Expansions and asymptotics for statistics

339 444 0
Expansions and asymptotics for statistics

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Expansions and Asymptotics for Statistics C5904_FM.indd 4/1/10 3:53:38 PM MONOGRAPHS ON STATISTICS AND APPLIED PROBABILITY General Editors F Bunea, V Isham, N Keiding, T Louis, R L Smith, and H Tong Stochastic Population Models in Ecology and Epidemiology M.S Barlett (1960) Queues D.R Cox and W.L Smith (1961) Monte Carlo Methods J.M Hammersley and D.C Handscomb (1964) The Statistical Analysis of Series of Events D.R Cox and P.A.W Lewis (1966) Population Genetics W.J Ewens (1969) Probability, Statistics and Time M.S Barlett (1975) Statistical Inference S.D Silvey (1975) The Analysis of Contingency Tables B.S Everitt (1977) Multivariate Analysis in Behavioural Research A.E Maxwell (1977) 10 Stochastic Abundance Models S Engen (1978) 11 Some Basic Theory for Statistical Inference E.J.G Pitman (1979) 12 Point Processes D.R Cox and V Isham (1980) 13 Identification of Outliers D.M Hawkins (1980) 14 Optimal Design S.D Silvey (1980) 15 Finite Mixture Distributions B.S Everitt and D.J Hand (1981) 16 Classification A.D Gordon (1981) 17 Distribution-Free Statistical Methods, 2nd edition J.S Maritz (1995) 18 Residuals and Influence in Regression R.D Cook and S Weisberg (1982) 19 Applications of Queueing Theory, 2nd edition G.F Newell (1982) 20 Risk Theory, 3rd edition R.E Beard, T Pentikäinen and E Pesonen (1984) 21 Analysis of Survival Data D.R Cox and D Oakes (1984) 22 An Introduction to Latent Variable Models B.S Everitt (1984) 23 Bandit Problems D.A Berry and B Fristedt (1985) 24 Stochastic Modelling and Control M.H.A Davis and R Vinter (1985) 25 The Statistical Analysis of Composition Data J Aitchison (1986) 26 Density Estimation for Statistics and Data Analysis B.W Silverman (1986) 27 Regression Analysis with Applications G.B Wetherill (1986) 28 Sequential Methods in Statistics, 3rd edition G.B Wetherill and K.D Glazebrook (1986) 29 Tensor Methods in Statistics P McCullagh (1987) 30 Transformation and Weighting in Regression R.J Carroll and D Ruppert (1988) 31 Asymptotic Techniques for Use in Statistics O.E Bandorff-Nielsen and D.R Cox (1989) 32 Analysis of Binary Data, 2nd edition D.R Cox and E.J Snell (1989) 33 Analysis of Infectious Disease Data N.G Becker (1989) 34 Design and Analysis of Cross-Over Trials B Jones and M.G Kenward (1989) 35 Empirical Bayes Methods, 2nd edition J.S Maritz and T Lwin (1989) 36 Symmetric Multivariate and Related Distributions K.T Fang, S Kotz and K.W Ng (1990) 37 Generalized Linear Models, 2nd edition P McCullagh and J.A Nelder (1989) 38 Cyclic and Computer Generated Designs, 2nd edition J.A John and E.R Williams (1995) 39 Analog Estimation Methods in Econometrics C.F Manski (1988) 40 Subset Selection in Regression A.J Miller (1990) 41 Analysis of Repeated Measures M.J Crowder and D.J Hand (1990) 42 Statistical Reasoning with Imprecise Probabilities P Walley (1991) 43 Generalized Additive Models T.J Hastie and R.J Tibshirani (1990) 44 Inspection Errors for Attributes in Quality Control N.L Johnson, S Kotz and X Wu (1991) 45 The Analysis of Contingency Tables, 2nd edition B.S Everitt (1992) 46 The Analysis of Quantal Response Data B.J.T Morgan (1992) 47 Longitudinal Data with Serial Correlation—A State-Space Approach R.H Jones (1993) 48 Differential Geometry and Statistics M.K Murray and J.W Rice (1993) 49 Markov Models and Optimization M.H.A Davis (1993) 50 Networks and Chaos—Statistical and Probabilistic Aspects O.E Barndorff-Nielsen, J.L Jensen and W.S Kendall (1993) 51 Number-Theoretic Methods in Statistics K.-T Fang and Y Wang (1994) 52 Inference and Asymptotics O.E Barndorff-Nielsen and D.R Cox (1994) 53 Practical Risk Theory for Actuaries C.D Daykin, T Pentikäinen and M Pesonen (1994) 54 Biplots J.C Gower and D.J Hand (1996) 55 Predictive Inference—An Introduction S Geisser (1993) 56 Model-Free Curve Estimation M.E Tarter and M.D Lock (1993) 57 An Introduction to the Bootstrap B Efron and R.J Tibshirani (1993) 58 Nonparametric Regression and Generalized Linear Models P.J Green and B.W Silverman (1994) 59 Multidimensional Scaling T.F Cox and M.A.A Cox (1994) 60 Kernel Smoothing M.P Wand and M.C Jones (1995) 61 Statistics for Long Memory Processes J Beran (1995) 62 Nonlinear Models for Repeated Measurement Data M Davidian and D.M Giltinan (1995) 63 Measurement Error in Nonlinear Models R.J Carroll, D Rupert and L.A Stefanski (1995) 64 Analyzing and Modeling Rank Data J.J Marden (1995) 65 Time Series Models—In Econometrics, Finance and Other Fields D.R Cox, D.V Hinkley and O.E Barndorff-Nielsen (1996) 66 Local Polynomial Modeling and its Applications J Fan and I Gijbels (1996) 67 Multivariate Dependencies—Models, Analysis and Interpretation D.R Cox and N Wermuth (1996) 68 Statistical Inference—Based on the Likelihood A Azzalini (1996) 69 Bayes and Empirical Bayes Methods for Data Analysis B.P Carlin and T.A Louis (1996) 70 Hidden Markov and Other Models for Discrete-Valued Time Series I.L MacDonald and W Zucchini (1997) 71 Statistical Evidence—A Likelihood Paradigm R Royall (1997) 72 Analysis of Incomplete Multivariate Data J.L Schafer (1997) 73 Multivariate Models and Dependence Concepts H Joe (1997) 74 Theory of Sample Surveys M.E Thompson (1997) 75 Retrial Queues G Falin and J.G.C Templeton (1997) 76 Theory of Dispersion Models B Jørgensen (1997) 77 Mixed Poisson Processes J Grandell (1997) 78 Variance Components Estimation—Mixed Models, Methodologies and Applications P.S.R.S Rao (1997) 79 Bayesian Methods for Finite Population Sampling G Meeden and M Ghosh (1997) 80 Stochastic Geometry—Likelihood and computation O.E Barndorff-Nielsen, W.S Kendall and M.N.M van Lieshout (1998) 81 Computer-Assisted Analysis of Mixtures and Applications— Meta-analysis, Disease Mapping and Others D Böhning (1999) 82 Classification, 2nd edition A.D Gordon (1999) 83 Semimartingales and their Statistical Inference B.L.S Prakasa Rao (1999) 84 Statistical Aspects of BSE and vCJD—Models for Epidemics C.A Donnelly and N.M Ferguson (1999) 85 Set-Indexed Martingales G Ivanoff and E Merzbach (2000) 86 The Theory of the Design of Experiments D.R Cox and N Reid (2000) 87 Complex Stochastic Systems O.E Barndorff-Nielsen, D.R Cox and C Klüppelberg (2001) 88 Multidimensional Scaling, 2nd edition T.F Cox and M.A.A Cox (2001) 89 Algebraic Statistics—Computational Commutative Algebra in Statistics G Pistone, E Riccomagno and H.P Wynn (2001) 90 Analysis of Time Series Structure—SSA and Related Techniques N Golyandina, V Nekrutkin and A.A Zhigljavsky (2001) 91 Subjective Probability Models for Lifetimes Fabio Spizzichino (2001) 92 Empirical Likelihood Art B Owen (2001) 93 Statistics in the 21st Century Adrian E Raftery, Martin A Tanner, and Martin T Wells (2001) 94 Accelerated Life Models: Modeling and Statistical Analysis Vilijandas Bagdonavicius and Mikhail Nikulin (2001) 95 Subset Selection in Regression, Second Edition Alan Miller (2002) 96 Topics in Modelling of Clustered Data Marc Aerts, Helena Geys, Geert Molenberghs, and Louise M Ryan (2002) 97 Components of Variance D.R Cox and P.J Solomon (2002) 98 Design and Analysis of Cross-Over Trials, 2nd Edition Byron Jones and Michael G Kenward (2003) 99 Extreme Values in Finance, Telecommunications, and the Environment Bärbel Finkenstädt and Holger Rootzén (2003) 100 Statistical Inference and Simulation for Spatial Point Processes Jesper Møller and Rasmus Plenge Waagepetersen (2004) 101 Hierarchical Modeling and Analysis for Spatial Data Sudipto Banerjee, Bradley P Carlin, and Alan E Gelfand (2004) 102 Diagnostic Checks in Time Series Wai Keung Li (2004) 103 Stereology for Statisticians Adrian Baddeley and Eva B Vedel Jensen (2004) 104 Gaussian Markov Random Fields: Theory and Applications H˚avard Rue and Leonhard Held (2005) 105 Measurement Error in Nonlinear Models: A Modern Perspective, Second Edition Raymond J Carroll, David Ruppert, Leonard A Stefanski, and Ciprian M Crainiceanu (2006) 106 Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood Youngjo Lee, John A Nelder, and Yudi Pawitan (2006) 107 Statistical Methods for Spatio-Temporal Systems Bärbel Finkenstädt, Leonhard Held, and Valerie Isham (2007) 108 Nonlinear Time Series: Semiparametric and Nonparametric Methods Jiti Gao (2007) 109 Missing Data in Longitudinal Studies: Strategies for Bayesian Modeling and Sensitivity Analysis Michael J Daniels and Joseph W Hogan (2008) 110 Hidden Markov Models for Time Series: An Introduction Using R Walter Zucchini and Iain L MacDonald (2009) 111 ROC Curves for Continuous Data Wojtek J Krzanowski and David J Hand (2009) 112 Antedependence Models for Longitudinal Data Dale L Zimmerman and Vicente A Núñez-Antón (2009) 113 Mixed Effects Models for Complex Data Lang Wu (2010) 114 Intoduction to Time Series Modeling Genshiro Kitagawa (2010) 115 Expansions and Asymptotics for Statistics Christopher G Small (2010) C5904_FM.indd 4/1/10 3:53:39 PM Monographs on Statistics and Applied Probability 115 Expansions and Asymptotics for Statistics Christopher G Small University of Waterloo Waterloo, Ontario, Canada C5904_FM.indd 4/1/10 3:53:39 PM Chapman & Hall/CRC Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2010 by Taylor and Francis Group, LLC Chapman & Hall/CRC is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S Government works Printed in the United States of America on acid-free paper 10 International Standard Book Number: 978-1-58488-590-0 (Hardback) This book contains information obtained from authentic and highly regarded sources Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint Except as permitted under U.S Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers For permission to photocopy or use material electronically from this work, please access www.copyright com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400 CCC is a not-for-profit organization that provides licenses and registration for a variety of users For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe Library of Congress Cataloging‑in‑Publication Data Small, Christopher G Expansions and asymptotics for statistics / Christopher G Small p cm (Monographs on statistics and applied probability ; 115) Includes bibliographical references and index ISBN 978-1-58488-590-0 (hardcover : alk paper) Asymptotic distribution (Probability theory) Asymptotic expansions I Title II Series QA273.6.S63 2010 519.5 dc22 2010010969 Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the CRC Press Web site at http://www.crcpress.com C5904_FM.indd 4/1/10 3:53:39 PM Contents Preface xi Introduction 1.1 Expansions and approximations 1.2 The role of asymptotics 1.3 Mathematical preliminaries 1.4 Two complementary approaches 16 1.5 Problems 18 General series methods 23 2.1 A quick overview 23 2.2 Power series 24 2.3 Enveloping series 40 2.4 Asymptotic series 47 2.5 Superasymptotic and hyperasymptotic series 63 2.6 Asymptotic series for large samples 66 2.7 Generalised asymptotic expansions 68 2.8 Notes 69 2.9 Problems 69 Pad´ e approximants and continued fractions 75 3.1 The Pad´e table 75 3.2 Pad´e approximations for the exponential function 79 vii viii CONTENTS 3.3 Two applications 81 3.4 Continued fraction expansions 85 3.5 A continued fraction for the normal distribution 88 3.6 Approximating transforms and other integrals 90 3.7 Multivariate extensions 92 3.8 Notes 93 3.9 Problems 94 The delta method and its extensions 99 4.1 Introduction to the delta method 99 4.2 Preliminary results 100 4.3 The delta method for moments 103 4.4 Using the delta method in Maple 108 4.5 Asymptotic bias 109 4.6 Variance stabilising transformations 111 4.7 Normalising transformations 114 4.8 Parameter transformations 116 4.9 Functions of several variables 119 4.10 Ratios of averages 119 4.11 The delta method for distributions 121 4.12 The von Mises calculus 123 4.13 Obstacles and opportunities: robustness 134 4.14 Problems 137 Optimality and likelihood asymptotics 143 5.1 Historical overview 143 5.2 The organisation of this chapter 151 5.3 The likelihood function and its properties 152 5.4 Consistency of maximum likelihood 159 5.5 Asymptotic normality of maximum likelihood 161 CONTENTS ix 5.6 Asymptotic comparison of estimators 164 5.7 Local asymptotics 171 5.8 Local asymptotic normality 177 5.9 Local asymptotic minimaxity 181 5.10 Various extensions 185 5.11 Problems 187 The Laplace approximation and series 193 6.1 A simple example 193 6.2 The basic approximation 195 6.3 The Stirling series for factorials 200 6.4 Laplace expansions in Maple 201 6.5 Asymptotic bias of the median 202 6.6 Recurrence properties of random walks 205 6.7 Proofs of the main propositions 207 6.8 Integrals with the maximum on the boundary 211 6.9 Integrals of higher dimension 212 6.10 Integrals with product integrands 215 6.11 Applications to statistical inference 219 6.12 Estimating location parameters 220 6.13 Asymptotic analysis of Bayes estimators 222 6.14 Notes 223 6.15 Problems 223 The saddle-point method 227 7.1 The principle of stationary phase 227 7.2 Perron’s saddle-point method 229 7.3 Harmonic functions and saddle-point geometry 234 7.4 Daniels’ saddle-point approximation 238 7.5 Towards the Barndorff-Nielsen formula 241 312 SUMMATION OF SERIES An important example of this kind is to be found in the computation of Khintchine’s constant This constant is the geometric mean arising in the continued fraction expansions of normal numbers A brief explanation is as follows Suppose X is a strictly positive random variable having some continuous distribution on the positive reals Then there exists a unique continued fraction representation of X as X = N0 + N1 + N2 + N3 + where N0 , N1 , N2 , and so on, are positive integers Khintchine (1964) discovered that there exists a real number ω such that P lim m m→∞ N1 N2 · · · Nm = ω = (8.52) In fact, ω is the geometric mean of the limiting distribution of Nm as m → ∞ The probability function of this limiting distribution can be shown to be 1 ln + lim P (Nm = j) = m→∞ ln j (j + 2) ∼ O(j −2 ) as j → ∞ , again, with probability one Therefore, an infinite series representation for the value of Khintchine’s constant is given by ∞ ln ω = j=1 ln j ln + ln j (j + 2) (8.53) Unfortunately, the convergence of this series is even slower than j −2 , due to the presence of the additional factor ln j However, the logarithm is a fairly flat function, and so a conservative acceleration of the series can be achieved using Richardson’s method under the rough assumption that sn −s = O(n−1 ) Since the difference sn −s is really of greater order than O(n−1 ), Richardson’s acceleration method will tend to underestimate the sum if we set p = for the first iteration and increment p by one for each iteration thereafter For Richardson’s algorithm, let us take a set of partial sums sn , where n = 2m × 100 for m = 0, , Setting p = in (8.44), and successively incrementing p by one, we get the values shown in Table 8.3 The estimate on the far right is our accelerated sum for the series So, based on Richardson’s algorithm and the given partial sums, our estimate for ω is ω ≈ exp(0.9876902347) COMPARING ACCELERATION TECHNIQUES 0.9080365395 0.9427186946 0.9627178807 0.9740145133 0.9803014212 0.9837612607 0.9856484896 ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ ✷ 313 0.9875263696 0.9876902347 0.9876876743 Table 8.3 Accelerated sums for Khintchine’s constant using Richardson’s algorithm (with intermediate values omitted for brevity) ≈ 2.685025526 This value can be compared with more precise estimates for ω based upon more extensive computations It is known that ω = 2.685452001 Khintchine’s constant is known to many more decimal places than this For example, Bailey et al (1997) calculated the constant to more than 7000 decimal places However, calculations of such precision use series for ω which converge much faster than the one given here 8.8 Comparing acceleration techniques We have now considered a number of methods for accelerating series A comparison among these methods is rather difficult because very few series will satisfy tidy regularity conditions necessary to make a particular acceleration method more effective than others While the series calculated in Section 8.7.1 fits the assumptions for Richardson’s method better than Wynn’s or Aitken’s, the series calculated in Section 8.7.2 fits none of the assumptions that well Nevertheless, the corrections obtained by these methods were all in the right direction, even if the order of the error was not reduced In all cases, the success of an acceleration method must be judged in terms of the computational efficiencies obtained from it Typically, the computational costs of summing a series are roughly proportional to the number of terms in the sum More generally, we might suppose that the computational cost of summing n terms of a series is some function C(n) We might also let C0 (n) denote the computational cost of accelerating the partial sums out to n terms Suppose that m > n are two positive integers such that sn − s ≈ sm − s It seems reasonable to measure the 314 SUMMATION OF SERIES computational efficiency obtained by accelerating the first n terms as Eff(n) = C(m) C(n) + C0 (n) Typically, acceleration will not be worthwhile unless m is much larger than n 8.9 Divergent series 8.9.1 Antilimits Divergent series have arisen in numerous places in earlier sections of this book In particular, asymptotic series are often divergent Although these series are of importance both theoretically and computationally, so far we have not had a unified way of summing such series Some intriguing clues about the summability of divergent series can be found among the series acceleration techniques that we have considered so far In some cases, the series transformations such as Aitken’s and Euler’s can be applied to divergent series so as to make them convergent It is worth formalising this idea When it exists, let us define an antilimit s of a divergent series aj to be the sum of the series after it has been transformed by some standard acceleration procedure For example, the series + ρ + ρ2 + ρ3 + · · · has limit 1/(1−ρ) for |ρ| < When |ρ| > 1, the series diverges However, we can still say that 1/(1 − ρ) is the antilimit of the series, because after a one-step application of Aitken’s δ method, we see that sn = − ρn , ∀n ≥ and sn = , ∀n ≥ 1−ρ 1−ρ In qualitative terms, the partial sums of series converge towards limits and diverge away from antilimits 8.9.2 Why should we sum divergent series? Sometimes the properties of functions can be proved simply through their expansions A very simple example of this principle is that the identity 1 x + − = , − x + x2 − x2 − x4 DIVERGENT SERIES 315 while easily verified algebraically, can be derived immediately from series expansions of the rational functions Thus 1 x + − 1−x 1+x − x2 = (1 + x + x2 + · · ·) + (1 − x2 + x4 − · · ·) −(x + x3 + x5 + · · ·) = + x4 + x8 + · · · = − x4 While the original identity is valid for all |x| = 1, the proof by series expansion is only valid for |x| < However, the proof remains valid if the series sums are interpreted as limits for |x| < and antilimits for |x| > • A unified theory for the summation of divergent and convergent series should allow the free manipulation of functions through their series expansions This should lead to the construction of simpler proofs of some identities without restrictions on the domains of the functions or the need for the extensions of those domains by analytic continuation A minimal condition for a unified methodology for summing both convergent and divergent series is the following • A summation method will be said to be regular provided that it sums all convergent series to their usual sums Thus a summation method should be an extension of the usual concept of series convergence to a more general class of series At this stage, we should clarify an ambiguity in the definition of regularity Consider the sequence + + + + · · · Understood in the usual sense, this is a divergent series However, as a series in the extended real numbers (including ±∞), the sum of this series exists and is +∞ That is, the partial sums of the series converge in the extended reals to +∞ In this respect, this series is quite different from − + − + · · · whose oscillations prevent the partial sums of the series from converging even among the extended reals The latter series is therefore divergent in a stricter sense than the former So, we may interpret the definition of regularity given above in two senses, depending on whether series 2j is understood as divergent or as convergent in the extended reals • A summation method is said to be totally regular if it sums series to s where lim inf sn ≤ s ≤ lim sup sn , the limits superior and inferior being defined for the extended real numbers 316 SUMMATION OF SERIES For example, a summation method which sums diverging geometric series to the antilimit using Aitken’s δ method is not totally regular because it sums + + + · · · to −1 rather than +∞ While regularity is clearly a desirable property of any summation method, the desirability of total regularity will depend on the context in which the series is studied What other properties should summation methods have? The following laws must be satisfied if we are to be able to manipulate series with standard algebraic techniques If aj = s, then If aj = s and We must have c aj = c s bj = t, then j≥1 (aj + bj ) = s + t aj = s, if and only if j≥2 aj = s − a1 The reader who is interested in learning more about the fascinating subject of divergent series is encouraged to read G H Hardy’s classic work, Divergent Series, which is available as a reprint from the American Mathematical Society See Hardy (1991) While the subject has not become mainstream with statistics, it continues to fascinate The existence of asymptotic series, which have great power for the calculation of functions but which not usually converge, is a reminder of other paradigms for series summation, most notably that of Euler Such generalised summation methods can be used to provide formal definitions of the expectations of some discrete non-integrable random variables This is of some value in discussing examples such as the St Petersburg paradox involving finite random variables with infinite expectations A generalisation of the law of large numbers was proposed by Feller (1968, 1971) See Problem 14, which explores this further We finish with one final formula, the merits of which are left to the reader to judge, to wit: 0! − 1! + 2! − 3! + 4! − · · · = 0.5963 8.10 Problems Use Bertrand’s test to prove that the harmonic series 1+ diverges 1 1 + + + + ··· PROBLEMS 317 Does the series ∞ j −(j+1)/j j=1 converge? Prove that if ∞ aj j=1 is a convergent series whose terms are positive, then the series ∞ j/(j+1) aj j=1 is also convergent (This was problem B4 from the 1988 William Lowell Putnam competition.) Prove that if the series ∞ j=1 aj is a convergent series with positive terms, then ∞ 1/j (a1 a2 · · · aj ) j=1 is also convergent (This is a toughie!) Let X1 , X2 , X3 , be independent identically distributed N (0, 1) Show that for all values of θ the series ∞ j=1 Xj sin(j θ) n converges with probability one Let Xj , j ≥ be any sequence of random variables Prove that there exists a sequence of positive constants aj such that lim j→∞ Xj =0 aj with probability one Suppose X1 , X2 , X3 , are independent identically distributed Poisson random variables Let aj = ln j/ ln ln j Prove that P lim sup j→∞ Xj =1 aj = 318 SUMMATION OF SERIES Use the Euler-Maclaurin sum formula to prove the following identities (a) The sum of positive integers: + + + · · · + n = n (n + 1)/2 (b) The sum of squares of positive integers: 12 + 22 + 32 + · · · + n2 = n (n + 1) (2 n + 1)/6 (c) The sum of the odd positive integers: + + + · · · + (2 n − 1) = n2 In each case, don’t forget to evaluate the constant c using special cases of n It is well known that the harmonic series diverges What about 1 1 + + + ··· + ? n+1 n+2 n+3 2n Does this go to infinity in n? If not, find the limit 10 Little Tommy wants to collect toy cars from boxes of cereal Tommy knows that each box contains a car, and that there are n different types of cars available to be collected Tommy is good at probability, and assumes that each type of car is equally likely in any box, and that the toy cars are inserted independently into each box The family buys one box of cereal each week Prove that the expected number of weeks until Tommy has all n types of cars is asymptotically n ln n + γ n + for large n, where γ is Euler’s constant 11 An urn contains n balls, each individually numbered from to n Two random numbers N and M are obtained by drawing a ball from the urn two times with replacement (a) Let n = k p, where p is a prime number and k is any positive integer Find an expression for the probability that the greatest common divisor of N and M is not divisible by p (b) Let n = k p q, where p and q are distinct prime numbers Find an expression for the probability that the greatest common divisor of N and M is not divisible by p or q PROBLEMS 319 (c) As n → ∞ show that the probability N and M are relatively prime is given by 1− p p prime where the product is over all primes (d) Take the reciprocal of this product and expand to show that for large n, the probability N and M are relatively prime is approximately 6/π 12 The following two formulas were useful in an application of Kummer acceleration Prove that ∞ k=1 and ∞ j=1 1 = , k (k + 1) (k + 2) · · · (k + n) n (n − 1)! = j2 n j=1 ∞ 1 + n! (j + 1) · · · (j + n) j2 j j=1 13 Prove (8.50), namely that n j=0 n j 2n r r ∼ √ πn (r−1)/2 as n → ∞ 14 Alice and Bob decide to play the following game A fair coin is tossed independently until it lands heads Let N be the number of tosses required If N is odd, then Alice pays Bob 2N dollars If N is even, Bob must pay Alice 2N dollars Let X be the total amount of money that Alice pays Bob in this game, which may be positive or negative Is the game favourable to either player? The reader will recognise that this game is a thinly disguised version of the St Petersburg paradox It is well known that random variables with infinite expectations not obey a law of large numbers However, a generalisation of the law of large numbers to the St Petersburg paradox has been given by Feller (1968, p 252) for games with variable “entrance fees.” Discuss the following (a) The game would be favourable to Bob if E(X) > 0, favourable to Alice if E(X) < and fair if E(X) = Formally, E(X) = = · 2−1 − 22 · 2−2 + 23 · 2−3 − · · · − + − ··· 320 SUMMATION OF SERIES which is undefined Therefore, the concept of fairness is not meaningful here (b) In practice, the number of tosses must be bounded Any partial sum of the series − + − + · · · will be greater than or equal to zero, with many of them strictly positive Therefore, under any finite termination rule after n tosses E(X | N ≤ n) ≥ with strict inequality when n is odd Therefore, the game is in Bob’s favour (c) Suppose we let v be the value of the game, as measured by some method yet to be determined In the conditional model where the first toss is a head, then clearly v = Conditionally on the first toss being a tail, the roles of Alice and Bob are reversed, with all the payoffs based on subsequent tosses being doubled So the value of the game conditional of an initial tail is −2 v Thus v = = 1 · + · (−2v) 2 1−v, which is solved by v = 1/2 Thus the game is slightly in Bob’s favour by 50 cents! (d) The game is fair in the classical sense.∗∗ Suppose this coin tossing game is repeated many times under independent and identical conditions After k repetitions, let Sk be the total amount of money that Alice has paid Bob over those trials where N was odd, and let Sk be the total amount of money that Bob has paid Alice over those trials where N was even Equivalently, Sk = max(X1 , 0) + max(X2 , 0) + · · · + max(Xk , 0) and Sk = max(−X1 , 0) + max(−X2 , 0) + · · · + max(−Xk , 0) , where X1 , , Xk are the payments (positive or negative) from Alice to Bob on each of the k trials The game is fair in the classical sense because, for all > 0, P Sk −1 > Sk → 0, as k → ∞ ∗∗ This is Feller’s terminology See Feller (1968, p 252) and Feller (1971, p 236) CHAPTER Glossary of symbols Calculus f (n) (x) The n-th derivative of f (x) p 25 ∂ n : f → f (n) Derivative operator acting on f p 37 Contour integral in complex plane along the vertical line through t0 on real axis p 239 t0 +i ∞ t0 −i ∞ C C f (z) dz f (z) dz f (z) dz Contour integral in complex plane along the directed closed contour C Contour integral in complex plane around the closed contour C taken in counterclockwise sense p 229 p 264 Special constants, functions and sets π 3.14159 p e 2.71828 p γ Euler’s constant 0.57721 p 294 ln(x) Natural logarithm of x √ Usually −1 and not an integer p (z) Real part of complex number z p 60 (z) p 229 E1 (y) Imaginary part of complex number z Exponential integral Γ(x) Gamma function at x p 32 i 321 p 46 p 73 322 GLOSSARY OF SYMBOLS normal distribution p 43 Standard function Standard normal density function The difference in the natural logarithms of Γ(x) and Stirling’s approximation Φ(x) φ(x) ζ(x) p 43 p 72 Z+ × Z+ The lattice of nonnegative (i.e., zero and positive) integers in the plane p 93 #(B) The number of elements in the (finite) set B p 93 The number of subsets of j items from a set of n “n choose j.” p n j Bn The number of ways to order a set of n items “n factorial.” The n-th Bernoulli number (Abramowitz & Stegun) Bn∗ The n-th Bernoulli number (Whittaker & Watson) p 73 det(H) Determinant of the matrix H p 218 n! p p 73 Order symbols f (n) = O(nk ) f (n) = o(nk ) f (n) = Op (nk ) Function f (n) is of order nk as n → ∞ Function f (n) is of smaller order than nk as n → ∞ Function f (n) is of order nk in probability as n → ∞ p p 12 p 13 f (n) = op (nk ) Function f (n) is of smaller order than nk in probability as n → ∞ p 13 ∼ g(n) Functions f (n) and g(n) are asymptotically equivalent as n → ∞ p 11 The limit superior of sequence xn p f (n) Series and sequences lim sup xn GLOSSARY OF SYMBOLS lim inf xn f (x) ∼ f (x) ∼ h(x) n an x−n an x−n an ❀ s 323 The limit inferior of sequence xn p f (x) has asymptotic expansion −n as x → ∞ n an x p 48 f (x)/h(x) has asymptotic expansion n an x−n as x → ∞ p 48 The sum p 40 n an envelops s Probability distributions and statistics E(T ) or E T Eθ (T ) or EF (T ) Var(T ) Varθ (T ) or VarF (T ) d X=Y d Xn =⇒ F P Xn → c Expectation of random variable T Expectation of T assuming parameter θ or distribution F p 15 p 126 Variance of random variable T p 15 Variance of T assuming parameter θ or distribution F Random variables X and Y identically distributed p 134 Random variables Xn converge in distribution to F Random variables Xn converge in probability to c p 56 p 14 p 14 Xn → c Random variables Xn converge almost surely to c p 14 μj The j-th moment of a random variable E(X j ) p 34 κj The j-th cumulant of a random variable Binomial for n independent trials with probability weight θ for each trial Cauchy distribution centred at θ with scale parameter a a.s B(n, θ) C(θ, a) E(λ) X (k) Exponential distribution with mean λ−1 Chi-square distribution with k degrees if freedom p 34 p p 225 p 64 p 32 324 X (k, ν) GLOSSARY OF SYMBOLS Non-central chi square with k p 32 degrees of freedom and noncentrality parameter ν G(α, λ) Gamma distribution with shape α, scale λ and mean αλ−1 N (μ, σ ) P(μ) Normal with mean μ and variance σ2 Poisson with mean μ U(a, b) Continuous uniform distribution on [a, b] p 67 M (t) Moment generating M (t) = E(et X ) p 19 A(t) Probability generating function A(t) = E(tX ) p 20 K(t) Cumulant generating function K(t) = ln M (t) p 34 χ(t) Characteristic function χ(t) = E(ei X t ) p 57 Log-likelihood function of a parameter θ for sample size n p 143 function p 70 p p 20 Likelihood statistics Ln (θ) n (θ) Log-likelihood function ln Ln (θ) n (θ) = p 143 θn Maximum likelihood estimator θn = arg maxθ L(θ) p 143 un (θ) Score function un (θ) = p 153 in (θ) Observed information function in (θ) = − n (θ) p 153 in (θn ) Observed information p 153 I(θ) Expected information function I(θ) = n−1 Eθ in (θ) p 153 n (θ) CHAPTER 10 Useful limits, series and products Limits Unless otherwise specified, limits are as x → ∞ or n → ∞ (ln x)k x xk ex →0 →0 (1 + n−1 )n → e sin x x →1 n1/n → n nn n! →e Series + + + ··· + n = n (n+1) d a + (a + d) + (a + d) + (a + d) + · · · + [a + (n − 1) d] = n a + n (n−1) + + + · · · + (2 n − 1) = n2 + + + · · · + (2 n) = n (n + 1) + 2 + + · · · + n2 = n (n+1) (2 n+1) 325 326 USEFUL LIMITS, SERIES AND PRODUCTS 13 + 23 + 33 + · · · + n3 = (1 + + + · · · + n)2 n (4 n2 −1) 12 + 32 + 52 + · · · + (2 n − 1)2 = a (1−r n ) 1−r a + a r + a r2 + a r3 + · · · + a rn−1 = n + n + n + n + ···+ n − n + n − n + · · · + (−1)n n 2 + n + n + ···+ n n 1×2 n + n + + + n + 2×3 3×4 + + ···+ n n = 2n n n + ···+ n =0 2n n = = 2n−1 n (n−1)×n n n =1− n Products 1− 1− 22 1− 1×3 1+ 1 1− 1− 32 1− 1− 1− 2×4 1+ 1− ··· − 42 ··· − 1− n 3×5 ··· + = n2 n = n+1 2n ··· − n×(n+2) (−1)n+1 n = = (n+1) n+2 n+1+(−1)n+1 2n π x+2 π sin xk sin x+π · · · sin x+(k−1) = 21−k sin x k sin k k π/2 sin x+π/2 sin x+3kπ/2 sin x+5kπ/2 · · · sin x+(2 k−1) = 21−k cos x k k

Ngày đăng: 25/11/2016, 13:36

Từ khóa liên quan

Mục lục

  • Cover & Table of Contents - Expansions and Asymptotics for Statistics.pdf (p.1-15)

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • References - Expansions and Asymptotics for Statistics.pdf (p.16-19)

    • References

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 1 Introduction - Expansions and Asymptotics for Statistics.pdf (p.20-40)

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 2 General Series Methods.pdf (p.41-92)

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 3 Pade Approximants And Continued Fractions.pdf (p.93-115)

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 4 The Delta Method And Its Extensions.pdf (p.116-158)

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 5 Optimality And Likelihood Asymptotics.pdf (p.159-207)

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 6 The Laplace Approximation And Series.pdf (p.208-240)

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 7 The Saddle-Point Method.pdf (p.241-291)

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 8 Summation Of Series.pdf (p.292-333)

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 9 Glossary Of Symbols.pdf (p.334-337)

    • 9. Glossary of symbols

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

  • Chapter 10 Useful Limits, Series And Products.pdf (p.338-339)

    • 10. Useful limits, series and products

    • Expansions and Asymptotics for Statistics

      • Front Cover

      • Contents

      • Preface

    • 1. Introduction

      • 1.1 Expansions and approximations

      • 1.2 The role of asymptotics

      • 1.3 Mathematical preliminaries

      • 1.4 Two complementary approaches

      • 1.5 Problems

    • 2. General series methods

      • 2.1 A Quick overview

      • 2.2 Power series

      • 2.3 Enveloping series

      • 2.4 Asymptotic series

      • 2.5 Superasymptotic and hyperasymptotic series

      • 2.6 Asymptotic series for large samples

      • 2.7 Generalised asymptotic expansions

      • 2.8 Notes

      • 2.9 Problems

    • 3. Padè approximants and continued fractions

      • 3.1 The Padè table

      • 3.2 Padè approximations for the exponential function

      • 3.3 Two Applications

      • 3.4 Continued fraction expansions

      • 3.5 A continued fraction for the normal distribution

      • 3.6 Approximating transforms and other integrals

      • 3.7 Multivariate extensions

      • 3.8 Notes

      • 3.9 Problems

    • 4. The delta method and its extensions

      • 4.1 Introduction to the delta method

      • 4.2 Preliminary results

      • 4.3 The Delta method for moments

      • 4.4 Using the delta method in Maple

      • 4.5 Asymptotic bias

      • 4.6 Variance stabilising transformations

      • 4.7 Normalising transformations

      • 4.8 Parameter transformations

      • 4.9 Functions of several variables

      • 4.10 Ratios of averages

      • 4.11 The delta method for distributions

      • 4.12 The von Mises calculus

      • 4.13 Obstacles and opportunities: robustness

      • 4.14 Problems

    • 5. Optimality and likelihood asymptotics

      • 5.1 Historical overview

      • 5.2 The organisation of this chapter

      • 5.3 The likelihood function and its properties

      • 5.4 Consistency of maximum likelihood

      • 5.5 Asymptotic normality of maximum likelihood

      • 5.6 Asymptotic comparison of estimators

      • 5.7 Local asymptotics

      • 5.8 Local asymptotic normality

      • 5.9 Local asymptotic minimaxity

      • 5.10 Various extensions

      • 5.11 Problems

    • 6. The Laplace approximation and series

      • 6.1 A simple example

      • 6.2 The basic approximation

      • 6.3 The Stirling series for factorials

      • 6.4 Laplace expansions in Maple

      • 6.5 Asymptotic bias of the median

      • 6.6 Recurrence properties of random walks

      • 6.7 Proofs of the main propositions

      • 6.8 Integrals with the maximum on the boundary

      • 6.9 Integrals of higher dimension

      • 6.10 Integrals with product integrands

      • 6.11 Applications to statistical inference

      • 6.12 Estimating location parameters

      • 6.13 Asymptotic analysis of bayes estimators

      • 6.14 Notes

      • 6.15 Problems

    • 7. The saddle-point method

      • 7.1 The principle of stationary phase

      • 7.2 Perron's saddle-point method

      • 7.3 Harmonic functions and saddle-point geometry

      • 7.4 Daniels' saddle-point approximation

      • 7.5 Towards the Barndorff-Nielsen formula

      • 7.6 Saddle-point method for distribution functions

      • 7.7 Saddle-point method for discrete variables

      • 7.8 Ratios of sums of random variables

      • 7.9 Distributions of M-estimators

      • 7.10 The Edgeworth expansion

      • 7.11 Mean, median and mode

      • 7.12 Hayman's saddle-point approximation

      • 7.13 The method of Darboux

      • 7.14 Applications to common distributions

      • 7.15 Problems

    • 8. Summation of series

      • 8.1 Advanced tests for series convergence

      • 8.2 Convergence of random series

      • 8.3 Applications in probability and statistics

      • 8.4 Euler-Maclaurin sum formula

      • 8.5 Applications of the Euler-Maclaurin formula

      • 8.6 Accelerating series convergence

      • 8.7 Applications of acceleration methods

      • 8.8 Comparing acceleration techniques

      • 8.9 Divergent series

      • 8.10 Problems

    • 9. Glossary of symbols

    • 10. Useful limits, series and products

    • References

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan