Open Access. Powered by Scholars. Published by Universities.®

Social and Behavioral Sciences Commons

Open Access. Powered by Scholars. Published by Universities.®

Economics

Yale University

Edgeworth expansion

Articles 1 - 11 of 11

Full-Text Articles in Social and Behavioral Sciences

Smoothing Local-To-Moderate Unit Root Theory, Peter C.B. Phillips, Tassos Magdalinos, Liudas Giraitis May 2008

Smoothing Local-To-Moderate Unit Root Theory, Peter C.B. Phillips, Tassos Magdalinos, Liudas Giraitis

Cowles Foundation Discussion Papers

A limit theory is established for autoregressive time series that smooths the transition between local and moderate deviations from unity and provides a transitional form that links conventional unit root distributions and the standard normal. Edgeworth expansions of the limit theory are given. These expansions show that the limit theory that holds for values of the autoregressive coefficient that are closer to stationarity than local (i.e. deviations of the form = 1 + (c/n), where n is the sample size and c < 0) holds up to the second order. Similar expansions around the limiting Cauchy density are provided for the mildly explosive case.


Optimal Bandwidth Choice For Interval Estimation In Gmm Regression, Yixiao Sun, Peter C.B. Phillips May 2008

Optimal Bandwidth Choice For Interval Estimation In Gmm Regression, Yixiao Sun, Peter C.B. Phillips

Cowles Foundation Discussion Papers

In time series regression with nonparametrically autocorrelated errors, it is now standard empirical practice to construct confidence intervals for regression coefficients on the basis of nonparametrically studentized t -statistics. The standard error used in the studentization is typically estimated by a kernel method that involves some smoothing process over the sample autocovariances. The underlying parameter ( M ) that controls this tuning process is a bandwidth or truncation lag and it plays a key role in the finite sample properties of tests and the actual coverage properties of the associated confidence intervals. The present paper develops a bandwidth choice rule …


Smoothed Empirical Likelihood Methods For Quantile Regression Models, Yoon-Jae Whang Mar 2004

Smoothed Empirical Likelihood Methods For Quantile Regression Models, Yoon-Jae Whang

Cowles Foundation Discussion Papers

This paper considers an empirical likelihood method to estimate the parameters of the quantile regression (QR) models and to construct confidence regions that are accurate in finite samples. To achieve the higher-order refinements, we smooth the estimating equations for the empirical likelihood. We show that the smoothed empirical likelihood (SEL) estimator is first-order asymptotically equivalent to the standard QR estimator and establish that confidence regions based on the smoothed empirical likelihood ratio have coverage errors of order n –1 and may be Bartlett-corrected to produce regions with an error of order n –2 , where n denotes the sample size. …


Higher-Order Improvements Of The Parametric Bootstrap For Long-Memory Gaussian Processes, Donald W.K. Andrews, Offer Lieberman Aug 2002

Higher-Order Improvements Of The Parametric Bootstrap For Long-Memory Gaussian Processes, Donald W.K. Andrews, Offer Lieberman

Cowles Foundation Discussion Papers

This paper determines coverage probability errors of both delta method and parametric bootstrap confidence intervals (CIs) for the covariance parameters of stationary long-memory Gaussian time series. CIs for the long-memory parameter d 0 are included. The results establish that the bootstrap provides higher-order improvements over the delta method. Analogous results are given for tests. The CIs and tests are based on one or other of two approximate maximum likelihood estimators. The first estimator solves the first-order conditions with respect to the covariance parameters of a “plug-in” log-likelihood function that has the unknown mean replaced by the sample mean. The second …


The Block-Block Bootstrap: Improved Asymptotic Refinements, Donald W.K. Andrews May 2002

The Block-Block Bootstrap: Improved Asymptotic Refinements, Donald W.K. Andrews

Cowles Foundation Discussion Papers

The asymptotic refinements attributable to the block bootstrap for time series are not as large as those of the nonparametric iid bootstrap or the parametric bootstrap. One reason is that the independence between the blocks in the block bootstrap sample does not mimic the dependence structure of the original sample. This is the join-point problem. In this paper, we propose a method of solving this problem. The idea is not to alter the block bootstrap. Instead, we alter the original sample statistics to which the block bootstrap is applied. We introduce block statistics that possess join-point features that are similar …


Valid Edgeworth Expansions For The Whittle Maximum Likelihood Estimator For Stationary Long-Memory Gaussian Time Series, Donald W.K. Andrews, Offer Lieberman Apr 2002

Valid Edgeworth Expansions For The Whittle Maximum Likelihood Estimator For Stationary Long-Memory Gaussian Time Series, Donald W.K. Andrews, Offer Lieberman

Cowles Foundation Discussion Papers

In this paper, we prove the validity of an Edgeworth expansion to the distribution of the Whittle maximum likelihood estimator for stationary long-memory Gaussian models with unknown parameter donpap26_black.jpg (3566 bytes) . The error of the (s-2)-order expansion is shown to be o ( n ( s -2)/2 ) – the usual iid rate — for a wide range of models, including the popular ARFIMA(p,d,q) models. The expansion is valid under mild assumptions on the behavior of spectral density and its derivatives in the neighborhood of the origin. As a by-product, we generalize a Theorem by Fox and Taqqu (1987) …


Higher-Order Improvements Of The Parametric Bootstrap For Markov Processes, Donald W.K. Andrews Oct 2001

Higher-Order Improvements Of The Parametric Bootstrap For Markov Processes, Donald W.K. Andrews

Cowles Foundation Discussion Papers

This paper provides bounds on the errors in coverage probabilities of maximum likelihood-based, percentile- t , parametric bootstrap confidence intervals for Markov time series processes. These bounds show that the parametric bootstrap for Markov time series provides higher-order improvements (over confidence intervals based on first order asymptotics) that are comparable to those obtained by the parametric and nonparametric bootstrap for iid data and are better than those obtained by the block bootstrap for time series. Additional results are given for Wald-based confidence regions. The paper also shows that k -step parametric bootstrap confidence intervals achieve the same higher-order improvements as …


Second Order Expansions For The Distribution Of The Maximum Likelihood Estimator Of The Fractional Difference Parameter, Offer Lieberman, Peter C.B. Phillips Jul 2001

Second Order Expansions For The Distribution Of The Maximum Likelihood Estimator Of The Fractional Difference Parameter, Offer Lieberman, Peter C.B. Phillips

Cowles Foundation Discussion Papers

The maximum likelihood estimator (MLE) of the fractional difference parameter in the Gaussian ARFIMA(0, d ,0) model is well known to be asymptotically N (0, 6/ π 2 ). This paper develops a second order asymptotic expansion to the distribution of this statistic. The correction term for the density is shown to be independent of d , so that the MLE is second order pivotal for d . This feature of the MLE is unusual, at least in time series contexts. Simulations show that the normal approximation is poor and that the expansions make significant improvements in accuracy.


Equivalence Of The Higher-Order Asymptotic Efficiency Of K-Step And Extremum Statistics, Donald W.K. Andrews Jul 2000

Equivalence Of The Higher-Order Asymptotic Efficiency Of K-Step And Extremum Statistics, Donald W.K. Andrews

Cowles Foundation Discussion Papers

It is well known that a one-step scoring estimator that starts from any N 1 /2 -consistent estimator has the same first-order asymptotic efficiency as the maximum likelihood estimator. This paper extends this result to k -step estimators and test statistics for k > 1, higher-order asymptotic efficiency, and general extremum estimators and test statistics. The paper shows that a k -step estimator has the same higher-order asymptotic efficiency, to any given order, as the extremum estimator towards which it is stepping, provided (i) k is sufficiently large, (ii) some smoothness and moment conditions hold, and (iii) a condition on the …


Higher-Order Improvements Of A Computationally Attractive K-Step Bootstrap For Extremum Estimators, Donald W.K. Andrews Jul 1999

Higher-Order Improvements Of A Computationally Attractive K-Step Bootstrap For Extremum Estimators, Donald W.K. Andrews

Cowles Foundation Discussion Papers

This paper establishes the higher-order equivalence of the k -step bootstrap, introduced recently by Davidson and MacKinnon (1999a), and the standard bootstrap. The k -step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also extends results of Hall and Horowitz (1996) to provide new results regarding the higher-order improvements of the standard bootstrap and the k -step bootstrap for extremum estimators (compared to procedures based on first-order asymptotics). The results of the paper apply to Newton-Raphson (NR), default …


Higher-Order Improvements Of A Computationally Attractive K-Step Bootstrap For Extremum Estimators, Donald W.K. Andrews Jul 1999

Higher-Order Improvements Of A Computationally Attractive K-Step Bootstrap For Extremum Estimators, Donald W.K. Andrews

Cowles Foundation Discussion Papers

This paper establishes the higher-order equivalence of the k -step bootstrap, introduced recently by Davidson and MacKinnon (1999a), and the standard bootstrap. The k -step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also extends results of Hall and Horowitz (1996) to provide new results regarding the higher-order improvements of the standard bootstrap and the k -step bootstrap for extremum estimators (compared to procedures based on first-order asymptotics). The results of the paper apply to Newton-Raphson (NR), default …