UBC Faculty Research and Publications

The Law of the Iterated Logarithm for the Error Distribution Estimator in First-Order Autoregressive Models Wang, Bing; Jin, Yi; Wang, Lina; Shi, Xiaoping; Yang, Wenzhi

Abstract

This paper investigates the asymptotic behavior of kernel-based estimators for the error distribution in a first-order autoregressive model with dependent errors. The model assumes that the error terms form an α-mixing sequence with an unknown cumulative distribution function (CDF) and finite second moment. Due to the unobservability of true errors, we construct kernel-smoothed estimators based on residuals obtained via least squares. Under mild assumptions on the kernel function, bandwidth selection, and mixing coefficients, we establish a logarithmic law of the iterated logarithm (LIL) for the supremum norm difference between the residual-based kernel estimator and the true distribution function. The limiting bound is shown to be 1/2, matching the classical LIL for independent samples. To support the theoretical results, simulation studies are conducted to compare the empirical and kernel distribution estimators under various sample sizes and error term distributions. The kernel estimators demonstrate smoother convergence behavior and improved finite-sample performance. These results contribute to the theoretical foundation for nonparametric inference in autoregressive models with dependent errors and highlight the advantages of kernel smoothing in distribution function estimation under dependence.

Item Media