- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Faculty Research and Publications /
- The Law of the Iterated Logarithm for the Error Distribution...
Open Collections
UBC Faculty Research and Publications
The Law of the Iterated Logarithm for the Error Distribution Estimator in First-Order Autoregressive Models Wang, Bing; Jin, Yi; Wang, Lina; Shi, Xiaoping; Yang, Wenzhi
Abstract
This paper investigates the asymptotic behavior of kernel-based estimators for the error distribution in a first-order autoregressive model with dependent errors. The model assumes that the error terms form an α-mixing sequence with an unknown cumulative distribution function (CDF) and finite second moment. Due to the unobservability of true errors, we construct kernel-smoothed estimators based on residuals obtained via least squares. Under mild assumptions on the kernel function, bandwidth selection, and mixing coefficients, we establish a logarithmic law of the iterated logarithm (LIL) for the supremum norm difference between the residual-based kernel estimator and the true distribution function. The limiting bound is shown to be 1/2, matching the classical LIL for independent samples. To support the theoretical results, simulation studies are conducted to compare the empirical and kernel distribution estimators under various sample sizes and error term distributions. The kernel estimators demonstrate smoother convergence behavior and improved finite-sample performance. These results contribute to the theoretical foundation for nonparametric inference in autoregressive models with dependent errors and highlight the advantages of kernel smoothing in distribution function estimation under dependence.
Item Metadata
| Title |
The Law of the Iterated Logarithm for the Error Distribution Estimator in First-Order Autoregressive Models
|
| Creator | |
| Publisher |
Multidisciplinary Digital Publishing Institute
|
| Date Issued |
2025-10-26
|
| Description |
This paper investigates the asymptotic behavior of kernel-based estimators for the error distribution in a first-order autoregressive model with dependent errors. The model assumes that the error terms form an α-mixing sequence with an unknown cumulative distribution function (CDF) and finite second moment. Due to the unobservability of true errors, we construct kernel-smoothed estimators based on residuals obtained via least squares. Under mild assumptions on the kernel function, bandwidth selection, and mixing coefficients, we establish a logarithmic law of the iterated logarithm (LIL) for the supremum norm difference between the residual-based kernel estimator and the true distribution function. The limiting bound is shown to be 1/2, matching the classical LIL for independent samples. To support the theoretical results, simulation studies are conducted to compare the empirical and kernel distribution estimators under various sample sizes and error term distributions. The kernel estimators demonstrate smoother convergence behavior and improved finite-sample performance. These results contribute to the theoretical foundation for nonparametric inference in autoregressive models with dependent errors and highlight the advantages of kernel smoothing in distribution function estimation under dependence.
|
| Subject | |
| Genre | |
| Type | |
| Language |
eng
|
| Date Available |
2025-11-26
|
| Provider |
Vancouver : University of British Columbia Library
|
| Rights |
CC BY 4.0
|
| DOI |
10.14288/1.0450875
|
| URI | |
| Affiliation | |
| Citation |
Axioms 14 (11): 784 (2025)
|
| Publisher DOI |
10.3390/axioms14110784
|
| Peer Review Status |
Reviewed
|
| Scholarly Level |
Faculty
|
| Rights URI | |
| Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
CC BY 4.0