- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Penalized and constrained data sharpening methods for...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Penalized and constrained data sharpening methods for kernel regression Wang, Dongying
Abstract
Data sharpening is a semiparametric method that is more flexible than parametric regression and is less variable than nonparametric regression. We study two kinds of data sharpening for local polynomial regression in this thesis. One version is penalized data sharpening, which constrains the regression function estimate globally. The other is constrained data sharpening, which operates more locally. Each approach requires a good bandwidth. Implementation details for direct-plug-in bandwidth selection for local linear regression are reviewed and extended to higher order local polynomial regression. The next critical step in solving the penalized data sharpening problem is to select a good tuning parameter. In this thesis, we propose and study several tuning parameter selectors. For constrained data sharpening, we study the optimization problem and solve it numerically using the Douglas-Rachford algorithm. By combining with the backfitting algorithm, we can apply the constrained data sharpening method to higher dimensional data, where our focus is on additive models. We apply penalized data sharpening and constrained data sharpening methods to some real data such as wildfire rate of spread data, temperature data, and images extracted from videos of small smoldering fires.
Item Metadata
Title |
Penalized and constrained data sharpening methods for kernel regression
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2022
|
Description |
Data sharpening is a semiparametric method that is more flexible than parametric
regression and is less variable than nonparametric regression. We study
two kinds of data sharpening for local polynomial regression in this thesis. One
version is penalized data sharpening, which constrains the regression function estimate
globally. The other is constrained data sharpening, which operates more
locally. Each approach requires a good bandwidth. Implementation details for
direct-plug-in bandwidth selection for local linear regression are reviewed and extended
to higher order local polynomial regression. The next critical step in solving
the penalized data sharpening problem is to select a good tuning parameter. In this
thesis, we propose and study several tuning parameter selectors. For constrained
data sharpening, we study the optimization problem and solve it numerically using
the Douglas-Rachford algorithm. By combining with the backfitting algorithm,
we can apply the constrained data sharpening method to higher dimensional data,
where our focus is on additive models. We apply penalized data sharpening and
constrained data sharpening methods to some real data such as wildfire rate of
spread data, temperature data, and images extracted from videos of small smoldering
fires.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2022-08-30
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0418431
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2022-09
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International