478
Views
0
CrossRef citations to date
0
Altmetric
General Regression Methods

Multifold Cross-Validation Model Averaging for Generalized Additive Partial Linear Models

, , &
Pages 1649-1659 | Received 17 Apr 2022, Accepted 18 Jan 2023, Published online: 07 Mar 2023
 

Abstract

Generalized Additive Partial Linear Models (GAPLMs) are appealing for model interpretation and prediction. However, for GAPLMs, the covariates and the degree of smoothing in the nonparametric parts are often difficult to determine in practice. To address this model selection uncertainty issue, we develop a computationally feasible Model Averaging (MA) procedure. The model weights are data-driven and selected based on multifold Cross-Validation (CV) (instead of leave-one-out) for computational saving. When all the candidate models are misspecified, we show that the proposed MA estimator for GAPLMs is asymptotically optimal in the sense of achieving the lowest possible Kullback-Leibler loss. In the other scenario where the candidate model set contains at least one quasi-correct model, the weights chosen by the multifold CV are asymptotically concentrated on the quasi-correct models. As a by-product, we propose a variable importance measure to quantify the importances of the predictors in GAPLMs based on the MA weights. It is shown to be able to asymptotically identify the variables in the true model. Moreover, when the number of candidate models is very large, a model screening method is provided. Numerical experiments show the superiority of the proposed MA method over some existing model averaging and selection methods. Supplementary materials for this article are available online.

Supplementary Materials

Text document:

Supplemental Materials (proofs, additional numerical results and justifications of conditions for the theoretical results) for “Multifold Cross-Validation Model Averaging for Generalized Additive Partial Linear Models” (.pdf file).

R code:

R programs which can be used to replicate the numerical results in this article.

Acknowledgments

We are grateful to the editor, the AE, and two anonymous reviewers for their insightful and constructive comments for improving our article substantially.

Disclosure Statement

The authors report there are no competing interests to declare.

Additional information

Funding

The work of Wangli Xu is supported by Beijing Natural Science Foundation (no. Z200001) and National Natural Science Foundation of China (no. 11971478). Liao’s work was partially supported by the National Natural Science Foundation of China (grant nos. 12001534 and 11971323).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.