1,127
Views
0
CrossRef citations to date
0
Altmetric
Machine Learning

A Generalization Gap Estimation for Overparameterized Models via the Langevin Functional Variance

ORCID Icon &
Pages 1287-1295 | Received 31 May 2022, Accepted 27 Mar 2023, Published online: 17 May 2023
 

Abstract

This article discusses the estimation of the generalization gap, the difference between generalization performance and training performance, for overparameterized models including neural networks. We first show that a functional variance, a key concept in defining a widely-applicable information criterion, characterizes the generalization gap even in overparameterized settings where a conventional theory cannot be applied. As the computational cost of the functional variance is expensive for the overparameterized models, we propose an efficient approximation of the function variance, the Langevin approximation of the functional variance (Langevin FV). This method leverages only the first-order gradient of the squared loss function, without referencing the second-order gradient; this ensures that the computation is efficient and the implementation is consistent with gradient-based optimization algorithms. We demonstrate the Langevin FV numerically by estimating the generalization gaps of overparameterized linear regression and nonlinear neural network models, containing more than a thousand of parameters therein. Supplementary materials for this article are available online.

Supplementary Materials

Supplementary material contains the proofs of theorems, and descriptions of source codes to reproduce experimental results.

Acknowledgments

We thank the editor, the AE, and two anonymous reviewers for constructive comments and suggestions. We also thank Eiki Shimizu for suggesting several references, and Tetsuya Takabatake and Yukito Iba for helpful discussions.

Additional information

Funding

A. Okuno was supported by JST CREST (JPMJCR21N3) and JSPS KAKENHI (21K17718, 22H05106). K. Yano was supported by JST CREST (JPMJCR1763), JSPS KAKENHI (19K20222, 21H05205, 21K12067), and MEXT (JPJ010217).