The article explores the generalization error curve of kernel regression methods. The authors provide a full characterization of these curves for kernel gradient descent and other spectral algorithms. They aim to improve understanding of the generalization behavior of wide neural networks. The study also introduces the analytic functional argument. The focus is on how the generalization error, kernel, regression function, noise level, and choice of regularization parameter interact. This research contributes to the ongoing exploration of neural networks and generalization errors.

 

Publication date: 4 Jan 2024
Project Page: https://arxiv.org/abs/2401.01599v
Paper: https://arxiv.org/pdf/2401.01599