The authors derive generalization bounds involving an arbitrary convex comparator function, which measures the discrepancy between training and population loss. The research is based on the assumption that the cumulant-generating function (CGF) of the comparator is upper-bounded by the corresponding CGF within a family of bounding distributions. The paper confirms the near-optimality of known bounds for bounded and sub-Gaussian losses and proposes novel bounds under other bounding distributions.
Publication date: 16 Oct 2023
Project Page: https://arxiv.org/abs/2310.10534v1
Paper: https://arxiv.org/pdf/2310.10534
Leave a comment