Strengthened Information-theoretic Bounds on the Generalization Error

The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) in terms of P X P Y (E) (where P X P Y is the product of the marginals of P XY ) and a measure of dependence of X and Y. Such bounds have direct applications in the analysis of the generalization error of learning algorithms, where E represents a large error event and the measure of dependence controls the degree of overfitting. Herein, bounds are demonstrated using several information-theoretic metrics, in particular: mutual information, lautum information, maximal leakage, and J ∞ . The mutual information bound can outperform comparable bounds in the literature by an arbitrarily large factor.


Published in:
2019 IEEE International Symposium on Information Theory (ISIT), 582-586
Presented at:
2019 IEEE International Symposium on Information Theory (ISIT), Paris, France, July 7 - 12, 2019
Year:
Jul 07 2019
Publisher:
IEEE
Laboratories:




 Record created 2019-10-11, last modified 2019-10-28

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)