Distributionally Optimistic Optimization Approach to Nonparametric Likelihood Approximation

The likelihood function is a fundamental component of Bayesian statistics; however, evaluating the likelihood of an observation becomes an intractable problem in many applications. We consider in this paper a non-parametric approximation of the likelihood by solving a distributionally optimistic optimization problem that identifies the probability measure that lies in the neighborhood of a nominal measure and that maximizes the probability of observing the data. We show that when the neighborhood is constructed by either the Kullback-Leibler divergence or by moment conditions or by the Wasserstein distance, the proposed optimistic likelihood is amenable to a convex optimization approach, and admits an analytical expression in particular cases. The posterior inference problem with optimistic likelihood approximation enjoys theoretical performance guarantees, and performs competitively in the probabilistic classification task.


Published in:
[Advances in Neural Information Processing Systems 33 (NIPS 2019)]
Presented at:
NeurIPS 2019 : Thirty-third Conference on Neural Information Processing Systems, Vancouver, Canada, December 8-14, 2019, Vancouver, Canada, December 8-14, 2019
Year:
2019
Keywords:
Laboratories:




 Record created 2019-09-03, last modified 2019-09-04


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)