Sparse projections onto the simplex

Most learning methods with rank or sparsity constraints use convex relaxations, which lead to optimization with the nuclear norm or the`1-norm. However, several important learning applications cannot benet from this approach as they feature these convex norms as constraints in addition to the non-convex rank and sparsity constraints. In this setting, we derive ecient sparse projections onto the simplex and its extension, and illustrate how to use them to solve high-dimensional learning problems in quantum tomography, sparse density estimation and portfolio selection with non-convex constraints.


Published in:
Proceedings of the 30th International Conference on Machine Learning (ICML), 2013, 28, 2, 280-288
Presented at:
The 30th International Conference on Machine Learning (ICML) 2013, Atlanta, USA, June 16-21, 2013
Year:
2013
Publisher:
JMLR W&CP
Laboratories:




 Record created 2014-12-11, last modified 2018-09-13

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)