Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Dictionary Identification - Sparse Matrix-Factorisation via $\ell_1$-Minimisation
 
research article

Dictionary Identification - Sparse Matrix-Factorisation via $\ell_1$-Minimisation

Gribonval, Remi
•
Schnass, Karin  
2010
IEEE Transactions on Information Theory

This article treats the problem of learning a dictionary providing sparse representations for a given signal class, via $\ell_1$-minimisation. The problem can also be seen as factorising a $\ddim \times \nsig$ matrix $Y=(y_1 \ldots y_\nsig), , y_n\in \R^\ddim$ of training signals into a $\ddim \times \natoms$ dictionary matrix $\dico$ and a $\natoms \times \nsig$ coefficient matrix $\X=(x_1\ldots x_\nsig),, x_n \in \R^\natoms$, which is sparse. The exact question studied here is when a dictionary coefficient pair $(\dico,\X)$ can be recovered as local minimum of a (nonconvex) $\ell_1$-criterion with input $Y=\dico \X$. First, for general dictionaries and coefficient matrices, algebraic conditions ensuring local identifiability are derived, which are then specialised to the case when the dictionary is a basis. Finally, assuming a random Bernoulli-Gaussian sparse model on the coefficient matrix, it is shown that sufficiently incoherent bases are locally identifiable with high probability. The perhaps surprising result is that the typically sufficient number of training samples $\nsig$ grows up to a logarithmic factor only linearly with the signal dimension, i.e. $\nsig \approx C \natoms \log \natoms$, in contrast to previous approaches requiring combinatorially many samples.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TIT.2010.2048466
ArXiv ID

0904.4774

Author(s)
Gribonval, Remi
Schnass, Karin  
Date Issued

2010

Publisher

Institute of Electrical and Electronics Engineers

Published in
IEEE Transactions on Information Theory
Volume

56

Issue

7

Start page

3523

End page

3539

Subjects

LTS2

•

lts2

•

$\ell_1$-minimisation

•

sparse coding

•

random matrices

•

sparse representation

•

dictionary learning

•

dictionary identification

•

nonconvex optimisation

•

independent component analysis

•

blind source separation

•

blind source localisation

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTS2  
Available on Infoscience
May 1, 2009
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/38230
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés