000111353 001__ 111353
000111353 005__ 20190509132130.0
000111353 0247_ $$2doi$$a10.5075/epfl-thesis-3936
000111353 02470 $$2urn$$aurn:nbn:ch:bel-epfl-thesis3936-1
000111353 02471 $$2nebis$$a5419056
000111353 037__ $$aTHESIS
000111353 041__ $$aeng
000111353 088__ $$a3936
000111353 245__ $$aAlgorithmic aspects of sparse approximations
000111353 269__ $$a2007
000111353 260__ $$bEPFL$$c2007$$aLausanne
000111353 300__ $$a104
000111353 336__ $$aTheses
000111353 502__ $$aChristophe De Vleeschouwer, Jean-Philippe Thiran, Laurent Daudet
000111353 520__ $$aTypical tasks in signal processing may be done in simpler ways or more efficiently if the signals to analyze are represented in a proper way. This thesis deals with some algorithmic problems related to signal approximation, more precisely, in the novel field of sparse approximation using redundant dictionaries of functions. Orthogonal bases permit to approximate signals by just taking the N waveforms whose associated projections have maximal amplitudes. This nice property is no longer valid if the used base is redundant. In fact, finding the best decomposition becomes a NP Hard problem in the general case. Thus, suboptimal heuristics have been developed; the best known ones are Matching Pursuit and Basis Pursuit. Both remain highly complex which prevent them from being used in practice in many situations. The first part of the thesis is concerned with this computational bottleneck. We propose to create a tree structure endowing the dictionary and grouping similar atoms in the same branches. An approximation algorithm, called Tree-Based Pursuit, exploiting this structure is presented. It considerably lowers the cost of finding good approximations with redundant dictionaries. The quality of the representation does not only depend on the approximation algorithm but also on the dictionary used. One of the main advantages of these techniques is that the atoms can be tailored to match the features present in the signal. It might happen that some knowledge about the class of signals to approximate directly leads to the dictionary. For most natural signals, however, the underlying structures are not clearly known and may be obfuscated. Learning dictionaries based on examples is an alternative to manual design and is gaining in interest. Most natural signals exhibit behaviors invariant to translations in space or in time. Thus, we propose an algorithm to learn redundant dictionaries under the translation invariance constraint. In the case of images, the proposed solution is able to recover atoms similar to Gabor functions, line edge detectors and curved edge detectors. The two first categories were already observed and the third one completes the range of natural features and is a major contribution of this algorithm. Sparsity is used to define the efficiency of approximation algorithms as well as to characterize good dictionaries. It directly comes from the fact that these techniques aim at approximating signals with few significant terms. This property was successfully exploited as a dimension reduction method for different signal processing tasks as analysis, de-noising or compression. In the last chapter, we tackle the problem of finding the nearest neighbor to a query signal in a set of signals that have a sparse representation. We take advantage of sparsity to approximate quickly the distance between the query and all elements of the database. In this way, we are able to prune recursively all elements that do not match the query, while providing bounds on the true distance. Validation of this technique on synthetic and real data sets confirms that it could be very well suited to process queries over large databases of compressed signals, avoiding most of the burden of decoding.
000111353 6531_ $$asparse approximation
000111353 6531_ $$aredundant dictionaries
000111353 6531_ $$adictionary learning
000111353 6531_ $$adata mining
000111353 6531_ $$afast algorithms
000111353 6531_ $$anearest neighbor
000111353 6531_ $$astructured dictionaries
000111353 6531_ $$agreedy algorithms
000111353 6531_ $$areprésentations parcimonieuse
000111353 6531_ $$adictionnaires redondants
000111353 6531_ $$aapprentissage de dictionnaires
000111353 6531_ $$adata mining
000111353 6531_ $$aalgorithmes rapides
000111353 6531_ $$aplus proche voisin
000111353 6531_ $$adictionnaires structurés
000111353 6531_ $$aalgorithmes gloutons
000111353 700__ $$0241006$$g114569$$aJost, Philippe
000111353 720_2 $$aVandergheynst, Pierre$$edir.$$g120906$$0240428
000111353 8564_ $$uhttps://infoscience.epfl.ch/record/111353/files/EPFL_TH3936.pdf$$zTexte intégral / Full text$$s5852029$$yTexte intégral / Full text
000111353 909C0 $$xU10380$$0252392$$pLTS2
000111353 909CO $$pthesis$$pthesis-bn2018$$pDOI$$ooai:infoscience.tind.io:111353$$qDOI2$$qGLOBAL_SET$$pSTI
000111353 918__ $$dEDIC2005-2015$$cITS$$aSTI
000111353 919__ $$aLTS2
000111353 920__ $$b2007$$a2007-10-19
000111353 970__ $$a3936/THESES
000111353 973__ $$sPUBLISHED$$aEPFL
000111353 980__ $$aTHESIS