Files

Abstract

Typical tasks in signal processing may be done in simpler ways or more efficiently if the signals to analyze are represented in a proper way. This thesis deals with some algorithmic problems related to signal approximation, more precisely, in the novel field of sparse approximation using redundant dictionaries of functions. Orthogonal bases permit to approximate signals by just taking the N waveforms whose associated projections have maximal amplitudes. This nice property is no longer valid if the used base is redundant. In fact, finding the best decomposition becomes a NP Hard problem in the general case. Thus, suboptimal heuristics have been developed; the best known ones are Matching Pursuit and Basis Pursuit. Both remain highly complex which prevent them from being used in practice in many situations. The first part of the thesis is concerned with this computational bottleneck. We propose to create a tree structure endowing the dictionary and grouping similar atoms in the same branches. An approximation algorithm, called Tree-Based Pursuit, exploiting this structure is presented. It considerably lowers the cost of finding good approximations with redundant dictionaries. The quality of the representation does not only depend on the approximation algorithm but also on the dictionary used. One of the main advantages of these techniques is that the atoms can be tailored to match the features present in the signal. It might happen that some knowledge about the class of signals to approximate directly leads to the dictionary. For most natural signals, however, the underlying structures are not clearly known and may be obfuscated. Learning dictionaries based on examples is an alternative to manual design and is gaining in interest. Most natural signals exhibit behaviors invariant to translations in space or in time. Thus, we propose an algorithm to learn redundant dictionaries under the translation invariance constraint. In the case of images, the proposed solution is able to recover atoms similar to Gabor functions, line edge detectors and curved edge detectors. The two first categories were already observed and the third one completes the range of natural features and is a major contribution of this algorithm. Sparsity is used to define the efficiency of approximation algorithms as well as to characterize good dictionaries. It directly comes from the fact that these techniques aim at approximating signals with few significant terms. This property was successfully exploited as a dimension reduction method for different signal processing tasks as analysis, de-noising or compression. In the last chapter, we tackle the problem of finding the nearest neighbor to a query signal in a set of signals that have a sparse representation. We take advantage of sparsity to approximate quickly the distance between the query and all elements of the database. In this way, we are able to prune recursively all elements that do not match the query, while providing bounds on the true distance. Validation of this technique on synthetic and real data sets confirms that it could be very well suited to process queries over large databases of compressed signals, avoiding most of the burden of decoding.

Details

Actions

Preview