An optimally concentrated Gabor transform for localized time-frequency components

Gabor analysis is one of the most common instances of time-frequency signal analysis. Choosing a suitable window for the Gabor transform of a signal is often a challenge for practical applications, in particular in audio signal processing. Many time-frequency (TF) patterns of different shapes may be present in a signal and they can not all be sparsely represented in the same spectrogram. We propose several algorithms, which provide optimal windows for a user-selected TF pattern with respect to different concentration criteria. We base our optimization algorithm on l (p) -norms as measure of TF spreading. For a given number of sampling points in the TF plane we also propose optimal lattices to be used with the obtained windows. We illustrate the potentiality of the method on selected numerical examples.


Published in:
Advances In Computational Mathematics, 40, 3, 683-702
Year:
2014
Publisher:
New York, Springer
ISSN:
1019-7168
Keywords:
Laboratories:




 Record created 2014-08-29, last modified 2018-12-03


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)