000100973 001__ 100973
000100973 005__ 20190316233936.0
000100973 02470 $$2ISI$$a000252357704040
000100973 037__ $$aCONF
000100973 245__ $$aDimensionality Reduction with Adaptive Approximation
000100973 269__ $$a2007
000100973 260__ $$c2007
000100973 336__ $$aConference Papers
000100973 520__ $$aIn this paper, we propose the use of (adaptive) nonlinear approximation for dimensionality reduction. In particular, we propose a dimensionality reduction method for learning a parts based representation of signals using redundant dictionaries. A redundant dictionary is an overcomplete set of basis vectors that spans the signal space. The signals are jointly represented in a common subspace extracted from the redundant dictionary, using greedy pursuit algorithms for simultaneous sparse approximation. The design of the dictionary is flexible and enables the direct control on the shape and properties of the basis functions. Moreover, it allows to incorporate a priori and application-driven knowledge into the basis vectors, during the learning process. We apply our dimensionality reduction method to images and compare it with Principal Component Analysis (PCA) and Non-negative Matrix Factorization (NMF) and its variants, in the context of handwritten digit image recognition and face recognition. The experimental results suggest that the proposed dimensionality reduction algorithm is competitive to PCA and NMF and that it results into meaningful features with high discriminant value.
000100973 6531_ $$aLTS4
000100973 700__ $$0240462$$aKokiopoulou, Effrosyni$$g170201
000100973 700__ $$0241061$$aFrossard, Pascal$$g101475
000100973 7112_ $$aIEEE Int. Conf. on Multimedia & Expo (ICME)$$cBeijing, China$$dJuly 2-5
000100973 773__ $$tIEEE Int. Conf. on Multimedia & Expo (ICME)
000100973 8564_ $$s191207$$uhttps://infoscience.epfl.ch/record/100973/files/Conf-dimred-lts4.pdf$$zn/a
000100973 909C0 $$0252393$$pLTS4$$xU10851
000100973 909CO $$ooai:infoscience.tind.io:100973$$pconf$$pSTI$$qGLOBAL_SET
000100973 937__ $$aEPFL-CONF-100973
000100973 973__ $$aEPFL$$rREVIEWED$$sPUBLISHED
000100973 980__ $$aCONF