000268680 001__ 268680
000268680 005__ 20190812204806.0
000268680 020__ $$a978-1-4799-8131-1
000268680 0247_ $$2doi$$a10.1109/ICASSP.2019.8683286
000268680 037__ $$aCONF
000268680 245__ $$aStochastic Gradient Descent for Spectral Embedding with Implicit Orthogonality Constraint
000268680 269__ $$a2019
000268680 260__ $$bIEEE$$c2019
000268680 336__ $$aConference Papers
000268680 520__ $$aIn this paper, we propose a scalable algorithm for spectral embedding. The latter is a standard tool for graph clustering. However, its computational bottleneck is the eigendecomposition of the graph Laplacian matrix, which prevents its application to large-scale graphs. Our contribution consists of reformulating spectral embedding so that it can be solved via stochastic optimization. The idea is to replace the orthogonality constraint with an orthogonalization matrix injected directly into the criterion. As the gradient can be computed through a Cholesky factorization, our reformulation allows us to develop an efficient algorithm based on mini-batch gradient descent. Experimental results, both on synthetic and real data, confirm the efficiency of the proposed method in term of execution speed with respect to similar existing techniques.
000268680 700__ $$0251490$$aEl Gheche, Mireille$$g291868
000268680 700__ $$aChierchia, Giovanni
000268680 700__ $$0241061$$aFrossard, Pascal$$g101475
000268680 7112_ $$d12-17 May, 2019$$cBrighton, UK$$aIEEE ICASSP
000268680 773__ $$tProceedings of IEEE ICASSP
000268680 8560_ $$fpascal.frossard@epfl.ch
000268680 85641 $$yView paper in IEEExplore$$uhttps://ieeexplore.ieee.org/document/8683286
000268680 909C0 $$pLTS4$$mpascal.frossard@epfl.ch$$0252393$$zMarselli, Béatrice$$xU10851
000268680 909CO $$pconf$$pSTI$$ooai:infoscience.epfl.ch:268680
000268680 960__ $$apascal.frossard@epfl.ch
000268680 961__ $$aalessandra.bianchi@epfl.ch
000268680 973__ $$aEPFL$$rREVIEWED
000268680 980__ $$aCONF
000268680 981__ $$aoverwrite