000232446 001__ 232446
000232446 005__ 20190317000855.0
000232446 037__ $$aCONF
000232446 245__ $$aLarge Scale Graph Learning from Smooth Signals
000232446 269__ $$a2017
000232446 260__ $$c2017
000232446 336__ $$aConference Papers
000232446 520__ $$aGraphs are a prevalent tool in data science, as they model the inherent structure of the data. They have been used successfully in unsupervised and semi-supervised learning. Typically they are constructed either by connecting nearest samples, or by learning them from data, solving an optimization problem. While graph learning does achieve a better quality, it also comes with a higher computational cost. In particular, the current state-of-the-art model cost is O(n^2) for n samples. In this paper, we show how to scale it, obtaining an approximation with leading cost of O(n log(n)), with quality that approaches the exact graph learning model. Our algorithm uses known approximate nearest neighbor techniques to reduce the number of variables, and automatically selects the correct parameters of the model, requiring a single intuitive input: the desired edge density.
000232446 6531_ $$aGraphs
000232446 6531_ $$aGraph learning
000232446 6531_ $$aMachine learning
000232446 6531_ $$aUnsupervised learning
000232446 6531_ $$aAlgorithm
000232446 6531_ $$aLarge scale
000232446 700__ $$aKalofolias, Vassilis
000232446 700__ $$aPerraudin, Nathanaël
000232446 8564_ $$s2098452$$uhttps://infoscience.epfl.ch/record/232446/files/large_scale_graph_learning.pdf$$yPreprint$$zPreprint
000232446 909C0 $$0252392$$pLTS2$$xU10380
000232446 909CO $$ooai:infoscience.tind.io:232446$$pconf$$pSTI$$qGLOBAL_SET
000232446 917Z8 $$x179669
000232446 937__ $$aEPFL-CONF-232446
000232446 973__ $$aEPFL$$rREVIEWED$$sSUBMITTED
000232446 980__ $$aCONF