TY - EJOUR
DO - 10.1109/TSP.2016.2602809
AB - The construction of a meaningful graph plays a crucial role in the success of many graph-based representations and algorithms for handling structured data, especially in the emerging field of graph signal processing. However, a meaningful graph is not always readily available from the data, nor easy to define depending on the application domain. In particular, it is often desirable in graph signal processing applications that a graph is chosen such that the data admit certain regularity or smoothness on the graph. In this paper, we address the problem of learning graph Laplacians, which is equivalent to learning graph topologies, such that the input data form graph signals with smooth variations on the resulting topology. To this end, we adopt a factor analysis model for the graph signals and impose a Gaussian probabilistic prior on the latent variables that control these signals. We show that the Gaussian prior leads to an efficient representation that favors the smoothness property of the graph signals. We then propose an algorithm for learning graphs that enforce such property and is based on minimizing the variations of the signals on the learned graph. Experiments on both synthetic and real world data demonstrate that the proposed graph learning framework can lead to efficiently inferring meaningful graph topologies from signal observations under the smoothness prior.
T1 - Learning Laplacian Matrix in Smooth Graph Signal Representations
IS - 23
DA - 2016
AU - Dong, Xiaowen
AU - Thanou, Dorina
AU - Frossard, Pascal
AU - Vandergheynst, Pierre
JF - IEEE Transactions on Signal Processing
SP - 6160-6173
VL - 64
EP - 6160-6173
PB - Institute of Electrical and Electronics Engineers
PP - Piscataway
ID - 200187
KW - graph learning
KW - signal processing on graphs
KW - representation theory
KW - factor analysis
KW - Gaussian prior
SN - 1053-587X
UR - http://arxiv.org/abs/1406.7842
ER -