Augmenting and Tuning Knowledge Graph Embeddings
Knowledge graph embeddings rank among the most successful methods for link prediction in knowledge graphs, i.e., the task of completing an incomplete collection of relational facts. A downside of these models is their strong sensitivity to model hyperparameters, in particular regularizers, which have to be extensively tuned to reach good performance [Kadlec el al, 2017]. We propose an efficient method for large scale hyperparameter tuning by interpreting these models in a probabilistic framework. After a model augmentation that introduces perentity hyperparameters, we use a variational expectation-maximization approach to tune thousands of such hyperparameters with minimal additional cost. Our approach is agnostic to details of the model and results in a new state of the art in link prediction on standard benchmark data.
WOS:000722423500046
2020-01-01
San Diego
Proceedings of Machine Learning Research
115
508
518
REVIEWED
EPFL
Event name | Event place | Event date |
Tel Aviv, ISRAEL | Jul 22-25, 2019 | |