Bamler, RobertSalehi, FarnoodMandt, Stephan2021-12-182021-12-182021-12-182020-01-01https://infoscience.epfl.ch/handle/20.500.14299/183888WOS:000722423500046Knowledge graph embeddings rank among the most successful methods for link prediction in knowledge graphs, i.e., the task of completing an incomplete collection of relational facts. A downside of these models is their strong sensitivity to model hyperparameters, in particular regularizers, which have to be extensively tuned to reach good performance [Kadlec el al, 2017]. We propose an efficient method for large scale hyperparameter tuning by interpreting these models in a probabilistic framework. After a model augmentation that introduces perentity hyperparameters, we use a variational expectation-maximization approach to tune thousands of such hyperparameters with minimal additional cost. Our approach is agnostic to details of the model and results in a new state of the art in link prediction on standard benchmark data.Computer Science, Artificial IntelligenceComputer Science, Theory & MethodsMathematics, AppliedStatistics & ProbabilityComputer ScienceMathematicsAugmenting and Tuning Knowledge Graph Embeddingstext::conference output::conference proceedings::conference paper