Embedding Logical Queries on Knowledge Graphs
Abstract
Learning lowdimensional embeddings of knowledge graphs is a powerful approach used to predict unobserved or missing edges between entities. However, an open challenge in this area is developing techniques that can go beyond simple edge prediction and handle more complex logical queries, which might involve multiple unobserved edges, entities, and variables. For instance, given an incomplete biological knowledge graph, we might want to predict "em what drugs are likely to target proteins involved with both diseases X and Y?"  a query that requires reasoning about all possible proteins that {\em might} interact with diseases X and Y. Here we introduce a framework to efficiently make predictions about conjunctive logical queries  a flexible but tractable subset of firstorder logic  on incomplete knowledge graphs. In our approach, we embed graph nodes in a lowdimensional space and represent logical operators as learned geometric operations (e.g., translation, rotation) in this embedding space. By performing logical operations within a lowdimensional embedding space, our approach achieves a time complexity that is linear in the number of query variables, compared to the exponential complexity required by a naive enumerationbased approach. We demonstrate the utility of this framework in two application studies on realworld datasets with millions of relations: predicting logical relationships in a network of druggenedisease interactions and in a graphbased representation of social interactions derived from a popular web forum.
 Publication:

arXiv eprints
 Pub Date:
 June 2018
 arXiv:
 arXiv:1806.01445
 Bibcode:
 2018arXiv180601445H
 Keywords:

 Computer Science  Social and Information Networks;
 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 Published in NeurIPS 2018