Multi-hop Path Queries over Knowledge Graphs with Neural Memory Networks
Author(s)
Wang, Qinyong
Yin, Hongzhi
Wang, Weiqing
Huang, Zi
Guo, Guibing
Nguyen, Quoc Viet Hung
Griffith University Author(s)
Year published
2019
Metadata
Show full item recordAbstract
There has been increasing research interest in inferring missing information from existing knowledge graphs (KGs) due to the emergence of a wide range of knowledge graph downstream applications such as question answering systems and search engines. Reasoning over knowledge graphs, which queries the correct entities only through a path consisting of multiple consecutive relations/hops from the starting entity, is an effective approach to do this task, but this topic has been rarely studied. As an attempt, the compositional training method equally treats the constructed multi-hop paths and one-hop relations to build training ...
View more >There has been increasing research interest in inferring missing information from existing knowledge graphs (KGs) due to the emergence of a wide range of knowledge graph downstream applications such as question answering systems and search engines. Reasoning over knowledge graphs, which queries the correct entities only through a path consisting of multiple consecutive relations/hops from the starting entity, is an effective approach to do this task, but this topic has been rarely studied. As an attempt, the compositional training method equally treats the constructed multi-hop paths and one-hop relations to build training data, and then trains conventional knowledge graph completion models such as TransE in a compositional manner on the training data. However, it does not incorporate additional information along the paths during training, such as the intermediate entities and their types, which can be helpful to guide the reasoning towards the correct destination answers. Moreover, compositional training can only extend some existing models that can be composable, which significantly limits its applicability. Therefore, we design a novel model based on the recently proposed neural memory networks, which have large external memories and flexible writing/reading schemes, to address these problems. Specifically, we first introduce a single network layer, which is then used as the building block for a multi-layer neural network called TravNM, and a flexible memory updating method is developed to facilitate writing intermediate entity information during the multi-hop reasoning into memories. Finally, we conducted extensive experiments on large datasets, and the experimental results show the superiority of our proposed TravNM for reasoning over knowledge graphs with multiple hops.
View less >
View more >There has been increasing research interest in inferring missing information from existing knowledge graphs (KGs) due to the emergence of a wide range of knowledge graph downstream applications such as question answering systems and search engines. Reasoning over knowledge graphs, which queries the correct entities only through a path consisting of multiple consecutive relations/hops from the starting entity, is an effective approach to do this task, but this topic has been rarely studied. As an attempt, the compositional training method equally treats the constructed multi-hop paths and one-hop relations to build training data, and then trains conventional knowledge graph completion models such as TransE in a compositional manner on the training data. However, it does not incorporate additional information along the paths during training, such as the intermediate entities and their types, which can be helpful to guide the reasoning towards the correct destination answers. Moreover, compositional training can only extend some existing models that can be composable, which significantly limits its applicability. Therefore, we design a novel model based on the recently proposed neural memory networks, which have large external memories and flexible writing/reading schemes, to address these problems. Specifically, we first introduce a single network layer, which is then used as the building block for a multi-layer neural network called TravNM, and a flexible memory updating method is developed to facilitate writing intermediate entity information during the multi-hop reasoning into memories. Finally, we conducted extensive experiments on large datasets, and the experimental results show the superiority of our proposed TravNM for reasoning over knowledge graphs with multiple hops.
View less >
Conference Title
Database Systems for Advanced Applications
Volume
11446
Subject
Artificial intelligence
Science & Technology
Computer Science, Information Systems
Computer Science, Theory & Methods