Holistic Representations for Memorization and Inference

Yunpu Ma (LMU/Siemens AG), Marcel Hildebrandt (LMU/Siemens AG), Stephan BaierĀ  (LMU) and Volker Tresp (LMU/Siemens AG) presented "Holistic Representations for Memorization and Inference" at the Quantum Machine Learning & Biomimetic Quantum Technologies conference in Bilbao, Spain, which ran from 19-23 March 2018. GDELT's Global Knowledge Graph (GKG) was used as one of the knowledge graph datasets to test their approach.

In this paper we introduce a novel holographic memory model for the distributed storage of complex association patterns and apply it to knowledge graphs. In a knowledge graph, a labelled link connects a subject node with an object node, jointly forming a subject-predicate-objects triple. In the presented work, nodes and links have initial random representations, plus holistic representations derived from the initial representations of nodes and links in their local neighbourhoods. A memory trace is represented in the same vector space as the holistic representations themselves. To reduce the interference between stored information, it is required that the initial random vectors should be pairwise quasi-orthogonal. We show that pairwise quasi-orthogonality can be improved by drawing vectors from heavy-tailed distributions, e.g., a Cauchy distribution, and, thus, memory capacity of holistic representations can significantly be improved. Furthermore, we show that, in combination with a simple neural network, the presented holistic representation approach is superior to other methods for link predictions on knowledge graphs.

Read The Full Paper.