LowFER: Low-rank Bilinear Pooling for Link Prediction

Saadullah Amin · Stalin Varanasi · Katherine Ann Dunfield · G√ľnter Neumann


Keywords: [ Matrix/Tensor Methods ] [ Natural Language Processing / Dialogue ] [ Applications - Language, Speech and Dialog ]

[ Abstract ]
[ Slides
Tue 14 Jul 11 a.m. PDT — 11:45 a.m. PDT
Wed 15 Jul midnight PDT — 12:45 a.m. PDT


Knowledge graphs are incomplete by nature, with only a limited number of observed facts from world knowledge being represented as structured relations between entities. To partly address this issue, an important task in statistical relational learning is that of link prediction or knowledge graph completion. Both linear and non-linear models have been proposed to solve the problem of knowledge graph completion, with the former being parameter efficient and interpretable. Bilinear models, while expressive, are prone to overfitting and lead to quadratic growth of parameters in number of relations. Simpler models have become more standard, with certain constraints on bilinear maps as relation parameters. In this work, we propose a factorized bilinear pooling model, commonly used in multi-modal learning, for better fusion of entities and relations, leading to an efficient and constraint-free model. We prove that our model is fully expressive, providing bounds on embedding dimensionality and factorization rank. Our model naturally generalizes TuckER (Balazevic et al., 2019), which has been shown to generalize other models, as efficient low-rank approximation without substantially compromising performance. Due to low-rank approximation, the model complexity can be controlled by the factorization rank, avoiding the possible cubic growth of TuckER. Empirically, we evaluate on real-world datasets, reaching on par or state-of-the-art performance.

Chat is not available.