,
Mansour Zoubeirou A Mayaki
,
Antoine Zimmermann
Creative Commons Attribution 4.0 International license
Over a decade, numerous Knowledge Graph Embedding (KGE) models have been designed and evaluated on reference datasets, always with increasing performance. In this paper, we re-evaluate these models with respect to their computational efficiency during training, by estimating the computational cost of the procedure expressed in floating-point operations. We design a cost model based on analytical expressions and apply it on a collection of 20 KGE models, representative of the state-of-the-art. We show that dimensionality or parameter efficiency, used in the literature to compare models with each other, are not suitable to evaluate the true cost of models. Through fixed-budget experiments, a novel approach to evaluate KGE models based on cost estimates, we re-assess the relative performance of model families compared to the state-of-the-art. Bilinear models such as ComplEx underperform with a low computational budget while hyperbolic linear models appear to offer no particular benefit compared to simpler Euclidian models, especially the MuRE model. Neural models, such as ConvE or CompGCN, achieve reasonable performance in the literature but their high computational cost appears unnecessary when compared with other models. The trade-off between efficiency and expressivity of both linear and neural models is to be further explored.
@Article{charpenay_et_al:TGDK.4.1.1,
author = {Charpenay, Victor and Zoubeirou A Mayaki, Mansour and Zimmermann, Antoine},
title = {{On the Computational Cost of Knowledge Graph Embeddings}},
journal = {Transactions on Graph Data and Knowledge},
pages = {1:1--1:30},
ISSN = {2942-7517},
year = {2026},
volume = {4},
number = {1},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/entities/document/10.4230/TGDK.4.1.1},
URN = {urn:nbn:de:0030-drops-256863},
doi = {10.4230/TGDK.4.1.1},
annote = {Keywords: Knowledge Graph Embedding, Parameter Efficiency, Computational Budget, Green AI}
}
archived version
archived version