Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HybEA: Hybrid Models for Entity Alignment (2407.02862v2)

Published 3 Jul 2024 in cs.DB

Abstract: Entity Alignment (EA) aims to detect descriptions of the same real-world entities among different Knowledge Graphs (KG). Several embedding methods have been proposed to rank potentially matching entities of two KGs according to their similarity in the embedding space. However, existing EA embedding methods are challenged by the diverse levels of structural (i.e., neighborhood entities) and semantic (e.g., entity names and literal property values) heterogeneity exhibited by real-world KGs, especially when they are spanning several domains (DBpedia, Wikidata). Existing methods either focus on one of the two heterogeneity kinds depending on the context (mono- vs multi-lingual). To address this limitation, we propose a flexible framework called HybEA, that is a hybrid of two models, a novel attention-based factual model, co-trained with a state-of-the-art structural model. Our experimental results demonstrate that HybEA outperforms the state-of-the-art EA systems, achieving a 16% average relative improvement of Hits@1, ranging from 3.6% up to 40% in 5 monolingual datasets, with some datasets that can now be considered as solved. We also show that HybEA outperforms state-of-the-art methods in 3 multi-lingual datasets, as well as on 2 datasets that drop the unrealistic, yet widely adopted, one-to-one assumption. Overall, HybEA outperforms all (11) baseline methods in all (3) measures and in all (10) datasets evaluated, with a statistically significant difference.

Summary

We haven't generated a summary for this paper yet.