Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EL Embeddings: Geometric construction of models for the Description Logic EL ++ (1902.10499v1)

Published 27 Feb 2019 in cs.AI

Abstract: An embedding is a function that maps entities from one algebraic structure into another while preserving certain characteristics. Embeddings are being used successfully for mapping relational data or text into vector spaces where they can be used for machine learning, similarity search, or similar tasks. We address the problem of finding vector space embeddings for theories in the Description Logic $\mathcal{EL}{++}$ that are also models of the TBox. To find such embeddings, we define an optimization problem that characterizes the model-theoretic semantics of the operators in $\mathcal{EL}{++}$ within $\Ren$, thereby solving the problem of finding an interpretation function for an $\mathcal{EL}{++}$ theory given a particular domain $\Delta$. Our approach is mainly relevant to large $\mathcal{EL}{++}$ theories and knowledge bases such as the ontologies and knowledge graphs used in the life sciences. We demonstrate that our method can be used for improved prediction of protein--protein interactions when compared to semantic similarity measures or knowledge graph embedding

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Maxat Kulmanov (3 papers)
  2. Wang Liu-Wei (1 paper)
  3. Yuan Yan (10 papers)
  4. Robert Hoehndorf (27 papers)
Citations (100)

Summary

We haven't generated a summary for this paper yet.