Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Truveta Mapper: A Zero-shot Ontology Alignment Framework (2301.09767v3)

Published 24 Jan 2023 in cs.LG, cs.AI, and cs.CL

Abstract: In this paper, a new perspective is suggested for unsupervised Ontology Matching (OM) or Ontology Alignment (OA) by treating it as a translation task. Ontologies are represented as graphs, and the translation is performed from a node in the source ontology graph to a path in the target ontology graph. The proposed framework, Truveta Mapper (TM), leverages a multi-task sequence-to-sequence transformer model to perform alignment across multiple ontologies in a zero-shot, unified and end-to-end manner. Multi-tasking enables the model to implicitly learn the relationship between different ontologies via transfer-learning without requiring any explicit cross-ontology manually labeled data. This also enables the formulated framework to outperform existing solutions for both runtime latency and alignment quality. The model is pre-trained and fine-tuned only on publicly available text corpus and inner-ontologies data. The proposed solution outperforms state-of-the-art approaches, Edit-Similarity, LogMap, AML, BERTMap, and the recently presented new OM frameworks in Ontology Alignment Evaluation Initiative (OAEI22), offers log-linear complexity, and overall makes the OM task efficient and more straightforward without much post-processing involving mapping extension or mapping repair. We are open sourcing our solution.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Mariyam Amir (1 paper)
  2. Murchana Baruah (2 papers)
  3. Mahsa Eslamialishah (1 paper)
  4. Sina Ehsani (4 papers)
  5. Alireza Bahramali (6 papers)
  6. Sadra Naddaf-Sh (2 papers)
  7. Saman Zarandioon (1 paper)
Citations (7)

Summary

We haven't generated a summary for this paper yet.