Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BERT Meets Relational DB: Contextual Representations of Relational Databases (2104.14914v1)

Published 30 Apr 2021 in cs.CL, cs.DB, and cs.LG

Abstract: In this paper, we address the problem of learning low dimension representation of entities on relational databases consisting of multiple tables. Embeddings help to capture semantics encoded in the database and can be used in a variety of settings like auto-completion of tables, fully-neural query processing of relational joins queries, seamlessly handling missing values, and more. Current work is restricted to working with just single table, or using pretrained embeddings over an external corpus making them unsuitable for use in real-world databases. In this work, we look into ways of using these attention-based model to learn embeddings for entities in the relational database. We are inspired by BERT style pretraining methods and are interested in observing how they can be extended for representation learning on structured databases. We evaluate our approach of the autocompletion of relational databases and achieve improvement over standard baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Siddhant Arora (50 papers)
  2. Vinayak Gupta (25 papers)
  3. Garima Gaur (5 papers)
  4. Srikanta Bedathur (41 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.