Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Are Large Language Models the New Interface for Data Pipelines? (2406.06596v1)

Published 6 Jun 2024 in cs.CL, cs.AI, and cs.DB

Abstract: A LLM is a term that encompasses various types of models designed to understand and generate human communication. LLMs have gained significant attention due to their ability to process text with human-like fluency and coherence, making them valuable for a wide range of data-related tasks fashioned as pipelines. The capabilities of LLMs in natural language understanding and generation, combined with their scalability, versatility, and state-of-the-art performance, enable innovative applications across various AI-related fields, including eXplainable Artificial Intelligence (XAI), Automated Machine Learning (AutoML), and Knowledge Graphs (KG). Furthermore, we believe these models can extract valuable insights and make data-driven decisions at scale, a practice commonly referred to as Big Data Analytics (BDA). In this position paper, we provide some discussions in the direction of unlocking synergies among these technologies, which can lead to more powerful and intelligent AI solutions, driving improvements in data pipelines across a wide range of applications and domains integrating humans, computers, and knowledge.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Sylvio Barbon Junior (10 papers)
  2. Paolo Ceravolo (13 papers)
  3. Sven Groppe (7 papers)
  4. Mustafa Jarrar (34 papers)
  5. Samira Maghool (5 papers)
  6. Soror Sahri (4 papers)
  7. Maurice Van Keulen (9 papers)
  8. Florence Sèdes (3 papers)
Citations (5)
X Twitter Logo Streamline Icon: https://streamlinehq.com