Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
72 tokens/sec
GPT-4o
61 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

From Text to Transformation: A Comprehensive Review of Large Language Models' Versatility (2402.16142v1)

Published 25 Feb 2024 in cs.CL and cs.AI

Abstract: This groundbreaking study explores the expanse of LLMs, such as Generative Pre-Trained Transformer (GPT) and Bidirectional Encoder Representations from Transformers (BERT) across varied domains ranging from technology, finance, healthcare to education. Despite their established prowess in NLP, these LLMs have not been systematically examined for their impact on domains such as fitness, and holistic well-being, urban planning, climate modelling as well as disaster management. This review paper, in addition to furnishing a comprehensive analysis of the vast expanse and extent of LLMs' utility in diverse domains, recognizes the research gaps and realms where the potential of LLMs is yet to be harnessed. This study uncovers innovative ways in which LLMs can leave a mark in the fields like fitness and wellbeing, urban planning, climate modelling and disaster response which could inspire future researches and applications in the said avenues.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Pravneet Kaur (1 paper)
  2. Gautam Siddharth Kashyap (10 papers)
  3. Ankit Kumar (140 papers)
  4. Md Tabrez Nafis (3 papers)
  5. Sandeep Kumar (143 papers)
  6. Vikrant Shokeen (1 paper)
Citations (27)