Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
72 tokens/sec
GPT-4o
61 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

InternLM-Law: An Open Source Chinese Legal Large Language Model (2406.14887v1)

Published 21 Jun 2024 in cs.CL

Abstract: While LLMs have showcased impressive capabilities, they struggle with addressing legal queries due to the intricate complexities and specialized expertise required in the legal field. In this paper, we introduce InternLM-Law, a specialized LLM tailored for addressing diverse legal queries related to Chinese laws, spanning from responding to standard legal questions (e.g., legal exercises in textbooks) to analyzing complex real-world legal situations. We meticulously construct a dataset in the Chinese legal domain, encompassing over 1 million queries, and implement a data filtering and processing pipeline to ensure its diversity and quality. Our training approach involves a novel two-stage process: initially fine-tuning LLMs on both legal-specific and general-purpose content to equip the models with broad knowledge, followed by exclusive fine-tuning on high-quality legal data to enhance structured output generation. InternLM-Law achieves the highest average performance on LawBench, outperforming state-of-the-art models, including GPT-4, on 13 out of 20 subtasks. We make InternLM-Law and our dataset publicly available to facilitate future research in applying LLMs within the legal domain.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Zhiwei Fei (4 papers)
  2. Songyang Zhang (116 papers)
  3. Xiaoyu Shen (73 papers)
  4. Dawei Zhu (46 papers)
  5. Xiao Wang (507 papers)
  6. Maosong Cao (9 papers)
  7. Fengzhe Zhou (7 papers)
  8. Yining Li (29 papers)
  9. Wenwei Zhang (77 papers)
  10. Dahua Lin (336 papers)
  11. Kai Chen (512 papers)
  12. Jidong Ge (17 papers)
Citations (2)