Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Regex-augmented Domain Transfer Topic Classification based on a Pre-trained Language Model: An application in Financial Domain (2305.18324v1)

Published 23 May 2023 in cs.CL and cs.AI

Abstract: A common way to use large pre-trained LLMs for downstream tasks is to fine tune them using additional layers. This may not work well if downstream domain is a specialized domain whereas the LLM has been pre-trained on a generic corpus. In this paper, we discuss the use of regular expression patterns employed as features for domain knowledge during the process of fine tuning, in addition to domain specific text. Our experiments on real scenario production data show that this method of fine tuning improves the downstream text classification tasks as compared to fine tuning only on domain specific text. We also show that the use of attention network for fine tuning improves results compared to simple linear layers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Vanessa Liao (1 paper)
  2. Syed Shariyar Murtaza (1 paper)
  3. Yifan Nie (8 papers)
  4. Jimmy Lin (208 papers)

Summary

We haven't generated a summary for this paper yet.