Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Domain-Specific Language Model Post-Training for Indonesian Financial NLP (2310.09736v1)

Published 15 Oct 2023 in cs.CL and cs.AI

Abstract: BERT and IndoBERT have achieved impressive performance in several NLP tasks. There has been several investigation on its adaption in specialized domains especially for English language. We focus on financial domain and Indonesian language, where we perform post-training on pre-trained IndoBERT for financial domain using a small scale of Indonesian financial corpus. In this paper, we construct an Indonesian self-supervised financial corpus, Indonesian financial sentiment analysis dataset, Indonesian financial topic classification dataset, and release a family of BERT models for financial NLP. We also evaluate the effectiveness of domain-specific post-training on sentiment analysis and topic classification tasks. Our findings indicate that the post-training increases the effectiveness of a LLM when it is fine-tuned to domain-specific downstream tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ni Putu Intan Maharani (2 papers)
  2. Yoga Yustiawan (1 paper)
  3. Fauzy Caesar Rochim (1 paper)
  4. Ayu Purwarianti (39 papers)
Citations (1)