Fine-Tuning Transformer Models for Enhanced Financial Sentiment Detection
Published: July 20, 2025
12 views
10 downloads
Abstract
Financial text mining is gaining popularity. Advancements in deep learning-based models on generic corpus have shown promising results in financial text mining applications, including sentiment analysis. Financial sentiment research is challenging due to the lack of labeled data and specialized terminology in the financial area. Deep learning algorithms for general usage are less successful due to the particular language used in finance. This work aims to improve financial text mining performance by utilizing NLP transfer learning to enhance pre-trained language models. Limited labeled samples are needed for pretrained language models, but training them on domain-specific corpora can improve their performance. To improve performance in financial NLP tasks, we offer an upgraded model called \finsentiment, which combines pretrained models like BERT, XLNet, RoBERTa, GPT, Llama, and T5 with financial domain corpora. The finance-specific models in \finsentiment include Fin-BERT, Fin-XLNet, Fin-RoBERTa, Fin-GPT, Fin-Llama, and Fin-T5. We suggest combined training of these models on financial and general corpora. Our finance-specific sentiment models outperform across three datasets, even with limited fine-tuning using a smaller training set. Our results show improved performance across all examined parameters for these datasets. Research shows that RoBERTa-pretrained financial corporations are highly effective and resilient. We demonstrate that NLP transfer learning approaches effectively address financial sentiment analysis challenges.
Authors
Ahmed Laarbaoui
National School of Computer Science and Systems Analysis, Mohammed V University in Rabat, Rabat, Morocco
-
Inaugural Editorial
NOURI Hicham, HABBAT Nassera1-1