Enhancing Financial Named Entity Recognition through Adaptive Few-Shot Learning: A Comparative Study of Pre-trained Language Models
DOI:
https://doi.org/10.69987/JACS.2024.40702Keywords:
Financial NER, Few-shot Learning, Transfer Learning, Pre-trained Language ModelsAbstract
Financial document processing faces significant challenges in extracting structured information from diverse document types including loan applications, financial statements, and regulatory filings. This paper presents an adaptive few-shot learning framework for Named Entity Recognition (NER) in financial documents, addressing the critical need to reduce annotation requirements while maintaining high extraction accuracy. We conduct a comprehensive comparative analysis of pre-trained language models including BERT, RoBERTa, and domain-specific FinBERT variants under few-shot learning scenarios. Our methodology integrates meta-learning approaches with prompt-based optimization strategies, enabling effective entity recognition with minimal labeled examples. Experimental results on financial document datasets demonstrate that our adaptive framework achieves 91.3% F1-score with only 10 labeled examples per entity type, representing a 68% reduction in annotation requirements compared to traditional supervised approaches. The proposed approach significantly benefits financial institutions by reducing manual processing costs while maintaining regulatory compliance standards.







