Enhancing Aspect-Based Sentiment Analysis with Fine-Tuned BERT for Multi-Domain Text

  • Unique Paper ID: 179747
  • Volume: 11
  • Issue: 12
  • PageNo: 8819-8823
  • Abstract:
  • Aspect-Based Sentiment Analysis (ABSA) is a fine-grained approach to understanding sentiments by identifying specific aspects of an entity and the sentiments expressed towards them. This paper explores the application of deep learning techniques to ABSA, addressing the challenges of handling unstructured and ambiguous text from diverse domains such as social media, product reviews, and customer feedback. We propose a deep learning-based model utilizing the pre-trained Bidirectional Encoder Representations from Transformers (BERT) for ABSA tasks. Our methodology reformulates input sentences to explicitly highlight aspect terms and fine-tunes a bert-base-uncased model using labeled sentiment data. The input format is designed to integrate aspect-specific context by combining aspect terms with their corresponding review sentences, enabling the model to disambiguate sentiment in multi-aspect scenarios. The dataset is preprocessed and encoded using BERT’s tokenizer, and class labels are transformed using label encoding for multi-class classification. We train the model using the AdamW optimizer and monitor performance using categorical loss. Experimental results show that the proposed model achieves high accuracy in distinguishing positive, negative, and neutral sentiments at the aspect level.

Copyright & License

Copyright © 2025 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{179747,
        author = {Madhuri Saxena and Prof. Anurag Shrivastava},
        title = {Enhancing Aspect-Based Sentiment Analysis with Fine-Tuned BERT for Multi-Domain Text},
        journal = {International Journal of Innovative Research in Technology},
        year = {2025},
        volume = {11},
        number = {12},
        pages = {8819-8823},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=179747},
        abstract = {Aspect-Based Sentiment Analysis (ABSA) is a fine-grained approach to understanding sentiments by identifying specific aspects of an entity and the sentiments expressed towards them. This paper explores the application of deep learning techniques to ABSA, addressing the challenges of handling unstructured and ambiguous text from diverse domains such as social media, product reviews, and customer feedback. We propose a deep learning-based model utilizing the pre-trained Bidirectional Encoder Representations from Transformers (BERT) for ABSA tasks. Our methodology reformulates input sentences to explicitly highlight aspect terms and fine-tunes a bert-base-uncased model using labeled sentiment data. The input format is designed to integrate aspect-specific context by combining aspect terms with their corresponding review sentences, enabling the model to disambiguate sentiment in multi-aspect scenarios. The dataset is preprocessed and encoded using BERT’s tokenizer, and class labels are transformed using label encoding for multi-class classification. We train the model using the AdamW optimizer and monitor performance using categorical loss. Experimental results show that the proposed model achieves high accuracy in distinguishing positive, negative, and neutral sentiments at the aspect level.},
        keywords = {Aspect-Based Sentiment Analysis, Deep Learning, BERT, Transformer Models, Sentiment Analysis},
        month = {May},
        }

Cite This Article

  • ISSN: 2349-6002
  • Volume: 11
  • Issue: 12
  • PageNo: 8819-8823

Enhancing Aspect-Based Sentiment Analysis with Fine-Tuned BERT for Multi-Domain Text

Related Articles