Exploiting BERT for End-to-End Aspect-based Sentiment Analysis
Abstract
In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out validation dataset for model selection, which is largely ignored by previous works. Therefore, our work can serve as a BERT-based benchmark for E2E-ABSA.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2019
- DOI:
- 10.48550/arXiv.1910.00883
- arXiv:
- arXiv:1910.00883
- Bibcode:
- 2019arXiv191000883L
- Keywords:
-
- Computer Science - Computation and Language
- E-Print:
- NUT workshop@EMNLP-IJCNLP-2019