Enhanced Word Representations for Bridging Anaphora Resolution
Abstract
Most current models of word representations(e.g.,GloVe) have successfully captured fine-grained semantics. However, semantic similarity exhibited in these word embeddings is not suitable for resolving bridging anaphora, which requires the knowledge of associative similarity (i.e., relatedness) instead of semantic similarity information between synonyms or hypernyms. We create word embeddings (embeddings_PP) to capture such relatedness by exploring the syntactic structure of noun phrases. We demonstrate that using embeddings_PP alone achieves around 30% of accuracy for bridging anaphora resolution on the ISNotes corpus. Furthermore, we achieve a substantial gain over the state-of-the-art system (Hou et al., 2013) for bridging antecedent selection.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2018
- DOI:
- 10.48550/arXiv.1803.04790
- arXiv:
- arXiv:1803.04790
- Bibcode:
- 2018arXiv180304790H
- Keywords:
-
- Computer Science - Computation and Language
- E-Print:
- To appear in NAACL2018 (the final version will be forthcoming)