[2023 SIGIR] Exploiting Ubiquitous Mentions in Document-Level Relation Extraction

Ruoyu Zhang's paper " Exploiting Ubiquitous Mentions in Document-Level Relation Extraction " has been accepted by SIGIR 2023.

Recent years have witnessed the transition from sentence-level to document-level in relation extraction (RE), with new formulation, new methods and new insights. Yet, the fundamental concept, mention, is not well-considered and well-defined. Current datasets usually use automatically-detected named entities as mentions, which leads to the missing reference problem. We show that such phenomenon hinders models’ reasoning abilities. To address it, we propose to incorporate coreferences (e.g. pronouns and common nouns) into mentions, based on which we refine and re-annotate the widely-used DocRED benchmark as R-DocRED. We evaluate various methods and conduct thorough experiments to demonstrate the efficacy of our formula. Specifically, the results indicate that incorporating coreferences helps reduce the long-term dependencies, further improving models’ robustness and generalization under

adversarial and low-resource settings. The new dataset is made publicly available for future research.