Dongqi Pu

My name in Oracle bone script:

Doctoral researcher in Computational Linguistics

Campus C7.4, Saarland University, 66123, Germany

dongqi.me [AT] gmail.com

Incorporating Distributions of Discourse Structure for Long Document Abstractive Summarization

Abstract:

For text summarization, the role of discourse structure is pivotal in discerning the core content of a text. Regrettably, prior studies on incorporating Rhetorical Structure Theory (RST) into transformer-based summarization models only consider the nuclearity annotation, thereby overlooking the variety of discourse relation types. This paper introduces the `RSTformer’, a novel summarization model that comprehensively incorporates both the types and uncertainty of rhetorical relations. Our RST-attention mechanism, rooted in document-level rhetorical structure, is an extension of the recently devised Longformer framework. Through rigorous evaluation, the model proposed herein exhibits significant superiority over state-of-the-art models, as evidenced by its notable performance on several automatic metrics and human evaluation.

RSTformer

Code:

Code is available at: https://github.com/dongqi-me/RSTformer

Citation:

Incorporating Distributions of Discourse Structure for Long Document Abstractive Summarization (Pu et al., ACL 2023)