[3Rin4-81] Length-controllable Abstractive Summarization by Guiding with Summary Prototype
Keywords:Abstractive Summarization, Length Control, Extraction of Important Words
We propose a new length-controllable abstractive summarization model. Recent state-of-the-art abstractive summarization models based on encoder-decoder models generate only one summary per source text. However, controllable summarization, especially of the length, is an important aspect for practical applications. Previous studies on length-controllable abstractive summarization incorporate length embeddings in the decoder module for controlling the summary length. Unlike these models, our length-controllable abstractive summarization model incorporates a word-level extractive module that determines important parts of the source text in the encoder-decoder model instead of length embeddings. This module determines important parts of the source text that should be included as a summary within a length constraint. Since the extractive module becomes a guide to both the content and length of the summary, our model can generate an informative and length-controlled summary. Experiments with the CNN/Daily Mail dataset and the NEWSROOM dataset show that our model outperformed previous models in length-controlled settings.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.