skip to content
zen catsDeeps.dev

Summarization in NLP: Get straight to the point

/ 1 min read

Summarization in NLP

Abstract

Sequence-to-sequence models have introduced new approaches,

However they have two short comings.

  • liable to reproduce factual details inaccurately

  • tend to repeat themselves

Authors have proposed a novel that builds on top of the standard sequence-to-sequence attentional models.

First, we use a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator.

Second, we use coverage to keep track of what has been summarized, which discourages repetition

We apply our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points

Introduction

graph TB
sum{Summarization}



sum --> abs[Abstractive]
sum --> ext[Extractive]

References:

Frontiers in Natural Language Processing Expert Responses.

http://ruder.io/4-biggest-open-problems-in-nlp/

http://ruder.io/requests-for-research/

https://www.aclweb.org/anthology/J95-1009