Skip to content

Latest commit

 

History

History
12 lines (8 loc) · 527 Bytes

neural-attention-model-for-abstractive-sentence-summarization.md

File metadata and controls

12 lines (8 loc) · 527 Bytes

TLDR; The authors apply a neural seq2seq model to sentence summarization. The model uses an attention mechanism (soft alignment).

Key Points

  • Summaries generated on the sentence level, not paragraph level
  • Summaries have fixed length output
  • Beam search decoder
  • Extractive tuning for scoring function to encourage the model to take words from the input sequence
  • Training data: Headline + first sentence pair.