Skip to content
This repository has been archived by the owner on Sep 7, 2022. It is now read-only.

Work in progress inference, learning, and evaluation code for extractive summarization.

License

Notifications You must be signed in to change notification settings

brain-research/wip-constrained-extractor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Disclaimer

This is not an official Google product. This is work-in-progess research code and is completely unsupported and may even be unfinished.

Models for extractive summarization

Inference, learning, and evaluation code for extractive summarization.

Models currently have 2 main components -- Extractors and Losses.

Main model is CompressiveSummarizerModel.

  1. Extracts RNN embeddings and scores for each token in an input article using ModelInputs and SummarizerFeatures classes.
  2. Scores are ingested by Extractor to compute extractions and likelihoods, samples, marginals.
  3. ExtractorLoss provides gradients to the Extractor and SummarizerFeatures models from supervision.

Extractor implementations:

  • IndependentCardinalityPotentialsExtractor: uses k-constrained inference to extract summaries.
  • TreeConstrainedExtractor: uses k- and discourse-parse constrained inference to extract summaries.

ExtractorLoss implementations:

  • OracleXentExtractorLoss: learn to extract summaries using supervised labels.
  • ROUGEReinforceExtractorLoss: learn to extract summaries using ROUGE-1 recall with the ground truth summary and REINFORCE algorithm.

About

Work in progress inference, learning, and evaluation code for extractive summarization.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages