metadata
license: bsd-3-clause
base_model: pszemraj/pegasus-x-large-book-summary
tags:
- generated_from_trainer
- synthsumm
metrics:
- rouge
datasets:
- pszemraj/synthsumm
pipeline_tag: summarization
language:
- en
pegasus-x-large-book_synthsumm - bf16
this is just a new repo in bf16 (training precision). refer to original repo for details https://huggingface.co/pszemraj/pegasus-x-large-book_synthsumm
Fine-tuned on a synthetic dataset of curated long-context text and GPT-3.5-turbo-1106
summaries spanning multiple domains + "random" long-context examples from pretraining datasets
Try it: gradio demo | example outputs .md (gauntlet) | code for free HF inference api