File size: 851 Bytes
bd615ee a4da419 bd615ee |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
license: bsd-3-clause
base_model: pszemraj/pegasus-x-large-book-summary
tags:
- generated_from_trainer
- synthsumm
metrics:
- rouge
datasets:
- pszemraj/synthsumm
pipeline_tag: summarization
language:
- en
---
# pegasus-x-large-book_synthsumm - bf16
> this is just a new repo in bf16 (training precision). refer to original repo for details https://huggingface.co/pszemraj/pegasus-x-large-book_synthsumm
Fine-tuned on a synthetic dataset of curated long-context text and `GPT-3.5-turbo-1106` summaries spanning multiple domains + "random" long-context examples from pretraining datasets
Try it: [gradio demo](https://huggingface.co/spaces/pszemraj/document-summarization) | [example outputs .md](evals-outputs/GAUNTLET.md) (gauntlet) | code for free [HF inference api](https://gist.github.com/pszemraj/08f527380ed00ef2f2169e220341c489)
|