|
--- |
|
license: bsd-3-clause |
|
base_model: pszemraj/pegasus-x-large-book-summary |
|
tags: |
|
- generated_from_trainer |
|
- synthsumm |
|
metrics: |
|
- rouge |
|
datasets: |
|
- pszemraj/synthsumm |
|
pipeline_tag: summarization |
|
language: |
|
- en |
|
--- |
|
|
|
# pegasus-x-large-book_synthsumm - bf16 |
|
|
|
> this is just a new repo in bf16 (training precision). refer to original repo for details https://huggingface.co/pszemraj/pegasus-x-large-book_synthsumm |
|
|
|
|
|
Fine-tuned on a synthetic dataset of curated long-context text and `GPT-3.5-turbo-1106` summaries spanning multiple domains + "random" long-context examples from pretraining datasets |
|
|
|
|
|
Try it: [gradio demo](https://huggingface.co/spaces/pszemraj/document-summarization) | [example outputs .md](evals-outputs/GAUNTLET.md) (gauntlet) | code for free [HF inference api](https://gist.github.com/pszemraj/08f527380ed00ef2f2169e220341c489) |
|
|
|
|
|
|