Text2Text Generation
Transformers
PyTorch
bart
feature-extraction
Inference Endpoints
abertsch's picture
Create README.md
8a21759
|
raw
history blame
712 Bytes
metadata
datasets:
  - yuvalkirstain/summ_screen_fd_t5_lm
  - urialon/summ_screen_validation
  - urialon/summ_screen_test
pipeline_tag: text2text-generation

Baseline model for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset SummScreen using the data preprocessing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/summ_screen_validation and urialon/summ_screen_test.