Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.
This model was finetuned from a BART-base model using Unlimiformer-aware early stopping, described in section 3.1 of the paper. It was finetuned on the dataset SummScreen using the data preprocessing pipeline from SLED; to load the validation or test set for use with these model, please use the datasets urialon/summ_screen_validation and urialon/summ_screen_test.
This is generally a weaker model than the retrieval-trained model and a stronger model than the baseline.
The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input! See the Unlimiformer GitHub for setup instructions.
- Downloads last month
- 3