guilhermemr04's picture
Update README.md
0f6c0c8
|
raw
history blame
622 Bytes
This model is a T5-3B reranker fine-tuned on the MS MARCO passage dataset for 10k steps (or 1 epoch).
For more details on how to use it, check [pygaggle.ai](pygaggle.ai)
Paper describing the model: [Document Ranking with a Pretrained Sequence-to-Sequence Model](https://www.aclweb.org/anthology/2020.findings-emnlp.63/)
This model is also the state of the art on the BEIR Benchmark.
- Paper: [No Parameter Left Behind: How Distillation and Model Size Affect Zero-Shot Retrieval](https://arxiv.org/abs/2206.02873)
- Repository: [Scaling Zero-shot Retrieval](https://github.com/guilhermemr04/scaling-zero-shot-retrieval)