This model is a T5-3B reranker fine-tuned on the MS MARCO passage dataset for 10k steps (or 1 epoch).
For more details on how to use it, check pygaggle.ai
Paper describing the model: Document Ranking with a Pretrained Sequence-to-Sequence Model
This model is also the state of the art on the BEIR Benchmark. Paper: No Parameter Left Behind: How Distillation and Model Size Affect Zero-Shot Retrieval Repository: Scaling Zero-shot Retrieval