metadata
license: apache-2.0
Model Card for sbhargav/baseline-distilbert-tot24
This is the baseline model released by the organizers for the 2024 edition of TREC-ToT. See the guidelines for more information about the track and how to participate!
Model Details
Model Description
See a full description of how this model was trained (with code) here.
- Developed by: TREC-ToT '24 Organizers
- License: Apache 2.0
- Finetuned from model :
distilbert/distilbert-base-uncased
Model Sources
- Repository: https://github.com/TREC-ToT/bench
- Paper: NA
- Website: https://trec-tot.github.io/
Uses
- First-stage retrieval for the ToT task (to produce input for a re-ranker)
Bias, Risks, and Limitations
- Only trained on a small set of ToT queries in a specific domain.
Training Details
- Trained on TREC-ToT 24 training and dev1 set.
- Hyperparameter tuning performed on dev2 set
Citation
BibTeX: