Model Card for MediNote-13B-v1.0
MediNote is a suite of open-source medical Large Language Models (LLMs) fine-tuned for clinical note generation from the MediTron foundation model.
MediNote-13B is a 13 billion parameters model trained to generate clinical notes from doctor-patient conversations.
Model Details
- Developed by: Antoine Bonnet and Paul Boulenger
- Model type: Causal decoder-only transformer language model
- Language(s): English only
- Model License: LLAMA 2 COMMUNITY LICENSE AGREEMENT
- Code License: MIT
- Fine-tuned from model: Llama-2-13B with continued pre-training on PubMed Central (MediTron-13B equivalent)
- Context length: 2K tokens
- Input: Text-only data
- Output: Model generates text only
Model Sources
- Repository: EPFL-IC-Make-Team/ClinicalNotes
- Trainer: epflLLM/Megatron-LLM
- Paper: MediNote: Automatic Clinical Notes
Uses
Direct Use
It is possible to use this model to generate clinical notes, which is useful for experimentation and understanding its capabilities. It should not be used directly for production or work that may impact people.
Downstream Use
Out-of-Scope Use
We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise.
Recommendations
Citation
BibTeX: If you use MediNote or its training data, please cite our work:
ADD CITATION
- Downloads last month
- 16
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.