⭐ StarCoder
Collection
All models, datasets, and demos related to StarCoder!
•
11 items
•
Updated
•
20
OctoCoder is an instruction tuned model with 15.5B parameters created by finetuning StarCoder on CommitPackFT & OASST as described in the OctoPack paper.
Data | CommitPack | 4TB of GitHub commits across 350 programming languages |
---|---|---|
CommitPackFT | Filtered version of CommitPack for high-quality commit messages that resemble instructions | |
Model | OctoCoder | StarCoder (16B parameters) instruction tuned on CommitPackFT + OASST |
OctoGeeX | CodeGeeX2 (6B parameters) instruction tuned on CommitPackFT + OASST | |
Evaluation | HumanEvalPack | Extension of OpenAI's HumanEval to cover 3 scenarios across 6 languages |
The model follows instructions provided in the input. You should always preface your input with "Question: " and finish it with "Answer:", for example: "Question: Please write a function in Python that performs bubble sort.\n\nAnswer:"
Feel free to share your generations in the Community tab!
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "bigcode/octocoder"
device = "cuda" # for GPU usage or "cpu" for CPU usage
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device)
inputs = tokenizer.encode("Question: Please write a function in Python that performs bubble sort.\n\nAnswer:", return_tensors="pt").to(device)
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
@article{muennighoff2023octopack,
title={OctoPack: Instruction Tuning Code Large Language Models},
author={Niklas Muennighoff and Qian Liu and Armel Zebaze and Qinkai Zheng and Binyuan Hui and Terry Yue Zhuo and Swayam Singh and Xiangru Tang and Leandro von Werra and Shayne Longpre},
journal={arXiv preprint arXiv:2308.07124},
year={2023}
}