Moe-4x7b-math-reason-code / generation_config.json
bhavnah's picture
Upload MixtralForCausalLM
bcae256 verified
raw
history blame contribute delete
137 Bytes
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 32000,
"transformers_version": "4.42.4",
"use_cache": false
}