VLLM is not supporting MistralForCausalLM
#11
by
narenzen
- opened
ValueError: Model architectures ['MistralForCausalLM'] are not supported for now. Supported architectures: ['AquilaModel', 'BaiChuanForCausalLM', 'BaichuanForCausalLM', 'BloomForCausalLM', 'FalconForCausalLM', 'GPT2LMHeadModel', 'GPTBigCodeForCausalLM', 'GPTJForCausalLM', 'GPTNeoXForCausalLM', 'InternLMForCausalLM', 'LlamaForCausalLM', 'LLaMAForCausalLM', 'MPTForCausalLM', 'OPTForCausalLM', 'QWenLMHeadModel', 'RWForCausalLM']
Hey, it will be in the next release ! https://github.com/vllm-project/vllm/issues/1089
We were a bit slow addressing some issues with our PR yesterday :)
timlacroix
changed discussion status to
closed