Please add the missing "MambaForCausalLM" to the "config.json"

#1
by count-zero - opened

@jondurbin Please add the missing "architectures": ["MambaForCausalLM"], line to the config.json, so that it can be quantized with llama.cpp without any further manipulation.

Sign up or log in to comment