Chat template is not consistent with documentation?
#34
by
ejschwartz
- opened
The chat template in the tokenizer is quite different than the format described here. In particular, the tokenizer version instructs the model to always make a tool call, even when it doesn't make sense. Is this an oversight?
This conversation is relevant: https://github.com/meta-llama/llama-stack-apps/issues/36