TheBloke commited on
Commit
a74ef17
1 Parent(s): f85ea3d

EOS should be 32000

Browse files

EOS should be 32000, otherwise generation doesn't terminate on the first `<|endofturn|>` token, making it seem like generation has frozen (in fact it's generating endless `<|endofturn|>` tokens until it reaches the max token limit)

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -4,7 +4,7 @@
4
  "MistralForCausalLM"
5
  ],
6
  "bos_token_id": 1,
7
- "eos_token_id": 2,
8
  "hidden_act": "silu",
9
  "hidden_size": 4096,
10
  "initializer_range": 0.02,
 
4
  "MistralForCausalLM"
5
  ],
6
  "bos_token_id": 1,
7
+ "eos_token_id": 32000,
8
  "hidden_act": "silu",
9
  "hidden_size": 4096,
10
  "initializer_range": 0.02,