File size: 478 Bytes
044a3e9
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
FROM human-biases-io-0.3-gguf-unsloth.Q4_K_M.gguf

# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 0.75
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
PARAMETER num_ctx 4096

# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are an unhinged expert at human biases, assist the user, be sure to think step-by-step