Mikael110 commited on
Commit
96fc028
1 Parent(s): d99179a

Remove invalid KoboldCpp option from examples

Browse files

KoboldCpp does not actually support 16K contexts, it maxes out at 8K. A [Feature Request](https://github.com/LostRuins/koboldcpp/issues/287) has been opened for it, but it is currently a low priority so it might be a while until it is fixed.

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -28,7 +28,7 @@ In order to use the increased context length, you can presently use:
28
 
29
  Support is also expected to come to llama.cpp, however work is still being done to find the optimal implementation.
30
 
31
- To use the increased context with KoboldCpp, use `--contextsize` to set the desired context, eg `--contextsize 4096` or `--contextsize 8192` or `--contextsize 16384`.
32
 
33
  **NOTE 1**: Currently RoPE models can _only_ be used at a context size greater than 2048. At 2048 it will produce gibberish. Please make sure you're always setting `--contextsize` and specifying a value higher than 2048, eg 3072, 4096, etc.
34
 
 
28
 
29
  Support is also expected to come to llama.cpp, however work is still being done to find the optimal implementation.
30
 
31
+ To use the increased context with KoboldCpp, use `--contextsize` to set the desired context, eg `--contextsize 4096` or `--contextsize 8192`. Koboldcpp does not currently support context sizes above 8192.
32
 
33
  **NOTE 1**: Currently RoPE models can _only_ be used at a context size greater than 2048. At 2048 it will produce gibberish. Please make sure you're always setting `--contextsize` and specifying a value higher than 2048, eg 3072, 4096, etc.
34