You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You should be able to override the max context window for the models. Especially because it seems to be enforcing a stricter limit based on the base model and not the model in use:
Cannot reproduce with the latest main branch—I can set the context limit as high as 131072. I'll have to assume this was fixed (since OP did not specify which version they tried, and this was opened a while ago).
Bug Report
You should be able to override the max context window for the models. Especially because it seems to be enforcing a stricter limit based on the base model and not the model in use:
This should have a 128k context window, but GPT4All enforces 4096
https://huggingface.co/failspy/Phi-3-mini-128k-instruct-abliterated-v3-GGUF/tree/main
The same with https://huggingface.co/mradermacher/Llama-3-8B-source-lewd-context-GGUF which should allow for also an insenely high context window, but it is capped at 8k.
Steps to Reproduce
Expected Behavior
These models that were made to allow for an extended context should allow such contexts to be made.
Your Environment
The text was updated successfully, but these errors were encountered: