We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add configuration support for custom OpenAI compatible endpoints for tools like ollama or localAI instead of proprietary AI.
I prefer to selfhost when possible and have been very keen on local only AI tools.
I want this feature to include a new configuration variable for openAI API endpoint
No response
None
The text was updated successfully, but these errors were encountered:
Not possible at the moment with CopilotKit, Tagging @ataibarkai and @arielweinberger for this :)
Sorry, something went wrong.
No branches or pull requests
🔖 Feature description
Add configuration support for custom OpenAI compatible endpoints for tools like ollama or localAI instead of proprietary AI.
🎤 Why is this feature needed ?
I prefer to selfhost when possible and have been very keen on local only AI tools.
✌️ How do you aim to achieve this?
I want this feature to include a new configuration variable for openAI API endpoint
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
Are you willing to submit PR?
None
The text was updated successfully, but these errors were encountered: