-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update bedrock v4 docs #1554
Update bedrock v4 docs #1554
Conversation
🎊 PR Preview has been successfully built and deployed to https://localstack-docs-preview-pr-1554.surge.sh 🎊 |
e59db2b
to
5407f1e
Compare
5407f1e
to
c65d2c6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the fast turnaround!
@@ -75,5 +86,5 @@ $ awslocal bedrock-runtime converse \ | |||
|
|||
## Limitations | |||
|
|||
* LocalStack Bedrock implementation is mock-only and does not run any LLM model locally. | |||
* LocalStack Bedrock currently only officially supports text-based models. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does "currently only officially" mean exactly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That at this point in time we only officially support text-based models as opposed to image or other kinds of binary data models.
Do you have an alternative way to read this sentence? I can clarify based on that :D
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My question would be what kind of models are unofficially supported 😅 . It's fine for me now, this was just the first question popping in my head. For me they are either supported or not, but this phrasing leads me to believe other models might be unofficially supported :P
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unofficially we support any model that can run in ollama, but not every single one is tested. The invoke-model
endpoint accepts binary data, so you could theoretically run non text-based models and use it without any issue probably - we just haven't tested that yet :)
This PR updates the documentation to cover the new LLM-enabled functionality of Bedrock and the new configuration variables.