-
Notifications
You must be signed in to change notification settings - Fork 478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add response streaming support #1635
Comments
Discussion #1632 opened few days ago. |
I was able to get this working with pretty minimal changes. I'll push it somewhere in case anyone wants to replicate until it's officially supported. |
Certainly not hardened but is working for my use case of returning dynamic http content from a lambda URL: Usable by returning new
|
Another use case would be streaming an LLM response (e.g. with Anthropic Claude) |
@paolofulgoni Good use case. Definitely feature I want to get to. |
As far as I can see, currently all dotnet lambdas work based on |
@Dreamescaper The streams you are seeing are different then Lambda's response stream. Currently the .NET Lambda runtime client only supports invoking the .NET code and taking the return from the invoke, if a POCO convert to a stream first, then upload the complete content of the stream back to the Lambda service. With response streaming we need a new programming model that provides access inside the .NET Lambda function the ability to write data back to the user without the function returning. |
@normj |
Hey there, I've checked your commit and I would like to achieve the streamed response for custom GPT assistant via .NET Lambda. But, I'm actually not sure how to use this custom Runtime to leverage your commit. Do you have any example of such usage? I have set up something like this:
And this is my Function
Specifically, I'm not sure what to set here in the runtime in my SAM template.yaml definition
This code returns error like "Error converting the Lambda event JSON payload to type System.String[]: The JSON value could not be converted to System.String[]". Which I wasn't able to resolve, yet. |
I've never used SAM so not sure on exact details but:
|
Describe the feature
Add support for using lambda response streaming when returning streams in a dotnet lambda function.
https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/
Use Case
I would like to stream large S3 objects to http clients using a lambda URL.
Proposed Solution
Add functionality to enable returned streams to be sent using
application/vnd.awslambda.http-integration-response
content type and prefixed with required JSON prelude / null bytes. Allow status code / http headers to be configured and added to JSON prelude.Other Information
No response
Acknowledgements
AWS .NET SDK and/or Package version used
Amazon.Lambda.RuntimeSupport 1.10.0
Targeted .NET Platform
.net 8
Operating System and version
debian container in lambda
The text was updated successfully, but these errors were encountered: