Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add response streaming support #1635

Open
2 tasks
plaisted opened this issue Dec 7, 2023 · 10 comments
Open
2 tasks

Add response streaming support #1635

plaisted opened this issue Dec 7, 2023 · 10 comments
Labels
feature-request A feature should be added or improved. module/lambda-client-lib p2 This is a standard priority issue queued

Comments

@plaisted
Copy link

plaisted commented Dec 7, 2023

Describe the feature

Add support for using lambda response streaming when returning streams in a dotnet lambda function.

https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/

Use Case

I would like to stream large S3 objects to http clients using a lambda URL.

Proposed Solution

Add functionality to enable returned streams to be sent using application/vnd.awslambda.http-integration-response content type and prefixed with required JSON prelude / null bytes. Allow status code / http headers to be configured and added to JSON prelude.

Other Information

No response

Acknowledgements

  • I may be able to implement this feature request
  • This feature might incur a breaking change

AWS .NET SDK and/or Package version used

Amazon.Lambda.RuntimeSupport 1.10.0

Targeted .NET Platform

.net 8

Operating System and version

debian container in lambda

@plaisted plaisted added feature-request A feature should be added or improved. needs-triage This issue or PR still needs to be triaged. labels Dec 7, 2023
@ashishdhingra ashishdhingra added module/lambda-client-lib needs-review and removed needs-triage This issue or PR still needs to be triaged. labels Dec 7, 2023
@ashishdhingra
Copy link
Contributor

Discussion #1632 opened few days ago.

@plaisted
Copy link
Author

plaisted commented Dec 8, 2023

I was able to get this working with pretty minimal changes. I'll push it somewhere in case anyone wants to replicate until it's officially supported.

@plaisted
Copy link
Author

plaisted commented Dec 8, 2023

Certainly not hardened but is working for my use case of returning dynamic http content from a lambda URL:
plaisted@3d45f5a

Usable by returning new StreamedResponse class which is a basic wrapper of a normal .net stream + http info:

    [LambdaFunction]
    public async Task<StreamedResponse> CustomerCDN(APIGatewayHttpApiV2ProxyRequest req, ILambdaContext ctx)
    {
        // dummy content
        await Task.Yield();
        var content = new MemoryStream(Encoding.UTF8.GetBytes("Example content!!".ToString()));

        // return using lambda response streaming
        return new StreamedResponse(content)
        {
            Headers = new Dictionary<string, string>
            {
                ["Content-Type"] = "text/plain; charset=utf-8",
                ["Cache-Control"] = "no-store, private, stale-if-error=0"
            }
        };
    }

@ashishdhingra ashishdhingra added p2 This is a standard priority issue queued and removed needs-review labels Dec 8, 2023
@paolofulgoni
Copy link

Another use case would be streaming an LLM response (e.g. with Anthropic Claude)

@normj
Copy link
Member

normj commented Apr 1, 2024

@paolofulgoni Good use case. Definitely feature I want to get to.

@Dreamescaper
Copy link
Contributor

Dreamescaper commented Aug 23, 2024

As far as I can see, currently all dotnet lambdas work based on Stream response underneath.
Therefore, are there any downsides to enabling streaming for all requests?

@normj
Copy link
Member

normj commented Aug 24, 2024

@Dreamescaper The streams you are seeing are different then Lambda's response stream. Currently the .NET Lambda runtime client only supports invoking the .NET code and taking the return from the invoke, if a POCO convert to a stream first, then upload the complete content of the stream back to the Lambda service. With response streaming we need a new programming model that provides access inside the .NET Lambda function the ability to write data back to the user without the function returning.

@Dreamescaper
Copy link
Contributor

@normj
But why Stream type isn't suitable for this purpose? AspNetCore uses it for this purpose, you can use IAsyncEnumerable<...> in your code, which is selialized to response Stream afterwards.
Why a new programming model is needed, when existing Stream-based model already supports streaming (and it's only needed to wire it up in RuntimeSupport)?

@jakubbloksa
Copy link

Certainly not hardened but is working for my use case of returning dynamic http content from a lambda URL: plaisted@3d45f5a

Usable by returning new StreamedResponse class which is a basic wrapper of a normal .net stream + http info:

    [LambdaFunction]
    public async Task<StreamedResponse> CustomerCDN(APIGatewayHttpApiV2ProxyRequest req, ILambdaContext ctx)
    {
        // dummy content
        await Task.Yield();
        var content = new MemoryStream(Encoding.UTF8.GetBytes("Example content!!".ToString()));

        // return using lambda response streaming
        return new StreamedResponse(content)
        {
            Headers = new Dictionary<string, string>
            {
                ["Content-Type"] = "text/plain; charset=utf-8",
                ["Cache-Control"] = "no-store, private, stale-if-error=0"
            }
        };
    }

Hey there, I've checked your commit and I would like to achieve the streamed response for custom GPT assistant via .NET Lambda. But, I'm actually not sure how to use this custom Runtime to leverage your commit. Do you have any example of such usage?

I have set up something like this:

public class Program
{
    [LambdaSerializer(typeof(DefaultLambdaJsonSerializer))]
    public static async Task Main(string[] args)
    {
        // Create an instance of the Function class
        var function = new Function();

        // Wrap the instance method
        using var handlerWrapper = HandlerWrapper.GetHandlerWrapper(
            (Func<APIGatewayHttpApiV2ProxyRequest, ILambdaContext, Task<StreamedResponse>>)function.FunctionHandler,
            new DefaultLambdaJsonSerializer()
        );

        // Initialize Lambda runtime
        using var bootstrap = new LambdaBootstrap(handlerWrapper);
        await bootstrap.RunAsync();
    }
}

And this is my Function

public class Function
    {
        public async Task<StreamedResponse> FunctionHandler(APIGatewayHttpApiV2ProxyRequest req, ILambdaContext ctx)
        {
            // dummy content
            await Task.Yield();
            var content = new MemoryStream(Encoding.UTF8.GetBytes("Example content!!".ToString()));

            // return using lambda response streaming
            return new StreamedResponse(content)
            {
                Headers = new Dictionary<string, string>
                {
                    ["Content-Type"] = "text/plain; charset=utf-8",
                    ["Cache-Control"] = "no-store, private, stale-if-error=0"
                }
            };
        }
      }

Specifically, I'm not sure what to set here in the runtime in my SAM template.yaml definition

ChatbotEndpointFunction:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: ./src/ChatbotEndpoint/
      Handler: ChatbotEndpoint::ChatbotEndpoint.Program::Main
      Runtime: dotnet8
      Architectures:
        - x86_64
      MemorySize: 512
      Timeout: 120
      Environment:
        Variables:
          Something: something
      FunctionUrlConfig:
        AuthType: NONE
        Cors:
          AllowOrigins:
            - "*"
          AllowMethods:
            - POST
            - GET
          AllowHeaders:
            - "*"
        InvokeMode: RESPONSE_STREAM

This code returns error like "Error converting the Lambda event JSON payload to type System.String[]: The JSON value could not be converted to System.String[]". Which I wasn't able to resolve, yet.

@plaisted
Copy link
Author

I've never used SAM so not sure on exact details but:

  • you are passing Main as your handler, the handler should be your actual lambda function (this is where string[] issue is coming from since it's args for main)
  • you are using the built in dotnet8 runtime which will be calling your lambda directly based on the handler you specify. I'm pretty sure the built in runtime for lambda includes the logic I'm replacing in the custom code so won't include the code changes to handle streaming so this won't work even with fixing the handler reference
  • I think you will need to use a custom runtime with lambda to use the changed code, in SAM docs it looks like this is the "provided" option but never used so not sure how it would be set up

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request A feature should be added or improved. module/lambda-client-lib p2 This is a standard priority issue queued
Projects
None yet
Development

No branches or pull requests

6 participants