I’m creating a serverless function with Go using the provided.al2023
runtime.
To start, let me clarify that the examples I give below will be done on both Lambda and Vercel, to demonstrate the difference. I reason that, since Vercel internally hosts Serverless Functions on Lambda, if Lambda supports it, I believe it should be straightforward (or ‘just work’) to make it work on Vercel the same way as Lambda. The only exception is if Vercel wraps the response, but my assumption is that it wraps the response the same way for both AL2023 and Node runtimes, as it should be runtime agnostic (proxy gateway), so if Streaming works in Node runtime, it should work in AL2023 too.
I’ve compiled my function and deployed it without streaming enabled ("supportsResponseStreaming": false
) and verified that my function works. Under the hood, Vercel CLI will build the handler and wrap it in vercel/go-bridge
which is a simple wrapper of aws-lambda-go
. I’ve verified that compiling my own handler using aws-lambda-go
directly, the same way as @vercel/go
compiles them, works as intended.
Here’s how the application is started via Lambda, which I have verified works on Vercel:
func Start() {
lambda.Start(func(request events.APIGatewayProxyRequest) (events.APIGatewayProxyResponse, error) {
log.Println("Hello world")
response := events.APIGatewayProxyResponse{
StatusCode: 200,
Body: "Hello world",
}
return response, nil
})
}
Enabling streaming ("supportsResponseStreaming": true
, as documented in Vercel Primitives) and modifying the code to use LambdaFunctionURLStreamingResponse
, like so:
func Start() {
lambda.Start(func() (*events.LambdaFunctionURLStreamingResponse, error) {
fmt.Println("Hello world!")
return &events.LambdaFunctionURLStreamingResponse{
StatusCode: 200,
Headers: map[string]string{
"Content-Type": "text/html",
},
Body: strings.NewReader("<html><body>Hello World!</body></html>"),
}, nil
})
}
Deploying this to AWS Lambda (no Vercel) with InvocationMode: RESPONSE_STREAM
, I have verified that this works as it should. Here’s a deployed lambda with the above code:
https://sylbl74crhzsbpbuqbh2yajkti0boxjj.lambda-url.eu-north-1.on.aws/
Now, deploying this to Vercel does not work, all I get back is FUNCTION_INVOCATION_FAILED
with no meaningful logging message, just a 500 request. I have verified that the Hello World!
log does appear, but the response is 500.
This means my handler is being called, but the response is not accepted.
Here’s a deployed Vercel serverless function with the above code:
https://vc-stream-test.vercel.app/test2
Interesting to note is that when enabling
supportsResponseStreaming
and using the original APIGatewayProxyRequest
code, the function no longer works, which indicates that Vercel does something with the supportsResponseStreaming
flag on AL2023, so there’s definitely something there.
I had hoped that this would work as is due to the announcement about Framework-agnostic streaming, and I have also cross-compared this with the source code of @vercel/node
package, which just returns a Readable
.
If Vercel just exposes an AWS Lambda, it should work fine, since AWS AL2023 supports it (as can be seen with my previous link), but if they proxy it something could mess it up.
For the record, I’ve disabled the Vercel Toolbar middleware, to see if that was the culprit, but it did not work.
Anyone has any thoughts on this? Does Vercel intercept the response for the AL2023 runtime and expect it to be in the non-streaming format (JSON with body string field), as opposed to the streaming format (JSON prelude, with body streamed afterwards). How come the Node runtime works with streaming, but not the AL2023 runtime?