We are trying to use Vercel Queues to process Klaviyo events for analytics purposes. I noticed that when I send messages above ~250 kb they fail with a generic error. We’ve split messages to stay under this undocumented constraint, but that adds rate limit pressure.
Current Behavior
Messages > 250 kb fail with a generic error.
The volume is such that we run up against rate limits often due to message splitting.
The documentation states messages can be up to 100 mb
New messages have higher priority than retry messages, causing delays for retries during high traffic.
Expected Behavior
Support for larger message sizes as documented (up to 100 mb).
Recommendation
A setting to change the behavior to prioritize old messages
InternalServerError: {“error”:“Failed to store message payload”}
I didn’t find the exact threshold but comparing the ones that succeeded vs the ones that failed it happens between 240 kb and 260 kb which is reminiscient of SQS’s 256 kb limit. Since chunking at 240 kb we have had no issues sending messages.
At about Apr 06 10:33:19.67GMT the most common error changed from
Queue callback error: Error: Failed to receive/acknowledge message by ID: 429 Too Many Requests
at throwCommonHttpError (file:///var/task/node_modules/@vercel/queue/dist/index.mjs:1426:9)
at _ApiClient.receiveMessageById (file:///var/task/node_modules/@vercel/queue/dist/index.mjs:1793:7)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async ConsumerGroup.consume (file:///var/task/node_modules/@vercel/queue/dist/index.mjs:432:24)
at async handleCallback (file:///var/task/node_modules/@vercel/queue/dist/index.mjs:707:5)
at async file:///var/task/node_modules/@vercel/queue/dist/index.mjs:2045:9
to
Queue callback error: MultipartParseError: Unexpected end of stream
at StreamingMultipartParser.startProducer (file:///var/task/node_modules/mixpart/dist/index.mjs:207:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
I preferred the 429 error message. The new one makes me feel like there’s something unexpected going on. These messages generally succeed on the next retry.