[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live) [Feedback](/c/feedback/8) # Option For Expanded Lambda Sizes 109 views · 2 likes · 3 posts Versecafe (@versecafe) · 2024-07-25 · ♥ 1 With AWS supporting up to `10240MB` for lambdas and some functions for tasks like PDF generation or data processing like large kmeans calculations with vector embeddings it would be useful to have certain functions at 4, 6, or even 10gb of ram as opposed to the current 2gb cap. I've had to splinter projects over to AWS + Vercel working with tools like pulumi or TF to keep things managed and it's a huge hassle that could be avoided if the max function sizes for the Vercel lambda wrappers were increased to match AWS. As a plus to whoever on the sales or finance team it would make Vercel some more money on function usage and pull in some ML ops stuff which is at least in my case forced over to AWS. > See https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html for docs on current lambda size range, 128 - 10240 mb as of most recent check Pauline P. Narvas (@pawlean) · 2024-07-25 · ♥ 1 @versecafe Thanks for your feedback! I've shared this internally with our product team. Pauline P. Narvas (@pawlean) · 2026-02-25 Great news — this has been partially addressed! Vercel has increased the Python Functions bundle size limit to **500MB unzipped** (previously 250MB). It's not the 4–10GB tiers you originally requested, but doubling the limit should meaningfully help with the data processing, PDF generation, and vector embedding workloads you described. Hopefully a step in the right direction! Reference: https://vercel.com/changelog/python-vercel-functions-bundle-size-limit-increased-to-500mb