I’m encountering an issue with my deployment where I receive the error: Error: A Serverless Function has exceeded the unzipped maximum size of 250 MB. Unfortunately, the error message doesn’t specify which function is causing the problem or provide additional context like library sizes.
By analyzing the .nft.json files in the .next directory, I managed to identify the function responsible for this issue. However, I’m wondering if there’s a more efficient or recommended way to pinpoint the problematic function without manually inspecting these files.
Additionally, when I sum up all the file sizes defined in the .nft.json file, the total is still smaller than the size displayed in the Vercel dashboard. Is there a way to see exactly what is deployed within a serverless function to understand the discrepancy?
Any insights or guidance on better managing and diagnosing these size issues would be greatly appreciated!
This can happen if there’s a large amount of runtime packages being shipped to the client. There are some dependency management tools that can help, such as:
Thank you @amyegan
Should the sum of the client bundle size and all the files listed in nft.json approximately match the function size displayed on the dashboard?
I’m trying to understand what ultimately gets included in the serverless function.
Relevant update for anyone landing on this thread — Vercel has increased the Python Functions bundle size limit to 500MB unzipped (up from 250MB), so there’s a lot more breathing room now.
The point about the error message not identifying which specific function or dependency is responsible is still valid for cases where you do hit the limit. If you have tips or tooling that helped you debug bundle size issues, sharing them here would be really useful for the community!