[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[Help](/c/help/9)

# Python - Serverless Function has exceeded the unzipped maximum size

309 views · 0 likes · 3 posts


mchen10 (@mchen10) · 2025-03-21

I know this is an oft-asked question. Just curious if you have ideas for my specific use case. 

I'm deploying a Python backend. The packages are fairly heavy (LLM use case), so if I bundle all my requirements together it exceeds the 250 MB limit. I got around this temporarily by splitting my BE into two serverless functions: one for my server and one for my LLM uses. 

When I actually call the LLM functions from my server code, it starts failing with the errors reading "Module not found", where the modules are the LLM packages that I imported into my LLM serverless function. 

I think I might not be understanding serverless functions fundamentally, but is there a way to split up the functions such that one serverless function only depends on its own packages, and can make calls to another without having to import everything?

My vercel.json if helpful

```
  "builds": [
    {
      "src": "api/main.py",
      "use": "@vercel/python",
      "config": {
        "installCommand": "pip install -r requirements.txt"
      }
    },
    {
      "src": "api/ai_framework/__init__.py",
      "use": "@vercel/python",
      "config": {
        "installCommand": "pip install -r requirements-ai.txt"
      }
    },
    {
        "src": "package.json",
        "use": "@vercel/static-build",
        "config": {
          "distDir": "dist"  
        }
      }
  ],
```


Anshuman Bhardwaj (@anshumanb) · 2025-03-21

Hi @mchen10, thanks for asking such an elaborate question.

First things first: the Vercel config you shared is using legacy configuration settings. So, I'd recommend you to follow the latest https://vercel.com/docs/project-configuration docs.

Now about your use case, serverless functions are meant to spin up quickly and also get scrapped easily depending on the traffic. So, we need to keep the functions lightweight. Hence, creating backend APIs that depend on large local files isn't the best thing here.

Also, splitting your code into multiple functions won't solve the problem, it'll only make managing dependencies and overall code harder.

I'd recommend using API first LLM models such as OpenAI, Claude, and similar because they don't require such big dependencies.

I hope this was helpful.


Pauline P. Narvas (@pawlean) · 2026-02-25

Good news — this should be resolved now! Vercel has increased the Python Functions bundle size limit from **250MB to 500MB** (unzipped).

You had to split your LLM backend into two separate functions as a workaround, but you should now be able to consolidate everything back into a single function and redeploy without hitting the size limit.

Reference: https://vercel.com/changelog/python-vercel-functions-bundle-size-limit-increased-to-500mb

Let us know if you still run into any issues!