My deployment doesn't get updated with the latest mongodb snapshot

Current versus Expected behavior
We are facing an issue where our Astro project, which is managed with MongoDB, does not use the latest MongoDB data after a redeploy. Specifically, when we make updates to MongoDB via Payload CMS, the new build process is triggered successfully using a vercel build trigger hook, but the deployment still uses outdated MongoDB data instead of the latest changes.

Expected behavior: The redeploy process should use the most recent MongoDB data during the build and deployment process.
Current behavior: The deployment uses cached or outdated MongoDB data, even though the build completes successfully.


Code, configuration, and steps that reproduce this issue

  1. Our project is built with Astro and uses MongoDB for database management.
  2. Payload CMS is integrated to manage and update the MongoDB data.
  3. We have set up a hook in Payload CMS to trigger the build process whenever a database update is made.
  4. The build completes successfully, but when deployed, the application does not reflect the updated MongoDB data. Instead, it shows the previous data.

Steps to reproduce:

  1. Update data in Payload CMS.
  2. Trigger a build process using the Payload CMS hook.
  3. Redeploy the Astro project.
  4. Observe that the deployed version does not reflect the updated MongoDB data.

Project information (URL, framework, environment, project settings)

  • Framework: Astro
  • Database: MongoDB, managed via Payload CMS
  • Hosting Environment: Vercel (with Astro Vercel adapter)
  • Project Behavior: Static build with hooks triggering builds when MongoDB data is updated.
  • CMS Integration: Payload CMS to update MongoDB content.

We believe this issue may be related to caching during the build process or how static builds are generated in Astro. How can we ensure the deployment reflects the most recent MongoDB data?

Hi, @kedimuzafer! Welcome to the Vercel Community :smile:

Can you share some error logs with us? Might help us get closer to a resolution.

Thanks it’s a nice place,
I don’t get any errors, builds finish successfully but the information on the website that is connected to my mongodb doesn’t get updated. When I try the same thing on local it builds as it’s expected, it fetches the latest information from my mongodb server and builds.
But on vercel, it deploys file changes like colors, layout etc without any problem but doesn’t fetch the latest information from the mongodb. I tried to flush the cache but it didn’t work. I tried to change the output from static to server and it didn’t worked too. I found some information about the same problem on vercel you can check on here next.js - NextJS application uploaded on Vercel not fetching updated record - Stack Overflow

Thanks.

Some ideas:

  • Ensure fresh data fetching: Add timestamps to MongoDB queries.
  • Implement cache busting: Use unique identifiers for each build.
  • Manage Vercel’s build cache: Force fresh builds by modifying the “Ignored Build Step” .
  • Consider Incremental Static Regeneration (ISR) if possible.
  • Verify Payload CMS webhook is correctly triggering Vercel builds.
  • Add more logging to debug the build process.
  • Check Vercel deployment settings, ensuring compatible Node.js version.

It might be worth sharing your set-up, or as much of it to v0 to give you some more refined troubleshooting steps?

Thank you for your suggestions, Pauline. Let me explain our specific situation:

We’re facing a persistent issue where Vercel keeps using prebuilt artifacts despite our attempts to force fresh builds. Here’s what we’ve tried:

  1. Cache Control Measures:
// vercel.json
{
  "buildCommand": "VERCEL_FORCE_NO_BUILD_CACHE=1 TURBO_FORCE=true NODE_OPTIONS='--no-cache' pnpm install && pnpm build",
  "headers": [
    {
      "source": "/(.*)",
      "headers": [
        {
          "key": "Cache-Control",
          "value": "no-store, no-cache, must-revalidate, proxy-revalidate, max-age=0"
        },
        {
          "key": "Clear-Site-Data",
          "value": "\"cache\", \"storage\""
        }
      ]
    }
  ]
}
  1. Build Process Optimization:
// package.json
{
  "scripts": {
    "clean": "rm -rf .vercel/output dist .output node_modules/.cache .astro",
    "build": "NODE_OPTIONS='--no-cache' pnpm clean && VERCEL_FORCE_NO_BUILD_CACHE=1 TURBO_FORCE=true astro build --force"
  }
}
  1. Deployment Settings:
  • Set VERCEL_FORCE_NO_BUILD_CACHE=1 and TURBO_FORCE=true in environment variables
  • Changed output mode from static to server in Astro config
  • Disabled build cache through Git settings
  • Set ignoreBuildStep: false in vercel.json

However, we’re still seeing this in deployment logs:

[16:54:34.154] VERCEL_FORCE_NO_BUILD_CACHE is set so skipping build cache step
[16:54:35.582] Using prebuilt build artifacts...

The strange part is:

  • Source code changes are visible in the deployment
  • But the build output remains unchanged
  • Local builds work perfectly.
  • On Vercel, no changes are reflected in the output at all, whether they’re static or dynamic
  • The site keeps serving the same old build despite multiple deployments
  • Our longest build time is 3 sec.

We’ve tried your suggestions:

  1. :white_check_mark: Fresh data fetching (switched to server output)
  2. :white_check_mark: Cache busting (implemented aggressive cache headers)
  3. :white_check_mark: Build cache management (using environment variables and clean scripts)
  4. :x: ISR (not applicable with our Astro setup)

The core issue seems to be that Vercel is somehow bypassing all our cache-busting attempts and still using prebuilt artifacts, even though the logs indicate the build cache is being skipped.

Would you have any insights on why Vercel might still use prebuilt artifacts despite all these measures? Is there perhaps a deeper level of caching we’re missing?

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.