First off, other Vercel customers also reporting this problem:
https://www.reddit.com/r/nextjs/comments/1ijmhqx/soft_404_on_nextjs_webpages_hosted_on_vercel/
Hi everyone, I’ve been dealing with a major indexing issue using the latest versions of Nextjs (15.1.6), React 19.0.0 as well as the PayloadCMS integrated in my site hosted on Vercel (for reference https://finlywealth.com/ is the home page).
This issue is practically impossible to reproduce on your own. I’ve tried almost everything you can think of, like the Chrome Canary with disabling javascript, changing user agent to Googlebot Phone, disabling caching, etc. Even when you use the “live test” of Google search console, the page is displaying just fine.
I’ve even bought a license to ScreamingFrog to crawl my entire site with the “googlebot” user agent, and so far everything works just fine. I ended up setting up Sentry
on our Nextjs website to be able to narrow things down to what errors the googlebot crawler might be encountering, and pretty much it all comes down to this:
Error: Connection closed.
at t(./node_modules/.pnpm/next@15.1.6_@babel+core@7.24.5_@opentelemetry+api@1.9.0_react-dom@19.0.0_react@19.0.0__react@19.0.0_sass@1.77.4/node_modules/next/dist/compiled/react-server-dom-webpack/cjs/react-server-dom-webpack-client.browser.production.js:1498:1)
Here’s a publicly sharable link on Sentry:
I’ve been trying to figure out what I’m doing wrong on our website for 2+ weeks now. The soft 404 issue is persistent on googlebot for most pages, even though a lot of these pages are static content. I’m using ISR and SSG for producing a lot of these pages, as well as integrated unstable_cache
for DB calls via PayloadCMS
to fetch data from our MongoDB Cloud.
I’ve even increased the maxDuration
flag on all our pages to 60 as well as the settings on Vercel for functions durations to be 45. I’ve even boosted our CPU utilization to max on Vercel, hoping to get this resolved but still not successful.