Sitemap not getting updated

I have been trying to update my sitemap.xml, but it is not getting updated in production. Here’s what I’ve done so far:

  1. Updated the sitemap.
  2. Added a custom header in next.config.mjs to revalidate the cache.
  3. Purged the Vercel cache from project settings.
  4. Redeployed several times.

Despite these steps, I am still seeing the old sitemap at the production URL. For reference, I created a static sitemap.xml and am using Next.js version 14.2.20.

Can you help me resolve this issue?

Hey, Finfao! Welcome :wave:

Could you make sure your sitemap.xml is placed in the correct location. In Next.js, static files should be placed in the public folder? Your sitemap.xml should be at public/sitemap.xml.

Yes, I do see in the build logs that sitemap is generated. But the newer one is not visible. I even tried with postman. I tried everything that forums have.

Hi @live-play1, thanks for sharing more context. When you run the build locally, is the new sitemap.xml file generated and placed in the public/ folder?

If you need more help, please share your public repo or a minimal reproducible example. That will let us all work together from the same code to figure out what’s going wrong.

1 Like

Current vs. Expected Behavior

I deployed my Next.js application on Vercel and generated a sitemap.xml file. However, when I upload the sitemap to google search console, it doesn’t seem to work as expected. I get this error "
Sitemap could not be read" .

I expect the sitemap to be accessible at https://dt-media.vercel.app/sitemap.xml and correctly list all relevant pages with proper <loc>, <lastmod>, <changefreq>, and <priority> attributes.

Code, Configuration, and Steps to Reproduce

  1. My sitemap.xml Output:

xml

<?xml version="1.0" encoding="UTF-8"?> https://dt-media.vercel.app/icon.png2025-02-18T15:17:02.667Zweekly1 https://dt-media.vercel.app/portfolio/logo-design2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/portfolio2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/portfolio/photography2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/portfolio/graphic2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/services2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/contact2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/portfolio/videography2025-02-18T15:17:02.668Zweekly1 https://dt-media.vercel.app/about2025-02-18T15:17:02.668Zweekly1
  1. Steps to Reproduce:
  • I generated the sitemap in my Next.js app using npm install next-sitemap.
  • Deployed the project to Vercel.
  • sitemap can be found at https://dt-media.vercel.app/sitemap.xml.
  • The sitemap does not load properly when I try to add it to google search console

Project Information

  • Framework: Next.js app router
  • Hosting: Vercel
  • Deployment Type: Static Export
  • Custom Configuration: None specific to sitemap.xml
  • Expected Behavior: The sitemap should be accessible at https://dt-media.vercel.app/sitemap.xml and correctly serve all URLs.

Questions:

  1. Does Vercel require additional configuration to serve a sitemap.xml correctly?
  2. Could this be a caching issue, or does Next.js need additional settings for sitemaps in next.config.js?
  3. How do I verify if search engines can access and parse this sitemap correctly?

Hi, @oo-ng! Welcome to the Vercel Community :wave:

I’ve moved your discussion topic here as it seems related.

Do you have any error messages that you can share with us? :pray:

1 Like

I am using static sitemap, so sitemap.xml is already in the public folder. Should I use dynamic or ts file to generate sitemap ?


This what I get when I build it locally, however I don’t see sitemap.xml or public folder in my dist folder.
I have next-sitemap.config.js if it is relevant.

/** @type {import('next-sitemap').IConfig} */
module.exports = {
    siteUrl: process.env.SITE_URL || 'https://liveplay.in',
    generateRobotsTxt: true,
}

Should I use sitemap.ts and genrate the sitemap, or should I have a sitemap.xml which has all the relevant things ?

On vercel it shows sitemap-0.xml as well, I have sitemap.xml which points to sitemap-0.xml and it has all the xml data. But when build is run loclly I dont see nything of that sorts.

Hi @live-play1, thanks for sharing more details.

Let’s take it step by step:

  1. If you’re using a pre-generated sitemap.xml and keep it in the /public/sitemap.xml then that’s a correct approach and should work. If that’s what you’re doing let’s debug this.
  2. If you’re using the next-sitemap package, then we will have to debug it differently.

So, don’t change the approach just yet. Can you confirm which method were you using initially?

hi, I see this discussion was nipped, but I’m getting a similar issue. I have a sitemap.ts that generates a sitemap xml for me, and a robots.txt that directs the robots to the correct sitemap location. The sitemap xml is available via the browser, validate-xml-sitemap gives me “no issues detected”, and I get a 200 and corerct headers via a curl command. However, when I submit the sitemap to the Google Search Console, I get “Sitemap could not be read”. I tried generating a static xml and saving in public/sitemap.xml, but hit the same error; I even tried generating a simple xml sitemap with just three links - public/test-sitemap.xml - but got the same error on GSC. I suspect the issue might be with Vercel accidentally blocking the crawlers? I see DDoS mitigation instances in the project’s firewall (I didn’t set up any firewall rules), which might be the crawlers being blocked by vercel (?..). Can you help? What should I do now?

@sfkislev Welcome to the Community! :waving_hand:

Funnily enough I actually did this myself over the weekend for a project, and it worked fine.

Sharing my configuration, in case it’s helpful:

`robots.ts`
import type { MetadataRoute } from "next/types"

const SITE_URL = "URL"

export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: "*",
      allow: "/",
    },
    sitemap: `${SITE_URL}/sitemap.xml`,
  }
}

`sitemap.ts`
import type { MetadataRoute } from "next/types"

const WORDPRESS_API_URL = "URL"
const SITE_URL = "URL"

// Define types for WordPress content
interface WordPressPost {
  id: number
  slug: string
  date: string
  modified: string
  link: string
}

// Fetch posts from WordPress
async function fetchPosts(page = 1, perPage = 100): Promise<WordPressPost[]> {
  try {
    const res = await fetch(
      `${WORDPRESS_API_URL}/posts?page=${page}&per_page=${perPage}&_fields=id,slug,date,modified,link`,
      { next: { revalidate: 3600 } },
    )

    if (!res.ok) {
      throw new Error(`Failed to fetch posts: ${res.status}`)
    }

    const posts = (await res.json()) as WordPressPost[]
    const totalPages = Number.parseInt(res.headers.get("X-WP-TotalPages") || "1", 10)

    if (page < totalPages && page < 10) {
      // Limit to 10 pages to avoid too many requests
      const nextPagePosts = await fetchPosts(page + 1, perPage)
      return [...posts, ...nextPagePosts]
    }

    return posts
  } catch (error) {
    console.error("Failed to fetch WordPress posts:", error)
    return []
  }
}

// Fetch pages from WordPress
async function fetchPages(): Promise<WordPressPost[]> {
  try {
    const res = await fetch(`${WORDPRESS_API_URL}/pages?per_page=100&_fields=id,slug,date,modified,link`, {
      next: { revalidate: 3600 },
    })

    if (!res.ok) {
      throw new Error(`Failed to fetch pages: ${res.status}`)
    }

    return (await res.json()) as WordPressPost[]
  } catch (error) {
    console.error("Failed to fetch WordPress pages:", error)
    return []
  }
}

// Create a URL that uses the main domain
function createUrl(wpUrl: string): string {
  try {
    // Extract the path from the WordPress URL
    const urlObj = new URL(wpUrl)
    const path = urlObj.pathname

    // Create a new URL with the main domain
    return `${SITE_URL}${path}`
  } catch (error) {
    console.error("Error creating URL:", error)
    return wpUrl
  }
}

export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
  // Fetch all posts and pages
  const [posts, pages] = await Promise.all([fetchPosts(), fetchPages()])

  // Create sitemap entries for posts
  const postEntries = posts.map((post) => ({
    url: createUrl(post.link),
    lastModified: new Date(post.modified || post.date),
    changeFrequency: "weekly" as const,
    priority: 0.6,
  }))

  // Create sitemap entries for pages
  const pageEntries = pages.map((page) => ({
    url: createUrl(page.link),
    lastModified: new Date(page.modified || page.date),
    changeFrequency: "monthly" as const,
    priority: 0.8,
  }))

  // Add homepage and feed
  const staticEntries = [
    {
      url: SITE_URL,
      lastModified: new Date(),
      changeFrequency: "daily" as const,
      priority: 1.0,
    },
    {
      url: `${SITE_URL}/api/feed`,
      lastModified: new Date(),
      changeFrequency: "daily" as const,
      priority: 0.8,
    },
  ]

  // Combine all entries
  return [...staticEntries, ...pageEntries, ...postEntries]
}


It could be helpful to look at your current set up!

1 Like

hi, thanks for sharing. Here’s the sitemap.ts, here’s the robots.ts, and here’s the resulting sitemap.xml . It’s in public/sitemap.xml . Not sure how you attached the code so nicely - sorry about the links.

1 Like
TL;DR: Working fine for me

I’m using next-sitemap 4.2.3 right now with no issues. My config:

/** @type {import('next-sitemap').IConfig} */
module.exports = {
  siteUrl: process.env.NEXT_PUBLIC_SITE_URL || "https://bestcodes.dev",
  generateRobotsTxt: true,
};

This generates public/sitemap.xml and public/sitemap-0.xml files, which are mostly empty since I recently rebuilt my website and haven’t migrated all the old pages, but they work:

https://bestcodes.dev/sitemap.xml
https://bestcodes.dev/sitemap-0.xml

My site is deployed with Vercel with no issues (using Next.js 15.3.0).

Logs:

next-sitemap logs from my last deployment
$ rm -rf public/sitemap*.xml public/robots.txt && next-sitemap
✨ [next-sitemap] Loading next-sitemap config: file:///vercel/path0/next-sitemap.config.js
✅ [next-sitemap] Generation completed
┌───────────────┬────────┐
│ (index)       │ Values │
├───────────────┼────────┤
│ indexSitemaps │ 1      │
│ sitemaps      │ 1      │
└───────────────┴────────┘
-----------------------------------------------------
 SITEMAP INDICES 
-----------------------------------------------------
   ○ https://bestcodes.dev/sitemap.xml
-----------------------------------------------------
 SITEMAPS 
-----------------------------------------------------
   ○ https://bestcodes.dev/sitemap-0.xml

Questions for people using next-sitemap:

  • Are the sitemap files git-ignored? (They should be)
  • What version of next-sitemap are you using?
  • Do you remove current sitemap files (if they exist) before running next-sitemap? (You should)

thanks for your response! I’m not using next-sitemap, but the native MetadataRoute.Sitemap . The resulting sitemap looks fine, I think (it’s attached above). It’s quite large, at ~15k entries. I’ve also downloaded it and put it in /public, so that it behaves like a static file.

There’s a promising sign: the test-sitemap.xml I created and submitted yesterday, with only 3 entries, got a “couldn’t fetch” yesterday, and is now a green “success” at the Google Search Console, which might indicate that the “couldn’t fetch” is Google’s confusing way of saying “wait”?..?
However, in the vercel firewall settings, looking at yesterday, I see 213 deinals and 588 challenges, all DDoS mitigation, and coming from the same IP (154.83.103.202 challenged, 154.83.103.113 deined). These seem like the crawlers of either the GoogleBot or the Ahref’s site audit I’ve attempted to generate, being blocked. (?)

1 Like

I think the original poster, @live-play1, was using next-sitemap, so my post was to him. :slight_smile:

As far as the “couldn’t fetch” error on Google Search Console, ignore it. I get the same error every time I upload a sitemap. Just wait a few minutes and reload the page.

1 Like

+1, seen this myself. What’s the update today, @sfkislev? Sounds like you made some progress here.

I don’t see how this could be the case given your robots file is permissive.

1 Like

well.. that’s confusing, and I saw it happening with my smaller test file: still, I’m about 24 hours on, and I still see a “couldn’t fetch” message on the Google Search Console for my sitemap. On the Vercel firewall dashboard I still see DDoS mitigation deinals from today (though only ~150, compared to yesterday’s ~2k, and to the ~15k sitemap…). I strongly suspect that this mitigation is blocking the crawlers. Don’t you?

1 Like