ISR Writes still occurring, maybe due to new Date(publishedAt) etc?

I noticed that even though I no longer use generateStaticParams in my codebase (removed August 16th), I have still had over 170,000 ISR writes since removing all generateStaticParams() usage in my codebase. My assumption is it’s something in my [slug] page redshirt-sports/apps/web/app/[slug]/page.tsx at main · JamesSingleton/redshirt-sports · GitHub but I honestly have no idea.

In reading Incremental Static Regeneration usage and pricing, it mentions to make sure you’re not using new Date()… Well that’s kind of hard to do when you have to show dates in your blog post or add dates to your ld+json .

I switched as much as I could to use date-fns and date-fns-tz. However, I still need it for new Date().getFullYear() for things like copyright in my ld+json scripts as well as the footer.

Is there even a replacement for this?

Hi @jamesrsingleton, sorry for the late response.

Can you check in Dashboard > Your project > Observability > ISR to see what could be causing this?

Once you have this information we can dig into why we are seeing so may writes.

Here is a screenshot of that page. I just don’t know why /[slug] is having 586 ISR writes in the last 12 hours but 394 reads. This is the code for that page.

Hi @jamesrsingleton, thanks for sharing. Let me dig into this.

Hi @jamesrsingleton, how many pages are there under the /slug route based on the Sanity data you have? Also, how often is the underlying data changing?

We have 642 articles that would fall under /[slug]. An article is maybe published 2-3 times a week… Some weeks it’s more and some weeks it’s less.

1 Like

I see. I get that new articles are published 2-3 times a week but what about changes to the existing ones? I’m still working on this one. It’s quite an interesting case.

Oh once an article is published it’s rarely re-published. There might be the one offs that maybe a capitalization was missed or something like that, but that’s few and far between. There are more publishing or articles than there is updating old articles. The only thing I can think of off the top of my head is using Sanity Live and how it revalidates a tag when new content is being published, if the newly published thing impacts a page. But I’m also not publishing that many things in our studio to hit 700+ writes every 12 hours.

Got it. This information will help me debug as well. My initial hunch is also about Sanity Live because otherwise you have 1 week of caching, so even if all pages are generated in 12 hours, they shouldn’t be generated in the next 12 hours.

I think we’ve gone this path before but have you tried time-based-revalidation in Sanity? This should give us some answers. Try setting it to the same value as your page (1 week). If this works, then we can use path based revalidation to add more granular support for page refreshes.

Except there’s an issue with that right? Lets say I set it to export const revalidate = 604800, that would mean any new posts wouldn’t be fetched and no updates to articles would be captured until a week later right?

I also noticed that I am still having a 50% cache miss every 12 hours which is odd

So I thought maybe it was the new Date().getFullYear() that was in my /[slug] json+ld markup or in the footer of the entire website. However, I hardcoded the values 45 minutes ago in production and I am still getting 7x more writes than reads for ISR in the past hour. There hasn’t been any new articles published in this time frame, the last one was 2 hours ago at the time of writing this comment.

It’s like every single article that gets hit is updating the ISR cache for some reason

In the case of the screenshot, it’s updating the cache for an article that was published 3 years ago. Published at 2022-11-22T16:58:45.679Z and last updated 2025-07-16T20:12:33Z.

I have a hunch based on Sanity Live docs that it is doing a revalidation. Also, here you can learn about the tags based revalidation and my assumption is there is a change in data that’s revalidating all posts.

We can confirm that if you put time based or tags based revalidation in SanityFetch like I suggested. I think it’s worth giving a try at least.

So according to their docs (The official Sanity toolkit for Next.js | Sanity.io plugin), they automatically add tags prefixed with sanity: when using sanityFetch from defineLive. I have confirmed this when setting sanity live revalidating coming through.

So sanityFetch is already adding tags for revalidating on their end, so I shouldn’t need to add them on my end. Furthermore, I tried adding cache: ‘force-cache’ in the client but that then completely disables the revalidating it looks like as after I deployed that change I made an update to an article and it didn’t update at all. The tag revalidation doc you linked to is if you’re still using the old way of revalidating which is using webhooks and a Next.js API endpoint. However, with defineLive you don’t need that now.

I see. About the image you shared: is this Sanity revalidating multiple tags in one page invocation? Have you confirmed what these tags belong to?

Yea so those 5 revalidated tag: calls are from a single post publish. It kind of makes sense when you consider this page is on it’s own page, it’s on the home page, it’s on /college/news, /college/football/news, /college/football/news/fcs, and /college/football/news/fcs/big-sky-conference. All of those pages make a request for posts, so when one is published or updated in this case, they have to be revalidated. It would be awesome if it was smarter and just revalidated the actual page itself, but that’s outside of Next.js :sweat_smile:

As for confirming what these tags belong to, is there a way to see that in Vercel somewhere? These ones are generated on Sanity’s backend which I don’t have any visibility into that I know of.

What’s kind of odd is that I went back and looked for the past 12 hours this morning and even though I had pretty much all cache hits, I was still seeing 1.3k ISR writes in the last 12 hours. In the last 12 hours none of my writers were active on the website, a new version of the website was published 13 hours ago. I removed all the possible new Date() usage in my /[slug] route by hardcoding the values. So there ideally should have been no ISR writes happening if I am understanding the docs. However, there were 1.3k writes during those 12 hours… I don’t even have 1,300 articles on the site, I have about half that. Unfortunately I can’t see the actual paths on the free tier, but this is basically suggesting that each article is being re-written twice every 12 hours.

What’s even weirder now, is that an hour after the last screenshot, and we have way more ISR reads than writes, which is good. But honestly just makes me even more confused.

I see. Thanks for explaining. I think Next.js and Vercel have no visibility on the tags’ source either. So, it’s all Sanity black magic.