Google Search Console is Suddenly having Crawl Issues. Its intermittently saying cant fetch robot.txt or the urls itself (Server connection issue) since today morning. Has been working just fine for 2 years and didn’t really change anything. What could be going on? (Its a blog.domain dot com setup)
Welcome to the Vercel Community, @anirudhbg-3848!
First, let’s check the current status:
- Can you access
https://blog.yourdomain.com/robots.txtdirectly in your browser? - Try testing with curl:
curl -I https://blog.yourdomain.com/robots.txt
Usually the issue is around:
- Middleware errors: If using Next.js middleware, ensure it handles requests without locale headers (Googlebot doesn’t always send Accept-Language)
- DNS configuration: Verify your subdomain CNAME is pointing correctly to
cname.vercel-dns.com - Firewall rules: Check if you have any WAF rules that might be blocking crawlers intermittently
For immediate debugging:
- Check your Logs for any 500 errors around the time Google reported failures
- Look for
EDGE_FUNCTION_INVOCATION_FAILEDerrors which indicate middleware issues - Try the URL Inspection tool in Google Search Console to get real-time crawl results