Hi Vercel team,
Hobby plan, project “primex” (team zv5jrjrqw8-1039s-projects, project ID
prj_2S6n5cPcTEdkFzd6zeCAdjvKwfzg). Subdomain gamsat.primexstudy.com.au.
Issue: Facebook’s social-card scraper (facebookexternalhit) is being
challenged at the edge by your automatic DDoS Mitigation. Meta’s Sharing
Debugger consistently returns HTTP 403 with the misleading “could be due
to robots.txt block” hint. Net effect: every shared GAMSAT link renders
a broken or hostname-only OG card on Facebook, Messenger, and WhatsApp.
What I have ruled out:
- robots.txt: explicitly allows facebookexternalhit / Twitterbot /
LinkedInBot / Slackbot-LinkExpanding / WhatsApp / TelegramBot - Application-layer block: direct curl with facebookexternalhit/1.1 UA
from my IP returns 200 OK - Custom Firewall Rule: I added one matching User-Agent contains
facebookexternalhit etc. with action Bypass. It does not help because
the system DDoS Mitigation runs before Custom Rules in the execution
order, so the request is rejected before my rule fires - LinkedIn Post Inspector and direct shares: work fine, return 200
- Vercel Firewall Overview: shows ~41 Challenged requests per hour from
the DDoS Mitigation rule, matching the FB scrape attempts
What I would like:
Could you whitelist Facebook’s published facebookexternalhit IP ranges
at the system level for this project, so the scraper can fetch OG meta
without hitting the IP-based DDoS challenge?
The Pro plan System Bypass Rules feature would let me do this myself by
adding the CIDR ranges, but I am running a hobby-stage launch and would
prefer not to upgrade just for this single issue if a manual whitelist
is feasible.
Verified working as expected:
- LinkedIn cards render correctly
- Twitter/X cards re-fetch on share
- Direct curl from outside Vercel’s PoPs returns 200
Thanks!