Questions Regarding AI Gateway Billing, Usage Tracking etc

Hi Vercel team and developer community,

I’m currently evaluating the Vercel AI Gateway for a multi-tenant SaaS platform focused on developers. The AI Gateway is incredibly appealing in terms of abstracting the complexity of multi-model LLM integration — and it aligns well with our goal at Wise Duck Dev GPTs to support 800+ specialized AI Assistants (e.g., for web, mobile, blockchain, game development, etc.), especially now that we are currently building the version 3 of the platform.

That said, I’m still determining whether to:

  • Fully adopt the Gateway as the foundation of our LLM infrastructure
  • Or continue building a BYOK (Bring Your Own Key) system to manage user consumption independently

Core Use Case of the V3 of the Wise Duck Dev GPTs (Hypothetical for Now)

  • Users interact with domain-specific Assistants that use different LLMs
  • Models can change mid-conversation (e.g., GPT-4o → Claude 3 → Gemini 1.5)
  • Usage-based billing is tied to individual user behavior

Key Questions

1. Billing Scope
Will the AI Gateway bill usage at the project level (i.e., to the app owner’s Vercel account), or is there support or future intent for user-scoped usage tracking and billing?
Can we issue unique Gateway tokens per user or session and attribute usage accordingly?

2. Usage Transparency
If user-scoped billing is planned:

  • Will we be able to retrieve detailed usage logs per user (e.g., token usage per LLM)?
  • Or will Vercel generate billing reports that would be available through an endpoint?

3. Feature Suggestion: Credit-Based Model
Would Vercel consider introducing a credit system, where users:

  • Purchase usage credits (e.g., $10 = X tokens)
  • Can auto-top-up below a balance threshold
  • Set usage limits to avoid unexpected charges

This would be ideal for SaaS platforms that want to offer pay-as-you-go access while maintaining usage clarity and cost control for users in my opinion.

4. Timeline
Do you have any timeline in mind about when the billing functionality will be deployed?

5. Model-Specific Features
Will the Gateway eventually manage access to model-specific capabilities like:

  • File uploads (e.g., GPT-4o vision or Claude tools)
  • Image generation or interpretation
  • Tool use (code interpreter, retrieval, etc.)

If so, will those capabilities be:

  • Abstracted automatically (like standard prompts)?
  • Or customizable ? and if yes, how?

Understanding how AI Gateway will expose and secure these advanced model features is important as we begin designing chat interfaces and workflows.


Why This Matters To Us

These details will help determine whether:

  • We adopt AI Gateway as our default model access layer
  • Or we continue to build our own system where advanced users use BYOK and credit tracking

The Vercel AI SDK + Gateway is already an amazing foundation. If future roadmap plans include per-user billing, usage transparency, and feature access controls, it could become an incredible LLM infrastructure for modern SaaS teams.


Thanks again to the team for building such a forward-thinking developer stack — and thanks in advance for any clarity you can provide.

Yann
Founder, Wise Duck Dev GPTs (AI Platform for Developer-Centric Assistants)

3 posts were merged into an existing topic: Introducing the Vercel AI Gateway

vercel is dead along with v0 chat, theyve destroyed it entirety

time to move on