LiteLLM Gateway is now available on Vercel

We’re excited to share that you can now deploy LiteLLM Gateway on Vercel. This gives you an OpenAI-compatible gateway to connect with any supported provider, including Vercel AI Gateway.

This integration makes it easier to:

  • Standardize access: Use a single API format to interact with multiple LLM providers.
  • Stay flexible: Switch between models or providers without rewriting your integration code.
  • Enhance observability: Works directly with Vercel AI Gateway for better management.

Are you planning to use LiteLLM in your next project? Let us know what you’re building below! :backhand_index_pointing_down: