LiteLLM server is now available on Vercel

We’re excited to share that you can now deploy LiteLLM server on Vercel! This update gives you an OpenAI-compatible gateway to connect with any supported provider, including Vercel AI Gateway. It’s a great way to standardize your LLM calls and stay flexible across different models without changing your code. This makes it much easier to manage your AI stack while keeping your options open. Give it a try on your next project and let us know what you think below! :backhand_index_pointing_down: