@browser-ai: better DX for running local models directly in the browser

Hi, community!

My project @browser-ai was selected as part of the Vercel OSS Program, and I’ve been encouraged to share it with y’all here.

If you’ve been experimenting with running local language models directly in the browser using Transformers.js, WebLLM, or the new Prompt API in Chrome/Edge, you’re likely familiar with the challenges:

  • Custom hooks and UI components: Each framework requires its own integration patterns

  • Fallback complexity: Building robust integration layers to automatically fall back to server-side models when client-side compatibility is an issue

  • API fragmentation: Significant differences in API specifications across different in-browser LLM frameworks

The @browser-ai library bridges this gap by providing a unified solution that lets you:

  • Experiment with in-browser AI models using familiar patterns

  • Seamlessly fall back to server-side models when needed

  • Use the same Vercel AI SDK eco system you already know

  • Avoid building complex integration layers from scratch

Excited to be part of the community :slight_smile:

4 Likes

Welcome, I’m really excited about this project

1 Like

Thank you, Jacob!

Congrats on joining the OSS program! :smiley:

1 Like

Thank you!!