Client-side “Code Interpreter” for AI SDK chatbots with WASM

I’ve been building chatbots with the Vercel AI SDK and ran into the point Anthropic makes that general-purpose agents perform much better when they can write and run code. We build support chatbots and I really didn’t want to run LLM-generated code on our own infra.

Cloud sandboxes are expensive, and we already have an SPA app, so I explored pushing the compute to the client using WASM instead.

So I built 1MCP – an open-source, browser-only WASM sandbox that your AI SDK chatbot can call to execute JS/TS on the client, Code Interpreter–style.

What it does

  • Runs model-generated or user-generated JS/TS in the browser via WASM

  • No backend required for “code execution” mode – works in a plain AI SDK chat widget

  • Optional server mode to act as an MCP proxy (for CORS + extra npm packages in the bundle)

Links

1 Like