[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live) [Help](/c/help/9) # Supabase works local not in prod/Vercel. Next.js, Supabase, Edge runtime 219 views · 0 likes · 2 posts Wordopera (@wordopera) · 2024-08-18 I started with the create-next-app template and have LLM chat inputs and outputs working fine on local and prod/Vercel. App inserts record into supabase with chat input and output. **Works fine in local.** **On Vercel, nothing breaks**, and no error messages. However, **Supabase does not get the record**. Triple checked the API keys on Vercel dashboard. Also tried removing and then adding back the Supabase integration. Operation occurs within the POST function, specifically inside the ReadableStream's start method const { data, error } = await supabase .from('messages') .insert([{ content: message, // User input ai_response: fullResponse, // LLM output model: model // Model used }]); It's inserting a new row into the 'messages' table with three fields: content: This is the user's input message. ai_response: This is the full response generated by the LLM (OpenAI API). model: This is the model name used for this interaction. -current versions of all packages Pauline P. Narvas (@pawlean) · 2024-08-18 Hi, @wordopera! It sounds like there may be an issue with the integration between Vercel and Supabase given that there are no error messages on Vercel. Do you get a response locally? Cross-posting a relevant blog in case it’s helpful: https://supabase.com/blog/using-supabase-with-vercel