[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[AI SDK](/c/ai-sdk/62)

# Any plans on supporting Gemini Interactions API?

95 views · 4 likes · 5 posts


founders-4062 (@founders-4062) · 2025-12-16

Would be nice to have it, similar like OpenAI Responses API


Amy Egan (@amyegan) · 2025-12-16

I think it would take some time before we could safely add the Interactions API since it's very new and still in public beta. But I like the suggestion!

I marked this as a #feature-request and shared internally.

If anyone else reading this is interested, please let us know in the comments :folded_hands:


Amy Egan (@amyegan) · 2026-01-23

Hey @founders-4062! 😊 Just checking in to see if you still need help with your question about the Gemini Interactions API. Have you found any solutions, or is there anything specific you'd like to discuss further? Let’s figure this out together!


Mateusz  (@mateusz-7385) · 2026-03-03 · ♥ 3

Hey @amyegan 

We've been running this pattern in production with AI SDK v5 and openai.responses() for a multi-channel property management AI agent. Conversations span dozens of turns across days, so context efficiency matters a lot.

We pass previousResponseId today via providerOptions.openai and persist the returned ID from result.providerMetadata?.openai?.responseId after each call. From turn 2 onward, we send only the new message — OpenAI reconstructs the full context server-side. We even use it in standalone utility calls like chat summarization, so the model gets the full conversation history without us constructing any message array at all.

The passthrough works, but it deserves first-class treatment:

* previousResponseId as a top-level SDK param, not buried in providerOptions

This is the exact same value prop as the Gemini Interactions API — server-side context storage to avoid re-transmitting full history. A provider-agnostic abstraction (e.g. conversationId in/out) with OpenAI and Gemini as the first two implementations would be ideal. Happy to contribute to a design RFC if that helps move it forward.


Tomtien (@tomtien) · 2026-03-06 · ♥ 1

This something I would also like to see. API’s like openai’s responses API will probably become the standard way to interface with these models soon due to the obvious benefits (not having to keep track of reasoning,  better caching, etc). 

Having something similar to previousResponseId for geminni models would be great