Hey, I’m using v0 Platform API to develop a coding agent that helps our customers to build self contained HTML blocks just by chatting with v0 models. I noticed some weird behaviors regarding some API endpoints:
1 - The Chat Initialize endpoint ( Initialize Chat | v0 Docs ) accepts the projectId parameter, but it does not link the created chat with the project. I believe this is a bug. What I’m doing as a workaround now is using the Project Assign endpoint to assign the chat to the project instead Assign Project to Chat | v0 Docs , but I’m not sure if after doing that the chat starts considering my project instructions. I believe this should be fixed.
2 - The response_mode async in the Send Chat Message endpoint ( Send Message | v0 Docs ) does not return the last sent user message in the response. I also believe it is a bug, I would expect that the user message should be persisted in your database and returned in the response body independently of the response mode. As the async mode suggests the assistant message needs to be polled, which should be fine, but the user message should be returned immediately in the endpoint response body, included in the messages array.
3 - I’d like some clarification docs about the abstract syntax tree (AST) returned in experimental_contentand also how to properly parse the AST in the events when the response mode is set to experimental_stream. This is a cool feature, but I’m having a lot of difficulties to understand and parse the events and assembly the messages from the assistant, specifically what is the LLM answering in the chat and what is a corresponding payload of it making changes to the chat files version.
I’m really amazed about the results we are getting by using your API, and I would love to contribute to make it better. I started creating a python client SDK in my project that I could later publish in PiPY if you guys have interest on it. I’m having a bit of trouble to better define type hints on it now as the API lacks a bit of documentation.