When I successfully sent a message to AI, AI did not successfully load the reply and kept looping this error… After I forked my project, it replied to me in the new chats, but after I deployed the project, it ignored me again.
I have been facing something similar today. v0 is not responding correctly to any long promt
Yep, I’ve noticed that I’m not the only one who has this problem and hasn’t solved it yet.
Thank you for the feedback, folks! I’ve shared this with the team internally.
Please let me know if you’re still seeing this today
yep, there are still some problems, as shown in the picture. But I promise it is not my network problem, because other chats functions are all normal, only this one has a problem
Deploying to vercel and seeing styling issues.
Asked for suggestions and just got the v0 equivalent of the spinning beachball of doom.
It’s still adding context no matter what I say.
Tried refreshing etc.
Anyone seen this before?
Hey @gichigi-mecom! Just merged your post here to keep track of this issue. Sharing internally
I’m running into the same issue so posting here to flag the incident and follow the resolution.
- Ask a question or prompt
- Agent thinks…
- No response provided.
Thanks for reporting. Can you share a link to the chat and we’ll take a look?
Hey Max, thanks for coming back.
Here you go: https://v0.dev/chat/choir-v0-138-Eqn0U2gaPI5
btw forking the chat seems to work but still don’t know why this chat failed. Only v12.
Hi Max,
We’re currently experiencing the same issue on our end and wanted to share some observations:
- Deployments are consistently failing**, and we’re unable to resolve them as V0 responds with an empty message and does not proceed with any action.
- Forking the project only provides a temporary workaround. We’re able to submit one task after forking, but then the system becomes unresponsive again - requiring us to fork repeatedly. This also prevents us from progressing beyond the initial version of any fork (e.g., we’re unable to reach v3 or higher).
Could you please advise on how we can resolve this issue or if there’s a known fix in progress?
Could you share a v0 chat link with us, Hamlet? Would help us debug