[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[AI SDK](/c/ai-sdk/62)

# StreamText Broken Using @ai-sdk/google-vertex in Version 2 and above

293 views · 0 likes · 4 posts


Jon (@jon2718) · 2025-01-13

Hi, 

I have a custom model deployed in Google Vertex AI.  Using streamText, I AM able to get predictions from the model using version 1.0.4 of `@ai-sdk/google-vertex`.   However, using the latest version of `@ai-sdk/google-vertex` or anything above version 2.0.0 DOES NOT WORK.  

The following code works fine using @ai-sdk/google-vertex 1.0.4, but breaks with the following error using the latest version or anything above version 2.  

```ts
export const maxDuration = 30;

const euVertex = createVertex({
  project: <PROJECT_ID>,
  location: 'us-central1', // optional
  googleAuthOptions: {
  keyFilename: <KEY_FILE>,
  },
});

const model = euVertex('projects/' + PROJECT_ID + '/locations/us-central1/endpoints/' + ENDPOINT)

const result = streamText({
  model: model,
  prompt: 'Hello, how are you?',
});

for await (const textPart of result.textStream) {
  console.log(textPart);
}
```
The error I get is this: error: `Uncaught (in promise) AI_APICallError: Not Found`

Any help would be greatly appreciated.

Thanks.


Haula Dv (@haula-dv) · 2025-05-21

```
export const maxDuration = 30;

const euVertex = createVertex({
  project: <PROJECT_ID>,
  location: 'us-central1', // optional
  googleAuthOptions: {
  keyFilename: <KEY_FILE>,
  },
});

const model = euVertex('projects/' + PROJECT_ID + '/locations/us-central1/endpoints/' + ENDPOINT)

const result = streamText({
  model: euVertex('gemini-1.5-flash'),
  prompt: 'Hello, how are you?',
});

for await (const textPart of result.textStream) {
  console.log(textPart);
}
```


Anshuman Bhardwaj (@anshumanb) · 2025-05-21

Hi @haula-dv, can you share what issue are you facing there? The code by itself isn't enough for the community members to help.


Anshuman Bhardwaj (@anshumanb) · 2025-05-21

Hi @jon2718, thanks for posting this here. I'll check with the team about it.