AI Gateway when using your own API keys

Hey,

I tried out AI gateway today and ran into 2 issues:

  1. Is there an easy way to see from the Vercel UI if an LLM call uses my Vertex API key or Vercel’s? OpenRouter shows this information.
  2. Does the Vertex integration accept plain API keys or needs it be provided in JSON format as shown in the placeholder text?
  1. You can see this from the response providerMetadata you can log from code, but we don’t have a way to see it from the UI right now. We are working on adding it to the UI as well.
  2. It requires the JSON format as shown, including location. If it isn’t working you should be able to see the error detail by logging the providerMetadataabove. Here is sample code to do this, make sure to use JSON.stringify as shown to get full detail from the attempts sub-object which will have the specific detail. Does Vertex AI support simple api keys now as well? If you link to detail we can look at adding support.
2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.