Vertex AI
Last updated
Last updated
The VertexAI Model Integration in LLM Labs allows you to connect and manage various LLM models, including your own Vertex AI models. This integration lets your team access your private models directly in LLM Labs environment for evaluation and deployment.
Navigate to the Models page under LLM Labs menu.
Open My models tab, and click the Manage providers button.
Choose Vertex AI as the provider, and insert your client email, private key, project ID, and region. To access the region:
From Google Cloud Console:
Go to Vertex AI section
Look at the region selector in the top navigation bar
Or check where your Vertex AI resources are deployed
From Project Settings:
Go to "IAM & Admin" > "Settings"
Look for "Location" or "Region"
Use the correct region format as specified in Vertex AI locations, for example, us-east5
.
Once you have connected your Vertex AI Models to LLM Labs, you will see a list of available LLM Models that you have already deployed in Vertex AI. You can immediately use these models within LLM Labs.
Every new model that you deploy in Vertex AI will be synced to Datasaur, and you can use it right away in Datasaur. If the models you just deployed haven't appeared on Datasaur, you can click Sync models.
The models will be accessible to all workspace members for use in their projects. Additionally, only the Admin can remove the Vertex AI provider from the workspace.