Models

Overview

The Models catalog page allows you to explore and deploy LLMs. It includes over 200 base models, so you can pick one that fits your needs. You can connect your own models by adding credentials from our supported providers, or try Direct Access LLMs that consist of popular models you can use right away in Datasaur without entering credentials.

Explore the models

  1. Go to the Models catalog page under LLM Labs menu.

  2. Navigate to the Explore tab.

Integrate LLM providers

You can also integrate with several LLM providers, such as Amazon SageMaker JumpStart, Amazon Bedrock, Azure OpenAI, OpenAI, and Google Vertex AI, through the Manage providers button.

Manage providers dialog

To connect a provider, you need to set up and add your credentials in Datasaur.

Input the credentials

Once you connect a provider, some will fetch your models so you can use them right away, while others require you to deploy the model first.

Deploy the models

Here’s how to deploy a model from Amazon SageMaker JumpStart. Click Deploy model, then in the dialog that appears, enter an endpoint name and choose your preferred instance type.

Deploy model dialog

After you click Deploy model, the deployment may take several minutes. You can track its progress in the model status at the top right.

Deploying process

After deployment is complete, the model status will change to Available, and you can start using the model right away.

Deployed model

Available models

The Available tab shows all deployed models and lets you deploy or undeploy them. It also lists our Direct Access LLMs —popular models you can use right away without having to add credentials.

Every new model that you deploy in the LLM providers will be synced to Datasaur, and you can use it right away in Datasaur. If the models you just deployed haven't appeared on Available tab, you can click the Sync models button.

Undeploy models

To undeploy a model, you can click the three dots on the model card and select Undeploy model.

Undeploy models

The model status will be changed into unavailable once you click the Undeploy model button.

Unavailable models

Disconnect LLM providers

To disconnect a provider, click Manage providers, then View details for the provider you want to disconnect. In the dialog that opens, click Disconnect at the bottom left.

Disconnect dialog

Once you’ve clicked the Disconnect button, the providers will be disconnected from your workspace.

Last updated