LLM Labs (beta)
Enable your integration with models from Datasaur LLM Lab
Easily integrate with models from Datasaur's LLM Labs. If you've already tested and deployed your experiment in the Datasaur LLM Labs Sandbox, we're here to help! Our integration allows you to use your deployed LLM Sandbox from Datasaur LLM Labs to enhance your labeling process.
Creating an LLM Labs Sandbox
To begin using ML-Assisted Labeling with LLM Labs, you first need to create and deploy a Sandbox. You can see this page to learn more details about how to deploy an LLM Labs Sandbox.
The output of the LLM Labs Sandbox must be in JSON object format, aligned with the label set defined in your NLP project. This ensures compatibility with regex-based string matching for labeling in your NLP platform.
We have label set of
party
,date
,contextual_reference
,signatory
.Therefore, the expected output or result from the LLM Labs Sandbox should be something like this
Prompt example to generate the correct JSON Object format
Accessing Your Deployed LLM Labs Sandbox in ML Assisted Labelling
Follow these steps to access your deployed LLM Labs Sandbox (from Datasaur LLM Labs) on your ML Assisted Labelling:
Create a custom project for Row Labeling or Span Labeling.
Click "Manage Extension" on your right bar.
Pop Up Manage Extension will appear and you can enable the Datasaur ML Assisted.
Once enable it, select "LLM Labs" as provider and you will see the following menu:
Target text: the column(s) of your targeted data for this ML assistance are based on.
Target question: the column(s) you wish to answer.
LLM application: your deployed LLM Sandbox name from LLM Labs.
API token: your API keys to access the deployed LLM Application. You can visit LLM Labs and go to `Settings` on the left sidebar, then select the `API Keys` menu.
Prediction Process
After setting up the above options, simply click “Predict Labels” to start predicting and obtaining labels from your deployed LLM Application from Datasaur LLM Labs.
Last updated