Labeling Agent (beta)

In some projects, ML models are just as important as human labelers. Labeling Agents allows you to assign ML models as labelers in your project and evaluate their performance alongside human labelers. This helps you better understand which labeling approach works best for your needs — whether human, machine, or both.

Why Use Labeling Agents?

Labeling Agents simplify the process of testing and comparing ML models inside Datasaur:

  • You no longer need to create separate accounts or log in as the model to run predictions.

  • Model outputs are now part of the same analytics and comparison tools used for human labelers.

  • It’s easier to measure performance and decide what labeling strategy to use.

Requirements

  • Models must be in the same team workspace as the Data Studio project.

  • ML models must be deployed applications from LLM Labs with “Deployed” status.

  • This feature is currently only supported for Span Labeling project.

    • This is a current limitation that will be improved in the future.

How to Create a Labeling Agent

If you want to use a machine learning model for the labeling task, you can first set it up in LLM Labs. This involves creating a prompt using system and user instructions to define the model’s behavior.

1. Prepare the label set

Make sure the label set you will be using matches the one you configure in step 2. Below is a simple example of labels that can later be used in Data Studio:

{
  "name": "Labeling agent Label set",
  "options": [
    { "id": "NhsjWIgaAQH3g6dsvtW6a", "color": "#f93b90", "parentId": null, "label": "PERSON" },
    { "id": "X1bKK7Nxf9SGaBfDpzH7g", "color": "#d4e455", "parentId": null, "label": "DATE" },
    { "id": "NP2RJr7tD5aMfVBnG6TOm", "color": "#85c98e", "parentId": null, "label": "ORG" }
  ]
}

2. Define your instructions

In LLM Labs, create a new sandbox and set up the model to act as a labeling agent. To help the model understand what to label, you’ll need to provide clear system and user instructions. Below is an example setup:

System Instruction

You are an expert data labeler

User Instruction

Given the document text, please extract the following information and present it in JSON format as shown below:

PERSON: People, including fictional.  
DATE: Absolute or relative dates or periods.
ORG: Companies, agencies, institutions, etc.

Instructions Summary:  
1. Extract and present the information in the specified JSON format.  
2. Ensure that all extracted data is accurate and corresponds directly to the content of each document.

Return the value of extracted fields in JSON structure in plain text, following this JSON FORMAT  
{
    "PERSON": ["People, including fictional."],
    "DATE": ["Absolute or relative dates or periods."],
    "ORG": ["Companies, agencies, institutions, etc."],
}

VERY IMPORTANT  
RETURN THE ANSWER WITHOUT ```json  
ANSWER PRECISELY GIVEN FROM THE SENTENCE PROMPT AND DON'T MASK THE ANSWER, ANSWER BASED ON THE GIVEN SENTENCE

3. Test with a prompt example

To check if your instructions work as expected, you can test them using an example sentence. Here's how you might write a prompt:

Label set:
- PERSON
- DATE
- ORG

Sentence:
Ivan Lee is the CEO and Founder of Datasaur.ai. He graduated with a Computer Science B.S. from Stanford University. He was chosen for the selective Mayfield Fellows entrepreneurship program in 2010. Ivan went on to found Loki Studios, an iOS game studio. After raising institutional funding from DCM's A-Fund and launching a profitable game, Loki was acquired by Yahoo.

After you click the Run button, the expected output will be:

{
  "PERSON": ["Ivan Lee"],
  "DATE": ["2010"],
  "ORG": ["Datasaur.ai", "Stanford University", "Mayfield Fellows", "Loki Studios", "DCM's A-Fund", "Yahoo"]
}

4. Deploy the model

You need to deploy the model first before it becomes available and visible in Data Studio as a Labeling Agent.

Using Labeling Agents

Once you’ve set up the model, you can now assign it as a labeler in Data Studio.

1. Assign models as labelers

You can assign models during the project creation process:

  1. Go to Projects page > Create New Project.

  2. Upload files and select Span Labeling.

  3. In the Assignment step, open the Labeling agents tab.

  4. Select one or more deployed models to assign them as labelers.

  5. Complete the project setup.

You can assign both human members and models. Each model counts toward your assignment limit.

2. Launch the project and trigger labeling

When you click Launch Project, models will automatically begin applying labels.

Current limitation:

  • Only the first label set is used.

  • Each span will only have one label.

  • Labeling agents cannot yet draw arrows.

3. Review labels applied by the labeling agent

Once all documents are fully labeled — either through external model assistance or manual input, the project can undergo a final review. This stage typically involves a reviewer ensuring the consistency and accuracy of all annotations before submission or export through Reviewer Mode.

4. View and compare performance

You can track the performance of both human labelers and models from the Analytics page.

From here, you’ll be able to compare IAA scores and other metrics across all labelers — human and model.

Best Practices

  • Use the external model as a timesaving aid but always include a human review step.

  • Train your model with high-quality data to improve suggestion accuracy.

  • Communicate clearly with labelers about how to handle model predictions.

  • Automate some of the work with consensus by using multiple models, e.g. use the consensus of 3 and deploy 3 Labeling Agents, then focus only on those that are not accepted through consensus.

FAQs

  • Can I assign multiple models to the same project?

    • Yes. You can assign up to 10 Labeling Agents.

  • Can I use Labeling Agents in Line Labeling?

    • Not yet. They can be assigned to Span + Line project but will only apply labels for Span Labeling.

  • How are Labeling Agent labels shown in the UI?

    • They are treated like human labelers but are masked. You’ll see their labels in the Reviewer mode and analytics.

Last updated