Datasaur
Visit our websitePricingBlogPlaygroundAPI Docs
  • Welcome to Datasaur
    • Getting started with Datasaur
  • Data Studio Projects
    • Labeling Task Types
      • Span Based
        • OCR Labeling
        • Audio Project
      • Row Based
      • Document Based
      • Bounding Box
      • Conversational
      • Mixed Labeling
      • Project Templates
        • Test Project
    • Creating a Project
      • Data Formats
      • Data Samples
      • Split Files
      • Consensus
      • Dynamic Review Capabilities
    • Pre-Labeled Project
    • Let's Get Labeling!
      • Span Based
        • Span + Line Labeling
      • Row & Document Based
      • Bounding Box Labeling
      • Conversational Labeling
      • Label Sets / Question Sets
        • Dynamic Question Set
      • Multiple Label Sets
    • Reviewing Projects
      • Review Sampling
    • Adding Documents to an Ongoing Project
    • Export Project
  • LLM Projects
    • LLM Labs Introduction
    • Sandbox
      • Direct Access LLMs
      • File Attachment
      • Conversational Prompt
    • Deployment
      • Deployment API
    • Knowledge base
      • External Object Storage
      • File Properties
    • Models
      • Amazon SageMaker JumpStart
      • Amazon Bedrock
      • Open AI
      • Azure OpenAI
      • Vertex AI
      • Custom model
      • Fine-tuning
      • LLM Comparison Table
    • Evaluation
      • Automated Evaluation
        • Multi-application evaluation
        • Custom metrics
      • Ranking (RLHF)
      • Rating
      • Performance Monitoring
    • Dataset
    • Pricing Plan
  • Workspace Management
    • Workspace
    • Role & Permission
    • Analytics
      • Inter-Annotator Agreement (IAA)
        • Cohen's Kappa Calculation
        • Krippendorff's Alpha Calculation
      • Custom Report Builder
      • Project Report
      • Evaluation Metrics
    • Activity
    • File Transformer
      • Import Transformer
      • Export Transformer
      • Upload File Transformer
      • Running File Transformer
    • Label Management
      • Label Set Management
      • Question Set Management
    • Project Management
      • Self-Assignment
        • Self-Unassign
      • Transfer Assignment Ownership
      • Reset Labeling Work
      • Mark Document as Complete
      • Project Status Workflow
        • Read-only Mode
      • Comment Feature
      • Archive Project
    • Automation
      • Action: Create Projects
  • Assisted Labeling
    • ML Assisted Labeling
      • Amazon Comprehend
      • Amazon SageMaker
      • Azure ML
      • CoreNLP NER
      • CoreNLP POS
      • Custom API
      • FewNERD
      • Google Vertex AI
      • Hugging Face
      • LLM Assisted Labeling
        • Prompt Examples
        • Custom Provider
      • LLM Labs (beta)
      • NLTK
      • Sentiment Analysis
      • spaCy
      • SparkNLP NER
      • SparkNLP POS
    • Data Programming
      • Example of Labeling Functions
      • Labeling Function Analysis
      • Inter-Annotator Agreement for Data Programming
    • Predictive Labeling
  • Assisted Review
    • Label Error Detection
  • Building Your Own Model
    • Datasaur Dinamic
      • Datasaur Dinamic with Hugging Face
      • Datasaur Dinamic with Amazon SageMaker Autopilot
  • Advanced
    • Script-Generated Question
    • Shortcuts
    • Extensions
      • Labels
      • Review
      • Document and Row Labeling
      • Bounding Box Labels
      • List of Files
      • Comments
      • Analytics
      • Dictionary
      • Search
      • Labeling Guidelines
      • Metadata
      • Grammar Checker
      • ML Assisted Labeling
      • Data Programming
      • Datasaur Dinamic
      • Predictive Labeling
      • Label Error Detection
      • LLM Sandbox
    • Tokenizers
  • Integrations
    • External Object Storage
      • AWS S3
        • With IRSA
      • Google Cloud Storage
      • Azure Blob Storage
      • Dropbox
    • SAML
      • Okta
      • Microsoft Entra ID
    • SCIM
      • Okta
      • Microsoft Entra ID
    • Webhook Notifications
      • Webhook Signature
      • Events
      • Custom Headers
    • Robosaur
      • Commands
        • Create Projects
        • Apply Project Tags
        • Export Projects
        • Generate Time Per Task Report
        • Split Document
      • Storage Options
  • API
    • Datasaur APIs
    • Credentials
    • Create Project
      • New mutation (createProject)
      • Python Script Example
    • Adding Documents
    • Labeling
      • Create Label Set
      • Add Label Sets into Existing Project
      • Get List of Label Sets in a Project
      • Add Label Set Item into Project's Label Set
      • Programmatic API Labeling
      • Inserting Span and Arrow Label into Document
    • Export Project
      • Custom Webhook
    • Get Data
      • Get List of Projects
      • Get Document Information
      • Get List of Tags
      • Get Cabinet
      • Export Team Overview
      • Check Job
    • Custom OCR
      • Importable Format
    • Custom ASR
    • Run ML-Assisted Labeling
  • Security and Compliance
    • Security and Compliance
      • 2FA
  • Compatibility & Updates
    • Common Terminology
    • Recommended Machine Specifications
    • Supported Formats
    • Supported Languages
    • Release Notes
      • Version 6
        • 6.111.0
        • 6.110.0
        • 6.109.0
        • 6.108.0
        • 6.107.0
        • 6.106.0
        • 6.105.0
        • 6.104.0
        • 6.103.0
        • 6.102.0
        • 6.101.0
        • 6.100.0
        • 6.99.0
        • 6.98.0
        • 6.97.0
        • 6.96.0
        • 6.95.0
        • 6.94.0
        • 6.93.0
        • 6.92.0
        • 6.91.0
        • 6.90.0
        • 6.89.0
        • 6.88.0
        • 6.87.0
        • 6.86.0
        • 6.85.0
        • 6.84.0
        • 6.83.0
        • 6.82.0
        • 6.81.0
        • 6.80.0
        • 6.79.0
        • 6.78.0
        • 6.77.0
        • 6.76.0
        • 6.75.0
        • 6.74.0
        • 6.73.0
        • 6.72.0
        • 6.71.0
        • 6.70.0
        • 6.69.0
        • 6.68.0
        • 6.67.0
        • 6.66.0
        • 6.65.0
        • 6.64.0
        • 6.63.0
        • 6.62.0
        • 6.61.0
        • 6.60.0
        • 6.59.0
        • 6.58.0
        • 6.57.0
        • 6.56.0
        • 6.55.0
        • 6.54.0
        • 6.53.0
        • 6.52.0
        • 6.51.0
        • 6.50.0
        • 6.49.0
        • 6.48.0
        • 6.47.0
        • 6.46.0
        • 6.45.0
        • 6.44.0
        • 6.43.0
        • 6.42.0
        • 6.41.0
        • 6.40.0
        • 6.39.0
        • 6.38.0
        • 6.37.0
        • 6.36.0
        • 6.35.0
        • 6.34.0
        • 6.33.0
        • 6.32.0
        • 6.31.0
        • 6.30.0
        • 6.29.0
        • 6.28.0
        • 6.27.0
        • 6.26.0
        • 6.25.0
        • 6.24.0
        • 6.23.0
        • 6.22.0
        • 6.21.0
        • 6.20.0
        • 6.19.0
        • 6.18.0
        • 6.17.0
        • 6.16.0
        • 6.15.0
        • 6.14.0
        • 6.13.0
        • 6.12.0
        • 6.11.0
        • 6.10.0
        • 6.9.0
        • 6.8.0
        • 6.7.0
        • 6.6.0
        • 6.5.0
        • 6.4.0
        • 6.3.0
        • 6.2.0
        • 6.1.0
        • 6.0.0
      • Version 5
        • 5.63.0
        • 5.62.0
        • 5.61.0
        • 5.60.0
  • Deployment
    • Self-Hosted
      • AWS Marketplace
        • Data Studio
        • LLM Labs
Powered by GitBook
On this page
  • Overview
  • How to enable it?
  • How it works?
  • Conflicts
  • Members
  • Progress
  • Span Based Labeling
  • Conflicts
  • Accept or Reject All Conflicted Labels
  • Members
  • Row Based Labeling
  • Conflicts
  • Members
  • Document Labeling
  • Conflicts
  • Members
  1. Advanced
  2. Extensions

Review

Last updated 14 days ago

Overview

The “Review Extension” is designed to streamline your labeling review process, providing you with a dedicated tool to ensure the accuracy and consistency of your labeled data. Whether you're working on a small dataset or a large-scale project, this feature ensures that no labeling discrepancies go unnoticed.

How to enable it?

You just have to enable it by clicking the Manage Extension button in the right sidebar.

How it works?

This extension provides essential information related to labeling work and review, including labelers' information, conflicting labels, and reviewing progress.

This extension consists of three main tabs:

  1. Conflicts

  2. Member

  3. Progress

Conflicts

There are three key aspects in this tab:

  1. Document allows you to view the list of uploaded documents for the project and the conflict information.

  2. Show work from allows to select a labeler to see their specific work.\

  3. Conflicts allows you to identify all conflicting answers provided by different labelers.

    1. For projects with multiple files, make sure to specify the file by selecting it from the Document dropdown.

    2. The displayed information varies depending on the project type.

Members

You can easily access the list of labelers assigned to the project and their labeling status. Each project type has different displayed information.

Progress

This tab will only be visible when the “Set review sampling rate” setting is enabled. Reviewers can track their progress in this tab, determining whether they are still in progress or have fulfilled the configured sampling rate.

Span Based Labeling

Conflicts

The Labeling Conflicts in Span Based labeling will be shown under the show work from dropdown.

It will be separated into three sections:

  1. Contents: This will show you the conflicted words that were changed by the labeler, for example:

    1. The original sentence is “the quick brown fox.”

    2. Labeler 1: edit the sentence into “the quick black fox.”

    3. Labeler 2: edit the sentence into “the quick fox.”

  1. Spans: This will show you the conflicted spans that were labeled by the labelers.

  1. Arrows: This will show you the conflicted arrows that were labeled by the labelers.

Accept or Reject All Conflicted Labels

Previously, managing a large number of unresolved or conflicting labels was time-consuming for Reviewers, as they had to manually accept or reject each label individually. This capability streamlines the process, allowing Reviewers to accept or reject all remaining labels with a single action.

How to: Accept or Reject All

This action can be applied to both Span and Arrow conflicts.

Navigate to the Review extension and then click the "View All" button.

Click "Accept All" or "Reject All" depending on the action you would like to perform.

Important Notes

  • Both actions cannot be undone and only apply to labels that are currently visible in the Reviewer’s interface.

  • Accepting or rejecting labels will not affect labels associated with a span or arrow that is part of a sentence in conflict. The correct sentence must be selected first by the Reviewer.

Accept All Conflicted Labels

  • When the project setting “Spans should have at most one label” is unchecked, all conflicted labels will be accepted.

  • When the project setting “Spans should have at most one label” is checked, then:

    • If a conflicted label belongs to a span/arrow that has another different label applied to it, then the label will remain conflicted. These require manual resolution by the Reviewer.

    • If a conflicted label belongs to a span/arrow that has no other different label applied to it, then the label will be accepted.

Reject All Conflicted Labels

All conflicted labels will be rejected.

Members

In the members tab, you can see your labeler’s progress. It will also show how many files they have already labeled.

Row Based Labeling

Conflicts

The Labeling Conflicts in Row Based labeling will be shown under the show work from dropdown.

You can click the View All button to see the Conflicted Answers that occur in the labeling process. And it will show you the list of conflicting labels.

You can also filter the conflicting answers based on the labeler. To do that, you have to select which labeler you want to review in the Show Work From the dropdown.

This will show you the conflicting answer caused by the labeler.

Members

In the members tab, you can see your labeler’s progress.

Document Labeling

Conflicts

To resolve conflicts in Doc Based Labeling you have to go to Document Labeling extension to resolve conflicted answers.

You can see all the labeler’s answers. And you can also resolve the conflicts by choosing the best answer and submitting it.

Members

In the members tab, you can see your labeler’s progress. It will also show how many files they have already labeled.

Read more information about Review Sampling .

For the detailed how reviewing process, please refer to this .

For the detailed how reviewing process, please refer to this

For the detailed how reviewing process, please refer to this

here
Reviewer Mode page
Reviewer Mode page
Reviewer Mode page