Analytics
It provides real time detailed metrics and performance insights for each assigned labeler within the project. Users can:
View Metrics and Performance: Track individual labeler performance and status, including accepted labels, rejected labels, etc.
Filter by Document: Narrow down analytics based on specific documents to pinpoint areas of improvement and high performance. The calculation will be affected by this filter.
Filter by Member: Analyze performance metrics by individual members to identify top performers and those needing additional support.
Reviewers can view the performance of every labeler assigned to the project, even before marking the project as complete, as the main goal of the extension is to easily portray the labeler's progress on the project. Labelers, however, can view their own performance only after the project has been marked as complete by reviewers.
As for mixed labeling (the one that contains more than one labeling type), all metrics will be shown, i.e. the combination of both labels and answers.
Metrics
Effective Time Spent: The actual time spent by labelers during the labeling process.
Labels
Only applied to Span, Conversational, and Bounding Box Labeling.
Accepted Labels: Labels that have been accepted (through manual or consensus) during the review process.
Rejected Labels: Labels that have been rejected during the review process.
Conflicted Labels: Labels that are still conflicted since it's not yet resolved through manual review or consensus.
Missing Labels: Labels that are applied on Reviewer Mode but were not provided by the labeler.
Answer
Only applied to Row and Document Labeling. If the project consists of multiple questions, each answer will be counted as one.
Accepted Answers: Answers that have been accepted (through manual or consensus) during the review process.
Rejected Answers: Answers that have been rejected during the review process.
Answered Rows: Number of rows that have been labeled, specific for Row Labeling.
Last updated