Consensus
Allows you to review your project.
Last updated
Allows you to review your project.
Last updated
Consensus determines how many labelers must agree on a label for that label to automatically be accepted by Datasaur. This means that if there is not enough agreement on a label then this label will be qualified as a "conflict" label -- as it is in conflict with the peer review consensus number that you have set for the project. Currently, we provide two types of consensus:
No Consensus (every label will be considered a conflict label, to be resolved by a reviewer)
Peer Review Consensus (if the consensus is set to 1 then every label will be automatically accepted)
Two simple steps: 1) No consensus is available on Step 4: Assignment.
2) Choose the “No consensus” option. Done! Use-Case: What can "no consensus" be used for? If you would like your reviewers to review every label executed by their labelers this can be a good option. Every label the labeler executes will be considered a conflict label, so the reviewer will have to manually accept correct labels in Reviewer Mode.
When you have chosen no consensus, then all labels will be in conflict. Then, Reviewers must accept or reject manually.
Peer review consensus will be explained through QA/Review section.