# Inter-Annotator Agreement (IAA)

Data scientists have long used inter-annotator agreement (IAA) to measure how well multiple annotators can make the same annotation decision for a certain label category or class. This information can be found directly through Datasaur's dashboards. The calculation can help determine the clarity and consistent reproducibility of your results.

We support two algorithms to calculate the agreement between two annotators.

1. [Cohen's Kappa](https://docs.datasaur.ai/workspace-management/analytics/inter-annotator-agreement/cohens-kappa-calculation)
2. [Krippendorff's Alpha](https://docs.datasaur.ai/workspace-management/analytics/inter-annotator-agreement/krippendorffs-alpha-calculation)

Note that we apply the scale interpretation of Krippendorff's Alpha for both methods as in Image 1.

![Image 1. Krippendorff's Alpha Scale](https://docs.google.com/drawings/u/0/d/skeGno0Afrw0ZIKAZpYbZ2A/image?w=513\&h=93\&rev=137\&ac=1\&parent=1UmG4KLgmliNMX-9QkW8wSX64xkFfe-GlU7jp5IM8lDI)

* **Discard** will be presented in red.
* **Tentative** will be presented in yellow.
* **Good** will be presented in green.

✍ IAA is calculated in the background as soon as a project status changed to **Ready for review** (after all labelers mark the project as complete) or **Complete** (a reviewer marks the project as complete).

## Multiple Ways to View the Data

1. Overview

   * You can view the inter-annotator agreement page by navigating to the **Analytics** page under the left sidebar, select **Team Overview**, then navigate to the **Inter-Annotator Agreement** tab. Please note that only **admin** that can access this page.
   * You can also filter the IAA for a specific project.
   * For this workspace level, IAA is calculated using a weighted average from all projects. This means it doesn't re-calculate the entire set of labels or answers from each project. Instead, the weighted approach considers the occurrence of each label or answer between two annotators, resulting in a more accurate calculation that represent the real situation.

   <figure><img src="https://448889121-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MbjY0HseEqu7LtYAt4d%2Fuploads%2Fgit-blob-80840320523a3112b48483bf18e73599af2b0672%2FAnalytics%20-%20IAA%20-%20Cohen&#x27;s%20Kappa%20-%20All%20projects.png?alt=media" alt=""><figcaption><p>Image 2. Team Overview IAA</p></figcaption></figure>
2. Project

   * The IAA will also be available when you open the detail of a project. It's basically the same IAA information as the above.

   <figure><img src="https://448889121-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MbjY0HseEqu7LtYAt4d%2Fuploads%2Fgit-blob-40df0fc741aad5c06206b822a5bdc8bf79dc65e1%2FAnalytics%20-%20IAA%20-%20Cohen&#x27;s%20Kappa%20-%20specific%20project.png?alt=media" alt=""><figcaption><p>Image 3. Project IAA</p></figcaption></figure>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.datasaur.ai/workspace-management/analytics/inter-annotator-agreement.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
