# Cohen's Kappa Calculation

[Cohen's Kappa](https://en.wikipedia.org/wiki/Cohen's_kappa) is one of the algorithms that is supported by Datasaur to calculate the agreement while taking into account the possibility of chance agreement. This section explains how labels from labelers and reviewers are processed into an agreement matrix and used to compute Cohen’s Kappa.

<figure><img src="/files/1LvsQClMvAreWFDnvjIv" alt="" width="194"><figcaption></figcaption></figure>

## Sample data

Suppose there are 2 labelers: Labeler A and Labeler B, who labeled the same sentences.

![Labeler A](/files/vzeXRZAUFjuwCMMLQN6C)

![Labeler B](/files/sxVnm8hCNDJBh3Fl2AcE)

There is also a reviewer who labeled the same sentences.

![Reviewer](/files/YeKsPDlk3nry560V27aB)

## Calculating the data

### Agreement records

Based on the screenshots above, we map those labels into the agreement records below:

### **Agreement table/confusion matrix**

The agreement records are then converted into a confusion matrix. For this example, the matrix is constructed using data from Labeler A and Labeler B.

![](/files/f8E2rMfySEkX9vqvmh0R)

### Calculating the Kappa

From the matrix above, there are **7** records with **4** agreements.

![](/files/-MbjYJooXhsKEIXkvCE0)

The observed proportionate agreement is:

![](/files/-MbjYJop2IZQ3GROglNg)

To calculate the probability of random agreement, we note that:

* Labeler A labeled `EVE` once and Labeler B didn't label `EVE`. Therefore, the probability of random agreement on the label `EVE` is:

![](/files/voqPzphxcDRdsuFfbUwP)

* Compute the probability of random agreement for all labels:

![](/files/-MbjYJormS9xj_2DE282)

The full random agreement probability is the sum of the probability of random agreement for all labels:

![](/files/-MbjYJos19gO4G0E6zKB)

Finally, we can calculate the Cohen's Kappa:

![](/files/-MbjYJotnA6mMLbDJk-k)

The Kappa value for Labeler A and Labeler B is 0.49.

#### **Kappa for Labeler A and Reviewer**

With the same calculation, the Kappa value for Labeler A and the reviewer is 0.36.

![](/files/uYZhwykZn25xuFy0S14x)

#### **Kappa for Labeler B and Reviewer**

With the same calculation, the Kappa value for Labeler B and the reviewer is 0.475.

![](/files/kGfToI5BXZHvO5E1I7tI)

![](/files/lvx1e98p4uXISa2qk1vP)

## Summary

* Missing labels from a labeler are treated as empty labels.
* Chance agreement depends on:
  * The number of labels in a project.
  * The number of label classes.
* When both labelers agree but the reviewer rejects the labels:
  * The agreement between the two labelers increases.
  * The agreement between the labelers and the reviewer decreases.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.datasaur.ai/workspace-management/analytics/inter-annotator-agreement/cohens-kappa-calculation.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
