IoU (Intersection over Union)
How well does a prediction fit the ground truth?
IoU is used to estimate how well a predicted mask or bounding box matches the ground truth data.
IoU also known as Jaccard index or Jaccard similarity coefficient.

Interpretation / calculation

The IoU is calculated by dividing the overlap between the prediction and ground truth label by the union of these.
The output is a percentage indicating the overlap between the two labels.

Code implementation

Numpy
PyTorch
1
import numpy as np
2
โ€‹
3
SMOOTH = 1e-6
4
โ€‹
5
def iou_numpy(outputs: np.array, labels: np.array):
6
outputs = outputs.squeeze(1)
7
8
intersection = (outputs & labels).sum((1, 2))
9
union = (outputs | labels).sum((1, 2))
10
11
iou = (intersection + SMOOTH) / (union + SMOOTH)
12
13
thresholded = np.ceil(np.clip(20 * (iou - 0.5), 0, 10)) / 10
14
15
return thresholded # Or thresholded.mean()
Copied!
1
import torch
2
โ€‹
3
SMOOTH = 1e-6
4
โ€‹
5
def iou_pytorch(outputs: torch.Tensor, labels: torch.Tensor):
6
# You can comment out this line if you are passing tensors of equal shape
7
# But if you are passing output from UNet or something it will most probably
8
# be with the BATCH x 1 x H x W shape
9
outputs = outputs.squeeze(1) # BATCH x 1 x H x W => BATCH x H x W
10
11
intersection = (outputs & labels).float().sum((1, 2)) # Will be zero if Truth=0 or Prediction=0
12
union = (outputs | labels).float().sum((1, 2)) # Will be zzero if both are 0
13
14
iou = (intersection + SMOOTH) / (union + SMOOTH) # We smooth our devision to avoid 0/0
15
16
thresholded = torch.clamp(20 * (iou - 0.5), 0, 10).ceil() / 10 # This is equal to comparing with thresolds
17
18
return thresholded # Or thresholded.mean() if you are interested in average across the batch
Copied!

Related model families

Last modified 5mo ago