Contenido del Curso
Computer Vision Essentials
Computer Vision Essentials
Intersection Over Union (IoU) and Evaluation Metrics
How It's Computed
Mathematically, IoU is given by:
Where:
Area of Overlap is the intersection of the predicted and actual bounding boxes;
Area of Union is the total area covered by both boxes.
import numpy as np def compute_iou(boxA, boxB): # Extract coordinates xA = max(boxA[0], boxB[0]) yA = max(boxA[1], boxB[1]) xB = min(boxA[2], boxB[2]) yB = min(boxA[3], boxB[3]) # Compute intersection area interArea = max(0, xB - xA) * max(0, yB - yA) # Compute areas of both boxes boxAArea = (boxA[2] - boxA[0]) * (boxA[3] - boxA[1]) boxBArea = (boxB[2] - boxB[0]) * (boxB[3] - boxB[1]) # Compute union area unionArea = boxAArea + boxBArea - interArea # Compute IoU iou = interArea / unionArea return iou # Example usage box1 = [50, 50, 150, 150] # [x1, y1, x2, y2] box2 = [100, 100, 200, 200] iou_score = compute_iou(box1, box2) print("IoU Score:", iou_score)
IoU as a Metric for Bounding Box Accuracy
IoU is commonly used to assess how well a predicted bounding box aligns with the ground truth. Higher IoU values indicate better alignment, with an IoU of 1.0
meaning perfect overlap and 0.0
meaning no overlap at all.
Thresholding IoU for True Positives and False Positives
To determine whether a detection is correct (true positive) or incorrect (false positive), a threshold for IoU is typically set. Commonly used thresholds include:
IoU > 0.5: considered a True Positive (TP);
IoU < 0.5: considered a False Positive (FP).
Setting higher IoU thresholds increases precision but may decrease recall since fewer detections meet the criteria.
Evaluation Metrics: Precision, Recall, and mAP
In addition to IoU, other evaluation metrics help assess object detection models:
Precision: measures the proportion of correctly predicted bounding boxes among all predictions;
Recall: measures the proportion of correctly predicted bounding boxes among all ground truth objects;
Mean Average Precision (mAP): computes the average precision across different IoU thresholds and object categories, providing a comprehensive evaluation of model performance.
def precision_recall(tp, fp, fn): precision = tp / (tp + fp) if (tp + fp) > 0 else 0 recall = tp / (tp + fn) if (tp + fn) > 0 else 0 return precision, recall # Example usage tp, fp, fn = 50, 10, 20 precision, recall = precision_recall(tp, fp, fn) print(f"Precision: {precision:.2f}, Recall: {recall:.2f}")
IoU serves as a fundamental metric in evaluating object detection models, helping assess the accuracy of predicted bounding boxes. By combining IoU with precision, recall, and mAP, researchers and engineers can fine-tune their models to achieve higher detection accuracy and reliability.
1. What does Intersection over Union (IoU) measure in object detection?
2. Which of the following is considered a false negative in object detection?
3. How is Precision calculated in object detection?
¡Gracias por tus comentarios!