Written by the Fiveable Content Team โข Last updated September 2025
Verified for the 2026 exam
Verified for the 2026 examโขWritten by the Fiveable Content Team โข Last updated September 2025
Definition
Edge detection refers to the process of identifying and highlighting boundaries or edges between different objects or regions in an image. It involves detecting rapid changes in pixel intensity that signify transitions from one object to another.
Related terms
Contrast: Contrast refers to the difference in brightness, color, or intensity between two adjacent pixels or regions. In edge detection, high contrast helps identify sharp transitions more easily.
Gradient: Gradient represents how rapidly pixel values change across neighboring pixels. By analyzing gradients, edge detection algorithms can identify areas where there is a significant change in intensity and locate edges.
Canny Edge Detector: The Canny edge detector is an advanced algorithm used for accurate and reliable edge detection. It applies multiple steps including noise reduction, gradient calculation, non-maximum suppression, and hysteresis thresholding to enhance the quality of detected edges.