Abstract
In computer vision, we need the significant edges that are boundaries between different surfaces, or an object and its background.
In this paper, we propose a method for detecting significant edges based on the local distribution of gradient magnitude values. The analysis enables us to tell whether any significant edges exist locally. In the edge area, the distribution of gradient magnitude values can apparently be separated into two clusters. In the non edge areas, the distribution seems to be one cluster. The significant edges in the edge area are detected by an automatic thresholding technique after which line thining takes place. Since the method does not use any smoothing criteria, detected edges are located where we desire. The experiments show that significant edges are detected and textural edges are suppressed.