Skeletonization is a process for reducing foreground regions in a binary image to a skelton remnant that largely preserves the extent and connectivity of the original region while clearing most of the foreground pixels. Imagine this is a process of burning the surface and the fire stops where fire meets. It is clear that thinning produces a sort of skeleton.
The skeleton can be produced in two main ways. The first is to use some kind of thinning that preserves end points. The alternative method is to first calculate the distance transform of the image and take local maximum (i.e. a value is greater than all 8 neighbors), setting all other points to background. This algorithm is very sensitive to noise or small variations as the last two examples illustrate.
Before | ||||||
After |
Another way to think about the skeleton is as the loci of centers of bitangent circles that fit entirely within the foreground regions, e.g.