Width Distributions for Shape Description
View/ Open
Author(s)
Zhang, X
Gao, Y
Griffith University Author(s)
Year published
2011
Metadata
Show full item recordAbstract
Measuring the similarity between articulated shapes is a fundamental yet challenging problem. This paper proposes a novel shape descriptor based on Width Distributions (WD), which is robust to articulations. We show that the width distributions are articulation insensitive yet descriptive to distinguish different shapes with varied part structures. With measurements on distributions only, the proposed method does not require any alignment between two objects and thus is more robust than correspondence-based measurements. First, the medial axes of the objects are extracted and fitted to B-Splines to remove outliners. The width ...
View more >Measuring the similarity between articulated shapes is a fundamental yet challenging problem. This paper proposes a novel shape descriptor based on Width Distributions (WD), which is robust to articulations. We show that the width distributions are articulation insensitive yet descriptive to distinguish different shapes with varied part structures. With measurements on distributions only, the proposed method does not require any alignment between two objects and thus is more robust than correspondence-based measurements. First, the medial axes of the objects are extracted and fitted to B-Splines to remove outliners. The width of the object shape perpendicular to the medial axis can be calculated for each position on the medial axis. The histograms of those widths are compared using chisquare method for similarity. Experiments on standard 2D shape database show that the proposed method performed better than standard shape distribution algorithms and similarly to other articulation-robust shape descriptors, such as inner-distance. The speed of the proposed method is much faster than the more sophisticated inner-distance, as the proposed method is much simpler in nature and requires limited image processing and pattern classification. These results suggested it could be an efficient and effective method to describe and to match articulated shapes.
View less >
View more >Measuring the similarity between articulated shapes is a fundamental yet challenging problem. This paper proposes a novel shape descriptor based on Width Distributions (WD), which is robust to articulations. We show that the width distributions are articulation insensitive yet descriptive to distinguish different shapes with varied part structures. With measurements on distributions only, the proposed method does not require any alignment between two objects and thus is more robust than correspondence-based measurements. First, the medial axes of the objects are extracted and fitted to B-Splines to remove outliners. The width of the object shape perpendicular to the medial axis can be calculated for each position on the medial axis. The histograms of those widths are compared using chisquare method for similarity. Experiments on standard 2D shape database show that the proposed method performed better than standard shape distribution algorithms and similarly to other articulation-robust shape descriptors, such as inner-distance. The speed of the proposed method is much faster than the more sophisticated inner-distance, as the proposed method is much simpler in nature and requires limited image processing and pattern classification. These results suggested it could be an efficient and effective method to describe and to match articulated shapes.
View less >
Conference Title
Proceedings - 2011 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2011
Copyright Statement
© 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Subject
Computer vision