JOURNAL ARTICLE
RESEARCH SUPPORT, NON-U.S. GOV'T
Add like
Add dislike
Add to saved papers

Automatic thoracic anatomy segmentation on CT images using hierarchical fuzzy models and registration.

Medical Physics 2016 March
PURPOSE: In an attempt to overcome several hurdles that exist in organ segmentation approaches, the authors previously described a general automatic anatomy recognition (AAR) methodology for segmenting all major organs in multiple body regions body-wide [J. K. Udupa et al., "Body-wide hierarchical fuzzy modeling, recognition, and delineation of anatomy in medical images," Med. Image Anal. 18(5), 752-771 (2014)]. That approach utilized fuzzy modeling strategies, a hierarchical organization of organs, and divided the segmentation task into a recognition step to localize organs which was then followed by a delineation step to demarcate the boundary of organs. It achieved speed and accuracy without employing image/object registration which is commonly utilized in many reported methods, particularly atlas-based. In this paper, our aim is to study how registration may influence performance of the AAR approach. By tightly coupling the recognition and delineation steps, by performing registration in the hierarchical order of the organs, and through several object-specific refinements, the authors demonstrate that improved accuracy for recognition and delineation can be achieved by judicial use of image/object registration.

METHODS: The presented approach consists of three processes: model building, hierarchical recognition, and delineation. Labeled binary images for each organ are registered and aligned into a 3D fuzzy set representing the fuzzy shape model for the organ. The hierarchical relation and mean location relation between different organs are captured in the model. The gray intensity distributions of the corresponding regions of the organ in the original image are also recorded in the model. Following the hierarchical structure and location relation, the fuzzy shape model of different organs is registered to the given target image to achieve object recognition. A fuzzy connectedness delineation method is then employed to obtain the final segmentation result of organs with seed points provided by recognition. The authors assess the performance of this method for both nonsparse (compact blob-like) and sparse (thin tubular) objects in the thorax.

RESULTS: The results of eight thoracic organs on 30 real images are presented. Overall, the delineation accuracy in terms of mean false positive and false negative volume fractions is 0.34% and 4.02%, respectively, for nonsparse objects, and 0.16% and 12.6%, respectively, for sparse objects. The two object groups achieve mean boundary distance relative to ground truth of 1.31 and 2.28 mm, respectively.

CONCLUSIONS: The hierarchical structure and location relation integrated into the model provide the initial pose for registration and make the recognition process efficient and robust. The 3D fuzzy model combined with hierarchical affine registration ensures that accurate recognition can be obtained for both nonsparse and sparse organs. Tailoring the registration process for each organ by specialized similarity criteria and updating the organ intensity properties based on refined recognition improve the overall segmentation process.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app