Add like
Add dislike
Add to saved papers

Automated Segmentation and Quantification of the Right Ventricle in 2-D Echocardiography.

OBJECTIVE: The right ventricle receives less attention than its left counterpart in echocardiography research, practice and development of automated solutions. In the work described here, we sought to determine that the deep learning methods for automated segmentation of the left ventricle in 2-D echocardiograms are also valid for the right ventricle. Additionally, here we describe and explore a keypoint detection approach to segmentation that guards against erratic behavior often displayed by segmentation models.

METHODS: We used a data set of echo images focused on the right ventricle from 250 participants to train and evaluate several deep learning models for segmentation and keypoint detection. We propose a compact architecture (U-Net KP) employing the latter approach. The architecture is designed to balance high speed with accuracy and robustness.

RESULTS: All featured models achieved segmentation accuracy close to the inter-observer variability. When computing the metrics of right ventricular systolic function from contour predictions of U-Net KP, we obtained the bias and 95% limits of agreement of 0.8 ± 10.8% for the right ventricular fractional area change measurements, -0.04 ± 0.54 cm for the tricuspid annular plane systolic excursion measurements and 0.2 ± 6.6% for the right ventricular free wall strain measurements. These results were also comparable to the semi-automatically derived inter-observer discrepancies of 0.4 ± 11.8%, -0.37 ± 0.58 cm and -1.0 ± 7.7% for the aforementioned metrics, respectively.

CONCLUSION: Given the appropriate data, automated segmentation and quantification of the right ventricle in 2-D echocardiography are feasible with existing methods. However, keypoint detection architectures may offer higher robustness and information density for the same computational cost.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app