We have located links that may give you full text access.
Graph-based automatic detection and classification of lesion changes in pairs of CT studies for oncology follow-up.
PURPOSE: Radiological follow-up of oncology patients requires the quantitative analysis of lesion changes in longitudinal imaging studies, which is time-consuming, requires expertise, and is subject to variability. This paper presents a comprehensive graph-based method for the automatic detection and classification of lesion changes in current and prior CT scans.
METHODS: The inputs are the current and prior CT scans and their organ and lesion segmentations. Classification of lesion changes is formalized as bipartite graph matching where lesion pairings are computed by adaptive overlap-based lesion matching. Six types of lesion changes are computed by connected components analysis. The method was evaluated on 208 pairs of lung and liver CT scans from 57 patients with 4600 lesions, 1713 lesion matchings and 2887 lesion changes. Ground-truth lesion segmentations, lesion matchings and lesion changes were created by an expert radiologist.
RESULTS: Our method yields a lesion matching rate accuracy of 99.7% (394/395) and 95.0% (1252/1318) for the lung and liver datasets. Precision and recall are > 0.99 and 0.94 and 0.95 (respectively) for the detection of lesion changes. The analysis of lesion changes helped the radiologist detect 48 missed lesions and 8 spurious lesions in the input ground-truth lesion datasets.
CONCLUSION: The classification of lesion classification provides the clinician with a readily accessible and intuitive identification and classification of the lesion changes and their patterns in support of clinical decision making. Comprehensive automatic computer-aided lesion matching and analysis of lesion changes may improve quantitative follow-up and evaluation of disease status, assessment of treatment efficacy and response to therapy.
METHODS: The inputs are the current and prior CT scans and their organ and lesion segmentations. Classification of lesion changes is formalized as bipartite graph matching where lesion pairings are computed by adaptive overlap-based lesion matching. Six types of lesion changes are computed by connected components analysis. The method was evaluated on 208 pairs of lung and liver CT scans from 57 patients with 4600 lesions, 1713 lesion matchings and 2887 lesion changes. Ground-truth lesion segmentations, lesion matchings and lesion changes were created by an expert radiologist.
RESULTS: Our method yields a lesion matching rate accuracy of 99.7% (394/395) and 95.0% (1252/1318) for the lung and liver datasets. Precision and recall are > 0.99 and 0.94 and 0.95 (respectively) for the detection of lesion changes. The analysis of lesion changes helped the radiologist detect 48 missed lesions and 8 spurious lesions in the input ground-truth lesion datasets.
CONCLUSION: The classification of lesion classification provides the clinician with a readily accessible and intuitive identification and classification of the lesion changes and their patterns in support of clinical decision making. Comprehensive automatic computer-aided lesion matching and analysis of lesion changes may improve quantitative follow-up and evaluation of disease status, assessment of treatment efficacy and response to therapy.
Full text links
Trending Papers
Cardiovascular Disease in Diabetes and Chronic Kidney Disease.Journal of Clinical Medicine 2023 November 9
Monitoring Macro- and Microcirculation in the Critically Ill: A Narrative Review.Avicenna Journal of Medicine 2023 July
Urinary tract infections: a review of the current diagnostics landscape.Journal of Medical Microbiology 2023 November
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
Read by QxMD is copyright © 2021 QxMD Software Inc. All rights reserved. By using this service, you agree to our terms of use and privacy policy.
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app