We have located links that may give you full text access.
Fusion of shallow and deep features from 18 F-FDG PET/CT for predicting EGFR-sensitizing mutations in non-small cell lung cancer.
Quantitative Imaging in Medicine and Surgery 2024 August 1
BACKGROUND: Non-small cell lung cancer (NSCLC) patients with epidermal growth factor receptor-sensitizing (EGFR-sensitizing) mutations exhibit a positive response to tyrosine kinase inhibitors (TKIs). Given the limitations of current clinical predictive methods, it is critical to explore radiomics-based approaches. In this study, we leveraged deep-learning technology with multimodal radiomics data to more accurately predict EGFR-sensitizing mutations.
METHODS: A total of 202 patients who underwent both flourine-18 fluorodeoxyglucose positron emission tomography/computed tomography (18 F-FDG PET/CT) scans and EGFR sequencing prior to treatment were included in this study. Deep and shallow features were extracted by a residual neural network and the Python package PyRadiomics, respectively. We used least absolute shrinkage and selection operator (LASSO) regression to select predictive features and applied a support vector machine (SVM) to classify the EGFR-sensitive patients. Moreover, we compared predictive performance across different deep models and imaging modalities.
RESULTS: In the classification of EGFR-sensitive mutations, the areas under the curve (AUCs) of ResNet-based deep-shallow features and only shallow features from different multidata were as follows: RES_TRAD, PET/CT vs . CT-only vs . PET-only: 0.94 vs . 0.89 vs . 0.92; and ONLY_TRAD, PET/CT vs . CT-only vs . PET-only: 0.68 vs . 0.50 vs . 0.38. Additionally, the receiver operating characteristic (ROC) curves of the model using both deep and shallow features were significantly different from those of the model built using only shallow features (P<0.05).
CONCLUSIONS: Our findings suggest that deep features significantly enhance the detection of EGFR-sensitizing mutations, especially those extracted with ResNet. Moreover, PET/CT images are more effective than CT-only and PET-only images in producing EGFR-sensitizing mutation-related signatures.
METHODS: A total of 202 patients who underwent both flourine-18 fluorodeoxyglucose positron emission tomography/computed tomography (18 F-FDG PET/CT) scans and EGFR sequencing prior to treatment were included in this study. Deep and shallow features were extracted by a residual neural network and the Python package PyRadiomics, respectively. We used least absolute shrinkage and selection operator (LASSO) regression to select predictive features and applied a support vector machine (SVM) to classify the EGFR-sensitive patients. Moreover, we compared predictive performance across different deep models and imaging modalities.
RESULTS: In the classification of EGFR-sensitive mutations, the areas under the curve (AUCs) of ResNet-based deep-shallow features and only shallow features from different multidata were as follows: RES_TRAD, PET/CT vs . CT-only vs . PET-only: 0.94 vs . 0.89 vs . 0.92; and ONLY_TRAD, PET/CT vs . CT-only vs . PET-only: 0.68 vs . 0.50 vs . 0.38. Additionally, the receiver operating characteristic (ROC) curves of the model using both deep and shallow features were significantly different from those of the model built using only shallow features (P<0.05).
CONCLUSIONS: Our findings suggest that deep features significantly enhance the detection of EGFR-sensitizing mutations, especially those extracted with ResNet. Moreover, PET/CT images are more effective than CT-only and PET-only images in producing EGFR-sensitizing mutation-related signatures.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app