We have located links that may give you full text access.
Near-Infrared Autofluorescence Signature: A New Parameter for Intraoperative Assessment of Parathyroid Glands in Primary Hyperparathyroidism.
Journal of the American College of Surgeons 2024 July 17
BACKGROUND: The success of parathyroidectomy in primary hyperparathyroidism depends on the intraoperative differentiation of diseased from normal glands. Deep learning can potentially be applied to digitalize this subjective interpretation process that relies heavily on surgeon expertise. In this study, we aimed to investigate whether diseased versus normal parathyroid glands have different near-infrared autofluorescence (NIRAF) signatures and whether related deep learning models can predict normal versus diseased parathyroid glands based on intraoperative in-vivo images.
STUDY DESIGN: This prospective study included patients who underwent parathyroidectomy for primary hyperparathyroidism or thyroidectomy using intraoperative NIRAF imaging at a single tertiary referral center from November 2019 to March 2024. Autofluorescence intensity and heterogeneity index of normal versus diseased parathyroid glands were compared, and a deep learning model was developed.
RESULTS: NIRAF images of a total of 1,506 normal and 597 diseased parathyroid glands from 797 patients were analyzed. Normal versus diseased glands exhibited a higher median normalized NIRAF intensity [2.68 (2.19-3.23) vs 2.09 (1.68-2.56) pixels, p<.0001] and lower heterogeneity index [0.11 (0.08-0.15) vs 0.18 (0.13-0.23), p<.0001]. On receiver operating characteristics analysis, optimal thresholds to predict a diseased gland were 2.22 in pixel intensity and 0.14 in heterogeneity index. On deep learning, precision and recall of the model were 83.3% each, and area under the precision-recall curve (AUPRC) was 0.908.
CONCLUSIONS: Normal and diseased parathyroid glands in primary hyperparathyroidism have different intraoperative NIRAF patterns that could be quantified with intensity and heterogeneity analyses. Visual deep learning models relying on these NIRAF signatures could be built to assist surgeons in differentiating normal from diseased parathyroid glands.
STUDY DESIGN: This prospective study included patients who underwent parathyroidectomy for primary hyperparathyroidism or thyroidectomy using intraoperative NIRAF imaging at a single tertiary referral center from November 2019 to March 2024. Autofluorescence intensity and heterogeneity index of normal versus diseased parathyroid glands were compared, and a deep learning model was developed.
RESULTS: NIRAF images of a total of 1,506 normal and 597 diseased parathyroid glands from 797 patients were analyzed. Normal versus diseased glands exhibited a higher median normalized NIRAF intensity [2.68 (2.19-3.23) vs 2.09 (1.68-2.56) pixels, p<.0001] and lower heterogeneity index [0.11 (0.08-0.15) vs 0.18 (0.13-0.23), p<.0001]. On receiver operating characteristics analysis, optimal thresholds to predict a diseased gland were 2.22 in pixel intensity and 0.14 in heterogeneity index. On deep learning, precision and recall of the model were 83.3% each, and area under the precision-recall curve (AUPRC) was 0.908.
CONCLUSIONS: Normal and diseased parathyroid glands in primary hyperparathyroidism have different intraoperative NIRAF patterns that could be quantified with intensity and heterogeneity analyses. Visual deep learning models relying on these NIRAF signatures could be built to assist surgeons in differentiating normal from diseased parathyroid glands.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app