Development of a deep learning-based system for aiding in the determination of Prostate Imaging Reporting and Data System (PI-RADS) scores: an international multicenter study

Cover Page

Cite item

Full Text

Abstract

Background. Prostate multiparametric magnetic resonance imaging is widely recommended prior to biopsy in clinical practice, with the Prostate Imaging Reporting and Data System (PI-RADS) as the standard tool for guiding diagnosis and treatment decisions. However, analyzing multiparametric magnetic resonance imaging data demands substantial expertise, and the process is often time-intensive and cognitively challenging, leading to variability between and within readers.

Aim. To create a deep learning-based computer-aided diagnosis (DL-CAD) system to minimize manual influence on PI-RADS score determination.

Materials and methods. Between January 2020 and May 2024, 108 patients with histopathologically confirmed prostate cancer with PI-RADS scores 4–5 were retrospectively selected for model development and training. Additionally, 28 benign cases were included for model validation. Different prostate zones were labeled following PI-RADS v2.1 guidelines to facilitate model selection. Manual segmentation of prostate regions and lesions was performed on T2-weighted (T2W) sequences, and a 3D U-Net architecture was implemented for the DL model using the MONAI framework. Diagnostic performance was assessed using Python-based statistical analysis.

Results. The DL-CAD system achieved average accuracy of 78 %, sensitivity of 60 %, and specificity of 84 % for lesion detection. The Dice similarity coefficient for prostate segmentation was 0.71, and the AUROC was 81.16 %.

Conclusion. The DL-CAD system demonstrates promise for patients with clinically significant prostate cancer by improving diagnostic accuracy. While it exhibits high specificity, further improvements of sensitivity and segmentation accuracy are necessary. These improvements could be achieved through the use of larger datasets and advanced deep learning techniques, such as transfer learning or ensemble learning, which could enhance sensitivity without compromising specificity. Further multicenter validation is required to accelerate the integration of this system into clinical practice.

About the authors

Mingze He

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Author for correspondence.
Email: hemingze97@gmail.com
ORCID iD: 0000-0003-0601-4713

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

M. E. Enikeev

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: enikmic@mail.ru
ORCID iD: 0000-0002-3007-1315

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

R. T. Rzaev

Department of Radiology, The Second University Hospital, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: ramin-rz@mail.ru
ORCID iD: 0000-0002-6005-6247

Build. 1, 6 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

I. M. Chernenkiy

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: chernenkiy_i_m@staff.sechenov.ru
ORCID iD: 0000-0001-5968-9883

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

M. V. Feldsherov

Department of Radiology, The Second University Hospital, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: feldsherov_m_v@staff.sechenov.ru
ORCID iD: 0000-0001-6808-7489

Build. 1, 6 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

He Li

Department of Radiology, The First Hospital of Jilin University

Email: lihe2018@jlu.edu.cn

Changchun

China

Kebang Hu

Department of Urology, The First Hospital of Jilin University

Email: hukb@jlu.edu.cn
ORCID iD: 0000-0003-2860-276X

Changchun

China

E. V. Shpot

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: shpot_e_v@staff.sechenov.ru
ORCID iD: 0000-0003-1121-9430

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

L. M. Rapoport

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: leonidrapoport@yandex.ru
ORCID iD: 0000-0001-7787-1240

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

P. V. Glybochko

Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)

Email: rector@staff.sechenov.ru
ORCID iD: 0000-0002-5541-2251

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435

Russian Federation

References

  1. Bray F., Ferlay J., Soerjomataram I. et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin 2018;68(6):394–424. doi: 10.3322/caac.21492
  2. Bagheri H., Mahdavi S.R., Geramifar P. et al. An update on the role of mpMRI and (68)Ga-PSMA PET imaging in primary and recurrent prostate cancer. Clin Genitourin Cancer 2024;22(3):102076. doi: 10.1016/j.clgc.2024.102076
  3. Kaneko M., Sugano D., Lebastchi A.H. et al. Techniques and outcomes of MRI-TRUS fusion prostate biopsy. Curr Urol Rep 2021;22(4):27. doi: 10.1007/s11934-021-01037-x
  4. Weinreb J.C., Barentsz J.O., Choyke P.L. et al. PI-RADS prostate imaging – reporting and data system: 2015, Version 2. Eur Urol 2016;69(1):16–40. doi: 10.1016/j.eururo.2015.08.052
  5. Ahdoot M., Lebastchi A.H., Long L. et al. Using Prostate Imaging-Reporting and Data System (PI-RADS) scores to select an optimal prostate biopsy method: a secondary analysis of the Trio study. Eur Urol Oncol 2022;5(2):176–86. doi: 10.1016/j.euo.2021.03.004
  6. Padhani A.R., Barentsz J., Villeirs G. et al. PI-RADS Steering Committee: the PI-RADS multiparametric MRI and MRI-directed biopsy pathway. Radiology 2019;292(2):464–74. doi: 10.1148/radiol.2019182946
  7. Wen J., Liu W., Shen X., Hu W. PI-RADS v2.1 and PSAD for the prediction of clinically significant prostate cancer among patients with PSA levels of 4-10 ng/ml. Sci Rep 2024;14(1):6570. doi: 10.1038/s41598-024-57337-y
  8. He M., Cao Y., Chi C. et al. Research progress on deep learning in magnetic resonance imaging-based diagnosis and treatment of prostate cancer: a review on the current status and perspectives. Front Oncol 2023;13:1189370. doi: 10.3389/fonc.2023.1189370
  9. Smani S., Jalfon M., Sundaresan V. et al. Inter-reader reliability and diagnostic accuracy of PI-RADS scoring between academic and community care networks: how wide is the gap? Urol Oncol 2024;S1078-1439(24)00681-1. doi: 10.1016/j.urolonc.2024.10.002
  10. Savadjiev P., Chong J., Dohan A. et al. Demystification of AI-driven medical image interpretation: past, present and future. Eur Radiol 2019;29(3):1616–24. doi: 10.1007/s00330-018-5674-x
  11. Rouvière O., Jaouen T., Baseilhac P. et al. Artificial intelligence algorithms aimed at characterizing or detecting prostate cancer on MRI: How accurate are they when tested on independent cohorts? A systematic review. Diagn Interv Imaging 2023;104(5):221–34. doi: 10.1016/j.diii.2022.11.005
  12. Taye M.M. Understanding of machine learning with deep learning: architectures, workflow, applications and future directions. Computers 2023;12(5):91.
  13. Alzubaidi L., Bai J., Al-Sabaawi A. et al. A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications. J Big Data 2023;10(1):46. doi: 10.1186/s40537-023-00727-2
  14. Singh D., Kumar V., Das C.J. et al. Machine learning-based analysis of a semi-automated PI-RADS v2.1 scoring for prostate cancer. Front Oncol 2022;12:961985. doi: 10.3389/fonc.2022.961985
  15. Annamalai A., Fustok J.N., Beltran-Perez J. et al. Interobserver agreement and accuracy in interpreting mpMRI of the prostate: a systematic review. Curr Urol Rep 2022;23(1):1–10. doi: 10.1007/s11934-022-01084-y
  16. Min X., Li M., Dong D. et al. Multi-parametric MRI-based radiomics signature for discriminating between clinically significant and insignificant prostate cancer: cross-validation of a machine learning method. Eur J Radiol 2019;115:16–21. doi: 10.1016/j.ejrad.2019.03.010
  17. Liu Y., Zheng H., Liang Z. et al. Textured-based deep learning in prostate cancer classification with 3T multiparametric MRI: comparison with PI-RADS-based classification. Diagnostics (Basel) 2021;11(10):1785. doi: 10.3390/diagnostics11101785
  18. Aldoj N., Lukas S., Dewey M., Penzkofer T. Semi-automatic classification of prostate cancer on multi-parametric MR imaging using a multi-channel 3D convolutional neural network. Eur Radiol 2020;30(2):1243–53. doi: 10.1007/s00330-019-06417-z
  19. Saha A., Bosma J.S., Twilt J.J. et al. Artificial intelligence and radiologists in prostate cancer detection on MRI (PI-CAI): an international, paired, non-inferiority, confirmatory study. Lancet Oncol 2024;25(7):879–87. doi: 10.1016/s1470-2045(24)00220-1
  20. Cao R., Mohammadian Bajgiran A., Afshari Mirak S. et al. Joint prostate cancer detection and Gleason score prediction in mp-MRI via FocalNet. IEEE Trans Med Imaging 2019;38(11):2496–506. doi: 10.1109/tmi.2019.2901928
  21. Hoar D., Lee P.Q., Guida A. et al. Combined transfer learning and test-time augmentation improves convolutional neural network-based semantic segmentation of prostate cancer from multi-parametric MR images. Comput Methods Programs Biomed 2021;210:106375. doi: 10.1016/j.cmpb.2021.106375

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2024



СМИ зарегистрировано Федеральной службой по надзору в сфере связи, информационных технологий и массовых коммуникаций (Роскомнадзор).
Регистрационный номер и дата принятия решения о регистрации СМИ: серия ПИ № ФС 77 - 36986 от  21.07.2009.