Preview

Cancer Urology

Advanced search

Development of a deep learning-based system for aiding in the determination of Prostate Imaging Reporting and Data System (PI-RADS) scores: an international multicenter study

https://doi.org/10.17650/1726-9776-2024-20-4-15-23

Abstract

Background. Prostate multiparametric magnetic resonance imaging is widely recommended prior to biopsy in clinical practice, with the Prostate Imaging Reporting and Data System (PI-RADS) as the standard tool for guiding diagnosis and treatment decisions. However, analyzing multiparametric magnetic resonance imaging data demands substantial expertise, and the process is often time-intensive and cognitively challenging, leading to variability between and within readers.

Aim. To create a deep learning-based computer-aided diagnosis (DL-CAD) system to minimize manual influence on PI-RADS score determination.

Materials and methods. Between January 2020 and May 2024, 108 patients with histopathologically confirmed prostate cancer with PI-RADS scores 4–5 were retrospectively selected for model development and training. Additionally, 28 benign cases were included for model validation. Different prostate zones were labeled following PI-RADS v2.1 guidelines to facilitate model selection. Manual segmentation of prostate regions and lesions was performed on T2-weighted (T2W) sequences, and a 3D U-Net architecture was implemented for the DL model using the MONAI framework. Diagnostic performance was assessed using Python-based statistical analysis.

Results. The DL-CAD system achieved average accuracy of 78 %, sensitivity of 60 %, and specificity of 84 % for lesion detection. The Dice similarity coefficient for prostate segmentation was 0.71, and the AUROC was 81.16 %.

Conclusion. The DL-CAD system demonstrates promise for patients with clinically significant prostate cancer by improving diagnostic accuracy. While it exhibits high specificity, further improvements of sensitivity and segmentation accuracy are necessary. These improvements could be achieved through the use of larger datasets and advanced deep learning techniques, such as transfer learning or ensemble learning, which could enhance sensitivity without compromising specificity. Further multicenter validation is required to accelerate the integration of this system into clinical practice.

About the Authors

Mingze He
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



M. E. Enikeev
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



R. T. Rzaev
Department of Radiology, The Second University Hospital, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 6 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



I. M. Chernenkiy
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



M. V. Feldsherov
Department of Radiology, The Second University Hospital, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 6 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



He Li
Department of Radiology, The First Hospital of Jilin University
China

Changchun


Competing Interests:

None



Kebang Hu
Department of Urology, The First Hospital of Jilin University
China

Changchun


Competing Interests:

None



E. V. Shpot
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



L. M. Rapoport
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



P. V. Glybochko
Institute for Urology and Reproductive Health, I.M. Sechenov First Moscow State Medical University, Ministry of Health of Russia (Sechenov University)
Russian Federation

Build. 1, 2 Bol’shaya Pirogovskaya St., Moscow 119435


Competing Interests:

None



References

1. Bray F., Ferlay J., Soerjomataram I. et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin 2018;68(6):394–424. DOI: 10.3322/caac.21492

2. Bagheri H., Mahdavi S.R., Geramifar P. et al. An update on the role of mpMRI and (68)Ga-PSMA PET imaging in primary and recurrent prostate cancer. Clin Genitourin Cancer 2024;22(3):102076. DOI: 10.1016/j.clgc.2024.102076

3. Kaneko M., Sugano D., Lebastchi A.H. et al. Techniques and outcomes of MRI-TRUS fusion prostate biopsy. Curr Urol Rep 2021;22(4):27. DOI: 10.1007/s11934-021-01037-x

4. Weinreb J.C., Barentsz J.O., Choyke P.L. et al. PI-RADS prostate imaging – reporting and data system: 2015, Version 2. Eur Urol 2016;69(1):16–40. DOI: 10.1016/j.eururo.2015.08.052

5. Ahdoot M., Lebastchi A.H., Long L. et al. Using Prostate Imaging-Reporting and Data System (PI-RADS) scores to select an optimal prostate biopsy method: a secondary analysis of the Trio study. Eur Urol Oncol 2022;5(2):176–86. DOI: 10.1016/j.euo.2021.03.004

6. Padhani A.R., Barentsz J., Villeirs G. et al. PI-RADS Steering Committee: the PI-RADS multiparametric MRI and MRI-directed biopsy pathway. Radiology 2019;292(2):464–74. DOI: 10.1148/radiol.2019182946

7. Wen J., Liu W., Shen X., Hu W. PI-RADS v2.1 and PSAD for the prediction of clinically significant prostate cancer among patients with PSA levels of 4-10 ng/ml. Sci Rep 2024;14(1):6570. DOI: 10.1038/s41598-024-57337-y

8. He M., Cao Y., Chi C. et al. Research progress on deep learning in magnetic resonance imaging-based diagnosis and treatment of prostate cancer: a review on the current status and perspectives. Front Oncol 2023;13:1189370. DOI: 10.3389/fonc.2023.1189370

9. Smani S., Jalfon M., Sundaresan V. et al. Inter-reader reliability and diagnostic accuracy of PI-RADS scoring between academic and community care networks: how wide is the gap? Urol Oncol 2024;S1078-1439(24)00681-1. DOI: 10.1016/j.urolonc.2024.10.002

10. Savadjiev P., Chong J., Dohan A. et al. Demystification of AI-driven medical image interpretation: past, present and future. Eur Radiol 2019;29(3):1616–24. DOI: 10.1007/s00330-018-5674-x

11. Rouvière O., Jaouen T., Baseilhac P. et al. Artificial intelligence algorithms aimed at characterizing or detecting prostate cancer on MRI: How accurate are they when tested on independent cohorts? A systematic review. Diagn Interv Imaging 2023;104(5):221–34. DOI: 10.1016/j.diii.2022.11.005

12. Taye M.M. Understanding of machine learning with deep learning: architectures, workflow, applications and future directions. Computers 2023;12(5):91.

13. Alzubaidi L., Bai J., Al-Sabaawi A. et al. A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications. J Big Data 2023;10(1):46. DOI: 10.1186/s40537-023-00727-2

14. Singh D., Kumar V., Das C.J. et al. Machine learning-based analysis of a semi-automated PI-RADS v2.1 scoring for prostate cancer. Front Oncol 2022;12:961985. DOI: 10.3389/fonc.2022.961985

15. Annamalai A., Fustok J.N., Beltran-Perez J. et al. Interobserver agreement and accuracy in interpreting mpMRI of the prostate: a systematic review. Curr Urol Rep 2022;23(1):1–10. DOI: 10.1007/s11934-022-01084-y

16. Min X., Li M., Dong D. et al. Multi-parametric MRI-based radiomics signature for discriminating between clinically significant and insignificant prostate cancer: cross-validation of a machine learning method. Eur J Radiol 2019;115:16–21. DOI: 10.1016/j.ejrad.2019.03.010

17. Liu Y., Zheng H., Liang Z. et al. Textured-based deep learning in prostate cancer classification with 3T multiparametric MRI: comparison with PI-RADS-based classification. Diagnostics (Basel) 2021;11(10):1785. DOI: 10.3390/diagnostics11101785

18. Aldoj N., Lukas S., Dewey M., Penzkofer T. Semi-automatic classification of prostate cancer on multi-parametric MR imaging using a multi-channel 3D convolutional neural network. Eur Radiol 2020;30(2):1243–53. DOI: 10.1007/s00330-019-06417-z

19. Saha A., Bosma J.S., Twilt J.J. et al. Artificial intelligence and radiologists in prostate cancer detection on MRI (PI-CAI): an international, paired, non-inferiority, confirmatory study. Lancet Oncol 2024;25(7):879–87. DOI: 10.1016/s1470-2045(24)00220-1

20. Cao R., Mohammadian Bajgiran A., Afshari Mirak S. et al. Joint prostate cancer detection and Gleason score prediction in mp-MRI via FocalNet. IEEE Trans Med Imaging 2019;38(11):2496–506. DOI: 10.1109/tmi.2019.2901928

21. Hoar D., Lee P.Q., Guida A. et al. Combined transfer learning and test-time augmentation improves convolutional neural network-based semantic segmentation of prostate cancer from multi-parametric MR images. Comput Methods Programs Biomed 2021;210:106375. DOI: 10.1016/j.cmpb.2021.106375


Review

For citations:


He M., Enikeev M.E., Rzaev R.T., Chernenkiy I.M., Feldsherov M.V., Li H., Hu K., Shpot E.V., Rapoport L.M., Glybochko P.V. Development of a deep learning-based system for aiding in the determination of Prostate Imaging Reporting and Data System (PI-RADS) scores: an international multicenter study. Cancer Urology. 2024;20(4):15-23. (In Russ.) https://doi.org/10.17650/1726-9776-2024-20-4-15-23

Views: 227


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1726-9776 (Print)
ISSN 1996-1812 (Online)
X