Background/Aim To automatically detect and classify the early stages of retinopathy of prematurity (ROP) using a deep convolutional neural network (CNN).
Methods This retrospective cross-sectional study was conducted in a referral medical centre in Taiwan. Only premature infants with no ROP, stage 1 ROP or stage 2 ROP were enrolled. Overall, 11 372 retinal fundus images were compiled and split into 10 235 images (90%) for training, 1137 (10%) for validation and 244 for testing. A deep CNN was implemented to classify images according to the ROP stage. Data were collected from December 17, 2013 to May 24, 2019 and analysed from December 2018 to January 2020. The metrics of sensitivity, specificity and area under the receiver operating characteristic curve were adopted to evaluate the performance of the algorithm relative to the reference standard diagnosis.
Results The model was trained using fivefold cross-validation, yielding an average accuracy of 99.93%±0.03 during training and 92.23%±1.39 during testing. The sensitivity and specificity scores of the model were 96.14%±0.87 and 95.95%±0.48, 91.82%±2.03 and 94.50%±0.71, and 89.81%±1.82 and 98.99%±0.40 when predicting no ROP versus ROP, stage 1 ROP versus no ROP and stage 2 ROP, and stage 2 ROP versus no ROP and stage 1 ROP, respectively.
Conclusions The proposed system can accurately differentiate among ROP early stages and has the potential to help ophthalmologists classify ROP at an early stage.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Contributors W-CW has full access to the data and takes overall responsibility. Conception and design: Y-PH, HB and W-CW. Data collection and collation: EY-CK, K-JC, Y-SH, C-CL and SK. Data analysis: HB and EY-CK. Data interpretation: JPC, MFC, RVPC and SK. Writing: Y-PH, W-CW.
Funding This study was funded in part by the Ministry of Science and Technology, Taiwan, under Grants MOST108-2321-B-027-001 and MOST108-2221-E-027-111-MY3, and the Chang Gung Memorial Hospital and National Taipei University of Technology Joint Research Program (CGMH-NTUT-2020-No. 01) This study was also supported by Chang Gung Memorial Hospital Research Grants (CMRPG3I0071~3 and CORPG3K0131) and Ministry of Science and Technology, Taiwan research Grants (MOST 106-2314-B-182A-040-MY3). Additionally, this project was supported by grants R01EY19474, K12EY027720 and P30EY10572 from the National Institutes of Health (Bethesda, Maryland), by grants SCH-1622679 from the National Science Foundation (Arlington, Virginia) and by unrestricted departmental funding and a Career Development Award (JPC) from Research to Prevent Blindness (New York, New York). The sponsor had no role in the design or conduct of this research.
Competing interests MFC is an unpaid member of the Scientific Advisory Board for Clarity Medical Systems (Pleasanton, California), a consultant for Novartis (Basel, Switzerland) and an initial member of Inteleretina (Honolulu, Hawaii). RVPC is on the Scientific Advisory Board for Visunex Medical Systems (Pleasanton, California) and a consultant for Genentech (South San Francisco, California). MFC and JPC receive research support from Genentech.
Provenance and peer review Not commissioned; externally peer reviewed.
Data availability statement Data are available upon reasonable request.