Purpose: Missed fractures are the most common radiologic error in clinical practice, and erroneous classification could lead to inappropriate treatment and unfavorable prognosis. Here, we developed a fully automated deep learning model to detect and classify femoral neck fractures using plain radiographs, and evaluated its utility for diagnostic assistance and physician training.
Methods: 1527 plain pelvic and hip radiographs obtained between April 2014 and July 2023 at our Hospital were selected for the model training and evaluation. Faster R-CNN was used to locate the femoral neck. DenseNet-121 was used for Garden classification of the femoral neck fracture, while an additional segmentation method used to visualize the probable fracture area. The model was assessed by the area under the receiver operating characteristic curve (AUC). The accuracy, sensitivity, and specificity for clinicians fracture detection in the diagnostic assistance and physician training experiments were determined.
Results: The accuracy of the model for fracture detection was 94.1 %. The model achieved AUCs of 0.99 for no femoral neck fractures, 0.94 for Garden I/II fractures, and 0.99 for Garden III/IV fractures. In the diagnostic assistance study, the emergency physicians had an average accuracy of 86.33 % unaided and 92.03 % aided, sensitivity of 85.94 % unaided and 91.78 % aided, and specificity of 87.88 % unaided and 93.13 % aided in detecting fractures. In the physician training study, the accuracy, sensitivity, and specificity of the trainees for fracture classification were 81.83 %, 77.28 %, and 84.85 %, respectively, before training, compared with 90.65 %, 88.31 %, and 92.21 %, respectively, after training.
Conclusions: The model represents a valuable tool for physicians to better visualize fractures and improve training outcomes, indicating deep learning algorithms as a promising approach to improve clinical practice and medical education.
Keywords: Deep learning; Femoral neck fractures; Training support; X-rays.
Copyright © 2024. Published by Elsevier Ltd.