Purpose: This study aims to evaluate current reporting practices in radiomics research, with a focus on CheckList for EvaluAtion of Radiomics research (CLEAR).
Methods: We conducted a citation search using Google Scholar to collect original research articles on radiomics citing the CLEAR guideline up to June 17, 2024. We examined the adoption of the guideline, adherence scores per publication, item-wise adherence rates, and self-reporting practices. An expert panel from the European Society of Medical Imaging Informatics Radiomics Auditing Group conducted a detailed item-by-item confirmation analysis of the self-reported CLEAR checklists.
Results: Out of 100 unique citations from 104 records, 48 original research papers on radiomics were included. The overall adoption rate in the literature was 2 %. Among the citing articles, 94 % (45/48) adopted CLEAR for reporting purposes, applying it to both hand-crafted radiomics (89 %) and deep learning (24 %). Self-reported checklists were included in 58 % (26/45) of these papers. Median study-wise adherence score for self-reported data was 91 % (interquartile range = 18 %). Mean confirmed adherence score was 66 % (standard deviation = 14 %). Difference between these scores was statistically significant, (mean = 21 %; standard deviation = 11 %), p < 0.001. Using an arbitrary 50 % adherence cut-off, the number of items with poor adherence increased from 3 to 15 after confirmation analysis, mostly comprised of open science-related items. In addition, several items were frequently misreported.
Conclusion: This study revealed significant discrepancies between self-reported and confirmed adherence to the CLEAR guideline in radiomics research, indicating a need for improved reporting accuracy and verification practices.
Keywords: Artificial intelligence; Machine learning; Radiomics; Reporting; Texture analysis.
Copyright © 2024 Elsevier B.V. All rights reserved.