CLEAR item#7

“Adherence to guidelines or checklists (e.g., CLEAR checklist). Indicate that the CLEAR checklist was used for reporting and submit the checklist as supplemental data. Do the same with other checklists or guidelines if used in addition to the CLEAR checklist.” [1] (from the article by Kocak et al.; licensed under CC BY 4.0)

Reporting examples for CLEAR item#7

Example#1. “It was designed in accordance with the reporting guidelines proposed by the Image Biomarker Standardization Initiative (IBSI), and our methodology was self-assessed via the modified Radiomics Study Quality (mRQS) score and CheckList for EvaluAtion of Radiomics research (CLEAR). […] The modified radiomics quality score (mRQS) for this study was 16/36 (ESM Table 7) and was much higher than the 7.5/36 that was recently observed in recently published work. In addition, we satisfied 47 of the 58 items in the Checklist for Evaluation of Radiomics Research (CLEAR) checklist (ESM Table 8), where the unaccomplished items included two non-applicable and nine unsatisfied items.” [2] (from the article by Wang et al.; licensed under CC BY 4.0)

Example#2. “We present this article in accordance with the CLEAR and TRIPOD reporting checklists (available at https://jgo.amegroups.com/article/view/10.21037/jgo-23-627/rc).” [3] (from the article by Xiang et al.; licensed under CC BY-NC-ND 4.0)

Explanation and elaboration of CLEAR item#7

The significance of reporting using checklists and adhering to guidelines is emphasized in Item#7. In radiomics [4] and AI literature [5], self-reporting (i.e., checklist documentation by study authors) is infrequently observed, according to recent research. Two recent studies [4, 5] suggest that even with self-reported checklists, the accuracy of the claimed items may not be entirely correct; therefore, a simple adherence statement may not be adequate. In both of the aforementioned instances, CLEAR was used as a reporting guideline alongside RQS (a quality scoring tool) and TRIPOD (a reporting guideline). Example#1 provided a concise summary of the items that were completed. In addition, both example studies provided supplements as filled-out checklists.

References

  1. Kocak B, Baessler B, Bakas S, et al (2023) CheckList for EvaluAtion of Radiomics research (CLEAR): a step-by-step reporting guideline for authors and reviewers endorsed by ESR and EuSoMII. Insights Imaging 14:75. https://doi.org/10.1186/s13244-023-01415-8
  2. Wang K, Karalis JD, Elamir A, et al (2023) Delta Radiomic Features Predict Resection Margin Status and Overall Survival in Neoadjuvant-Treated Pancreatic Cancer Patients. Ann Surg Oncol. https://doi.org/10.1245/s10434-023-14805-5
  3. Xiang Y, Hu Y, Chen C, et al (2023) Radiomics based on machine learning algorithms could predict prognosis and postoperative chemotherapy benefits of patients with gastric cancer: a retrospective cohort study. J Gastrointest Oncol 14:2048–2063. https://doi.org/10.21037/jgo-23-627
  4. Kocak B, Akinci D’Antonoli T, Ates Kus E, et al (2024) Self-reported checklists and quality scoring tools in radiomics: a meta-research. Eur Radiol. https://doi.org/10.1007/s00330-023-10487-5
  5. Kocak B, Keles A, Akinci D’Antonoli T (2023) Self-reporting with checklists in artificial intelligence research on medical imaging: a systematic review based on citations of CLAIM. Eur Radiol. https://doi.org/10.1007/s00330-023-10243-9

Back Next