Item #1

“Adherence to radiomics and/or machine learning-specific checklists or guidelines.” [1] (licensed under CC BY)

Explanation

“Whether any guideline or checklist, e.g., CLEAR checklist, is used in designing and reporting, as appropriate for the study design (e.g., handcrafted radiomics or deep learning pipeline).” [1] (licensed under CC BY)

Positive examples from the literature

Example #1: “This study strictly adhered to the CheckList for EvaluAtion of Radiomics reporting guidelines to enhance the credibility, reproducibility, and transparency of the study. The checklist is available in the Electronic Supplemental Material.” [2] (licensed under CC BY)

Example #2: “This study conducted a self-assessment of our work using the CLEAR checklist and METRICS, which were submitted as Supplementary 2 and Supplementary 3.” [3] (licensed under CC BY)

Example #3: “In addition, this study adhered to the CLEAR (CheckList for EvaluAtion of Radiomics) reporting guidelines to improve the credibility, reproducibility, and transparency of the study.” [4] (licensed under CC BY-NC-ND)

Hypothetical negative examples

Example #4: We followed standard radiomics methodology to ensure robust feature extraction and model validation. Our study design was guided by best practices in the field. A detailed description of our methods is provided in the supplementary material.

Example #5: We followed STARD (Standards for Reporting of Diagnostic Accuracy Studies) guideline in planning and reporting this study.

Importance of the item

Adherence to radiomics and machine learning-specific checklists, such as the CheckList for EvaluAtion of Radiomics research (CLEAR) [5], plays a crucial role in ensuring transparency in radiomics studies. These guidelines provide a standardized approach to designing, implementing, and reporting radiomics studies, which is essential for reproducibility and comparability across studies. By following these checklists, researchers can minimize biases, ensure proper data handling, and enhance the interpretability of their findings [6, 7]. Moreover, guideline adherence promotes consistency in critical aspects like feature extraction, model validation, and clinical applicability, ultimately supporting the translation of radiomics into clinical practice [8, 9]. Without such standardization, radiomics studies risk producing results that are difficult to validate or generalize, undermining their clinical utility and scientific value [9, 10].

Specifics about the positive examples

The provided examples demonstrate how the authors used the guidelines or checklist to ensure transparent and standardized reporting of their radiomics studies. In both examples, the authors followed the CheckList for EvaluAtion of Radiomics research (CLEAR) to enhance reproducibility, transparency, and methodological rigor [5, 6].

In Example #2, the authors also incorporated the METhodological RadiomICs Score (METRICS) [1] to assess and report their study’s quality based on this additional tool.

Moreover, the completed checklists in the first two examples were included in the supplementary materials, offering readers, reviewers, and editors a detailed account of the study’s adherence to reporting guidelines, further exemplifying best practices. This approach not only aids in peer review but also serves as a valuable resource for readers seeking to understand or replicate the methodologies used in these studies.

In Example #3, the authors mention their adherence to CLEAR without providing the checklist.

Specifics about the negative examples

In Example #4, the study makes vague claims about following “best practices” but does not explicitly reference any known checklist (e.g., CLEAR, METRICS, CLAIM).

In Example #5, the study mentions adherence to a checklist not specific to radiomics or machine learning.

Recommendations for appropriate scoring

Studies should be scored based on their adherence to reporting guidelines, such as radiomics-specific reporting guidelines like CLEAR [5] or METRICS [1]. The artificial intelligence or machine learning guidelines such as CLAIM [11] can also be considered to meet this item. It is important to note that item score should only be awarded if the study explicitly references a radiomics or machine learning-specific guideline used.

While providing a completed checklist is not mandatory for scoring this item, researchers are strongly encouraged to include it as supplementary material. This practice enhances reproducibility and allows reviewers to verify compliance with the guidelines, fostering a higher standard of reporting in radiomics research.

References

  1. Kocak B, Akinci D’Antonoli T, Mercaldo N, et al (2024) METhodological RadiomICs Score (METRICS): a quality scoring tool for radiomics research endorsed by EuSoMII. Insights Imaging 15:8. https://doi.org/10.1186/s13244-023-01572-w
  2. Foltyn-Dumitru M, Mahmutoglu MA, Brugnara G, et al (2024) Shape matters: unsupervised exploration of IDH-wildtype glioma imaging survival predictors. Eur Radiol. https://doi.org/10.1007/s00330-024-11042-6
  3. Li C, Hu J, Zhang Z, et al (2024) Biparametric MRI of the prostate radiomics model for prediction of pelvic lymph node metastasis in prostate cancers : a two-centre study. BMC Med Imaging 24:185. https://doi.org/10.1186/s12880-024-01372-8
  4. Park JH, Cho E-S, Yoon J, et al (2024) MRI radiomics model differentiates small hepatic metastases and abscesses in periampullary cancer patients. Sci Rep 14:23541. https://doi.org/10.1038/s41598-024-74311-w
  5. Kocak B, Baessler B, Bakas S, et al (2023) CheckList for EvaluAtion of Radiomics research (CLEAR): a step-by-step reporting guideline for authors and reviewers endorsed by ESR and EuSoMII. Insights Imaging 14:75. https://doi.org/10.1186/s13244-023-01415-8
  6. Kocak B, Ponsiglione A, Stanzione A, et al (2024) CLEAR guideline for radiomics: Early insights into current reporting practices endorsed by EuSoMII. European Journal of Radiology 181:111788. https://doi.org/10.1016/j.ejrad.2024.111788
  7. Kocak B, Akinci D’Antonoli T, Ates Kus E, et al (2024) Self-reported checklists and quality scoring tools in radiomics: a meta-research. Eur Radiol 34:5028–5040. https://doi.org/10.1007/s00330-023-10487-5
  8. Akinci D’Antonoli T, Cuocolo R, Baessler B, Pinto Dos Santos D (2023) Towards reproducible radiomics research: introduction of a database for radiomics studies. Eur Radiol 34:436–443. https://doi.org/10.1007/s00330-023-10095-3
  9. Kocak B, Pinto dos Santos D, Dietzel M (2025) The widening gap between radiomics research and clinical translation: rethinking current practices and shared responsibilities. European Journal of Radiology Artificial Intelligence 1:100004. https://doi.org/10.1016/j.ejrai.2025.100004
  10. Pinto Dos Santos D, Dietzel M, Baessler B (2021) A decade of radiomics research: are images really data or just patterns in the noise? Eur Radiol 31:1–4. https://doi.org/10.1007/s00330-020-07108-w
  11. Tejani AS, Klontzas ME, Gatti AA, et al (2024) Checklist for Artificial Intelligence in Medical Imaging (CLAIM): 2024 Update. Radiology: Artificial Intelligence 6:e240300. https://doi.org/10.1148/ryai.240300

Back to Index Next: Item #2