par Paquier, Zelda ;Chao, Shih-Li ;Acquisto, Anais ;Fenton, Chifra ;Guiot, Thomas ;Dhont, Jennifer;Levillain, Hugo ;Gulyban, Akos;Bali, Maria Antonietta ;Reynaert, Nick
Référence Biomedical Physics & Engineering Express, 8, 6, page (065008)
Publication Publié, 2022-09-01
Référence Biomedical Physics & Engineering Express, 8, 6, page (065008)
Publication Publié, 2022-09-01
Article révisé par les pairs
Résumé : | Abstract Introduction. Radiomics is a promising imaging-based tool which could enhance clinical observation and identify representative features. To avoid different interpretations, the Image Biomarker Standardisation Initiative (IBSI) imposed conditions for harmonisation. This study evaluates IBSI-compliant radiomics applications against a known benchmark and clinical datasets for agreements. Materials and methods. The three radiomics platforms compared were RadiomiX Research Toolbox, LIFEx v7.0.0, and syngo.via Frontier Radiomics v1.2.5 (based on PyRadiomics v2.1). Basic assessment included comparing feature names and their formulas. The IBSI digital phantom was used for evaluation against reference values. For agreement evaluation (including same software but different versions), two clinical datasets were used: 27 contrast-enhanced computed tomography (CECT) of colorectal liver metastases and 39 magnetic resonance imaging (MRI) of breast cancer, including intravoxel incoherent motion (IVIM) and dynamic contrast-enhanced (DCE) MRI. The intraclass correlation coefficient (ICC, lower 95% confidence interval) was used, with 0.9 as the threshold for excellent agreement. Results. The three radiomics applications share 41 (3 shape, 8 intensity, 30 texture) out of 172, 84 and 110 features for RadiomiX, LIFEx and syngo.via, respectively, as well as wavelet filtering. The naming convention is, however, different between them. Syngo.via had excellent agreement with the IBSI benchmark, while LIFEx and RadiomiX showed slightly worse agreement. Excellent reproducibility was achieved for shape features only, while intensity and texture features varied considerably with the imaging type. For intensity, excellent agreement ranged from 46% for the DCE maps to 100% for CECT, while this lowered to 44% and 73% for texture features, respectively. Wavelet features produced the greatest variation between applications, with an excellent agreement for only 3% to 11% features. Conclusion. Even with IBSI-compliance, the reproducibility of features between radiomics applications is not guaranteed. To evaluate variation, quality assurance of radiomics applications should be performed and repeated when updating to a new version or adding a new modality. |