Development and Validation of a Radiomics-Based Decision-Making Supporting Tool to Improve the Management and Outcome of Patients with Pancreatic Cancer - European Medical Journal

Development and Validation of a Radiomics-Based Decision-Making Supporting Tool to Improve the Management and Outcome of Patients with Pancreatic Cancer

2 Mins
Radiology
Authors:
*Ayoub Mokhtari,1 Roberto Casale,1 Zelda Paquier,2 Thomas Guiot,2 Zohaib Salahuddin,3 Henry Woodruff,3,4 Philippe Lambin,3,4 Jean-Luc Van Laethem,5 Maria Antonietta Bali1
  • 1. Department of Radiology, Institut Jules Bordet, Hôpital Universitaire de Bruxelles, Université Libre de Bruxelles, Belgium
  • 2. Medical Physics Department, Institut Jules Bordet, Hopital Universitaire de Bruxelles, Université Libre de Bruxelles, Belgium
  • 3. The D-Lab, Department of Precision Medicine, GROW-School for Oncology and Reproduction, Maastricht University, the Netherlands
  • 4. Department of Radiology and Nuclear Medicine, GROW-School for Oncology, Maastricht University Medical Center, the Netherlands
  • 5. Department of Digestive Oncology, Hôpital Universitaire de Bruxelles, Université Libre de Bruxelles, Belgium
*Correspondence to [email protected]
Disclosure:

Woodruff has minority shares in the company Radiomics SA but has confirmed this entity was not involved in the preparation of this paper. Lambin has minority shares in the companies Radiomics SA, Convert pharmaceuticals, Comunicare, and LivingMed Biotech, and he is co-inventor of two issued patents with royalties on radiomics (PCT/NL2014/050248 and PCT/NL2014/050728), licensed to Radiomics SA; one issued patent on mtDNA (PCT/EP2014/059089), licensed to ptTheragnostic/DNAmito; one non-issued patent on LSRT (PCT/ P126537PC00), licensed to Varian; three non-patented inventions (softwares) licensed to ptTheragnostic/DNAmito, Radiomics SA, and Health Innovation Ventures, and two non-issued, non-licensed patents on Deep Learning-Radiomics (N2024482, N2024889); however, Lambin had confirmed that none of the above entities or funding sources were involved in the preparation of this paper. The other authors have declared no conflicts of interest.

Acknowledgements:

The authors would like to thank ‘Les amis de Bordet’ for providing a grant for a research fellowship for this project.

Citation:
EMJ Radiol. ;4[1]:43-45. DOI/10.33590/emjradiol/10309894. https://doi.org/10.33590/emjradiol/10309894.
Keywords:
CT, pancreatic adenocarcinoma (PDAC), radiomics.

Each article is made available under the terms of the Creative Commons Attribution-Non Commercial 4.0 License.

BACKGROUND AND AIMS

The aims of the study were to identify reliable clinical and radiomics features to build machine learning models predicting progression-free survival (PFS) and overall survival (OS), using pre-treatment CT in patients with pathology-confirmed pancreatic adenocarcinoma (PDAC).1

MATERIALS AND METHODS

Pre-treatment portal contrast-enhancement CT-scanner of 253 patients with PDAC were retrospectively analysed between 2010–2019. CT scans were collected from different hospitals in Belgium, with non-uniform scanner models and protocols. Demographic, clinical, and survival data were collected from medical records.

OS was used to stratify patients into a long or short survival groups (OS: ≥10 months; OS: <10 months). For PFS, patients were stratified into a long or short PFS groups (PFS: ≥3 months; PFS: <3 months). Lesions were semi-manually segmented using MIM 6.9.0 software (Cleveland, Ohio, USA), and radiomics features were extracted using RadiomiX research software (supported by Radiomics, Liège, Belgium).

Two-thirds of patients were randomly assigned to training-validation dataset, and the remaining one-third for testing.

A four-step method was applied for feature selection. Firstly, reproducible features were chosen according to recent studies on phantoms and human cohorts.2-4 Secondly, features with good intra-rater reliability were retained based on intra-class correlation coefficient >0.75.5 Thirdly, highly correlated and redundant features were removed using Spearman correlation coefficient >0.95. Finally, the number and names of selected features for the final models were chosen using the wrapper method (WEKA software version 3.8.6 [University of Waikato, Hamilton, New Zealand]),6,7 which finds the best combination of features using a defined classifier (random forest classifier). This procedure was applied in the training-validation dataset with three-fold cross-validation.

Based on random forest, different machine learning models were trained and tested to predict OS and PFS. Model performances were assessed using receiver operating characteristic curves and associated area under the curve (AUC). The AUCs were compared using the DeLong test. Significance was set at p<0.05. This part of the pipeline was computed by using SPSS version 28.0.1.1 (IBM, Armonk, New York, USA) and WEKA software version 3.8.6.

RESULTS

A total of 171 radiomics features were extracted. Out of these, 36 features were retained after assessing inter-scanner reproducibility; 28 features were kept based on intra-rater reliability; and, after evaluation of highly correlated features, 18 features were included. Finally, using the wrapper method, six feature subgroups were selected (clinical and radiomics features, clinical features, and radiomics features [both for OS and PFS]).

Subsequently, six random forest models were trained and tested. The Clinical&Radiomics model was the most predictive for both OS (AUC: 0.75) and PFS (AUC: 0.66). Other models reached lower AUCs (Figure 1 and Table 1).

Figure 1: Receiver operating characteristics and area under the curves for overall survival (A) and progression-free survival (B) prediction of the different models.
AUC: area under the curve; CI: confidence interval; OS: overall survival; PFS: progression-free survival; ROC: receiver operating characteristic.

Table 1: Area under the curve for overall survival and progression-free survival prediction of the different models.
AUC: area under the curve; CI: confidence interval; OS: overall survival; PFS: progression-free survival.

It is worth noting that the Clinical&Radiomics model for OS prediction included six clinical features and one radiomic feature (GLCM_homogeneity1);8,9 while the Clinical&Radiomics model for PFS prediction included four clinical features and two radiomic features (GLCM_invDiffMomNor9 and Stats_mean).10

CONCLUSION

Radiomics is an emergent methodology that can be used to predict outcomes. In PDAC, a combination of clinical and radiomics features reached better performances.

Given the significant variability in acquisition protocols and scanners among patients, it is crucial to investigate the reproducibility and repeatability of radiomics features, particularly in the absence of harmonisation techniques.

Future direction of the current project will focus on other outcomes, such as genetic, histological data, and response to treatment.

References
Bartoli M et al. CT and MRI of pancreatic tumors: an update in the era of radiomics. Jpn J Radiol. 2020;38(12):1111-24. Chen Y et al. Robustness of CT radiomics features: consistency within and between single-energy CT and dual-energy CT. Eur Radiol. 2022;32(8):5480-90. Jha AK et al. Repeatability and reproducibility study of radiomic features on a phantom and human cohort.Sci Rep. 2021;11(1):2055. Li Y et al. The impact of phantom design and material-dependence on repeatability and reproducibility of CT-based radiomics features. Med Phys. 2022;49(3):1648-59. Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J Chiropr Med. 2016;15(2):155-63. Kohavi R, John GH. Wrappers for feature subset selection. Artificial Intelligence. 1997;97(1-2):273-324. Hall M et al. The WEKA data mining software. SIGKDD Explor. 2009;11:10-8. Clausi DA. An analysis of co-occurrence texture statistics as a function of grey level quantization. Canadian Journal of Remote Sensing. 2002;28:45-62. Haralick RM et al. Textural features for image classification. IEEE Transactions on Systems, Man, and Cybernetics. 1973;3(6):610-21. Pietka E, “Image standardization in PACS,” Bankman IN (ed.), Handbook of Medical Imaging (2000) Cambridge: Academic Press, pp.783-801.

Rate this content's potential impact on patient outcomes

Average rating / 5. Vote count:

No votes so far! Be the first to rate this content.

Thank you!

Please share some more information on the rating you have given