Image graph-based neural network enhances clinical image analysis
Image graph-based neural network enhances clinical image analysis lead image
Medical imaging can be used to diagnose diseases, monitor their progression, and evaluate their treatment, often providing immediate and accurate feedback that can lead to faster responses and better patient outcomes. Analyzing images by hand can be tedious, however, so many researchers turn to the field of radiomics, which uses data characterization algorithms to identify features quickly and precisely.
Yang et al. developed a versatile radiomics framework for processing cancer images that can learn to recognize relevant features while taking advantage of similarities between groups of images for better analysis.
Many modern radiomics algorithms rely on neural networks that are trained on collections of relevant medical images. Most of these neural networks assume each image is independent, but this is often not the case: Images can be from the same patient, the same tissue, or the same location, and therefore related to each other.
Graph neural networks (GNNs) are designed to account for these relationships but require them to be explicitly defined. For medical images, this is often impossible. To solve this, the researchers developed an image graph-based neural network (IG-Net) that can learn image features and image relationships simultaneously.
“IG-Net was specifically designed to address these challenges by dynamically learning both feature representations and their underlying relational structure during model training,” said author Jielong Yang.
The team plans to continue developing IG-Net by scaling the framework to handle larger datasets and improving interpretability of the model’s outputs. They are also investigating the potential applications of their approach in other clinical contexts.
Source: “Learning discrete structures for cancer radiomics,” by Jielong Yang, Jing Yang, Tianye Niu, Zhili Wang, Linbo Liu, Yongsheng Huang, Si Chen, and Xin Ge, APL Bioengineering (2025). The article can be accessed at https://doi.org/10.1063/5.0260927