Relation equivariant graph neural networks to explore the mosaic-like tissue architecture of kidney diseases on spatially resolved transcriptomics

dc.contributor.authorRaina, Mauminah
dc.contributor.authorCheng, Hao
dc.contributor.authorFerreira, Ricardo Melo
dc.contributor.authorStansfield, Treyden
dc.contributor.authorModak, Chandrima
dc.contributor.authorCheng, Ying-Hua
dc.contributor.authorSuryadevara, Hari Naga Sai Kiran
dc.contributor.authorXu, Dong
dc.contributor.authorEadon, Michael T.
dc.contributor.authorMa, Qin
dc.contributor.authorWang, Juexin
dc.contributor.departmentBiomedical Engineering and Informatics, Luddy School of Informatics, Computing, and Engineering
dc.date.accessioned2025-07-14T08:15:56Z
dc.date.available2025-07-14T08:15:56Z
dc.date.issued2025
dc.description.abstractMotivation: Chronic kidney disease (CKD) and acute kidney injury (AKI) are prominent public health concerns affecting more than 15% of the global population. The ongoing development of spatially resolved transcriptomics (SRT) technologies presents a promising approach for discovering the spatial distribution patterns of gene expression within diseased tissues. However, existing computational tools are predominantly calibrated and designed on the ribbon-like structure of the brain cortex, presenting considerable computational obstacles in discerning highly heterogeneous mosaic-like tissue architectures in the kidney. Consequently, timely and cost-effective acquisition of annotation and interpretation in the kidney remains a challenge in exploring the cellular and morphological changes within renal tubules and their interstitial niches. Results: We present an empowered graph deep learning framework, REGNN (Relation Equivariant Graph Neural Networks), designed for SRT data analyses on heterogeneous tissue structures. To increase expressive power in the SRT lattice using graph modeling, REGNN integrates equivariance to handle n-dimensional symmetries of the spatial area, while additionally leveraging Positional Encoding to strengthen relative spatial relations of the nodes uniformly distributed in the lattice. Given the limited availability of well-labeled spatial data, this framework implements both graph autoencoder and graph self-supervised learning strategies. On heterogeneous samples from different kidney conditions, REGNN outperforms existing computational tools in identifying tissue architectures within the 10× Visium platform. This framework offers a powerful graph deep learning tool for investigating tissues within highly heterogeneous expression patterns and paves the way to pinpoint underlying pathological mechanisms that contribute to the progression of complex diseases. Availability and implementation: REGNN is publicly available at https://github.com/Mraina99/REGNN.
dc.eprint.versionFinal published version
dc.identifier.citationRaina M, Cheng H, Ferreira RM, et al. Relation equivariant graph neural networks to explore the mosaic-like tissue architecture of kidney diseases on spatially resolved transcriptomics. Bioinformatics. 2025;41(6):btaf303. doi:10.1093/bioinformatics/btaf303
dc.identifier.urihttps://hdl.handle.net/1805/49378
dc.language.isoen_US
dc.publisherOxford University Press
dc.relation.isversionof10.1093/bioinformatics/btaf303
dc.relation.journalBioinformatics
dc.rightsAttribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.sourcePMC
dc.subjectAcute kidney injury
dc.subjectComputational biology
dc.subjectDeep learning
dc.subjectGene expression profiling
dc.subjectGraph neural networks
dc.subjectKidney diseases
dc.subjectTranscriptome
dc.titleRelation equivariant graph neural networks to explore the mosaic-like tissue architecture of kidney diseases on spatially resolved transcriptomics
dc.typeArticle
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Raina2025Relation-CCBY.pdf
Size:
5.68 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.04 KB
Format:
Item-specific license agreed upon to submission
Description: