Abstract: Knowledge distillation (KD), as an effective compression technology, is used to reduce the resource consumption of graph neural networks (GNNs) and facilitate their deployment on ...
A preliminary implementation of the multi-modal sparse interpretable GCN framework (SGCN) for the detection of Alzheimer's disease (AD). In our experimentation, SGCN learned the sparse regional ...