Adversarial Label Poisoning Attack on Graph Neural Networks via Label Propagation



Liu, Ganlin, Huang, Xiaowei ORCID: 0000-0001-6267-0366 and Yi, Xinping ORCID: 0000-0001-5163-2364
(2022) Adversarial Label Poisoning Attack on Graph Neural Networks via Label Propagation. .

[img] Text
Adversarial_Attack_on_Graphs_ECCV_Camera_Ready.pdf - Author Accepted Manuscript

Download (1MB) | Preview

Abstract

Graph neural networks (GNNs) have achieved outstanding performance in semi-supervised learning tasks with partially labeled graph structured data. However, labeling graph data for training is a challenging task, and inaccurate labels may mislead the training process to erroneous GNN models for node classification. In this paper, we consider label poisoning attacks on training data, where the labels of input data are modified by an adversary before training, to understand to what extent the state-of-the-art GNN models are resistant/vulnerable to such attacks. Specifically, we propose a label poisoning attack framework for graph convolutional networks (GCNs), inspired by the equivalence between label propagation and decoupled GCNs that separate message passing from neural networks. Instead of attacking the entire GCN models, we propose to attack solely label propagation for message passing. It turns out that a gradient-based attack on label propagation is effective and efficient towards the misleading of GCN training. More remarkably, such label attack can be topology-agnostic in the sense that the labels to be attacked can be efficiently chosen without knowing graph structures. Extensive experimental results demonstrate the effectiveness of the proposed method against state-of-the-art GCN-like models.

Item Type: Conference or Workshop Item (Unspecified)
Uncontrolled Keywords: Label poisoning attack, Graph neural networks, Label propagation, Graph convolutional network
Divisions: Faculty of Science and Engineering > School of Electrical Engineering, Electronics and Computer Science
Depositing User: Symplectic Admin
Date Deposited: 13 Jan 2023 08:16
Last Modified: 02 Mar 2023 08:34
DOI: 10.1007/978-3-031-20065-6_14
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3166832