BI-GCN: Boundary-Aware Input-Dependent Graph Convolution Network for Biomedical Image Segmentation

Meng, Yanda ORCID: 0000-0001-7344-2174, Zhang, Hongrun, Gao, Dongxu, Zhao, Yitian, Yang, Xiaoyun, Qian, Xuesheng, Huang, Xiaowei and Zheng, Yalin ORCID: 0000-0002-7873-0922
(2021) BI-GCN: Boundary-Aware Input-Dependent Graph Convolution Network for Biomedical Image Segmentation. [Preprint]

Access the full-text of this item by clicking on the Open Access link.


Segmentation is an essential operation of image processing. The convolution operation suffers from a limited receptive field, while global modelling is fundamental to segmentation tasks. In this paper, we apply graph convolution into the segmentation task and propose an improved \textit{Laplacian}. Different from existing methods, our \textit{Laplacian} is data-dependent, and we introduce two attention diagonal matrices to learn a better vertex relationship. In addition, it takes advantage of both region and boundary information when performing graph-based information propagation. Specifically, we model and reason about the boundary-aware region-wise correlations of different classes through learning graph representations, which is capable of manipulating long range semantic reasoning across various regions with the spatial enhancement along the object's boundary. Our model is well-suited to obtain global semantic region information while also accommodates local spatial boundary characteristics simultaneously. Experiments on two types of challenging datasets demonstrate that our method outperforms the state-of-the-art approaches on the segmentation of polyps in colonoscopy images and of the optic disc and optic cup in colour fundus images.

Item Type: Preprint
Additional Information: Accepted in BMVC2021 as Oral
Uncontrolled Keywords: cs.CV, cs.CV, cs.AI
Divisions: Faculty of Health and Life Sciences
Faculty of Health and Life Sciences > Institute of Life Courses and Medical Sciences
Depositing User: Symplectic Admin
Date Deposited: 25 Apr 2022 13:22
Last Modified: 06 Oct 2022 06:16
Open Access URL:
Related URLs: