Diffeomorphic unsupervised deep learning model for mono- and multi-modality registration



Theljani, Anis and Chen, Ke ORCID: 0000-0002-6093-6623
(2020) Diffeomorphic unsupervised deep learning model for mono- and multi-modality registration. JOURNAL OF ALGORITHMS & COMPUTATIONAL TECHNOLOGY, 14. p. 174830262097352.

Access the full-text of this item by clicking on the Open Access link.

Abstract

<jats:p> Different from image segmentation, developing a deep learning network for image registration is less straightforward because training data cannot be prepared or supervised by humans unless they are trivial (e.g. pre-designed affine transforms). One approach for an unsupervised deep leaning model is to self-train the deformation fields by a network based on a loss function with an image similarity metric and a regularisation term, just with traditional variational methods. Such a function consists in a smoothing constraint on the derivatives and a constraint on the determinant of the transformation in order to obtain a spatially smooth and plausible solution. Although any variational model may be used to work with a deep learning algorithm, the challenge lies in achieving robustness. The proposed algorithm is first trained based on a new and robust variational model and tested on synthetic and real mono-modal images. The results show how it deals with large deformation registration problems and leads to a real time solution with no folding. It is then generalised to multi-modal images. Experiments and comparisons with learning and non-learning models demonstrate that this approach can deliver good performances and simultaneously generate an accurate diffeomorphic transformation. </jats:p>

Item Type: Article
Uncontrolled Keywords: Deep learning, optimisation, similarity measures, mapping, inverse problem, image registration
Depositing User: Symplectic Admin
Date Deposited: 19 Jan 2021 16:36
Last Modified: 18 Jan 2023 23:02
DOI: 10.1177/1748302620973528
Open Access URL: https://journals.sagepub.com/doi/full/10.1177/1748...
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3114203