Self-Adaptation for Unsupervised Domain Adaptation



Cui, Xia ORCID: 0000-0002-1726-3814 and Bollegala, Danushka ORCID: 0000-0003-4476-7003
(2019) Self-Adaptation for Unsupervised Domain Adaptation. In: Recent Advances in Natural Language Processing, 2019-9-2 - 2019-9-4, Varna, Bulgaria.

[img] Text
Cui_RANLP_2019.pdf - Author Accepted Manuscript

Download (291kB) | Preview

Abstract

Lack of labelled data in the target domain for training is a common problem in domain adaptation. To overcome this problem, we propose a novel unsupervised domain adaptation method that combines projection and self-training based approaches. Using the labelled data from the source domain, we first learn a projection that maximises the distance among the nearest neighbours with opposite labels in the source domain. Next, we project the source domain labelled data using the learnt projection and train a classifier for the target class prediction. We then use the trained classifier to predict pseudo labels for the target domain unlabelled data. Finally, we learn a projection for the target domain as we did for the source domain using the pseudo-labelled target domain data, where we maximise the distance between nearest neighbours having opposite pseudo labels. Experiments on a standard benchmark dataset for domain adaptation show that the proposed method consistently outperforms numerous baselines and returns competitive results comparable to that of SOTA including self-training, tri-training, and neural adaptations.

Item Type: Conference or Workshop Item (Unspecified)
Depositing User: Symplectic Admin
Date Deposited: 02 Sep 2019 08:29
Last Modified: 19 Jan 2023 00:28
DOI: 10.26615/978-954-452-056-4_025
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3053017