Multi-source Attention for Unsupervised Domain Adaptation



Cui, Xia ORCID: 0000-0002-1726-3814 and Bollegala, Danushka
(2020) Multi-source Attention for Unsupervised Domain Adaptation. CoRR, abs/20.

[img] Text
2004.06608v2.pdf - Submitted version

Download (791kB) | Preview

Abstract

Domain adaptation considers the problem of generalising a model learnt using data from a particular source domain to a different target domain. Often it is difficult to find a suitable single source to adapt from, and one must consider multiple sources. Using an unrelated source can result in sub-optimal performance, known as the \emph{negative transfer}. However, it is challenging to select the appropriate source(s) for classifying a given target instance in multi-source unsupervised domain adaptation (UDA). We model source-selection as an attention-learning problem, where we learn attention over sources for a given target instance. For this purpose, we first independently learn source-specific classification models, and a relatedness map between sources and target domains using pseudo-labelled target domain instances. Next, we learn attention-weights over the sources for aggregating the predictions of the source-specific models. Experimental results on cross-domain sentiment classification benchmarks show that the proposed method outperforms prior proposals in multi-source UDA.

Item Type: Article
Uncontrolled Keywords: cs.CL, cs.CL, cs.LG
Depositing User: Symplectic Admin
Date Deposited: 30 Apr 2020 10:37
Last Modified: 18 Jan 2023 23:53
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3085261