Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings



Bollegala, Danushka ORCID: 0000-0003-4476-7003
(2022) Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings. In: Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}, 2022-7-23 - 2022-7-29, Vienna, Austria.

[img] Text
meta-concat.pdf - Author Accepted Manuscript

Download (298kB) | Preview

Abstract

<jats:p>Given multiple source word embeddings learnt using diverse algorithms and lexical resources, meta word embedding learning methods attempt to learn more accurate and wide-coverage word embeddings. Prior work on meta-embedding has repeatedly discovered that simple vector concatenation of the source embeddings to be a competitive baseline. However, it remains unclear as to why and when simple vector concatenation can produce accurate meta-embeddings. We show that weighted concatenation can be seen as a spectrum matching operation between each source embedding and the meta-embedding, minimising the pairwise inner-product loss. Following this theoretical analysis, we propose two \emph{unsupervised} methods to learn the optimal concatenation weights for creating meta-embeddings from a given set of source embeddings. Experimental results on multiple benchmark datasets show that the proposed weighted concatenated meta-embedding methods outperform previously proposed meta-embedding learning methods.</jats:p>

Item Type: Conference or Workshop Item (Unspecified)
Divisions: Faculty of Science and Engineering > School of Electrical Engineering, Electronics and Computer Science
Depositing User: Symplectic Admin
Date Deposited: 04 May 2022 13:50
Last Modified: 15 Mar 2024 02:25
DOI: 10.24963/ijcai.2022/563
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3154243