Gender Bias in Meta-Embeddings



Kaneko, Masahiro, Bollegala, Danushka ORCID: 0000-0003-4476-7003 and Okazaki, Naoaki
(2022) Gender Bias in Meta-Embeddings. In: Findings of the Association for Computational Linguistics: EMNLP 2022, 2022-12 - 2022-12, Abu Dabi.

[img] Text
EMNLP2022_bias_in_meta_embedding.pdf - Author Accepted Manuscript

Download (480kB) | Preview

Abstract

Different methods have been proposed to develop meta-embeddings from a given set of source embeddings. However, the source embeddings can contain unfair gender-related biases, and how these influence the meta-embeddings has not been studied yet. We study the gender bias in meta-embeddings created under three different settings: (1) meta-embedding multiple sources without performing any debiasing (Multi-Source No-Debiasing), (2) meta-embedding multiple sources debiased by a single method (Multi-Source Single-Debiasing), and (3) meta-embedding a single source debiased by different methods (Single-Source Multi-Debiasing). Our experimental results show that meta-embedding amplifies the gender biases compared to input source embeddings. We find that debiasing not only the sources but also their meta-embedding is needed to mitigate those biases. Moreover, we propose a novel debiasing method based on meta-embedding learning where we use multiple debiasing methods on a single source embedding and then create a single unbiased meta-embedding.

Item Type: Conference or Workshop Item (Unspecified)
Depositing User: Symplectic Admin
Date Deposited: 12 Oct 2022 08:14
Last Modified: 15 Mar 2024 02:25
DOI: 10.18653/v1/2022.findings-emnlp.227
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3165408