Error-Correcting Neural Sequence Prediction



Neill, James O' and Bollegala, Danushka
(2019) Error-Correcting Neural Sequence Prediction. CoRR, abs/19.

[img] Text
1901.07002v1.pdf - Submitted version

Download (449kB)

Abstract

We propose a novel neural sequence prediction method based on \textit{error-correcting output codes} that avoids exact softmax normalization and allows for a tradeoff between speed and performance. Instead of minimizing measures between the predicted probability distribution and true distribution, we use error-correcting codes to represent both predictions and outputs. Secondly, we propose multiple ways to improve accuracy and convergence rates by maximizing the separability between codes that correspond to classes proportional to word embedding similarities. Lastly, we introduce our main contribution called \textit{Latent Variable Mixture Sampling}, a technique that is used to mitigate exposure bias, which can be integrated into training latent variable-based neural sequence predictors such as ECOC. This involves mixing the latent codes of past predictions and past targets in one of two ways: (1) according to a predefined sampling schedule or (2) a differentiable sampling procedure whereby the mixing probability is learned throughout training by replacing the greedy argmax operation with a smooth approximation. ECOC-NSP leads to consistent improvements on language modelling datasets and the proposed Latent Variable mixture sampling methods are found to perform well for text generation tasks such as image captioning.

Item Type: Article
Uncontrolled Keywords: cs.LG, cs.LG, cs.CL, stat.ML
Depositing User: Symplectic Admin
Date Deposited: 21 Feb 2019 10:24
Last Modified: 19 Jan 2023 01:03
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3033045