Efficient training of interval Neural Networks for imprecise training data



Sadeghi, Jonathan ORCID: 0000-0003-4106-2374, de Angelis, Marco ORCID: 0000-0001-8851-023X and Patelli, Edoardo ORCID: 0000-0002-5007-7247
(2019) Efficient training of interval Neural Networks for imprecise training data. NEURAL NETWORKS, 118. pp. 338-351.

[img] Text
NeuralNetJournal_authoraccepted.pdf - Author Accepted Manuscript

Download (1MB) | Preview

Abstract

This paper describes a robust and computationally feasible method to train and quantify the uncertainty of Neural Networks. Specifically, we propose a back propagation algorithm for Neural Networks with interval predictions. In order to maintain numerical stability we propose minimising the maximum of the batch of errors at each step. Our approach can accommodate incertitude in the training data, and therefore adversarial examples from a commonly used attack model can be trivially accounted for. We present results on a test function example, and a more realistic engineering test case. The reliability of the predictions of these networks is guaranteed by the non-convex Scenario approach to chance constrained optimisation, which takes place following training, and is hence robust to the performance of the optimiser. A key result is that, by using minibatches of size M, the complexity of the proposed approach scales as O(M⋅N<sub>iter</sub>), and does not depend upon the number of training data points as with other Interval Predictor Model methods. In addition, troublesome penalty function methods are avoided. To the authors' knowledge this contribution presents the first computationally feasible approach for dealing with convex set based epistemic uncertainty in huge datasets.

Item Type: Article
Uncontrolled Keywords: Machine learning, Imprecise probability, Uncertainty quantification, Neural Networks, Interval Predictor Models
Depositing User: Symplectic Admin
Date Deposited: 08 Jul 2019 13:54
Last Modified: 19 Jan 2023 00:38
DOI: 10.1016/j.neunet.2019.07.005
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3049133