The Importance of Risk and Uncertainty for Humane Algorithms



Gray, Nicholas ORCID: 0000-0002-0930-4575
(2023) The Importance of Risk and Uncertainty for Humane Algorithms. PhD thesis, University of Liverpool.

[img] Text
201280368_Feb2023.pdf - Author Accepted Manuscript

Download (3MB) | Preview

Abstract

Computers should make our lives easier; they should complement human abilities and allow us to perform to our maximum potential. They should not be some Kafkaesque machines that require us to carefully navigate some arbitrary bureaucracy that forces us to communicate with black boxes, unsure whether we are even putting the right information into the machine or getting the right answer out. Nevertheless, the algorithms used in our daily lives are often inhumane. One solution is for algorithms to better understand the risk and uncertainties present in the situations where they are used, within the inputs, the internal calculations and outputs. Such an approach has many potential benefits. Understanding the uncertainties can allow algorithms to make better decisions. Uncertainty in the output of an algorithm may lead to ways in which decisions can be interrogated. Allowing algorithms to deal with variability and ambiguity with their inputs means they do not need to force people into uncomfortable classifications. It is essential to compute with what we know rather than make assumptions that may be unjustified. There are two types of uncertainty that algorithms have to deal with: aleatory uncertainty -- caused by the natural variability of a system, and epistemic uncertainty -- caused by a lack of knowledge. Traditionally, both types of these statistics are considered using probability theory. However, using precise probability distributions to model epistemic uncertainty can lead to illogical, inconsistent and incorrect results. This thesis considers how algorithms can compute with both types of uncertainty. This thesis explores imprecise probabilities as a way of computing with both types of uncertainty through probability bounds analysis. The mathematics of computing with these objects is reviewed before the prospect of using an automatic uncertainty compiler to translate uncertainty naive code into code taking complete account of the uncertainty is considered. This approach can enable analysts who may be unwilling or unable to rewrite their codes to include intrusive uncertainty analysis with their algorithms. Problems related to epistemic uncertainty within binary classifications are considered, both within the input and output of the algorithms. The problem of characterising the uncertainties associated with binary classification where there is no gold standard that can perfectly reveal the true class is explored. This problem is particularly relevant in medicine, where harm may be caused by incorrectly interpreting the result of diagnostic tests. Epistemically uncertain inputs -- in the form of intervals -- within logistic regression, a popular binary classifier within statistics and machine learning, are also considered. A novel imprecise approach is shown that considers the set of possible models in an imprecise way instead of reducing the epistemic uncertainty to a single middle-of-the-road model. The approach works when there is uncertainty in the dependent and independent variables. Finally, a chapter is devoted to considering the impact of diagnostic uncertainty within epidemiological models. Throughout the COVID-19 pandemic, governments relied on a combination of testing and epidemiological modelling to guide policy and public health interventions. However, many of the tests performed were imperfect, and it is vital to understand the impact this uncertainty has on models. This work considers that problem by creating a compartmental model with a testing and quarantining system, and can be used to analyse the effect of imperfect mass testing on the spread of the disease.

Item Type: Thesis (PhD)
Divisions: Faculty of Science and Engineering > School of Engineering
Depositing User: Symplectic Admin
Date Deposited: 22 Aug 2023 13:24
Last Modified: 22 Aug 2023 13:24
DOI: 10.17638/03170731
Supervisors:
  • Ferson, Scott
  • De Angelis, Marco
URI: https://livrepository.liverpool.ac.uk/id/eprint/3170731