Active and collaborative Bayesian calibration of model parameters and bias



Hong, Fangqi, Wei, Pengfei, Xu, Haonan and Beer, Michael ORCID: 0000-0002-0611-0345
(2026) Active and collaborative Bayesian calibration of model parameters and bias Mechanical Systems and Signal Processing, 244. p. 113817. ISSN 0888-3270, 1096-1216

[thumbnail of parameters&biasCorrection-Revision2-cleanVersion.pdf] Text
parameters&biasCorrection-Revision2-cleanVersion.pdf - Author Accepted Manuscript
Available under License Creative Commons Attribution.

Download (2MB) | Preview

Abstract

Calibration of computational models to minimize the discrepancy between model prediction and observations is a fundamental yet challenging task in modern computational science. The root sources of uncertainties resulting in this discrepancy can be complex, but can generally be captured by quantifying the uncertainties of model parameters and model bias. Thus, the numerical problem for joint calibration of model parameters and model bias has attracted widespread attention, but is extremely challenging, especially when the simulators are expensive to evaluate. To fill this gap, a new modular Bayesian inference methodology framework, named active and collaborative Bayesian calibration, is developed based on Bayesian numerical methods, where model bias is described using the Gaussian Process models. By reformulating the calibration problem as an equivalent optimize-then-integrate nested problem, a Bayesian algorithm is then developed for inferring the hyper-parameters of model bias and generating training samples for model parameters in a sequential manner, which allows for collaborative calibration of both model parameters and bias with high efficiency. The upper confidence bound function is generalized to identify the hyper-parameters of model bias, and the posterior error contribution function, is developed for actively enriching the training data set and further updating the trained GPR models. With these two acquisition function working sequentially, model bias can be quantified with a sound trade-off between accuracy and efficiency, and model parameters can be calibrated as a byproduct. Two numerical examples and two real-world engineering examples are ultimately introduced for demonstrating the performance of the proposed method.

Item Type: Article
Uncontrolled Keywords: Uncertainty quantification, Model calibration, Model bias, Bayesian optimization, Active learning
Divisions: Faculty of Science & Engineering
Faculty of Science & Engineering > School of Engineering
Faculty of Science & Engineering > School of Engineering > Civil and Environmental Engineering
Depositing User: Symplectic Admin
Date Deposited: 09 Jan 2026 11:20
Last Modified: 28 Feb 2026 14:53
DOI: 10.1016/j.ymssp.2025.113817
Related Websites:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3196528
Disclaimer: The University of Liverpool is not responsible for content contained on other websites from links within repository metadata. Please contact us if you notice anything that appears incorrect or inappropriate.