TY - JOUR

T1 - Randomize-then-optimize for sampling and uncertainty quantification in electrical impedance tomography

AU - Bardsley, Johnathan M.

AU - Seppänen, Aku

AU - Solonen, Antti

AU - Haario, Heikki

AU - Kaipio, Jari

N1 - Funding Information:
∗Received by the editors July 21, 2014; accepted for publication (in revised form) October 1, 2015; published electronically December 8, 2015. http://www.siam.org/journals/juq/3/97827.html †Department of Mathematical Sciences, University of Montana, Missoula, MT 59812-0864 (bardsleyj@mso.umt. edu). ‡Department of Applied Physics, University of Eastern Finland, FI-70211 Kuopio, Finland (Aku.Seppanen@uef.fi). This author’s research was supported by the Academy of Finland (projects 270174 and 273536). §Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA 02139, and Department of Mathematics and Physics, Lappeenranta University of Technology, Lappeenranta FI-53851, Finland (antti.solonen@gmail.com). ¶Department of Mathematics and Physics, Lappeenranta University of Technology, Lappeenranta FI-53851, Finland (heikki.haario@lut.fi). ‖Department of Mathematics, University of Auckland, Auckland 1142, New Zealand, and Department of Applied Physics, University of Eastern Finland, FI-70211 Kuopio, Finland (j.kaipio@math.auckland.ac.nz).
Publisher Copyright:
Copyright © by SIAM and ASA.

PY - 2015

Y1 - 2015

N2 - In a typical inverse problem, a spatially distributed parameter in a physical model is estimated from measurements of model output. Since measurements are stochastic in nature, so is any parameter estimate. Moreover, in the Bayesian setting, the choice of regularization corresponds to the definition of the prior probability density function, which in turn is an uncertainty model for the unknown parameters. For both of these reasons, significant uncertainties exist in the solution of an inverse problem. Thus to fully understand the solution, quantifying these uncertainties is important. When the physical model is linear and the error model and prior are Gaussian, the posterior density function is Gaussian with a known mean and covariance matrix. However, the electrical impedance tomography inverse problem is nonlinear, and hence no closed form expression exists for the posterior density. The typical approach for such problems is to sample from the posterior and then use the samples to compute statistics (such as the mean and variance) of the unknown parameters. Sampling methods for electrical impedance tomography have been studied by various authors in the inverse problems community. However, up to this point the focus has been on the development of increasingly sophisticated implementations of the Gibbs sampler, whose samples are known to converge very slowly to the correct density for large-scale problems. In this paper, we implement a recently developed sampling method called randomize-then-optimize (RTO), which provides nearly independent samples for each application of an appropriate numerical optimization algorithm. The sample density for RTO is not the posterior density, but RTO can be used as a very effective proposal within a Metropolis-Hastings algorithm to obtain samples from the posterior. Here our focus is on implementing the method on synthetic examples from electrical impedance tomography, and we show that it is both computationally efficient and provides good results. We also compare RTO performance with the Metropolis adjusted Langevin algorithm and find RTO to be much more efficient.

AB - In a typical inverse problem, a spatially distributed parameter in a physical model is estimated from measurements of model output. Since measurements are stochastic in nature, so is any parameter estimate. Moreover, in the Bayesian setting, the choice of regularization corresponds to the definition of the prior probability density function, which in turn is an uncertainty model for the unknown parameters. For both of these reasons, significant uncertainties exist in the solution of an inverse problem. Thus to fully understand the solution, quantifying these uncertainties is important. When the physical model is linear and the error model and prior are Gaussian, the posterior density function is Gaussian with a known mean and covariance matrix. However, the electrical impedance tomography inverse problem is nonlinear, and hence no closed form expression exists for the posterior density. The typical approach for such problems is to sample from the posterior and then use the samples to compute statistics (such as the mean and variance) of the unknown parameters. Sampling methods for electrical impedance tomography have been studied by various authors in the inverse problems community. However, up to this point the focus has been on the development of increasingly sophisticated implementations of the Gibbs sampler, whose samples are known to converge very slowly to the correct density for large-scale problems. In this paper, we implement a recently developed sampling method called randomize-then-optimize (RTO), which provides nearly independent samples for each application of an appropriate numerical optimization algorithm. The sample density for RTO is not the posterior density, but RTO can be used as a very effective proposal within a Metropolis-Hastings algorithm to obtain samples from the posterior. Here our focus is on implementing the method on synthetic examples from electrical impedance tomography, and we show that it is both computationally efficient and provides good results. We also compare RTO performance with the Metropolis adjusted Langevin algorithm and find RTO to be much more efficient.

KW - Bayesian methods

KW - Electrical impedance tomography

KW - Inverse problems

KW - Markov chain Monte Carlo

KW - Numerical optimization

UR - http://www.scopus.com/inward/record.url?scp=84984861050&partnerID=8YFLogxK

U2 - 10.1137/140978272

DO - 10.1137/140978272

M3 - Article

AN - SCOPUS:84984861050

SN - 2166-2525

VL - 3

SP - 1136

EP - 1158

JO - SIAM-ASA Journal on Uncertainty Quantification

JF - SIAM-ASA Journal on Uncertainty Quantification

IS - 1

ER -