TY - JOUR
T1 - MCMC algorithms for computational UQ of nonnegativity constrained linear inverse problems
AU - Bardsley, Johnathan M.
AU - Hansen, Per Christian
N1 - Publisher Copyright:
© 2020 Society for Industrial and Applied Mathematics Publications. All rights reserved.
PY - 2020
Y1 - 2020
N2 - In many inverse problems, a nonnegativity constraint is natural. Moreover, in some cases, we expect the vector of unknown parameters to have zero components. When a Bayesian approach is taken, this motivates a desire for prior probability density (and hence posterior probability density) functions that have positive mass at the boundary of the set {x ϵ RN| x ≥0} . Unfortunately, it is difficult to define a prior with this property that yields computationally tractable inference for large-scale inverse problems. In this paper, we use nonnegativity constrained optimization to define such prior and posterior density functions when the measurement error is either Gaussian or Poisson distributed. The numerical optimization methods we use are highly efficient, and hence our approach is computationally tractable even in large-scale cases. We embed our nonnegativity constrained optimization approach within a hierarchical framework, obtaining Gibbs samplers for both Gaussian and Poisson distributed measurement cases. Finally, we test the resulting Markov chain Monte Carlo methods on examples from both image deblurring and positron emission tomography.
AB - In many inverse problems, a nonnegativity constraint is natural. Moreover, in some cases, we expect the vector of unknown parameters to have zero components. When a Bayesian approach is taken, this motivates a desire for prior probability density (and hence posterior probability density) functions that have positive mass at the boundary of the set {x ϵ RN| x ≥0} . Unfortunately, it is difficult to define a prior with this property that yields computationally tractable inference for large-scale inverse problems. In this paper, we use nonnegativity constrained optimization to define such prior and posterior density functions when the measurement error is either Gaussian or Poisson distributed. The numerical optimization methods we use are highly efficient, and hence our approach is computationally tractable even in large-scale cases. We embed our nonnegativity constrained optimization approach within a hierarchical framework, obtaining Gibbs samplers for both Gaussian and Poisson distributed measurement cases. Finally, we test the resulting Markov chain Monte Carlo methods on examples from both image deblurring and positron emission tomography.
KW - Bayesian methods
KW - Inverse problems
KW - Markov chain Monte Carlo
KW - Nonnegativity constraints
KW - Uncertainty quantification
UR - http://www.scopus.com/inward/record.url?scp=85084480726&partnerID=8YFLogxK
U2 - 10.1137/18M1234588
DO - 10.1137/18M1234588
M3 - Article
AN - SCOPUS:85084480726
SN - 1064-8275
VL - 42
SP - A1269-A1288
JO - SIAM Journal on Scientific Computing
JF - SIAM Journal on Scientific Computing
IS - 2
ER -