MCMC algorithms for computational UQ of nonnegativity constrained linear inverse problems

Johnathan M. Bardsley, Per Christian Hansen

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


In many inverse problems, a nonnegativity constraint is natural. Moreover, in some cases, we expect the vector of unknown parameters to have zero components. When a Bayesian approach is taken, this motivates a desire for prior probability density (and hence posterior probability density) functions that have positive mass at the boundary of the set {x ϵ RN| x ≥0} . Unfortunately, it is difficult to define a prior with this property that yields computationally tractable inference for large-scale inverse problems. In this paper, we use nonnegativity constrained optimization to define such prior and posterior density functions when the measurement error is either Gaussian or Poisson distributed. The numerical optimization methods we use are highly efficient, and hence our approach is computationally tractable even in large-scale cases. We embed our nonnegativity constrained optimization approach within a hierarchical framework, obtaining Gibbs samplers for both Gaussian and Poisson distributed measurement cases. Finally, we test the resulting Markov chain Monte Carlo methods on examples from both image deblurring and positron emission tomography.

Original languageEnglish
Pages (from-to)A1269-A1288
JournalSIAM Journal on Scientific Computing
Issue number2
StatePublished - 2020


  • Bayesian methods
  • Inverse problems
  • Markov chain Monte Carlo
  • Nonnegativity constraints
  • Uncertainty quantification


Dive into the research topics of 'MCMC algorithms for computational UQ of nonnegativity constrained linear inverse problems'. Together they form a unique fingerprint.

Cite this