TY - JOUR
T1 - Efficient marginalization-based MCMC methods for hierarchical bayesian inverse problems
AU - Saibaba, Arvind K.
AU - Bardsley, Johnathan
AU - Brown, D. Andrew
AU - Alexanderian, Alen
N1 - Publisher Copyright:
© 2019 Society for Industrial and Applied Mathematics and American Statistical Association.
PY - 2019
Y1 - 2019
N2 - Hierarchical models in Bayesian inverse problems are characterized by an assumed prior probability distribution for the unknown state and measurement error precision, and hyper-priors for the prior parameters. Combining these probability models using Bayes' law often yields a posterior distribution that cannot be sampled from directly, even for a linear model with Gaussian measurement error and Gaussian prior, both of which we assume in this paper. In such cases, Gibbs sampling can be used to sample from the posterior [Bardsley, SIAM J. Sci. Comput., 34 (2012), pp. A1316-A1332], but problems arise when the dimension of the state is large. This is because the Gaussian sample required for each iteration can be prohibitively expensive to compute, and because the statistical efficiency of the Markov chain degrades as the dimension of the state increases. The latter problem can be mitigated using marginalization-based techniques, such as those found in [Fox and Norton, SIAM/ASA J. Uncertain. Quantif., 4 (2016), pp. 1191-1218; Joyce, Bardsley, and Luttman, SIAM J. Sci. Comput., 40 (2018), pp. B766-B787; Rue and Held, Monogr. Statist. Appl. Probab. 104, Chapman & Hall/CRC, Boca Raton, FL, 2005], but these can be computationally prohibitive as well. In this paper, we combine the low-rank techniques of [Brown, Saibaba, and Vallelian, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 1076-1100] with the marginalization approach of [Rue and L. Held, Monogr. Statist. Appl. Probab. 104, Chapman & Hall/CRC, Boca Raton, FL, 2005]. We consider two variants of this approach: delayed acceptance and pseudomarginalization. We provide a detailed analysis of the acceptance rates and computational costs associated with our proposed algorithms and compare their performances on two numerical test cases-image deblurring and inverse heat equation.
AB - Hierarchical models in Bayesian inverse problems are characterized by an assumed prior probability distribution for the unknown state and measurement error precision, and hyper-priors for the prior parameters. Combining these probability models using Bayes' law often yields a posterior distribution that cannot be sampled from directly, even for a linear model with Gaussian measurement error and Gaussian prior, both of which we assume in this paper. In such cases, Gibbs sampling can be used to sample from the posterior [Bardsley, SIAM J. Sci. Comput., 34 (2012), pp. A1316-A1332], but problems arise when the dimension of the state is large. This is because the Gaussian sample required for each iteration can be prohibitively expensive to compute, and because the statistical efficiency of the Markov chain degrades as the dimension of the state increases. The latter problem can be mitigated using marginalization-based techniques, such as those found in [Fox and Norton, SIAM/ASA J. Uncertain. Quantif., 4 (2016), pp. 1191-1218; Joyce, Bardsley, and Luttman, SIAM J. Sci. Comput., 40 (2018), pp. B766-B787; Rue and Held, Monogr. Statist. Appl. Probab. 104, Chapman & Hall/CRC, Boca Raton, FL, 2005], but these can be computationally prohibitive as well. In this paper, we combine the low-rank techniques of [Brown, Saibaba, and Vallelian, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 1076-1100] with the marginalization approach of [Rue and L. Held, Monogr. Statist. Appl. Probab. 104, Chapman & Hall/CRC, Boca Raton, FL, 2005]. We consider two variants of this approach: delayed acceptance and pseudomarginalization. We provide a detailed analysis of the acceptance rates and computational costs associated with our proposed algorithms and compare their performances on two numerical test cases-image deblurring and inverse heat equation.
KW - Hierarchical Bayesian approach
KW - Inverse problems
KW - Low-rank approximations
KW - Markov chain Monte Carlo
KW - One-block algorithm
UR - http://www.scopus.com/inward/record.url?scp=85078703430&partnerID=8YFLogxK
U2 - 10.1137/18M1220625
DO - 10.1137/18M1220625
M3 - Article
AN - SCOPUS:85078703430
SN - 2166-2525
VL - 7
SP - 1105
EP - 1131
JO - SIAM-ASA Journal on Uncertainty Quantification
JF - SIAM-ASA Journal on Uncertainty Quantification
IS - 3
ER -