TY - JOUR

T1 - An efficient computational method for total variation-penalized poisson likelihood estimation

AU - Bardsley, Johnathan M.

N1 - Funding Information:
This work was supported by the NSF under grant DMS-0504325 and was done during the author’s visit to the University of Helsinki, Finland in 2006-07 under the University of Montana Faculty Exchange Program. The author would like to thank the referees for their helpful comments. The paper is better because of their efforts. The author would also like to acknowledge the support of the University of Montana International Exchange Program and of the Department of Mathematics and Statistics at the University of Helsinki.
Publisher Copyright:
© 2008 American Institute of Mathematical Sciences.

PY - 2008

Y1 - 2008

N2 - Approximating non-Gaussian noise processes with Gaussian models is standard in data analysis. This is due in large part to the fact that Gaussian models yield parameter estimation problems of least squares form, which have been extensively studied both from the theoretical and computational points of view. In image processing applications, for example, data is often collected by a CCD camera, in which case the noise is a Guassian/Poisson mixture with the Poisson noise dominating for a sufficiently strong signal. Even so, the standard approach in such cases is to use a Gaussian approximation that leads to a negative-log likelihood function of weighted least squares type. In the Bayesian point-of-view taken in this paper, a negative-log prior (or regularization) function is added to the negative-log likelihood function, and the resulting function is minimized. We focus on the case where the negative-log prior is the well-known total variation function and give a statistical interpretation. Regardless of whether the least squares or Poisson negative-log likelihood is used, the total variation term yields a minimization problem that is computationally challenging. The primary result of this work is the efficient computational method that is presented for the solution of such problems, together with its convergence analysis. With the computational method in hand, we then perform experiments that indicate that the Poisson negative-log likelihood yields a more computationally efficient method than does the use of the least squares function. We also present results that indicate that this may even be the case when the data noise is i.i.d. Gaussian, suggesting that regardless of noise statistics, using the Poisson negative-log likelihood can yield a more computationally tractable problem when total variation regularization is used.

AB - Approximating non-Gaussian noise processes with Gaussian models is standard in data analysis. This is due in large part to the fact that Gaussian models yield parameter estimation problems of least squares form, which have been extensively studied both from the theoretical and computational points of view. In image processing applications, for example, data is often collected by a CCD camera, in which case the noise is a Guassian/Poisson mixture with the Poisson noise dominating for a sufficiently strong signal. Even so, the standard approach in such cases is to use a Gaussian approximation that leads to a negative-log likelihood function of weighted least squares type. In the Bayesian point-of-view taken in this paper, a negative-log prior (or regularization) function is added to the negative-log likelihood function, and the resulting function is minimized. We focus on the case where the negative-log prior is the well-known total variation function and give a statistical interpretation. Regardless of whether the least squares or Poisson negative-log likelihood is used, the total variation term yields a minimization problem that is computationally challenging. The primary result of this work is the efficient computational method that is presented for the solution of such problems, together with its convergence analysis. With the computational method in hand, we then perform experiments that indicate that the Poisson negative-log likelihood yields a more computationally efficient method than does the use of the least squares function. We also present results that indicate that this may even be the case when the data noise is i.i.d. Gaussian, suggesting that regardless of noise statistics, using the Poisson negative-log likelihood can yield a more computationally tractable problem when total variation regularization is used.

KW - Bayesian statistical methods

KW - Image reconstruction

KW - Nonnegatively constrained optimization

KW - Total variation

UR - http://www.scopus.com/inward/record.url?scp=70350288449&partnerID=8YFLogxK

U2 - 10.3934/ipi.2008.2.167

DO - 10.3934/ipi.2008.2.167

M3 - Article

AN - SCOPUS:70350288449

SN - 1930-8337

VL - 2

SP - 167

EP - 185

JO - Inverse Problems and Imaging

JF - Inverse Problems and Imaging

IS - 2

ER -