Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Mathematics

Selected Works

Jodi Mead

Selected Works

2013

Articles 1 - 2 of 2

Full-Text Articles in Entire DC Network

X2 Tests For The Choice Of The Regularization Parameter In Nonlinear Inverse Problems, J. L. Mead, C. C. Hammerquist Oct 2013

X2 Tests For The Choice Of The Regularization Parameter In Nonlinear Inverse Problems, J. L. Mead, C. C. Hammerquist

Jodi Mead

We address discrete nonlinear inverse problems with weighted least squares and Tikhonov regularization. Regularization is a way to add more information to the problem when it is ill-posed or ill-conditioned. However, it is still an open question as to how to weight this information. The discrepancy principle considers the residual norm to determine the regularization weight or parameter, while the χ2 method [J. Mead, J. Inverse Ill-Posed Probl., 16 (2008), pp. 175–194; J. Mead and R. A. Renaut, Inverse Problems, 25 (2009), 025002; J. Mead, Appl. Math. Comput., 219 (2013), pp. 5210–5223; R. A. Renaut, I. Hnetynkova, and J. L. …


Discontinuous Parameter Estimates With Least Squares Estimators, J. L. Mead Jan 2013

Discontinuous Parameter Estimates With Least Squares Estimators, J. L. Mead

Jodi Mead

We discuss weighted least squares estimates of ill-conditioned linear inverse problems where weights are chosen to be inverse error covariance matrices. Least squares estimators are the maximum likelihood estimate for normally distributed data and parameters, but here we do not assume particular probability distributions. Weights for the estimator are found by ensuring its minimum follows a χ2 distribution. Previous work with this approach has shown that it is competitive with regularization methods such as the L-curve and Generalized Cross Validation (GCV) [20]. In this work we extend the method to find diagonal weighting matrices, rather than a scalar regularization parameter. …