Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Mathematics

PDF

Selected Works

Selected Works

Regularization

Publication Year
Publication

Articles 1 - 4 of 4

Full-Text Articles in Entire DC Network

X2 Tests For The Choice Of The Regularization Parameter In Nonlinear Inverse Problems, J. L. Mead, C. C. Hammerquist Oct 2013

X2 Tests For The Choice Of The Regularization Parameter In Nonlinear Inverse Problems, J. L. Mead, C. C. Hammerquist

Jodi Mead

We address discrete nonlinear inverse problems with weighted least squares and Tikhonov regularization. Regularization is a way to add more information to the problem when it is ill-posed or ill-conditioned. However, it is still an open question as to how to weight this information. The discrepancy principle considers the residual norm to determine the regularization weight or parameter, while the χ2 method [J. Mead, J. Inverse Ill-Posed Probl., 16 (2008), pp. 175–194; J. Mead and R. A. Renaut, Inverse Problems, 25 (2009), 025002; J. Mead, Appl. Math. Comput., 219 (2013), pp. 5210–5223; R. A. Renaut, I. Hnetynkova, and J. L. …


Discontinuous Parameter Estimates With Least Squares Estimators, J. L. Mead Jan 2013

Discontinuous Parameter Estimates With Least Squares Estimators, J. L. Mead

Jodi Mead

We discuss weighted least squares estimates of ill-conditioned linear inverse problems where weights are chosen to be inverse error covariance matrices. Least squares estimators are the maximum likelihood estimate for normally distributed data and parameters, but here we do not assume particular probability distributions. Weights for the estimator are found by ensuring its minimum follows a χ2 distribution. Previous work with this approach has shown that it is competitive with regularization methods such as the L-curve and Generalized Cross Validation (GCV) [20]. In this work we extend the method to find diagonal weighting matrices, rather than a scalar regularization parameter. …


A Particle Method And Numerical Study Of A Quasilinear Partial Differential Equation, R. Camassa, P. H. Chiu, Long Lee, T.W.H. Sheu Jun 2012

A Particle Method And Numerical Study Of A Quasilinear Partial Differential Equation, R. Camassa, P. H. Chiu, Long Lee, T.W.H. Sheu

Long Lee

We present a particle method for studying a quasilinear partial differential equation (PDE) in a class proposed for the regularization of the Hopf (inviscid Burger) equation via nonlinear dispersion-like terms. These are obtained in an advection equation by coupling the advecting field to the advected one through a Helmholtz operator. Solutions of this PDE are "regularized" in the sense that the additional terms generated by the coupling prevent solution multivaluedness from occurring. We propose a particle algorithm to solve the quasilinear PDE. "Particles" in this algorithm travel along characteristic curves of the equation, and their positions and momenta determine the …


Least Squares Problems With Inequality Constraints As Quadratic Constraints, Jodi Mead, Rosemary A. Renaut Apr 2010

Least Squares Problems With Inequality Constraints As Quadratic Constraints, Jodi Mead, Rosemary A. Renaut

Jodi Mead

Linear least squares problems with box constraints are commonly solved to find model parameters within bounds based on physical considerations. Common algorithms include Bounded Variable Least Squares (BVLS) and the Matlab function lsqlin. Here, the goal is to find solutions to ill-posed inverse problems that lie within box constraints. To do this, we formulate the box constraints as quadratic constraints, and solve the corresponding unconstrained regularized least squares problem. Using box constraints as quadratic constraints is an efficient approach because the optimization problem has a closed form solution.

The effectiveness of the proposed algorithm is investigated through solving three …