MATH Seminar

Title: Efficient Solvers for Nonlinear Problems in Imaging
Defense: Dissertation
Speaker: James L Herring of Emory University
Contact: James Herring, james.lincoln.herring@emory.edu
Date: 2018-05-16 at 3:00PM
Venue: MSC W301
Download Flyer
Abstract:
Nonlinear inverse problems arise in numerous imaging applications, and solving them is often difficult due to ill-posedness and high computational cost. In this work, we introduce tailored solvers for several nonlinear inverse problems in imaging within a Gauss-Newton optimization framework.\\ \\ We develop a linearize and project (LAP) method for a class of nonlinear problems with two (or more) sets of coupled variables. At each iteration of the Gauss-Newton optimization, LAP linearizes the residual around the current iterate, eliminates one block of variables via a projection, and solves the resulting reduced dimensional problem for the Gauss-Newton step. The method is best suited for problems where the subproblem associated with one set of variables is comparatively well-posed or easy to solve. LAP supports iterative, direct, and hybrid regularization and supports element-wise bound constraints on all the blocks of variables. This offers various options for incorporating prior knowledge of a desired solution. We demonstrate the advantages of these characteristics with several numerical experiments. We test LAP for two and three dimensional problems in super resolution and MRI motion correction, two separable nonlinear least squares problems that are linear in one block of variables and nonlinear in the other. We also use LAP for image registration subject to local rigidity constraints, a problem that is nonlinear in all sets of variables. These two classes of problems demonstrate the utility and flexibility LAP method.\\ \\ We also implement an efficient Gauss-Newton optimization scheme for the problem of phase recovery in bispectral imaging, a univariate nonlinear inverse problem. Using a fixed approximate Hessian, matrix-reordering, and stored matrix factors, we accelerate the Gauss-Newton step solve, resulting in a second-order optimization method which outperforms first-order methods in terms of cost per iteration and solution quality.

See All Seminars