Lars Ruthotto

Recent Talks

Deep Learning Optimal Control Methods for Training Deep Neural Networks with E Haber, B Chang, L Meng, E Holtham et al..

We present a new mathematical framework that simplifies designing, training, and analyzing deep neural networks. It is based on the interpretation of deep learning as a dynamic optimal control problem similar to path-planning problems. We exemplify how this understanding helps design, analyze, and train deep neural networks. First, we will focus on ways to ensure the stability of the dynamics in both the continuous and discrete setting and on ways to exploit discretization to obtain adaptive neural networks. Second, we present new multilevel and multiscale approaches, derived from he continuous formulation. Finally, we discuss adaptive higher-order discretization methods and illustrate their impact on the optimization problem.

See also: Handout Version, Beamer Version, Videos, Ref1, Ref2, Ref3

Multigrid Hyper A Multigrid Preconditioner for Hyperelastic Image Registration with J Modersitzki, C Greif.

Image registration is a central problem in a variety of areas involving imaging techniques, and is known to be challenging and ill-posed. Regularization functionals based on hyperelasticity provide a powerful mechanism for limiting the ill-posedness. A key feature of hyperelastic image registration approaches is their ability to model large deformations while guaranteeing their invertibility. In this talk we focus on computational challenges arising in approximately solving the Hessian system. We show that the Hessian is a discretization of a strongly coupled system of partial differential equations whose coefficients can be severely inhomogeneous. Motivated by a local Fourier analysis, we stabilize the system by thresholding the coefficients.

See also: Handout Version, Ref.