Abstract
s We develop a method for optimization in shape spaces, wi.e., sets of surfaces modulo re-parametrization. Unlike spreviously proposed gradient flows, we achieve superlinear gconvergence rates through an approximation of the shape tHessian, which is generally hard to compute and suffers rfrom a series of degeneracies. Our analysis highlights the orole of mean curvature motion in comparison with firstsorder schemes: instead of surface area, our approach pepnalizes deformation, either by its Dirichlet energy or total cvariation, and hence does not suffer from shrinkage. The alatter regularizer sparks the development of an alternatfing direction method of multipliers on triangular meshes. aTherein, a conjugate-gradient solver enables us to bypass tformation of the Gaussian normal equations appearing in Mthe course of the overall optimization. We combine all of sthese ideas in a versatile geometric variation-regularized bLevenberg-Marquardt-type method applicable to a variety oof shape functionals, depending on intrinsic properties of uthe surface such as normal field and curvature as well as its tembedding into space. Promising experimental results are ireported. d o