Abstract. Robust cost optimization is the challenging task of fitting a
large number of parameters to data points containing a significant and
unknown fraction of outliers. In this work we identify three classes of
deterministic second-order algorithms that are able to tackle this type of
optimization problem: direct approaches that aim to optimize the robust
cost directly with a second order method, lifting-based approaches that
add so called lifting variables to embed the given robust cost function
into a higher dimensional space, and graduated optimization methods
that solve a sequence of smoothed cost functions. We study each of these
classes of algorithms and propose improvements either to reduce their
computational time or to make them find better local minima. Finally,
we experimentally demonstrate the superiority of our improved graduated optimization method over the state of the art algorithms both on
synthetic and real data for four different problems