We give the first polynomial-time algorithm for robust regression in the listdecodable setting where an adversary can corrupt a greater than 1/2 fraction of examples. For any < 1, our algorithm takes as input a sample of n linear equations where of the equations satisfy for some small noise ? and (1-)n of the equations are arbitrarily chosen. It outputs a list L of size O(1/) a fixed constant that contains an that is close to . Our algorithm succeeds whenever the inliers are chosen from a certifiably anticoncentrated distribution D. As a corollary of our algorithmic result, we obtain a time algorithm to find a O(1/) size list when the inlier distribution is standard Gaussian. For discrete product distributions that are anti-concentrated only in regular directions, we give an algorithm that achieves similar guarantee under the promise that has all coordinates of the same magnitude. To complement our result, we prove that the anti-concentration assumption on the inliers is information-theoretically necessary. To solve the problem we introduce a new framework for list-decodable learning that strengthens the “identifiability to algorithms” paradigm based on the sum-ofsquares method.