We study the problem of properly learning large margin halfspaces in the agnostic PAC model. In more detail, we study the complexity of properly learning ddimensional halfspaces on the unit ball within misclassification error where OPT is the optimal -margin error rate and is the approximation ratio. We give learning algorithms and computational hardness results for this problem, for all values of the approximation ratio , that are nearly-matching for a range of parameters. Specifically, for the natural setting that α is any constant bigger than one, we provide an essentially tight complexity characterization. On the positive side, we give an α = 1.01-approximate proper learner that uses samples (which is optimal) and runs in time On the negative side, we show that any constant factor approximate proper learner has runtime assuming the Exponential Time Hypothesis.