Abstract
Bayesian inference provides a powerful framework to optimally inte- grate statistically learned prior knowledge into numerous computer vision algo- rithms. While the Bayesian approach has been successfully applied in the Markov random field literature, the resulting combi natorial optimization problems have been commonly treated with rather inefficient and inexact general purpose opti- mization methods such as Simulated Annealing. An efficient method to compute the global optima of certain classes of cost functions defined on binary-valued variables is given by graph min-cuts. In this paper, we propose to reconsider the problem of statistical learning for Bayesian inference in the context of efficient optimization schemes. Specifically, we address the question: Which prior information may be learned while retaining the ability to apply Graph Cut opti- mization? We provide a framework to learn and impose prior knowledge on the distribution of pairs and triplets of labels. As an illustration, we demonstrate that one can optimally restore binary textures from very noisy images with run- times on the order of a second while imposing hundreds of statistically learned constraints per pixel.