Q. Huang, Y. Chen, and L.Guibas, Scalable Semidefinite Relaxation for Maximum A Posterior Estimation. International Conference on Machine Learning, 2014.

Abstract:

Maximuma posteriori (MAP) inference over discrete Markov random fields is a fundamental task spanning a wide spectrum of real-world applications, which is known to be NP-hard for general graphs. In this paper, we propose a novel semidefinite relaxation formulation (referred to as SDR) to estimate the MAP assignment. Algorithmically, we develop an accelerated variant of the alternating direction method of multipliers (referred to as SDPAD-LR) that can effectively exploit the special structure of the new relaxation. Encouragingly, the proposed procedure allows solving SDR for large-scale problems, e.g., problems on a grid graph comprising hundreds of thousands of variables with multiple states per node. Compared with prior SDP solvers, SDPAD-LR is capable of attaining comparable accuracy while exhibiting remarkably improved scalability, in contrast to the commonly held belief that semidefinite relaxation can only been applied on small-scale MRF problems. We have evaluated the performance of SDR on various benchmark datasets including OPENGM2 and PIC in terms of both the quality of the solutions and computation time. Experimental results demonstrate that for a broad class of problems, SDPAD-LR outperforms state-of-the-art algorithms in producing better MAP assignments in an efficient manner.

Bibtex:

@inproceeding{cg-ssrmrf-14,
  author    = {Qixing Huang and
               Yuxin Chen and
               Leonidas J. Guibas},
  title     = {Scalable Semidefinite Relaxation for Maximum A Posterior
               Estimation},
  booktitle = {International Conference on Machine Learning},
  year      = {2014}
}