Inverse Problems and Imaging doi:10.3934/ipi.2012.6.623 Volume 6, No. 4, 2012, 623–644 GLOBAL MINIMIZATION OF MARKOV RANDOM FIELDS WITH APPLICATIONS TO OPTICAL FLOW Tom Goldstein Rice University, Department of Electrical and Computer Engineering Houston, 77251, USA Xavier Bresson City University of Hong Kong, Department of Computer Science Hong Kong, China Stan Osher UCLA, Department of Mathematics Los Angeles, 90095, USA (Communicated by Antonin Chambolle) Abstract. Many problems in image processing can be posed as non-convex minimization problems. For certain classes of non-convex problems involving scalar-valued functions, it is possible to recast the problem in a convex form using a “functional lifting” technique. In this paper, we present a variational functional lifting technique that can be viewed as a generalization of previ- ous works by Pock et. al and Ishikawa. We then generalize this technique to the case of minimization over vector-valued problems, and discuss a condition which allows us to determine when the solution to the convex problem corre- sponds to a global minimizer. This generalization allows functional lifting to be applied to a wider range of problems then previously considered. Finally, we present a numerical method for solving the convexified problems, and apply the technique to find global minimizers for optical flow image registration. 1. Introduction. Many problems in image processing are posed as regularized minimization problems. Classical examples include total-variation (TV) and non- local TV regularized models for denoising [53, 46, 31], deconvolution [45, 57], and segmentation [11, 43, 31]. When these models are convex, standard minimization methods provide reliable results. Simple and reliable variational techniques include gradient descent [53], dual formulations [18, 15, 1], split Bregman schemes [32, 55, 26]. In the discrete setting, state-of-the-art methods can be derived by embedding the problem into a graph or Markov random field (MRF), and solving the resulting problem using graph cuts [42, 20, 8]. When problems are non-convex, these techniques fail because they get “stuck” at local minima. A classical example of one such non-convex problem is image registration. For this problem, gradient descent techniques can be very ineffective, especially if the images involved suffer from strong noise contamination or repetitive details such as textures. Recent advances in optimization theory have allowed certain classes of non- convex problems to be solved in polynomial time. Most of these techniques rely 2000 Mathematics Subject Classification. 46N10, 68U10, 90C26. Key words and phrases. Optical flow, functional lifting, nonconvex optimization. 623 c 2012 American Institute of Mathematical Sciences