AN OPTIMAL ALGORITHM FOR CONSTRAINED DIFFERENTIABLE CONVEX OPTIMIZATION CL ´ OVIS C. GONZAGA , ELIZABETH W. KARAS , AND DIANE R. ROSSETTO §¶ June 6, 2011 Abstract. We describe three algorithms for solving differentiable convex optimization problems constrained to simple sets in R n , i.e., sets on which it is easy to project an arbitrary point. The first two algorithms are optimal in the sense that they achieve an absolute precision of ε in relation to the optimal value in O(1/ ε) iterations using only first order information. This complexity depends on a Lipschitz constant L for the function derivatives and on a strong convexity constant µ 0. The first algorithm extends to the constrained case a well-known method devised by Nesterov [7] for unconstrained problems, and includes a procedure guessing the unknown value of L. The complexity analysis follows a simpler geometric approach. The other algorithms have several enhancements, including line searches that improve the performance: the second algorithm is enhanced and optimal; the third relaxes somewhat the optimality to obtain the best practical performance. Numerical tests for box-constrained quadratic problems are presented in the last section. 1. Introduction. We study the nonlinear programming problem (P ) minimize f (x) subject to x Ω, where Ω R n is a closed convex set and f : R n R is convex and continuously differentiable, with a Lipschitz constant L> 0 for the gradient and a convexity parameter µ 0. It means that for all x, y R n , ‖∇f (x) −∇f (y)‖≤ Lx y(1.1) and f (x) f (y)+ f (y) T (x y)+ 1 2 µx y 2 . (1.2) If µ> 0, the function is said to be strongly convex. Note that µ L. Simple sets: we assume that Ω is a “simple” set, in the following sense: given an arbitrary point x R n , an oracle is available to compute P Ω (x) = argmin yΩ x y, the orthogonal projection onto the set Ω. A well-known algorithm for solving Problem (P ) is the projected gradient method described by Bertsekas [2]. Our methods will be based only on first-order information, and each iteration will contain a projected gradient step. Optimal methods: the main reference for this paper is the book by Yurii Nesterov [7], and our algorithms are extensions of his basic method described in Chapter 2. His method [7, Algorithm 2.2.6] solves unconstrained convex problems and is in some sense a short steps method. We shall show how to modify his method to deal with constrained problems, and then show how to improve the speed at the cost of inexact projected line searches. Department of Mathematics, Federal University of Santa Catarina. Cx. Postal 5210, 88040-970 Florian´opolis, SC, Brazil; e-mail: clovis@mtm.ufsc.br. Department of Mathematics, Federal University of Paran´a. Cx. Postal 19081, 81531-980 Cu- ritiba, PR, Brazil; e-mail : ewkaras@ufpr.br. § Department of Mathematics, University of S˜ao Paulo. SP, Brazil; e-mail: dianerr@ime.usp.br. The authors are supported by CNPq. 1