Electronic Transactions on Numerical Analysis. Volume 7, 1998, pp. 163-181. Copyright 1998, Kent State University. ISSN 1068-9613. ETNA Kent State University etna@mcs.kent.edu RESTARTING TECHNIQUES FOR THE (JACOBI-)DAVIDSON SYMMETRIC EIGENVALUE METHODS ANDREAS STATHOPOULOS AND YOUSEF SAAD Abstract. The (Jacobi-)Davidson method, which is a popular preconditioned extension to the Arnoldi method for solving large eigenvalue problems, is often used with restarting. This has significant performance shortcomings, since important components of the invariant subspace may be discarded. One way of saving more information at restart is through “thick” restarting, a technique that involves keeping more Ritz vectors than needed. This technique and especially its dynamic version have proved very efficient for symmetric cases. A different restarting strategy for the Davidson method has been proposed in [14], motivated by the similarity between the spaces built by the Davidson and Conjugate Gradient methods. For the latter method, a three term recurrence implicitly maintains all required information. In this paper, we consider the effects of preconditioning on the dynamic thick restarting strategy, and we analyze both theoretically and experimentally the strategy based on Conjugate Gradient. Our analysis shows that, in some sense, the two schemes are complementary, and that their combination provides an even more powerful technique. We also describe a way to implement this scheme without additional orthogonalizations or matrix multiplications. Key words. Davidson, Jacobi-Davidson, Lanczos, Conjugate Gradient methods, eigenvalue, implicit restarting, deflation, preconditioning. AMS subject classifications. 65F15. 1. Introduction. Many scientific and engineering applications require the solution of large, sparse, symmetric eigenvalue problems, , for a few of the lowest or highest (extreme) eigenvalues and eigenvectors (eigenpairs). The Lanczos method and its equiva- lent in the non-symmetric case, the Arnoldi method, have been used traditionally to solve these problems [18]. However, as the matrix size increases, clustering of the eigenvalues deteriorates the performance of these methods, and because the inverse of cannot be com- puted directly, preconditioning becomes necessary to compensate for the loss of efficiency and robustness of iterative methods. The Davidson method and its generalization, the Jacobi- Davidson method [5, 13, 3, 20], are popular extensions to the Arnoldi method. Instead of extracting the eigenvectors from a generated Krylov space, these methods gradually build a different space by incorporating into the existing basis the approximate solution of a correc- tion equation. Procedurally, the two methods are similar to the FGMRES method [19], and in this sense, we refer to the approximate solution of the correction equation as preconditioning. Despite the benefits of preconditioning, for many hard problems the (Jacobi-)Davidson method may still require a large number of steps. Because the vector iterates must be saved for extracting the eigenvectors, the storage requirements are overwhelming. The problem is actually aggravated in the symmetric case, where the better theoretical framework and software has led researchers to consider matrices of huge size that allow only a few vectors to be stored. Even in the non-preconditioned Lanczos method where a three-term recurrence is known, orthogonality problems and spurious solutions prevent the application of the method for a large number of steps. For these reasons, many restarting variants of the Lanczos and (Jacobi-)Davidson methods are used in practice [4, 17, 21, 1, 7]. For the Lanczos method, the requirement of maintaining the tridiagonal matrix neces- sitates restarting the iteration with only one vector that is chosen as a linear combination of Received February 2, 1998. Accepted for publication August 3, 1998. Recommended by R. Lehoucq. Work supported by NSF grants DMR–9217287 and ASC 95-04038, and the Minnesota Supercomputer Institute Department of Computer Science, College of William and Mary, Box 8795, Williamsburg, Virginia 23187-8795, (andreas@cs.wm.edu). Department of Computer Science, University of Minnesota, 4-192 EE/CSci Bldg., Minneapolis, Minnesota, 55455-0154 (saad@cs.umn.edu.) 163