Under review - comments welcome but please do not cite! Quantum Logic of Word Meanings: Concept Lattices in Vector Space Models Dominic Widdows and Stanley Peters Center for the Study of Language and Information, Stanford University November 12, 2003 Abstract. This paper systematically develops the logical and algebraic possibilities inherent in vector space models for language, considerably beyond those which are customarily used in semantic applications such as information retrieval and word sense disambiguation. The cornerstone of the approach lies in a simple implementa- tion of the connectives of quantum logic as introduced by Birkhoff and von Neumann (1936), which defines the negation of a concept as the projection onto its orthogonal subspace, and the disjunction and conjunction of two concepts as the vector sum and intersection of their subspaces. This enables us to use the full lattice structure of a vector space, bringing these models much closer to traditional semantic lattice representations such as taxonomic concept hierarchies. We describe selected examples of this process with both negation and disjunction, and summarise experiments which show that the non-local nature of these connec- tives has clear advantages over their Boolean counterparts in removing the synonyms and neighbours of negated terms in information retrieval, as well as removing the negated terms themselves. Having thus validated the approach, we explore its impli- cations for assigning semantics to some compositional phrases, showing cases where a quantum interpretation is preferable to a traditional Boolean formulation (and vice versa). Finally, we draw attention to the danger that quantum connectives may overgeneralise, and suggest another (also non-Boolean) alternative. Keywords: Quantum Logic, Word Vectors, Non-Distrubutive Lattice 1. Introduction Vector spaces have become widespread in natural language processing: initially used to assign coordinates to queries and documents for infor- mation retrieval (Salton and McGill, 1983), variants of the vector model have become used for modelling human language acquisition (Landauer and Dumais, 1997), word sense disambiguation (Sch¨ utze, 1998) and for the automatic mapping of unknown words into a concept hierarchy (Widdows, 2003c). These models have only made use of a small part of the algebraic structure inherent in vector spaces. The only algebraic operations which are standardly performed on word vectors have been commutative addition (for combining several word vectors into a ‘docu- ment vector’), and taking the scalar product of two vectors to measure c 2003 Kluwer Academic Publishers. Printed in the Netherlands. widdows-jolli.tex; 12/11/2003; 18:04; p.1