““7 TRANSACTIONS ON INFORMATION THEORY, VOL. IT-28, NO. 5, SEPTEMBER 1982 ,xXx [ill [17-l [I31 1141 [I51 York: Springer-Verlag, pp. 73-137, 1981. R. J. McEliece, The Theory of Information und Coding: A Muthe- [I61 matical Frumework for Communication. Reading, MA: Addison- Wesley, 1971. M. L. Molle, “On the capacity of infinite population multiple access [I71 protocols,” vol. IT-28, pp. 396-401, May 1982. -, “Bounds on the capacity of infinite population multiple access protocols,” in Ahstructs 1981 IEEE Int. Symp. Inform. The- ory, Santa Monica, CA, pp. 120-121. J. Mosely, “An efficient contention resolution algorithm for multi- [I81 ple access channels,” M.S. thesis, Tech. Rep. LIDS-TH-918, M.I.T. M. S. Pinsker, Information and Informution Stuhility of Random Vuriuhles and Processes. Izd. Akad. Nauk. SSSR, Moscow, 1964, 701 (Trans. San Francisco: Holden-Day, 1964.) N. Pippenger, “Bounds on the performance of protocols for a multiple-access broadcast channel,” IEEE Trans. Inform. Theory, vol. IT-27, pp. 145-151, Mar. 1981. B. S. Tsybakov and V. A. Mikhailov, “Free Synchronous packet access in a broadcast channel with feedback,” Problems of Informu- tion Trunsmission, vol. 14, no. 4, pp. 259-280, April 1979 (trans- lated from Russian original in Problemv Pereduchi Informatsii, Oct.-Dec. 1978). -, “An upper bound to capacity of random multiple access systems,” presented at the 1981 IEEE Inform. Theory Symp., Santa Monica, CA, Feb. 1981. Prob1em.v Pereduchi Informutisii, vol. 17, no. I, pp. 90-95, Jan.-Mar., 1981. Causal Source Codes DAVID L. NEUHOFF, MEMBER, IEEE, AND R. KENT GILBERT, MEMBER, IEEE Abstract-Causal source codes are defined. These include quantizers, delta modulators, differential pulse code modulators, and adaptive versions of these. Several types of causal codes are identified. For memoryless sources it is shown that the optimum performance attainable by causal codes can be achieved by memoryless codes or by time-sharing memoryless codes. This optimal performance can be evaluated straightforwardly. I. INTRODUCTION T HE TASK of a source code, which consists of an encoder and decoder, is to encodethe sourceoutput X into a compressed representation 2 (we shall assumeit is binary) and, subsequently,to decode2 into a reproduction X of X. Roughly speaking, we will say that a sourcecode is causaZ if the reproduction created by the code of the present sourceoutput dependson present and past outputs but not on future ones. It is not required that the com- pressed representation be produced causally. Quantizers, delta modulators, differential pulse code modulators, and adaptive versions of theseare all causal in the above sense. Indeed, thesecodesprovided the initial motivation for this study. A precise definition of causality will be given in the next section. Manuscript received June 6, 1979; revised January 6, 1982. This work was supported by NSF Grants ENG76-8253 1 and ECS79-21075. Portions of this work were presented at The Sixteenth Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, October 1978; and at the IEEE International Symposium on Information Theory, Grignano, Italy, June 1979. D. L. Neuhoff is with the Department of Electrical and Computer Engineering, University of Michigan, Ann Arbor, MI 48109. R. K. Gilbert was with the Computer, Information, and Control Engineering Program, University of Michigan, Ann Arbor, MI 48 109; he is now with Bell Northern Research, Inc., Ann Arbor, MI 48106. Source coding theorems for block, sliding-block, tree, and trellis source codes have shown that these classes contain codes that achieve the rate-distortion function R(D), which is the optimum performance theoretically attainable. Although theseclasses contain some codes that are causal, it is widely believed that it is the noncausal codes, and only the noncausal codes, that achieve R(D). Nevertheless, there are causal codes of great practical interest. Indeed, for some sources (e.g., Gaussian memory- less) there are causal codes (e.g., quantizers with entropy coding) whose performance comes quite close to R(D). In other cases there are causal codes that are believed to perform quite well but have not yielded to rigorous analy- sis. In this paper we define causality for source codes, de- scribe several specific kinds of causal codes, and char- acterize the optimum performance theoretically attainable by them for memoryless sources. The goal is to discover how much is lost relative to R(D) by the restriction to causal codes, or equivalently, how much can be gained by noncausal codes. The principal result is that for memory- less sources the optimum performance by causal source codes can be achieved by memoryless codes or by time- sharing memoryless codes. This best performance can be evaluated straightforwardly. Lloyd [l] was the first to consider causal sourcecodes(in the sense described herein). He determined how well slid- ing-block causal codescan perform for the binary symmet- ric memorylesssourceby finding a lower bound to the best performance, which was obviously achievable. Piret [2] shows that causal sliding-block codes with feedback could also achieve Lloyd’s bound. Our approach is similar to Lloyd?. OOlE-9448/82/0900-0701$00.75 01982 IEEE