IEEE COMMUNICATIONS LETTERS, VOL. 3, NO. 2, FEBRUARY 1999 49 On the Error Exponent for Memoryless Flat Fading Channels with Channel-State-Information Feedback Walid K. M. Ahmed, Member, IEEE, and Peter J. McLane Abstract— In this letter, we derive the random coding error exponent for the time-independent flat fading channel with per- fect knowledge of the channel state information (CSI) at both the receiver and the transmitter. That is, the CSI is feedback from the receiver to the transmitter. In such a situation, the transmitter is capable of optimizing the power allocation of the transmitted signal according to the fading state in order to obtain, for our case, the best error exponent. The power scheme obtained here is different from the water-pouring one, which is known to maximize the channel capacity. I. INTRODUCTION T HE channel capacity and the random coding error ex- ponent of memoryless flat fading channels with ideal channel state information (CSI) at the receiver have been studied previously in [1] (see also G¨ unther [2]), and [3] and [4], respectively. It has been shown in [3] and [4] that although the loss in the channel capacity due to fading is not dramatic, a significant amount of loss occurs for the random coding exponent, which reflects a considerable amount of increase in coding complexity that is required for the fading channel in order to achieve the same error performance obtained over the additive white Gaussian noise (AWGN) channel. If the CSI, however, can be sent from the receiver to the transmitter, the achievable performance can then be improved by optimizing over the transmitted symbol power adaptively according to the channel state. The capacity of a memoryless flat fading channel with additive Gaussian noise and ideal CSI at both transmitter and receiver with an ensemble per-letter input average power constraint has been studied by Goldsmith and Varaiya [4]. That is, a constraint of the form where denotes the expected value of the enclosed quantity over the pdf and is the fading variable 1 associated with the received symbol. In this letter, we consider the random coding error exponent for the same channel. It is assumed that the CSI is perfectly sent from receiver to transmitter. We also Manuscript received December 31, 1997. The associate editor coordinating the review of this letter and approving it for publication was Dr. B. R. Vojcic. This work was supported in part by the Telecommunications Research Institute of Ontario (TRIO), Canada, and by the Natural Sciences and Engineering Research Council of Canada (NSERC). W. K. M. Ahmed was with the Electrical and Computer Engineering Department, Queen’s University, Kingston, ON K7L 3N6, Canada. He is now with Bell Laboratories, Lucent Technologies, Holmdel, NJ 07733 USA (e-mail: walidahmed@lucent.com). P. J. McLane is with the Electrical and Computer Engineering Department, Queen’s University, Kingston, ON K7L 3N6, Canada. Publisher Item Identifier S 1089-7798(99)02049-9. 1 Upper case letters denote the random quantities and lower case letters denote the values for the corresponding random quantities. assume an ensemble per-letter average power constraint of the form We show that the aforementioned constraint yields an optimal power allocation scheme for the error exponent that is different from the often used water- pouring method, which is known to achieve the channel capacity [5]. II. BACKGROUND For a discrete-time channel, let be the transition prob- ability assignment of the channel, where is the set of transmitted symbols or codeword letters and is the set of received symbols. Also, let be chosen with a probability assignment . Suppose that maximum-likelihood decoding is employed. Then, the average probability of block decoding error over this ensemble of codes is bounded, for any choice of by [5] (1) where is the block code length, is the transmission rate, and (2) Since and are arbitrary, the tightest bound in (1) is obtained by choosing and to maximize the quan- tity The general shape of the quantity is shown in Fig. 1 [the quantity is denoted as ]. The quantity is usually referred to as the “cut-off rate” of the channel. The “critical rate” is the largest rate at which the exponent is maximized by setting In Fig. 1, is the average mutual information of the channel. For more discussion the reader is referred to [5]. III. ERROR EXPONENT CALCULATION For a memoryless channel, the received signal at the th time interval can be written as (3) where are the transmitted (input) symbols which are complex-valued, independent and identically distributed (i.i.d.) random variables, are complex-valued i.i.d. fading random variables, and are additive i.i.d. complex-valued Gaussian 1089–7798/99$10.00 1999 IEEE