Signal Processing in Digital Communications – The Fall of Science Tor Aulin Chalmers University of Technology D & IT, Telecommunication Theory, SE-412 96 Gothenburg, SWEDEN aulin@chalmers.se Abstract — The last decades has seen a fast growth in published papers concerning the scientific investigation and design of digital communication systems. There is a well developed theory for this purpose, since long. Central here is the theory for Hypothesis Testing and Random Processes. This approach has been abandoned over the years, resulting in applying procedures used in the area of Signal Processing. This results in non-optimal solutions and the underlying models are erroneous and will result in contradictions. This paper will give some simple basic examples of ad hoc approaches using Signal Processing and also point out some obviously erroneous approaches. This will involve the use of sampling, treatment of unknown channels and the general treatment of band limited Rayleigh Fading. The latter will demonstrate the loss of Implicit Diversity, using Signal Processing approaches. Many more examples can be listed but this paper will be mainly limited to these simple examples. Instead references are given for a detailed treatment. I. INTRODUCTION The area of Communication Theory is by now well developed and has its origins in the 1940s. Detection Theory is an important part of this area, dealing with the derivation of optimal detectors (given a certain criterion of optimality, e.g. Maximum likelihood (ML) or Maximum a Priori (MAP)) and how to analyse these detectors. Most communication systems are analogue concerning both time and amplitude, except for input and output. For radio channels this is certainly so, where also carried modulated techniques are utilized [1]. A first step into learning about deriving and analysing detectors is the classical text book by Wozencraft & Jacobs [1]. Here digital information is mapped into channel waveforms and these are observed after transmission over a noisy channel. Statistical properties such as a priori probabilities of the data symbols (M-ary) and the channel (conditional distribution) are introduced. A central role is played by the Additive White Gaussian Channel (AWGN) which is well known [1], [2]. Here the concept of Vector Channel is introduced and now vectors instead of time dependent waveforms are considered. The important concept of a sufficient statistic is introduced, guaranteeing that no information is lost when the received waveforms are converted into random vectors. The General Gaussian Problem is considered in [2]. By forming the Likelihood Function [1] using these vectors and optimizing this with respect to the transmitted data the ML detector is easily found. The receiver implementation involves the dot products between the received vector and all the possible transmitted signal vectors. These operations can also be performed in continuous time, involving the correlation between the received waveform and all the possible transmitted waveforms. II. SAMPLING Recently, many new text books have been published where the area of Communication Theory is approached by initially sampling the received signal. The reason for this is probably due to the involvement if Signal Processing into the area of Communication Theory. Now the whole machinery and theories in e.g. [1], [2] is completely abandoned. Instead, the sampling theorem due to Nyquist [6] is referred to and only band limited systems are considered. By sampling the received signal at or above the Nyquist frequency, the received signal can be reconstructed again, using the samples. Thus, it is claimed that no information loss occurs by using sampling. The whole treatment after sampling now takes place in discrete time, an approach which is very convenient for computer simulations. This is especially so when using Matlab, resulting in what has been coined Matlab Communications. Anyhow, turning back to the general approach summarized in Section I above, for accomplishing ML detection, the received signal waveform shall be correlated with the possible transmitted signal waveforms over the time interval of interest. Thus, the received signal waveform must be reconstructed and this involves all the samples, from minus infinity to plus infinity. This is of course not possible; instead a few samples around the time interval where the received signal is present are used. In most cases, only those very samples generated from the actual data symbol time interval are used. Further, it is common that ad hoc strategies are used to arrive at a decision concerning the transmitted data symbol. It is thus clear that there is a performance loss as compared to ML. How big this is, is in general not dealt with since the ML detector is not analysed. The result is therefore complete confusion and it is hard, if not impossible, to make any scientific conclusions concerning the general properties of such a system. III. UNKNOWN CHANNEL Sometimes channels are band limited, resulting in InterSymbol Interference (ISI). A good model of such a channel is to consider it being a linear filter, having a 978-1-4244-6316-9/10/$26.00 2010 IEEE ISWCS 2010 112