International Journal of Engineering, Applied and Management Sciences Paradigms, Vol. 22, Issue 01 Publishing Month: March 2015 An Indexed and Referred Journal ISSN (Online): 2320-6608 www.ijeam.com IJEAM www.ijeam.com 103 Bit Error Rate Performance of Long Term Evolution (LTE) Network Ahmed Suliman Mohammed Ahmed 1 , Dr. Khalid Hamid Bilal 2 and Dr. Amin Babiker A/Nabi Mustafa 3 1,3 Department of communication, Faculty of engineering Al-Neelain University Khartoum, Sudan 2 University of science and technology Khartoum, Sudan Abstract In this paper we monitored the bit error rate (BER) performance of LTE network system under adaptive modulations and OFDM system over an additive white Gaussian noise (AWGN) and multi path fading (selective fading & flat fading) channels. Using Mat lab based simulation software program. the parameter which were taken into consideration of the performance are AWGN selective, flat fading channels, cycle prefix, and the bandwidth .the results were obtained in terms of chart for bit error rate against signal to noise rate. Keywords: LTE, BER, OFDM, AWGN, Fading. I. Introduction Mobile broadband is a reality today and is growing fast, as members of the internet generation grow accustomed to having broadband access wherever they go, and not just at home or in the office and the Demand for this high speed in wireless communication systems, a new wireless data networks has been emerged and standardized by the 3rd Generation Partnership Project (3GPP). This new standard is marketed as 4G Long Term Evolution (LTE){1}. In LTE, the wireless data speed and data throughput are increased by using a combination of a number of novel technologies namely Multiple-Input Multiple-Output (MIMO) antennas, The baseline antenna configuration consists of two transmit antennas at the base station and two receive antennas at the Mobile Terminal (MT). The possibilities for higher-order schemes are considered up to a maximum of four transmit and four receive antennas [2]. Orthogonal Frequency Division Multiplexing (OFDM) and Orthogonal Frequency Division Multiple Access (OFDMA) at the downlink, Single Carrier Frequency Division Multiple Access (SCFDMA) at the uplink, support for the (16QAM), 64QAM,(QPSK) The performance of a MIMO- OFDM communication system significantly depends upon the estimation of channel{3}. MIMO technology involves the use of multiple antennas at the transmitter, receiver or both. The diversity and multiplexing modes are the two main modes of operation of multiple antenna systems. On the other hand, OFDM is a modulation technique which transforms frequency selective channel into a set of parallel flat fading channel [4]. Hence LTE will be very convincing for network operators that already have HSPA networks running [5]. Comparing the performance of 3G and its evolution to LTE, LTE does not offer anything unique to improve spectral efficiency, i.e. bps/Hz. LTE improves system performance by using wider bandwidths if the spectrum is available [6]. LTE downlink transmission scheme is based on Orthogonal Frequency Division Multiple Access (OFDMA), while the uplink transmission is based on Single Carrier Frequency Division Multiple Access (SC-FDMA). The main drawback of OFDMA over SC–FDMA is its high Peak to Average Power Ratio (PAPR) [7]. OFDMA allocates individual users in the time and the frequency domain and its signal generation in the transmitter is based on the Inverse Fast Fourier Transform (IFFT) [8]. LTE is focusing on an optimum support of Packet Switched (PS) services [9]. This paper is organized as follows: In Section II the mathematical models and Section III the Descriptive Analysis, implementation in section IV and simulation results are shown in Section V. Finally, conclusions are given in Section. VI II. Mathematical Models The theoretical bit error rate performance with SNR for different types of digital modulation such as BPSK, 4QAM and 16QAM for Rayleigh flat fading environment is given by: P(BPSK)=0.5 (1-√SNR/(1-SNR)) (1)