International Journal of Engineering Research ISSN:2319-6890)(online),2347-5013(print) Volume No.3 Issue No: Special 2, pp: 81-87 22 March 2014 NCSC@2014 Page 81 Low Leakage Bit-Line Sram Design Architectures Ratan kumar.S.V., Dept. of ECE, Assistant Professor, RGMCET, Nandyal, saneratankumar@gmail.com Mr.S.Kashif Hussain, Asst.Prof in ECE, RGMCET, Nandyal, kashif1919@gmail.com Dr.S.GovindaRajulu, Dept. of ECE, Professor, RGMCET, Nandyal, rajulusg09@gmail.com Abstract: In high performance Systems-on-Chip, leakage power consumption has become comparable to the dynamic component, and its relevance increases as technology scales. These trends are even more evident for memory devices, for two main reasons. First, memories have historically been designed with performance as the primary figure of merit; therefore, they are intrinsically non power-efficient structures. Second, memories are accessed in small chunks, thus leaving the vast majority of the memory cells unaccessed for a large fraction of the time. In this paper, we present an overview of the techniques proposed both in the academic and in the industrial domain for minimizing leakage power, and in particular, the sub threshold component, in SRAMs. The surveyed solutions range from cell-level techniques to architectural solutions suitable to system-level design[15]. Keywords : memory, Systems-on-Chi, leakage power consumption I. INTRODUCTION The emergence of static power consumption in CMOS devices has been one of the first adverse effects of technology scaling. Roughly, when feature size broke the 100 nm barrier, the CMOS transistor ceased to be a virtually ideal switch consuming power only when changing state. While static (or leakage) power affects all kinds of CMOS circuits, it is particularly critical for SRAMs, for two main reasons. First, leakage power is proportional to the total number of transistors on chip. As reported in the ITRS roadmap, transistors devoted to memory structures in a typical microprocessor based system is about 70% today and it is expected to rise to 80% in the near future [1,15],. Another reason is related to the temperature dependence of some sources of leakage power. SRAMs are highly optimized structures resulting in very high density: Typical SRAM cells have areas in the order of 0.1 m .Such a high density, coupled with large power consumption, translates into an increase of temperature, which in turn affects leakage current (and, in particular, the sub threshold component) exponentially. For these reasons, there has been a wide spectrum of research on the reduction of leakage power for SRAMs, at various abstraction levels, from optimized cells structures to alternative memory architectures. The purpose of this survey is to present an exhaustive review of such methods, and to provide a systematic classification and qualitative assessment of the various solutions proposed in the literature[15]. The paper is organized as follows[15]. Section II describes the relevant sources of leakage currents in SRAM cells, by characterizing the functional conditions under which the most important sources of leakage, namely, sub threshold currents, manifest themselves. Then, it provides a classified review of the approaches for reducing sub threshold leakage in SRAMs. In particular, Section III addresses methods for bit line leakage minimization, Section IV highlights some technological perspectives and presents a qualitative comparative analysis of the various classes of solutions that have been considered. Finally, Section V closes the paper with some universal guidelines to memory designers interested in exploiting the techniques described in this paper[15]. II. OVERVIEW A. Sources of Leakage Consumption in SRAMs[15]: Leakage power in a CMOS transistor originates from several sources, corresponding to various leakage currents flowing in the device [2]. The list mostly comprises currents that are present when the channel is non-conducting (off state): Sub threshold leakage , gate-induced drain leakage ,and depletion punch- through leakage. Two other relevant sources of leakage exist independent of the conduction state of the channel: Gate tunneling leakage through bulk, source and drain (usually regarded as a single current ), and p-n junction leakage (from both source and drain). The latter has various sub-sources, and it is dominated by the band-to-band (BTBT) tunneling effect. Fig. 1 summarizes the flow of such currents in the transistor’s schematic. Most of the leakage sources can be considered as parasitic effects and are indeed negligible. In [3], the authors show that three are the currents that should be considered for power analysis: Sub-threshold, gate and junction leakage. Their projections on predictive technologies indicate that the three sources will have comparable magnitude already in the 32 nm node. However, technologies