Copyright © 2006 American Scientific Publishers
All rights reserved
Printed in the United States of America
Journal of
Low Power Electronics
Vol. 2, 1–11, 2006
SRAM Cell Optimization for Ultra-Low Power Standby
Huifang Qin
1 ∗
, Rakesh Vattikonda
2
, Thuan Trinh
1
, Yu Cao
2
, and Jan Rabaey
1
1
University of California at Berkeley, Berkeley, California 94720, USA
2
Arizona State University, Tempe, Arizona 85281, USA
(Received: 20 June 2006; Accepted: 13 October 2006)
This paper
†
proposes a comprehensive SRAM cell optimization scheme that minimizes leakage
power under ultra-low standby supply voltage (V
DD
). The theoretical limit of data retention voltage
(DRV), the minimum V
DD
that preserves the states of a memory cell, was derived to be 50 mV
for an industrial 90 nm technology. A DRV design model was developed on parameters including
body bias, sizing, and channel length. A test chip was implemented and measured to attain DRV
sensitivities to key design parameters. Based on this, a low-leakage SRAM cell design methodology
is derived and the feasibility of a 270 mV standby V
DD
was demonstrated, including a safety margin
of 100 mV. As a result, the SRAM leakage power was reduced by 97%.
Keywords: SRAM, Standby, Leakage, DRV, Data Retention, Sizing, Body Bias, Process
Variation.
1. INTRODUCTION
Technology scaling and the fact that a larger fraction of a
chip is devoted to memory made SRAM leakage control
increasingly important. In microprocessor designs memory
consumes a significant portion of system power budget dur-
ing light-duty operation mode. For example, a past study
on 0.13 m high-end processor showed that leakage energy
accounted for 30% of L1 cache energy and 80% of L2
cache energy.
1
For mobile applications low standby power
is crucial, since compared to the small duration of active
power, most times leakage power determines battery life.
To minimize SRAM leakage power, many architec-
ture and circuit level techniques were proposed, including
dynamic-biasing, V
DD
-gating, and novel cell design.
Dynamic-biasing techniques adjust the substrate-source
and gate-source biases to enhance active driving strength
and create low-leakage paths.
1–4
V
DD
-gating techniques use
sleep transistors to turn off un-used memory sections,
5 6
or reduce V
DD
to the data-retention level of the memory.
7 8
Recently a 10 T SRAM cell with improved margins under
very low V
DD
was proposed. The low voltage operation
reduces both active and standby power.
9
Compared to
these existing approaches, this work focus on improving
the conventional 6 T SRAM cell for ultra-low voltage
standby operation, based on an in-depth understanding
∗
Author to whom correspondence should be addressed.
Email:huifangq@eecs.berkeley.edu
†
This work was sponsored by the MARCO GSRC center and SRC. Fab-
rication support from STMicroelectronics is appreciated.
of the SRAM cell standby voltage limit. A comprehen-
sive optimization methodology is developed, applying a
combination of V
DD
-gating, dynamic-biasing, and sizing
techniques. With the target of ultra-low power mobile
application, the design goal is to achieve maximum
standby power saving and reliable data retention, with
minimum penalty on area, speed, and read-write stability.
In order to minimize SRAM leakage power, an effective
method is to reduce the memory standby V
DD
. The min-
imum V
DD
that preserves memory data is the data reten-
tion voltage (DRV). Measurement results from a 32 K
bits SRAM module implemented in 130 nm technology
showed that SRAM cell DRV ranges from 60 mV to
390 mV. At a 100 mV safety margin above the DRV,
SRAM leakage power can be reduced by 85%.
8
Based on predictive device models,
10
SPICE simulations
showed that the DRV increases with technology scaling due
to larger process variations (Fig. 1). At a 32 nm node with
700 mV V
DD
, the DRV with 3 process variations reaches
570 mV for standard SRAM cell. As a result, DRV-aware
optimization is critical for future low-power and reliable
SRAM design. Figure 1 also shows that by simply using a
5% larger channel length (L) for the four transistors in an
SRAM cell inverter loop, 10 ∼ 80 mV reduction in DRV
and 50 ∼ 90% saving in leakage power can be achieved.
This is due to reduced device mismatch with larger channel
length. As technology scales the tuning effect on DRV and
leakage saving becomes more significant.
Starting from the analysis on theoretical bound of
SRAM DRV, impact of design parameters and process
variations on SRAM low-voltage data-retention behavior
J. Low Power Electronics 2006, Vol. 2, No. 3 1546-1998/2006/2/001/011 doi:10.1166/jolpe.2006.097 1