Abstract

The rise of data-intensive applications has increased the demand for high-density and low-power embedded memories. Among them, the gain-cell embedded DRAM (GC-eDRAM) is a suitable alternative to the static random access memory (SRAM) due to its high memory density and low leakage current. However, as the GC-eDRAM dynamically stores data, its memory content has to be periodically refreshed according to the data retention time (DRT). Even though different DRT characterization methodologies have been reported in the literature, a practical and accurate method to quantify the DRT across Monte Carlo (MC) runs to evaluate the impact of local process variations (LPVs) has not been proposed yet. Thus, the minimum memory refresh rate is generally estimated with large design guard bands to avoid any loss of data, at the expense of a higher power consumption and less memory bandwidth. In this work, we present a current-based DRT characterization methodology that enables an accurate LPV analysis without the need of a large number of costly electronic design automation (EDA) software licenses. The presented approach is compared with other DRT characterization methodologies for both accuracy as well as practical aspects. Furthermore, the DRT of a 3-transistor (3T) gain cell (GC) designed in 28nm FD-SOI process technology is measured for different design choices, global and local variations. The analysis of the results shows that LPVs have the most degrading effect on the DRT and therefore that the proposed approach is key for either the design of GC-eDRAMs or the choice of their refresh rate to avoid the need for overly pessimistic worst-case margins.

Details

Actions