Abstract

Gain-cell embedded DRAM (GC-eDRAM) is an interesting alternative to SRAMfor reasons such as high density, low bitcell leakage, logic compatibility, and suitability for 2-port memories. Themajor drawbacks of GC-eDRAMs are their limited data retention times (RTs) and the large spread of RT across an array, which degrade energy-efficiency due to refresh cycles. While the array refresh rate can be determined according to circuit simulation or post-manufacturing calibration, there is a lack of analytical and statistical RT models for GC-eDRAM that could unveil the limiters and circuit parameters that lead to the large observed RT spreads. In this work, we derive the first comprehensive analytical model for the statistical distribution of the per-cell retention time of 2T-bitcell GC-eDRAMs, which is found to follow a log-normal distribution. The accuracy of the proposed retention time model is verified by extensive Monte Carlo and worst case distance circuit simulations and silicon measurements of an 0.18 mu m test chip. Furthermore, a sensitivity analysis unveils the circuit parameters that have the highest impact on the RT spread. Interestingly, the variability of the threshold voltage of the write access transistor has a much higher impact on the RT spread than the variability of any other circuit parameter, including the storage node capacitor. This holds true under process scaling, for nodes as advanced as 28 nm, as shown through simulation. The insights gained from the retention time model help circuit designers achieve better GC-eDRAMs with longer RTs and sharper RT distributions. In addition, the herein presented model can be used as a basis to study the reliability/energy trade-off for GC-eDRAM usage in fault-tolerant VLSI systems.

Details

Actions