Abstract

With the growing number of process variation sources in deeply nano-scaled technologies, parameterized device and circuit modeling is becoming very important for chip design and verification. However, the high dimensionality of parameter space, for process variation analysis, is a serious modeling challenge for emerging VLSI technologies. These parameters correspond to various inter-die and intra-die variations, and considerably increase the difficulties of design validation. Today’s response surface models and most commonly used parameter reduction methods, such as Principal Component Analysis (PCA) and Independent Component Analysis (ICA), limit parameter reduction to linear or quadratic form and they do not address the higher order of nonlinearity among process and performance parameters. In this paper, we propose and validate a feature selection method to reduce the circuit modeling complexity associated with high parameter dimensionality. This method relies on a learning-based nonlinear sparse regression, and performs a parameter selection in the input space rather than creating a new space. This method is capable of dealing with mixed Gaussian and non-Gaussian parameters and results in a more precise parameter selection considering statistical nonlinear dependencies among input and output parameters. The application of this method is demonstrated in digital circuit timing analysis in both FinFET and Silicon Nanowire technologies. The results confirm the efficiency of this method to significantly reduce the number of required simulations while keeping estimation error small.

Details

Actions