Abstract

This paper proposes a novel approach to modeling of gate level timing errors during high-level instruction set simulation. hi contrast to conventional, purely random fault injection, our physically motivated approach directly relates to the underlying circuit structure, hence allowing for a significantly more detailed characterization of application performance under scaled frequency / voltage (including supply noise). The model uses gate level timing statistics extracted by dynamic timing analysis from the post place & route netlist of a general-purpose processor to perform instruction aware fault injections. We employ a 28 nm OpenRISC core as a case study, to demonstrate how statistical fault injection provides a more accurate and realistic analysis of power vs. error performance.

Details

Actions