000203972 001__ 203972
000203972 005__ 20180913062907.0
000203972 0247_ $$2doi$$a10.1145/2594291.2594316
000203972 022__ $$a0362-1340
000203972 02470 $$2ISI$$a000344455800008
000203972 037__ $$aCONF
000203972 245__ $$aSurgical Precision JIT Compilers
000203972 260__ $$bAssoc Computing Machinery$$c2014$$aNew York
000203972 269__ $$a2014
000203972 300__ $$a12
000203972 336__ $$aConference Papers
000203972 520__ $$aJust-in-time (JIT) compilation of running programs provides more optimization opportunities than offline compilation. Modern JIT compilers, such as those in virtual machines like Oracle's HotSpot for Java or Google's V8 for JavaScript, rely on dynamic profiling as their key mechanism to guide optimizations. While these JIT compilers offer good average performance, their behavior is a black box and the achieved performance is highly unpredictable. In this paper, we propose to turn JIT compilation into a precision tool by adding two essential and generic metaprogramming facilities: First, allow programs to invoke JIT compilation explicitly. This enables controlled specialization of arbitrary code at run-time, in the style of partial evaluation. It also enables the JIT compiler to report warnings and errors to the program when it is unable to compile a code path in the demanded way. Second, allow the JIT compiler to call back into the program to perform compile-time computation. This lets the program itself define the translation strategy for certain constructs on the fly and gives rise to a powerful JIT macro facility that enables "smart" libraries to supply domain-specific compiler optimizations or safety checks. We present Lancet, a JIT compiler framework for Java bytecode that enables such a tight, two-way integration with the running program. Lancet itself was derived from a high-level Java bytecode interpreter: staging the interpreter using LMS (Lightweight Modular Staging) produced a simple bytecode compiler. Adding abstract interpretation turned the simple compiler into an optimizing compiler. This fact provides compelling evidence for the scalability of the staged-interpreter approach to compiler construction. In the case of Lancet, JIT macros also provide a natural interface to existing LMS-based toolchains such as the Delite parallelism and DSL framework, which can now serve as accelerator macros for arbitrary JVM bytecode.
000203972 6531_ $$aDesign
000203972 6531_ $$aLanguages
000203972 6531_ $$aPerformance
000203972 6531_ $$aJIT Compilation
000203972 6531_ $$aStaging
000203972 6531_ $$aProgram Generation
000203972 700__ $$0243345$$g185682$$uEPFL, Zurich, Switzerland$$aRompf, Tiark
000203972 700__ $$aSujeeth, Arvind K.
000203972 700__ $$aBrown, Kevin J.
000203972 700__ $$aLee, Hyoukjoong
000203972 700__ $$aChafi, Hassan
000203972 700__ $$aOlukotun, Kunle
000203972 7112_ $$dJUN 09-11, 2014$$cEdinburgh, SCOTLAND$$a35th ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI)
000203972 773__ $$j49$$tAcm Sigplan Notices$$k6$$q41-52
000203972 909C0 $$xU10409$$0252187$$pLAMP
000203972 909CO $$pconf$$pIC$$ooai:infoscience.tind.io:203972
000203972 917Z8 $$x166927
000203972 937__ $$aEPFL-CONF-203972
000203972 973__ $$rREVIEWED$$sPUBLISHED$$aEPFL
000203972 980__ $$aCONF