Ahead-of-time compilation

In computer science, ahead-of-time (AOT) compilation is the act of compiling a higher-level programming language such as C or C++, or an intermediate representation such as Java bytecode or .NET Framework Common Intermediate Language (CIL) code, into a native (system-dependent) machine code so that the resulting binary file can execute natively.

AOT produces machine optimized code, just like a standard native compiler. The difference is that AOT transforms the bytecode of an extant virtual machine (VM) into machine code.

Reduced runtime overhead

Some programming languages with a managed code runtime that can be compiled to an intermediate representation, use just-in-time (JIT) compiling. This, briefly, compiles intermediate code into machine code for a native run while the intermediate code is executing, which may slow an application's performance. Ahead-of-time compiling eliminates the need for this step by occurring before execution rather than during execution.

Ahead-of-time compiling for dynamically typed languages to native machine code or other static VM bytecode is possible in a limited number of cases only. For example, the High Performance Erlang Project (HiPE) AOT compiler for the language Erlang can do this because of advanced static type reconstruction techniques and type speculations.

In most situations with fully AOT compiled programs and libraries, it is possible to drop a useful fraction of a runtime environment, thus saving disk space, memory, battery life, and startup times (no JIT warmup phase), etc. Because of this, it can be useful in embedded or mobile devices.

Performance trade-offs

AOT compilers can perform complex and advanced code optimizations, which in most cases of JITing will be considered much too costly. In contrast, AOT usually cannot perform some optimizations possible in JIT, like runtime profile-guided optimizations, pseudo-constant propagation, or indirect-virtual function inlining.

Further, JIT compilers can speculatively optimize hot code by making assumptions on the code. The generated code can be deoptimized if a speculative assumption later proves wrong. Such operation slows the performance of the running software until code is optimized again by adaptive optimization. An AOT compiler cannot make such assumptions and needs to infer as much information as possible at compile time. It needs to resort to less specialized code because it cannot know what types will go through a method. Such problems can be alleviated by profile-guided optimizations. But even in this case, the generated code cannot be adapted dynamically to the changing runtime profile as a JIT compiler would do.

Example

The Android mobile operating system was delivered in 2008 with Dalvik, a virtual machine using a JIT compiler. In 2013, it was replaced by Android Runtime, a new virtual machine using AOT compilation, but in 2017, it received a JIT compiler.[1]

See also

References

  1. "Implementing ART Just-In-Time (JIT) Compiler". android.com. Retrieved 25 January 2018.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.