
Despite appearances to the contrary, work is always afoot to try to improve the performance of pretty much every operating system out here – and that includes Android. Google has revealed details of its latest technique for speeding things up.
This time the work has been done at kernel level with Automatic Feedback-Directed Optimization (AutoFDO). If you are curious about just what this is, Google has a detailed explanation.
Google says that it has concentrated on the kernel as this accounts for 40 percent of CPU time in Android – meaning that there is a lot of room for improvements that will be noticeable. The company says that the work it has done has resulted in not only faster launch times for apps, but also faster device boot times.
Going on to explain just what AutoFDO is, Google says:
During a standard software build, the compiler makes thousands of small decisions, such as whether to inline a function and which branch of a conditional is likely to be taken, based on static code hints.While these heuristics are useful, they don’t always accurately predict code execution during real-world phone usage.
AutoFDO changes this by using real-world execution patterns to guide the compiler. These patterns represent the most common instruction execution paths the code takes during actual use, captured by recording the CPU’s branching history. While this data can be collected from fleet devices, for the kernel we synthesize it in a lab environment using representative workloads, such as running the top 100 most popular apps. We use a sampling profiler to capture this data, identifying which parts of the code are ‘hot’ (frequently used) and which are ‘cold’. When we rebuild the kernel with these profiles, the compiler can make much smarter optimization decisions tailored to actual Android workloads.
The blog post that Google has used to talk about AutoFDO goes on to provide quite detailed information about how the processes that have been optimized work, and this is something that will be of interest to developers. Addressing concerns about possible unwanted side effects of optimization, the company goes on to say:
A common question with profile-guided optimization is whether it introduces stability risks. Because AutoFDO primarily influences compiler heuristics, such as function inlining and code layout, rather than altering the source code’s logic, it preserves the functional integrity of the kernel. This technology has already been proven at scale, serving as a standard optimization for Android platform libraries, ChromeOS, and Google’s own server infrastructure for years.
To further guarantee consistent behavior, we apply a “conservative by default” strategy. Functions not captured in our high-fidelity profiles are optimized using standard compiler methods. This ensures that the “cold” or rarely executed parts of the kernel behave exactly as they would in a standard build, preventing performance regressions or unexpected behaviors in corner cases.
You can check out the full blog post here.
