Announcing ElectricAccelerator JobCache

accelerate-job-cacheToday, we are excited to announce a new add-on to ElectricAccelerator 8.0 called JobCache.

What’s the big deal you ask, since ElectricAccelerator already super-accelerates your software build and tests? Yes, it is true that ElectricAccelerator has achieved some pretty amazing things in its 7 product generations. Customers have told us that they see up to 20x acceleration (and some even reported 65x!!) in their build times. We’ve accomplished this through massive parallelization.

This means that to get even higher speeds, you need to throw more hardware at the build problem and parallelize even more. But, what happens when you can’t parallelize anymore? Does this mean that you’ve smacked into the proverbial brick wall?

Don’t do it at all (unless you really have to)!

This got us to thinking…the fastest way to do something is to don’t do it unless you really, really have to. Remember how many of us dealt with chores as kids? Don’t do it at all (unless you really have to)!

In other words, we started exploring object file caching. I know that this isn’t necessarily novel as there are other object file caching technologies available. But, during preliminary research, our customer design partners told us that they choose to not use any existing object file caching technologies because they weren’t always reliable to produce the same results consistently.

We knew that we had an advantage with ElectricAccelerator there and then because ElectricAccelerator guarantees same deterministic builds every time.

Getting to work:

Once we had a clear vision on what we needed to do, we got to work PRONTO.

We used our own software (ElectricAccelerator) as our initial benchmark for JobCache. This had two benefits: the source was readily available and an extensive test suite was available to verify correctness of the cache behavior.  When we built ElectricAccelerator with JobCache enabled on our CI builds, we saw over 230% acceleration and 97% cache hit rate on a regular basis.

Let’s sanity check that – ElectricAccelerator is almost 1M SLOC and industry standard code churn of 100 SLOC per dev. By rebuilding only the changed components and replaying from cache the rest, JobCache just reduced our build time to less than half of what it used to be.

So encouraging news indeed…

We next built Android Lollipop with JobCache enabled. JobCache provided an additional 500% speedup over the acceleration provided by just ElectricAccelerator (which itself was significant). If you are curious about comparisons in build times for Android Lollipop check out this blog.

The Android results made us a bit more ambitious, so we ran a benchmark with a ccaching alternative. We compared ElectricMake with JobCache against GNU Make with ccache (v3.1.9), and ElectricMake with JobCache consistently perform better than GNU Make.  See for yourself !

jobcache-graph

The message is clear: even though building with ElectricAccelerator is fast, adding JobCache makes it even faster. 

But, one of the things that surprised us as we began engaging with some of our early customer design partners is that they saw value in the flexibility that JobCache provided. In the past, if they needed to decrease their build times (or maintained them as their SLOC increased), they would need more hardware which not only resulted in higher budgets, but also higher operational costs (rack space, power, cooling, etc).

Yippy!

The Rubber Meets the Road

But the picture would not be complete without some “real-world” data (aka the world where outages and legacy “will-optimize-next” and “didn’t-know-it-was-still-there” exist) from our customers’ test laboratories.

We asked customers to help us design the next acceleration booster and they were happy to oblige – everyone understands value of fast and accurate build and no one wants to be left behind.  These customers span multiple verticals, company size, and IT/build infrastructures. As expected, no two test projects from them are alike, not in size, composition, nor update characteristics.  All of our design partners saw at least 25% speedup, regardless if they’re on physical of virtual build infrastructure.  Two leading telecom vendors reported upwards of 45% speed-up!

How does JobCache work?

Here is what’s happening under the hood. With JobCache enabled, we create cache “slots” keyed by each combination of the environment variables, current working directory, and command-line options.  This slot can be empty or filled by a previously computed outcome and information regarding the inputs to that outcome.

In the most likely case where a cache slot exists with existing content we would compare the inputs to determine if any change has occurred.  If any input doesn’t match, ElectricAccelerator will rebuild the object; inputs include all source files, gcc precompiled headers, or compilation tools used. Conversely, when the inputs all match, then we allow the outputs in saved in the cache slot to be reused.

Jane Kuo

Jane Kuo

Jane Kuo is a product manager at Electric Cloud responsible for ElectricAccelerator product family.Jane comes to Electric Cloud with over a decade of industry experience building cool products.Prior to joining Electric cloud, she was with Guidewire Software where she helped define a new Business Intelligence strategy and delivered the first product offering .Jane was also at VMware where she helped to deliver the award-winning VMware Workstation and was responsible for the introduction of (free!) VMware Player.

Jane holds a BS in Computer Science, a BS in Management Science, and MEng in Computer Science from MIT.
Jane Kuo

Share this:

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe

Subscribe via RSS
Click here to subscribe to the Electric Cloud Blog via RSS

Subscribe to Blog via Email
Enter your email address to subscribe to this blog and receive notifications of new posts by email.