Problem/Objective

Like many large, embedded software organizations, Intel worked on a quarterly release cycle, with each release taking between several days and two weeks to complete. Each application consisted of 16 different features, each of which had to be verified on five different platforms and 10 chip variations, meaning 96 permutations of hardware and software per product for  3 million lines of code. Each permutation had hundreds of combinations of tests to validate each and every release.  All new software releases had to work seamlessly on all the previous generations and couldn’t break any legacy code.  The relatively small team of 15 test engineers had to work with a constantly growing matrix of more than 20,000 tests but could only execute a few hundred tests a day.

Unfortunately, the team was so badly outnumbered that it was forced to deliver software nearly blind. “85 percent of our code was of unknown quality, which caused a lot of anxiety for everyone,” said Manish Aggarwal, Software Engineering Manager at Intel. “We simply didn’t know what we were shipping out.” “Despite the growing number and complexity of tests for releases, we couldn’t ask for more people,” continued Aggarwal. “The business reality is that the budget and headcount are limited, so the only solution is to deliver; deliver with the same number of people; and deliver with known quality.”

  • QA lead time per CI cycle was between a few days and a few weeks
  • The team could only run ~100 test cases per day
  • QA Engineers spending their time executing tests, not designing more effective tests
  • Lacked visibility into code quality or history
  • Software delivery was “Push and Pray”
  • Release velocity was limited to 4 releases per year, going faster meant sacrificing features
  • Adding new features meant slower delivery times

Solution

Manish and his team realized that the only way to resolve these growing problems was to embrace test automation. This would “shift quality left” by running the tests earlier in the process. Developers were able to get feedback faster, before software went into production.
“Machines are much better at performing repetitive tasks,” says Aggarwal. “Instead of spending time executing what already existed, Engineers would be freed-up to spend their time on designing more test cases to maximize the quality of their product.”

The team devised their test automation solution using Electric Cloud ElectricFlow based on the following key principles:

  • Test “On-Demand”
    Anybody who wants to run a test – whether they’re from QA, Development, or Support – is able to do it from both the self-service catalog or CLI without any dependence on a test or infrastructure engineer, as environment provisioning and configuration is included ensuring all tests are run in the same environment.
  • Share / Reuse tests
    It’s not possible to write separate test automation for every test, so all test scripts handle more than one test suite, multiple features, and multiple blocks. By building in reusability, test automation naturally becomes faster and more efficient.
  • Share appropriate access
    Even though tens of thousands of tests were available via their ElectricFlow self-service catalog, and other access methods, Intel still needed a way to make them easy to find and use. Intel leveraged ElectricFlow APIs to create a custom data-entry form to find applicable tests based on a limited set of key search parameters such as firmware version and environment variables.
  • Share appropriate visibility
    All stakeholders have the ability to see product quality at any time. This includes real-time reports of which tests are running, when they last passed, which tests are failing and why they’re failing. The reports also allow stakeholders to easily drill down into the details.

“In addition to these design principles, ElectricFlow allowed us to have a truly metrics-driven solution,” says Aggarwal. “This included a birds-eye view for our management into all of the pertinent metrics.”

Results

Intel is now delivering software 25 times more frequently with ElectricFlow. Rather than quarterly deployment cycles, the team is now deploying every two weeks, on average – with no additional headcount. In addition, they have gone from 85% unknown code coverage to 87% known, so they now have complete confidence in the quality of code they are shipping.

ElectricFlow has helped to transform the team from “test executors” to “test engineers”. Each team member is adding more value, is more motivated, and is doing a better job.

Intel now benefits from streamlined, visible workflows, as well as a shared language and metrics. Approval gates, change requests, and consequent pipeline stages are also fully auditable. And QA lead time went from days to minutes; it’s now a continuous process with less reliance on individuals, and more on automation.

  • Transformed team from test executors to motivated test engineers
  • 25X improvement in software delivery with the same size QA team
  • Known code coverage grew from 15% to over 87%
  • QA lead time per CI cycle is now 10 minutes
  • Over 15,000 test cases run per day
  • Can add more tests without increasing head count
  • Full visibility into pipeline stages, approval gates, and change requests
  • Framework is in place to confidently release new features on business demand

“The biggest benefit we’ve reaped from ElectricFlow is that now we have known quality for the software we’re shipping,” says Aggarwal. “It is essential and gratifying to be able to stand by the quality of our products, rather than hoping that it doesn’t fail at the customer’s site.”