Get more stuff like this
Sign up with BETSOL to get the latest tech trends, industry best-practices, and benchmarks.
Thank you for subscribing.
Something went wrong.
Last Updated on December 15, 2020 by Umme Kulsum
We live in an interesting era where rapid changes in customer demand and high expectations for quality are quickly making standard software engineering and quality practices obsolete. To keep up with the pace of change, we can no longer have long development life cycles, followed by standard tests and quality assurance practices. This is a sure way fall behind your competition–or worse–cheat time-to-market by cutting corners on QA.
Apart from delivering customer features quickly and with great quality, we also have rapidly emerging security concerns to deal with. As for a performance, quality, and security testing all come together, delays in getting fixes and incremental improvements to the market cannot be tolerated.
If there’s ever a time to overhaul your Quality Assurance practices, it’s now.
Structural quality is replacing transnational quality
In the software world, Quality Assurance has long been looked down upon and relegated to the back of the line. QA receives the output of the more important upstream processes, such as Project Management, Business Analysts, and Developers.
Well, that’s all ready for a major change.
It’s time to make QA the surround that glues all of the organizations together, while also bringing the Customer Experience, Operations, and more into the process.
Process-centric disciplines, such as Six Sigma, have been on to this notion for years, completely transforming the quality of manufacturing and services. Rather than seeking incremental improvements in quality by improving the way outputs are inspected, this is about embedding quality into every stage of the game and changing the processes, approaches, and measurements–resulting in level-jumps in quality.
It’s time to apply these approaches to Quality Assurance as it exists in today’s software engineering and delivery lifecycles.
Reactive vs Proactive vs Predictive Quality Assurance in Software Engineering
To explain the difference between the three maturity levels of Quality Assurance for Software Engineering, let me use a car manufacturing example.
Reactive QA would check the parts and functions of an assembled car before it is shipped to the showroom. Testers would receive the completed car and go through checklists of tests. Perhaps they identify a gasket that won’t hold pressure or an incorrect brake pad. In this case, the part is fixed, retested, then shipped off to the dealer. Better Reactive QA shops might have inspections throughout the manufacturing process at major stages, failing items that they receive.
Proactive QA would begin inserting sensors and automated checks throughout the manufacturing process. QA has the authority to drive changes in the entire manufacturing process. Improved equipment, processes, and measures are all initiated by the Quality Organization. The company is committed to quality first and foremost. As cars move down the assembly line, issues are identified in real-time along the line. Once the car is completed, it goes through a final checklist, but issues are virtually non-existent. The same car gets to the dealer’s showroom much faster.
Predictive QA takes all of the benefits of Proactive QA a step further and gets squarely in the customer’s business. Predictive QA organizations begin putting sensors everywhere throughout the car’s manufacturing process, in the car itself (to get data once in the hands of the customer), at the dealer, and in its repair shops. The car manufacturer is now collecting quality data across not just the Software Development Lifecycle, but the entire Product Lifecycle. Thanks to Big Data, IoT, and Analytics, the Quality Organization is now making novel discoveries and changes. QA impacts vendor management, procurement, and nearly every other aspect of the company related to the car’s production.
In software engineering, this can be visualized as follows.
Note: SDLF stands for Software Development Lifecycle. For proactive organizations, when I say QA encompasses SDLF, I am including Project Management, Business Analysis, Software Engineering, Quality Test and Automation, and Customer Experience.
In this model, quality is a business decision and culture; not just a function.
How top software engineering organizations achieve fast time-to-market and exceptional quality
As with any great business practice, to transform the Quality Assurance function, people, processes, tools, and measurements all come into play.
Let’s start with the key factors and goals of Predictive Quality Assurance:
Zero Production Defects…and Measure, Measure, Measure
Does zero production defects sound crazy? It’s not. Predictive QA has its roots in Six Sigma. The term “Six Sigma” in fact means 3.4 defects per million opportunities. Expressed another way, that’s a 0.0000034 defect rate. Let’s just say 0.
This is the new standard and customer expectation. Apart from customer satisfaction, the cost of leaking defects to the customer are escalating rapidly, especially for companies that operate at scale (something Samsung can now firmly attest to). The new standard is precision and end-to-end quality across all platforms.
Especially as we consider the reach of software, which is now exploding its reach, thanks to factors such as IoT, which is embedding complex software in billions of devices, both consumer and enterprise facing. A single defect that reaches millions of devices poses extremely high-cost challenges to address and may cause unrecoverable impacts to the brand.
The future of Software Quality Assurance is in the measurements. Yes, it’s time to break out your statistics textbook. Analytics, data, and real-time measurements are the basis.
The ability for software engineering and QA to adapt to changes, by being attached at the hip with customers and operations. This means Agile practices, and more specifically DevOps. Fundamentally, this means development and operations functions combine to speed time-to-market. Quality is no longer after the fact, it’s built from within and integrated into every phase of the engineering lifecycle.
Adaptive, Uniform Technology
Having the right technology to gain real-time feedback and analytic across multiple platforms stretches the typical bounds of QA to include feedback from new channels, such as social. This technology also needs to be able to mimic end-to-end customer transactions in production environments. Analytics and measures are the new, sought after output.
Upstream and Integrated Culture
Above all else, delivering predictive and proactive QA is a cultural mindset and philosophy that needs to be ingrained from leadership to delivery. QA now leads, rather than response. QA makes organization, technological, and process recommendations that need to be implemented by other organizations. Products and releases move rapidly. Being engaged with the customer trumps keeping a distance from the customer.
Customer Experience Focus
The days of a tester running through a set of test cases written by a BA are over. The true test is understanding the entire customer experience and engineering environments that mimic customer usage in actual customer environments. This means QA is systematically plugged into the customer experience. Processes and scripts are actively running in your production environment simulating end-to-end customer transactions and reporting defects in real-time. If there is an issue on Amazon.com, Amazon QA will know before customers begin to report it.
The 6 Steps to Predictive Quality Assurance
Given the key factors above, here are the seven steps to making the transition toward fast time-to-market, zero defects, and a predictive methodology.
Step 1: Culture and Job Types
New competencies and training put in place to drive “old practices” out and build “new practices” into the fabric of the culture. This typically means bringing in senior test engineers, test management, and process improvement leaders that have direct experience leading teams through end-to-end proactive quality assurance across organizations.
You need people who understand software, know how to measure, and can matrix manage. This is your transitional core team and they need to infect the rest of your Quality Organization and ultimately the company as a whole.
Quality Assurance needs to have its staff moved into the lifecycle as well. QA is no longer downstream of project management, business analysis, and development. QA is ingrained and a leader in each of these functions. This is an organizational redesign and requires QA team members to have cross-functional leadership skills, which traditional testing teams may not have. This requires a nexus of skills and leadership across multiple team members.
A transition to Proactive Quality Assurance generally requires executive sponsorship.
Step 2: Process Changes
With the right team in place, it’s time to break down the silos.
First and foremost is customer awareness, insight, and focus. The teams must have insight into actual customer-facing metrics and understand how to drive impact.
Second is the adoption of an end-to-end test strategy that spans Project Management, Business Analysts, Architects, Engineers, QA, and testers.
Last is the elimination of phase-specific processes and silos. Rather, the entire software development lifecycle is unified as a Quality Organization and allows for agility across the end-to-end process.
This is just a start. Process change will become the new norm. Once measurements are in place, root cause analysis will drive continuous improvement.
Step 3: Measurements
Predictive Quality Assurance relies on measurements. Measurements are king.
Process engineering, especially in manufacturing, has focused heavily on data capture and statistical analysis to drive structural change and deliver quality outputs that are orders of magnitude better than incremental improvements that can be achieved by reactive quality.
Having detailed metrics of the entire end-to-end customer experience allows the QA team to analyze, objectively, that experience. Analytics tools can sift through data, issue warnings and identify problems before humans (namely customers) can. But, to get to this point, the right measures and metrics must be established.
Sample quality indexes to be established include end-to-end quality performance:
- Defect leakage
- QA cycle time
- Customer found defects
- Cost of poor quality
- Security and performance
- Cost of end-to-end QA
- Customer experience measures
- Product and scenario-specific measures
Organizations will have clear real-time, end-customer quality analytics and visualizations, targeting zero production defects.
These metrics are looking at the full lifecycle, with just-in-time metrics, sensors, and analytics that allow for process and systematic automation.
Step 4: Cloud-based Technology and Environments
The infrastructure behind proactive and predictive QA organizations requires consistent, scalable, and simple-to-deploy environments.
Modern cloud infrastructure allows for rapid spin up and provisioning, resulting in simple scaling. This lends well to matching numerous customer environments, performance testing, and integration testing. At the lower cost and rapid delivery, cloud-based infrastructure and test tools are fundamental to predictive and proactive QA.
Additionally, test tools, where typically niche, compartmentalized, and heavily customized, should be standardized across the organization to lead to consistent results, org-to-org, individual-to-individual, and group-to-group. Individual testers with their own special scripts and best practices will have their practices shared and integrated into the standard procedure.
Step 5: Continuous Integration and Delivery
In order to facilitate ongoing QA, developers need to integrate code into the repository multiple times per day. Continuous integration and delivery rule the roost. You can see our article and white paper on this topic for more information.
Automation follows to compile and build the application, automating the deployment of code. This enables a continuous development model providing constant and immediate feedback across the entire delivery process.
Such approaches lead to such real-life metrics as a 40% reduction in development costs, a 140% increase in programs under development, and a 5x increase in time spent on innovation (Continuous Integration Best Practices for Software Engineering).
Step 6: Proactive and Predictive QA Implementation
Across all of the areas mentioned, QA is now responsible for implementing and owning measures to identify which points in the overall process are contributing to defects and leakage. Using these metrics, QA can begin to predict quality and defects, moving toward a solution that understands and improves code quality systematically.
The result is more of a project to improve organizational quality, rather than flagging issues as they arrive.
Similar to process improvement initiatives, such as those led by Six Sigma organizations, QA organizations deliver focused processes, people, and technology changes that achieve the same goal of no defects–just in a different manner.
This model leverages a bigger reliance on root-cause analyses, statistical investigation, and metric analytics.
Conclusion: Predictive QA Drives Change in Your Entire Company
As you can see by this article, software QA should not be the last stop for the software before it heads out the door. To achieve Predictive and Proactive Quality Assurance, the QA organization becomes a surround for the entire solution delivery (spanning Project Management, Business Analysis, Development, and Customer Experience).
QA no longer simply flags issues to be fixed. QA identifies root causes to prevent future quality issues. The result changes in process, technology, and organizations that alter the way the company operates, ultimately level-jumping quality results and speed-to-market.[embedit snippet=”after-article-getresponse”]