Want huge profits AND free cash this year? Start spending the OPTIMUM amount of money on quality-related projects by forecasting two KPIs: Cost of Quality (COQ) and Cost of Poor Quality (COPQ).
With these metrics and real-time analytics in SAP Analytics Cloud, you'll be able to:
Know how much you ARE spending on avoidable quality errors
See how much you SHOULD be spending on Quality Management Systems (QMS)
Analyze and predict quality, on-demand at any level of the business
...by product, work center, program, customer, business unit, etc...
Perform What-If simulations
To see the impact of business tradeoffs between cost, lead time, risk, and safety
And see the impact on Operating Income (Op Inc) and Cost of Goods Sold (COGS)
And (as a bonus) identify which processes in your manufacturing plant are in control, which are out of control, and which are bottlenecks creating waste
In this blog, I'll discuss how to get started with Cost of Quality (COQ) and Cost of Poor Quality (COPQ) calculations, and I'll share why spending MORE money on Quality can improve your brand, save lots of money, and make customers happy.
See my blog articles at http://www.happylittledashboards.com for more KPI discussion like this and for help getting started with SAP Analytics Cloud. Also, I'll keep these KPI blogs updated, so I love to hear your feedback.
Also, a quick shoutout to my teammates from 60+ projects and hundreds of passionate discussions....
at the largest American defense contractors (who have the most complicated build processes imaginable)
with some of Germany's brightest engineers at SAP (where they study the best run businesses in the world)
with fellow students from the University of Utah school of Operations
and at SAP, ASUG, and SAPinsider conferences around the world
Warning: Portions of this blog might be kind of technical. If, on the other hand, they're not technical enough, let me know. :p
Understanding your quality costs
Let's be frank ... a company's quality management processes determine the cost of your products. Period.
Now we're not talking just talking about your Quality department and your QMS system. We're even talking bigger than your Operations department. Beyond the standard costs like materials and labor, we accrue costs in inspection, test, repair, recall ... we need to look at overhead expense. We need to account for lost capacity at bottlenecks on your production floor.
The cost of quality operations, both the ones that add value and the ones stealing money from your bottom-line, are hidden throughout your time charging, your manufacturing data, your HR systems, and your overhead accounts. And that's only the costs we can quantify.
We're also talking about the impact of quality on your brand. IBM is a computer, but Apple is a brand. Reebok is a shoe company, but Nike is an institution. GM makes cars, but Porsche makes the Ultimate Driving Machine.
Quality directly impacts cost, but it also conveys a message about what your company is and what it stands for.
Why should we measure quality outcomes?
How much do your Quality processes cost? Unless you're already measure COPQ and COQ, there's a good change that you can't answer. If you ask me, the biggest gap in most MBA programs is a lack of emphasis on fundamental KPIs in the business. You might talk about Moneyball, and the fact that statistical management CAN exist, but most schools don't teach you have to be like Billy Beane.
You see, some parts of the company - like Quality and Information Technology - seem like expenses because they don't generate product or profit. They don't win contracts. They aren't in the spotlight, heroically saving the day for our customers.
They're Cost Centers, making the tough calls and unpopular decisions that keep the wheels on your company. They're the first to be impacted by budget cuts; and if we aren't measuring them correctly, cuts may be destroying your bottom line.
When have accurate measurements of COQ and COPQ we can identify the relationship over time at a granular level, allowing us to calculate the optimum amount to spend on
Quality department employees
Quality-related business operations and processes
And Continuous Improvement projects (Lean / Six Sigma)
Spending the correct amount on quality will allow you to maximize long-term profits while retaining a brand identity for customers and shareholders.
Anything that doesn't add value to your product is waste. Improper management, and lack of adequate measurement, creates wastes that you often don't see until times of crisis.
And in times of crisis or change, when your costs are out of control, your quality data can help:
Increase cash and reduce the Cash Conversion Cycle (CCC), by decreasing cycle time and reducing stock
Decrease expense immediately, in expediting costs, overtime labor, contract labor, lease costs, usage fees, overhead, and related fringe
When we cut funding to Quality without understanding the impact to business processes
We increase production variability and lead time
We increase both the obvious and hidden costs associated with poor quality
And it is impossible to know if you're getting rid of part of your undocumented "secret sauce" in your human capital, which could lead to a loss of competitive advantage, loss of Trade Secrets, and damage to your brand
I can't stress this point enough. We're often eager to cut overhead budgets to reduce the cost of business operations. But unless you can measure their impact to the bottom line, you have no freaking clue what you're doing when you reduce their headcount or quality program budgets.
For the fundamentals, here's a link that's a good introduction to COPQ at the ASQ. As a side-note, the ASQ - which provides most international Quality certification - is one of the most organized and scientific certification bodies that I've worked with. If your business is involved in complex manufacturing, all of your executives should understand why a Quality Management System (QMS) is important.
How to categorize cost
Here's a scenario.... Your manufacturing floor is always busy. Employee utilization is high, deliveries are on time, customers are happy, employees love management, and you're profitable. How much money are you wasting?
It's another trick question. We can't really tell, unless we gather some data.
First off, it's important to note that costs associated with poor quality are avoidable. We're not talking about costs associated with your Quality Assurance team, because their salaries and fringe are part of the Cost of Quality, which is a basic part of doing business and adds value to the product.
Costs associated with poor quality are avoidable.
Next, before we dive into some of the costs, let's be clear that we're not merely talking about avoiding direct costs and expense.
When I was working in Continuous Improvement at L3 Technologies, we devised a strategy to break down costs (and savings) into four categories:
Savings: Direct costs, which come directly off your books
Value: Indirect costs, things like "busy work", working the right thing at the right time (sequencing), and poor flow. Value costs decrease production capacity, increase Cycle Time (CT), or consume resources without adding value.
Risk: Or hypothetical measurement of costs associated with your brand or your ability to remain certified for production
Safety: Costs which directly impact employee morale and physical safety, often considered a cost avoidance.
Isn't it interesting that only one of the four cost types is easy to measure? And there's also a bit of overlap.
You can come up with your own equation to measure value or risk or safety costs. But it's often difficult to communicate these costs to business leaders of other organizations, so many organizations only focus on Savings and Value.
And that's where COPQ comes into play. It's hard to communicate and quantify the business value of cost avoidance activities that aren't readily available on your balance sheet. How much is the savings of avoiding a shutdown due to a poor audit? What is the value of avoiding a recall across your product base? What if you're not ADA compliant and someone gets hurt? Your reputation as an employer is important to keeping critical talent.
Based on my experience, a 10 to 15% reduction in Cost of Goods Sold over the next few quarters is entirely possible if you're just starting to measure quality. And I'd anticipate an additional 10% of value savings that can be realized in subsequent years, either in increased capacity, improved product, restructured staff, or reduced overhead.
But keep in mind, much of the savings will be difficult to track. You'll see costs reduce in places you don't expect. But you'll be able to track so much savings, it won't matter TOO much where the profits come from.
Identifying Savings opportunity on direct costs
To build your calculations, you'll need to collect very specific data. Let's start with identifying direct costs that are traceable to your PNL. You may have more than this, but here's a few you should be measuring:
Engineering changes (when due to quality)
Expediting costs (when you order a part too late or when you need to replace a spare part)
Identifying Value on indirect costs
Next, there are your indirect costs:
Overstaffing (of specific job roles)
Decreased machine capacity
Increased Lead Time
Increased Cycle Time
Too many test fixtures
Increased cost to manage customer / government property
Increased overhead, staffing, and fringe to support all other excess costs
Did you notice that SOME of these look like direct costs, but they're not. We often call them Cost Avoidance, because the losses are actually to capacity. Yes, you can quantify them, but they're costs that are ALREAY INCURRED.
By measuring indirect costs, we can avoid future expense, and we can reduce the cost to create products (which often DOES affect COGS), but it's a lagging indicator. Changes you make will be harder to trace directly to your books and bottom-line.
Calculating Risk-related costs
Risk cost can be even harder to quantify. If you lose the ability to bid on contracts, or if your customers are unhappy, you could lose everything.
Some risk costs include:
Compliance risk, risk of failure to meet certification requirements, or damage the customer relationship with perceived incompetence
Product risk, insufficient engineering, test, or build, increasing the odds of field failures and perceived product quality
Design risk, risk of failing to meet customer requirements
Catastrophic risk, or the risk of loss of business due to risks above
Risk costs are generally hard to assign a value to, but they are often the basic cost of doing business. These are generally used for the Cost of Quality calculation, where you need to meet a "minimum level" of compliance to continue operating. Treat it like a sunk cost or a one-time fee.
How you calculate risk calculations is entirely up to you. What would be the risk in lost contracts, lost customers, or work stoppage?
It can be difficult to quantify results, either in improved product quality or in changes to your bottom-line. But Risk costs can be the most critical to your reputation.
Calculating the value of employee Safety
Safety costs, much like Risk costs, constitute the minimum threshold for doing business. They include:
Health risks, like ADA compliance, OSHA and work safety, etc
Product design risks, which might create a hazard for customers
Global risks, such as risks to the environment or society in general.
Quantifying spending on Safety is difficult, as it appears as a direct expense, but it's important to measure.
Every company needs to understand that addressing Risk and Safety costs is part of doing business, and part of the "overhead" expense that YOU CAN NOT CUT FROM YOUR BUDGET. In fact, that might be a good way to state that....
Quality as corporate identity and market differentiation
The Cost of Quality measures both perceived and actual value to your products and how they meet customer expectations, and it needs to align to your corporate strategy.
Where are your products positioned?
Do you make a "high quality product"?
Or are you structured to sell commodities.
Quality is a differentiator that your customers will pay for. A lack or decrease in quality, or a perceived lack, devalues the products your customers have purchased and damages your brand.
A quality calculation model
Okay, so how do we actually calculate Cost of Quality and Cost of Poor Quality? Once we have arrived at a calculation, how do we improve?
To begin with, we need to calculate direct cost, and we'll start sorting them into COQ and COPQ categories, and how we can allocate those costs to a particular part, job, or product.
From a top-down approach, we can calculate costs by simply summing accounts from your ledger. Cost of Quality will include:
Salaries of your Quality team (including fringe)
A percentage of other overhead, to support the Quality Management Team, including training, HR, and other fixed costs
Cost related to preventative maintenance
A percentage of other non-Quality employee labor where the employees are serving quality functions like
Test operations (of all types)
Replacement part procurement
Quality-related training and documentation
Support for repair contracts and customer / goverment liasons (DCMA, etc)
Facilities management for any test / repair / rework stations
You will also want to look at any costs related to constraints, such as
and anything else where costs accrue when you exceed normal operating capacity
Keep in mind that Quality employees can report to any organization, so it's critical to correctly identify job roles prior to calculating labor.
Also remember, there is no "one size fits all" calculation. A manufacturing inspector might be considered a part of the cost of quality if their primary job is inspecting build operations. But you might want to only include a portion of their time, if they're also operating machines, performing packaging operations, or moving parts on the floor.
A few problems with calculating quality metrics
Costs of Poor Quality include other operations that are avoidable, but there are some "gotchas" to calculating it.
A defect trend, for example, can be viewed in a couple of different ways.
Let's say we've identified a problem with a part, but we don't anticipate build very many of them, so we decide not to correct the problem, and we spend $1000 reworking every item we build.
Is that $1000 part of our COQ? Is it a standard part of doing business? Or is it Poor Quality?
One might argue that our design intentionally includes build operations that necessitate rework. No part is "perfect", is there? The design itself requires a number of operations, including this $1000 rework job.
You'll encounter this argument more in high cost / complexity jobs with lower volume, since it often isn't worth the cost or effort to redesign.
But this argument doesn't hold water, since this cost is avoidable. Certainly, it requires investment to avoid the defect, but that's a necessary part of the Cost of Quality. It would constitute a Value savings if I could avoid the $1000 rework, so it might be hard to justify the expense of redesign, so you would have to compare the cost of redesign to the cost of the rework.
This argument doesn't hold water, since this cost is avoidable....
Compare the project cost to the cost avoidance. One might say that it doesn't make sense to rework, if redesign and rework costs are the same. I would argue, however, that if everything else is equal - if redesign and rework costs are exactly the same - you should always redesign.
There are hidden costs involved that you can never measure, like customer Goodwill, and there is always an inherent value in higher quality products and services.
Calculating COPQ as a percentage of COGS
Let's go back to our calculations. We really need to define how much of our labor, as a percentage, is actually devoted to COQ.
Most of your business operations should be able to measured in your Manufacturing Execution System (MES). Whether you run on Excel or have a fancy ERP, you should be measuring how long each build operation takes.
Most systems should contain a standard process file, basically the "Order of Operations" that are required to build a part. It might look something like BASIC computer code:
Assemble the Box
Inspect the Box
Assemble the Crate
Inspect the Crate
Ship the Crate and Box to customer
By mining the data from the standard process, we can look at every operation in aggregate and identify how many times we performed extra operations that can should be treated as as avoidable:
Assemble the Box
Inspect the Box
Assemble the Crate
Inspect the Crate
Ship the Crate and Box to customer
In this case, there are two operations - Rework and Modify - that are avoidable, since they're not part of the standard process.
Sometimes we under engineer, but that's an avoidable cost.
For example, what if you source a chip that's only designed for temperatures down to freezing...
But your customer requirement is 10 degrees below that.
Let's say you determine you can screen these parts in testing and use the 80% of parts that meet the requirement...
This is still a Cost of Poor Quality since it's avoidable.
If it's avoidable, we can change it. You can't get away with NOT spending money. If you don't spend it in the design, you pay for it in operations.
You can't get away with NOT spending money.
When we aggregate these operations on a part, product, or across the production floor, we can take the percentage of non-standard operational time and count it all as COPQ. This gets us part of the way to understanding the burden of COPQ, but there's another cost that we often forget.
Let's say in step 1 where we Assemble the Box, there's a wide standard deviation in the amount of time it takes to complete an operation. Sometimes it takes 5 minutes to assemble, but sometimes it takes 5 hours. Is there a COPQ cost hiding in this operation?
When our data shows variability in a process, it's often a sign of a process that isn't under control. If a process isn't under control, the variation is an avoidable cost. So we definitely can make an argument that a portion of high variability processes should be counted as COPQ.
It's pretty easy to write a calculation if the mean is close to the lower limit in a calculation.
Let's say our minimum process time is 5 minutes, our mean is 10 minutes, and our maximum time is 5 hours. That's a pretty steep curve ... most of our operations only take a few minutes, but a handful take hours. One could reasonably calculate that anything over the mean is COPQ and avoidable.
So should we always use the mean as a "cut off" for standard cost?
I don't think so. Every process will have normal variation when it's under control. You can make traditional Box and Whisker charts, or even more advanced charts in SAP Analytics Cloud to analyze each distribution.
It might make sense only to count high variation process labor as COPQ. Or you might want to take a different approach, only counting the upper quartile as COPQ.
One approach I've used is to find the range first, or the difference between the minimum and maximum.
In this example, the range is 4 hours and 55 minutes and my mean is 10 minutes.
If my mean was 50% of my range, I might have a controlled process with a normal distribution curve.
But if my mean is less than 25% of my range, that means that 1 out of 4 parts have an exponentially long process time, indicating a problem.
So, ideally, we'd calculate the percentage of labor or process time from the process files to understand avoidable time charging. But if that's not possible, we need a formulas that we can challenge and model over time.
Establishing a solid COQ and COPQ formula is highly customized and creates competitive advantage.
I'd recommend using a modern analytic platform like SAP Analytics Cloud to ensure you're calculating every single process file and every job.
If you start with a solid in-memory platform when you're performing the calculations, they'll be scalable, performative, and can be run on the fly. Investing in a solid system to measure performance is a critical component of the Cost of Quality and will create value throughout your organization.
Three more notes about process mining before I move on:
Any plants that have mixed manufacturing, combining production with engineering or repairs in the same production floor, will have a harder time identifying waste and defects than traditional production without real-time data systems like SAP S/4 and the PEO products.
If you don't follow a standard process, it can be extremely difficult to identify standard versus non-standard operations.
This same problem also obfuscated field repairs and product maintenance, as there is no "standard" process for resolving a complicated product defect unless it's part of a trend (which should probably be calculated differently).
Build your own COPQ model
Now that we can measure non-standard operations and have an idea about identifying the hidden avoidable rework in our standard operations, we can start seeing how to calculate Poor Quality.
COPQ is comprised of
Repair parts and labor
All labor on non-standard operations (since its theoretically avoidable, with enough investment, discussed above)
A percentage of labor exceeding the mean in highly deviated processes (one approach is calculating all time above the mean when the mean is less than 25% of the range)
Any consumables or spare parts associated with non-standard operations
Any costs created by schedule delays
A percentage of costs for program management and production planning, especially for any resources devoted to rework and repair contracts
A percentage of the cost of maintaining extra employees, machines, production lines, or outsourcing to address capacity concerns
All costs associated with product recalls, including legal and communication costs
All repair costs
It is important to consider each of these costs carefully, look at the data and trends, and identify a way to calculate it, either now or in the future. Yes, some expediting and repair costs are normal, and may included as Value in a COQ calculation. But especially when considering labor as a resource, a certain percentage is avoidable. And yes, it might be considered a sunk cost already, but in aggregate we need to consider how much potential we have to perform work, NOT how much we actually perform.
Busy work will always exist in a non-constrained operation, but it doesn't add value. Other KPIs (like Utilization) are designed to address those issues in your company independently.
Integrating data from different systems
So, for everyone keeping track, we'll need data from a number of systems to calculate COPQ.
Data from your MES to count standard and non-standard operations.
Labor charging, to calculate the percentage of cost associated with quality-related operations, job roles, and unnecessary expedite and overtime feeds
HR data to identify how much fringe and overhead expense you can calculate, based on the percentage of avoidable labor and overhead, as well as unavailable process operations
Any other systems where overhead expense is stored, such as property management systems.
Customer and government property records, and any cost reserves associated with spare parts.
Procurement costs for outsource-related costs and service related costs
And any other administrative fees associated with avoidable costs
The ultimate goals of calculating COQ and COPQ is these
If I invest one dollar in Quality, what is my ROI?
Am I overstaffed or understaffed?
Are my processes under control?
What value am we adding to products, and are we recognizing that value in pricing, Goodwill, or Trades Secrets.
At the end of the day, any quality-related costs that do not add to the product, reduce timeline, increase capacity, mitigate risk, or improve safety are likely not adding value.
Statistical prediction and optimization
So, now that you have a model, you need to test its performance. The COQ is the predictor variable. If we have a high enough statistical significance (P-value), we can use the COQ to predict changes to the COPQ.
Regression analysis will allow us to find out how much of the variation in COPQ over time is due to the COQ. And the more data that we have, the better of a model we can construct. It might be time to bring in your data science team or your Operations Research engineers to help.
Keep in mind that a wider and deeper dataset will improve predictor accuracy. And you may find that some of your cost categories are not good predictors of COPQ performance. They still need to be measured for your top-line calculations, but omit them from predictive models until you understand their relationship to avoidable costs.
There may be other reasons why they do not accurately predict COPQ - perhaps they're operations that aren't on the critical path or constrained, perhaps they're not a problem due to excess capacity - but their relationship to your process can evolve as your business does.
You may also want to develop indexes for your COQ measurements, such as scrap rates, first pass yield, test acceptance rates, etc. These indexes simplify mathematical concepts from real world models and can often act as better predictors than raw data measurements. Consider a number of different indexes, automate them, and track them over time - you never know which might prove critical to solving a critical business problem.
So, we can usually only predict common cause variation - catastrophic incidents can rarely be predicted, except in the case where we're taking on too much Risk.
If we have a catastrophic failure or product recall resulting in huge costs and containment efforts, do we exclude it from our calculations? One could certainly make the argument that it is special cause variation that doesn't contribute to a baseline, predictable metric of either COQ or COPQ.
But these events impact cost. They are part of a larger, real-world system that we're trying to simulate mathematically, and they will have ripple effects that we can't always predict.
What happens when vendors cut costs and have several "one off" problems due to cost cutting efforts?
Could we have avoided the issues by having a larger Quality Management System to begin with?
This gets us back to a traditional reporting scenario.
We need to track all our costs. These costs are evaluated as part of our predictive models, and omitting them would be reckless. We just need to understand that our baseline KPIs will evolve over time, by a variety of factors, but maintaining the discipline to measure them, as a part of a consolidated enterprise reporting platform, is the key to deriving savings and value from product quality.
COQ and COPQ are two of the most difficult KPIs to monitor, but they may be the most essential for manufacturing organizations to keep costs in check. Every equation will be unique, based on your systems and business challenges. I'd probably leave you with a few pieces of advice.
First, don't rely on sampling when extracting data. Having all the data available for mathematic and predictive models ensures that variability is suitably accounted for.
Second, invest in the technology to calculate data accurately. I would recommend a combination of SAP HANA and SAP Analytics Cloud, and I'll tell you why.
You will be most effective if every product, every product, every work center, and every business operation has transparency into their impact toward COQ and COPQ calculations, into their specific trends over time, and into their own predictive model.
They need to see which KPIs are influencing their performance, when their processes are out of control, and when they are creating unnecessary waste.
With a real-time engine that can calculate COQ and COPQ "on demand", you empower every stakeholder to actively participate in cost containment. And the ability of in-memory, real-time analytics to perform this task cannot be understated.
Last, don't be discouraged.
You may find that your COQ calculation is a poor predictor for COPQ, and that your models need a great deal of revision before they're usable.
By exploring the data, engaging the teams, and educating leadership on the importance of COQ, you are building the foundation of a true quality organization. Ultimately, we do have the ability to measure value of our quality. It isn't a sunk cost or simply a cost of doing business. It is a differentiator. It's essential.