The Cost of Quality
Increasingly, large parts of manufacturing are outsourced in an effort to lower costs and reduce time-to-market. Boeing typifies this new manufacturing model. It uses its expertise to assemble aircraft in its factories but the quality of the aircraft that rolls out depends on a large number of suppliers around the world. Boeing’s latest 787 Dreamliner aircraft appeared to benefit from such global collaboration. The 787, its most fuel efficient aircraft, used lithium-ion batteries to power its electrical systems and became the fastest selling aircraft in history. But trouble began soon after. The batteries began overheating, leading to fires on board the aircraft; the US, Japan, India and Chile grounded their 787 fleets.1
Boeing is not the only one in this unfortunate predicament. When maps failed to work on the Apple 5C mobile phone, stocks dropped. The market is unforgiving when a company breaches product quality, especially when expectations are high as per the brand promise.
What went wrong at these companies? How can the risk of reputational damage be mitigated? It is indeed apparent that there is a significant cost to ignoring quality (see Figure 1: Cost of Quality).
Cost of Quality
Data and analytics can help reduce the cost of not attaining quality by reporting trends, providing alerts, determining priorities, spelling out trade-offs and accurately indicating ROI for each decision.
Figure 1: Cost of Quality
Also, the cost of poor quality is rising as legislative bodies and society become increasingly unforgiving. For example, General Motors has said it will spend USD1.2 billion to fix the 25.68 million cars it has recalled2 – among which are the 2.6 million switch-related recalls. This is in addition to the base USD 1- USD 5 million that GM is expected to pay by way of compensation to each of the families of the 13 who died in accidents3 caused due to the defective switch (additionally, GM will pay each surviving dependent USD300,000 to cover the cost of emotional damages).
Large corporations with modern production processes have begun to comprehend the potential of machine data and rightly believe that data and analytics can help meet their elevated quality objectives. In fact, there is more that the data can deliver by way of a competitive edge. Manufacturing can use data to predict failure and reduce downtime, refine inventory management for spares, provide maintenance forecasts, provide feedback into product design and usage, define skills requirements and reduce the cost of fixing problems. The real value in this is a greatly improved experience – for the customers who use such products – due to improved reliability and reduced downtime.
Meeting Today’s Quality Norms
There are three stages that must be considered to meet quality norms. Manufacturers can capitalize on machine data and analytics to shape their quality curve at each stage:
Components: Every lot manufactured in-house or by outsourced partners must be inspected and examined, using relevant quality measurements, testing or sampling methods, to determine if the components meet quality expectations
Assembly Process: Multiple test stations must be established to ensure quality at each step of the assembly/production process. This is especially important for new products where failure can be high. The metric that captures quality at this stage is First Pass Yield (FPY)
Shipped Product: Failure at this stage can be due to a number of reasons ranging from poor quality in shipment and installation processes, unsuitable product usage, poor integration in the product ecosystem, weather, etc. The impact is on brand reputation, litigation costs, customer compensation, recall and repair costs. The first two stages depend
The first two stages depend on factory data while the third depends on Machine2Machine Data (collected from networked machines). Manufacturers have by and large focused their efforts on quality management in the factory using statistical process control methods. However, in the field, they have found themselves to be reactive to quality problems.
A 2014 study, Manufacturing and the Data Conundrum – Too much? Too little? Or just right?6, commissioned by Wipro and conducted by the Economist Intelligence Unit (EIU) suggests that the news on data usage is only partially positive. Conducted with C-suite and factory executives across North America and Europe, the research shows that 86% of those surveyed had increased the amount of production and quality control data for analysis over the last two years. This is offset by the fact that only 14% reported that they had no problems managing data. The obvious conclusion is that while manufacturing is well on its way to capturing and acquiring data, it also lags in its ability to leverage this data
Figure 2: Source: Manufacturing and the Data Conun
The study exposed extreme polarization in the intensity of capture and use of data. For example, while some companies like GE were forging ahead with heightened attention to quality at 400 of their factories – GE refers to these factories as “brilliant factories” – with one battery plant capturing 10,000 variables, some as frequently as every 250 milliseconds, only twothirds of the study respondents said that they were capturing sensor generated data (see Figure 2: Manifold Data Sources). The study observes that despite decades of quality improvement programs after World War II, tens of thousands of factories in North America and Europe are light years removed from advanced, cutting edge digital processes to ensure product quality.
Bigger Drivers of Machine Data on their way
Examine Figure 2 in the light of the Internet of Things – where billions of devices will be connected with products having to work quite differently – and you can see the pre-eminent role of data and analytics in quality management. We are transitioning into a world where intelligent machines will be used a lot more to manage life, and humans will work in tandem with them distributing work based on strengths.
Telematics in vehicles, smart grids, connected wearable medical devices etc. are already creating an overwhelming amount of data. We now need a model to manage big data – or, put another way – we need models that can translate big data into meaningful and useful insights.
In effect, there are two main challenges to be overcome:
1. To have a strategy to extract signals from the data, and then separate noise from the valuable signals that contain insights.
The questions to ask here are:
2. To arrive at a data structure that can overcome the challenge.
The questions to ask here are:
The Road Ahead
Today, manufacturing is no longer de-coupled from its customer. There is just one degree of separation between the two. Soon enough the manufacturing industry will be leveraging the Internet of Things to collect product and usage data directly and regularly rather than depend on dealers and surveyors to collect and send in the data after a machine or device has broken down. This will give them visibility to product and customer issues in near real time and enable them to fix such problems remotely in a semi-automated fashion, even before customers come to know about their malfunctioning machines.
A powerful example of this is Vehicle Telematics. In-vehicle systems are capable of exporting vehicle performance, location, drive conditions and usage values to a central server, analyse the data and alert the owner about driving behaviour improvements, suggest fuel optimization strategies, service needs, when to expect failure, where to find the nearest dealer, and how to avoid the disruption. By ensuring that the right spares and skills are available at the right time, data can also help the dealer in reducing service time.
Vehicle Telematics is not only helping automobile manufacturers and dealers in getting closer to their customers but is also creating whole new products and services ranging from the Autonomous (driver-less) vehicles such as the Google Car to new services such as Uber and Lyft that offer low cost rides to people who do not own cars. It is further helping communities with creative solutions to address traffic congestion and pollution thereby helping move more cars per hour without having to incur large amounts of capital in widening roads or building new ones.
The focus of business is shifting. From being a one-time sales transaction with customers, businesses now want to have a lasting customer relationship. Manufacturing organizations that bring this focus to their data strategy are already headed for earning more impactful customer experiences that translate into profitable long-term growth. This is indeed validated by the EIU survey results which indicate that the most data-adept companies are also the most profitable.
Nitesh Jain
Nitesh Jain is General Manager and Global Practice Head for the MyViews practice, Advanced Technologies & Solutions (ATS) at Wipro. He is responsible for the P&L, providing vision, business direction and strategy to the practice. He is a proven Business Analytics & IT professional with expertise in leadership roles in advanced analytics & performance management practice, BI & data warehousing projects and program management. He is responsible for solutions, services, thought leadership and growth in analytics and performance management across Industry verticals globally
He has more than 17 years of Industry experience ranging from managing Advanced Analytics practice, leading Data warehousing and analytics projects for Retail, CPG, Manufacturing & Banking sectors, architecture design and development of various operational and analytical solutions for Retail, Medical devices and Banking/Financial Services domain.
He holds a Master’s in Business Administration from Pune University and has a Bachelor’s degree in Mathematics.
Dr. Kandathil Jacob
Dr. Kandathil Jacob is a General Manager within the Advanced Technologies and Solutions (ATS) service line and focuses on defining and implementing solutions that provide Analytics based Insights to major corporations across the world. His engagements span the entire customer value chain and include trade and marketing spend optimization, customer life cycle analytics, manufacturing quality management, supply chain analytics and most recently machine learning analytics using Big Data platforms. His personal expertise is in providing thought leadership on how predictive analytics can enhance the profitability of manufacturing and hi tech companies.
In the past, he has held Director and Sr. Manager positions at Apple, HSBC, Del Monte, FICO and several analytics oriented start-ups. Dr. Jacob has taught MBA courses at the Haas School of Business and at the School of Management, IIT Bombay. He is a Doctor of Engineering in Industrial Engineering, and an MBA in Marketing, both from the University of California, Berkeley. His undergraduate degree is in Mechanical Engineering with Honors from IIT Bombay.