Key learnings and success factors from our journey with PathWave Manufacturing Analytics in the electronics manufacturing industry thus far – Part 1

2021-02-01  |  10 min read 

Industry 4.0 has been the hot topic for the past few years in the electronics industry, especially when the narrative steers towards the digital transformation of the traditional factory. It seemed everywhere and everyone was buzzing with anticipation, visions and dreams of what a smart factory would be. Squeezing every last drop of productivity from invested manufacturing equipment out on the floor was the goal and hence a lot of focus was on downtime and throughput. This was and still is an important business outcome from any successful Industry 4.0 implementation.

Then Covid-19 happened. Other than manufacturing driven by the race to 5nm chips, 5G and cloud computing, some sectors of the electronics industry have seen a drastic drop in volume leading to surplus of production assets on the floor. For once, machines idled. For others, Covid-19 has caused massive supply chain disruptions has led to manufacturers with a hand to mouth situation. All the steps that governments around the globe, and necessary, to manage and halt the spread of this epidemic has restricted movement of factory employees and subsequently lowered productivity and output. The trade situation between US and China has also forced manufacturers to rush to shuffle its operations for business continuity. There are everlasting shifts in manufacturing paradigms as a result of Covid-19. The new “norm” is going to need rethinking on how Industry 4.0 technology enablers will be used to address the new challenges.

Quality Over Quantity

Before Covid-19, Industry 4.0 adoption mostly revolved around asset utilization. However, with the current “norm”, it may be a better outcome to ensure that every single product that is manufactured is of the highest quality the process allows it to be. Due to the shortage of materials and parts, rising logistic costs and restricted factory employees, manufacturers will have to minimize Return Merchandise Authorizations (RMA) even more than before. Their customers will be in a hand-to-mouth situation as well, and every good part that can be shipped and used is crucial to meet demand and get revenue. Better quality may also prove to be a compelling value differentiator against competitors to win more business.

This isn’t saying that quality has never been one of the most important performance metrics of a manufacturer but rather that the usual narrative of adopting Industry 4.0 technology such as big data analytics, Artificial Intelligence (A.I.) and Industrial Internet-of-Things (IIOT) to maximize asset utilizations will need to pivot to add more focus on improving the quality of the product being manufactured instead. Keeping machines up and running with minimal downtime gives less Return-Of-Investment (ROI) if product recalls are happening or assets are loaded only half the time most days. 

That would mean that now, qualitative and quantitative data on the product, usually from test and measurement equipment in the floor will be an undeniable source of insights for any big data analytics implementation. These insights are important to allow engineers to maintain process parameters that would yield the highest quality. In addition, these insights will be a real-time barometer of gross reproducibility and repeatability of the equipment and processes, which is important for the predictable quality standard of the products.

To put it simply, lower Cost-Of-Poor-Quality (COPQ) is going to be something Industry 4.0 technology adoption has to address quickly. 

The Dangers of Anomaly Detection and Things to Look Out for

Since the launch of Keysight’s PathWave Manufacturing Analytics in 2018, we are seeing more manufacturers embracing the new “norm” and emphasizing using its big data advanced analytics capabilities on test and measurement that are generated every second on the production floor. In fact, all of our customers are using PathWave Manufacturing Analytics to analyze test and measurement data from the test equipment out in the floor to provide insights to the quality of the products manufactured. This is a major shift from the initial ideas of predictive maintenance and asset utilization. It’s also why we decided to change our tagline from ‘Measurement Science meets Data Science’ to ‘Build It Better With Advanced Analytics’ at the start of the year.

A core fundamental analytics insight from the platform is being able to predict potential quality issues before it happens. The machine learning tool usually used to do this is around anomaly detection. We have seen a lot of examples of factories investing in setting up a generic big data platform and using publicly available open-source anomaly detection algorithms in production. What is eventually evident is that these algorithms tend to be low in accuracy when dealing with the test and measurement data, as opposed to continuous signal from sensors. This is what drove us to develop our own anomaly detection machine learning model that is tuned to provide the highest accuracy for test and measurement data from the floor. In fact, we have recently released a whitepaper that shares a comprehensive study on the performance comparisons of our anomaly detection model against other popular open-source algorithms on the same set of test and measurement data.

In that article, we also brought up an issue normally referred to as “Alert Fatigue” in other industries that has been successfully using anomaly detection as a predictor. We have learnt in the journey with our customers that this fatigue is real in trying to use anomaly detection in the manufacturing industry. There are hundreds of thousands of measurements being taken in real-time in production. One can imagine the number of anomalies that are being alerted to the operators or engineers every minute of every day. It is an impossible task for the users to decide which anomaly is most important and what are the most urgent actions to take. Ultimately, this fatigue will lead the users to ignore the alerts and the slow but sure demise of the entire advanced analytics project begins. If the right actions to prevent losses cannot be taken, then the ROI cannot be realized. We knew this was as important as anything else in order to make any investments in big data advanced analytics implementation in the factory worthwhile. It had direct correlation with business outcomes. 

Last year, we decided to put together a team of data scientists and test and measurement experts in Keysight to develop an alert scoring machine learning model that works seamlessly with our already out-performing anomaly detection algorithms to score measurement anomaly alerts in real-time. Now, we are proud to say that we have actually achieved this goal and we are planning the release of the new Alert Scoring feature in PathWave Manufacturing Analytics 2.4.0 release in spring of 2021. Alerts are labelled and sorted by the machine learning model as either high, medium or low severity. The interpretation of the machine learning model of severity required supervised learning that Keysight’s test and measurement experts could provide. 

With this first-in-industry alert scoring model, we were able to reduce the number of alerts sent to users for disposition by 90% in real-life testing! Instead of a hundred alerts, the engineer or operator will only receive ten of the most severe or important alerts. 

It is this unique ability to combine domain knowledge and data science that really sets us apart from a generic big data platform partner. We are really looking forward to helping our customers to achieve more tangible business outcomes with our exciting 2021 roadmap.