skip to Main Content

Artificial intelligence, real quality control

Submitted by Derick Jose on Wed, 05/02/2018 – 10:53

What do process chemical manufacturing and cooking have in common?

smart industry iot iiot industrial internet of things digital transformationFlutura’s Derick Jose
Both have recipes—cookbooks and standard-operating-procedures that serve as recipes for process-chemical manufacturing. Both need quality inputs. Both need dynamic control as the process unfolds—adding the right amount of pepper or calibrating temperature, for example. Both need feedback signals—a chef sampling his dish midway or quality signals in process chemicals.

The problem facing the chemical-manufacturing industry is that, while there are standard-operating-procedures, the do not take into account the dynamic conditions in which actual manufacturing processes happen. For example, the mixer’s vessels would have been used, leaving residuals; the ambient temperature may have moisture or dust that influences product quality.

As a manufacturer there are specific blind spots:

What influence do each of these factors play in changing product quality outcomes? (Which factors are noise and which factors are signals?)
What is the rank of each influencer variable? (Some variables may have disproportionately more influence on quality outcomes than others.)
What is the expected quality outcome based on current conditions and what would be the next best frontline action to take in order to reduce wasteful production?
I can illustrate this with a real-world story. We recently executed a plan for an industrial-glue manufacturer and scaled it across multiple production lines across countries. The problem: the customer was facing a massive challenge; wasted production cost them hundreds of millions of dollars because of the stringent quality controls in the industry. They did not have the tools to pinpoint what influenced the quality outcome.

The solution: we built surgical AI apps to process multiple signals as input lab-quality signals, sensor anomalies, process signals and ambient condition data to predict quality of current production and correlations between various parameters and quality outcomes.

The best aspect of the process was that we closed the decision loop with the frontline folks by translating complex statistical signals into a simple quality “smiley” that indicates if all is going well. When the smiley account changed, production was shut down and forensics initiated to nail the specific parameter that caused quality deviations.

The learnings? If you are in process-chemical manufacturing and want to stay competitive, consider embedding AI into your frontline-manufacturing actions to boost quality outcomes. And before getting started, ask yourself:

Which product lines experience the highest quality rejection rates? Can we isolate the top three product lines?
What is the economic impact of wasted quality? A best-case estimate? A realistic estimate?
If the quality of product is enhanced by 3-5 percent, how much economic value would it unlock in the first year, second year and third year?
What data pools exist? What about sensor data, lab data, SCADA/PLC data, maintenance ticket data, operator data?
Which OT/IT systems hold this event data?
Who can be the executive champion who can shepherd the project?
What if initial results from the AI processes can be consumed in 90 days?
I believe that the process-manufacturing industry has to view industrial AI as a massive shift, not a temporary phenomenon. Rather than being paralyzed by threats, embracing industrial AI will boost efficiency.

The risk of digital inaction is greater than the risk of no returns.

Derick Jose is co-founder and chief data scientist at Flutura Decision Sciences and Analytics.

Interested in AI? Join the experts at the 2018 Smart Industry Conference. Learn more here.

Leave a Reply

Your email address will not be published. Required fields are marked *