What should you be doing with your live and historical data?

From analysts and academics to companies and their stakeholders, everyone knows that data is the big game changer in today’s digitalized world. Whatever your market, if you want to be competitive and grow your business, utilizing your data to its full potential is the way to achieve just that.

Data can be exploited with different levels of complexity, from simplistic to advanced. Naturally, the higher the complexity, the higher the potential value return in any context where data is relevant.

Generating value from HISTORICAL data

Historical data may be used in simplistic applications such as analyzing single trends of process variables to find disturbances. A production batch that does not meet quality criteria may, for example, have deviations in one or more process variables that are evident from a simple inspection of data trends. Having a tool that provides simple graphical presentation of process variables is, of course, the basis for making such analyses and identifying these “disturbance patterns” in a simple way.

More advanced analyses can provide in-depth knowledge that can increase the productivity, efficiency, flexibility, and quality of your business processes. For instance, a tool that lets you position process variables from different batches or production times on the same timescale allows you to identify repeating patterns within your process.

This provides additional insight on top of the simplistic analysis of a single trend line. For example, it is possible to determine if the above identified “disturbance pattern” always causes a quality deficiency, or if there are other factors that influence the outcome. Applying multivariate mathematical models to your historical data lets you identify the other contributing factors to a quality loss.

The Golden Batch

This approach of analyzing process signals can also be used to identify the golden batch. The golden batch is where the best values for parameters such as temperature, pressure, and concentration coordinate with optimal outcomes regarding yield, quality or other important results. To be able to identify these settings you can use your historical process data as your process knowledge bank and apply tools to derive the information needed – as opposed to the traditional approach of relying on the hands-on experience of specific individuals.

Even the most optimized process can, in fact, become more efficient. One common problem is bottlenecks in the workflow. Historical data regarding run hours, reasons for not running and maintenance logs provide a great source for revealing these bottlenecks. Modeling tools, together with planning tools, can help structure the data to show where a production modification would provide the greatest impact to achieve, or even exceed, the production quota.

Generating value from LIVE data

While historical data offer important insights into your processes and how to optimize them, the real challenge is to apply this knowledge to your up and running live processes.

Not all solutions offer you the possibility to visualize truly live data. However, where this can be achieved, and especially when it is integrated with your historical data, valuable new possibilities open up.

Identifying a “disturbance pattern” is good but being able to apply process controls to your live process – e.g. warning when these patterns are first beginning to appear – may give you time to adjust your process and save the batch or avoid a production stoppage. There are tools available today that can deliver this extra value.

The golden batch approach is also possible to obtain using live data. Modern applications can apply your multivariate process model to your current process parameters, showing you a prediction of where on your quality scale you will end up. This prediction gives you the possibility to alter your settings and immediately visualize where the new settings will take you at the end of the batch. Small adjustments may then be made to obtain the optimal result.

Generating value from data using ARTIFICIAL INTELLIGENCE

The production optimization methods described so far are all, in some way, still based on human intelligence and limitations. If we take human limitations out of the equation and instead let artificial intelligence do the work for us, how does this change the picture?

“Disturbance patterns” need to reach a certain level of distinctiveness before a human is able to detect them. We humans are not optimized for analyzing small variabilities across several parameters. Artificial intelligence, on the other hand, is.

Applying self-learning mathematical algorithms to your historical data can reduce the identification time for disturbances by many factors, giving you a larger time window to mediate, or even eliminate, the effect. Small segments of code (agents) lie in your systems with the sole purpose of identifying and alerting you to tiny variations that are known to lead to a failure or low-quality product.

If each of your repeating disturbances had a dedicated agent that gave you enough warning to eliminate that disturbance, how much additional yield could your process deliver?

Data is the lifeblood of an optimized, smooth running and self-learning factory. To be able to digitalize your future you need have control of your historical data, as well as a strategy for storing, analyzing and using your data in your live processes.

Case story

SITUATION: A customer had a problem with a particular piece of production equipment, which led to 8-10 stoppages per year. Each stoppage resulted in a significant disturbance and loss of production. However, far more seriously, 5-6 such incidents resulted in total machine breakdown with devastating consequences.

CHALLENGE: Using human trend recognition, operators were only given about five minutes of warning before stoppage incidents. This time frame needed increasing.

SOLUTION: By applying artificial intelligence-based tools to historical data, including failure events, the reaction time frame was extended to 1 hour and 50 minutes. This gave operators the time they needed to make a controlled stop and avoid a full breakdown.

RESULT: This single application has resulted in savings of 3 million SEK per year.

Does your organization need help generating VALUE from data?

The methods outlined above are just a few examples of how we can help you to exploit your data in a more strategic way. PlantVision’s Manufacturing Information team has many years of experience implementing and managing systems for data capture and storage, along with the insights and tools needed to turn data into value.

From mining to molecular biology
We have experience from many different production industries. Our consultants have both in-depth knowledge and extensive practical experience regarding the operational requirements and quality standards for all kinds of processes, from raw material extraction to highly regulated industrial environments.

We can help you to implement and maintain multivariate live process control, machine learning and much more. At the same time, we provide a visionary, life cycle approach that emphasizes mutual trust as the basis for successful long-term collaboration.

Get in touch and find out how we can help your organization turn data into value.

Cecilia Jacobsson
Cecilia JacobssonBusiness Area Manager, Operational Information
Tel: +46 (0)8 568 595 15

Get in touch if you would like to know how PlantVision could help your organization generate more VALUE from your production data.