- By Mark Pietryka
- February 29, 2024
- Seeq Corporation
- Feature
Summary
ML insights paired with advanced analytics solutions provide modern process manufacturers with the insights required to increase production uptime, optimize operational efficiency, and enhance product quality and yield.
Equipment failures cripple the industrial sector, resulting in significant production losses and unplanned downtime. These losses amount to billions of dollars annually for process manufacturers around the world. For example, a critical conveyor belt grinding to a halt mid-shift can force an entire factory line to sit idle for hours, potentially backing up entire supply chains.
Fortunately, modern machine learning (ML) provides a breakthrough solution. By analyzing a wealth of sensor data, ML algorithms can predict breakdowns and backlogs before they occur, enabling proactive repairs and dramatically reducing downtime. But that's not all–ML also reveals hidden patterns in production data, optimizing processes, cutting waste and boosting overall efficiency.
Before organizations can unleash the full potential of ML, they must start by mastering the essential building block of teamwork. To establish accurate and impactful models, data scientists and domain experts with a deep understanding of industrial equipment complexities must form partnerships. This collaboration translates factory floor knowledge into the language of data, driving ML solutions that deliver results.
Overcoming the shortcomings of traditional industrial data
Improving operational efficiency with ML insights does not occur overnight. The first challenge is making sense of raw industrial data.
In its native formats, industrial data is vast, diverse, and often riddled with errors or irrelevant information, such as downtime logs, for example. Without guidance, data scientists often waste precious time and resources sifting through extraneous complexity, costing valuable time, and often producing misguiding models. This is why domain experts, including process engineers and operations personnel, are critical in preparing data for accurate models, with their extensive process knowledge helping pinpoint the right data and relevant time periods.
However, identifying the right data is only the first step. Raw industrial data is typically messy, requiring context to make sense of. Imagine a model where temperature readings during maintenance periods are intermixed with those during operations: this would send a predictive model into a tailspin! Cluelessly plugging data into a model can wreak havoc, showcasing the importance of cleansing and contextualizing data before performing analysis. Process experts can help define caveats like this, reducing algorithmic errors, ensuring consistency, and identifying specific operating conditions that matter most for the model’s success.
Once data is cleansed, there is still work to be done before it is ML-ready. Feature engineering bridges this gap, requiring data scientists’ and process experts’ continued partnership to transform raw readings into contextual insights that speak directly to the problem at hand. These informative insights, or “features,” include statistical summaries, frequency patterns, and other clever combinations of sensor data, helping ML algorithms spot hidden patterns, enhancing model accuracy, and aiding complex operational decision-making.
Deploying ML models in industrial settings demands more than just accuracy. To be truly useful, models must be easily transferred to operators for use during production. This means interfaces must be easy to read, presenting predictions, alerts, and real-time data clearly and concisely. Additionally, including explanations in operational interfaces wherever feasible builds trust and understanding among end users.
Adding to the challenge, industrial processes change over time, and ML deployment success necessitates methods for retraining models with fresh data to maintain accuracy. This requires continued collaboration between data scientists and operations teams to monitor performance and continue model iterations.
Advanced analytics enhance industrial ML initiatives
Navigating the many steps to building and implementing ML models into operational workflows is not easy, but modern advanced analytics solutions are simplifying the procedure, providing a holistic solution for ML integration into industrial processes.
These solutions, such as Seeq, cut through common industrial data chaos by connecting to a multitude of sources in real-time. Beyond aggregation, these software tools automate data cleansing, cutting out great deals of manual data manipulation and conditioning (Figure 1).
This adaptability is crucial as processes change characteristically because it keeps ML models up to date with relevant information to reflect current operating conditions. Recalling the conveyor belt failure scenario from this article’s introduction, advanced analytics solutions empower engineers with the ability to quickly identify anomalies, handle inconsistencies, and extract meaningful information immediately. This high-quality data then, informs troubleshooting steps, fuels actionable ML insights, and increases operational decision-making confidence.
Feature engineering is vital for ML success in industrial settings, but it demands collaboration. Advanced analytics solutions help foster this required synergy with distinctly curated user profiles, built for different experts’ roles, with the required tools to share findings among various operational groups seamlessly (Figure 2).
For example, Seeq’s Data Lab empowers data scientists to easily deploy models for direct consumption by engineering and operational groups, who can provide feedback to help refine the models. Predictions and alerts then flow to Workbench, Organizer, and external visualization tools, which are commonly accessible by managerial and administrative users. Advanced analytics solutions bridge these historically segregated divisions, transforming models into powerful tools for tighter process control, operational optimization, and better-informed decision-making throughout an organization.
Taming compressor failures with predictive analytics
Real world results showcase the usefulness of advanced analytics solutions for slashing costly downtime events. For example, one large chemical manufacturer plagued with unforeseen failures in critical compressors used Seeq to identify subtle compressor deviations from one operating cycle to another. Each incident cost an estimated $1 million USD, so finding a way to anticipate and prevent these failures quickly became a priority.
The company set out by collecting a copious amount of process data, but the sheer volume and complexity—over 170 variables—made it difficult to discern true patterns from noise. Traditional analysis fell short in pinpointing the combination of factors that plausibly presaged a failure.
The manufacturer then turned to Seeq, leveraging the software’s built-in ML tool, and it empowered its domain experts to tackle model development without reliance solely on data scientists. The solution’s user-friendly interface placed the power of ML directly in the hands of process engineers with comprehensive compressor expertise, helping bridge the knowledge gap between SMEs and data scientists, exacerbated in traditional analysis. This helped ensure predictive models incorporate the right domain understanding and evolution.
By leveraging purpose-built capabilities within the advanced analytics solution, the company turned model results into near-real-time operational insights. The models homed in on subtle deviations in compressor parameters that indicated developing problems, and visual dashboards helped alert operational and engineering groups early enough to take preventive actions to avoid costly breakdowns. This predictive approach empowered the team to transform reactive maintenance into a proactive strategy.
By addressing issues prior to failure, the company experienced significantly fewer costly downtime events. The advanced analytics solution provided not only a technical backbone, but also a new level of data fluency, giving engineers greater agency over equipment health.
Tackling meter freezes for optimized gas delivery
Frozen meters threatened profitability at an oil and natural gas provider, causing measurement errors and costly product giveaway. The operator’s vast network, spanning 32,000 miles of pipeline with a processing capacity of 7.4 billion cubic feet of natural gas per day, amplified the scale of this issue. Messy data and a reliance on a rule-based approach to identify freeze events proved both time-consuming and unreliable, and maintaining rules—in addition to sifting through many false positives and missed detections—sapped valuable resources.
Requiring a different approach, the company turned to Seeq to simplify cleansing and access to its vast array of meter data. Domain experts used the software tool to improve data quality and label past freeze events, while data scientists collaborated with the engineers to develop accurate models, moving beyond rigid rules and embracing ML.
Within the advanced analytics solution, the operator established a fully-automated workflow, including data pre-processing, model configuration, and automated retraining to maintain model accuracy as operating conditions changed. Model predictions fed directly to visual dashboards and populated reports, providing stakeholders with real-time insight into potential freeze issues.
This streamlined workflow enabled proactive intervention to mitigate freezes, and even slight accuracy improvements in certain locations resulted in millions of dollars saved annually due to reduced product giveaway. Beyond increased accuracy, the solution fostered the data-driven collaboration essential for continuously bolstering operational efficiency.
The effort left the provider with three key takeaways:
- Scalability: The advanced analytics solution handled the company’s massive dataset, a key benefit for large-scale asset management.
- ML as efficiency multiplier: Automating detection tasks freed up engineers to focus on higher-value problems.
- From insight to dollars: The advanced analytics solution streamlined the path from prediction to cost savings, a vital hallmark of impactful ML deployment.
Embrace advanced analytics to usher in ML effectively
ML is undeniably transforming manufacturing processes. Its ability to automate complex tasks, optimize production cycles, and enable predictive maintenance provides tangible advantages over traditional approaches. By improving asset uptime, increasing yields, and enhancing decision-making processes, ML provides efficiency gains and cost savings in many industrial sectors.
While implementing ML presents its own challenges, the vast benefits far outweigh the hurdles, and advanced analytics solutions help ensure successful deployment. These software tools provide powerful data analytics specially designed to handle the demands of time-series data and ML applications in industrial settings. With user-friendly interfaces and a focus on collaboration, these solutions empower companies to fully embrace ML-based insights, providing significant efficiency and profitability advantages in increasingly competitive manufacturing markets.
All figures courtesy of Seeq
About The Author
Mark Pietryka is a senior analytics engineer at Seeq. He has an engineering background with a BS in chemical and biomolecular engineering from North Carolina State University. Mark has nearly a decade of experience working for and with major oil & gas companies to solve high-value business problems. In his current role, Mark enjoys supporting industrial organizations as they maximize value from their time series data.
Did you enjoy this great article?
Check out our free e-newsletters to read more great articles..
Subscribe