Must-do Steps to Unlock Value Through Digital Transformation

Must-do Steps to Unlock Value Through Digital Transformation
Must-do Steps to Unlock Value Through Digital Transformation

Industry 4.0, also known as the Fourth Industrial Revolution, refers to the integration of advanced technologies such as the Internet of Things (IoT), Artificial Intelligence (AI) and Big Data into industrial processes. Our readers in the United States & Canada may have also hear the terms Smart or Advanced Manufacturing to refer to the same efforts as Industry 4.0. Initiatives promise results in optimizing processes downtime and improving product quality. In our practical experience supporting users and integrators, there are three key things that need to happen in Digital Transformation of a business: connect, get the data and decide where to store it, and transform to make it useful before sending it somewhere.


Connect, get the data and where to store it?

We all know you can't improve what you don't measure. The scale and wide-reaching scope of digital transformation initiatives require efficient, performant and inclusive communication technologies. Methods to connect and get the data fall into 2 broad areas: control specific networks & protocols and open/standards-based protocols. We will speak to the first shortly but on the second there are two ongoing trends involving OPC UA and MQTT.

The open protocols MQTT + Sparkplug and OPC UA + OPC Companion Specifications & Information Models are leaders in addressing standardization and interoperability in Industrial Internet of Things (IIoT). Our intention is not to determine the superior communication solution, but rather to discuss their characteristics and factors worth considering.


Open: MQTT + sparkplug–transport and information organization

MQTT is a lightweight communications transport protocol suitable for limited bandwidth networks and applications with multiple clients and devices sharing data in a many-to-many arrangement. It enables clients to publish and subscribe to data in cloud or premise hosted brokers that manage the data and route it to subscribed clients.

MQTT alone does not define the organization of data in packets, known as the payload, leading to risks of interoperability issues and vendor lock-in due to vendor specific payload formats. Sparkplug extends basic MQTT with a standardized payload format for users, integrators and suppliers to use to define models for interchanging data, albeit not as well defined as OPC UA Information Models which we will discuss shortly.



Users of MQTT and Sparkplug need to consider using Smart MQTT Clients and Brokers that go beyond just moving data around. A Smart Client or Broker will handle multiple payload formats over the same connection, be able to automatically extract MQTT topic data into tags for consumption via standards like OPC, and handle propagation of data connection quality status. If control decisions rely on MQTT data, it is imperative for the Smart Client or Broker to address the management of missed and out of sequence messages, guarantee message order preservation, and address the handling of failed writes plus address store & forward for network downtime. You can learn more about MQTT Smart Brokers and Clients here.

 

Open: OPC UA + companion specifications and information models

The OPC UA (Open Platform Communication Unified Architecture) standards are another means of providing a standardized framework for data exchange and communication between diverse industrial systems, devices and applications. An evolution of the OPC Classic standards, the OPC UA standards define a secure integrated means of exchanging a wide range of industrial data, along with standardized information models with well-defined namespaces for interchange of data, often in specific vertical industries.

OPC UA Companion Specifications provide information models published and available from the OPC Foundation with XML definition files to rapidly empower client and server applications to share the industry specific data in the model. Sophisticated users and integrators can define and publish their own information models to exchange plant data within their businesses, or with supply chain or other partners. This data encompasses not only the raw data but also historical & event data, metadata, including details about data sources, data quality and interrelationships between data points.



OPC UA Pub/Sub (Publish/Subscribe) is an extension to the OPC UA protocol for applications requiring many-to-many communications, offers an efficient transport for data of all types, raw or organized in OPC UA information models, either from companion specifications or user defined. OPC UA Pub/Sub can also be used over an MQTT transport, enabling businesses to benefit from the best of both worlds, leveraging the strong standardization of OPC UA companion specifications & information models and the flexibility and simplicity of MQTT.


Control network protocols and devices

In any implementation, there will be systems and devices to connect that do not natively implement MQTT, OPC UA in any form. Strategies to connect and integrate standard PLC & control protocols, legacy devices and non-standard protocol devices must be addressed early, either through device replacement or integration software. The MQTT and OPC standards have empowered a market supply of off-the-shelf software with visual configuration interfaces to connect just about anything with a serial or ethernet connection and a documented, published protocol to the OPC & MQTT standards. Off-the-shelf solutions exist to enable serial, USB and Ethernet devices not using an open/standard protocol to be integrated without writing any custom code.


 


Orchestrating OT and IT + event-driven data exchange

The convergence of Operational Technology (OT) and Information Technology (IT) is also vital for dismantling data silos, facilitating uninterrupted data exchange, empowering real-time analysis, support for predictive maintenance, quality control and more. A flexible event-driven middleware and visual workflow tool can play a pivotal role in achieving these goals.

Modern implementors expect integration without writing custom code. They want a visual working environment where they construct basic and complex workflows to exchange data and automate processes, test and deploy at scale using templates and mass imports. They also expect support for a wide array of protocols for OT, cloud and IT, including OPC UA, OPC Classic, MQTT, REST, ERP interfaces, databases and others, to ensure seamless data flow and ETL (data transformation) throughout the entire industrial ecosystem.

Off-the-shelf solutions exist that provides an interface that meets those expectations and enables bi-directional connections to non-standard interfaces (think LIMS systems, Databases on the OT layer, ERP & more). They also support OPC UA methods, RESTful or SOAP webservices, and are naturally able to convert the information into a format that can be shared/published.


Once the devices are connected, and IT and OT technologies orchestrated, there’s still a key factor of bringing transactional operational and business data together, cleansing and contextualizing.
 

Where to store it? The Unified Name Space or UNS

There is an active trend to create and make real the concept of a Unified Name Space (UNS) to empower efficient information interchange in support of real-time decision making. The use of a UNS helps to break down information silos, enabling businesses to collect and analyze data from a wide range of sources, providing a more complete picture of operations. The UNS does not necessarily live in one place, but rather in a distributed environment involving all applications. The tools used to gather information that goes into the UNS model should empower the standardization efforts. OPC UA, UA Information Models, UA Pub/Sub, MQTT and MQTT Sparkplug are technologies that meet those requirements, enabling and empowering technologies to get the data into the UNS.


Transforming the data and doing something with it

Validating, contextualizing and turning data into information, adds value by providing actionable insights and enabling informed decision-making, driving operational efficiencies, and supporting continuous improvement initiatives. We discussed OT data access, OT to IT orchestration, but how do we address the integration of operational and business or transactional databases and systems? What about the issues of sending unvalidated data to the cloud and wasting bandwidth, data ingestion fees and time?

Bridging this so-called “Data Divide” requires a comprehensive solution known as a Unified Analytics Framework or UAF that goes beyond the typical connectivity discussed so far. Essential functionality for such a solution includes cleansing, normalizing and contextualizing, and providing data to various consumers, including the larger Unified Name Space (UNS) that was mentioned earlier. This work is more efficiently done closer to the data source, rather than sending potentially invalid or unaggregated data to the cloud for analytics, machine learning and other value additions.

Software for building a UAF and transformational bridge exists built for OT first, with a keen understanding of business and cloud integration. The purpose of a UAF solution is to transform the data collected at the edge to be ready for advanced analytics, AI and ML which likely will occur in the cloud. Ideally, the UAF solution runs at the edge and includes native OT connectors the sources mentioned in this article plus process historian combined with connectors to transactional business data sources. By operating at the edge but also being able publish results up through business layers and to the cloud, calculations and contextualization are kept as close to the source as possible. The benefit of this is reduced cost for ingestion of data by cloud systems, but more importantly you ensure quality data is delivered to cloud systems rather than wasting time cleaning data in the cloud before analysis. Templatization that can be managed across the business at different layers provides scalable engineering and data governance.

 

Conclusion

Success in Industry 4.0 or Smart/Advanced Manufacturing & Digital Transformation initiatives requires consideration of many factors. In our experience, a clear focus on getting the right data, transforming it and getting it to the right places for further use is key.

Software Toolbox is an experienced partner for users and integrators seeking to address these key enablers. Established in 1996, the company offers an extensive array of open, standards-based tools that function collectively as solutions or as value-enhancing supplements to enterprise vendor applications.

Software Toolbox's commitment to providing outstanding customer support and technical consulting ensures clients have the necessary resources to implement, maintain and expand their digital transformation initiatives.

For more information, visit Software Toolbox.

About The Author


Dawid Sadie is a Business Developer at Software Toolbox with a 23-year background in automation technology. With experience ranging from working as an integrator to leading technical sales teams and heading a process automation partnership, Dawid delivered to our clients his global expertise & experience in IoT/IIoT, Smart Manufacturing and Advanced Solutions.


Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..

Subscribe