The Last Word: Enabling Kubernetes at the Edge

The Last Word: Enabling Kubernetes at the Edge
The Last Word: Enabling Kubernetes at the Edge

Kubernetes is an open-source container orchestration system for automating software deployment, scaling and management. Originally designed by Google, the technology is now maintained by the Cloud Native Computing Foundation (CNCF). Many who are building cloud-native software are using Kubernetes to deploy these apps in the cloud or data center.

Given its suitability for running and managing large cloud-native workloads, Kubernetes is being widely adopted in data centers and multiple distributions of this platform—from independent software vendors (ISVs) and major public cloud vendors—are available. With so many industrial automation and control systems moving to the cloud, operational technology (OT) practitioners need to understand it.

Kubernetes was a big topic at the recent Ignition Community Conference (ICC) 2023 put on by Inductive Automation in September. Lead software engineer Kevin Collins explained that Kubernetes involves multiple computers to form a cluster. It is organized into a control plane for managing everything and a data plane for running workloads. It handles deployment, scaling, and management of containerized applications, which is its intended purpose.

“It gives us a standardized set of resources, a common application programming interface (API) for most of the typical things we do in application development,” Collins said.

Although Collins’ presentation focused primarily on how to deploy Ignition on Kubernetes, it also provided insight into the inner workings of the technology and how it behaves both on-premise and in cloud deployments. He said Kubernetes is very modular, much like Linux. Its modular construction facilitates everything from how containers execute to how the networking is configured. Users have a wide variety of potential places to use Kubernetes, everywhere from the edge all the way up into massive cloud cluster deployments.

The technology is not limited to Inductive Automation’s Ignition platform, however. For example, ZEDEDA recently announced its ZEDEDA Edge Kubernetes Service, a fully managed Kubernetes service for the distributed edge. According to ZEDEDA, in a little longer than a decade, with the continually evolving edge as a backdrop, organizations have gone from using a virtual machine as their application form factor to containerizing just about everything.


Containerization advantages

Containerization offers several advantages, not the least of which are ease of development and ease of deployment, according to ZEDEDA. As the frontrunner of containerization technology, Kubernetes has already helped countless organizations successfully run and manage the full lifecycle of their containers in a variety of environments. The CNCF estimates that more than 5.6 million developers are using Kubernetes today; its recent survey revealed that 96 percent of organizations are either using or evaluating Kubernetes—a substantial increase from 83 percent in 2020 and 78 percent in 2019.

ZEDEDA contends that organizations are likely to be evaluating Kubernetes through one of two lenses:

1. Users who may have recently invested a lot of resources and capital into developing Kubernetes in cloud or data center environments. They are now looking to capitalize on those investments by applying them to the edge.

2. Users who have already deployed Kubernetes in the cloud or a data center, but they are just starting deployment at the edge. They realize they need to safeguard legacy workloads and modernize their assets, hardware, and applications, while closely investigating the potential for any risks at the edge.

Regardless of the edge deployment objective, users should first address the inherent challenges of edge computing before devising a plan to implement Kubernetes. These challenges include everything from hardware and operating system diversity, network connectivity, and safety and security to diverse and remote environments and a lack of skilled resources in the field. Once these challenges are adequately resolved, users can plan for the deployment of a long-term Kubernetes edge solution.


Final thoughts

Running Kubernetes at the edge has very little to do with Kubernetes itself, and everything to do with how to enable Kubernetes at the edge. The biggest challenge is how to orchestrate Kubernetes environments at the edge, how to secure them, how to manage them, and how to monitor them at scale.

This column originally appeared in the December 2023 issue of InTech digital magazine.

About The Author


Jack Smith is senior contributing editor for Automation.com and InTech digital magazine, publications of ISA, the International Society of Automation. Jack is a senior member of ISA, as well as a member of IEEE. He has an AAS in Electrical/Electronic Engineering and experience in instrumentation, closed loop control, PLCs, complex automated test systems, and test system design. Jack also has more than 20 years of experience as a journalist covering process, discrete, and hybrid technologies

Download Intech Digital Magazine

Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..

Subscribe