What is Edge Computing for Cloud and IoT?

Edge computing is a term that’s getting thrown around more and more these days, though often unaccompanied by an easy-to-digest definition of what exactly Edge Computing means. Usually, explanations are either too aggressively full of technical jargon for a layman to decipher or too vague to provide a meaningful, clearcut understanding of what Edge Computing really is, why it’s useful, and why so many more organizations are turning to it as a way of handling emerging IT obstacles and improving the power of other technologies, namely Cloud Computing and IoT. 

What is Edge Computing?

Down below, we will explain exactly what edge computing is, and why it is becoming increasingly important in our digital world as we grapple with the new data processing challenges that accompany increasingly advanced technologies.

Cloud Computing and IoT Explained

What is edge computing? IoT

Before we can illustrate the mechanics of Edge Computing, it’s important to first understand how cloud computing — a completely different technology and term that is in no way interchangeable with Edge Computing — works and the current obstacles it faces.

Cloud computing delivers computing power over the Internet by connecting users to powerful servers maintained and secured by a third-party. This lets users leverage the computing power of those servers to process data for them.

Cloud computing services like the Microsoft Azure cloud, Amazon Web Services, the Google Cloud Platform and the IBM Cloud allow users to avoid the substantial upfront costs that come with creating a heavy-duty local server setup as well as the responsibility of maintaining and securing that server. This affords people and companies a “pay-as-you-go model” option for their information processing needs, with costs varying with use.

The Internet of Things, or IoT is a related concept that involves the networking of everyday devices over the Internet via cloud computing. This allows non-computer devices to speak to each other, gather data, and be controlled remotely without being directly connected to each other.

Take, for example, a home security camera. The camera can send its information to the cloud via the home Wi-Fi network, while the user can access the data via their phone while at work. Neither device needs to be directly connected to one another, only the internet.

This way the user can send and receive information through a server that both devices connect to via their internet connection.

This same model can be used in all sorts of ways; everything from smart home technology like smart lights, smart ACs, and other appliances, to industrial safety mechanisms like heat and pressure sensors can use IoT to increase automation and create actionable data.

By allowing devices to connect with one another wirelessly, IoT helps reduce human workload and improve overall efficiency for both consumers and producers.

Obstacles Facing Cloud Computing and IoT

What is edge computing? Data

While IOT continues to grow, with applications being used in nearly every industry, the burden on data centers used for cloud computing is increasing exponentially. The demand for computational resources is beginning to exceed the supply of said resources, reducing overall availability.

When cloud computing first emerged, the only devices connecting to it were client computers, but, as IoT has exploded, the amount of data that needs to be processed and analyzed has reduced the amount of computational power available at any one moment. This slows data processing speeds and increases latency, bringing down performance on the network. 

This Is Where Edge Computing Comes In

What is edge computing? Edge

Now that you understand cloud computing, IoT, and the obstacles that face Both technologies, the concept of Edge Computing should be easy to understand.

In simple terms, edge computing places more of the workload locally where the data is first collected, rather than on the cloud itself. As its name suggests, Edge Computing aims to place more of the burden of data processing closer to the source of the data (i.e. at the “edge” of the network).

This means, for example, finding ways to do some of the work that would be done at the data center on the local device before sending it off, reducing both processing time (latency) as well as bandwidth. In the context of a security camera, this would mean developing software that discriminates against data based on certain priorities, picking and choosing which data to send to the cloud for further processing.

This way, the data center need only process perhaps 45 minutes or so of important data, rather than a full 24 hours of video. This lessens the burden on data centers, reduces the amount of information that needs to travel between the devices, increases the overall efficiency of the network. 

What is edge computing? Tesla

Speed and processing power have become especially important with the rise of more demanding technologies. Earlier uses of IoT in cloud computing required smaller amounts of data to be processed and were generally less time-sensitive.

However, with more advanced use cases, the importance of lower latency cannot be understated. No example illustrates this point better than that of self-driving cars. These devices are responsible for safely navigating a complex, high-stakes environment with dire physical consequences.

A self-driving car requires cloud computing to be able to receive updates, send information, and communicate with other servers over the internet. It does not, however, have the luxury of limiting its processing power according to the availability of that connection.

Outages and other complications can hamper the strength of any connection and bottleneck the data processing the self-driving car needs to safely navigate roads and highways. Thus, much of the extremely time-sensitive data is processed locally, right on the vehicle’s CPU, protecting it from such a bottleneck and ensuring that even with unpredictable connections the device can work at full efficiency.

This combination of increased local workload and sustained cloud connectivity is a prime example of edge computing and how similar system architecture can improve the efficiency of all the technologies involved.

Still a little complicated? That’s fine. You can always reach out to us in the comments below with any questions you still have – we love answering them, and love helping people understand the increasingly complex world we’re building for ourselves every day.

Posted by
Will

Will Heydecker is a writer, screenwriter and illustrator who still likes dragons. As part of his bitter war against adulthood, he likes to distill art, gaming, technology, and entertainment info into digestible topics people actually enjoy reading.