Fundamentals: What is edge computing?

It’s the weekend – time to get cozy and fire up the latest binge-worthy show on Netflix. You start up Netflix and see the dreaded loading wheel. You’re not sure how long it’ll spin – 5 seconds or 15 seconds – but it’s enough to get under your skin. This lag is actually Netflix connecting to the cloud, so it can pull up the selected show. Just about everything is connected via the cloud, but as the Internet of Things keeps growing, it puts more stress on the cloud, causing bigger latency issues. So how do we get faster speeds? Enter edge computing.

What is edge computing? Simply put, it brings the computation and data storage closer to devices that need them. Typically, IoT devices rely on a central location that could be located thousands of miles away. Edge computing moves storage and servers closer to where the data is. This makes it easier for data, especially real-time data, to communicate with the system without affecting its performance. 

For example, a retail store may use an internet-connected camera for surveillance. A single-camera can easily transmit data across the network, but using multiple cameras can cause network problems, including lag. Edge computing technology localizes the servers and data processors to reduce latency issues. Edge computing devices can include IoT sensors, security cameras, and edge gateways, which process data from an edge device and sends back relevant information via the cloud to reduce bandwidth needs.

Another benefit to edge computing is the cost. Bandwidth and cloud resources cost money. If you’ve ever filled up your iCloud account, you know that the more space you need, the more you pay. Because edge computing reduces bandwidth and cloud resources, it reduces the cost. Edge computing can also provide security advantages. Cloud storage is vulnerable to attack – think about the numerous data breaches over the years. Since edge computing processes and sends data locally, it reduces the amount of data at risk. 

Edge computing devices are connected to their respective clouds, in turn served by enterprise hub. (Image credit: custom)

Ironically, security is also one of the drawbacks of edge computing. Whereas some view the smaller, localized servers as an improvement, others worry it invites new opportunities for malicious attacks. Sending data via a centralized hub may be slow, but it’s at least stable and secure. Because edge computing is still fairly new, the devices it uses may not be as secure as others. Additionally, varying requirements for electricity, processing power, and network connectivity can impact the reliability of an edge device. In this case, managing the devices to ensure they don’t fail is crucial. 

Despite its drawbacks, edge computing is being implemented in various industries, like farming, manufacturing, medical devices, and self-driving cars. But it’s being used most prominently in 5G wireless technologies. Similar to edge computing, 5G boasts high bandwidth and low latency for applications. Rather than relying on the cloud, companies like Verizon are using edge computing strategies in their 5G deployments to offer fast, real-time processing for mobile devices. 

As the IoT continues to grow, edge computing will continue to advance and evolve. Though there are some kinks to work out, the technology can make a difference beyond resolving bandwidth and latency issues. Because it can be used for various applications, don’t be surprised if edge computing replaces cloud technology in the coming years.