What Does Edge Computing Mean?

Imagine you have a smart home with lots of gadgets: lights, cameras, and sensors.

Instead of sending all the data these gadgets collect to a distant data processing center or the faraway cloud, edge computing brings data processing closer to the gadgets where it is created. In other words, to the edge.

What is the cloud?

The cloud is a term that refers to a network of servers that provide various services over the Internet, such as data storage, software applications, computing power, and analytics. The cloud enables users to access these services without having to install or maintain any hardware or software on their own devices. The cloud is not a physical location, but rather a distributed system that can span across multiple regions and countries.

What does edge computing mean?

A brief history of edge computing

In a sense, edge computing is an example of what’s old is new.

Beginning in the late sixties, the invention of, first, the Internet and, later, the World Wide Web enabled the creation of the server-computer model where data processing shifted from individual devices to centralized servers. (Internal Link)

When computers were first invented, however, there was no Internet and when personal computers (PCs) first began to make their way into households in the seventies and eighties, there was still no World Wide Web. All data processing occurred right on the device itself.

But with the enabling technology of the Internet and the Web, the server-computer model gained prominence.

This model was attractive for several reasons. Centralized servers were built with higher-grade components than PCs and most also had built-in redundancies. Offloading files to servers not only freed up space on PC hard drives, but also served as a backup. And unlike PCs, which had limitations, servers could be scaled up as needed. All these qualities made the server-computer model efficient and cost effective.

But, as more devices connected to the Internet, the servers processing all the data generated by those devices began to feel the strain.

Then, in 1998, a group of computer scientists from MIT presented a novel idea at the MIT $50K Entrepreneurship Competition (now the MIT $100K Entrepreneurship Competition).

Their idea? Edge computing.

Edge computing

Let’s break it down:

Edge computing is a distributed computing framework that allows data processing and analysis to be performed closer to the source of the data - on an edge server or edge device - rather than by a centralized cloud server. 

By performing computation at the edge, edge computing reduces the latency, bandwidth, and energy consumption of data transmission allowing devices to respond faster.

Edge computing is especially relevant to Internet of Things (IoT) devices, as, by performing computation at the edge, these devices are able not only to respond faster to events but they are also able to perform more autonomously. They can operate independently from the cloud in the event of network disruptions or failures.

What is the Internet of Things (IoT)?

The IoT is a network of physical devices, vehicles, appliances, sensors, and other objects that can connect to the internet and exchange data. The explosion of IoT devices has resulted in unprecedented volumes of data. Edge computing makes it possible for IoT devices to more quickly process and act on that data.

While slow and disrupted service may seem merely – albeit, perhaps, severely – frustrating when it comes to smart home devices, such as thermostats, lights, security cameras, locks, or appliances, bear in mind that IoT devices also include health monitors, industrial robots on factory floors, smart city traffic lights, and the on-board sensors and cameras in self driving cars.

In the case of self driving cars, an IoT-based technology system shares information on road conditions and the vehicle itself while the vehicle is moving. To enable the vehicle to navigate roads and avoid collisions sensors, cameras, and onboard processors must rapidly detect and respond to traffic signals, road signs, other drivers, pedestrians, pot holes, curbs, lane changes, changes in road conditions, and any number of other variables – processing huge amounts of data in real time and making split-second decisions. In this context, latency or a network disruption is not merely inopportune, it is a matter of life and death.

Edge AI

While edge computing accelerates performance and improves the reliability of network devices, edge artificial intelligence – edge AI – takes things to the next level, fusing edge computing and artificial intelligence.

Edge AI allows AI models to operate directly on local edge devices, executing machine learning tasks directly on sensors and IoT devices and enabling real-time data processing and analysis without the constant need for back-and-forth data transfer to cloud infrastructures.

As with all edge computing applications, edge AI addresses latency issues and reduces bandwidth use. Though edge devices are certainly more limited in processing capabilities than the cloud, edge AI greatly enhances real-time data analytics. And by processing information locally on devices, edge AI also reduces data privacy concerns and security risks.

A significant real world use case of this is video surveillance. In smart home security systems, for example, AI enabled cameras have two primary applications. The first is to make quick decisions and take immediate actions, sending alerts or initiating security protocols when the camera detects unusual activity. The second application involves intelligently processing data for future review.

Using edge AI, security cameras can determine what clips are relevant and what is simply noise sending only selected data to third-party monitoring centers for immediate processing instead of burdening the network 24/7 with constant video streams.

Moreover, by processing, analyzing, and categorizing videos before storage, edge AI uses expensive cloud resources more selectively – storing only processed data for later analysis – and also makes subsequent searches of that data more efficient.

How are edge AI models trained?

As described by IBM in their post, “What is edge AI?” this is how edge AI technology operates:

Edge AI models are first trained at "centralized data centers or the cloud." Neural networks and deep learning are used to teach models to "accurately recognize, classify, and describe objects" within the voluminous data used to train them.

"After deployment, edge AI models progressively improve," writes IBM. However, "[s]hould the AI encounter an issue, the problematic data" is generally transferred back to the data center and use to further train the original AI model.

This feedback loop is used to improve model performance over time with the retrained model ultimately replacing the model that was initially deployed.

Conclusion

Today’s homes are filled with “smart” devices like doorbells, thermostats, fridges, entertainment systems, and controlled lightbulbs. These devices form an ecosystem in smart homes, using edge computing and increasing edge AI to improve the lives of residents.

Whether it’s identifying a visitor at the door or adjusting the home’s temperature, edge technology processes data quickly on-site, eliminating the need to send information to a remote server.

This not only preserves the resident’s privacy but also accelerates performance and ensures more reliable experiences for customers.

"The cloud computing model may be a wonderful system when it works, but it's a nightmare when it fails. And the more people who come to depend upon it, the bigger the nightmare." 

- Jamais Cascio, American author and futurist.

Leave a Comment