Edge Computing

Computers have gone through multiple waves of trends. When the personal computer first arrived, the only option was computing locally. This meant that for years, everything was saved and every function was performed directly on a device and nowhere else. Then came the cloud, which outsources function and security to the Internet, and now edge computing has arrived to create something of a happy medium.

Not Quite Central, Not Quite Local

Think of the cloud as a giant warehouse. There’s plenty of room there, and there’s no danger you’ll lose all your assets if your office burns down, so long as they’re housed in this warehouse. What about when it comes time to retrieve those assets, though? And what if you find you have a large number of assets you’d like to add to the warehouse on day, but there’s only one door? Plus, what’s to stop someone from breaking into the warehouse?


These are the problems that edge computing aims to solve. It does not occur on the device itself, nor does it occur on the cloud entirely. Edge computing happens as close to the source (the device creating the data) as it possibly can, on the edge of the network, hence the name. 

What’s The Point?

As the Internet of Things (IoT) becomes an increasingly real concept, there’s a great deal of data that may need to be processed in a given location. Sending all of that information to the cloud to be processed and stored can create a bit of a traffic jam, and it reduces the function of the IoT because it generates a certain amount of latency no matter what. 

Edge computing takes care of a fair amount of processing the data before it ever reaches the cloud. This way, there’s less to be done with the information after its departure into cyberspace, because the hard part was taken care of immediately. 

The most basic example of edge computing with which you are probably familiar are the security features of iPhones. Rather than storing all of an iPhone users’ security information to the cloud, they keep the functions local. This prevents lag time while still utilizing a centralized system. 

Where The Concept Is Headed

Perhaps the most futuristic yet real example of edge computing right now is autonomous cars. They drive themselves without receiving constant information about what move to make from a centralized data storage center, but they are also managed largely by a cloud. 

These self-driving vehicles have to communicate regularly with a cloud in order to receive updates and provide data that can help improve algorithms, but they cannot communicate so regularly that there is latency in their performance. 

This fusion between local and cloud computing presents a best-of-both-worlds vision for the future of the Internet of Things, capitalizing on both systems’ strengths while compensating for their weaknesses.