For many years now, companies have been migrating to the cloud intending to harvest its many advantages. That led to the fast growth of cloud computing ballooning up from $90bn in 2015 to upwards of $312bn in 2020, according to Statista. But in the meanwhile, edge computing has been steadily growing.
That might seem like a step backward to the uninitiated; why would organizations use semi-local edge servers after all the fuss about the cloud? Yet, the answer is simple and not hard to spot; edge servers are not here to replace the cloud. They have their use cases that need smaller distributed servers with shorter response times when compared to the public cloud.
Let us take a closer look at edge computing, what is it about, and what benefits can companies adapting it expect?
Computing on the edge
When cloud computing became common, it allowed companies to focus on the core aspects of their businesses while outsourcing computing to companies specialized in data management, storage, and analysis. Yet, the shortcomings of the cloud became apparent with time.
Companies relied on the cloud with their time-sensitive operations and the delays of sending data halfway across the world and back. That made to high demand for what we now know as edge computing.
The edge in the name is based on the placement of servers “on the edge of the network” compared to more traditional computing schemes. That is different from traditional computing done on central servers and data centers operated by the companies and cloud computing that relies on far servers halfway across the globe sometimes.
Edge servers are smaller than typical ones found in data centers and instead of being buried deep in central locations, they are distributed to various locations to be closer to the devices that generate or require quick access to data.
The core idea of edge computing is to put servers closer to users or devices needing them. Thus, edge servers are usually within close range to their targets, contrary to cloud servers that are tens to hundreds of kilometers away in the best cases. That reduces latency and allows for time-sensitive use cases with low to no tolerance to high ping times.
Where does edge computing shine?
As it is all about speed and low latency, edge computing is very useful in use cases that need almost instantaneous computing and response. It is also effective in data distribution as it reduces the need for transferring data over long distances and within limited bandwidths. All in all, edge computing has proven itself especially useful in fields like:
For now, self-driving technology is still in its early stages and relies completely on onboard computers that make decisions based on what is best from a very narrow-angle. If we want self-driving to reach levels where human intervention is not needed, we would need networks of connected autonomous vehicles that act together to avoid crashes and accidents.
It would not be practical to base such networks on internal car computers as they are far underpowered for managing a large network of vehicles. Cloud is not a good choice here either, as even if data is transferred at the speed of light, traveling over thousands of kilometers will add very precious milliseconds to the response time.
Like any other use case that requires real-time performance, high-level autonomous driving would be almost impossible without edge servers. Thankfully, adding such infrastructure won’t be very hard as small edge servers can be added to cell towers making high-scale traffic control possible.
Smart city applications
Many smart city technologies rely on large-scale data collection and real-time processing, both things that edge computing excels at. Let us take the smart grid and smart thermostats as an example of the real advantages of using edge computing in a smart city setting.
Grid operators can track energy consumption and patterns in homes and control smart thermostats, so cooling houses on hot summer days happen in low consumption times. That makes power outages less common, even in extreme situations like heat waves or winter blizzards that put high pressures on electricity grids.
Such tactics were effectively used in the 2021 heatwave that hit Texas, USA, and put immense pressure on the grid that energy companies raised pre-set temperatures on smart thermostats to curb the power demand and avoid wide-range blackouts.
Many companies are starting to use sensors and other IoT devices to monitor real-time power consumption at their facilities. They can track consumption patterns to determine how to achieve optimal performance while reducing costs by shifting the highest energy-consuming activities to off-peak hours.
Such applications are not limited to cost efficiency but are crucial to enable higher reliance on renewable energy. Solar and wind energy are not always predictable, and they fluctuate to a high degree that makes it hard to rely on them. Controlling the energy consumption to synchronize with energy generation would make it easier to migrate toward solar and wind energy and slowly phase out gas and coal stations.
Healthcare providers are digitizing quickly and migrating into more reliance on IoT devices and connected medical devices. The overflow of IoMT (Internet of Medical Things) means hospitals and health centers will rely on data processing more than ever.
Just like self-driving cars, a few milliseconds may save a life or reduce the risk of a serious injury by enabling quicker response times for medical staff and granting faster access to crucial patient records.
Add the plethora of smartwatches and bands that track heart rate and blood oxygen levels in real-time while some even detect falls or other health issues, and the need for edge computing in the healthcare system becomes apparent.
Edge computing is here to stay without doubt, and as we have demonstrated above, the insights predict it to grow at a fast pace. The growth of edge computing will not cut into the huge existing cloud market, but it will replace the cloud in some applications where quick response is of the utmost importance.