Why Edge Computing is Key to a 5G Future
Today, your neighborhood might have a few thousand devices all connecting to a mobile wireless network. But if predictions become reality, by 2025 that number will increase—ultimately reaching well over a million devices in the same area.
While this proliferation of connected things holds promise for everything from our health to home security, there’s a major barrier standing in the way of the widespread adoption of the internet of things: bandwidth. Today’s 4G and LTE networks, while powerful, simply can’t accommodate the needs of millions of new connections.
IoT Devices & the Dawn of the Data Marketplace. Read more >
Thankfully, the advent of 5G—the next generation of wireless technology, which could operate at a throughput that is 10 to 1000 times faster than current networks—has arrived just in time for the explosion of the IoT. 5G will allow mobile data networks to open the door for all manner of new services, but they’ll come at a cost: a much higher likelihood of widespread network congestion.
Clearing the Bottleneck on the Edge
Edge computing is a concept that moves processing power from the center of the network (traditional servers) to the edge, closer to where the data is consumed (by a computer, phone, or some other device). By putting smaller, decentralized servers with compute power nearer to their use, congestion and strain on the network decreases—increasing performance for everyone.
According to Ihab Tarazi, a former telecom engineer and the new CTO of cloud hosting company Packet, with 5G, this increased performance is essential because there will soon be so many devices connecting to the network that a traditional, centralized design would cause everything to grind to a halt. The network would be so busy that “you simply wouldn’t be able to connect.”
“But edge computing changes the picture,” Tarazi adds. “Today’s telecommunications architecture is very traditional, with calls and data transferred from tower to tower and compute power located within centralized server farms. With edge computing we can push processing closer and closer to the tower.”
This will redefine cloud technologies, minimize latency issues and allow users to benefit from the increased speed that 5G promises. Without the edge, 5G won’t be much different than its predecessors (with many more devices trying to run on it).
Upgrading to the Edge
Experts agree that while edge is top of mind for telcos, its implementation will take time.
In an ideal universe, every cell tower radio would be upgraded with its own computing system, a micro-server of sorts that could provide muscle and serve commonly-used data without having to call back to a server farm in Iowa. But with over 200,000 cell towers and other stations in the U.S. alone, it isn’t financially feasible to retrofit every tower in the country in this fashion.
“The right answer probably lies with regional data centers and on-board computing [rising] as we slowly migrate compute power to the edge over time,” says Joe Madden, lead analyst with Mobile Experts, which has been studying the economics of 5G for the last four years.
The upshot for this move to edge is that bottlenecks will be eased, decreasing sluggish data delivery in the last mile—transforming the customer experience and their expectations of what the network can do. We’re not just talking about getting movies downloaded faster, either. The combination of 5G and edge computing could reinvent everything from real-time machine control systems to autonomous vehicles.
With the very first 5G deployments set to go live later this year, Tarazi predicts that by 2019 all major telcos will have some sort of 5G capability—and some already deploying edge computing test zones to prepare.
Tarazi doesn’t think we’ll see universal 5G adoption until 2023. “But,” he says, “it’s going to be awesome.”
This content is produced by WIRED Brand Lab in collaboration with Western Digital Corporation.