As part of 5G, there’s a big movement to deploy thousands of new edge compute locations to bring applications closer to consumers. The topic will be discussed on Tuesday, December 1, at FierceWireless’ 5G Blitz Week.
5G and the edge are inextricably connected. “The 5G air interface is so low latency that operators are able to look at it almost like a wire,” said Dan Picker, CTO of Inseego. “Suddenly, it makes a difference to put compute at the 5G edge rather than in the cloud. 5G has made it the right time to build the compute resources at the edge.”
But what are the business models?
Patrick Lopez, the founder of the consulting firm Core Analysis, advises clients such as Nokia, Ericsson, Huawei, Facebook and Deutsche Telekom. He worked at Telefonica for three years as global VP of network innovation where he helped the service provider develop its mobile and fixed innovation strategy for 21 countries.
Lopez said all operators are trying to understand the intersection between their networks and hyperscale networks and where edge computing fits. And they’re also trying to figure out their business models for edge and 5G.
“The only way this works is if operators keep and increase their enterprise customers,” said Lopez.
Many enterprise customers are already using public clouds to run their workloads, and it’s probably easier for them to continue using public clouds.
Operators see this, and they don’t want to be left out of the action. Some of them have begun to partner with hyperscalers such as AWS and Microsoft Azure. For example, Verizon is working with AWS, and AT&T is working with Azure.
But it’s unclear what percentage of revenue the operators might receive. They would prefer to be the primary provider to the enterprise rather than just part of “the pipe” over which the hyperscalers provide services.
Lopez said, “The risk is you build 5G for enterprise, but enterprise will buy cloud service from cloud providers, and it is the cloud providers who will capture most of the value.”
He said this scenario has already played out in the content delivery realm. “Think of Netflix,” Lopez said. “When they started deploying their cache in networks there was a revenue share to put their box in the network. Then when Netflix was deployed in most networks, they did a network quality index and shamed operators into providing a network cache. Then, the next step there was no revenue share. They said, ‘Your client is my client – if you want to kick me out, go ahead; your client will suffer.’ I think for the cloud and edge it could be very much like that.”
Lopez is advising telcos that some of this evolution is inevitable. “But they have control over where they deploy the hyperscalers and how fast,” he said. “I do think the largest telcos would be well-served to have their own edge that would be integrated with connectivity that could be differentiated from the hyperscalers.”
Francesca Serravalle, emerging technology director with Colt Technology Services, agreed that the business case has not been hammered out for 5G edge.
“They have proved the capability of 5G and edge but not the monetization opportunity,” said Serravalle. “In our case, we are incubating some specific propositions.” She said Colt is hosting some proof of concepts related to smart buildings and smart office use cases. But she said, “I think we’re far from proving the business case.”
Asked if hyperscalers would be friends or foes to operators in terms of edge compute, she said, “I see more as a friend than a foe. They need us, we have the customer, we have the relationship. We provide the connectivity fabric that connects customers with their cloud providers.”
But she said Colt hasn’t sorted out any type of revenue-sharing model. It has launched a steering committee to examine different synergies with some of the cloud providers. “When we position 5G and edge, there is also some cloud migration that comes into the picture,” she said.
Where is the edge deployed?
Ever since the concept of mobile edge compute (MEC) began gaining traction a few years ago, there have been many discussions about “where is the edge?” But at this point, the answer seems to be “wherever you want it.” The main thing is that edge compute resources can be placed in thousands of new locations, closer to end users so that data doesn’t have to trombone back and forth to a traditional, centralized location.
Inseego’s Dan Picker said he’s been involved in a lot of conversations with AWS, which is starting to migrate a lot of its servers closer to the customer.
AWS is working with Verizon in some cases to place AWS Wavelength compute and storage functionality at Verizon’s 5G Edge.
“When you look at the cellular space, most operators are starting to build these edge clouds, which they call their MEC,” said Picker. He said some of those are sitting on the cellular edge by the base stations.
But Inseego sells customer premises equipment, so its edge installations are at enterprises. “Our view is to do everything we can before sending anything over a data link,” said Picker.
He gave the example of customers that monitor their businesses with video cameras. The cameras are connected to artificial intelligence (AI) software, and if the software detects any unexpected activity it will stream the video data to notify the appropriate person. There’s no reason for any of that information to flow off premises.
“That’s a perfect example of edge AI taking place,” said Picker. “It reduces the amount of data that has to be sent across.”
Edge and routing
In addition to edge compute devices, there might need to be routing software to connect the edge points with existing networks.
DriveNets is a company that provides its Network Cloud software to service providers to run a variety of applications. The company announced in September it had won a contract with AT&T to provide core network routing software.
But the DriveNets’ network infrastructure platform can run a variety of network services. “AT&T is running a core routing service,” said Run Almog, head of product strategy at DriveNets. “But from our perspective, a service is a service.
DriveNets is involved in the edge computing space because most of the new MEC resources will need to be connected with routing software.
Routing happens when networks meet themselves. Those meeting places could be in large colocation facilities, internet peering sites, network cores, aggregation sites and access sites.
“Each is a separate standalone routing service provided by a dedicated machine,” said Almog. “From our perspective all of these are services. Network Cloud collapses to a unified infrastructure with software to have as a cloud orchestrator.”
Almog said each different type of network is built to support peak requirements, which are rare events. But DriveNets’ software “collapses all these” networks, leaving excess resources. And its software also frees up resources because “the amount of interfaces” will also be shrunk.
DriveNets also determines where functions are best processed. “We need to prevent the network from shifting traffic from place to place,” said Run.
The goal is to prevent compute resources from running functions that are more network native and to prevent network resources from running high-intensity compute functions.
Cell tower edge
There are some edge compute locations that are so far on the edge of the network that they don’t meet other networks at that point. An example would be American Tower’s deployment of edge compute at the base of its macro cell towers. The company is using its real estate — which already has electricity and security — to provide enterprises with edge compute close to their enterprise locations. The resources are connected by fiber.
But American Tower has future plans to connect those MEC resources with its tower tenants. And once that happens, networks will meet other networks.
Core Analysis’ Lopez said that operators are primarily deploying edge compute in their central offices. He doesn’t think base stations are commercially ready for the technology, yet.