Published: 15/08/2016
Author: Greg Collins

Originally published on RCR Wireless on August 12, 2016

Transformation of service assurance into virtualized environments being led by aggressive telecom operators

While far from perfect, mobile operators have come to an understanding on how service assurance worked in the real world of hardware-based network operations. Sure, some of the processes in place would seem a bit arcane for platforms designed to transport high-tech services, but at least carriers – and more importantly their engineers – have had a lot of time to work through the kinks and understand those processes.

But, as operators begin moving network control over to virtualized platforms, the game changes as “real” tools are set to be replaced with virtual code. Of even more concern for operators is the near-term need to support both real and virtualized hybrid deployments that many think will be around for up to a decade.

“The good news is that service quality management direction works just as well in the new world as the old world,” said Chris Rice, VP of advanced technology at AT&T Labs. “From a black box point of view, you are just looking at black boxes.”

AT&T is seen by many as one of the telecom leaders in terms of its migration to a software-controlled future. AT&T in late 2014 announced plans to control 75% of its network resources using virtualization technologies by 2020, and that at the end of 2015 the carrier had reached 5.7% control, which was ahead of its 5% target.

Krish Prabhu, CTO and president of AT&T Labs, recently explained that the carrier’s network “typically has about 250 distinct network functions. So when we say 5% or 6% of the network has been virtualized, we’re essentially saying that any new network function that’s deployed in the network in that cluster of virtualized network functions is deployed as a software element running on cloud infrastructure as opposed to buying up the shelf hardware.”

In making the adjustment of dealing with service assurance in a hardware world as compared with dealing with service assurance in a virtualized world, Rice noted the difference is not cut and dried.

“There are a couple of differences in a cloud world, however,” Rice said. “No. 1 is that there is no such thing as a pure cloud world. Everything is old and new world. It’s all brownfield and that brings some challenges. … Reliability is an example. In the old world you would have special gear and boxes you would buy and implement. There was a lot of proprietary elements, but allowed for what I could call an atomic level of resiliency. In the virtual world you can get the same or better level of resiliency with a combination of things.”

In looking to help bridge those worlds, Rice explained AT&T’s recently launched enhanced control, orchestration, management and policy project as a move in that direction. AT&T said it launched the ECOMP project due to a lack of guidance for network functions virtualization and software-defined networking deployments in a wide area network environment. ECOMP is said to provide automation support for service delivery, service assurance, performance management, fault management and SDN tasks. The platform also is designed to work with OpenStack, though the company has noted it was extensible to other cloud and compute environments.

That lack of guidance from vendors was a common theme among telecom operators, which are in the position of driving the virtualization movement instead of legacy models where vendors were always moving to roll out new technologies.

Verizon Communications, which as of late has become more vocal in terms of the progress it’s making with virtualized platforms, noted it’s definitely a different position in terms of comfort level for the carrier, but said the carrier is becoming more confident in the transition.

“No one is completely comfortable yet with virtualization, but it’s OK that it’s an uncomfortable world,” said Victoria Lonker, director of product management for MPLS, SDN and Mobile Private Network at Verizon Communications. “We have made sure we could service chain multiple vendor deployments together so they could work, which took some work because some vendors said their design would work in an open environment, but they didn’t work as planned once deployed. … We also have that challenge today with physical devices, but have been able to leverage that with managed services. The move toward virtualization is really not that much different today.”

Analyst are also cognizant of the challenges facing carriers, especially as it concerns service assurance issues in what those operators are looking to deploy in a multivendor environment. Much of this could be surmounted using the open source community, but even that has its issues.

“If you looks at a basic OpenStack software model, there needs to be some hardening to support carriers in terms of service assurance, and that seems to be pretty clear,” said Greg Collins, founder of Exact Ventures. “Just getting the scripts in place to tackle the issues are still a challenge.”

As an example of operators recognizing the importance of the open source community, AT&T recently said it is set to move its ECOMP program to developers interested in building upon the already established software code. AT&T said it is working with the Linux Foundation on the structure of the open source release.

“This is a big decision and getting it right is crucial,” said John Donovan, chief strategy officer and group president for technology and operations at AT&T. “We want to build a community – where people contribute to the code base and advance the platform. And we want this to help align the global industry. We’ve engaged a third-party company to be the integrator and provide support in the industry for the ECOMP platform. And we’ve received positive feedback from major global telecom companies.”

In touting the platform, AT&T said ECOMP is “mature, feature-complete and tested in real-world deployments. And, we believe it will mature SDN and become the industry standard. Releasing this software into open source levels the worldwide playing field for everyone. Most importantly, we believe this will rapidly accelerate innovation across the cloud and networking ecosystems.”

Our expertise

  • Market sizing and forecasting
  • Custom market analysis
  • Pricing model and price elasticity analysis
  • Technology and market white papers

News and reports

We regularily publish news and reports that document the changing face of the telecommunications market place.

Find out more >

Who we are

An independent market intelligence firm with the objective to create unbiased, enduring benchmarks for measuring market shares...

Find out more >

Get in touch

Please contact us if you require more information about our services, or would like to discuss your requirements.