DataCenterKnowledge reported at the beginning of 2017 that AT&T virtualized their customer-facing network in the proportion of over 30 percent. This is part of a three year program that aims to achieve network virtualization by 2018. Three thirds of the company’s network should reach this 2018 goal.
This example perfectly illustrates how the telecommunications industry is making a move for the new generation technologies. Virtualization is meant to allow the provision of modern services, as well as to support the massive data traffic driven by IoT-like communications and video streaming.
What is the ensemble situation on network virtualization? Let’s see if we can work out a more comprehensive image in what follows.
The big operators, going into network virtualization
By looking at a list of mobile virtual network operators (MVNOs) in the United States, we can easily notice how the host networks include all the big carriers. We have seen how AT&T stands in regard with network virtualization. According to this source, Verizon, Sprint, and T-Mobile also host a variety of providers that in turn deliver Telco services to businesses and individuals.
In what the stage of these transformations is concerned, DataCenterKnowledge mentions a relevant professional statement. Big-scale projects take about one year of planning, followed by three years with a slow growth. Only after this period can the companies reach around 50 percent or more from their proposed target.
Verizon goes the same way as AT&T. They invested in network virtualization on a big scale, and now provide services under the tagline “Boost agility with networks that deliver quickly on demand “. Take a look here at what they attribute to virtualization in terms of faster, better services.
T-Mobile on the other hand has another angle. Their CTO, Neville Ray talked about their modern core technologies that make the virtualization move less “material” and obviously less needed for this carrier.
With their eyes set on 5G and delivering the promised 1 Gb/s speeds, T-Mobile depends less on legacy systems and would probably consider software-defined networking only after 5G implementation. Once they will employ network slicing, SDN’s benefits might be welcome.
Why the two mandatory steps in network virtualization are necessary
We may only assume that telecommunication R&D will continue to deliver improved solutions over time. For now, going for network virtualization means implementing two main elements:
- Software-Defined Networking (SDN);
- Network Function Virtualization (NFV).
Once the carriers adopt this technology and gain experience in using it in order to provide better services, it branches out into various new exploits. For example, Cisco estimates that NFV will spread into the enterprise. Thus we should see benefits such as easier services provision, better chain transmission and improved scalability.
Also, in the same company’s opinion, the three virtualization driving factors will go on all through 2017. These are:
- the increasing number of mobile devices that push rich data and video data onto networks;
- the public’s thirst for cloud-hosted applications (another data-consumer factor);
- the Internet of Everything moving on and generating yet more big data bulks with its sub-networks and connected clusters of devices.
New mindsets for new times
Going beyond the virtualization’s financial and technical strive that the telecommunications operators face for now, RCR Wireless analyzes other necessary readjustments. More precisely, the publication takes into consideration that network monitoring and performance will in fact require a different mindset, once adopting virtualization.
SDx’s 2015 SDN and NFV Market Size and Forecast Report predicted that by 2021 80 percent of the service providers would embrace virtualization for their networks. In such a process, the operational changes are huge. Adapting as a specialist is not easy – new concepts, new functionalities and a lot of hardware becoming replaced by virtual items – all these represent quite a challenge.
The vital role belongs to operations directors. It depends on them to make sure systems keep their integrity and their performance levels during the transition. Monitoring deployments, as well as keeping a comprehensive visibility are important for cyber-security reasons. For the clients to remain undisturbed during the entire process, all things must go as planned.
In other words, the companies embarking upon the ample process of virtualization need and count on professionals that have thoroughly familiarized themselves with the new technology. This goes from technical details to the different mindset necessary in finding specific solutions whenever needed.
From network administrators to network programmers
Cisco’s Trends in Enterprise Networking for 2017 mention in the 8th place the fact that network administrations will most likely become network programmers. This while programming will move from device to controller. It all has to do with the different mindset required by modern technologies.
The cloud era takes the weight of hardware components and moves it into software environments. However, handling, organizing and structuring data in a virtual environment is not less sophisticated, on the contrary. It takes a whole new set of skills to “weave” a sustainable virtual structure that will stand all the necessary increasing data traffic.
An estimate coming from Cato Networks reveals that 40 percent of the organizations could keep away from upgrading their infrastructure the next few years, due to complexity issues. The fear of being unable to properly protect their data is the main reason for stalling virtualization. Also, achieving comprehensive visibility is not easy to establish nor to explain to reluctant clients.