By Brian Handrigan on Tuesday, 26 August 2014
Category: Network Access Solutions

Why Network Monitoring Is Changing?

IT needs end-to-end visibility, meaning tool access to any point in the physical and virtual network, and it has to be scalable and easy to manage, says Jason Echols, IXIA

Networking is a rapidly evolving and changing landscape. We've quickly moved from parsing bits and bytes from workstation to workstation to providing powerful applications and services to millions of consumers. Speed and bandwidth requirements have grown exponentially, new traffic types appear daily, and now a functioning networkis a crucial and necessary part of running a successful large or small business in any market.The portion of traffic residing within the data center will remain the majority throughout the forecast period, accounting for 76 per cent of data center traffic. Data center traffic on a global scale will grow at a 25 per cent CAGR.

At the same time that technologies like cloud and LTE are proliferating, the number of applications, both beneficial and malicious, are rapidly increasing and place further demands upon service providers and large enterprises. These organizations have to maintain constant vigilance to be able to respond in real time to eliminate network blind spots and identify hidden network applications so they can mitigate the network security threats from rogue applications and users.

Rapid evolution of the data center creates both urgent challenges and outstanding revenue opportunities. Of course, these vary according to the organisation and its unique goals and concerns. However, all IT teams have to deal with the proliferation of devices from multiple vendors, threats in every form, and scalability needs.

As business networks continue to respond to user demands of access to more data, BYOD and the Internet of Things, a new chapter has opened for IT personnel. 89 per cent of Global IT professionals reporting personal devices on corporate networks, more than 10 per cent of organizations that are "fully aware" of all the devices accessing their network and 1 in 4 employed adults who have been a victim of malware or hacking on a personal device. These numbers are staggering and alarming too. While much of the traffic that runs through service provider and enterprise networks is stateful and application-based, access to application and user data has been costly and often lacking. Simply looking at layers 2-4 of the OSI model no longer provides deep insight into the character of the traffic. While the layer 2 - 4 data continues to have value, to really understand your network infrastructure and how to respond to customer demands, you need to see what applications are running and look at performance artifacts at the application layer, i.e. layer 7 information.

IT organisations are tasked with providing their customers with connectivity for communication and for their business critical applications. Customer expectations are now higher and more service-focused-infrastructure and simply creating a functioning network are mere "table stakes" in the network management game.

IT is expected to provide the highest possible customer experience in a secure and always-up network environment. In order to meet these new demands for impeccable service, ITorganizations must deal with a myriad of dynamic forces that challenge their ability to meet expectations:

Another needed component to visibility is application intelligence (the ability to monitor packets based on application type and usage). This technology is the next evolution in network visibility.

Application intelligence can be used to dynamically identify all applications running on a network. Distinct signatures for known and unknown applications can be identified and captured to give network managers a complete view of their network. In addition, well designed visibility solutions will generate additional (contextual) information such as geolocation of application usage, network user types, operating systems and browser types that are in use on the network.

Many companies are using an adaptive, intelligent network visibility architecture. A visibility architecture is a holistic approach to network visibility that controls costs and administrative burdens, while optimizing the investment value of monitoring and security tools. A visibility architecture helps speed application delivery and enables effective troubleshooting and monitoring for network security, application performance, and service level agreement (SLA) fulfillment - and allows IT to meet compliance mandates.

Network and IT organisations are caught in a constant cycle of deploying new services, supporting new use cases, and managing growth - which results in networks that are always trying to get back to a reliable state before the next rounds of change hits.

One of the results of these changes over the last 15-20 years is there are more monitoring, visibility, and security tools in use today than ever before. In fact, these tools are typically required today for all enterprise data center and campus networks, as well as service provider IT, data, and LTE production networks.

But all of these tools need access to data on the production network. In fact, most of the tools function better when they get data from across the entire network, including the data center, security DMZs, the network core, and the different campus and remote office locations. However, the problem is many of these tools aren't getting the data access they need.

IT needs end-to-end visibility, meaning tool access to any point in the physical and virtual network, and it has to be scalable and easy to manage. But more than that, IT needs control. These tools often can't handle all the traffic from across the network, so IT needs the ability to control what information is directed to each tool - and they need to do all this within existing budget constraints.

Thanks to Business World for the article. 

Leave Comments