Call Us:1.800.561.4019
According to the IDC, nearly 5 million 10 Gb ports were shipped in the third quarter of 2013, surpassing the number of Gigabit switches for the first time since the technology was introduced in 2001. That's a lot of upgrades, but with Big Data analysis, VoIP, video, and other bandwidth-intensive applications gobbling up more and more network resources, it's no surprise.
The real challenge for many network teams today is managing performance monitoring expectations in the new 10 Gb environment, as some of the old standbys will have to be put out to pasture, or significantly modified to keep up with the higher network speeds.
“Wireshark, while you can’t argue its price point, its inability to scale and perform in 10 Gb environments becomes apparent when packet captures get too sizeable,” says Sam Wang, Regional Sales Director and Packet Analyst Extraordinaire. “Whenever we’re talking over a gigabyte in terms of the capture size, Wireshark is either very slow or it crashes completely.”
But the network protocol analyzer still has its uses, as Wang is also quick to point out. “Wireshark is very usable in the Gigabit world, but when you go to 10 Gb, the utilization, throughput, and the amount of raw data is ten times faster. It becomes an issue for a software-only product like Wireshark to keep up. You have to have some sort of built-in 10 Gb performance product [like Network Instruments GigaStor] to capture the information. You can still export the data to Wireshark, which is a common practice. But you can’t just have Wireshark on a laptop anymore, or on a desktop or on a server trying to monitor 10 Gb.”
Another concern for network administrators is what to do with all the old Gigabit tools. The fastest and easiest, but by no means cheapest method is to recycle them and purchase new 10 Gb tools. But is it necessary?
“The other thing that is being used a lot when going from Gigabit to 10 Gb, especially if they don’t have the budget to immediately go to 10 Gb tools, is they can put in a product like the Matrix network monitoring switch,” Wang says. “Assuming your 10 Gb throughput is low and you do the math to avoid oversubscribing the one Gigabit tool, you can use Matrix as a middleman between your 10 Gb network and your legacy Gigabit performance devices.”
Common scenarios see 10 Gb running around 5 Gb per second which is still arguably 2-4 times the speed of Gigabit throughput. But compare that level of activity to the fact that a fully saturated, full-duplex 10 Gb link can run at 18 or 19 Gb. That’s a billion packets per second.
How do you monitor a billion packets per second?
“That’s very high utilization,” says Wang. “Most enterprises don’t have that sort of throughput. Most have around 5 Gigabits per second in their 10 Gb links, which is still hundreds of millions of packets per second traversing the link we’re trying to monitor. It’s important to understand the amount of raw data that is being passed and the ability to capture that raw data without dropping any packets as well as maintaining adequate storage for the amount of time you want to be collecting this data. It’s much more of a challenge than Gigabit and the need to have a performance-oriented appliance front-ending the 10 Gb network becomes paramount to getting the necessary information.”
Another concern is multiple data centers. Wang says, “People will want to consider the implementation of multiple data centers for things like redundancy, load sharing, and load balancing. You’ll want to be cognizant about where the data can travel in today’s enterprise environment, and consider a unified performance platform that has a good workflow. If the data were to traverse from one network data center to another there needs to be an effective way to get to the information based on different physical locations.”
“The criteria for evaluating 10 Gb network performance tools is the ability for the solution to be scalable and also be able to keep up with the current and future throughput of 10 Gb without having issues like dropping packets, or having slow performance, or not having enough physical storage to go back as far as people want to go,” says Wang.
Thanks to Network Instruments for the article.
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.
Comments