Skip to content

Now capable of storing 96 TB of network and application traffic for analysis in a single chassis, the GigaStor Expandable scales to over a petabyte and beyond with additional units. It is the highest-capacity retrospective network analysis solution on the market today.*

To further put the size of a petabyte into perspective – if the average smartphone camera photo is 3MB in size and the average printed photo is 8.5 inches wide, then the assembled petabyte of photos placed side by side would be over 48,000 miles long, almost long enough to wrap around the equator twice.
Source: ComputerWeekly.com

Form Factor Deployment Capacity Networks Rack Size

Expandable

Data Center
Server Access Layer
Long-Term Retention

Data Center
Server Access Layer
Long-Term Retention

1 Gb, 10 Gb & 40 Gb 5U

 

GigaStor Expandable capacities can begin at 96 TB (increasing by 96 TB increments) to 288 TB, thereafter growing in 288 TB amounts to 576 TB, 864 TB, and over a petabyte of packet capture storage. The Expandable product line offers in-the-field scalability to meet growing performance monitoring needs on gigabit, 10 Gb, and 40 Gb links.

GigaStor provides network forensics and data retention solutions to network teams for retrospective network analysis and troubleshooting. Using GigaStor’s analytics, users navigate to the exact moment a problem occurred, view packet-level details around the event, and resolve the issue. GigaStor, along with other Observer Platform solutions, saves significant time troubleshooting and eliminates having to recreate the performance problem.

*Compared to primary vendors Netscout, Riverbed, and Fluke, Network Instruments’ GigaStor provides the largest long-term packet capture and storage capacity in a single appliance.

Thanks to Network Instruments for the article. 

Related Posts

Beyond the "Perfect" Lab: Simulating Real-World Network Chaos Before Deployment

Beyond the "Perfect" Lab: Simulating Real-World Network Chaos Before Deployment

It is the classic IT paradox: your application performed flawlessly in the staging lab, but the moment it was deployed…
UNDERSTANDING ZERO TRUST -- WHY VISIBILITY IS THE BEDROCK OF “NEVER TRUST, ALWAYS VERIFY”

UNDERSTANDING ZERO TRUST -- WHY VISIBILITY IS THE BEDROCK OF “NEVER TRUST, ALWAYS VERIFY”

In our first post, we demystified the core philosophy of Zero Trust—shifting from the outdated “castle-and-moat” perimeter to a model…
Precision Timing Applications in Healthcare and Emergency Services

Precision Timing Applications in Healthcare and Emergency Services

Precision timing is often associated with telecommunications, financial trading, or power grids, but its role in healthcare and emergency services…
Understanding Precision Timing in 5G and O-RAN Networks

Understanding Precision Timing in 5G and O-RAN Networks

5G is doing more than just speeding up our downloads—it’s completely changing how mobile networks are architected. Unlike the LTE…
The Heartbeat of Quantum: How White Rabbit Synchronization is Moving Innovation from the Lab to the Network

The Heartbeat of Quantum: How White Rabbit Synchronization is Moving Innovation from the Lab to the Network

Why Sub-Nanosecond Timing is the Missing Link for Distributed Quantum Computing and QKD For quantum scientists and researchers, the challenge…