Big iron network security specialist Crossbeam Systems proved to me in a live network security test that its hardware can breezily execute firewall and intrusion prevention inspection, as well as Network Address Translation (NAT), on the equivalent of more than one million mobile device users simultaneously.
EANTC used a stack of Spirent’s Avalanche devices to emulate the traffic of one million simultaneous mobile device users running HTTP sessions, checking email, and downloading over-the-air smartphone OS updates. Engineers directed that traffic over a 10 Gigabit pipe into Crossbeam’s highest-end product, an X80-S chassis fully populated with processor blades running Check Point Software’s R75 security technology. Throughout the network security test there were no signs of service interruption or degradation of user experience.
The Crossbeam chassis
Crossbeam's X-Series chassis-based platforms, which host third-party network security software, such as firewalls from McAfee and Check Point Software, are carrier-grade devices (the company claims 16 of the 20 largest carriers in the world as customers), but also focus on the enterprise (The company claims enterprise customers, such as Volkswagon).
The Crossbeam chassis feature an extensible and scalable architecture and can range in performance from 5 Gbps to 140 Gbps in its high-end X80-S chassis. Each chassis will hold two classes of modules: Network Processor Modules (NPM) that manage and load balance incoming traffic flow and multi-core Application Processor Modules (APM) that execute the network security processes of the hosted applications. To add or subtract capacity to a chassis, customers can hot-swap NPMs and APMs on the fly. (Note the X80-S to the left, with four APM blades pulled out mid-test. This photo was taken through soundproof glass. Please excuse my reflection).
Crossbeam’s network security test: “We’ve emulated China.”
The scale of this particular testing was immense -- the stack of Spirent boxes that was assembled to emulate the traffic required 10 minutes to start up and another 20 minutes just to get the synthesized traffic up to one million users.
The network security test validated that the X80-S was able to run at 106 Gbps with stateful firewall, IPS and NAT enabled. It supported one million simultaneous users, four million active TCP connections and 242,000 new connections per second. Spirent’s software measured that latency from the Crossbeam’s processing of traffic was only 10 milliseconds.
As the test ran its course and the number of aggregate connections simulated crossed 2.1 billion users, someone in the lab quipped: “We’ve emulated China.”
Chris Chapman, a technical marketing engineer with Spirent, said, “I test a wide range of devices, and this is a factor of five times more than what I typically see with service provider firewalls.”
As the network security test ramped up to 600,000 transactions per second, Chapman added: “The firewall look-ups per second is huge, and it’s maintaining a good user experience. I have a hard time classifying this device.”
Crossbeam’s X-Series chassis are known for the hot-swappable capabilities of its APM blades. As the test was passing its peak, Peter Doggart, Crossbeam’s director of product marketing, was feeling heady and ran into the room where the equipment was running and started pulling out APM blades just to see what would happen.
The emulated traffic from the Spirent stack was already starting to drop, but the overall performance of the X80-S barely registered a blip from the loss of processing power. The NPM blades were able to load balance traffic enough so that latency never spiked beyond 400 milliseconds, even with half of the 12 APM blades yanked out.
Doggart said Crossbeam wanted to prove out its X-Series products because IT organizations typically don’t believe the performance numbers they get from vendors, and they rarely have the means to prove them on their own. Crossbeam recently surveyed enterprises about how they evaluate network security gear before rolling them into production.
“Half of respondents never test their equipment,” he said. “They don’t even do a proof of concept. And two-thirds had to buy additional equipment to meet their performance goals.”