News Stay informed about the latest enterprise technology news and product updates.

World Cyber Games' network is 'organized chaos'

An enterprise-grade network was built by ProCurve Networking by HP last week for the World Cyber Games in Seattle. It offers thousands of gamers uninterrupted access.

SEATTLE – Designing and building the network for the World Cyber Games (WCG) Grand Final, considered by many to be the Super Bowl of gaming, was a labor of love for a pair of network pros from ProCurve Networking by HP.

The event brings together thousands of gamers from around the globe to compete in games like Counter-Strike, StarCraft, FIFI 07, Tony Hawk's Project 8 and many more for cash and prizes. The battle for bragging rights is set to turn Seattle's Qwest Field Event Center into the world's largest arcade of sorts from Oct. 4 to Oct. 7.

But to ensure the more than 700 gaming PCs and 300-plus Xboxes, more than 20 edge switches, a core switch, a wireless overlay and miles of cable offer uninterrupted game play requires an enterprise-grade network, even though it's only temporary. Essentially, ProCurve was charged with building a network to support thousands of devices and thousands of users in just a couple of days, a challenge that would take the average enterprise weeks, even months, to put together.

Last Thursday, technical teams converged on the event center. Hundreds of yards of cable were laid out, the PCs were set up and the network, designed and diagramed in the two months leading up to the kickoff, was set in motion.

"It's like a big dance project going into there," said ProCurve Networking technical consultant for the Americas Chris Ruybal, calling the planning and setup "organized chaos."

"You have to weave [the network] into something you can show other people," he said.

Getting all of the teams on the same page in the days leading up to the WCG was a challenge, Ruybal said, especially since no one really meets each other in person until crunch time.

"We all have to meet each other on the first day we can get into the facility," he said. "Then you have to take your plan, lay it out and map the physical equipment to that layout. The trick is to compartmentalize, focus on the function that it's providing and have it all collapse back into one backbone."

Ben Van Kerkwyk, lead engineer and WCG network architect, interjected: "You have to contain, control and focus on the function you're trying to provide. You have to get all of the parties together for planning. If you can do that right, it should be fairly simple in terms of how you build it. When everybody meets for the first time it gets real."

Anything from a communications perspective that goes wrong it's our fault.
Chris Ruybal
Technical Consultant for the AmericasProCurve Networking by HP

From the start there were minor issues: The cabling crew showed up late; the ProCurve 8212 core switch, which acts as the network's core, didn't arrive as planned; all of the Samsung gaming PCs showed up with Korean power jacks; and other events slated for the venue, namely a concert by George Thorogood and Bryan Adams, put time constraints on an already tight deadline.

Even before access was granted to the Qwest Field Event Center, there were some last minute adjustments. Ruybal flew into Seattle from California carrying a white Safeway shopping bag containing a backup power supply, just in case. Van Kerkwyk sat beside him on the plane carrying additional modules for the 8212 switch, which, come tournament time, would have a wireless services module, a redundant management module, four 24-port gig POE modules and two 2x4 fiber modules.

"We're dependent on a bunch of different aspects. This one's a little tricky, because I only have a third of the access at the moment," Van Kerkwyk said, noting that because of the concert and other planned events, only the gaming balcony could be set up last week. The main event floor had to be put off until after the weekend.

When all is said and done, the network will have more than 20 ProCurve 2650 edge switches -- one at each gaming table -- feeding into the core switch and powering the network, the ProCurve teams' monitoring stations, the management server and other technologies. A wireless overlay for VIPs and a few others will be added into the mix, using somewhere between seven and 10 radio ports. The network infrastructure will also deliver real-time video feeds over the Internet, which could draw in more than 30 million viewers from around the globe.

One key component of the WCG network, Van Kerkwyk said, is the use of SFlow traffic monitoring, which is a packet sampling technology that statistically samples traffic to see exactly what's happening on the network. SFlow is ideal, he said, because it adds minimal additional traffic onto the network, while also giving his staff visibility into who's doing what and when.

Ruybal summed it up like this: "If you run into an issue, the more you can see into what the root cause is, the easier it is to track down and stop it."

And for Ruybal and Van Kerkwyk, keeping the network running and running smoothly is a top priority, not only because the event hinges on it, but because their reputations rely on it as well.

"Anything from a communications perspective that goes wrong it's our fault," Ruybal said. "The more tools and capabilities in your products and design, the easier it is to go from problem to solution immediately."

And when it comes to potential problems that could interrupt the network, there are plenty. Van Kerkwyk said security is a concern, since someone could tap in and manipulate games. Other concerns are on the human error side of things. Either way, delays and interruptions that affect gameplay would mean the game must be replayed, causing hindrances in the overall competition.

"There's a lot of things to worry about," he said. "Like people stepping on the cables and pulling them out. We're putting our equipment in front of 1,000 different variables."

For more information
Read more about ProCurve's 8212 core switch

Check out tips on network architecture and design

See a diagram of the WCG network on our blog

Ruybal added: "We have to come in and set a certain kind of standard."

Thursday's setup was mostly that, laying the groundwork. The network in the gaming area wouldn't be officially up and running until Friday afternoon. Ruybal and Van Kerkwyk matched their layout to the diagrams and decided how to run everything back to the core switch. The team placed the 50-port 2650s on each gaming table and powered them up to ensure they work. They had a few spares on hand in case one went bust -- "always over-provision," Van Kerkwyk said -- but as of the weekend, they hadn't been needed.

Powering up the edge switches, though monotonous, is the cost of doing business. Come game day, if one isn't working, that could put a major wrench in things.

"It's one of those pay now or pay later kind of things," Ruybal said. "You don't want the opening ceremonies to start and someone goes to power up their stuff and …."

Each table needed its own switch, since games are highly sensitive to delay and jitter. The setup essentially makes each gaming table its own network with 1-gig speeds.

"I don't want a lot of hops," Ruybal said, adding that could degrade performance. "I don't want daisy-chaining."

"It's a fairly intensive environment for a network," Van Kerkwyk added. "Like voice, it can't stand any jitter and can't stand any delay."

Ruybal interjected: "Talk about a mission-critical application. If you mess up, it's pretty noticeable."

It's a fairly intensive environment for a network. Like voice, it can't stand any jitter and can't stand any delay.
Ben Van Kerkwyk
Lead Engineer, WCG Network ArchitectProCurve Networking by HP

In the corner sat the lonely rack for the missing-in-action 8212 core switch, which would serve as mission control. Once they tracked down the 8212 -- it was delivered to the wrong part of the building -- it was mounted in the rack and, after a quick trip to Sears for a wrench and a screwdriver set, it was powered on. On Friday, everything was fed into it and all of the edge switches were configured with IP addresses and names so they could be centrally managed.

"If I do this right, I only need to touch these switches once," Van Kerkwyk said.

Once they configured all of the switches, different games were segmented into different VLANs and subnets to ensure they can't interfere with each other.

The true test will be this week, before the event kicks off. Van Kerkwyk and Ruybal will have access to the main event floor and will have just days to get that area ready. They'll rely on the in-house infrastructure and run fiber. Still, as of Friday afternoon, the ProCurve team was comfortable with its progress.

"It'd always be nice to be further along than you are, but we'll get it together," Van Kerkwyk said. "Our objective for these two days was to get this upstairs part all running and done, and we've done that."

And on Friday, two more core switches popped up. Again, over-provisioning just in case. "We now have three core switches and 15 power supplies, so we're good if something goes down," Van Kerkwyk said.

Ruybal added: "We could power all of Seattle with all of the bandwidth and power we brought."

Dig Deeper on Network Infrastructure

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchUnifiedCommunications

SearchMobileComputing

SearchDataCenter

SearchITChannel

Close