If nothing else, network architects expect 10 GbE core switches to deliver raw performance. Bottlenecks can lurk...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
inside a high-speed switch, however, and that could cause its actual performance to clock in far below theoretical maximum. The solution, of course, is to benchmark any of these high-end devices before purchase and deployment. Testing core switch performance brings another set of challenges, though.
Test tool vendors such as Ixia and Spirent Communications provide a wide range of powerful network switch benchmarking gear that includes 10 GbE interfaces. Such gear is essential to benchmarking core switches. Often, however, organizations must share access to this gear across multiple test teams, leaving little time for testing core switch performance. In other cases, you might assemble all your high-end traffic generation gear and find out that your core switch is cruising along showing only 20% CPU utilization.
If buying or renting additional traffic-generation firepower is economically or logistically impossible, it is time to dust off your old statistics book and let linear regression come to the rescue in testing core switch performance. With a sufficient number of data points of key system variables, it is possible and, indeed, easy to use statistical modeling to make reasonable projections of switch performance beyond the load points that you are able to test.
Why use traffic modeling for benchmarking core switch performance?
In a test of high-end 10 GbE L4-7 application switches published in November 2009, The Tolly Group found it needed to employ a modeling technique to postulate the likely connection, transaction and DNS query response capacity of a Brocade ADX 10000 outfitted with 16 10 GbE interfaces. (See Tolly Report 209150.)
Even with an extensive set of traffic generators, engineers were able to drive the system, which ran 32 application processing cores, only up to 28% in the L4 transaction rate test, which resulted in roughly 5 million transactions per second.
Modeling core switch traffic with regression analysis and least squares
So engineers performed regression analysis to approximate the limits of the switch, correlating CPU utilization to transaction. Specifically, engineers employed a method known as ""least squares," which is, thankfully, described in detail on Wikipedia.at http://en.wikipedia.org/wiki/Least_squares.
To see how this is applied, let's look at Figure 1 from the aforementioned Tolly document to see how the model was built.
Having reached the limits of the test harness for the L4 transaction rate test, engineers ran multiple iterations at lower traffic loads -- in this case, 1, 2, 3 and 4 million tps -- to provide data for the model. Engineers collected a total of five pairs of traffic load and CPU usage values.
After the set of equations were run on the data, engineers projected that the system could process approximately 18 million transactions per second before the CPU resource was exhausted. The least squares regression formula output two numbers that became the key constants for the projections, with "m" being the slope value and "b" being the y-intercept value.
These numbers were plugged into the slope intercept formula, y = mx+b. Simply put, this allowed engineers to project transaction rate (y) for a given cpu usage (x) and to draw a slope to project the CPU usage/transaction rates at any point from where the physical benchmarking stopped to the point where the CPU was running at 100%.
Finally, engineers calculated the coefficient of determination (R2) to be .9999], meaning that 99.99% of the relationship between the two variables is accounted for by the regression equation -- in other words, a very strong statistical confidence in the results.
While it might seem a bit daunting, everything we've described here can be done in Excel with a few simple functions. Once you set up an Excel template, it is easy to modify it for use with future projects. Then it is a piece of cake compared with finding an extra million dollars or so of test gear to borrow for your benchmarking project.
About the author: Kevin Tolly is president and CEO of The Tolly Group, an independent test lab and research firm that publishes a series of networking testing strategies called Tolly Common Test Plan. You can read more of The Tolly Group's reports at the Tolly Common Test Plan.