Virtualization in the data center has quietly become one of the most widespread IT trends going. Though the concepts...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
behind virtualization have been around for some time, only now are they taking root, foreshadowing a future in which acquiring or provisioning IT resources could be as easy as plucking fruit from a tree.
Traditionally, running a data center means having many standalone computers that perform dedicated functions for individual applications. But virtualization technology allows applications and servers to work in a non-dedicated manner, interacting as needed to get the job done, said Andrew Schroepfer, founder and president of Tier 1 Research in Minneapolis.
Schroepfer said there are essentially three types of virtualization technologies: networking, server and storage. On a network, virtualization can allow one or a few machines to handle encryption, load balancing, firewall protection and other appliance services, instead of using a number of dedicated computers for each task.
"With servers, it's a lot simpler," Schroepfer said. "Instead of configuring servers to a specific application, you would just say, 'Here's my application. I have this much traffic and it needs this much processing power,' and it would go access as much power -- or storage -- as it needs."
Efficiency and cost savings are at the heart of the technology's proliferation. IT services providers IBM Corp. and EDS Corp., financial services firms like Morgan Stanley and Goldman Sachs, and other data center operators like Comcast and State Farm Insurance are all implementing virtualization in some form.
Hosting and network services provider Savvis Communications Inc., which operates seven data centers on three continents, has been using virtualization for about three years. Greg Furst, vice president and general manager for hosting services at Savvis, said his company is using virtualization products from Inkra Networks Corp. to provide the same "five 9s" level of service to its network services customers as it does to customers whose systems reside in one of its data centers.
"We're looking to create an environment where our clients can get services on demand and pay for what they use," Furst said. "It enables us to deploy services -- things like firewall services, SSL and load balancing -- in a very cost-effective [and] fast way, so the provisioning time is short."
What's more, Furst said, because Savvis is spending less on managing its data center, it's been able charge its customers less as well. "I can say that it has helped us win some business," he said.
Oddly enough, today's virtualization technology is similar to what application service providers (ASPs) and management service providers (MSPs) touted a few years ago. Those companies offered outsourced applications and services on a subscription basis using shared resources, but few found success. Most were eventually acquired or went out of business.
So why is virtualization a hit now? Schroepfer said the technology was just emerging then, and many of those companies were developing it in house. Rather than providing the best possible service, they became bogged down by development.
Today, companies like Inkra are churning out virtualization products en masse, which means that data center operators can buy off-the-shelf components and integrate them for a fraction of what it cost ASPs and MSPs to get started, Schroepfer said.
So does that mean virtualization will inevitably make the provisioning and use of IT services as easy as turning on a faucet? "It's foolish not to believe that's the goal, but it's just as foolish to believe you can have that soon," Schroepfer said.
He said that the management or business policy layer is still missing from the virtualization paradigm. That piece of the puzzle tells the data center systems which processes have priority over others, ensuring, for instance, that bandwidth for a mission-critical customer relationship management (CRM) application takes precedence over other, nonessential traffic. Without this layer, a data center operator is forced to build the framework for those rules on its own, often having to integrate them with existing management software, which is not always a smooth process.
"I don't think one company will be able to dominate all the pieces of the infrastructure," said Inkra's co-founder and vice president of strategy, Dave Roberts. He said his company and others in the space are looking to XML and Web services to provide the common denominator, though it will take time before those efforts are realized.
Schroepfer also said that most service providers have not yet been able to pass cost savings from virtualization along to customers. He said most are focusing on maintaining and improving their profit margins and, at best, putting themselves in position to share cost savings with customers down the road.
Still, virtualization's potential is undeniable. Schroepfer predicts that, in three or four years, most Global 2000 companies will have upgraded to blade servers in order to take advantage of virtualization technology. Though it remains to be seen whether mission-critical "on tap" IT resources will ever materialize, virtualization is making its mark.
"We may not have 'water faucet' IT computing," said Schroepfer, "but we'll have a radically improved data center with less manual effort. It'll allow either for fewer people or more productive people but, either way, it'll create a more productive and efficient data center."
FOR MORE INFORMATION:
Watch our webcast on network and systems virtualization.