Sunday, March 16, 2008

The many faces of virtualization

Virtualization is arguably one of the most abused words in the tech industry in 2006. Virtualization is different things to different vendors depending on what is appended to it.

Server and operating system (OS) vendors like IBM, HP and VMware define it as the masking of server resources, including processor and OS, into multiple, isolated virtual environments. Why do you want to do this? Gartner claims that we use only, on average, less than 10 percent of a physical server's computing resources. So why would you buy another server if you are under utilizing an existing one?

Then there is storage virtualization with vendors like EMC, IBM and HP battling out the top three spots worldwide. Storage virtualization is pooling all of your storage needs across the enterprise into one single logical entity. The idea is to dynamically allocate storage resources to applications according to each need. The physical storage can be disks, tapes or optical devices. The application only knows it has storage space allotted to it. Nothing else matters.

Applications are written to use resources available from the OS. When we install an application, it often tweaks the OS to suit its needs. There are instances when conflicts arise as different applications vie for the same OS resources. Microsoft, XenSource (recently purchased by Citrix) and Altiris (a Symantec subsidiary) skirt around this by developing technology which effectively isolate the application from the OS. They call this application virtualization.

Brocade and Cisco dominate the technology we know as network virtualization which takes the same principles as storage virtualization but involves the management and pooling of all resources connected to the network including storage, servers and services.

If you are confused, you are not alone. But don't blame it on technology. IT merely mirrors the complexity with which businesses operate today. But that does not give vendors a blanket excuse to create solutions that only techies can understand. If any, vendors must deliver solutions that are easy to install, operate, manage and maintain. The underlying complexity should be masked and only accessible as and when it is absolutely critical.

So how did we get to the point where we are today?

Too many servers, too many users demanding to access data across platforms of all shapes and sizes, and too much data sitting at different locations creating the potential for data lost or security breach, as well as a swelling headache for IT as the custodian of a company's information.

"Users need instant access to real-time data. They want to be able to securely share that information with partners, customers and outsourcers. They also want their applications kept up-to-date without impacting their productivity," says Dennis Rose, vice president, Citrix Systems Asia Pacific.

Meanwhile, the IT organization isn't growing as fast as the business although its responsibilities to the user, to management, and to the company are growing exponentially.

Consolidation initiatives became apparent as companies tried to rein in the uncontrolled proliferation of IT infrastructure that comes as businesses open offices and operations in new markets.

"One way of managing these growing resources is to pool them together via virtualization and manage the pool instead of the individual entities. This strategy has the knock-on benefit of requiring less resources to properly manage the infrastructure," says Rene Aerdts, EDS Fellow.

Virtualization payback question
Virtualization is not free (no vendor in his right mind will give you something without strings attached). It does entail financial investment to bring in virtualization technology, including hardware, software and the services needed to bring it all today.

"There are immediate cost saving items that can be realized with server consolidation and virtualization," says Johann Muller, Senior Director for Global Systems Practice at Sun Microsystems. "These include reduction in power, cooling, software licensing and system maintenance costs. There are longer term benefits such as reductions in system administration cost, faster time to market and ease of management that are harder to quantify."

The larger the scale of virtualization, the more compounded these savings are. "We see in many cases, customers that have performed consolidation and virtualization experience savings in the vicinity of 20- 50 percent of their normalized IT spend. These savings span over a three to five year period and often amount to tens of millions of dollars," adds Muller.

Rose notes that application virtualization also plays a central role in providing many of the operational benefits that help ensure the longevity of a company's IT investments.

"More importantly, virtualization allows IT departments to take the costs and resources that would be used to perform menial maintenance tasks to focus instead on adding real value to the business strategy of a company," said Rose.

United Overseas Bank Malaysia now deliver compute-intensive applications over low bandwidth saving them more than $700,000 annually through eliminating the need to upgrade 400 PCs as well as bandwidth costs.

Virtualization show stoppers
Cost is still cost. If you want to consolidate and virtualize your infrastructure you will have to be willing to invest in time, money and the willingness to change the way you do things. The latter being the hardest to sell to users who are happy with the way things are.

According to Muller, "organizations are constantly seeking methods to create a consolidated infrastructure without sacrificing the manageability and security of applications. Variations in application tuning, patch level, operating system revision, and security requirements often prevent consolidation projects from moving forward."

He also concedes that the lack of deep understanding of how applications behave and the inability to present a financial justification for virtualization are common stumbling blocks towards adoption.

"From our experience, inertia and lack of knowledge are the primary stumbling blocks toward the adoption of any technology, including virtualization. Legacy infrastructure and extended experience with the same technology may hold some companies back from looking for better ways to optimize their IT environment," laments Rose.

Aerdts warns that architectural limitations make virtualization impossible to deploy in the high-end computing environments. "At the same time vendor support for applications may be a showstopper if vendors are not willing, or are unable, to provide full support for their product in a virtualized environment," says Aerdts.

The way forward
We've been told about the good news and the bad news. What else do we need to know to make an informed decision?

Before you make the final decision to go virtualize your IT infrastructure, it would make sense to take a step back and assess what it is you want to achieve. Business leaders need to fully understand the pros and cons of the technology, how it impacts current business practice, and what it will mean to the operations moving forward.

"It is unlikely that one technology alone will meet your needs completely, so it is best to take a holistic view towards virtualization. Especially with the latter, it is important to note that not all applications are suited for virtualization. Thus mapping out specific goals and needs will help ensure that the organization gets the most out of deploying virtualization," advices Muller.

He also warns that unwavering executive level support must be in place from the onset before execution. "Too many virtualization projects fail because the individual business owners of the servers will not support virtualization implementation," he adds.

The next step will be to access the business areas that will be affected and give each area a weight based on financial and operational impact. This will help you prioritize deployment.

As with all things new, it is wise to implement in stages. "Many corporations start with the test and development environment to 'get their feet wet' and get used to a virtualized environment. Usually this step is followed by a small, non-mission critical, system (or environment or application) that is virtualized. And finally, full-scale production roll-out," adds Aerdts.

No comments: