There’s a battle brewing in the agency IT world, and for once, it has nothing to do with cloud computing — at least on the surface.
This fight is over hypervisors – a key component of a virtualization strategy and one that forms the basis for a successful private or hybrid cloud implementation. Specific battle lines are drawing around the various hypervisor “flavors” and which, if any, federal IT teams should standardize upon.
From open source or proprietary to VMware or Microsoft or a host of other, newer players, the choices facing agency IT decision makers in the war for their virtualization dollars are seemingly endless. But any decisions in virtualization need to be made quite deliberately, given the price tag and the impact the newly chosen hypervisor will have on an agency’s future cloud computing needs.
So is there a “right” answer when it comes to hypervisors in the agency world?
Like any other technology application, there is no silver bullet solution when it comes to virtualization. With a sound strategy and well thought out roadmap, however, agencies can determine the right path for their operations. Answering the following questions can help:
- Open or Closed? The first virtualization question on every agency CIO’s mind is likely something along the lines of “what about open source?” It’s true that open source is making headway in the virtualization world – open solutions like Xen and KVM have strong backing from industry players like Citrix, Red Hat and IBM, but aren’t nearly as mature as proprietary offerings. Open source hypervisors are the way to go if an agency is planning on building an OpenStack, like NASA for example. Otherwise, the complexities of the current open source offerings can simply be too much for many agencies to take on right now, especially if they are going to be operating in a heterogeneous environment. This same complexity, however, can make an open hypervisor a great fit for a virtualization strategy that requires heavy customization – if an agency IT team is ready to take this route, open source is a great fit. Otherwise, implementing a more established hypervisor is the better route.
- Which Giant to Back? When it comes to mature hypervisors, the only two notable players in the game are VMware (the granddaddy of virtualization) and Microsoft‘s Hyper-V. VMware is the decided incumbent – the company’s solution is battle-proven and supports dozens, if not hundreds, of virtualization installations across government. The current version of Hyper-V, while not truly comparable to VMware, does have one excellent feature in its favor: the technology is free with Windows Server. The free component of Hyper-V hits hard against VMware, which holds a near licensing monopoly over end users and can effectively cripple virtualization implementations if the company elects to modify the pricing model (which it did for a short time last year). Hyper-V, however, lacks VMware’s virtualization “ecosystem” of affiliated applications that come with the hypervisor, like Virtual Center and vSphere, which truly form the foundation of an effective virtualization strategy. While it seems that VMware should simply overwhelm Hyper-V, it’s really not that simple. A new version of Hyper-V (3.0) is pending, and should compete on a much more equal level with VMware while keeping the free price tag. So while operating a VMware-only shop might seem like a good idea, in truth lock-in of any kind, virtualization or otherwise, is never a good idea. What if VMware jacks up licensing fees or changes the pricing model (again)? Agencies cannot afford to be held hostage by virtualization, especially in today’s budget climate. Instead, every agency looking at virtualization should adopt (or plan on adopting) a heterogeneous approach to hypervisors – VMware’s proven solutions should be applied to mission-critical workloads, while Hyper-V, in its current state, should form the foundation for less priority loads, which will help greatly offset costs. Heterogeneous technology approaches lead to additional challenges, most notably…
- How to Manage? Managing virtual infrastructure is vital to keeping cloud plans on track and ensuring that critical applications and networks stay online; this, however, is far easier said than done. Private and hybrid clouds, like those pursued by the agency world, require a heavy management burden, especially with users provisioning IT resources themselves – without proper management tools, users will quickly absorb all available resources, crippling the cloud and with it, agency operations. While maintaining a heterogeneous virtualization strategy is good for the bottom line, it makes the job of managing a federal IT deployment that much harder. Most virtualization providers offer tools that only support their specific technologies; if agency IT personnel choose to go the virtualization vendor route for management, it means watching over two separate management consoles and adding additional complexity to an already difficult workload. To avoid this problem, most federal IT teams will look for a third-party option that can watch over their entire virtualization deployment in a vendor-agnostic manner. While great on paper, these tools can, again, be overly complex or fail to work with other management technologies already deployed, like those used to monitor existing network, storage or physical infrastructure. Agency CIOs need management solutions that are easy to use, easy to deploy and do not require months of additional training to use properly. These management tools should also fit seamlessly into existing operations without causing an integration nightmare and must also be able to effectively monitor the virtualization mix without excessive tweaking. Effective virtualization management tools should help IT teams with not only performance management, but also drill down to the virtual machine level. Third party tools must also watch over capacity and sprawl, ensuring that the number of “zombie” and “orphan” virtual machines is kept to a minimum.
To the Cloud and Beyond
Choosing the right hypervisor is not just about an agency’s virtualization strategy; it’s about building a platform for a cloud computing strategy. The technology selected will have ripples across the entire agency, if not the government, so making a selection is neither a coin-flip nor an off-the-cuff decision.
When the dust settles and a solution is chosen, it’s not the end result of a plan; it’s the first step towards a more efficient and effective government IT strategy. Choose wisely.
SolarWinds, an IT management software provider.