In 2010, as part of his 25-point plan, former federal CIO Vivek Kundra called for the reduction of 800 of the federal government’s 2,100 data centers by 2015. But one key issue to consider with data center consolidation is data center innovation. When done correctly, data center innovation can reap operational and financial rewards.

And we’re not talking five year safe harbor plans either; we’re talking mere months. Data centers aren’t like cars. You don’t keep the same model until the engine drops out. This year’s model will be faster and more energy efficient than last year’s. That’s the basis of innovation.

Commencing a plan of radical innovation in the data center requires a few steps. You simply cannot get from Point A to Point B in one bold leap. This isn’t necessarily due to the cost of purchasing servers that dents the IT budget, but rather the cost of powering up and cooling down those servers. With that in mind, what are the logical steps to innovating your data center?

One analogy is, how could you secure and manage your own personal bank accounts if you weren’t certain of how many accounts you had?”

You start by taking a hardware inventory, because you can’t know what you need until you know what you already have. Every CIO needs to know what IT they have under their control. One analogy is, how could you secure and manage your own personal bank accounts if you weren’t certain of how many accounts you had?

Former Federal CIO Vivek Kundra started with a data center count, which sounds like a somewhat simplistic approach until you realize how many structures — some not much larger than a broom closet — counted as data centers. He found they had just over 2,000 of them and decided that was 800 too many.

Once you have the list, consider the following next steps:

1. Server countdown: Commence ranking and rating the hardware that lies within. How much of it is bespoke so-called “heavy iron”? What is ancient, what is powered up but apparently not passing packets – if this is the case, did it get overlooked? And let’s not forget those mystery racked servers that aren’t even powered up – did they ever get turned on, or is it a two-year-old fully populated server that just sat there looking shiny yet forlorn?

2. Age: How old is the equipment in the racks? With Moore’s Law, as each generation of servers gets launched, up goes the processing power and down goes the power requirements. By performing some independent modeling on your existing hardware, you can determine the processing power compared to the energy requirements, factor in the increased A/C requirements, and soon discover the financial error in shoring up server equipment that is a few generations old. When running the numbers on the energy savings achieved with new multi-socketed Intel Xeon-based servers, for example, the return on the investment can be measured in just months.

3. Architecture: How many architectures are you running and supporting, and how much time and money is being spent getting them to talk to each other? Of course, from out bias, the solution to this dilemma, x86, is a platform that covers all your needs, from high performance computing through blades, and even takes into account the desktop and mobile client.

4. Data center consolidation through virtualization: Another advantage of having the latest generation of processors in your reduced rack space is their innate ability to support a fully virtualized environment. With the average federal server idling along at a 22 percent, it behooves the innovator to securely virtualize a number of processes on the same system. It works and it’s definitely where the smart IT money is investing right now.

5. The cloud. If ever a term should be voted buzzword of the decade, it would be cloud computing. Not a panacea, but something of a new religion for those looking to offload services that aren’t millisecond response-time dependent, aren’t print servers, and don’t involve ancient legacy systems that you’re too scared to even think about touching.

Everyone has an interpretation of what cloud computing is, and I like this explanation the best “Cloud computing is a utility service – giving you access to technology resources managed by experts and available on-demand.”

Now you need to decide which of those services is best suited to putting onto the cloud. At present, email and collaboration services seem to be the easiest to unbuckle from the mother ship.

Less is More
So now you have the map of your IT “battlefield.” Please don’t be fooled into thinking that the more hardware you have online, the more potent your CIO wielding powers are. In the age of open and efficient government with budget constraints, the aim here is to do more with less, and technology can help here. Newer truly is a case of better when it comes to data centers.

Out with the Old
How many bodies routinely wander around your current data center armed with patch cables and laptops? Intel recently showed one of our recently refreshed new Xeon processors to the CIO of a large agency. They were surprised to learn that we had just one live security person physically at the location and that the server rooms ran hotter (less cooling) than the agencies. Why and how, we were asked. Well, cooling costs money, and if you engineer your equipment to be securely accessed remotely, then you don’t need to cool the rooms for the comfort of humans.

We share our best practices readily with anyone looking to update and innovate. I don’t want to sound too much like a marketing wonk but because of Intel’s commitment to continual innovation, standardization on Intel architecture translates to significant performance improvements at regular intervals. We update our own silicon because that refresh translates to faster processing and lower energy bills. Simple economics.

The Status Quo
At a recent reception in Washington, D.C., I made the opening remarks to the majority of the Federal CIO Council, who were there to wish Vivek all the best with his Harvard Fellowship. I couldn’t help but raise the issue that time and time again the reason I encounter about why the government won’t do something isn’t cost, isn’t complexity, and isn’t even policy-related. Simply put its change, a word that hampers many agencies in adapting to new practices.

For example, since the President signed into law the 2010 Telework Improvement Act, you would think that all agencies would stop buying desktops and instead buy laptops in readiness for the new mobile work paradigm, but sadly that isn’t the case. The United States Patent and Trademark Office and U.S. Department of Agriculture are two that don’t need to be asked, as they already received that particular memo, but for some others it is the fear of lost power, lack of control and yes, just fear of change itself.

The private sector can’t afford that fear; we adapt and innovate or we wither on the vine well aware that our competitors aren’t treading water.

Drivers of Innovation and Change
Vivek Kundra clearly wasn’t afraid to ruffle feathers. It’s true that you can’t make an omelet without breaking a few eggs or apparently upsetting a few of the old guard who’d been doing things the same way since the IBM golfball typewriter was the new kid on the block.

In the last two years, the administration managed to slash $3 billion worth of overrun projects and I have every belief that the new Federal CIO Steven VanRoekel will continue the good fight. He has already gone on record as stating that “The right investments in IT spending can yield huge gains in productivity” so I know he gets it. It isn’t about spending, it’s about spending smart. Newer technology works faster and more efficiently; that is the yardstick here.

Security Innovation Begins with Prevention
In the private sector we share a number of the same drivers as the public sector, including wanting to share some documents with a wide audience while securing others from prying eyes. The greater the number of data centers and live end-points you present to the outside world, the more IP addresses become potentially vulnerable. We all realize that keeping out unwanted guests is part of the cost of having an IT infrastructure. And for some, that can be an increasingly costly undertaking, playing cat and mouse but for stakes somewhat larger than cheese.

There are different approaches, and if you operate from a reactive state, you’ll always be behind the intruder, following a trail of forensic breadcrumbs and trying to discover who they were and what they took. If, however, you operate from a proactive stance, then you are less likely to discover problems after the fact.

Staying ahead of the puck is the goal, and upcoming advances that incorporate security features below the traditional software layer will provide an optimized state of security never before achieved.

My first paying job was in retail, where I quickly learned that shoplifters were a fact of life. In such instances, prevention was much more effective and efficient than catching them after the fact, risking assault, having to call the police, providing statements, attending court etc.

The problem is more profound when it comes to data theft. Prevention remains the most effective medicine.

Nigel Ballard is director of federal marketing, Intel Americas.