Steven VanRoekel


Federal Chief Information Officer Steven VanRoekel described his path forward for federal IT in a policy speech in Silicon Valley this October and again in the draft Federal IT Shared Services Strategy released just this month. He articulated a “Shared First” paradigm that will lead agencies to root out waste and duplication by sharing IT services, infrastructure, procurement vehicles, and best practices.
_________________________________________________________

This article originally appeared on GSA’s Great Government Through Technology blog.
_________________________________________________________ Keep reading →


This is one in a series of articles highlighting Breaking Gov’s best stories of the past year. As we reflected on our 2011 coverage of innovation, technology and management amongst the federal agencies and workforce, this was among the stories that stood out as delivering key insight into the top issues facing today’s government community.

From his first public appearance to his recent move toward the commitment to close data centers, Steven VanRoekel has been busy since taking over the role as federal CIO in August from Vivek Kundra. Keep reading →

The President has been clear that every federal dollar spent must generate a positive return for the American people and that as we tackle our long term fiscal challenges, we must root out waste in government. One area that we know we can do better in is with the thousands of duplicative data centers that sprung up across the last decade.
___________________________________________________
This article originally appeared as a White House blog post.
___________________________________________________
These data centers – some as big as a football field, others as small as a closet – represent billions in wasted capital that could be better used to improve upon critical services for American taxpayers. By closing data centers, agencies are on track to save taxpayers billions of dollars by cutting spending on wasteful, underutilized hardware, software and operations as well as enhance our cybersecurity; shrink our energy and real estate footprints; and take advantage of transformational technologies like cloud computing to make government work better for our nation’s families. Keep reading →


Federal CIO Steve VanRoekel, speaking publicly for the first time to the government IT community since being appointed last August, laid out a redirected vision for how the federal government needs to move forward using information technology, and highlighted his primary imperatives heading into 2012 that call for making “little things big and big things little.”

VanRoekel outlined several imperatives Friday for his office in the coming year that build on, and to some extent, recasts the policies of his predecessor, Vivek Kundra. Specifically, he stressed his desire to: Keep reading →

If you have been at a recent Washington Capitals hockey game when the opponent scores a goal, you know the crowd routinely shouts out “Who cares!”

Last week, Steven VanRoekel, Federal CIO, released the long awaited OMB plan for the Federal Risk and Authorization Management Program, or FedRAMP; which reminds me to be thankful for pronounceable acronyms. The purpose of FedRAMP per the implementing OMB memorandum, is to “provide a cost-effective, risk-based approach for the adoption and use of cloud services”. Keep reading →

As technologists in the private sector know, when money is tight, it’s often technology that enables us to do more with less. In a lean fiscal environment, organizations look for ways to take existing resources and use the latest advances and tools to do the seemingly impossible: improve and expand services while cutting costs. It is no different with the Federal Government. To deliver on the President’s commitment to an effective and efficient government, we are leveraging the latest advances in technology to save taxpayer dollars and cut waste. We are working aggressively to meet the challenge of doing more with less, and we are seeing real results.
______________________________________________
This article was originally published as a blog post Dec. 8 on the White House website.
________________________________________________
By holding underperforming IT projects accountable, we are identifying efficiencies and eliminating waste to deliver better technology solutions sooner, and at a lower cost. This year we took our rigorous Techstat accountability sessions and open sourced the model, giving agencies the tools to turnaround or terminate failing projects at the agency-level. As a result agencies identified nearly $1 billion in efficiencies, bringing the grand total of Techstat efficiencies to $4 billion in less than two years. You can read more about that in the TechStat Report published today.

Having the right people matters too. In order to ensure we have the experienced and talented managers we need to oversee these large, complex IT investments and maximize the return on taxpayer dollars at every step in the process, we created a new role for IT program managers with more rigorous requirements. We also launched the Presidential Technology Fellows Program this fall to attract new talent to the federal IT workforce by reducing barriers to entry for talented young IT professionals. Keep reading →


Data.gov evangelist Jeanne Holm emailed and tweeted today: “I wanted you to be among the first to hear about an open source release for an Open Government Platform” that among things, shares Data.gov with India.


The action is part of the recently launched the U.S. National Action Plan on Open Government, announced by President Obama. It represents another step under the U.S.-India Strategic Dialogue to produce “Data.gov-in-a-Box,” an open source version of the U.S.’s Data.gov data portal and India’s India.gov.in document portal. In fact, according to the Data.gov site, 28 countries have now adopted open data sites to share information.

The U.S. and India are working together to produce an open source version available for implementation by countries globally, encouraging governments around the world to stand up open data sites that promote transparency, improve citizen engagement, and engage application developers in continuously improving these efforts.

Technical teams from the governments of the U.S. and India have been working together since August of this year, with a planned launch of a complete open source product (which is now called the Open Government Platform (OGPL) to reflect its broad scope) in early 2012.

It’s less clear what is new here. You can find out more about the evolution of this project from the U.S. CIO Steven VanRoekel and Chief Technology Officer Aneesh Chopra in a joint announcement they made on a White House blog post today. And more about the Open Government Platform repository is on Github. You’ll find here a growing set of open source, open government platform code that allows any city, organization, or government to create an open data site.

In reviewing this site, I essentially found only: Create a new database in MySQL and Login using the default Drupal administration username and password. I also went to India.gov.in and found something that had been copyrighted in 2005. Is there something new I am missing here?

The announcement goes on to say: The first module released is the Data Management System, which provides the tools and capabilities for an automated process for publishing data in the Open Government Platform, an open source product designed to facilitate governments around the world to stand up their own open government data sites.

Our next planned release will be from India and related to the web site for the Open Government Platform. The U.S. and India will be providing additional modules in the future, and developers are encouraged to participate, provide feedback, and create new modules and capabilities! The teams working on this project are the National Informatics Centre in India and Data.gov in the U.S.

I asked Jeanne for an interview to get more details and she responded: “The best place to get an interview is through the White House Media Office. You are welcome to contact them directly at media@omb.eop.gov and they are ready to respond!” I did and heard back there were no officials to interview. When I asked where I could see what this would look like, she said: The code release is on GitHub, which is where we and others can provide additional modules as they come up. So I am wondering where is the platform and the real rationale for doing this?

I know that Data.gov has experimented with several platforms (as I have written previously) trying to evolve from catalog-to-repository-to-platform to actually find and access data and build apps, but the latest experiment with Socrata has been criticized as not being an open procurement and an open source platform. So maybe that is what this is all about – trying to straighten that out while looking like providing a service to the world.

I just wrote about how Data.gov.uk provides a real service to the UK and how data.gov would do well to emulate that as a concrete service to the US taxpayers that have been paying, expecting, and even demanding it for several years now.

It is hard not to think that when our leaders make mistakes (which we all do), they are prone to try things that are even bigger mistakes rather that admit them and ask for help, which in this case is right next door – the UK!

This week the National Institute of Standards and Technology (NIST) marked another milestone in the US Federal Government’s march to the cloud with Cloud Computing Workshop IV. Held at the NIST Gaithersburg campus, November 1-3, attendees had the pleasure of hearing Federal CIO Seven Van Roekel as the first day’s keynote speaker.

Mr. Van Roekel highlighted the great partnership that has been establish between government and industry around cloud computing. He also reaffirmed the administration’s support for cloud computing, praising NIST for their effectiveness in fulfilling a unique leadership role. Keep reading →

The National Institute of Standards and Technology launched Phase 2 of its efforts to guide the adoption of cloud computing in the federal government with the release this week of the first two volumes of the U.S. Government Cloud Computing Technology Roadmap, Release 1.0.

Release 1.0 of the roadmap, is “designed to support the secure and effective adoption of the cloud computing model by federal agencies to reduce costs and improve services,” according to authors of the NIST document. Keep reading →

In 2010, as part of his 25-point plan, former federal CIO Vivek Kundra called for the reduction of 800 of the federal government’s 2,100 data centers by 2015. But one key issue to consider with data center consolidation is data center innovation. When done correctly, data center innovation can reap operational and financial rewards.

And we’re not talking five year safe harbor plans either; we’re talking mere months. Data centers aren’t like cars. You don’t keep the same model until the engine drops out. This year’s model will be faster and more energy efficient than last year’s. That’s the basis of innovation. Keep reading →

Page 4 of 6123456