GSA


The General Services Administration has done the equivalent of expanding from a busy brick-and-mortar book store to a burgeoning e-reader business akin to a mini-Amazon that has already saved the government tens of thousands of dollars.

Until recently, the public used the mail to request printed documents from the GSA’s distribution site in Pueblo, Colo., where government’s printed copies are stored. The site continues to operate, handling millions of print requests. Keep reading →

The American Council for Technology-Industry Advisory Council released its list of top 30 government IT programs and projects for 2012.

The finalists in ACT-IAC’s 11th annual Excellence.gov Awards Program, were selected by a panel of more than 50 judges, representing senior government IT and industry IT organizations. Keep reading →

The names of this year’s 100 most influential executives in the government IT community were released by Federal Computer Week magazine this morning.

The Federal 100 Awards recognize government and industry leaders who have played pivotal roles in the federal government IT community-and who “have made a difference in the way technology has transformed their agency or accelerated their agency’s mission.” Keep reading →

While the news that Aneesh Chopra is stepping down from his White House post as chief technology officer may have earned the most chatter on government IT blogs this week, the bigger buzz behind the scenes was the controversy over Google’s new privacy policies and what it would mean for government employees.

If the controversy began with Google’s announcement Jan. 24 that it plans to follow the activities of users as they move across Google’s various websites and platforms, it escalated quickly the following day with an article by Karen Evans and Jeff Gould. Keep reading →


The latest data table at Data.gov featuring an interactive snapshot of the government’s progress in consolidating data centers is nice to look at (“eye candy” as some might say). But there are two big problems with it:

  • First, it is not real data that can be copied directly into a spreadsheet and reused (try highlighting it and copying to a spreadsheet – it fails); and
  • Second, when you do download the spreadsheet from the Socrata interface it has to be reformatted to map the data because the “Data Center Location” column is not formatted properly. Among other issues, the latitude and longitude data need to be in separate columns and without text).

Perhaps more importantly, the table still does not deliver a result that the public and decision makers can use without some additional work.

I have done a good deal of that work for two previous stories with details elsewhere. Over that time, the number of data centers listed in the table has grown:

  • 6/18/2011: “2010-2011” – 137 data centers (first story)
  • 7/21/2011: “2010-2012” – 373 data centers (second story)
  • 1/12/2012: “2010-2012” – 525 data centers (current data set)
I reviewed the current data table and it shows:

  • 525 rows in the table
  • 158 without locations all together
  • 33 without longitude and latitude

In addition it shows:

  • 149 data centers closed between initiative Kickoff 2/26/2010 and Report 11/15/2011
  • 310 to be closed between 1/1/2012 and 12/31/2012
  • 66 to be closed between 11/15/2011 and 12/31/2011

It appears that additional information about data centers continues to be released for the same or different years, but the data continues to suffer from the lack of two important features: missing locations and no cost savings data.

There is a real disconnect between this table and a statement in the recent GSA Office of Citizen Service and Innovation Technologies 2011 Annual Report, which claims:

“Data Center Consolidation savings by the end of 2015 are expected to be $3 billion, based on analysis of information provided in October, which shows that agencies plan to close 472 data centers by the end of next year (do they mean 2012 or 2013?).”

Note that 472 is yet another number different from 525 in the most recent data set.

And it would be nice to see a column of data for the cost saving by data center so citizens can see the individual closures and savings in their own locations.

So I say this is progress in accountability to taxpayers and transparency in reporting, but still not giving us real data that can be readily used to support decisions and understanding by me as a data scientist and by our readers I am working for.


As the firewalls and silos that made up an obstructive government of the past have come down, Bev Godwin’s been working to entice the public to interact and absorb information via web and social media tools.

As director of GSA’s Federal Citizen Information Center (FCIC), Godwin helps federal agencies develop, promote and distribute useful communications on many channels – USA.gov, GobiernoUSA.gov, 1-800-FED-INFO, email, web chat, social media, publications online or by mail, and on laptop, mobile, and e-reader. Keep reading →

I recently was involved in a discussion debating the successes of Open Government.


Some of the individuals in the discussion felt the success of the Open Government initiative was the creation of Data.gov, but I disagreed saying that it was only a data catalog and even the featured data sets are difficult to use and understand.

We really need data apps and data stories from those that the public and decision-makers could use to justify funding and claim success.

The 25 most popular apps at Data.gov were mostly XML feeds and only 5 were Excel that I could easily make into data apps. Keep reading →


Federal Chief Information Officer Steven VanRoekel described his path forward for federal IT in a policy speech in Silicon Valley this October and again in the draft Federal IT Shared Services Strategy released just this month. He articulated a “Shared First” paradigm that will lead agencies to root out waste and duplication by sharing IT services, infrastructure, procurement vehicles, and best practices.
_________________________________________________________

This article originally appeared on GSA’s Great Government Through Technology blog.
_________________________________________________________ Keep reading →

The General Services Administration will begin accepting applications Jan. 9, 2012, for the first group of companies to be chosen as Third Party Assessment Organizations (3PAO) for the newly launched FedRAMP initiative, also known as the Federal Risk and Authorization Management Program program.

Officials for GSA and the National Institute of Standards and Technology made the joint announcement during the “Industry Forum on FedRAMP and Third Party Assessment Organizations”, held December 16 at GSA headquarters in Washington, DC. The half-day session presented the most up-to-date guidance for industry representatives on the FedRAMP Third Party Assessment Organization (3PAO) application process. Keep reading →

If you have been at a recent Washington Capitals hockey game when the opponent scores a goal, you know the crowd routinely shouts out “Who cares!”

Last week, Steven VanRoekel, Federal CIO, released the long awaited OMB plan for the Federal Risk and Authorization Management Program, or FedRAMP; which reminds me to be thankful for pronounceable acronyms. The purpose of FedRAMP per the implementing OMB memorandum, is to “provide a cost-effective, risk-based approach for the adoption and use of cloud services”. Keep reading →

Page 7 of 121...34567891011...12