USGS

While the Obama administration and the federal government have worked to set up ways to share geospatial data between agencies, a new report from the Government Accountability Office finds that lack of coordination between departments is resulting in costly duplication and millions of wasted tax dollars.

While the GAO report said the extent of duplication in geospatial investments is unknown, it said billions of dollars are being spent across the federal government on duplicative geospatial investments.

Further, “many mission-critical applications, such as those used to respond to natural disasters-floods, hurricanes, and fires-depend on geospatial information to protect lives and property. Thus, it is important that the data acquired to support these critical functions be done in a timely and coordinated manner, with minimal duplication,” the report concluded.

The government has tried to coordinate the use of geographically-related data by setting up the Federal Geographic Data Committee, under the direction of the Office of Management and Budget. One of the FGCD’s tasks was to create a metadata standard to mark geospatial information and a clearinghouse to store and disseminate it.

But the GAO found that agencies that collect and use such data are not using the clearinghouse to identify geospatial investments, coordinate activities and avoid duplication. According to the GAO, the FGCD has not planned or launched an approach that allows agencies to manage and more effectively share geospatial data to avoid costly redundancies.

Additionally, the report said the FGCD’s master plan is missing key elements, such as performance measurements for many of its defined goals.

The three departments responsible for implementing and managing geospatial information government-wide – Commerce, Transportation and Interior – have only put some of the steps needed for national geospatial data sharing into effect.

Among the three departments, the only major goal that they all achieved was to make metadata available on the clearinghouse. Only the Interior Department has designated a senior official to oversee sharing geospatial information with other departments and agencies. None of the three departments has launched a strategy to share data and only the Commerce Department has partially established a metadata policy.

OMB, meanwhile, does not have complete and reliable information to identify duplication in agency investments, the report said.

One example of the lack of coordination cited by the report is that the Census Bureau, the USGS and the Department of Homeland Security are independently acquiring road data, which is reported to have wasted millions of tax dollars.

“Unless OMB, the FGCD and federal departments and agencies decide that coordinating geospatial investments is a priority, the situation is likely to continue,” the report said.

To improve coordination and reduce duplication, the GAO report recommended that the FGCD develop a national strategy to coordinate geospatial information, federal agencies follow federal guidelines to manage geospatial investments and that the OMB develop a mechanism to identify and report on geospatial investments.

The OMB and two of the departments have agreed with the GAO’s recommendations while one department has neither agreed or disagreed with the findings, the report said.

Keep reading →

It’s not easy following Todd Park, the federal government’s chief technology officer, and his breathless on-stage enthusiasm for promoting technical innovation in government and the virtues of collaboration.

Park clearly found an avid proponent, however, in Seth Harris, U.S. Deputy Secretary of Labor, who made a persuasive case last week in describing the inherent logic for government and the private sector to work jointly in turning information into useful tools for the American public and the U.S. economy. Keep reading →

First Todd Park, former Department of Health and Human Services chief technology officer, bet on health data in a big way; got his upcoming Health Data Palloza, and then became our new Federal CTO.

Then Gus Hunt, CIA CTO, bet on big data for the Intelligence Community and got its budget increased by Congress, reflecting a governmental shift in IT priorities, from a Defense Department style network-centric focus toward the IC’s big data-centric focus.

Now the Defense Department is in the big data game with their big bet to the tune of $250 million announced Thursday at the White House Office of Science and Technology Policy’s Big Data Research and Development Initiative.

The assistant secretary of Defense, in a letter released yesterday, said “We intend to change the game and plan to be the fist to leverage big data across the full scope of military operations in new and unconventional ways.”

There are five other agencies who were present at the AAAS Auditorium event which are contributing much smaller (or non-disclosed amounts) as follows:

  • National Science Foundation: $10 million, plus several smaller grants
  • DARPA: $25 million annually for four years
  • National Institutes of Health: No money, but the world’s largest set of data on human genetic variation freely available
  • Department of Energy: $25 million
  • USGS: New grants for unspecified amounts
But where does this new initiative leave us?

I think it leaves us with a disconnected federal big data program between the science and intelligence communities with the former considerably behind the latter.

The report, “Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology,” prepared by the President’s Council of Advisors on Science and Technology (PCAST), said: “Every federal agency needs to have a “big data” strategy.

I did not hear that today either from every agency or across all the agencies. The recent 2012 Big Data Government Forum provided a much more comprehensive view of best practices around Big Data technology, trends, and issues from senior government executives, data scientists, and vendors.

As Professor Jim Hendler, RPI Computer Scientist, commented during the meeting: “Computer scientists like us have to move to the social science side of things to really do big data.”

This new White House Initiative needs Todd Park’s entrepreneurial spirit, Gus Hunt’s experience, and DoD’s new money, spent in a coordinated way with the IC and civilian agencies to make big data across the federal government a reality.

Nearly 90,000 high resolution scans of the more than 200,000 historical U.S. Geologic Survey topographic maps, some dating as far back as 1884, are now available online.

The Historical Topographic Map Collection includes published U.S. maps of all scales and editions. The historical maps are available for digital download to the public at no cost in a GeoPDF format. Printed copies are also available for $15 plus a $5 handling charge from the USGS Store. Keep reading →

The highlight of yesterday’s Geospatial Summit for me was mention of the National Hydrography Data Set.

Tommy Dewald (U.S. Environmental Protection Agency) and Keven Roth (U.S. Geological Survey, retired) set about the task of developing a surface water dataset of the nation in the 1990s with a vision to create a solution for the 21st century, when suitable water resources would become critical. What oil was for the 20th century, water would be for the 21st century. Keep reading →