data

Data analytics are proving to be a powerful tool for improving the results of government program, according to a new report released today, “From Data To Decisions: The Power of Analytics.”

The report, which examines how several federal agencies used data, is a joint effort between the Partnership for Public Service and the Public Sector Business Analytics & Optimization practice at the IBM Center for The Business of Government. Keep reading →

It is clear to me that the CIA needs big data, like Zettabytes (10 to the 21st power bytes), and the ability to find and connect the “terrorist dots” in it. As of 2009, the entire Internet was estimated to contain close to 500 exabytes which is a half zettabyte.

Recently I have listened to three senior CIA officials — two former and one current — talk about this and the need for data science and data scientists to make sense of it.

Gen. Michael Hayden, former director of the CIA and National Security Agency, and Principle Deputy Director of National Intelligence, and Bob Flores, former chief technology officer at the CIA, spoke about this at the MarkLogic Government Summit; and Gus Hunt, current CTO at CIA, spoke about this at the Amazon Web Services Summit that I wrote about recently.

General Hayden framed the problem as follows: Cold War Era — easy to find the enemy, but hard to stop them (e.g. Soviet tanks in Eastern Germany); versus the Global War on Terrorism — hard to find the terrorist, but easy to stop once their found (e.g. the underwear bomber on the airplane). He said we live in an era where it is not a failure to share data, but with processing the shear volume and variety of data with velocity that is the result of sharing.

He shared his experience meeting with former Egyptian President Mubarak before the recent Arab awakening due to social media that resulted in his overthrow and then meeting with the President of Twitter, Jack Dorsey, whom he asked: How does it feel to overthrow a government–something the CIA, when Hayden was director, was never able to do?

Hayden also said we need tools to predict the future from social media and data scientists to use them.

I told him about my work with Recorded Future that was also the subject of an Breaking Gov story.

Bob Flores, former CIA CTO, said that Recorded Future was a new, fantastic technology and that the old model of collect, winnow, and disseminate fails spectacularly in the big data world we live in now. He used the recent movie “Moneyball” as an example of how the new field of baseball analytics called Sabermetrics has shown there is no more rigorous test (of a business plan) than empirical evidence.

He said that in this time of budget cuts and downsizing the creme will rise to the top (those people and organizations can solve real problems with data) and survive. And Flores agrees with Gen. Hayden that while all budgets are on a downslope (including for defense, intelligence, and cyber), that cyber is on the least down slope of all the rest because it is realized that limiting the analysis of big data would be equivalent to disarmament in the Cold War era.

When it comes to proactive law enforcement, intelligence and counterterrorism operations, the New York City Police Department – the NYPD – is viewed by many of its counterparts as one of the most innovative and successful police departments in the nation’s history.

However, the NYPD has also gained another, more insidious reputation in recent years for what many regard as an unprecedented challenge to privacy and civil liberties in America and what others regard as overreach internationally. Keep reading →

The highlight of yesterday’s Geospatial Summit for me was mention of the National Hydrography Data Set.

Tommy Dewald (U.S. Environmental Protection Agency) and Keven Roth (U.S. Geological Survey, retired) set about the task of developing a surface water dataset of the nation in the 1990s with a vision to create a solution for the 21st century, when suitable water resources would become critical. What oil was for the 20th century, water would be for the 21st century. Keep reading →

One the nation’s most authoritative sources for residential address data, the U.S. Census Bureau, may soon have to confront a costly legal constraint that prevents it from sharing basic street address information with thousands of county, state governments and other organizations.

The limitation not only means that state and local governments must spend more to validate address information, so must the Census Bureau and other federal agencies, according to a group of data specialists speaking at a conference on the use of geographic data. Keep reading →

Recovery.gov is the U.S. government’s official website that provides easy access to data related to Recovery Act spending and allows for the reporting of potential fraud, waste, and abuse. My AOL colleague, Richard Walker wrote recently about how Recovery.gov “Shows The Power Of Transparency In Tracking Federal Spending” since the Recovery Accountability and Transparency Board [RAT Board] has provided “a commendable model of transparency… the tremendous success of the RAT Board is worthy of replication throughout the federal bureaucracy.”

He also mentions how the proposed Digital Accountability and Transparency Act of 2011 (DATA Act) would establish consistent data elements and standards for federal financial information to assure comparability and reliability in reported information and how recipient reporting through federalreporting.gov is the most cutting-edge feature of the transparency process and should be an integral part of federal spending accountability. Keep reading →

Page 5 of 812345678