big data

In this satellite handout from National Oceanic and Atmospheric Administration (NOAA), Hurricane Rina churns October 26, 2011 in the Caribbean Sea.

A new study has concluded serious weather events cost the U.S. of $485 billion annually. Keep reading →

The explosive growth of data emerging from social media, mobile applications and other sources is creating new challenges in terms of how to harness and obtain value from it.


The federal government’s recently announced $200 Million Big Data Research and Development Initiative is prompting new and higher profile attention from industry on how to better address the explosion of big data.

The latest example is the decision by TechAmerica Foundation, announced late last week, to create an expert commission on the big data issues.

“Big Data is one of the biggest issues that the technology industry has to tackle in the near term and we want to bring together the leading thinkers on the issue to provide the path forward,” said Jennifer Kerber, President of the TechAmerica Foundation.

Kerber cited the fact that more than 90% of the data that has ever existed was created in the past two years, according to a report in Fortune Magazine. Yet, the notion of what actually constitutes big data from the ongoing flood of information, and what new opportunities are emerging around big data, remains vague and not well understood, she said.

With the world’s data doubling every 18 months, the real question is how to make intelligent decisions based on that data, and that’s a question that is critical for government and industry to answer, she said.

The government of course has been dealing with big data issues for many years and has a variety of major initiatives already underway.

But the explosive growth of data emerging from social media, mobile applications, machine sensors and other sources is creating new challenges in terms of how to harness and obtain value from it.

Kerber said the new commission will seek to explore what capabilities are required to succeed with big data; how to use big data to make intelligent decisions; how will agencies effectively govern and secure huge volumes of information, while protecting privacy and civil liberties; and what value will it really deliver to the U.S. Government and U.S. citizens.

The TechAmerica Foundation said it is currently accepting applications for commissioners.

COMMENTARY: Yesterday Todd Park, Federal CTO, used Twitter to answers questions about “big data”. Well sort of because while it reportedly generated 413 tweets, reaching an audience of 3.5 million, I counted only 131 actual questions, only 9 actual answers, and 7 retweets – so it really was a big data event with small results like so many these days.


The highlights of Todd Park’s responses, in my opinion, were:

  • Librarians becoming the new data liberators – check out what NLM is doing
  • Great places for health/data startups to go: Health Challenges, HDI Forum and Code Fests
  • Key 2 do: make data liquid + accessible for beneficial use while rigorously protecting privacy. This is doable
To me, the most penetrating question he received was this: “How can small companies get ready to harness big data? It seems to be a big boys playground.”

Ironic, given the point of all of this health data activity by Todd Park and his predecessor, Aneesh Chopra, was to release lots of government data (big and small) to foster innovation investment and job growth.

Keep reading →


U.S. Chief Technology Officer Todd Park told millions of Twitter followers today that skillful use of “big data” can help patients understand their health status and provide insights into how to improve their own health.

In conjuction with Big Data Week, he answered questions from around the world, focusing on practical applications for large aggregations of data, particularly in the health care field. Parks, who previously held the CTO post at HHS, said privacy concerns are surmountable and that the widespread dissemination of the information can help individuals to care for themselves, as well as helping medical professionals provide better care. Keep reading →

Government agencies are flooded with a tidal wave of data. But a number of healthcare agencies are facing particularly challenging obstacles to achieving their missions in a digital and data interoperable world. This is particularly true for regulatory healthcare agencies such as the Food and Drug Administration (FDA).

As outlined in the agency’s “FDA Science and Mission at Risk” report dating back to 2007, the FDA anticipated many of these challenges. The report detailed new data sources coming from new digital sciences including the use of molecular data for medicine (e.g., genomics, proteomics, and pan-omics), wireless healthcare, nanotechnology, medical imaging, telemedicine platforms, electronic health records and more. Keep reading →

Todd Park, the newly-appointed U.S. Chief Technology Officer for the White House, will attempt to address questions about Big Data in 140 characters or less during a live Twitter chat scheduled today at 2. p.m ET.

The 30-minute Twitter chat is expected to give Park an opportunity to highlight the White House’s new Big Data Research and Development Initiative, and more specifically, efforts supporting the use of big data in the health sector. Keep reading →

I just recently attended a meeting about something that many people are beginning to hear about but which most people do not understand — Ontology for Big Systems.

Even the words describing the outcome of meeting are misleading: Summit and Communique. A summit is usually a global meeting of world leaders and a communique is usually a short statement for the public and press. This year’s communique is 11 pages long, which has grown from previous years: 2006 – 1, 2007 – 4, 2008 – 8.4, 2009 – 8.4, 2010 – 8.6, and 2011 – 8 pages.


Ontology has two definitions — one from philosophy and another computer science. I won’t even bother you with their definitions because the Intelligence Community prefers to use the word Knowledge Base instead to describe a collection with a very large number of documents that can be analyzed and searched “for more needles in bigger haystack.” Keep reading →

The epic shift to cloud computing and need to process massive volumes of data are spurring a high-stakes race to build global data center capacity while making information available on whatever kind of device consumers want it, Microsoft CEO Steve Ballmer told a group of Northern Virginia technology executives Thursday.

“We need to think of data center capacity in real time,” he said, as part of Microsoft’s broader bet that businesses and government will use a combination of public and private clouds in the future, Ballmer said. He made comment in a series of wide ranging remarks about how technology is changing and the implications that will have for individuals, businesses, and government during an industry breakfast in McLean, Va., hosted by the Northern Virginia Technology Council. Keep reading →


Big Data and shared services, arguably two of the hottest trends in federal IT, posses the unquestionable power to revolutionize our ability to share information, make informed decisions and create knowledge – all while saving taxpayers boatloads of cash. However, despite the myriad memos, initiatives and projects focused on these transformational strategies, the federal IT community does not seem to be giving much attention to one of the most critical requirements needed to truly maximize these systems: bandwidth.

Here is the problem: Keep reading →


When it comes to big data and high public demand, the cloud can be a federal agency’s salvation.

That’s what the National Archives and Records Administration learned during the recent and long-anticipated 1940 census launch — the largest-ever release of publicly available data in the federal government. Keep reading →

Page 6 of 91...23456789