big data

Government is at a crossroads in having the ability to process vast volumes of data, but too few executives who understand how to tap its potential, according to a report on “big data” released today.

The TechAmerica Foundation report offers recommendations for public policy, research and development, privacy issues and overcoming barriers based on government leaders who have established early successes in leveraging big data, such as the Internal Revenue Service and Centers for Medicare and Medicaid Services. Keep reading →

White House officials brought together dozens of senior government leaders and private sector entrepreneurs Monday, including Energy Secretary Dr. Steven Chu, to demonstrate how energy data is fueling new products and services aimed at promoting greater energy efficiency in America.

The “Energy Datapalooza“was the latest in a series of White House-sponsored events designed to showcase innovative applications using government data – this one focused on the energy sector – hosted by Federal Chief Technology Officer Todd Park. Keep reading →


Ryan Panchadsaram was selected as a Presidential Innovation Fellow for the Blue Button program as part of the new White House Presidential Innovation Fellows program. The program pairs top innovators from the private sector, nonprofits, and academia with top innovators in government to collaborate on solutions that aim to deliver significant results in six months.



The Blue Button program is aimed at providing easy access to health records by enabling individuals to securely download their own health information via a simple text file, such as current medications and drug allergies, claims and treatment data, and lab reports. The Department of Veterans Affairs – working with the Department of Health & Human Services, the Department of Defense, and others – are collaborating on the project.

_______________________________________________

This is one in a series introducing 18 Fellows working on five initiatives that are part of the White House Presidential Innovation Fellows program.

_______________________________________________

Panchadsaram was most recently the head of Customer & Product at Ginger.io, a spin-off from MIT Media Lab, using big data to transform health. Keep reading →

MeriTalk has released a report that reveals how federal IT managers view the barriers, current status, and future plans related to moving mission-critical applications to the cloud.

The report, released this week, also reveals that government could save an estimated $16.6 billion annually if all agencies move just three mission-critical applications to the cloud. Keep reading →

It’s not easy following Todd Park, the federal government’s chief technology officer, and his breathless on-stage enthusiasm for promoting technical innovation in government and the virtues of collaboration.

Park clearly found an avid proponent, however, in Seth Harris, U.S. Deputy Secretary of Labor, who made a persuasive case last week in describing the inherent logic for government and the private sector to work jointly in turning information into useful tools for the American public and the U.S. economy. Keep reading →

Make the rounds with government agency CTOs or at any public sector technology conference these days and two words are likely to emerge in the conversation very quickly: big data.

Whether the discussion is around Hadoop, MapReduce or other big data tools, the focus to date has been largely on data analytics and how to rapidly and efficiently analyze extremely large and unstructured datasets. big data analytics is important for government agencies seeking to gain better insights into their data and make more informed decisions based on this insight, but analytics represents the tip of the iceberg in making big data work. Keep reading →


Big data can mean a lot of things to different federal agencies. To the Department of Energy, big data not only means managing an information sharing network to promote big science, but also making the results of that research available to the public.

This information can be blended together in a variety of ways, depending on the end users’ needs, explained Robert Bectel, CTO and senior policy advisor at the DOE’s Office of Energy Efficiency and Renewable Energy (EERE). Speaking at a recent federal IT event, he explained that as one of the department’s technology evangelists, his goal is to make sure that taxpayers get the most out of their money by allowing federal workers to do the most on the job. Keep reading →

The federal government’s vast collection of searchable data has begun to feature information from city databases as part of the effort to increase transparency, promote efficiency and spur innovation.

Now, city officials and developers will work together to help improve the information available to city residents via the new ‘Cities’ Community as part of Data.gov. Databases are currently available for Chicago, New York City, San Francisco and Seattle, according to Jeanne Holm, GSA’s data.gov evangelist. Next up: Santa Cruz, Calif., Louisville and Atlanta. Keep reading →

Dr. Stephan Fihn is sitting on the edge of a revolution at the Department of Veterans Affairs, where big data is becoming easily accessible for clinicians and analysts throughout its 160 hospitals.

Fihn (pictured above) is director of Business Intelligence and Analytics for the Veterans Health Administration and a practicing physician at the VA Puget Sound Health Care System in Seattle, where he is helping to develop as well as benefiting from the VA’s big data warehouse. Keep reading →

A National Institutes of Health (NIH) doctor aims to revolutionize a notoriously unpleasant medical test to identify a leading cause of cancer death.

Dr. Ronald M. Summers, M.D., Ph.D. pioneered the virtual colonoscopy. It uses non-invasive imaging similar to a CT scan to find polyps in the colon that are the precursor to cancer. Keep reading →

Page 4 of 912345678...9