NASA, the National Science Foundation and the Department of Energy launched a new public challenge contest Wednesday to generate novel approaches to using “big data” information sets from various U.S. government agencies.

Dr. Suzanne Iacono, senior science advisor for the National Science Foundation’s Computer and Information Science directorate, made the announcement during an industry forum at the Capitol surrounding the release of a new report on big data in government.

In a related announcement, NSF, with support from the National Institutes of Health, will invest nearly $15 million in new big data fundamental research projects.

“To get the most value from the massive biological data sets we are now able to collect, we need better ways of managing and analyzing the information they contain,” said NIH Director Francis S. Collins.

NIH announced eight project awards aimed at accelerating health research “by developing methods for extracting important, biomedically relevant information from large amounts of complex data,” Collins said.

Wu Feng, associate professor of computer science at Virginia Tech (pictured above) will be one of the participants in the big data research initiative. He and Srinvas Aluru of Iowa State University and Oyekunie Olukotun at Stanford University are making promising advances for genomics and metagenomics.

“Big data is characterized not only by the enormous volume or the velocity of its generation but also by the heterogeneity, diversity and complexity of the data,” said Iacono, who co-chairs an interagency steering group on big data issues.

“There are enormous opportunities to extract knowledge from these large-scale diverse data sets, and to provide powerful new approaches to drive discovery and decision-making, and to make increasingly accurate predictions,” she said. “We’re excited to see what this competition will yield.”

The challenge is being led by the NASA Tournament Lab, and is the latest in a series of challenge programs sponsored by the government via its program.

The Big Data Challenge series is looking to experts in a variety of fields to look at health, energy and Earth science data, and imagine new or better analytical techniques and software tools that can make government information more usable.

The contest calls for applicants to describe how the data may be shared as universal, cross-agency solutions that transcend the limitations of individual agencies.

“The ability to create new applications and algorithms using diverse data sets is a key element for the NTL,” said Jason Crusan, director of the Advanced Exploration Systems Division at NASA Headquarters in Washington, in a prepared release.

The competition will be run by the NTL, a collaboration between NASA, Harvard University and TopCoder, a competitive community of digital creators. TopCoder will provide its Open Innovation platform that allows U.S. government agencies to “conduct high risk/high reward challenges in an open and transparent environment with predictable cost… and the potential to move quickly into unanticipated directions,” according to a website official.

Registration is open through Oct. 13 for the Ideation Challenge phase, the first of four idea generation competitions in the series.

The eight projects announced by NSF and NIH today run the gamut of scientific techniques for big data management, new data analytic approaches, and e-science collaboration environments with possible future applications in a variety of fields, such as physics, economics and medicine.

“Data represents a transformative new currency for science, engineering, and education,” said Farnam Jahanian, assistant director for NSF’s Directorate for Computer and Information Science and Engineering, in a prepared release.

“By advancing the techniques and technologies for data management and knowledge extraction, these new research awards help to realize the enormous opportunity to capitalize on the transformative potential of data.”