In this video, Tiffany Shlain, founder of The Webby Awards, Jeff Jonas, of IBM, and Mari Maeda, of DARPA, discuss ways data can change the world.
It was taken at The Economist’s Ideas Economy: Information 2012 event in San Francisco, California. The session was moderated by Kenneth Cukier, data editor for The Economist. Keep reading →
A group of technology leaders came to the Capitol this week to make the case that the age of “big data” is not only upon us, but also represents a powerful and practical way for federal agencies to create substantially more value for the public – for relatively little incremental investment.
That comes as refreshing news compared to the relentless promises of big savings that accompanied the dawn of cloud computing, which has proven far trickier to implement. The difference is due in large measure to the fact that big data is really more of a phenomenon than a technology. Keep reading →
Computers can’t simulate the Earth’s ever-changing climate in real time, the interaction of the human heart with each of thousands of different drugs, or the tiniest details of a nuclear weapon’s detonation.
But that could soon change. Keep reading →
With cloud computing becoming an integral part of the business of government, the recent Derecho storm raised legitimate concerns about the increasing reliance on large, grid-dependent data centers.
As smart, self-healing grids are still years out, cloud service providers are starting to see a new selling point in offering grid-independent solutions. Keep reading →
In a sign that the worlds of big data and government-owned high performance computing centers are beginning to converge, the Department of Energy’s Lawrence Livermore National Laboratory and IBM announced that they are joining forces to help boost the competitiveness of U.S. industries in the global economy.
The announcement drew the attention and praise of Sen. Dianne Feinstein (D-Calif.) during a Capitol Hill briefing June 27, during which Feinstein stressed the growing importance of high performance computing and data analytics in the U.S. Keep reading →
The federal digital strategy released today is the next step in President Barack Obama’s effort to streamline and improve government services through mobile and web-based technologies and solidifies many efforts already under way.
Analysts mostly applauded the strategy, saying it provides specific, measurable goals, demonstrates a commitment to transforming the use of technology to better serve citizens, requires the use of analytics to enable more responsive government and builds security into to the federal digital architecture. Keep reading →
In this satellite handout from National Oceanic and Atmospheric Administration (NOAA), Hurricane Rina churns October 26, 2011 in the Caribbean Sea.
A new study has concluded serious weather events cost the U.S. of $485 billion annually. Keep reading →
The future of federal technology spending may not be as bleak as current government budget cutbacks seem to suggest, a group of former government information technology officials suggested at a Federal IT forum today.
But changes in the type of technology services agencies are acquiring, the way they acquire them, compounded by election year uncertainty, are forcing contractors to reassess their strategies. Keep reading →