NetApp

Make the rounds with government agency CTOs or at any public sector technology conference these days and two words are likely to emerge in the conversation very quickly: big data.

Whether the discussion is around Hadoop, MapReduce or other big data tools, the focus to date has been largely on data analytics and how to rapidly and efficiently analyze extremely large and unstructured datasets. big data analytics is important for government agencies seeking to gain better insights into their data and make more informed decisions based on this insight, but analytics represents the tip of the iceberg in making big data work. Keep reading →

Only 6% of civilian agencies and 3% Defense and Intelligence agencies currently have the infrastructure and processes in place to take full advantage of big data sets and most federal organizations will need at least three years before they can, according to a just-released survey of federal IT professionals.

The survey’s findings seem to indicate a rocky road ahead for President Obama’s “Big Data Research and Development Initiative” announced in late March. As part of that initiative, six federal departments and agencies announced more than $200 million in new big data projects. Keep reading →

Government agencies are savings billions of dollars from virtualization; and those savings are projected to grow as workloads in virtualized server and desktop environments are expected to double by 2015. But agencies must overcome funding uncertainties, concerns about legacy systems and other barriers to achieve virtualization’s full potential, according to a new industry survey of government IT executives.

The new study found that 82% of federal and 77% of state-and-local IT professionals say their agencies have already implemented some degree of server virtualization, where computing work is done in artificially-created, software-controlled work spaces. Keep reading →