Make the rounds with government agency CTOs or at any public sector technology conference these days and two words are likely to emerge in the conversation very quickly: big data.
Whether the discussion is around Hadoop, MapReduce or other big data tools, the focus to date has been largely on data analytics and how to rapidly and efficiently analyze extremely large and unstructured datasets. big data analytics is important for government agencies seeking to gain better insights into their data and make more informed decisions based on this insight, but analytics represents the tip of the iceberg in making big data work. Keep reading →
Only 6% of civilian agencies and 3% Defense and Intelligence agencies currently have the infrastructure and processes in place to take full advantage of big data sets and most federal organizations will need at least three years before they can, according to a just-released survey of federal IT professionals.
Government agencies are savings billions of dollars from virtualization; and those savings are projected to grow as workloads in virtualized server and desktop environments are expected to double by 2015. But agencies must overcome funding uncertainties, concerns about legacy systems and other barriers to achieve virtualization’s full potential, according to a new industry survey of government IT executives.