Only 6% of civilian agencies and 3% Defense and Intelligence agencies currently have the infrastructure and processes in place to take full advantage of big data sets and most federal organizations will need at least three years before they can, according to a just-released survey of federal IT professionals.
The survey’s findings seem to indicate a rocky road ahead for President Obama’s “Big Data Research and Development Initiative” announced in late March. As part of that initiative, six federal departments and agencies announced more than $200 million in new big data projects.
The survey, conducted in March and released this week by MeritTalk and sponsored by NetApp, interviewed 151 federal government CIOs and IT managers (half with civilian, half with defense/intelligence agencies).
The survey found that 96% of respondents expect their data to grow in the next two years by an average of 64%. However, less than half of those surveyed estimated they had the data storage/access, computational power and personnel needed to strategically use the data.
In addition, 57% said they have a least one dataset that has grown too big to work with using their current management tools and/or infrastructure.
Nine out of 10 respondents reported facing a variety of challenges to using big data. The most significant challenges to managing large amounts of data were:
- storage capacity – 40% of respondents
- distribution and sharing – 36%
- search and retrieval – 35%
- slow analysis and processing speed – 34%
Another challenge is the rise of unstructured data, which constitutes 31% of agencies’ data, on average. Complicating data management and resource matters further is the question of who actually owns the data agencies capture–the IT department or the department that generates the data, for instance.
At the same time, IT managers in the survey believe effectively harnessing big data will result a variety of benefits, including improving efficiency (59%); decision-making (51%) and forecasting (30%).
Two out of three respondents (64%) said they could easily expand or upgrade their data management system, but they estimated it would take an average of ten months to double their short-to-medium-term capacity.
That may or may not be fast enough to keep up with the rising volume of data IT managers need to manage.
IT managers polled in the survey projected that over the next two years, the amount of data they store will increase, as an average, to 2.63 petabytes from a current 1.61 petabytes.
Some federal agencies, however, are further along than others in preparing to manage larger and larger data sets.
Asked where agencies are in their planning for big data, approximately 18% of respondents said they are designing a plan or proof of concept on how to use the data. Only 6% of respondents at civilian agencies and 3% at defense/intel agencies reported having infrastructure processes in place to successfully leverage big data.
Many more, however, are just getting started. The survey found that 31% of DOD/Intel agency IT managers are not yet discussing big data and 42% are just learning how to use it. Civilian agencies appear a little further along, with 60% of respondents reporting they are learning about big data with only 9% not yet talking about it.
At the same time, roughly 60% of IT managers said they are currently capturing and analyzing large amounts of data and 40% are using it in their decision-making process. The survey suggested that “agencies are spending to collect data but are yet to unlock the return” on that effort.
Of those agencies taking steps to improve how they manage and make decisions with big data, the areas of greatest effort include:
- investing in IT infrastructure to optimize data storage – 39%
- training IT professionals to manage and analyze big data – 33%
- improving data storage security – 31%
- educate senior management on big data issues – 28%
The report has a margin of error of +/- 7.95% at a 95% confidence level. The full study is available for download to those who register for it at MeriTalk.com.