Government agencies will add a petabyte of new data in the next two years, but most lack the personnel resources, storage and computing power to take advantage of the explosion, according to a survey from government online community MeriTalk.

The new report, "The Big Data Gap," suggests that Federal IT professionals agree that growing stores of data can improve government but that the usefulness of big data is limited by its current inaccessibility.

The study surveyed 151 Federal government CIOs and IT managers during March of this year and was sponsored by NetApp, a provider of storage technology and management software that stands to be excited by the results. 

A broad premise of big data and analytics is a use case for tackling unstructured data, the documents and other kinds of records not loaded into table form. The survey authors claim about one-third of agency data is unstructured and less useful, and that further growth of unstructured data has grown. 

Also hindering the usefulness of data stores are questions of governance and ownership, where respondents are split in their beliefs that IT, departments and C-level officers are the true owners of the data.  

By the projections of the study, agencies estimate only 49 percent of storage and access, 46 percent of bandwidth and CPU and 44 percent of the personnel required to drive big data mission results. Fifty-seven percent say they have at least one data set too big for their current management tools and infrastructure to work with. 

The Federal government recently announced a $200 million Big Data Research and Development Initiative to study projects where the government could apply emerging big data practices to problem-solving.

Progress Slow but Outlook Large in Big DataSponsored survey sees government agencies adding a petabyte of new data in the next 24 months but resources not presentGovernment agencies will add a petabyte of new data in the next two years, but most lack the personnel resources, storage and computing power to take advantage of the explosion, according to a survey from government online community MeriTalk.The new report, "The Big Data Gap ," suggests that Federal IT professionals agree that growing stores of data can improve government but that the usefulness of big data is limited by its current inaccessibility.The study surveyed 151 Federal government CIOs and IT managers during March of this year and was sponsored by NetApp, a provider of storage technology and management software that stands to be excited by the results. A broad premise of big data and analytics is a use case for tackling unstructured data, the documents and other kinds of records not loaded into table form. The survey authors claim about one-third of agency data is unstructured and less useful, and that further growth of unstructured data has grown. Also hindering the usefulness of data stores are questions of governance and ownership, where respondents are split in their beliefs that IT, departments and C-level officers are the true owners of the data.  By the projections of the study, agencies estimate only 49 percent of storage and access, 46 percent of bandwidth and CPU and 44 percent of the personnel required to drive big data mission results. Fifty-seven percent say they have at least one data set too big for their current management tools and infrastructure to work with. The Federal government recently announced a $200 million Big Data Research and Development Initiative to study projects  where the government could apply emerging big data practices to problem-s