Big data analysis has the potential to save government agencies 14% of their annual budgets, says poll, but few government IT managers are confident of success.
New York's 32-Story Data 'Fortress'
(click image for slideshow)
Federal IT professionals estimate that government agencies potentially can save 14% of their budgets, or nearly $500 billion across the government, from successfully analyzing big data. But while nearly one-fourth of federal IT managers in a new government poll report their agencies have launched at least one big-data project, only 31% believe their agency's big-data strategy is sufficient to deliver on that potential.
The numbers come from a recent report, "Smarter Uncle Sam: The Big Data Forecast," by government IT networking group MeriTalk. The report, sponsored by EMC Corporation, is based on a survey of 150 federal IT professionals.
According to the findings, feds predict they will spend 16% of their annual IT budget, or nearly $13 billion, on big data in five years. Big data could help agencies fulfill their missions in several ways, including improving processes and efficiency (51%), enhancing security (44%) and predicting trends (31%).
A majority (70%) of those surveyed believe that five years from now, successfully leveraging big data will be critical to fulfilling these federal mission objectives.
NASA chief technology officer Dr. Sasi Pillay, speaking during an online webinar last week, illustrated how analyzing large volumes of data could improve agency operations. The space agency stores and processes vast amounts of climate and weather data at its Center for Climate Simulation, which provides supercomputing resources to NASA scientists and engineers. Officials at the center want to reduce the number of tests and do more computational modeling. That would allow allows scientists to see differences between various data and anomalies visually. It's a way for the agency to save time and billions of dollars, Pillay said.
Another area is multilevel domain data, which involves creating a mechanism to help pilots fly planes in different types of conditions, Pillay said. The idea is to take large amounts of data and integrate it into the methodology to understand how a plane and an engine will perform in various conditions.
"NASA is in the business of collecting information and looking at how we can make it useful," said Pillay during the webinar. "We engage citizen scientists to share that data through Data.gov. We're working on making it available in digital formats," she said. "It takes time to catch up with the data. There's a long lag. Hopefully we'll improve the use of that data for more scientific discoveries."
"The big component" to making big data work more effectively for agencies "will be getting their hands around metadata and properly tagging it," Rich Campbell, federal chief technologist at EMC, said during the webinar, which was held in conjunction with the release of the findings.
Currently, 26% of government data has been tagged and 23% has been analyzed, according to average estimates collected by the survey. Federal IT execs believe agencies should double their data management efforts, aiming for 46% for tagged data and 45% for analyzed data, the study suggested.
When it comes to long-term impact, 69% of the study's respondents said big data will help government work better.
Top federal functions that are expected to benefit most from analyzing big data include executing military/intelligence, surveillance, and reconnaissance missions (54%); combating fraud, waste and abuse (48%); and managing the transportation infrastructure (27%).
"It varies agency by agency," said EMC's Campbell. "The timeframe for big data projects is six to nine months for most agencies, and 18 to 36 months for larger projects."
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.