Obama administration announced plans to fund R&D push for big data analysis, for applications ranging from health to defense.
Federal Data Center Consolidation Makes Progress
(click image for larger view and for slideshow)
The Obama administration on Thursday announced plans to spend hundreds of millions of dollars on a new "big data initiative" for research and development into technology to "access, store, visualize, and analyze" massive and complicated data sets.
The initiative comes as volumes of data used by government and the private sector expand exponentially. It includes commitments from several federal agencies to develop new technologies to manipulate and manage big quantities of data and use those technologies in science, national security, and education. John Holdren, director of the White House's Office of Science and Technology Policy, compared the effort to federal research that led to breakthroughs in supercomputing and to the development of the Internet.
"While the private sector will take the lead on big data, we believe that the government can play an important role, funding big data research, launching a big data workforce, and using big data approaches to make progress on key national challenges," Holdren said in a press conference to announce the effort. The government is also helping to set big data standards.
The federal agencies working on the initiative will be the National Science Foundation, the National Institutes of Health, the Department of Defense, the Department of Energy, and the U.S. Geological Survey.
Among the big data projects will be a joint solicitation from the National Science Foundation and the National Institute for Health, which will award a up to $25 million in funding for 15 to 20 research projects that, according to the solicitation, will "advance the core scientific and technological means of managing, analyzing, visualizing, and extracting useful information from large, diverse, distributed, and heterogeneous data sets."
In addition to the big data solicitation, the National Science Foundation is also implementing a long-term big data strategy that includes encouraging research, funding a $10 million data project at the University of California, Berkeley, support for a geosciences data effort called Earth Cube, and more.
The Department of Defense, meanwhile, plans to spend about $250 million annually, including $60 million on new research projects, on big data. The Defense Advanced Research Projects Agency is creating the XDATA program, a $100 million effort over four years to "develop computational techniques and software tools for sifting through large structured and unstructured data sets."
The National Institutes of Health announced as part of the effort that it has placed 200 Tbytes of genomic data--the world's largest set of human genetic data, according to the White House--on Amazon Web Services as part of the international 1000 Genomes Project.
The Department of Energy is no stranger to big data, being home to some of the most powerful supercomputers in the world. As part of the big data initiative, the agency's Lawrence Berkeley National Laboratory will spend $25 million to create a new research facility, the Scalable Data Management, Analysis, and Visualization Institute.
In a blog post accompanying the announcement, OSTP deputy director Tom Kalil called on industry, universities, and non-profit organizations to join the administration in its efforts.
For their part, technology companies applauded the effort. "The administration's work to advance research and funding of big data projects, in partnership with the private sector, will help federal agencies accelerate innovations in science, engineering, education, business and government," said David McQueeney, VP of software for IBM Research.
Attend InformationWeek's IT Government Leadership Forum, a day-long venue where senior IT leaders in government come together to discuss how they're using technology to drive change in federal departments and agencies. It happens in Washington, D.C., May 3.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.