Immediately after the 9.0 earthquake struck off the coast of Japan on March 11, the National Oceanic and Atmospheric Administration, using real-time data from ocean sensors, generated computer models of the tsunami to follow. Those models were quickly shared around the world via YouTube and other websites, providing vital information to an anxious public.
The sensors, located on buoys and the ocean floor, are part of a global network of sensors that provide a steady stream of data on the Earth's oceans and weather. With that and a vast archive of historical data, the agency manages some of the largest databases in federal government. Its Princeton, N.J., data center alone stores more than 20 petabytes of data.
"I focus much of my time on data lifecycle management," said Joe Klimavicz, who discussed his IT strategy in a recent interview with InformationWeek at NOAA headquarters in Silver Spring, Md. The keys to ensuring that data is useable and easy to find, he says, include using accurate metadata, publishing data in standard formats, and having a well-conceived data storage strategy.
NOAA is responsible for weather and climate forecasts, coastal restoration, and fisheries management, and much of Uncle Sam’s oceanic, environmental, and climate research. The agency, which spends about $1 billion annually on IT, is investing in new supercomputers for improved weather and climate forecasting and making information available to the public through Web portals such as Climate.gov and Drought.gov.
Weather and climate sensors attached to planes, bridges, and buildings are becoming ubiquitous, feeding data into NOAA's forecasting models. NOAA collects 80 TB of scientific data daily, and Klimavicz expects there to be a ten-fold increase in measurements by 2020.
As is true in other agencies, NOAA uses a mix of legacy IT systems and newer platforms. "While we probably have some of the most cutting-edge storage around at some locations, we have more primitive technology elsewhere," he said. "A lot of these things take time, and I don’t see a lot of influx of cash to do this in a big bang, so we're tackling it incrementally."
Last year, NOAA began real-time monitoring from a new cybersecurity center, which is open 12 hours a day, five days a week. Klimavicz wants to expand that to 24 by 7. The agency uses ArcSight security tools for monitoring events and correlating logs. "The earlier you react, the less work you have to do," he says.
With 122 field offices, NOAA is highly decentralized. The agency’s CIO office -- with about 115 federal employees and an equal number of contractors -- oversees IT policy and manages telecom, supercomputing, cybersecurity, and other IT operations. Six line offices, including the National Weather Service and the National Environmental Satellite, Data and Information Service, have their own CIOs, who work under Klimavicz's guidance and meet with him weekly.
The Agile ArchiveWhen it comes to managing data, don’t look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyIT’s tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
Join InformationWeek’s Lorna Garey and Mike Healey, president of Yeoman Technology Group, an engineering and research firm focused on maximizing technology investments, to discuss the right way to go digital.