White House order means agencies will need to give more attention to data lifecycle management.
The CIOs of federal agencies are assessing the full scope of a White House executive order, introduced May 9, that requires agencies to make government data available in open, machine-readable formats. One likely outcome is that federal IT teams will have to give data sets the same of lifecycle management attention they already apply to hardware and software investments, said Simon Szykman, CIO at the Department of Commerce.
"Data will be part of what gets addressed early on in the lifecycle [of new systems or applications]," Szykman said during a May 14 presentation at the FOSE conference in Washington, D.C. "This makes it more systematic requirement and a bigger part of what we'll be doing in the future."
Szykman predicted the executive order will lead to "less ad hoc modeling and sharing and more systematic sharing of data," in response to questions about the policy.
Richard Holgate, assistant director for science and technology and CIO at the Bureau of Alcohol, Tobacco, Firearms & Explosives (ATF), said the executive order is significant not only for making government data more readily available to the public, but also for sharing data internally between government components.
The Department of Justice, of which ATF is one of many component organizations, publishes a variety of crime statistics that are useful to the public. But the majority of data it produces revolves around case files used by law enforcement agencies, attorneys, the courts, and other parts of the criminal justice system.
"We do a relatively poor job of passing along all that information," Holgate said. Stricter requirements to make data machine readable from the start would help make such data more readily discoverable, more accurate and timely, and reduce duplication.
The executive order was accompanied by a new Open Data Policy, issued by the Office of Management and Budget and the Office of Science and Technology Policy, which instructs federal agencies to manage government information as an asset.
Under the new policy, agencies are required to create an internal index of their data, make a public list of their public data and list all data that can be made public. Within 30 days of the policy's issuance, agencies will get access to an open online repository of tools and best practices to assist them in integrating the policy into their operations.
Federal CIO Steven VanRoekel and CTO Todd Park and will be responsible for regularly updating the online repository so that agencies continue adoption of open data practices, according to the executive order. VanRoekel, speaking at FOSE on May 15, said the goal of the new policy is to make interoperable, machine-readable data "the new default" in how federal agencies manage their data.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 18, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."