A complete overhaul of how the government approaches its IT projects and systems is needed if officials want to see the Open Government Directive work, open-government advocate Carl Malamud said Tuesday.
Speaking at the Gov 2.0 Summit in Washington Tuesday, Malamud, president of Public.Resource.Org, said that that while the commitment to open government is a good start, the federal IT system is fundamentally flawed, and most of its IT investment is an exercise in futility.
The Gov 2.0 Summit, produced by O'Reilly and UBM TechWeb, is being held in Washington Tuesday and Wednesday.
"Our federal government spends $81.9 billion a year on information technology," Malamud said, according to a transcript of his speech at the conference. "Much of that is a wasted effort. We build systems so badly; it is crippling the infrastructure of government."
Malamud cited several examples of woefully managed and financed IT projects to illustrate his point. One is a project about which he testified to Congress -- an Electronic Records Archives system the National Archives and Records Administration (NARA) has been building. The project is way over budget and has been badly managed, and is one that's been flagged by U.S. CIO Vivek Kundra for review.
"The Inspector General testified he had no idea what the system did or how much it costs," Malamud said. "GAO issued reports saying they couldn't figure it out either." So far, the government has spent $250 million on the system and has projected it will cost $500 million to finish. However, Malamud predicted that the project will end up costing $1 billion when all is said and done.
Another "fiasco" is the FBI's Sentinel system, which has cost $451 million but still doesn't work, and may cost another billion to finish, he said. The ambitious project, which is to build an automated case-management system, also is on Kundra's review list.
Malamud suggested the government take some specific steps to avoid such scenarios in the future and to create a system that supports transparency. One is to digitize as much government data as possible, a move in line with the mission of his nonprofit Public.Resource.Org, which puts government information on the web.
"Prior efforts at digitization have been halfhearted," Malamud said. "We should be spending a minimum of $250 million per year for a decade on a national scanning initiative."
Currently, the government's large vast data stores are in places where most people have no access to them, Malamud said. Making them available could provide a "platform that provides access to knowledge for all," he said.
The federal government also should conduct an agency-by-agency review to rebuild IT systems mainly on open-source infrastructure, choosing carefully which proprietary technology is used to avoid the "reliance on over-designed custom systems" he sees in federal agencies now.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 18, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."