Top 10 Government IT Innovators Of 2013
From low-cost open-source tech tools to better citizen services, government IT staffs show how creative -- and cost conscious -- they can be.
![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/bltc0182b2356ae8eed/64b83949410a1b4c0bd7459b/IW_generic_image.png?width=700&auto=webp&quality=80&disable=upscale)
Information technology has become a pivotal tool for government agencies trying to improve services, even as budgets slide steadily backward. Continuous advances in technology, from mobile applications to cloud computing, are helping federal, state and municipal government agencies develop new approaches to delivering better services to the public -- and more efficient tools to government employees.
Each year, InformationWeek Government sets out to identify the best examples of how agencies are applying IT in creative ways. After reviewing dozens of submissions, we selected 10 standout examples of IT innovation in government as InformationWeek's 2013 Government Innovators.
Their fresh approaches take many forms, from open source applications that put new tools into the hands of government employees at substantial savings over past projects, to more sophisticated information services available to on-the-go citizens via mobile apps.
For example, the city of Chicago, like many cities, faces growing information demands. Yet the need to stretch resources and collaborate across departments has never been greater. The challenge was how to pull data from many disparate sources, define the relationships for using that data, and make it relevant to a diverse array of communities of interest. The solution: Chicago's Department of Innovation and Technology developed a low-cost, comprehensive situational awareness program called WindyGrid, pictured here.
The application presents a unified, real-time situational view of operations across a map of Chicago, including 911 and 311 service calls, asset locations, building information, tweets and other critical data, all available in one place. It provides city employees instantaneous deep-dive analysis for a specific location.
To support tactical and strategic predictive modeling, WindyGrid also provides a comprehensive analytics platform to understand the impact of changes in service delivery. Departments can use this to analyze historical patterns or model future changes to service delivery to determine what might result in improved outcomes and cost savings. Chicago personnel already have been able to make quicker, smarter decisions and do a better job of allocating resources.
WindyGrid uses a highly scalable MongoDB database with an ESRI map solution, already owned by the department, and industry-standard Java Web services. As a result, the department was able to create a prototype within a few months and deploy the application at a much lower cost than similar commercially developed systems, which the city estimates would likely have cost tens of millions of dollars.
Photo credit: Walter Mitchell
Go to the InformationWeek 500 - 2013 homepage
Few federal agencies do a better job of capturing the public imagination than NASA's Jet Propulsion Lab, which helped a worldwide audience share in the Curiosity Rover's "seven minutes of terror" before landing successfully on Mars. Behind the scenes was the clever use of cloud technology, social media and engineering to transmit real-time images from 150 million miles out in space, all on a tight budget.
JPL's IT team, anticipating Olympic-size audiences, migrated several applications to the Amazon Web Services (AWS) cloud, including the legacy content management system, the Mars public outreach web sites and the Eyes on the Solar System website. But JPL IT had never implemented video streaming before and did not know how many people would watch the streaming video. At the eleventh hour, it became apparent viewership would be massive, so JPL and AWS put together a cloud-based system capable of handling 80,000 requests per second and that would ultimately stream 150 gigabytes per second and deliver 150 terabytes during the few days of the entry-descent-and-landing event.
At its peak, the websites reached 8 million hits per minute. The cloud enabled a global audience to experience the marvels of Mars at unprecedented speeds; in fact, at the same time JPL scientists did. Because of the cloud architecture, JPL had virtually unlimited computing and storage resources. And as importantly, JPL was able to serve 10 to 100 times more traffic at one-tenth the cost compared to the Mars Exploration Rover landing nine years earlier.
The FBI launched a new virtual platform and data storage initiative called Distributed Application Virtual Environment (DAVE) in 2012 that is delivering big benefits. DAVE is a single platform that hosts over 100 distributed applications used for daily worldwide operations. It supports more than 1,500 virtual servers and resulted in the retirement of over 550 physical servers. The virtual platform is mirrored in two locations -- primary and backup -- and is replicated between sites every five minutes. A complete failover can be accomplished in less than 40 minutes with the touch of an icon.
DAVE provides a single storage and backup capability for FBI field offices. Copies of data are maintained at three different centralized tiered storage locations, resulting in redundancy at each level. The virtual platform and single-storage system is helping the FBI standardize hardware, simplify IT processes, reduce server costs and manage complicated operations and maintenance procedures at scattered locations. The DAVE development team, pictured here (from left), includes David Waters, Oscar Prestwood, Laura Cain, Daniel Meyers, Jason James, April Landers, Richard Butler and Gian Ventura.
Immediately following Superstorm Sandy, New York City's Department of Transportation (NYCDOT) sprang into action to quickly assess and document nearly $500 million in damages to city streets, sidewalks, bridges and other facilities so rebuilding could begin. The information was vital for repair crews and for supporting reimbursement requests to city, state and federal agencies.
NYCDOT employed its mobile-first application development strategy to create a responsive GIS Web app that makes this information readily available. Its interactive map organizes photos and damage reports by location among 44 data layers. The map resizes itself to fit the screen used to run it: smartphone, tablet or desktop. Because the app operates from the cloud, a browser is the only client software needed.
The application allowed NYCDOT field inspectors to use iPads to capture digital photos of damages with accurate embedded location data. It also allowed NYCDOT commissioner Janette Sadik-Khan to summon data from an interactive map on her iPad while walking the halls of Congress advocating for storm relief funding for New York City.
The Department of Homeland Security operates more than 20 separate radio networks serving more than 120,000 front-line agents and officers. Most of these systems were deployed more than 20 years ago and often fail to provide sufficient coverage in remote locations. Many also don't meet federal mandates for encryption security and spectrum efficiency.
The Joint Wireless Program Management Office (JWPMO), chartered in April 2012 under DHS's U.S. Customs and Border Protection unit, in partnership with DHS Science and Technology (S&T) Directorate, took an innovative approach to tackling the problem. JWPMO teamed up with the Departments of Commerce, Justice and Interior to develop an enterprise-level broadband communications strategy and laid the groundwork to move from agency-owned-and-operated systems to the Nationwide Public Safety Broadband Network (NPSBN) and commercial subscriber-based services.
The NPBSN provides a crucial resource for DHS components to dramatically lower costs for building and maintaining wireless communications infrastructure. Without the effort, DHS was looking at spending $3.2 billion to upgrade its existing private radio systems department-wide.
The JWPMO team consists of representatives from across the Department of Homeland Security, including Cynthia Walters, acting executive director (seated at back right.)
Fort Bragg, N.C., "Home of the Airborne and Special Operations Forces," equips, trains, rapidly deploys and sustains full-spectrum forces supporting combatant commanders. The Network Enterprise Center (NEC-FB) provides IT support to these diverse missions and recognized the need to modernize its IT capabilities in anticipation of continued data growth and Army plans to reduce its data center count from 260 to 65.
Working with Northrop Grumman Information Systems, OSC Edge, EMC Corporation, VMware, Cisco Systems and CDW-G, Fort Bragg's data center operation became the first to fully virtualize in line with the Army Data Center Consolidation Plan. The operation now features high-performance unified storage, continuous data replication with multiple recovery points, inline data de-duplication, high-speed backup to disk, and archiving. The unified system is designed to manage physical, virtualized and cloud environments with automated end-to-end recovery processes.
The team's innovative acquisition approach -- evaluating technology without vendors' names -- resulted in a 70% reduction in servers and 1000% increase in storage capacity, according to Fort Bragg officials.
The project team included (front, from left) Robert Johnson, Crystal Hewett, Patti Pirtle; (rear, from left) Robert Wood, Steve Smith, Brett Hershfeild, Gerry Rhiner, Bryan Strauch, Daniel Nestor, James Nious and Rob Underwood.
The Disability Insurance Branch of the California Employment Development Department (EDD) was using a decades-old mainframe system that was unable to accommodate new technology or the growing demands of California citizens. The department set out to build a new Web-based system that would automate the nation's largest state disability insurance program, responsible for handling more than a million claims and paying out more than $5 billion annually.
The resulting system significantly expands access to claimants, medical providers, employers and voluntary plan administrators. It also accelerates the delivery of benefits to claimants and improves the state's ability to detect and prevent fraud. Using service-oriented architecture models and business intelligence engines to help integrate the system with other state agencies, the EDD team developed an enterprise mainframe platform that can handle approximately 120 interfaces and more than 1,000 input parameters, allowing the state to adapt to changing needs without costly reprogramming.
Now, payments are authorized the same day for about 25% of claims. Also, half of the 3 million continued claim forms received annually by EDD can now be approved and benefit payments authorized without staff intervention, saving significant time and delivering a dramatically improved level of customer service.
The U.S. General Services Administration (GSA) has continued its multi-year Drive-to-the-Cloud initiative with a number of accomplishments over the past 18 months. Following its transition to a cloud-based email platform and a cloud-based collaboration/application platform, the GSA has continued to lead efforts in cloud security, shared-service cloud applications, mobile computing and social apps. Those efforts have paved the way for other federal agencies to adopt cloud computing as a viable and scalable option to support their business IT environments.
In the process, GSA's work has led to the consolidation of 1,700 legacy apps to fewer than 100; the development of 26 new enterprise business apps in less than six months; the reduction of app development time by 75%; and the reduction of the five-year average total cost of operations for apps by 90%. Meanwhile, GSA's leadership in creating a related set of policies -- a "Cloud Center of Excellence" -- is helping eliminate duplication through better coordination and data re-use across the federal government.
Michigan is one of just a handful of states to consolidate information, communications and technology (ICT) services into one agency over the past decade, and is among the first to conduct a comprehensive assessment, with the help of analyst firm Gartner, on changing how the state does business. The state looked at global-public and private-sector best practice enterprises, in contrast to U.S. state governments, for comparisons.
The result: Michigan combined infrastructure and agency services with a focus on customer service excellence, strengthened its project and portfolio management, and created an enterprise service catalog. It also reportedly is the only state with an innovation management function supported by ICT investment and innovation funds. That's in spite of the fact that Michigan spent 1% of operating expenses on ICT, versus its peers, which spent an average of 3%.
The Obama administration's Digital Government Strategy, released a year ago in May by the Office of Management and Budget, is redefining the way the government thinks about the information it produces. The strategy is based on the notion that "all Americans should be able to access information from their Government anywhere, anytime, and on any device," says U.S. CIO Steve VanRoekel in a CIO Council blog.
Behind that strategy is an enlightened effort to treat data as a valuable national asset -- by releasing hundreds of government datasets and developing new ones via machine-readable formats. That effort gained new momentum when the President released a historic Executive Order and Open Data Policy making open and machine-readable the new default for government data.
The Obama administration's Digital Government Strategy, released a year ago in May by the Office of Management and Budget, is redefining the way the government thinks about the information it produces. The strategy is based on the notion that "all Americans should be able to access information from their Government anywhere, anytime, and on any device," says U.S. CIO Steve VanRoekel in a CIO Council blog.
Behind that strategy is an enlightened effort to treat data as a valuable national asset -- by releasing hundreds of government datasets and developing new ones via machine-readable formats. That effort gained new momentum when the President released a historic Executive Order and Open Data Policy making open and machine-readable the new default for government data.
Information technology has become a pivotal tool for government agencies trying to improve services, even as budgets slide steadily backward. Continuous advances in technology, from mobile applications to cloud computing, are helping federal, state and municipal government agencies develop new approaches to delivering better services to the public -- and more efficient tools to government employees.
Each year, InformationWeek Government sets out to identify the best examples of how agencies are applying IT in creative ways. After reviewing dozens of submissions, we selected 10 standout examples of IT innovation in government as InformationWeek's 2013 Government Innovators.
Their fresh approaches take many forms, from open source applications that put new tools into the hands of government employees at substantial savings over past projects, to more sophisticated information services available to on-the-go citizens via mobile apps.
For example, the city of Chicago, like many cities, faces growing information demands. Yet the need to stretch resources and collaborate across departments has never been greater. The challenge was how to pull data from many disparate sources, define the relationships for using that data, and make it relevant to a diverse array of communities of interest. The solution: Chicago's Department of Innovation and Technology developed a low-cost, comprehensive situational awareness program called WindyGrid, pictured here.
The application presents a unified, real-time situational view of operations across a map of Chicago, including 911 and 311 service calls, asset locations, building information, tweets and other critical data, all available in one place. It provides city employees instantaneous deep-dive analysis for a specific location.
To support tactical and strategic predictive modeling, WindyGrid also provides a comprehensive analytics platform to understand the impact of changes in service delivery. Departments can use this to analyze historical patterns or model future changes to service delivery to determine what might result in improved outcomes and cost savings. Chicago personnel already have been able to make quicker, smarter decisions and do a better job of allocating resources.
WindyGrid uses a highly scalable MongoDB database with an ESRI map solution, already owned by the department, and industry-standard Java Web services. As a result, the department was able to create a prototype within a few months and deploy the application at a much lower cost than similar commercially developed systems, which the city estimates would likely have cost tens of millions of dollars.
Photo credit: Walter Mitchell
Go to the InformationWeek 500 - 2013 homepage
About the Author(s)
You May Also Like