The job of CIO in government has become more challenging and more visible. InformationWeek Government recently hosted the Government IT Leadership Forum, where more than a dozen CIOs and CTOs shared their strategies. Here are highlights from the event.

InformationWeek Staff, Contributor

July 2, 2010

16 Min Read

The job of CIO in federal government has become more challenging and more visible. Uncle Sams top IT decision makers are looking to secure systems, improve project performance, and deliver new services in an era of open government. On June 15, InformationWeek hosted the Government IT Leadership Forum in Washington, where more than a dozen CIOs and CTOs shared their strategies on these and other priorities. Here are highlights from the event.

Tackling The Tough Issues Head On

Federal CIO Vivek Kundra gave the opening keynote at InformationWeek's Government IT Leadership Forum. Following is an except from his speech.

People can go online using consumer technologies, and they're able to conduct their day-to-day activities in a manner that's almost frictionless. Yet when it comes to dealing with their government, we tend to take them back a decade or two or three. This gap in technology, unfortunately, is a result of the federal government focusing in the wrong areas in terms of the investments that we've been making.

Part of what we're trying to do in the Obama administration is bring the Darwinian pressure that's apparent and present in the consumer space to federal government. How do we innovate? How do we deploy technology? How do we make sure that it doesn't take years in terms of rolling out some of these innovations?

We've begun that journey. There are a number of success stories across federal government where great work is happening. We just announced a simple move by the Department of Treasury, essentially going paperless, saving hundreds of millions of dollars over the next five years. More importantly, it's also going to prevent fraud on an ongoing basis.

Across the federal government, what we've seen, as a function of heavy investments, is that people have tried to address this problem over the last 50 years. There have been OMB memos and legislation. It's not because of a lack of investments. We've spent as a federal government over $500 billion [on IT] over the last decade, yet too many times we end up with large-scale IT failures.

For far too long, what ends up happening is we throw good money after bad money. When IT managers across the federal government begin to make investments, one of the challenges is that long procurement cycles and complexity lead managers to oversize the IT project. Once that project is oversized, new stakeholders are brought in, which leads to exponential complexity in terms of requirements and definition.

Then, when the project isn't working well, you end up having more oversight, which ends up generating more paperwork, and more research is expended on overhead rather than solving the root cause of the problem itself. We've seen this across the board, and what we've tried to do is convene the brightest minds across the country. The president in January invited top CEOs across the private sector to a summit in the White House to talk about how the government can apply some of the best practices in the private sector. A number of themes emerged; key was to simplify, making sure that we have smaller time frames for deliverables.

As I've been looking at federal IT investments across the board, we've found investments where people have spent five to seven years essentially blueprinting, and what you end up with are architectural documents that nobody really implements. There's also a challenge when it comes to human capital, as far as making sure we've got the right talent in federal government to make sure that these complex agreements and contracts are actually managed well.

We decided to take this issue head-on from an execution perspective. When I came in, one of the first things I received was a document that contained over $27 billion of IT investments that were over budget or behind schedule. We decided to say, "We can't manage this when we look at these investments once a year."

Public Feedback

That's one of the reasons we decided to make sure that we were being transparent and open about how these investments are performing. We launched the IT Dashboard, and a lot of the CIOs I see in this room have your faces right next to [your] IT projects. What's been really useful is we've actually gotten the American people engaged, and they're giving us feedback on these investments and how they're performing.

That was step one, sort of the first brick in a foundation that we're trying to lay, but that's not sufficient. We moved forward and said, "We need to make sure that we've got essentially an office of analytics within OMB that's looking at these investments." We launched the TechStat sessions in January.

We recognized that there are a number of problems across federal government where we haven't been tough enough in making sure that we're holding ourselves and contractors accountable for results. So these TechStat sessions have unearthed a number of issues that we're addressing. We've seen the ability to take on IT investments like at Veterans Affairs, where they went after 45 IT projects, halted those that weren't performing, and terminated 12 of them.

New Areas Of Focus

Across the board, what we're trying to do as we look forward is make sure that we address some of these persistent issues. As part of the 2012 budget process, there are three major areas that we've focused on.

No. 1 is around infrastructure. The president has committed to a net-net, zero-growth policy when it comes to data center infrastructure. That policy is going to be reflected in the 2012 budget and ongoing years. And there's a shift toward cloud computing to make sure that we're deploying technology faster and cheaper, and that we're thinking through all the elements around security, privacy, data portability, and interoperability.

IT Dashboard

  • Launched in May 2009 as a way of exposing the status of federal IT projects

  • Provides data on 7,000 IT investments, including 800 deemed 'major'

  • Highlights projects in need attention or of significant concern

The second area we're focused on is IT project management. Projects that are behind schedule or over budget--CIOs are directed to review those projects before they're submitted for the 2012 budget process.

Third is cybersecurity. The State Department spent over $133 million over six years to generate paperwork reports. Unfortunately, that doesn't make the State Department more secure. What makes us more secure is real-time security monitoring--continuous monitoring--and acting on data. That's why agencies are directed as part of the budgeting process to make sure that their budget reflects presidential priority in terms of investing in tools, not in paperwork reports.

So those three key areas are going to be central to our federal IT strategy. We want to be able to close this technology gap because our society and the public expect that their federal government will be as user friendly as their experiences in their day-to-day lives.

NASA Looks To Optimize Its Innovations

NASA's new CTO for IT, Chris Kemp, wants to more fully exploit the myriad technology innovations created by the space agency's researchers, scientists, and technologists. Two months into the new position, Kemp shared his strategy for channeling that innovation in new ways.

NASA CIO Linda Cureton announced Kemp's appointment to CTO for IT, a newly created position, in May. Kemp is responsible for NASA's enterprise architecture division and for introducing new and emerging technologies. He's also charged with forming a council of CTOs from NASA field centers and mission teams that will foster innovation across NASA. Kemp was previously CIO of NASA's Ames Research Center in northern California. Before that, he worked for Escapia, a Web company, and Classmates.com.

Speaking at InformationWeek's Government IT Leadership Forum, Kemp said one of his goals in his new job is to connect the previously "disconnected pockets of innovation" at NASA, which spends close to $2 billion annually on IT and technology development.

NASA's 10 field centers must look for opportunities to establish ties with tech companies in their area, he said. Ames Research Center, for one, is near the offices of Google, Microsoft, and Yahoo. "It's a unique opportunity we should take advantage of," he said.

Kemp has experience creating such industry partnerships. As director of strategic business development at Ames, prior to serving as CIO there, he struck collaboration agreements with Google and Microsoft, resulting in NASA's high-resolution imagery of the moon, Mars, and other planets and stars being made available on the Web through Google Earth and Microsoft's WorldWide Telescope.

NASA must also find ways to accommodate a new "unconstrained" generation of employees who are mobile, always connected, and avid users of social media and crowdsourcing applications, Kemp said. "The challenge here is that the culture at NASA and a lot of other agencies isn't ready for this," he said.

Cloudy Future

At Ames, Kemp was project leader for a cloud computing pilot project, called Nebula, that's being expanded to Goddard Space Flight Center in Maryland. Nebula's hardware and software are housed in a mobile shipping container with a capacity of 16 petabytes of data storage and 12,000 CPU cores.

The Nebula project, however, caused political, budget, and personnel disruptions within NASA, and "it wasn't pleasant," Kemp said. His point is that disruptive technologies are just that--disruptive. It goes with the territory for CTOs, he said. Kemp favors "elegant" technologies over complex systems with many moving parts.

It's important for CIOs and CTOs to understand the "as is" state of IT infrastructure in assessing where emerging technologies will fit, Kemp said. "This is all driving toward a repeatable process for evaluating the current state of NASA's IT infrastructure and articulating our future road map," he added. That process involves the development of technology prototypes, followed by case studies, and then, where appropriate, investment.

Kemp plans to use metrics to gauge the effectiveness of IT pilot projects at NASA. He advocates completing pilots within three to four months and considers "failure" to be part of the process. "The idea is to fail fast," he said. "Movement is key."

-- John Foley ([email protected])

Progress In Intelligence Sharing

"I wish we knew what we know." That aphorism, attributed to the late Lewis Platt, former CEO of Hewlett-Packard, sums up the conclusions of a Senate Select Committee on Intelligence report, which found that the U.S. intelligence community had more than enough information to keep a would-be bomber from boarding a Detroit-bound commercial airliner last Dec. 25.

Intelligence sharing has been a priority of the U.S. government since 9/11, another occasion when the intelligence community failed to connect the dots, but technology, policy, and cultural barriers remain. Can the intelligence community evolve from a need-to-know culture--which has kept intelligence information in silos for generations--to a need-to-share approach?

IT leaders at three government agencies--the Department of Homeland Security, the Defense Intelligence Agency, and the CIA--offered signs of progress at InformationWeek's Government IT Leadership Forum, but they agreed that much remains to be done. Improving information sharing has been "a long slog," said Margie Graves, deputy CIO of DHS and formerly with the Transportation Security Administration.

Created after 9/11, DHS comprises 22 organizations, including TSA, Citizenship and Immigration Services, and Customs and Border Protection. Graves helped write the original DHS business plan for IT consolidation, five years ahead of the current government-wide data center consolidation initiative. DHS is consolidating 24 data centers into two and uniting 12 e-mail systems, with a goal of creating more manageable services and reducing costs.

From an intelligence perspective, however, consolidation is akin to creating larger haystacks, and it doesn't fundamentally address the problem of finding needles. For that, DHS has taken the lead in promoting the National Information Exchange Model, an XML-based data modeling and schema framework for information sharing among government agencies.

Originally spearheaded by the Department of Justice, NIEM is now being led by DHS with participation from DOJ, the FBI, and the Office of the Director of National Intelligence. Ten agencies are participating and another seven are under review.

NIEM is a starting point for shared metadata and data models that will enable intelligence agencies to map one database to another. Graves said NIEM has helped DHS create a "person-centric" view that standardizes the attributes of an individual, including name, date of birth, and place of birth. "Now Customs and Border Patrol officers at a border crossing can look at a federated query that pulls from 13 separate data sets," Graves explained. Work is under way on a similar federated query that will let Citizenship and Immigration employees pull together information from 11 data sets.

Access Management

Across the 16-agency intelligence community, obstacles to information sharing remain entrenched. Identity access management, a requirement for verifying users and authorizing access to restricted resources, continues to be one big challenge.

Keeping information from falling into the wrong hands has been a cornerstone of the intelligence community's need-to-know culture. The downside is that information remains in silos, making it impossible to connect the dots. Following the Intelligence Reformation Act of 2004, it became obvious that without access management, the Defense Intelligence Agency would never be able to interoperate with its partners in the intelligence community, said Casey Henson, the DIA's CTO.

There's been progress on the IT front. In recent years, the DIA, National Reconnaissance Office, and National Security Agency have built and now share a security management system and an identity access management system. The systems can publish and consume Web services, so other agencies will be able to take advantage by adapting their architectures. "It's the three of us using it today, but within the next 12 to 18 months we'll probably double that and we could triple it within [another] six to 10 months," Henson said.

The technical barriers are the easiest to remove. A need-to-know culture keeps agencies focused on their own agendas.

"Every agency says, 'I have unique needs.' Then their IT providers say, 'I will give you the 100% solution for that need, but you have to give us all this money to create a unique solution,'" said Don Burke, the "doyen" of Intellipedia, an intelligence-community-wide wiki launched by the Office of the Director of National Intelligence in 2006. Intellipedia is an example of the kind of "systems of common concern" the intelligence community must encourage and support, Burke said.

More To Do

Siloed thinking is changing, if slowly, at DHS. As recently as three years ago, DHS component organizations would focus on promoting their own system as a focal point for intelligence sharing, but the conversation has turned to adopting frameworks and architectures that enable everybody to participate. "Getting to the point where we can talk about standards and interoperability and changing the architecture is a major step forward," Graves said.

In an example of what data sharing can do, the arrest of the Times Square bomber could be tied, in part, to a suspicious activity reporting initiative that now reaches across DHS agencies, Graves said.

The Office of the Director of National Intelligence is trying to promote cross-agency collaboration, too. For more than 18 months, it has been convening monthly meetings of the 16 intelligence agency CIOs.

Yet every terrorist incident serves as a reminder of the urgent need to do more. The fallout of the failed Christmas Day bombing and damning Senate report was the resignation in May of Dennis Blair, director of national intelligence.

In nominating retired Air Force general and DIA veteran James Clapper to fill that post, President Obama emphasized the need to analyze and share intelligence more effectively, and to act on it. Said Obama: "Our intelligence community needs to work as one integrated team that produces quality, timely and accurate intelligence."

-- Doug Henschen ([email protected])

FISMA Meets Continuous Monitoring

Federal agencies are fed up with the FISMA compliance process, complaining that it's outdated and expensive.

They have a point. Compliance with FISMA--the Federal Information Security Management Act--means collecting loads of data on systems and devices and submitting lengthy reports to auditors. At InformationWeek's Government IT Leadership Forum, federal CIO Vivek Kundra outlined a plan to move away from paper-based FISMA compliance and invest in continuous-monitoring technology.

I can't help wondering, though, if the "choice" between FISMA compliance and continuous monitoring is a false one, having seen time and again that compliance efforts can force a wholesale shift toward better security. At the same time, no security environment--compliant or not--is safe without some form of continuous monitoring.

During a forum session on cybersecurity, Ron Ross, project leader for the National Institute of Standards and Technology's FISMA implementation project, emphasized that the newest guidelines will include continuous monitoring as a core component.

"Continuous monitoring isn't a strategy. It's a tactic," Ross said. "It's part of a risk management framework."

Other risk management steps include selecting the right set of controls and making sure those controls are implemented correctly, he said. "In order to make continuous monitoring effective in what it's intended to do, you've got to be monitoring the right stuff," Ross added. "And the stuff you put in has to be effective. Those controls really are what provide the strength to withstand cyberattacks."

What Ross is saying is that a new emphasis on monitoring shouldn't be seen as an abandonment of the compliance effort, or as an either/or choice in IT security. Any effective strategy for agencies will have to include both FISMA compliance and continuous monitoring, even if one occasionally takes priority over the other.

-- Tim Wilson ([email protected])

Project Management Push

The Obama administration is increasing pressure on CIOs in federal government to improve IT performance. Its approach--including the IT Dashboard and hands-on project reviews--are leading to changes in the way projects get managed.

The IT Dashboard puts "community pressure" on CIOs and other agency officials, said Education Department CIO Danny Harris, during a session titled "Feet-To-The-Fire Project Management" at InformationWeek's Government IT Leadership Forum.

Agencies have long had internal IT investment review boards and vetted their budgets with the Office of Management and Budget, but IT project management wasn't out in the open as it is now. Federal CIO Vivek Kundra launched the Web-based IT Dashboard in June 2009, exposing the ongoing performance of federal IT projects. Kundra's office is also running metrics-driven TechStat sessions where the federal CIO and agency CIOs talk through troubled projects. And, beginning with fiscal 2012, CIOs are required to review projects that are behind schedule or over budget before they're submitted to the budget process.

Such efforts are forcing changes in the way IT projects are viewed. If an initiative is deemed "a dog," the decision must be made to end it and re-allocate those resources, Harris said. The IT Dashboard has made CIOs accountable to agency secretaries, and secretaries to the public. "It doesn't get any more powerful than that," said Harris.

Agency CIOs such as Roger Baker of Veterans Affairs and Jerry Williams of Housing and Urban Development are now taking the concept further with agency-specific IT dashboards. CIOs are also attending one another's TechStat sessions and sharing best practices.

Kundra indicates more can be done, pointing to areas "where we haven't been tough enough" in holding agencies and contractors accountable for results.

-- J. Nicholas Hoover ([email protected])

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights