By winning a $600 million CIA contract, Amazon Web Services has positioned itself as equal to IBM in building secure systems and superior in large-scale, elastic systems.

Charles Babcock, Editor at Large, Cloud

November 19, 2013

6 Min Read

The recently concluded court case between IBM and Amazon over a $600 million CIA cloud computing contract left unanswered several interesting questions about the bidding process and the US Court of Federal Claims decision that came down decisively in Amazon Web Services' favor.

But there's no question about one result: Amazon has changed the terms of the discussion about building secure, large-scale systems.

One of the still-open questions: Who was the third bidder for the CIA contract, right up until the agency decided it had just two viable bids?

"After reducing the competitive range to AWS, IBM, and a third offeror, the agency conducted written and oral discussions," the Court of Claims justices wrote in recounting the facts of the case. They did not name that third bidder. Was it Microsoft? Google? Rackspace? More likely, it was Terremark, which has lots of experience dealing with government data-security requirements.

[ Want to learn more about security in the Amazon cloud? See Amazon EC2 Gains Key ISO Security Certification. ]

Another question: What did the request for proposal mean by "Scenario 5," which demanded the ability to process 100 terabytes of data? This question formed the basis for IBM's protest and claim of prejudice in the bidding process. IBM interpreted Scenario 5 to mean one pass through a given 100 terabytes of data in a 36-hour period. The other bidders had assumed the request was for repeated processing of 100 terabytes in a continuous fashion. That assumption seems much more realistic to me, given the amount of communications and data the CIA is trying to handle. Why would IBM assume the agency needed a single look at any given 100 terabytes?

Actually, it didn't assume that until the revised part of the bidding process. It initially assumed that there would be multiple passes through a requested analysis of 100 terabytes, and, like the other bidders, it formulated a proposal accordingly. These proposals were discussed "in written and oral communications" by the agency and bidders, the justices wrote. After the discussion, each bidder revised its proposal to make its best offer. It was only after this process was exhausted that IBM switched its approach and submitted a bid based on a single pass through the data.

"IBM suddenly deviated from its initial Scenario 5 proposal," the justices wrote. "IBM changed its solution from one that processed 100 TB of data continuously throughout the year to one that performed a single 100 TB processing run in a 36-hour period, thereby reducing its price."

During the discussion phase, the CIA raised no objection with IBM on its proposed continuous processing. Indeed, the company's approach was consistent with that of the other bidders. Because the CIA didn't prompt IBM to reconsider that part of its bid, it "effectively conveyed to IBM that it had correctly followed the Scenario 5 instructions," the justices wrote. In a move that the court found too much to take, IBM then altered its approach and claimed prejudice because its Scenario 5 bid hadn't been considered on the same grounds as everyone else's.

I am tempted, like the court, to say IBM's Federal Systems division has operated in the nation's capital for too long. Perhaps it's overdue for a hiatus in Topeka, Kan., Sacramento, Calif., or, better yet, Lodi, Calif., where it might find both the time and introspection to reconsider some of its sophisticated ways. But it's hard not to come away from this bidding process without reaching another overall conclusion. Amazon Web Services winning the CIA contract is tantamount to a small, knobby-kneed horse named Seabiscuit winning the American Horse of the Year title against well-bred competition.

Amazon had to convince the CIA to give it a 10-year contract to build and operate a private x86 cloud that must function as one of the most secure clouds in the world. IBM had more overall experience in the field of security than Amazon, but the two companies received equal "pass" ratings in the contract proposal.

IBM would probably win a competition based on a mixed mainframe-Unix-x86 environment, but that perhaps is part of the point. The CIA cloud will doubtlessly be x86 only. That architecture is well known to a wide body of hackers and intruders -- everyone from script kiddies to Russian bank hackers and Chinese espionage artists. Amazon risks its business every day to intrusion and gross violation, and it has built out one of the best layered defenses in the world.

At Amazon's prompting, the enterprise-controlled workload arrives with the enterprise imposing user identification and authentication. If privacy is important, than by default it gets an Amazon-provisioned firewall, where the default setting is closed ports unless the customer specifically directs they be opened. A firewall on the Zen hypervisor running an Amazon Machine Image inspects traffic from virtual machines to see whether the VMs are talking to authorized resources or trying to talk to neighboring virtual machines with which they have no business.

Hosts are guarded by Amazon's own identity and access management system, and I suspect there is a monitoring system watchdogging all login activity, prepared to sound the alarm at certain things that indicate an intrusion attempt. Most of Amazon's security is automated. I suspect there are a few skilled eyes watching the audits of the access management system around the clock for any sign of exceptions or errors that might indicate someone is trying to take advantage of a host or customer VM, and likewise for the intrusion detection system.

The CIA thinks it has high data bulk -- high-volume work that can be best accomplished in a cloud computing architecture. It also thinks Amazon is the right party to build and operate that cloud. It will no doubt have every security measure applied to EC2 and then some. In the process, Amazon is going to gain unparalleled expertise in large-scale x86 private cloud security. Its learnings will inevitably spill over to its more prosaic operations in EC2.

Chances are, when Amazon offers private cloud operations in its infrastructure in the future, people will be much more prepared to drop their notions that their own datacenter is safe but the public cloud is not. Instead, they may wish they could make their datacenter as safe as they now perceive the public cloud provider's to be.

Want to relegate cloud software to edge apps or smaller businesses? No way. Also in the new, all-digital Cloud Software: Where Next? special issue of InformationWeek: The tech industry is rife with over-the-top, groundless predictions and estimates (free registration required).

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights