I'm beginning to think the unanswered question about cloud computing is integration. Sure it's easy to provision servers with a workload in the cloud, assign storage, even create an instant recovery system in a neighboring availability zone. But after that, what can you connect it to?
Integration was the problem in the previous cycle of computing. That was when everything was under one roof. What's to keep it from being an even bigger issue in the cloud era? Talend and Jitterbit may be part of the answer, since the cloud seems to like open source code, and the two of them have sustained a prolonged output of open source connectors and adapters. Still you need the expertise of working with all of those moving parts.
It seems to me the cloud itself has to serve as part of the solution. It's not enough to merely duplicate everything we did inside the data center again outside in the cloud. That sounds so Sterling Commerce/ Progress Software/Iona-ish oriented. Oh, that's right, Progress acquired Iona three-years ago.
The workload in the cloud is going to have the same need to connect to a particular database (whether a standard relational system or a non-standard structure), to other applications, and to a multitude of data-generating sources. One way to describe the shortcomings of the current environment, however, is to imagine developing an application in the cloud that will need to connect to the mainframe. There are no mainframe services in the x86-based cloud. How are you going to test the application to know whether it really taps into the customer information control system (CICS)?
John Michelsen, chief scientist of cloud development, virtualization, and testing software provider iTKO, says this is "the wires hanging out of the cloud" problem. You've developed software that you need to test, but you can't give it a realistic run because there's no equivalent to a mainframe system in the cloud environment. In a test, your application issues a call to the mainframe's information management system (IMS), but the wire over which the call went out is disconnected, hanging loose, unable to allow the software complete its function. Michelsen wouldn't be Michelsen if he wasn't sure he's got the answer, which is: specialized modules that can mimic mainframe functions in the cloud, allowing the application to function as if it were attached to the target system.
But I'm looking for a more generic solution. Why can't the cloud help solve the problem that arises as it starts to run more and more enterprise workloads?
First came the immense libraries of application adapters and connectors, from the likes of Informatica, Tibco, Progress Software, and Sterling Commerce. Then the enterprise service bus proved to be a great invention for translating exchanges between running applications, using a messaging structure that was already resident in the enterprise to activate connections rather than build each one point-to-point separately.
Why can't we build a service bus that connects everything to everything, using the Internet as the connective tissue? At one time, Grand Central Communications, the firm started by Cnet founder Halsey Minor, promised to do this, but it was too far ahead of its time, had limited means to tackle an endless problem, and suffered from too little ammunition to throw at a huge, ongoing business problem.
On the other hand, if you sit down with Gaurav Dhillon, co-founder of Informatica, you realize he's thinking along the Internet as service bus lines. The TCP/IP network is the message bus and an Internet server can act as an integration hub, not just for many applications at one company but for hundreds of applications at thousands of companies. To do this, the adapters and connectors need to be carefully cataloged as to version and release, and the difficulties of reaching the connector and forging a customer connector must be minimized.
His firm pursuing this goal is SnapLogic, and while generating 63 or so connectors to the most commonly used applications -- PeopleSoft, SAP, Siebel Systems, Salesforce.com CRM -- he's less focused on what his firm can accomplish by itself in building the library than what application suppliers might accomplish if they brought their own connectors to his hub.
The Apple App Store model "is absolutely the right model. We are implementing a (previously) rigid idea in a new way," he says. Everyone goes to one place to get software known to work with the device they're using. That software can be updated through a central source, and each version can be tracked and made available, as the customer needs. If enough people use the hub, apps start to flow into it as the best distribution point.
All of this sounds a little optimistic. The business world is infinitely more varied than the family of technical lookalikes known as the iOS devices. If Dhillon has made it as easy as he says to produce "snaps" or links between an enterprise application and his hub, then this will prove an experiment worth watching.
In the meantime, I'll maintain that if the cloud has generated an integration problem, then somewhere in the cloud lies part or most of the solution.