informa
/
5 MIN READ
Commentary

Q & A: Tom Rosamilia At IBM's IMPACT 2009 Conference

Talking about cloud computing and real-time BI via complex event processing (CEP) with Tom Rosamilia, General Manager of the Application & Integration Middleware Division in the IBM Software Group at IBM's IMPACT 2009 conference.
Talking about cloud computing and real-time BI via complex event processing (CEP) with Tom Rosamilia, General Manager of the Application & Integration Middleware Division in the IBM Software Group at IBM's IMPACT 2009 conference.While at IBM's IMPACT 2009 conference, I asked Tom for his take on the cloud computing initiative within IBM and what his thoughts were on real-time BI.

CS: I have to admit that coming out with a private cloud appliance was a great idea at the perfect time. I'm amazed how the "pundits" talk about public clouds as something everyone should be doing, but having worked primarily in financial services and healthcare I know for a fact that anyone having to deal with Sarbanes-Oxley (SOX) or HIPAA compliance will never even consider exposing data on a public cloud they have no control over. What more can you tell our readers about IBM's cloud computing initiatives and this new private cloud appliance?

TR: A lot of our clients wanted a better way to deal with their datacenters, virtualized environments in particular. We've been working for our clients for the last 10 years around what are a set of best practices on how to deploy WebSphere. If a customer has a challenge, they come to us and say "something is wrong" and we go in and tell them "your configuration is not quite right" or "you don't have the latest patches on". Our customers may not have a rigorous rollout process in place, what we call "Change Control Process", so they would not be aware that a new release was rolled out on one set of servers, but a second set was still running a previous release. When it comes to change control management, our clients wanted to ensure that the latest patch levels on the OS or on the App server were applied and that the correct application versions were running where they needed to. So, we thought what if we could somehow freeze-dry this image that's good for "now" - however long "now" is, and then load that out to all the different servers, having full control over what is running out there. Being able to do this with certainty, as well as quickly, is what we have now in the WebSphere Cloudburst appliance - a way to really help people with the benefits of cloud computing but in a private environment.

At last year's IBM IMPACT conference, we had some ideas on what we wanted to do and our clients and a couple of partners told us "not exactly". We took their advice, modified our plan two months after the conference and cranked out this appliance in 10 months time. It's nice to have the appliance form factor that we already know how to build, where we don't have to figure out how to get the right height, weight and power. We can reuse that to do whatever we need, we just have to figure out what to put on it, to give it one purpose versus another. The appliance model gave us a rapid time to market for our clients.

Private clouds are not getting enough attention today, but take XML/SOAP/REST based Web Services as an analogy. In 2001 all the rage was around "Web Services are going to take over the world". No one would write any code again, just call different services and combine them for what was needed. I struggled with that primarily because a business model was never created to handle the requirements for security - ways of doing authentication, or guaranteed delivery. There are a set of things I would be willing to create Web Services for, but that's a little different than writing something I would bet my business on. So what happened there, quite interestingly, was Web Services became all the rage, but not because they were freely available publicly, rather they provided a standard and efficient way to call my own partners, or internally reuse business processes from applications that were already written. I think the same analogy can be used with cloud computing, private clouds becoming more evolved, where initially it was all about public clouds.

CS: We have evolved from client-server computing, to n-tier distributed computing, to business processes being exposed as services that can be mashed up together in composite applications. Today, event processing allows us to monitor the flow between the services, and take action on the events taking place. Do you see more of your customers moving to a real-time BI model from the traditional BI model, considering that Cognos is another one of IBM's business units?

TR: There will be this confluence of traditional BI and event processing, Business Event Processing in our case. I've seen this coming for the last 2 years, it's not here yet, and not because it's not possible, but because all parties that need to make it happen have completely different mindsets. I've had the privilege of moving from the DB2 team to the WebSphere team and I can tell you that our data team thinks differently than our transactions team. The same can be said about customers. On one hand you have the data warehouse group that is looking at expediting the ETL process and making it more efficient. On the other you have the group dealing with transactions occurring in real-time, they care about what is happening now and what happened 5 minutes ago is old news. As of right now, the two worlds have not merged. Our customers need to think along those lines as well - combining historic data with real-time event data is extremely powerful. There is some human behavior innovation that has to occur for real-time BI to fully develop.

Editor's Choice
Brian T. Horowitz, Contributing Reporter
Samuel Greengard, Contributing Reporter
Nathan Eddy, Freelance Writer
Brandon Taylor, Digital Editorial Program Manager
Jessica Davis, Senior Editor
Cynthia Harvey, Freelance Journalist, InformationWeek
Sara Peters, Editor-in-Chief, InformationWeek / Network Computing