Convergence is happening all over the collaboration space: Audio, video and data conferencing are converging; VoIP, PSTN and computer networks are converging; synchronous and asynchronous collaboration technologies are converging and evolving into on-demand collaboration tools. But none of these are the convergence I'm about to cover. Collaborative convergence is the need by enterprises to converge the myriad collaborative applications and services into one that works for the whole organization and does so securely and effectively.
- Deepen Customer Satisfaction and Brand Affinity with Impactful Web Content and Microsites
- Creating Value with Social Collaboration Platforms
- The Oracle Insurance Survey: Overcoming IT Hurdles to Success
- The Case for Outbound Content Management
- Strategy: Heading Off Advanced Social Engineering Attacks
- Strategy: Mapping IAM Processes to the Business
Oracle may be a bellwether for convergence and standardization throughout the enterprise. Oracle had 12 different RTC (real-time collaboration) systems in use before the company decided to build and standardize on its own RTC tool. Over the last decade, most large organizations have experimented with collaboration technologies, initially with asynchronous and then synchronous technologies (in the new millennium). All of the organizations I’ve interviewed have used collaborative technologies (IM, web conferencing, or web meetings) at some point over the last year, and many of these organizations were regular users of these and asynchronous collaboration technologies (like SharePoint, eRoom or Groove).
One of the things we often note when doing multiple interviews with the same organization is that different groups in the organization use different RTC technologies. For example, the training group might be using Centra or Adobe’s Breeze, while the sales group uses Linktivity’s WebDemo, and the marketing group is using WebEx or InterWise. This collaborative tool proliferation has not gone unnoticed by IT departments -- more and more they have had to support users on these applications.
In Oracle’s case, they developed their own RTC tool and because of some insightful work by the development group, they were able to get it adopted across the enterprise in reasonably short order, and with a high level of success.
Strategies for Corporate Collaborative Consolidation
I see IT’s move to consolidate collaboration tools a sign of a maturing collaboration market. There seem to be two strategies for doing collaborative consolidation in an organization.
The first way is that IT does an evaluation of a number of collaboration technologies (hopefully including the ones currently being used by the enterprise), and based on this evaluation (which is based on a variety of factors including cost, politics, and which vendors are already well known to IT, like Oracle, Microsoft, IBM, etc.), mandates which collaboration tools the corporation will use based on (such an evaluation at Target Corporation is a two-year process). I don’t see this as the optimal strategy for consolidation, not because of technical factors, but because of behavioral and social factors.
The second and, fortunately, more common strategy seen for consolidation is when someone in IT realizes that the enterprise is spending a lot of money on collaboration services and tools and reports it to the CIO. The CIO asks an IT person to go out to the enterprise, identify all of the stakeholders for all of the different applications, interview them on why they chose that application (business needs), how it is being used, how many are using it, how effective a solution it is, and what is critical about that software for the processes they are using it for (critical features and functions).
Armed with this information, the IT person or, in many cases, an outside consultant (like my firm) completes the interviews and writes a report on the current state of collaboration in the organization. This gives an initial assessment, or as we call it, a “collaborative snap shot”, of the organization.
If the CIO is wise, he/she will ask the IT person or consultant to develop a plan for consolidation. If the IT person or consultant is wise, they will convene a roundtable of all of the stakeholders identified in the report and let them know the corporate consolidation goal. This usually provokes a period of discussion by the roundtable, the result of which is that a number of the stakeholders see that other collaborative tools can provide the functionality they need without too much (or any) training or change in their business processes.
If the roundtable leader is lucky, it becomes clear to all members that one solution will predominate and will fulfill most of the needs of each of the stakeholders. Through a political process, the leader can often get consensus and buy-in from all of the stakeholders. In some cases, we have seen two systems be approved. In one case, we saw the whole company move (over the course of a year) to one system except for one department, which insisted on using an in-house system because they claimed that they needed 100% uptime (by contract) and that the system the rest of the company was moving to only had a 99.7% uptime metric. In-house systems, while very specific to the needs of the company, are often expensive to develop and maintain. Within two years of the initial move, and working with the collaboration vendor, they were able to get the uptime to an acceptable level and the last group in the company abandoned the in-house built system to move over to this vendor’s system early in 2005. Having this collaboration consolidation process take two years is not an unusual amount of time for a large enterprise.
I have seen this second strategy to be more successful, not because of technology reasons or even cost, but for political and behavioral reasons. The roundtable process gives people a chance to express their opinions, and to understand other technology options and what the tradeoffs will be. In addition, people feel like they have some say in the outcome and ownership of the process, and you get a higher level of “buy in.” The outcome of this ownership usually is a much higher and less painful adoption rate of the collaboration technologies. The overall cost of implementing the technology across the organization is often lower, and there are champions for the technology all over the organization (all of the roundtable members).
The champions for the technology not only help educate their groups (and other groups) about the new technology, but, and even more importantly, they can help explain to people the benefits of the technology in the context of their specific business process. They can explain that the sales people can now conduct a web demo with two clicks and that the demos can be recorded and played back later (if needed). They can explain to the training people that the RTC software can link to their current LMS, and that it offers additional virtual classroom features in an upcoming release from the vendor. In many cases, the outcome (technically) of both strategies is the same. Often it is Microsoft, Oracle, IBM/Lotus, Webex or Adobe/Macromedia selected as the corporate collaboration vendor because their other tools are already in use in the enterprise and those vendors have long-standing relationships with IT.
Once we get a call to help an organization (usually a group in IT) with this, we talk about the four stages that are needed to help an organization with this process:
Stage 1: Assessment. This includes not only the work defined above in the “collaborative snap shot”, but also some of the tools we have developed to help give us a (numerical) idea of how important various collaborative features and functions are to each group in the organization. In addition to the technology, we also look at how the culture supports collaborative behavior, what the economic impact of the collaborative technologies is, and how these tools are seen politically.
Stage 2: Requirements and Vendor Selection. Once the assessment is concluded, the data is analyzed and a report is developed for IT management. As part of this report a “requirements” matrix is developed with each of the features/functions prioritized based on scores they got from each of the business and IT groups assessed. Since the features are rated on a 1-10 scale, often features that are 7 or above are put into a shorter “requirements list” and given to 4-5 of the vendors that have their tools currently being used in the enterprise as well as a few other vendors that might have unique or solutions or are using a newer technology or architecture (e.g., vendors that use VoIP, or are Web 2.0 oriented, etc.). Since we are familiar with many of the major vendors in the collaboration space and their product road maps, we can shorten this requirements and vendor selection process, and also keep the vendors from pestering our IT clients.
Stage 3: Strategic Pilot Project. In Stage 1 when we are doing the assessment, we are on the lookout for various groups that might be good candidates for a pilot project. In one instance we talked with the manager of a support team that had 200+ people in the same location and another 20 scattered around the world.
There were a number of criteria we established for pilot projects that used collaborative technologies: the manager had to be technically savvy; a good leader (with good communications skills); very interested in IM (which was the technology we were testing); have a bounded group of between 100 -300 users; the group had to be heavy users of the technology (IM); the technology being tested had to be part of a critical process for the group and ideally used every day.
The support group in question met most of these criteria and proved to be a good group for the IM pilot, which lasted 30 days (with this whole stage taking about 60 days).
As part of the pilot, we gathered as many metrics on IM usage from the target group as possible before allowing access to the new IM technology. In addition, we gave training to half the group and the other half we just let figure the technology out (after all, IM is pretty simple and should be easy enough for anyone to use). By doing this we were able to tell from the pilot how critical training might be to the rest of the organization.
The goal of the pilot project from a strategic basis was threefold: one, to have a big success with the technology and to publicize that success to the organization (it is easier to build on success then failure); two, understand the best way to offer the technology so that high adoption rates occur; and three, to understand the priorities of this technology to each group in the organization and roll it out to those groups in order of priority.
Stage 4: Implementation and Adoption. The timeline for this stage can vary widely from as little as a few months to years. Often it depends on the size of the organization, how important the collaboration technology is, the degree of support the project has from management, and a variety of other factors. The goal in implementation is to do a strategic rollout of the technology along with the necessary training and education so that those groups that get the technology know how to use it and where to use it. This often means showing a use case or example using a critical process for each group on how the collaborative technology can be used and what the benefits are.
For the support group I have been using as an example, only half got training, but because support people are very technically savvy, the difference between the group that got training and the group that did not was not very significant. Showing a support person how they could integrate all of the IM clouds on their desktop (Yahoo, MSN, AOL, etc.) into one buddy list and still do presence detection is just showing a feature. It was important to show them how a specific IM tool can be used for internal IM (new policy), and how the IM clouds could be used with customers and suppliers (outside the firewall) more securely (both through policy and technology). It was just as important in this case (and most cases) to educate the group on new IM security and use policies as it was to educate them on the technology.
As the technology gets rolled out to different groups, and the number of users scales up, it is important to monitor the response times of the technologies to make sure that they do scale and that there is not a degradation in service quality or service reliability (e.g., IM fails in the middle of a conversation). Often both these factors are tied to the enterprise’s network bandwidth, QoS, firewalls, and other infrastructure factors. It is important to talk with those in IT responsible for the corporate infrastructure prior to the rollout to make sure that the enterprise infrastructure can support the use of these tools and to also project expected usage over the next year or two to make sure infrastructure growth is keeping pace.
Although other analyst firms in this space recommend 10-step programs, they are essentially the same as the program outlined above, but with each step being a bit more granular. It is important when doing collaborative convergence to look at the process holistically and give equal importance to people, process and technology. Often the technology is the focus of this type of project because it is the most visible and tangible, but people and process really comprise about 80% of the success for any collaborative technology (which enables them) and it is critical not to underestimate the importance of these other -- and less tangible -- factors.