Even vanilla hosted collaboration systems need to be put through their paces with each successive upgrade.
So your company has decided it wants to launch a vanilla hosted collaboration or social business software system--meaning you aren't planning to customize it. You've presented the business case, but your executive team denies your line item request for someone to test the system each time you upgrade it to a new version. The software vendor does its own QA before it rolls out an upgrade, right? You haven't done any customizations. What could go wrong? This is social software--everyone knows how to use Twitter and Facebook.
Everyone on your team needs to understand that this mentality is a recipe for disaster. All software has bugs. Not all bugs will be found prior to launch, regardless of how thorough the vendor QA testing is. And developers and QA people aren't using the software in your environment. Your organization undoubtedly has unique use cases, and you will be the one hearing your employees/community members scream if something they love or need breaks during an upgrade.
Putting out these fires will leave your community managers with less time to get involved with their members--to help them with training, for instance--leading to slower adoption and potential project failure. And, yes, people use Facebook and Twitter without a tech support team, but that doesn't mean they don't get frustrated by those platforms. You want your business colleagues to have reliable software, and to have someone to turn to when it doesn't work as they expect.
It's critical that your company hires someone to lead your own rounds of QA on the upgrade. Make sure that person understands testing, and that he or she will be allowed to work with community managers and members to perform the tests. I've noticed that it's easy to make assumptions when testing, or to always perform steps in the exact same way. That approach can cause you to miss problems that users are sure to encounter. When we've had several people working together, we've been more prepared to help our community members, post-upgrade.
If your management team or vendor suggests that the release notes should suffice, push back. You need to review the bugs "in action" and determine what the impact will be on your audience. For example, do the benefits of the new features and old bugs that are getting fixed outweigh the pain of anything new that's being introduced? If so, putting that into words for your members as part of your upgrade plan will help reduce everyone's frustration.
Perhaps there are easy workarounds for any shortcomings that you can document or train people about. Or maybe you work with your vendor to find that the bugs are known issues, and you determine it's worth waiting for the next point release.
Make sure your business case includes the necessary resources for you to QA test your system upgrades. As you investigate your possible platforms, make sure to also estimate an upgrade release frequency as well as determine your community's comfort level on the early adopter versus laggard scale. If you've done all of these things in advance of launching your community, your pool of potential community managers will understand that you've got their backs.
Next up: writing a test script.
Tracy Maurer is an Enterprise 2.0 system administrator, evangelist, community manager, and trainer at United Business Media, the parent company of InformationWeek, the Enterprise 2.0 Conference, and a range of leading business technology media brands. Contact her at email@example.com.