Satisfaction Insurance For Your Social Business Platform
Writing and rewriting test scripts may be a tedious task, but it forces you to have a plan and constantly improve upon it.
In my previous column, I discussed the need to put your company's social business platform through rigorous testing with each successive upgrade. Doing so will help your community put new features to use, react quickly to challenges that members encounter, and let them know you're responsive to their needs.
So how do you actually put this advice into action? I've been surprised to learn that even seasoned IT pros often don't know. And even when they do, getting started can seem daunting--writing a test script is a tedious analytical task.
Think of test-script creation as satisfaction insurance. It forces you to have a plan, work the plan, and then improve upon it for next time. One of the most important pieces of advice I can offer is to find someone using the same system who's willing to share his or her test-script or plans. That's exactly what I did, and it was a huge help in getting me rolling. I didn't have to reinvent the whole process, just tweak it so that it worked better for me. It also added maturity to my tests, keeping me from making beginner's mistakes.
To start, I took that base script and thought about our community. Many questions came to mind that helped me to plan not only the script, but also the actual testing plan:
--Were the tasks representative of how I work in the community?
--Were the tasks representative of how other community members work?
--Did I need to address other components or behaviors core to my company's usage?
--Were there tasks or scenarios that weren't relevant and could be eliminated?
--Which scenarios would require the assistance of other people? Could I play some of those other roles by using my own test user accounts?
--Which tasks required the user to have specific permissions? And should some of the tasks be viewed with more than one permission set?
I went back through the script at this point and added the details from my answers. In some places, that meant fleshing out what user permission levels would be needed to test a certain task. In other places, I added reminders of the kinds of things to look for, the things I knew in the past had broken during upgrades.
I also added notes reminding me to wait for feedback from another tester, as well as when to check that an email had been received and contained the correct information. I then checked to see if the different sets of tasks were grouped and ordered logically to reduce jumping around to accomplish precursor tasks.
Now it was time to review the release notes. This took a couple of passes. First, I needed to determine which new features and bug fixes would apply to how we were using the system. For example, we don't use LDAP, so any fixes or changes for that aren't relevant for me.
Second, I needed to determine which elements would be visible to our users or have a direct impact on the way we use the administrative tools. During this part of the process, I also made separate notes on which items needed to be communicated and/or documented for our employees.
Third, I figured out where in my current script to add all of these items, keeping in mind precursor tasks and logical task groupings.
At the same time, I was checking to make sure that all of the existing tasks in the script were still necessary. Sometimes, I'll leave a task in there for an upgrade or two to ensure that once it's fixed, it stays fixed. Pruning is very important if you don't want your script to grow so large that you never finish the test, or you get so bored testing that you get sloppy.
With all of this work behind me, I did a dry run with my script and validated that it all made sense. Did I forget an obvious step? Did I account for all of the elements included in a new feature? Was I testing not just the expected path, but also alternate paths?
Of course, even all of this careful planning doesn't keep each upgrade from bringing with it a list of things I missed. I get plenty of opportunities to learn from mistakes and improve my script for the next set of tests. The challenge is to force myself to repeat the steps for each release, even the small ones.
Tracy Maurer is an Enterprise 2.0 system administrator, evangelist, community manager, and trainer at United Business Media, the parent company of InformationWeek, the Enterprise 2.0 Conference, and a range of leading business technology media brands. Contact her at [email protected].
Attend Enterprise 2.0 Boston to see the latest social business tools and technologies. Register with code CPBJEB03 and save $100 off conference passes or for a free expo pass. It happens June 20-23. Find out more.
About the Author
You May Also Like
2024 InformationWeek US IT Salary Report
May 29, 20242022 State of ITOps and SecOps
Jun 21, 2022