Bad User Experiences: How Analytics Can Improve App Testing

Not doing any user experience testing on your enterprise apps? Do it now. Want to make sure your apps are successfully used by your employees? Consider applying usability analytics to your evaluation process. Here's how.

Jonathan Feldman, CIO, City of Asheville, NC

October 1, 2015

5 Min Read
<p align="left">(Image: wgmbh/iStockphoto)</p>

10 Free Tools For Productive Programming

10 Free Tools For Productive Programming


10 Free Tools For Productive Programming (Click image for larger view and slideshow.)

Those of us working in enterprise IT write custom apps so customers and employees will use them to great business benefit. But if we want people to use our apps, they must be useful. If we want useful apps, we must focus on usability testing and customer experience testing.

Indeed, I believe that your organization cannot possibly become a digital organization without focusing on usability. The question is: Will the next step in usability for your enterprise require the next step of analytics?

As usual, digital business examples help to make the picture clear. Because of analytics, Amazon knows what you want before you know it. So does Target. But: Analytics is not only something used for point of sale experiences. Sometimes, "conversion" means something other than getting someone to buy something.

Sometimes it simply means that someone likes using your application. Sometimes it means creating something that makes it easier for your employees to do their jobs. Sometimes conversion means that someone did what you wanted them to do, like use a knowledge base to answer their question (cheap) instead of calling your global help line (expensive).

Using an application event analytics platform to test usability means that you can ask questions like, "How many people close the window after 10 seconds?" and "How much does the mouse move prior to users finding what they want?"

Naturally, as with all analytics, the magic happens with forming the right questions and analyzing the data correctly. Oh no, data science again?!

Why yes, data science. Again. Sorry. But here's the great news. Some Software-as-a-Service can, at least, take away the sting of "instrumenting" your app or website. That is, instead of building your own data platform, your developers will only need to add a line or two of code, and the events of interest will start to get tracked.

[Do you have what it takes to achieve digital transformation? See True Digital Organizations Put People First.]

There is a certain despair in some enterprises that we can never make data do our bidding in the ways that giants such as Google, Amazon, and Target do. There's a notion that these companies are unicorns, that it's too hard for your busy, financially constrained enterprise -- an organization that, by the way, doesn't have the right kind of support at the top -- to be able to meaningfully use data.

No. That is so not true. Amazon, Google, and Target are also busy organizations. They are large organizations, so you know without even visiting them there is a certain amount of cultural dysfunction that they must overcome on a daily basis.

Just. Like. You.

So, as a social experiment, let's go to the experiences of a startup. You know they have fewer resources than your enterprise. The only difference between their organization and yours is that their leadership may be more visionary about creating a digital company than your bosses are. Maybe. (Based on my experience with a variety of startups, I will tell you that startups, like everyone else, are composed of a population distribution: There are plenty of idiot CEOs. So, it's not always true that startups have digital leadership, even when the startup is super techie.)

I chatted recently with Aytekin Tank, the CEO of JotForm. JotForm makes SaaS so that your company doesn't have to commit the digital felony of distributing Word documents or PDFs to gather information from employees and customers.

Tank says that prior to instrumenting the features being deployed, it was difficult to know where to focus efforts. They now use Keen.io's analytics services, and "it's almost too easy to use."

More to the point, these analytics provide data about app use that is impossible to know otherwise.

For example, when a competitor shut down, JotForm created an import tool to help the competitor's customers easily migrate to JotForms. The team instrumented the migration completion. When a customer was fully migrated, that was considered a "conversion."

The conversion data showed something strange: Windows PC users, as opposed to Mac users, had a relatively low final conversion rate. Was it something about the user interface on Windows?

The team theorized about which feature was difficult to use and tried several changes, continuing to track conversion rates as they modified the user interface. One of the fixes (in the way that files were converted) showed a jump from 54% conversion to 85% and 90%+ conversion rates. Success!

JotForms also uses its analytics tool to determine which of its software features never get used. Startups would be a lot better off if developers had more actionable data about "zombie" features, that is, features that get a lot of developer time and money that are never used.

This is not a small thing in the enterprise, with developers miming what they see on "big ERP" screens. That is, complexity and everything but the kitchen sink thrown in. Zombie features not only cost resources to maintain that could be better spent on other things; they also annoy and confuse users.

That's ultimately what I'm driving at when I ask whether the next step in usability for your enterprise will require the next step of analytics. Don't get me wrong. I think that if you do only one thing with usability, simple user experience testing should be it. But, if you're already doing that, adding analytics could bring significant value.

What are your usability testing strategies? How much user research do you perform before unleashing a home-grown app on your employees? What are your biggest pain points when it comes to usability testing? Tell us all about it in the comments section below.

About the Author

Jonathan Feldman

CIO, City of Asheville, NC

Jonathan Feldman is Chief Information Officer for the City of Asheville, North Carolina, where his business background and work as an InformationWeek columnist have helped him to innovate in government through better practices in business technology, process, and human resources management. Asheville is a rapidly growing and popular city; it has been named a Fodor top travel destination, and is the site of many new breweries, including New Belgium's east coast expansion. During Jonathan's leadership, the City has been recognized nationally and internationally (including the International Economic Development Council New Media, Government Innovation Grant, and the GMIS Best Practices awards) for improving services to citizens and reducing expenses through new practices and technology.  He is active in the IT, startup and open data communities, was named a "Top 100 CIO to follow" by the Huffington Post, and is a co-author of Code For America's book, Beyond Transparency. Learn more about Jonathan at Feldman.org.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights