Advertisement
  1. Web Design
  2. UX/UI
  3. Usability

Harnessing Social Media for Rapid Usability Testing

Scroll to top
12 min read
This post is part of a series called UX Foundations.
The Psychology of Content Design
Cultivating Trust Through Web Design

There are many ways of performing usability tests - all of which will get you feedback, allow you to make alterations and gradually iterate until you have a solution. But these methods often take time. Let’s look at an alternative process which will speed things up tenfold: leveraging social media.

My company is a small startup. Being so small allows us to apply the lean startup philosophy, which entails always building the minimum viable feature. That doesn’t mean that we’re trying to cheap out, it means that we want to get well-built features into our customers’ hands. Then we can get feedback, make it better, all in a short period of time.

Fitting Tests into Lean Workflow

Usability testing is the process of putting your product in front of a real person and trying to learn from watching them use it. This is true whether your product is a mobile app, a website, desktop software, or something else. By watching a person try to use your product, you can learn a lot. 99% of the time our customers are people who don’t think like us, or use computers or mobile devices the way that we would.

Lean UX values low fidelity deliverables

Lean UX values low fidelity deliverables, like paper prototypes, collaborative brainstorming and design, immediate feedback from users and colleagues, and people and ideas over deliverables.

Lean UX provides a whole new set of solutions to the age old problem of how to communicate better with your users. But it’s not without its own, new challenges!

If we promised customer X a feature within the next 48 hours, and I want to quickly get some validation before we code and ship that feature, a brief social usability test can get me some feedback within the next 10-15 minutes.

Compared to the slower process of calling or emailing existing contacts or trying to recruit some people via another method, this is a quick win.

New Problems

So let’s say your organization is successfully agile, and you’re iterating along. Your team is shipping features like crazy, and you’re able to keep your customers up to date on what’s coming up and when to expect it.

You’re even able to quickly implement things that are requested by customers. Transparency, speed, new features! Your customers love you. You’re enjoying a level of communication trust with your customers that might not have been possible previously.

But now you have a new problem. Everyone, even the most enthusiastic customers, will get tired of being asked questions every week. In fact, sometimes they may not even have an opinion. There’s a reason that companies pay people to take surveys and participate in studies.

Lean UX can lead to tester burnout pretty quickly if you’re not careful.

Two Cautionary Tales

For example, at my organization, we have a wonderful customer. We’ll call him George (his name isn’t really George). When he first signed up for his account, he was so excited about our product and our company, and he volunteered to be in my merry band of usability testers.

I talk to him at least once a month, sometimes more frequently than that, whether via videochat, click test, or emailed survey. At first he loved taking calls and talking about everything. He would talk my ear off for an hour, giving phenomenal detailed feedback, but after awhile he got a lot less chatty.

Almost a year later, he doesn’t always respond to my emails right away. This is because I took too much advantage of his enthusiasm to participate, and he got burned out.

Here’s another one. Once, I worked with an organization that was extremely survey happy. We routinely sent out surveys of over 50 questions (I should stipulate that these surveys covered more areas than usability, but still, the lesson stands).

Our customers were highly motivated to respond, but over time, we started to realize that the email address we were sending the surveys from was getting blacklisted. People were hitting delete, or even flagging our messages as spam.

With Great Power, Comes Great Responsibility

Even the most accommodating and eager of your company’s fanboys and fangirls has limited time and a limited attention span. We can combat this by constantly recruiting new testers. which can be conducted socially as much as the tests themselves can.

We can also combat this by knowing when to use our customer testers, and when we just need An Opinion. The time of your volunteer testers is a most precious resource, so don’t abuse it.

Enter Social Media

Contacting your clients via email or phone will generally take a little longer. Valuing, as we do, their precious time, we would always make sure to work around their busy schedules.

That means I can’t very well call my friend George up and say, ‘Take this clicktest RIGHT NOW, I want to get it coded by 2 o’clock”. Never mind that George is in London, and he was eating dinner with his kids. I have designs to test!

When you recruit testers from social media, you’ve got a pool of self-selected people who are ready to go right then! If they’re busy, they won’t click the link, but if they have a moment to spare, you get near instantaneous feedback.

I’ve posted tests to Twitter and filled my target sample size of twenty answers within ten minutes. Obviously, that speed will vary, depending on how bored people on twitter are. You will probably get better feedback by posting a test at 1pm on a weekday than at 3am on a Saturday.

How to Do It

Social usability testing is basically just regular usability testing with a twist. There are two types that I conduct on an ongoing basis.

Type 1: Testing to Solve a Particular Problem

When conducting a test to solve a particular problem, here are some things to keep in mind. Short click tests or surveys, with a limited scope or a very specific outcome objective work well in this format.

Do:

  • Keep it short
  • Have one or two specific metrics you’re trying to test (like time to complete)
  • Have a clear idea of what an answer looks like (for anyone who’s done user testing, you know that this is not always a given. In fact, sometimes it’s impossible.)

Don’t:

  • Send out nebulous problems or vaguely worded questions. This isn’t the right way to suss out the answers to life’s persistent questions. Those are things better addressed in person.

Here’s a use case for this type of test.

Short Click Test

Once upon a time, I was ever so slightly tweaking a sidebar in our app. I needed to add a couple extra items to it, so I wanted to group them more accurately.

Inevitably, the smallest changes bring out the strongest opinions, so the team was divided in short order. So, I put together a click test with mockups of the different versions, with the goal of finding out which one had the shortest completion time. Then I emailed it to my formal list of usability test volunteers. And I got no response for over an hour. I was aiming to have this particular design to the developers pretty quickly, so I decided to send it out over Twitter.

I don’t have the most Twitter followers in the world, but I do have somewhere between 300 and 400, and several of my friends were kind enough to retweet, so I had a large and varied group of people volunteering to do this click test. Certainly different people than I would have been able to contact one on one via email. I reached my target sample size of thirty responders in under thirty minutes.

Survey

I will frequently send out surveys of two or three questions, asking things like:

  • ”Which word do you like better?”
  • ”Which button do you like better?”
  • ”Which icon do you like better?”

These types of surveys allow for someone to give a brief opinion, without thinking too much. That’s exactly the sort of knee-jerk feedback that you would want for small tweaks, and it’s the sort of feedback you’re likely to get via social media.

Type 2: Social Media as a Feedback Channel

The second main usage of social usability that I use, is as an ongoing open feedback channel. Examples would include blogposts inviting comments and ideas, a ‘Feedback’ tab or button in the app or on the website, or just generally making it known that you’re open to and soliciting feedback, whether via Twitter, Facebook, or other channel.

Type 2 is much less specific. In fact, I would almost group it in with regular metrics gathering, except that it’s really much more interactive. There’s usually an opportunity to engage with the customer and gather further information about their feedback.

My favorite example of this is a ‘Feedback’ tab inside an app or posted on a website. It’s a passive way of soliciting feedback, and also very unstructured. The user can click somewhere and tell us something about what they think.

In order to conduct these tests, you’re probably going to need some tools. We’ll be covering such tools in a coming tutorial, so stay tuned.

Workflow

We’ve covered framing the problem and mentioned that we’ll need tools to help us. How about actually running this thing? Here’s the good part- it’s super easy. But you do need to keep an eye on any test in progress. Easy isn’t free license to ignore it and leave it to brew on its own, because if there are any problems, that can be very counter productive. Plus, you don’t want to miss out on any conversations that might get sparked by the distribution of the test!

Decisions to Make

  • Which social media channel will be best for this test?
  • Which tool is best suited to this test?
  • How long will you wait for responses before you switch to an alternate plan?

Deciding how you’ll recruit testers is something that will be determined by the type of question that you’re trying to answer, as well as which social networks your organization participates in.

If you really do need people with a certain level of familiarity with your industry lingo, you might want to post it on LinkedIn, for example.

If you want as many responses as possible as quickly as possible, Twitter is probably a better choice. And there’s no law against cross posting!

Usually what I do is post it on our company Twitter feed, which posts to Facebook at the same time, for maximum possible responses. Sometimes I’ll post it from my personal Twitter account as well.

Also, keep in mind that you might be able to post it on your website or application. This might vary depending on your users and your organization.

Keep an Eye

Wherever you decide to post it, once you do, you need to keep an eye on things. If there was a typo, a broken link, or any sort of problem with the test, you’re going to hear about it quickly. If there’s one thing people love to do on social media, it’s complain.

I always try to have a colleague take the test first before I post it, but sometimes these things slip through the cracks, especially if you’re moving quickly.

Other problems to look out for include confusion around using the testing software. For example, once I had some people on Twitter who were confused about the note-leaving functionality in a tool. The tool explains how to use it at the beginning of each test, but people don’t always read, and if it’s not obvious there’s going to be confusion.

You may also get the side effect of bonus feedback- someone sees that you’re looking for testers, but instead of taking the test, they start tweeting about how they like or don’t like X about your product.

Do have your support team on standby, sometimes requesting targeted feedback can have a way of stirring the pot, as it were.

The Aftermath

When the results finish coming in, if you’ve got a well-defined question, you should have a clear answer that doesn’t require any interpretation.

At my company, particularly in the case of feature click tests, everyone has usually been looking over my shoulder to see how their favorite is doing.

When the winner is announced, there’s some dancing and cheering, and usually a youtube video of some kind, or at the bare minimum an animated gif. But if your organization is less invested in your test results, you might consider packaging them in a brief report document to distribute among stakeholders.

I mentioned earlier about how interactive this method of testing can be. Make sure that anyone who reaches out with comments or questions gets a response, or you’ll have a lot less trust and interest the next time you need some helpers to do a test.

Take the time to follow up with each one personally, if you can. In my experience people usually think it’s cool when they get to talk to the UX team (or the people who are just generally doing UX).

Review

So, let’s review. This is my workflow, in a nutshell. I have no doubt that every organization is different and you will need to make adjustments. But I think this covers all the basics of what you need to make sure is done when you’re running a test in this medium.

  1. Define a problem. If you don’t know what information you’re trying to get, whatever it is that your people give you in the test results probably won’t be that helpful.
  2. Phrase the question. A poorly designed question almost forces a non-helpful answer.
  3. Define your audience. Don’t make George flag you as spam.
  4. Pick your weapon. Decide what tool works best for the task you’re trying to do.
  5. Send it out there!
  6. Follow up.

Ready to go out and try it?

Don’t be alarmed if you think you won’t get any responses. At the bare minimum, tweet it to me @aarahkahak, and I WILL take your test, I promise. And likewise, whenever you see someone who has a survey or click test out there, be a pal and take the five minutes to do it.

Advertisement
Did you find this post useful?
Want a weekly email summary?
Subscribe below and we’ll send you a weekly email summary of all new Web Design tutorials. Never miss out on learning about the next big thing.
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.