We spoke about many different subjects during the meeting, including how researchers synthesize data. One researcher had attended a workshop put on by Dana Chisnell (Dana's blog) and she talked about the method she learned. At the workshop they performed an "affinity diagramming exercise" where the group:
1) Brainstormed on post-it notes
2) Grouped similar post-its together (stacked in a column, order doesn't matter)
3) Labeled each group with a different colored post-it with the theme heading
4) Voted (votes written directly on the theme headings), tallied up votes, and then prioritized each group
Before brainstorming, they made sure to have a clear focus question, grounded in observation and real data, not opinions. Another very interesting part of the exercise was that there was no talking! Everyone silently went to the wall and paired post-its together, and then found pairs that go together, and so on. Then everyone silently voted on the different themes, which is how the prioritization was decided. I think that's a good idea because without the talking, I'm sure things moved along more quickly and there was less influencing going on. It sounded like a great workshop. I'd be interested to attend one myself.
We also talked a bit about the favorite tools used by the researchers for different tasks. One that I thought was really cool was the Live Scribe. It's a pen that also records audio. It also sort of makes notes on the audio with what you write...if that makes sense. You write in a special notebook and let's say you're listening to a specific talk at a conference. You write down the subject of the talk when it starts. Later, to hear that audio from that talk, you simply tap with your pen on the title of the talk, which you wrote down in your special notebook. It's linked to the time signature of the audio and will start playing what you were listening to when you wrote down whatever it is you wrote down. That all sounds confusing. Maybe you should just visit the website linked above. The user experience with the pen is pretty interesting as well. If you want to e-mail yourself your notes, you write "email" on your paper and then tap your pen on each page you want to send. Every function you want to perform, you write/draw. I could see this being extremely useful when interviewing a user though! I want one.
The group spoke about different methods for videotaping users and remote user testing. Here are the different methods discussed:
- One of the researchers just uses his iPhone and puts it on a tripod. He researches usability of a mobile app. He has the iPhone set up close to the user so that the image is of the user holding the mobile device from her own vantage point.
- Hugging a laptop is another method for seeing what's happening on a mobile device. The user turns the laptop's camera away from herself and "hugs it." She then holds the mobile device out in front of her so that the laptop camera records what she's doing.
- There's an app out there called Reflection, which mirrors what is happening on the screen of your iPhone, iPad, or Mac. It's, as you may have guessed, only for Apple products right now. Also, you don't see the gestures, just what is happening on the screen. And you must be hooked up to wifi for it.
- ScreenFlow can be used for recording what is happening on a screen.
- For remote user testing, GoToMeeting, is an online meeting tool that can be used.
- Another site good for user testing is a meeting tool where you can share your screen, Join.Me.
I'll leave you with a picture of me hugging a laptop.
And here's the laptop taking a picture of me holding my iPhone, which is also taking a picture of me hugging my laptop. So meta!