Wizard of Oz prototype

After the video prototype, the class was split into groups to create a behavioral prototype, alternately called a Wizard of Oz prototype.  This type of prototype uses whatever methods the prototyping team has available to simulate the product that they want to test, often attempting to keep the test’s participants oblivious to the fact that there is someone ‘behind the curtain’ who is actually controlling the prototype.  It is often used to get feedback on the initial design and functionality of a product that would be too expensive to create a more high-fidelity prototype.

For our project we were told to create a behavioral prototype for either a speech-to-text system, a wearable posture detection system, or a gesture recognition system, which is what we chose to create, in the form of a gesture controlled music app.  To achieve the sensor effect of the music player, we chose to take the advantage of the premium version of Spotify music player, which allows us to remotely control the player on mobile devices from laptop.  By using it, we successfully played around with the basic gesture controls including play/pause/next song/previous song.  The technology is also affordable, and easy to set up: all we used were a laptop and a tablet with Spotify premium installed.

We next assigned roles to each of the group members for the test.  I was enlisted to be the test’s moderator, with Juan Cai taking filming duties.  We introduced Rashmi Srinivas and ‘Hailey’ Xiaochen Yu to our participants as notetakers, while in reality Hailey was watching the participant’s gestures and controlling the tablet’s Spotify app from her computer.  We set up our group and equipment in an alcove at UW’s Allen Library and began our recruiting efforts.

Watch our video here:

Rashmi and I began by walking the library to see if we could find anybody, but quickly realized that anyone already there was likely going to be busy and not want to be disturbed, so we instead set up camp by the library’s entrance to recruit people coming or going.  Finding someone fairly quickly, we introduced the team as a joint HCDE and CSE group (not actually a lie) who had created a gesture recognition system that runs in Spotify’s background, and began our first round of testing.

After setting up the user, we faced a surprise challenge.  It turned out that Rashmi and I took long enough finding the participant that the connection between the laptop and tablet had timed out, meaning that when Hailey tried to give the tablet the ‘Play’ command, the sound instead came from her laptop and a connection request dialog box appeared on the tablet.  There was a moment of confusion for everyone present, with the user asking if the volume was turned down on the tablet.  I realized what had happened before long and took the tablet from the user, giving the Pause gesture to the device and Hailey before confirming the connection to the laptop.  Explaining this as a bug in our system, I put the tablet back in front of the user to get to the test proper.

From there, our first test went very well.  I had the user run some very simple play/pause tests as well as a skip forward and skip back command.  At the end of the test, he seemed impressed with our ‘app,’ leading us to believe that he was fooled by the test despite the technical issues it started with.  For good measure we chose to recruit a second participant, who we found and ran the tests with successfully, completely fooling her, with only one interesting problem.

Looking back at the tests, we realized that one thing we should have done was train the user on the gestures before sitting them down in front of the device.  For the most part, the gesture descriptions were enough for the participants to successfully use the device, however our second participant’s Play/Pause wave looked incredibly similar to the swipe used to skip forward a song, leading to Hailey having to make a hasty decision about which command to give the tablet.  Had we trained the user, we wouldn’t have had to make that quick decision which, in our test, ended up with us having to react to that motion as the Play/Pause command for the sake of consistency.

Thank you for reading!