User Testing: Cloud Based Video Recording Service

Concept and usability testing for a new cloud streaming storage service.

OVERVIEW
DURATION
2 Weeks

ROLE
Researcher
Moderator

PROBLEM

The purpose of this study was to evaluate the usability of a prototype of a new cloud recording service iOS mobile application . The application saves video from video streaming services to a device.

The primary points of concern centered around user comprehension of the steps it takes to create a downloadable recording. The servers must record, in real time, video from one of these sources and only afterwards is it available for the user to download. The process is very similar to a traditional DVR, but is expected to be unfamiliar in the mobile form factor and with streaming services who typically offer content instantly. Additionally, failure to record is not uncommon using the current backend implementation.

The research goals were to evaluate how users comprehended the service to work, how they understood and reacted to errors, along with more traditional usability concerns such as visual presentation and UI navigation.

 

PROCESS

This project done in collaboration with a member of the product team, and fit into their product cycle. Already, the team had done a significant amount of work and created prototypes for testing, which is where I fit in. My suggestions had key influence for the final software.

 

METHODS

PROTOTYPE

The evaluation prototype was created with the prototyping software Pixate and ran on an iPad Mini. A task list was created in conjunction with a few “golden path” navigation abilities; in other words, the prototype did not carry with it all the functionality of the expected final product but supported the tasks required to explore the research questions.

 

PARTICIPANTS

Participants recruited were screened to represent the target market for the application as defined by the company.

  • 22 - 36 years old
  • Uses a media streaming service such as Netflix, Hulu, Spotify, etc.
  • Owns a compatible device (iOS: iPhone, iPad)

 

PROCEDURE

Following background information, a description of the application was verbally given to the participant. This description essentially reflected what they might see in an app store description or a long web ad. It purposefully mentions the cloud recording process, but did not go into detail.

Next, participants were instructed to review the initial screen of the application and verbally evaluate their impressions and expectations. Once complete, a series of four tasks and scenarios were presented representing real use cases for the app.

Finally, participants were allowed to express any final thoughts and concluding discussions took place.

 

RESULTS

INITIAL IMPRESSIONS

Participants felt comfortable recognizing familiar networks that were immediately displayed by the interface. They universally noticed the navigation bar second, followed by slight confusion about the vacant black space on the right hand side, highlighted below. Prevailing feelings were that of familiarity and comfort. Some negative impressions about the size and inability to read the navigation bar focused around it’s small size.

Participants weren't sure what of the purpose of this empty space.


TASK 1

The first task was beginning a download of the Netflix TV show Master of None. Participants had no issues with this task, it was completed quickly and easily. They shared the sentiments that the flow was intuitive and the UI was comfortable. Their challenges were shared in interpreting what “added to queue” meant (the message received once the episode was paid for). One participant noted that the full episode description under the show might give too much away should spoilers be a concern.

Unfamiliarity with a mental model of "recording" lead to difficulties in understanding the queue.


TASK 2

The second task was checking the status of the same Netflix show. Half of the participants did not follow the expected path. One participant was irritated they could not press the “in queue” button to arrive at their queue. Evaluating the queue page provided difficult for all participants. No single one was entirely comfortable with the information that was being provided to them. The most common issues, highlighted in a screenshot, were:

  • Problems understanding if the videos underneath the “currently in queue” have been downloaded or have yet to be downloaded
  • Problems understanding if this queue is for downloading or recording to the cloud

Continued misunderstanding of how the system functions lead to difficulty in navigation.


TASK 3

Task three had participants finding the content they had downloaded to their device and playing the movie Whiplash. Three participants navigated to their library, chose to show content on their device and play the movie. Two more played through the notification section, which was successful, but missed navigating to all of the the content on their device. Two, en route to playing the movie, navigated to their account assuming they would be able to see all of the content they owned. A major source of confusion arose immediately before playing the movie: exists a “download” icon next to the “play” icon. Participants who noticed this icon began second guessing if they had the content yet at all.

The system misrepresented its state  by placing a "download" button in a piece of media that had already been downloaded.


TASK 4

The final task was reacting to an error message. The participants were told they received an email from the service notifying them of a failed recording attempt. The goal was to find the notification and gauge reactions to both the error and being offered a recording credit for their failed download.

Although the error the message indicated that the system had attempted to record the video three times, the majority of the participants believed it was still possible to obtain the video by trying again themselves. None of the participants were confident in understanding what had lead to the error. All speculated at some point that the error had occurred due to a problem with their own network, representing a misunderstanding of the apps process. All participants indicated that if this error occurred with some regularity, they would uninstall the app.

Errors were a large cause of frustration and confusion.


RECCOMENDATIONS

These are the recommendations I put forth for the product, which were incorporated for it's release version.
 

NAVIGATION BAR

  • Different icons should be used “queue” and “notifications”. The current icons are the source of confusion.
     

SIZE

  • The current size of the navigation bar makes text hard to read. This may also address the issues with icon misinterpretation.
  • The size of most touch targets may be small, so increasing the size of menus and lists may result in a better experience.
     

QUEUE PAGE

  • Explicitly indicating “recording to cloud” instead of “recording” could alleviate some general confusion and facilitate a clearer mental model of the app.
  • Changing “6 / 32” to “recorded 6 of 32 minutes” may solve the problem of confusion over what 6 / 32 means, however nonstatic images may also do this on their own. Should the progress bar move, it may spark understanding in a user’s mental model.
     

PLAY CONTENT PAGE

  • Remove the download icon from content that you already have downloaded.
     

ERROR MESSAGE

  • Incorporate a recommendation to the user to “try again later”.
  • Be transparent with blaming the streaming provider (ie Netflix has moved this content, it will be available at a later date). Allow a scheduling of trying again (ie in 3 hours, in one day, etc).
  • An error in downloading one episode was perceived to be a huge roadblock when trying to download an entire TV season. Make the status of “failed” available anywhere the episode information can be viewed (ie queue, library, discover).