Improving ApplyBC user experience

Knowledge gathered from ApplyBC user testing will soon lead to a better user experience. BCcampus conducted the first formal usability tests for our post-secondary application service at the Vancouver location in January. We wanted to identify errors and critical problems, and gauge the level of difficulty for an applicant to complete two forms: the ApplyBC.ca common form and the institutional form. We wanted to understand the cause of any difficulties that were encountered, gather feedback and suggestions from test participants and suggest improvements to the user experience for consideration in future ApplyBC development.

Clipboard ApplyBC testing blog postFirst off, we recruited participants using Facebook, Twitter, and personal connections based on certain criteria: they must not have used ApplyBC before, they must have their own laptops and be available during the times we booked for the testing room. As recommended by Jakob Nielsen, a leading web usability consultant, we aimed to recruit five participants, and in total we had six: two high school students, two university students, and two BCcampus colleagues. We tested one participant at a time, each session lasting approximately one hour.

During this hour, participants went through the application process according to a specified scenario: using ApplyBC, apply via an institutional form for any program, as if the institution was the preferred choice. Once participants finished the scenario, we had them fill out a brief questionnaire that asked about ease of use, and then interviewed them about their overall experience and their opinions of the forms.

The scenario was the most important because we observed any critical errors that could impede a user from completing the form and identified other non-critical issues. To get an idea of how intuitive the form is, we observed whether or not they needed some form of help, such as asking questions or clicking on the help boxes. To capture this information, we recorded the sessions on video and encouraged participants to speak aloud their thoughts as they used the application. We noted their comments on data collection forms (which were printed screenshots of both the ApplyBC common form and the institutional form).

To sum up our results by the numbers:

  • 20 minutes > average time to complete the application.
  • 13 minutes > fastest time to complete the application
  • 30 minutes > slowest time to complete the application
  • 2 > critical errors observed ApplyBC common form.
  • 10 > non-critical errors observed in the common form.
  • 11 > questions asked about the common form
  • 38 > suggestions regarding the common form
  • 0 > critical errors in the institutional form
  • 4 > non-critical errors in the institutional form
  • 6 > questions asked about the institutional form
  • 12 > suggestions made about the institutional form.

The two critical errors in the common form happened with only one participant: the first was an Internet Explorer script warning that caused the browser to crash. The second error occurred as the respondent filled in his personal information the second time. The error, “Username already exists” popped up, and the expected behaviour, which is to detect that the user information is already on file and offer that information, did not occur.

The other non-critical errors varied: missing mandatory fields, typing information different from what was asked and clicking on a link which led to a “Page not found” error. Types of questions asked were ones about the actual scenario (“The school [that I have to apply for] was … ?”), what-if’s (“What if I went to more than one institution?”) and inquiries about fields that did not clearly state the information asked for (ambiguities).

Testers gave us many suggestions regarding features that could make completing tasks easier, as well as design recommendations. However, despite participants’ concerns, many compliments were made about the existing application (for instance, participants liked the progress bar at the top).

Overall, each participant completed the form but not without some obstacles, all participants had input about the design of the application, and we discovered that there were not many performance problems when transitioning between the two forms.

We had two weeks to carefully prepare, recruit and execute the testing, and we consider the study a success. Facilitating the sessions went smoothly as participants were eager to start, all documents were prepared, set-up was efficient and we obtained all the data we needed and more.

You can see for yourself all data from the study (except, for privacy reasons, the video recordings).

Gracey Mesina is a co-op student from SFU School of Interactive Arts and Technology http://gracelle.carbonmade.com/