Sign in

4. User testing procedure

Test script

Our prototype was a redesign of a single task within the Indeed Jobs app, and that redesign was split into two pathways; the first was to find an interesting or relevant job post using the filters, and the second was to do the same using the chatbot that we created. As such, our test script included two tasks designed to test these two pathways, full details of which can be found here.

Method of testing

We performed our usability testing as moderated remote tests. We used a range of meeting software to do so, using Skype and Zoom and recording the sessions through the software. Where a note-taker was unavailable for the session, the recordings were reviewed by another team member and notes were taken to capture points of difficulty or observations.

Testers and measuring methods

We recruited our participants from personal contacts and our classmates whose ages ranged 25–35. We requested that our testers used the “think-aloud” method to give us insights into their thought process and observations throughout. We encouraged their honest feedback and reassured that any faults discovered were positive as they allowed us to progress our design.

Our first round of testing consisted of 5 participants and our second round consisted of 4 participants. The recorded sessions and associated notes can be found here.

Measurement Methods

For Round 1 of testing, we did not request a System Usability Scale (SUS) questionnaire to be filled out following the tests and we based our measurement of user satisfaction based on observations, tester comments and notes. For Round 2, we requested testers to fill out a SUS questionnaire generated using Microsoft Forms, of which 3 participants responded. The results of this SUS questionnaire can be seen in full here. Overall, the results of the questionnaire indicate a high level of usability, however it is difficult to ascertain the increase in usability using these results against the original or our first prototype.

Results used to refine the prototypes

The results of our first round of testing allowed us to refine our second prototype to be more functional and less surprising that the first. The main changes included:

  • Removing gifs from the chatbot as they did not fit the desired tone of voice, described by one participant as “a bit random”.
  • Enlarging the text of the onboarding procedure and reducing the amount of text and steps within as one participant struggled to read it and just skimmed it.
  • Adding a delay for responses by the bot. In our first prototype, it required two clicks to make the bot response appear. In one test it resulted in a task failure and the test ended there as the tester was also stumped.
  • Introducing a carousel for job results within the chatbot function.