Scenario-based Checklists

 Characteristics

Applicable stages: design, code, test, and deployment.
Personnel needed for the evaluation:
Usability experts: 1 or 3
Software developers: 0
Users: 0
Usability issues covered:
Effectiveness:Yes
Efficiency:Yes
Satisfaction:No
Can be conducted remotely: Yes Can obtain quantitative data: No

 Overview

The inspection is done along three scenarios: novice use, expert use, and error handling. For each scenario, a checklist is provided that describes the issues to be checked, along with an instruction of the inspection process. Each inspector works on only one scenario. Usually three inspectors are needed for the inspection of a system, each for one of the three scenarios. But one inspector can also use this technique by carrying out three inspection sessions, each for one of the three scenarios. Inspectors can be human factors engineers or software developers. One moderator (a human factors engineer) is needed to prepare the inspection materials (user profile, task cases), arrange the inspection and collecting inspection results. This technique can be applied during the following development stages: design, code, test, and deployment.

Procedure

Prepare User Profile and Task Cases

The coordinator needs to prepare the user profile and task cases and provide them to the evaluators. The tasks are presented differently to evaluators working on different scenarios. For novice use, each task is followed by the steps that are most easily learned to accomplish the task. For expert use and error handling, each task will also be followed by the another set of steps that are can accomplish the task most efficiently, if there exists one. These steps for accomplishing each task are provided by designers or developers of the system.

In the case that one evaluator uses the techniques to do evaluation on all three scenarios, he/she should do the novice use first, expert use second, and error handling last.

Inspection Instructions for Novice Use

  1. Read the user profile to understand the user characteristics.
  2. Read through this instruction to understand the evaluation task, eliminate items that do not apply to the current system.
  3. Start the inspection by going through the following steps and check each applicable item. Write down the task domain knowledge or computer knowledge that is required by the system on the users for them to understand or use the system but is not listed in the user profile.
  4. If a user may need to install the system, install the system and check question Q1.1.
    Q1.1: How easy is it to install the software?
    • Whether there is an installation program that requires as little effort as possible from the user.
  5. If the user may not have any experience in using the input devices, check question Q1.2 at the first appearance of the system.
    Q1.2. Does the system provide good support for the most-novice users?
    • When the system starts, whether there is information on the interface to help the most-novice users understand how to use the system.
  6. Go through the task cases. For each task, chesk questions Q1.4 and Q1.5. For each new screen/session, check quesiton Q1.3.
    Q1.3: How understandable is the interface itself and its content?
    • Whether the interface has a meaningful caption indicating what it is and how it relates to other interfaces.
    • Whether the interface present the objects/data that the user is interested in by using meaningful visual cues or spatial arrangement to represent the semantic information of the objects/data.
    • Whether the relationship among the objects on the interface is consistent to that in the user's knowledge.
    Q1.4: How easy is it to specify an object or an action?
    • Instructional information for novice users
      1. Not available on-line.
      2. In the on-line tutorial but is not retrievable by function.
      3. In the on-line tutorial that is retrievable by function.
      4. Always has clear instruction/affordance about what the user needs to do for a certain function.
    • When needed, whether the information about actions, objects, command formats, or data formats is presented directly on the interface?
    Q1.5: How easy is it to understand the outcome of an action?
    • Promptness of system feedback.
      1. No feedback.
      2. Only gives feedback when an action is complete.
      3. Gives feedback when an action is complete, and constantly gives progress information when the action takes more than 5 seconds.
    • Whether a user can perceive the feedback without changing the current focus of attention.
    • Whether or not the representation of an object changes promptly when its representation-related content is changed.
    • Whether or not the representation of an object disappears promptly from the interface when the object is deleted.

Inspection Instructions for Expert Use

  1. Read through this instruction (including the questions and metrics) to understand the evaluation task, eliminate items that do not apply to the current system.
  2. If you have not used the system before, use the system freely for a while to get familiar with the system.
  3. Go through the task cases. For each task, do the following:
  4. Think about the most efficient way to do it at the conceptual level and write down the conceptual steps.
  5. Read through the actions provided with the task and answer question Q2.1.
    Q2.1: Does the system effectively support users grasing the most efficient way of doing tasks, if it's not the only way to do it?
    • The way that information about the most efficient techniques are provided on the interface.
      1. Not available on-line.
      2. In the on-line tutorial or occasional tips.
      3. Present in some way each time the user uses the less efficient technique.
  6. Complete the task by carrying out the actions provided, and check questions Q2.2, Q2.3, Q2.4, and Q2.5.
    Q2.2: Is the interface designed to best facilitate the efficient completion of tasks?
    • Whether the system has not caused the user any extra work for the task?
    • Whether the system always provides default values when possible.
    • Whether cut-and-paste always work.
    • Whether the system is design in a way that no frequent switchings between keyboard input and mouse input are needed for tasks.
    Q2.3: How easy is it to use the system for the tasks?
    • Whether no difficult/stressful action(s) involved in the task. Difficult/stressful actions include, but not limited to:
      • Motor: double click a mouse button, hold a mouse button, combination keys
      • Percectual: read blinking text, text in small font, or text in hard-to-read colors
      • Cognitive: have to recall things presented in earlier stages to be used in the current stage
    Q2.4: How does the layout of the user interface facilitate efficient use? (for screen-based interfaces)
    • Whether the window space is effectively used so that the amount of vertical and horizontal scrolling is minimized.
    • Whether the widgets are positioned in a way that mouse movement for the task are minimized.
    • Whether multiple windows are arranged in a way that one won't clutter another when they are both needed for a task.
    • Whether the use of un-movable or un-sizable or modal dialog boxes is minimized.
    Q2.5: How appropriate is the system feedback presented?
    • Whether there is no trivial system message that require unnecessary user action.
    • When working with a large data space, the extent to which the system provides information as to where the current position is relative to the entire space.
      1. No such information provided.
      2. Such information is provided when user initiates a command.
      3. Such information is provided constantly and dynamically.

Inspection Instructions for Error Handling

  1. Read through this instruction (including questions and metrics) to understand the evaluation task, eliminate items that do not apply to the current system.
  2. If you have not used the system before, use the system freely for a while to get familiar with the system.
  3. If the user will use the system to do constructive work, generate an abnormal system termination while doing such work, and check the following:
    Q3.1: To what extent is the system error-corrective?
    • Whether the system save the user's work periodically in case the system crashes?
    • When the system restarts after crashing, whether the system informs the user how to retrieve the most recent backup when it exists?
  4. Go through the task cases. Try to generate some user errors, such as giving data of the wrong type or wrong format for data entry, access floppy disk drive when it's not ready, etc. Check the following questions and metrics:
    Q3.2: To what extent is the system error-preventive?
    • Whether invalid operations/commands are disabled.
    • Whether the system generates a message for destructive actions of which the effect is not easy to recover.
    • Whether the objects and actions unambiguously presented.
    • Whether the system verifies data entered by a user for format, range, etc.
    • Whether the system maintains consistencies so that users won't make unexpected mistakes by following conventions in the systems.
    • Whehter the system comply with the conventions in the user's task domain knowledge and previous computer experiences.
    Q3.3: To what extent is the system error-informative?
    • Whether the error message uses the user's language to explain what is wrong.
    • Whether the error message is context-sensitive.
    • Whether the error message informs the user what to do to recover from the error.
    • When the user's action does not cause any effect, whether the system generates a message telling the user why the action does not have any effect at this situation.
    Q3.4: To what extent is the system error-corrective?
    • Whether the system always have clearly marked exit?
    • Whether actions are made reversible whenever possible.
    • Whether the system provides an UNDO facility that can undo multiple steps of concrete operations.
    • Whether each on-going operation can be canceled by the user.
    • When the user enters an invalid data entry, whether the system presents some valid data entries similar to what the user has entered and lets the user confirm.
  5. After going through the task cases, write down the noticed inconsistencies of the user interface in the presentation of objects and actions.
References