Interviews |
Applicable stages: design, code, test, and deployment. | |||||||||||||||
Personnel needed for the evaluation:
|
|
||||||||||||||
Can be conducted remotely: No | Can obtain quantitative data: No |
Overview
In this technique, human factors engineers formulate questions about the product based on the kind of issues of interest. Then they interview representative users to ask them these questions in order to gather information desired. It is good at obtaining detailed information as well as information that can only be obtained from the interactive process between the interviewer and the user.ProcedureIn an evaluation interview, an interviewer reads the questions to the user, the user replies verbally, and the interviewer records those responses. The methods of interviewing include unstructured interviewing and structured interviewing.
Unstructured interviewing methods are used during the earlier stages of usability evaluation. The objective of the investigator at this stage is to gather as much information as possible concerning the user's experience. The interviewer does not have a well-defined agenda and is not concerned with any specific aspects of the system. The primary objective is to obtain information on procedures adopted by users and on their expectations of the system.
Structured interviewing has a specific, predetermined agenda with specific questions to guide and direct the interview. Structured interviewing is more of an interrogation than unstructured interviewing, which is closer to a conversation.
When holding an interview, the following guidance should be followed:References
- Alway record the interview. Making notes is often a distraction to the subject, who will have to restrain him/herself from having a look to see what is being written.
- Phrase the questions in an open or neutral way. Also, encourage the user to reply with full sentences, rather than a simple "yes" or "no". For example, ask, "What do you think of this feature?" and not "Did you like this new feature?"
- Begin with less demanding topics and move to more complex issues.
- Ask questions to reveal more information, not to confirm the investigator's beliefs.
- Include instructions about the answer. For example, answers can range from lengthy descriptions, to briefer explanations, to identification or simple selection, to a simple "yes" or "no".
- Do not try to explain to a subject why the system behaved in a particular way. Do not justify the design decision.
- Avoid using jorgon. Use terms that the subjects can understand.
- Do not ask leading questions. A leading question implies that a situation exists and influences the direction of response. For example, "how did that poorly designed dialog affect you?"
- Do not agree or disagree with the user; remain neutral.
- Use probes to obtain further information after the original question is answered (especially during the earlier stages of usability testing). Probes are used to encourage the subjects to continue speaking, or to guide their response in a particular direction so a maximum amount of useful information is collected. Types of probes include:
- Addition probe encourages more information or clarifies certain responses from the test users. Either verbally or nonverbally the message is, "Go on, tell me more," or "Don't stop."
- Reflecting probe, by using a nondirective technique, encourages the test user to give more detailed information. The interviewer can reformulate the question or synthesize the previous response as a proposition.
- Directive probe specifies the direction in which a continuation of the reply should follow without suggesting any particular content. A directive probe may take the form of "Why is the (the case)?"
- Defining probe requires the subject to explain the meaning of a particular term or concept.