You are on page 1of 27

Chapter Six

Evaluation Techniques and


Universal Design

6/11/2018 yitbe... 1
Chapter Content

1. Evaluation Techniques (What is evaluation?,


Goals of evaluation, Choosing an evaluation
method)
2. Universal Design (Introduction, Universal
design principles, Multi-modal
interaction, Designing for diversity)

6/11/2018 yitbe... 2
Evaluation Techniques - Introduction (1.1)

• Evaluation tests the usability, functionality and


acceptability of an interactive system.
• Evaluation may take place: in the laboratory or in the
field.
• Some approaches are based on expert evaluation:
– analytic methods
– review methods
– model-based methods
• Some approaches involve users:
– experimental methods
– observational methods
– query methods
6/11/2018 yitbe... 3
Evaluation Techniques – What is Evaluation (1.2)

Evaluation
– tests usability and functionality of system
– occurs in laboratory, field and/or in collaboration
with users
– evaluates both design and implementation
– should be considered at all stages in the design life
cycle

6/11/2018 yitbe... 4
Evaluation Techniques - Goals of Evaluation (1.3)

• Assess extent of system functionality


– The system’s functionality is important in that it
must accord with the user’s requirements.
• Assess effect of interface on user
– This includes considering aspects such as how
easy the system is to learn, its usability and the
user’s satisfaction with it.
• Identify specific problems
– These may be aspects of the design which, when
used in their intended context, cause unexpected
results, or confusion amongst users.
6/11/2018 yitbe... 5
Evaluation Techniques - Choosing an Evaluation Method
(1.4)

Evaluation Through Expert Analysis


• Three approaches to expert analysis
include:
– Cognitive Walkthrough
– Heuristic Evaluation
– Review-based evaluation

6/11/2018 yitbe... 6
Cognitive Walkthrough
Proposed by Polson et al.
– evaluates design on how well it supports user in
learning task
– usually performed by expert in cognitive psychology
– expert ‘walks though’ design to identify potential
problems using psychological principles
– forms used to guide analysis

6/11/2018 yitbe... 7
• For each task walkthrough considers
– what impact will interaction have on user?
– what cognitive processes are required?
– what learning problems may occur?

• Analysis focuses on goals and knowledge:


does the design lead the user to generate the
correct goals?

6/11/2018 yitbe... 8
Heuristic Evaluation
• Proposed by Nielsen and Molich.
• usability criteria (heuristics) are identified
• design examined by experts to see if these are
violated
• Example heuristics
– system behaviour is predictable
– system behaviour is consistent
– feedback is provided
• Heuristic evaluation `debugs' design.
6/11/2018 yitbe... 9
Review-based Evaluation
• Results from the literature used to support or refute
parts of design.
• Care needed to ensure results are transferable to
new design.
• Model-based evaluation
• Cognitive models used to filter design options
e.g. GOMS prediction of user performance.
• Design rationale can also provide useful evaluation
information
6/11/2018 yitbe... 10
Evaluating through user Participation
Laboratory studies
• Advantages:
– specialist equipment available
– uninterrupted environment
• Disadvantages:
– lack of context
– difficult to observe several users cooperating
• Appropriate
– if system location is dangerous or impractical for
6/11/2018 constrained single user systems
yitbe... to allow controlled 11
manipulation of use
Field Studies
• Advantages:
– natural environment
– context retained (though observation may alter it)
– longitudinal studies possible
• Disadvantages:
– distractions
– noise
• Appropriate
– where context is crucial
6/11/2018 yitbe...for longitudinal studies 12
Evaluating Implementations
Experimental evaluation
• controlled evaluation of specific aspects of
interactive behaviour
• evaluator chooses hypothesis to be tested
• a number of experimental conditions are
considered which differ only in the value of some
controlled variable.
• changes in behavioural measure are attributed to
different conditions
6/11/2018 yitbe... 13
Observational Methods
Think Aloud
• user observed performing task
• user asked to describe what he is doing and why, what he
thinks is happening etc.
• Advantages
– simplicity - requires little expertise
– can provide useful insight
– can show how system is actually use
• Disadvantages
– subjective
– selective
– act of describing may alter task performance
6/11/2018 yitbe... 14
Cooperative Evaluation
• variation on think aloud
• user collaborates in evaluation
• both user and evaluator can ask each other
questions throughout
• Additional advantages
– less constrained and easier to use
– user is encouraged to criticize system
– clarification possible

6/11/2018 yitbe... 15
Protocol Analysis
• paper and pencil – cheap, limited to writing speed
• audio – good for think aloud, difficult to match with other protocols
• video – accurate and realistic, needs special equipment, obtrusive
• computer logging – automatic and unobtrusive, large amounts of data
difficult to analyze
• user notebooks – coarse and subjective, useful insights, good for
longitudinal studies
• Mixed use in practice.
• audio/video transcription difficult and requires skill.
• Some automatic support tools available

6/11/2018 yitbe... 16
Automated Analysis
• Workplace project
• Post task walkthrough
– user reacts on action after the event
– used to fill in intention
• Advantages
– analyst has time to focus on relevant incidents
– avoid excessive interruption of task
• Disadvantages
– lack of freshness
– may be post-hoc interpretation of events

6/11/2018 yitbe... 17
Post-task Walkthroughs
• transcript played back to participant for
comment
– immediately  fresh in mind
– delayed  evaluator has time to identify
questions
• useful to identify reasons for actions and
alternatives considered
• necessary in cases where think aloud is not
possible
6/11/2018 yitbe... 18
• Query Techniques
– Interviews
– Questionnaires
• Physiological methods
– Eye tracking
– Physiological measurement

6/11/2018 yitbe... 19
Factors distinguishing evaluation techniques
• the stage in the cycle at which the evaluation is carried
out
• the style of evaluation
• the level of subjectivity or objectivity of the technique
• the type of measures provided
• the information provided
• the immediacy of the response
• the level of interference implied
• the resources required
6/11/2018 yitbe... 20
Universal Design - Introduction (2.1)

• Universal design is about designing systems so that


they can be used by anyone in any circumstance.
• Multi-modal systems are those that use more than one
human input channel in the interaction. These systems
may, for example, use: speech, non-speech sound,
touch, handwriting and gestures.
• Universal design means designing for diversity,
including:
– people with sensory, physical or cognitive impairment
– people of different ages
– people from different cultures and backgrounds.

6/11/2018 yitbe... 21
Universal Design - Universal Design Principles (2.2)

• equitable use
• flexibility in use
• simple and intuitive to use
• perceptible information
• tolerance for error
• low physical effort
• size and space for approach and use

6/11/2018 yitbe... 22
Universal Design - Multi-modal Interaction (2.3)

Multi-modal vs. Multi-media


• Multi-modal systems
– use more than one sense (or mode ) of interaction
e.g. visual and aural senses: a text processor may speak the words as well
as echoing them to the screen

• Multi-media systems
– use a number of different media to communicate
information
e.g. a computer-based teaching system: may use video, animation, text
and still images: different media all using the visual mode of interaction;
may also use sounds, both speech and non-speech: two more media, now
using a different mode
6/11/2018 yitbe... 23
Universal Design - Designing for Diversity (2.4)

Users with Disabilities


• visual impairment
– screen readers, SonicFinder
• hearing impairment
– text communication, gesture, captions
• physical impairment
– speech I/O, eyegaze, gesture, predictive systems (e.g. Reactive keyboard)
• speech impairment
– speech synthesis, text communication
• dyslexia
– speech input, output
• autism
– communication, education
6/11/2018 yitbe... 24
• Age groups
– older people e.g. disability aids, memory aids,
communication tools to prevent social isolation
– children e.g. appropriate input/output devices,
involvement in design process
• Cultural differences
– influence of nationality, generation, gender, race, sexuality,
class, religion, political persuasion etc. on interpretation of
interface features
– e.g. interpretation and acceptability of language, cultural
symbols, gesture and colour
6/11/2018 yitbe... 25
Summary

• Evaluation is an integral part of the design process


and should take place throughout the design life
cycle. Its aim is to test the functionality and usability
of the design and to identify and rectify any
problems.
• A design can be evaluated before any
implementation work has started, to minimize the
cost of early design errors.
• The choice of evaluation method is largely
dependent on what is required of the evaluation.
6/11/2018 yitbe... 26
Summary

• Universal design is about designing systems that are


accessible by all users in all circumstances, taking account of
human diversity in disabilities, age and culture.
• Universal design helps everyone – for example, designing a
system so that it can be used by someone who is deaf or hard
of hearing will benefit other people working in noisy
environments or without audio facilities.
• For any design choice we should ask ourselves whether our
decision is excluding someone and whether there are any
potential confusions or misunderstandings in our choice.
• What are 5 usability factors?
• Ease of Learning
Efficiency of use
6/11/2018 yitbe... 27

You might also like