You are on page 1of 68

CSE

LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
Rogier Woltjer
<rogwo@ida.liu.se>
Division of Human-Centered Systems
Department of Computer and Information Science
Linkping University
BIKS1 4OKT06 F6&7
(with thanks to Erik Hollnagel and Yu-Hsing Huang)
Cognitive Systems Behaviour
in Complex Environments:
Accident Models and Risk Analysis
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
2
Accidents: Why? How?
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
3
Definition of accident
An accident is an unexpected event with unwanted
outcome
Unexpected
event
Unwanted
outcome
AND
Accident
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
4
Unwanted
outcome
prevented
Prevention of accidents
Accident
Normal
operation
Unwanted
outcome
Unexpect
ed event
Unexpected
event
prevented
Accident
avoided
Reduce the
probability
that the event
happens
Reduce the
consequences
of the event
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
5
Anatomy of an Accident
Normal
condition
Unexpected
event
AND
Abnormal
condition
Failure of
control
AND
Loss of
control
Lack of
defence
AND
Accident
Rasmussen & Jensen (1973)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
6
Cause of road accidents
Green & Senders (2003)
85% - National Safety Council (1974)
89% 1193 Finnish Insurance Information Center (1974)
95% 2130 English Study (cited in Sabey and Staughton, 1975)
88% 670 Perchonok (1972)
92.6% 2258 Treat et al. (1977)
% human error # accidents Study
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
7
Cause of road accident
Vehicle (12.6%)
Driver (92.6%)
2,258
road accidents
Improper lookout (23.1%)
Excessive speed (16.9%)
Inattention (15.0%)
Improper evasive action (13.3%)
Internal distraction (9.0%)
Environment (33.8%)
Treat et al. (1977)
View obstructions (12.1%)
Slick roads (9.8%)
Transient hazards (5.2%)
Design problems (4.8%)
Control hindrances (3.8%)
Braking systems (5.2%)
Tires and wheels (4.0%)
Communications systems (1.7%)
Steering systems (1.0%)
Body and doors 0.7%)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
8
Changes to attributed cause types
10
20
30
40
50
60
70
80
100
90
1960 1965 1970 1975 1980 1985 1990 1995
% Attributed cause
2000
Human
factors
O
rganisation
?
?
?
T
e
c
h
n
o
lo
g
y
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
9
EvoIving concept of cause
Accident /
event
Technical
failures
Other
Human
error
Operation
Maintenance
Design
Management
Latent failure
conditions
Organisational
failures
Violations
Safety culture
Barriers
Quality management
Resources
Heuristics
Information processes
Cognitive functions
Pathogenic organisations
Software failures
Complex
coincidences
Simple
causality
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
10
Accident prevention approach
Analyze
data
Select
remedy
Apply
remedy
Monitor
Basic Personal Philosophy of
Accident Occurrence and Prevention
Principles Beliefs
Fundamental Approach to Accident
Prevention
(Safety Management)
For Long-Term Safety
Management Considerations
and Safety Programming
For Short-Term Safety
Management Problems and
Considerations
Collect
data
Heinrich et al. (1980)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
11
ModeI - cIassification - method
Method
Classification
scheme
Model Analysis
Data:
Observations,
event reports
The method describes how the
classification take place
The model describes
the internal structure of
the classification
scheme
Hollnagel (1998)
Conclusions
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
12
Accident anaIysis & prevention
Analysis Prevention
Probable causes
Cost-benefit
analysis
Corrective action
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
13
Accident modeI
Analysis Prevention
Probable causes Corrective action
Accident
model
Effect Cause
Cause Effect
Cost-benefit
analysis
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
14
CausaIity assumption
Every cause has an effect
Every event (effect) has a prior cause
Cause Effect
Effect Cause
1. If we know
what this is ...
2. then we can
look for this!
1. If we can
see what this
is ...
2. then we can
find out what
this is!
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
15
Why do we need an accident modeI?
Accidents are the
complex result of
multiple, interacting
factors. In order to make
sense of this, an accident
model is required.
An accident model is
an abstraction that
describes how
accidents can occur
and therefore also how
they can be prevented.
Accident analysis Accident prevention
What should
we look for?
What can we
do about it?
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
16
SimpIe, Iinear cause-effect modeI
Assumption: Accidents are the (natural) culmination
of a series of events or circumstances, which occur
in a specific and recognisable order.
Consequence:
Accidents are prevented by finding and eliminating
possible causes.
Safety is ensured by improving the organisations
capability to respond.
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
17
The occurrence of a preventable injury is the natural
culmination of a series of events or circumstances, which
invariably occur in a fixed and logical order.
One is dependent on another and one follows because of
another, thus constituting a sequence that may be compared
with a row of dominoes placed on and in such alignment in
relation to one another that the fall of the first domino
precipitates the fall of the entire row
Domino modeI (Heinrich, 1930)
S
o
c
i
a
l

e
n
v
i
r
o
n
m
e
n
t
A
n
c
e
s
t
r
y
F
a
u
l
t

o
f

p
e
r
s
o
n
U
n
s
a
f
e

a
c
t
M
e
c
h
a
n
i
c
a
l

&

p
h
y
s
i
c
a
l
H
a
z
a
r
d
s
A
c
c
i
d
e
n
t
A
c
c
i
d
e
n
t
I
n
j
u
r
y
I
n
j
u
r
y
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
18
Domino theory
1
1. Ancestry and social environment
2. Fault of person
3. Unsafe act or/and unsafe
mechanical or physical condition
4. Accident
5. Injury
Removal of
middle domino
breaks the chain
2
3
4
5
Heinrich et al. (1980)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
19
Event and cause
B
C
C
Accident
Event
Cause
A
Unexpected
event
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
20
"Human error"
Hollnagel (1998)
Response
Response
within
limits?
Human
error
Correct
response
Criterion
Error as an
externalised category
No Yes
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
21
SequentiaI accident modeIs
Accident
Component
failure
Normally
functioning
system
Time
Human
failure
Technical
failure
Accident
analysis
Component
failure
Component
reliability
Accident
prevention
Time
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
22
Human erroneous action
Hollnagel (1998)
Mistake
Correct
Execution?
Slip
Correct
action
Error as an
internalised category
Correct
intention?
Yes
No
Yes No
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
23
Consequences of a simpIe, Iinear modeI
Find specific
causes and cause-
effect links.
Eliminate causes
and links.
Improve responses
Basic principle Purpose of analysis Typical reaction
Causality
(Single or multiple
causes)
C
D
D
Accident
Event
(Caus
e)
B
Unexpect
ed event
E
Normal
development
A
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
24
CompIex, Iinear cause-effect modeI
Assumption: Accidents result from a combination of
active failures (unsafe acts) and latent conditions
(hazards).
Consequence:
Accidents are prevented by strengthening barriers
and defences.
Safety is ensured by keeping track of performance
indicators.
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
Some holes are
due to active
failures
Other holes are
due to latent
conditions
Hazard
Loss
"Swiss cheese" modeI (Reason)
Accidents are seen as the result of interrelations
between real time unsafe acts by front line
operators and latent conditions weakened
defences.
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
26
HAE modeI
Weakened
defence
Host Agent
Environment
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
27
Event and factors
B
C
C
Accident
Event
Factors
E
A
A
H
Unexpected
event
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
28
Combinations of
unsafe acts and
latent conditions
Strengthen barriers
and defences.
Improve
observation (of
indicators)
Basic principle
Typical reaction
Hidden
dependencies
Consequences of a compIex, Iinear modeI
C
D
D B E
Barrier
Latent
conditions
Accident
Unexpected
event
Normal
developm
ent
A
Causes
Latent
conditions
Event
Basic principle Purpose of analysis
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
29
Non-Iinear accident modeI
Assumption: Accidents result from unexpected
combinations (resonance) of normal performance
variability.
Consequence:
Accidents are prevented by monitoring and damping
variability.
Safety requires constant ability to anticipate future
events.
D
C
B
A
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
30
NormaI Accident Theory (Perrow, 1984)
Accident is a normal state of complex systems
Two dimensions in the evaluation of system
Complexity
Coupling
Accident prevention
The failure of component is not the target
To understand the property of systems
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
31
Sharp end - bIunt end
Factors at
local
workplace
Management Company Regulator
Sharp end
factors work
here and now
Blunt end factors
are removed in
space and time
Government
Unsafe
acts
Morals,
social
norms
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
32
The spear
Government
Regulators
Company
Management
Operational staff
Work actions
Accident
Everybodys blunt end is
someone elses sharp end.
Roberts (2001)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
33
System output
The system aims to remain its output on a reference and within
an acceptable zone. System output fluctuates about the
reference. An accident occurs when the output over the
boundary.
Accident
System output
Acceptable
zone
Reference
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
34
MuItipIe factors and muItipIe chains
Accident
Chains of events are hindsight
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
35
Accident causation
Accidents are
caused by a
coincidence among
events, rather than a
sequence of
failures.
The events that
combine into the
accident can be
due to normal
performance
variability, as well
as proper failures.
Regulators
Equipment
Tasks
Environment
Monitoring
People
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
36
Close couplings
and complex
interactions
Monitor & control
performance
variability. Improve
anticipation
Basic principle Purpose of analysis Typical reaction
Dynamic
dependency,
functional
resonance
Consequences of a non-Iinear modeI
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
37
Systemic accident modeIs
Accident
Normally
functioning
system
Sharp end
factors
Blunt end
factors
Latent
system
conditions
Latent
system
conditions
Time
Common
conditions
System performance
variability
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
38
FunctionaI Resonance Accident ModeI
Design
(unanticipated
consequences)
Limited
maintenance
Technological
glitches and
failures
Inadequate
maintenance
Design flaws
and oversights
Incident,
accident
Latent
conditions
Human
performance
variability
Local
optimisation
(ETTO)
Incapacity
Impaired or
missing
barriers
Unclear
indications
Lax safety
culture
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
39
Prevention and protection
Prevention (control barriers):
Active or passive barrier
functions that prevent the
initiating event from occurring.
Protection (safety
barriers):
Active barrier
functions that
deflect
consequences
Protection
(boundaries):
Passive barrier
functions that
minimise
consequences
Accident
Initiating event,
failure mode
(Incorrect action)
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
40
EpidemioIogicaI accident modeIs
Human
erroneous
action
Normally
functioning
system
Barrier
Local
conditions
Latent
system
conditions
Latent
system
conditions
Time
Accident
analysis
Accident
Barrier failure
Barrier reliability
Accident
prevention
Time
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
41
Iceberg modeI of accidents
Hollnagel (2002)
Accidents
Incidents
Near-misses
Unsafe acts
Increasing
visibility of
events
Increasing
frequency
of events
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
42
Accident modeIs
Search
principle of
accident
analysis
Goal of
accident
analysis
Specific causes
and well-defined
links.
Specific causes
and well-defined
links.
Eliminate or
contain causes.
Eliminate or
contain causes.
Sequential
accident
model
Epidemiological
accident model
Systemic
accident
model
Carriers, barriers,
and latent
conditions.
Carriers, barriers,
and latent
conditions.
Strengthen
defences and
barriers .
Strengthen
defences and
barriers .
Functional
dependencies
and common
conditions
Functional
dependencies
and common
conditions
Monitor & control
performance
variability
Monitor & control
performance
variability
Hollnagel (2002)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
43
ConcIusions
Accident model determines analyses and responses
Root cause, shaping factors or coincidence
Event based or system based
Elimination, improvement or monitoring
The misleading simplicity of human error
Human performance is inherently variable - but not
unreliable
Variability reflects work conditions
Performance deviations have positive and negative
consequences: errors as an opportunity for learning
CSE is a system approach for analysing, evaluating
and designing complex systems
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
44
How to achieve safety?

=
n
i
Accident Safety
1
Safety is freedom from accidents or losses.
Leveson (1995)
Absence of failures
Stay inside envelope of
safe performance
Risks are identified and
controlled
Performance variability
management
Imagination, identification,
assessment, modification
Monitoring:
detection-recovery
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
45
Definition of risk
Status quo
P
1
S
1
P
2
P
n
i
n
i
i
S P R

=
=
1
n
S
2
S
1
S
n
Probable
accidents
Unwanted
outcomes
Stochastic
Risk (R) is a combination of the likelihood of probable accidents
(P) and the severity of the potential consequences (S).
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
46
Risk of traffic accidents
Past risks
(statistics, number of
accidents on road)
Present risks
Future risks
??
Traffic
density
Driving
style
Vehicle
technology
Known with
high certainty
Estimates,
given certain
assumptions
Facts
Traffic
models
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
47
LeveIs of performance evaIuation
S
e
n
s
i
t
i
v
i
t
y
Level 1: Accident studies
(statistics)
V
a
l
i
d
i
t
y
Level 2: Incident studies
Level 3: Performance
measurements
High
Low High
Low
Long delays,
dependent on accident
model
Higher event rate; data
collection may be costly
Measurements of single
events or cases
Model?
Model?
Model?
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
48
Risk reduction
Identify
context
Identify risks
Workplace
Organisation
Target/purpose
Demands
Resources
Scenarios
Tasks
Activities
Personnel
Disturbances
Identification
Analyse risks
Evaluate risks
Probability
Potential
Frequencies
Failure modes
Failure types
Estimates
Priorities
Risk themes
Assessment
Treat risks
Policies
Defences
Monitoring
Procedures
Communication
Remediation
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
49
TechnoIogicaI risk identification
Identify hazards
Assess
risks
Probability,
consequence
Specify
controls
Re-assess risks
ALARP
Direct
assessment
Tolerable
Strong technical influence / bias.
Assumes that hazards can be clearly
identified.
Requires that system structure is known
Strong technical influence / bias.
Assumes that hazards can be clearly
identified.
Requires that system structure is known
HAZOP, HAZID
Fault tree (top-down), event tree
(bottom-up), HRA, expert judgment,
statistics
Barriers, performance
monitoring
Qualitative, quantitative
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
50
HAZOP - Hazards and operabiIity anaIysis
Objective: Identify all hazards resulting from potential malfunctions in a
process
Analyse each step in process using HAZOP guidewords
Determine how this could happen
Can the condition be detected?
Are the consequences hazardous?
Can the consequences be prevented?
Is prevention cost-effective?
A quantitative decrease (e.g. low pressure) Less
The negation of the intention (e.g. no flow) No or None
A qualitative decrease (e.g. only one or two components present) Part of
In addition to (e.g. impurity) As well as
Complete substitution (e.g. wrong material) Other than
The opposite of the intention (e.g. backflow) Reverse
A quantitative increase (e.g. high pressure) More
Meaning Guide words
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
51
HAZOP - Hazards and operabiIity anaIysis
... ...
Blockage, valve closed, high ambient temperature etc. More pressure
Heat loss, leak, imbalance of input and output etc. Less temperature
Typical problems Type of deviation
...
None
None
Existing controls
...
Leak
Valve closed
Cause
...
Release to
atmosphere
Overpressure
Consequence
... ...
High pressure alarm More pressure
Gas detector
Less
temperature
Possible action Deviation
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
52
Bottom-up risk anaIysis
Fail
Event tree (early 1970s)
E
v
e
n
t

1
E
v
e
n
t

2
E
v
e
n
t

3
E
v
e
n
t

4
O
u
t
c
o
m
e
Initiating
event
The analysis starts
from a specific
initiating event
and tries systematically to assess the
outcomes of possible combinations of
subsequent successes / failures.
Succeed
Fail
Succeed
Fail
Succeed
Fail
Succeed
Fail
Succeed
Fail
Success
Failure
Succeed
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
53
Event tree anaIysis
IEE (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
54
Top-down risk anaIysis
Fault tree (org. 1961)
Top
event
Basic
events
Previously
defined
hazard
The analysis starts from
the selected top event
and tries systematically to find all the
(logical) combinations of events that
could lead to it.
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
55
FauIt tree components
A
C B
AND
Conjunction
If B and C are true,
then A is true
A
C B
OR
Disjunction
If B or C are true,
then A is true
Flooding
Pumps do
not work
Water level
continues to rise
AND
Signal is
missed
Operator is
inattentive
Signal/noise ratio
Is too low
Hollnagel (2004)
OR
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
56
Steps in fauIt tree anaIysis
Top
event
AND
AND OR
Basic
event
1. Identify top event
2. Identify first-level events
3. Link the events to top event by a logic gate
4. Identify next-level events
5. Link the events to last-level
events by logic gate
6. Repeat step 4 and 5 until all
basic events are identified
Basic event indicates the limit of
analytical resolution
Basic event indicates the limit of
analytical resolution
Event
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
57
CoIIision at intersection
IEE (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
58
Risk IeveI
Unacceptable region
Tolerable or As Low As
Reasonably Practicable
(ALARP) region
Broadly
acceptable region
Must be eliminated or
contained at any cost
Will be eliminated or
contained if not too costly
Should be eliminated or
contained or otherwise
responded to
May be eliminated or
contained or otherwise
responded to
Should be assessed
when feasible
Risk is only
undertaken if a
benefit is required
No need to demonstrate
ALARP in details
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
59
IATA Risk matrix
Medium
High
Substantial
Insignificant Minor Moderate Critical
Often
Occasionally
Possible
Unlikely
Catastrophic
Practically
impossible
Medium High High Substantial Substantial
Small Medium High High Substantial
Small Medium Medium High High
Small Small Small Medium Medium
No or minor injury
or negligible
damage
Minor injury or
minor property
damage
Serious but non-
permanent
injuries or
significant
property damage
Permanent
disability or
occupational
illness or major
property damage
May cause death
or loss of
property
Medium High Substantial Substantial Substantial
Small
Safety is partially guaranteed, normal protective measures are required.
Safety is not ensured, protective measures are urgently required.
Safety is not ensured, enhanced protective measures are urgently required.
Safety is largely guaranteed.
L
i
k
e
l
i
h
o
o
d

o
r

p
r
o
b
a
b
i
l
i
t
y
Severity / Scope of damage
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
60
Socio-technicaI risk identification
System model Failure modes
Task analysis;
Functional
model;
Goals-means
model
HAZOP list;
MTO list;
phenotypes
Possibilities for
detection
Likelihood
Interface
design;
Work
organisation;
Possible
antecedents
(causes)
Consequence
Accident
statistics;
experience;
brainstorming
Context
(performance
conditions)
Context
(performance
conditions)
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
61
HierarchicaI task anaIysis
Prepare
transaction
Enter PIN
code
Select type of
transaction
Remove
card
Remove
money
Complete
transaction
Enter amount
Insert card
Begin
Enter four
digits
Push Enter
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
62
TempIate for risk anaIysis


Mitigating
actions
(M, T, or O)
Failure mode
can be found
using
guidewords, e.g.
phenotypes.
Identification
must be
systematic
Activities should
be described on
the same level of
detail

Mitigating
action
Consequence

Yes /
No

How When
Consequence
should be
described as
clearly as
possible
Failure mode /
deviation
Activity / function
Possibility of
detection
Probability /
likelihood

Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
63
Human and systemic faiIure modes
Human failure mode Systemic failure mode
Timing Action performed too
early or too late
Position reached too early or too late.
Equipment not working as required.
Duration Action performed too
briefly or for too long
Function or system state held too briefly or for
too long.
Distance Object/control moved too
short or too far
System or object transported too short or too far
Speed Action performed too
slowly or too fast
System moving too slowly or too fast
Equipment not working as required.
Direction
Action performed in the
wrong direction
System or object (mass) moving in the wrong
direction
Force / power
/ pressure
Action performed with
too little or too much
force
System exerting too little or too much force.
Equipment not working as required.
System or component having too little or too
much pressure or power.
Object Action on wrong object Function targeted at wrong object
Sequence Two or more actions
performed in wrong order
Two or more functions performed in the wrong
order,
Quantity /
volume
None System/object contains too little or too much or
is too light or too heavy.
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
64
Performance conditions (context)
Availability (personnel, equipment)
Training and preparation (competence)
Communication quality
HMI and operational support
Availability of procedures and methods
Working conditions
Number of goals & conflict resolution
Available time (time pressure)
Circadian rhythm, stress
Team collaboration (commitment)
Organisation quality
Very
good
Very
bad
3 3 2
Hollnagel (2004)
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
65
Barrier Systems and Functions
Erik HoIInageI, 2004
Incorporeal
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
66
FRAM functionaI unit (moduIe)
Erik HoIInageI, 2006
Output
Resource
ControI
Input
Precondition
Time
That which is used
or transformed to
produce the
output. Constitutes
the Iink to previous
functions.
That which is
produced by
function.
Constitutes Iinks
to subsequent
functions.
That which is needed or
consumed by function to
process input (e.g.,
matter, energy,
hardware, software,
manpower).
That which
supervises or adjusts
a function. Can be
pIans, procedures,
guideIines or other
functions.
System conditions
that must be
fuIfiIIed before a
function can be
carried out.
Time avaiIabIe: This
can be a constraint
but can aIso be
considered as a
speciaI kind of
resource.
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
67
Exercise
Read the US Highway Accident Case
Apply the viewpoints of the three accident models that
were discussed
Which contributing factors can the model identify?
Which contributing factors do you miss in the model?
Is there enough information for investigation according to each
model?
Which model do you think the investigators had in mind?
CSE
LAB
COGNITIVE SYSTEMS ENGINEERING LABORATORY
68
Further Reading

You might also like