You are on page 1of 91

Data management for asset

management decision making

Dr. M.Hodkiewicz, August 2006


University of Western Australia
Learning Outcomes
• After this session you will be able to:
– Define what makes ‘quality’ data
– Be able to define data requirements for specific
reliability and maintenance metrics
– Justify and manage improvements in the quality of
data using the ‘8 Step Plan’
– Develop a business plan to support dedication of
appropriate resources to measuring and improving
data quality

Hodkiewicz, UWA – “AM data quality” 2


Session Content
• Introduction and background
• Approaching the data quality issue
• The 8 Step Plan
• Roles in the data quality process
• Changing data fields
• Case Studies 1 and 2.

Hodkiewicz, UWA – “AM data quality” 3


Text Resources (1 of 2)
[1] Hodkiewicz, Kelly, Sikorska & Gouws, (2006) A framework to assess
data quality for reliability variables, WCEAM, Australia.
[2] Wang, R. Y. & Strong, D. M. (1996) Beyond accuracy: What data
quality means to consumers. Journal of Management Information
Systems, 12( 4)
[3] Lee, Y.W. & Strong, D.M. (2004) Knowing-Why about Data
processes and Data quality. Journal of Management Information
Systems, 20(3),
[4] Hodkiewicz, Coetzee, Dwight & Sharp (2006) The importance of
knowledge management to the Asset Management Process,
Business Briefings: Oil and Gas Processing Review 2006, 43-45.
[5] Hodkiewicz & Forrester, Optimizing a Preventative Maintenance
Program – A Case Study (1999) Meeting of MESA (IEAust) – The
Next Generation of Maintenance, Perth WA.
[

Hodkiewicz, UWA – “AM data quality” 4


Text Resources (2 of 2)
[6] Sandtorv, H. A., Hokstad, P. & Thompson, D. W. (1996) Practical
experiences with a data collection project: the OREDA project.
Reliability Engineering and System Safety, 51.
[7] Redman (2005) Availability Modeling of a Wastewater treatment
plant, UWA Honours thesis.
[8] P. Jager and B. Bertsche, (2004) “A new approach to gathering
failure behaviour information about mechanical components based
on expert knowledge”, in Annual Reliability and Maintainability
Symposium, 2004
[9] Prusak L. 1997 Knowledge in Organisations, Butterworth-
Heinemann, Boston, MA.

Hodkiewicz, UWA – “AM data quality” 5


Background
• Engineering Asset Managers rely on data to assist
with and support decisions.
• There has been little published work on how to
assess the quality of maintenance data and its fit-for-
purpose to support decisions.
• The general data problem is compounded by
ambiguity over who is responsible for assuring data
quality and appropriate use of the data

Hodkiewicz, UWA – “AM data quality” 6


Simple definition

• Quality data is ….

• Therefore, you need to know its context!

Hodkiewicz, UWA – “AM data quality” 7


Attributes of Data quality
• The quality of the data is dependent on various
attributes including [3]
– timeliness,
– accuracy,
– relevance,
– completeness, and
– accessibility

Hodkiewicz, UWA – “AM data quality” 8


Current status
• There is currently no widely accepted methodology
for
– (i) assessing the quality of maintenance-related
data,
– (ii) confirming that data is ‘fit-for-purpose’ or
– (iii) identifying and managing changes to the data
collection, storage and use system.

Hodkiewicz, UWA – “AM data quality” 9


What do we know?
1. There are three key elements required for
data quality [2]

– knowing-what,

– knowing-how, and

– knowing-why.

Hodkiewicz, UWA – “AM data quality” 10


What do we know?
2. There are three roles associated with the data
process [3]:
– data collectors,
– custodians,
– and consumers [3].

3. Data quality is highly dependent on data


collectors knowing-why [3].

Hodkiewicz, UWA – “AM data quality” 11


‘Who’ is ‘what’ in the maintenance context?
• The data collector is predominantly a maintenance or
production technician.
• The data custodians are the IT support staff
responsible for managing various enterprise
databases and systems.
• The main data consumers are
– (i) reliability engineers who use the data during the
determination of long-term maintenance
strategies, and
– (ii) maintenance engineers and maintenance
supervisors who use the data when addressing
day-to-day maintenance issues.
Hodkiewicz, UWA – “AM data quality” 12
Approaching the data quality
issue
How do you know you have a maintenance data
quality problem?
• Some clues ….
– Locating data for simple exercises such as calculating
MTTF requires access to numerous data sources.
– Data from different sources is contradictory
– Electronic data not readily available in a useable form
– Results of calculations do not agree with ‘experience’
– Data often has to be retrieved verbally from
experienced personnel
– Data is either not available of not at sufficient depth for
analysis.

Hodkiewicz, UWA – “AM data quality” 14


Other clues?

Hodkiewicz, UWA – “AM data quality” 15


Players in the data
process

Hodkiewicz, UWA – “AM data quality” 16


The Basic Steps

This will give you a feel


for the problem, but not
a defensible case for
spending large amounts
of time and money
fixing it…

Hodkiewicz, UWA – “AM data quality” 17


Addressing the problem of data quality

ƒIdentify current performance/status

ƒIdentifying desired goals

ƒList and group actions to close gap between


desired goals and current status

ƒDevelop a business case

ƒIdentify stakeholders and agree on action plan to


close gap

Hodkiewicz, UWA – “AM data quality” 18


The 8 Step Plan
Why develop the 8 Step
Data Quality Improvement
Plan?

To promote
business-centric
view of data quality.

From [1]

Hodkiewicz, UWA – “AM data quality” 20


Aims of the ‘8 step plan’
• Ensure businesses are collecting the right data.
• Build defensible cases for improving bad quality
data.
• Identify where improvements will add the most value.
• Objectively quantify improvements.
• Highlight the business risks of bad quality data.
• Identify where systems and processes are hindering
rather helping ensure quality data.

Hodkiewicz, UWA – “AM data quality” 21


Step 1: Define the
business need

• Look for words like:


– Measure
– Quantify
– Improve
– Reduce
– Control

Hodkiewicz, UWA – “AM data quality” 22


Step 2: Identify pertinent
metrics

• Common RAM metrics:


– MTBF
– MTTF
– MTTR
– Cost to repair
• How often do these
need to be measured?
• What level of accuracy
is required?

Hodkiewicz, UWA – “AM data quality” 23


Step 3: Determine data
requirements

• Identify metric variables


• Locate the data source
• Identify the data fields?
• How is this measured?

Hodkiewicz, UWA – “AM data quality” 24


What is meant by ‘metric variable’?

• Metric variables are


all data or variables
that are required to
calculate the metric.

Hodkiewicz, UWA – “AM data quality” 25


Example of data required for MTTF?
• Step 3: Identify data required
• MTTF = Total number of bearing operating hours /
Total number of bearing failures
• In a perfect system each bearing failure event would
have:
• (a) a single unique designator classifying its failure
completely (i.e. pump bearing failure), and
• (b) the age of the bearing at the time of failure.

Hodkiewicz, UWA – “AM data quality” 26


Example of data required for MTTF?

Hodkiewicz, UWA – “AM data quality” 27


Who is involved?
• Who is involved in the data collection process in this
example?
• (Roles will vary in different plants)
• The operator/ production technician records the date the
pump is removed from and returned to service, the
functional location of the pump, and the functional failure
code. This code identifies ‘why’ the unit was removed
from service.
• The maintainer identifies the maintainable item that is
the cause of the equipment being removed from service,
in this case, the bearing. Further information is then
required to describe the failure; example codes are
provided in ISO 14224.
Hodkiewicz, UWA – “AM data quality” 28
Vital missing data …. Suspensions
• A suspension is used to describe the situation when
the bearing is replaced in the course of a pump
overhaul, when the bearing is still functioning. The
recording of suspensions is vital for the accurate
estimation of MTTF.

Hodkiewicz, UWA – “AM data quality” 29


Example …. Suspensions

• For example, if one pump operating from 1000 hours


experiences 2 bearing failures and has a further 2
functioning bearings replaced during replacement of
mechanical seals, the true MTTF (accounting for
suspensions)
• is 500 hours.
• If the suspensions are incorrectly counted as failures
and omitting the suspension/failure distinction can
result in a dramatic under-estimation of the bearing
MTTF. The calculated MTTF
• is 250 hours

Hodkiewicz, UWA – “AM data quality” 30


MTTF example discussion
• Previous slides illustrates:
– Require access to information collected by
multiple people
– Require access to information stored in different
places
– Require information that is not always available
(eg failure/ suspension distinction)
• Can also require data from other departments eg cost
data from accounting

Hodkiewicz, UWA – “AM data quality” 31


Sources of data
Engineering
design data CMMS data

Management
planning data
Reliability
data

LIFE CYCLE
Accounting COST DATA
data
Logistic
support data

Customer/
Market data Production
data
Construction
Adapted from Blanchard data
‘Systems Engineering and
Analysis’, 2006.

Hodkiewicz, UWA – “AM data quality” 32


Step 4: Analyse data quality

Hodkiewicz, UWA – “AM data quality” 33


Understanding the data
attributes
• Is the data accurate?
• (Intrinsic data quality)

• Is the data appropriate to the


business need?
• (Contextual data quality)

• Is the data represented


appropriately? (Representational
data quality)

• Is the data accessible but


secure?
• (Accessibility data quality)

Hodkiewicz, UWA – “AM data quality” 34


Step 4a: Examine the data first
• Investigation of the data may reveal issues such as:
– Inaccurate failure date.
– Nonsense free-text data (for example “Broken” as
a fault description).
– Incorrectly selected options.
– Prevalence of “other” or “none of the above”
selections in restricted entry fields.
– Missing data.
– Referential integrity problems.
– Inappropriate selection of functional location level.

Hodkiewicz, UWA – “AM data quality” 35


Step 4(b): List data quality
categories (from [2])
Data quality category Attributes Explanation
Intrinsic Believability Real and credible
(Does the data have any Accuracy Accurate, correct, reliable, errors can be
inherent quality in its easily identified
own right?)
Objectivity Unbiased and objective
Reputation Reputation of the data source and data
Contextual Relevancy Applicable to task at hand, usable
(Is the data appropriate to Timeliness Age of the data is appropriate to the
the business need?) task at hand
Completeness Breadth, depth and scope of
information contained in data
Appropriate Quantity and volume of available data
amount of data is appropriate
Value-added Data gives a competitive edge, adds
value to the operation

Hodkiewicz, UWA – “AM data quality” 36


Quality categories (from [2])

Data quality category Attributes Explanation


Representational Interpretability Data are in appropriate language
(Is the data represented and units and data definitions
appropriately?) are clear
Ease of Easily understood, clear, readable
understanding
Representationa Consistency formatted and
l consistency represented, data are compatible
with previous format
Concise Concise, well organized,
representation appropriate format of the data
Accessibility Accessibility Accessible, retrievable, speed of
(Is the data accessible but access
secure?) Access security Access to data can be restricted,
data is secure.

Hodkiewicz, UWA – “AM data quality” 37


Develop Measurement philosophy
• Measurement of the quality of your data against
these attributes requires the development of a series
of questions. (Scoring system)
• The structure and nature of the questions should be
dependent on the data variable under investigation
and its context.
• Wherever possible, questions should be designed to
elicit a quantitative response.
• Answers can then be viewed individually and/or as
weighted sums, resulting in an overall score of data
quality for that particular field.

Hodkiewicz, UWA – “AM data quality” 38


Example questions for MTTF
example (1 of 2)
Data quality Attributes Question for the Functional location Marking
category field in the MTTF calculation
Intrinsic Accuracy The data is correct Y/N
The data is accurate Y/N
Reputation I know who collected the data Y/N
I know who entered the data Y/N
Is the selection of FL affected by who Y/N
collects the data?
Data in this field is relevant for this T/F
analysis
Contextual Relevancy Data in this field is appropriate for this Y/N
analysis
Data is sufficiently current for our work Y/N
Completeness Data is complete Y/N
Are FL data definitions clear to all users Y/N
(collector, custodian and user)?

Hodkiewicz, UWA – “AM data quality” 39


Example questions for MTTF
example (2 of 2)
Data quality Attributes Question for the Functional location field in Marking
category the MTTF calculation
Representational Concise Is the information easily retrievable? Y/N
representation
Ease of Are drop-down menus well structured? Y/N
understanding If there been changes in the FL structure as the Y/N
CMMS has been upgraded. Is there a
translation between old and new systems?
Representational Is each FL entered in exactly the same format? Y/N
consistency Is the maximum number in any one drop-down Y/N
menu less than 8?
Accessibility Accessibility Is the information easily accessible? Y/N
Is the information easily obtainable? Y/N
Access security Is there a system to prevent unauthorized Y/N
changes to the system?

Hodkiewicz, UWA – “AM data quality” 40


Step 5: Develop strategies

• Look for aggregation.


• Identify biggest
vulnerabilities.
• Discuss problems with
data collectors.
• May have to draw a line
in the sand with respect
to old data.

Hodkiewicz, UWA – “AM data quality” 41


MTTF example continued
• Identify issues with missing data and some fields are being
incorrectly filled in. To address these issues we decide on the
following measures:
– Auditing of data at the time of collection.
– Establishing a process for data vetting.
– Careful selection of classifications, Avoid presenting codes.
– Reduce manual data entry. Source date data from other,
automated sources (e.g. purchasing system).
– Data entry close to the job in time and space
– Training
– Introduce workflow/process to identify and manage
exceptions such as selecting “other”.

Hodkiewicz, UWA – “AM data quality” 42


Step 6: Implement
changes

• Need short term wins.


• Prioritize changes
based on:
– Complexity of
change required
– Aggregated effect of
error
• Trying to rectify the
“sins of the past” is to
be undertaken with
caution.

Hodkiewicz, UWA – “AM data quality” 43


Assessing proposed changes

Selection of proposed changes Efficacy Cost/Difficulty

Audit: Alter IT systems to check High Medium


for known garbage values

Vetting: Two week audit drive on Medium Low


work order data

Identify data owners and ‘goal’ High Very High


them on data quality

Hodkiewicz, UWA – “AM data quality” 44


Step 7: Assess changes

• Repeat analysis
process with same
questions to assess
success of change.
• This will identify where
to focus future effort.

Hodkiewicz, UWA – “AM data quality” 45


Step 8: Establish review
periods

• Business needs change


over time.
• Reviews are required to
determine if data needs
are current.
• Initial focus may lose
impetus over time.

Hodkiewicz, UWA – “AM data quality” 46


Summary
• Advantages of this ‘context-oriented’ approach include:
• (1) a structured prescriptive methodology that can be
applied to many situations,
• (2) a direct link between data quality and the business
process,
• (3) identification of all data inputs and their associated
data collector, data storage format and process.

• Recommend that more thought and attention needs to


be given to the process of collecting, storing and using
data for maintenance and reliability decisions.

Hodkiewicz, UWA – “AM data quality” 47


Discussion
Roles in the data quality process
Roles in the data quality
process

Hodkiewicz, UWA – “AM data quality” 50


Common observations
• Inappropriate data is collected and stored
• The role of the data collector is not valued.
• The data collector is not engaged in the decision
process
• No ‘value’ is placed on ‘good’ data.

Hodkiewicz, UWA – “AM data quality” 51


Research findings
• Work by Lee and Strong [3] has shown that data
collectors with “why-knowledge” about the data
production process contribute to producing better
quality data.
• Overall, improving the knowledge of data collectors is
more critical than that of data custodians.
• Knowing-why is gained from experience and
understanding of the objectives and cause-effect
relationships underlying activities (knowing-what) and
procedures (knowing-how) involved in work
processes in organisations [3].

Hodkiewicz, UWA – “AM data quality” 52


Implications
• It is important for the collectors of data required for
reliability analysis (the maintainers and operators) to
understand
• why they need to collect the data, and
• how it will be used.
• The solution is to identify the variables on which key
decisions are made, identify the data required for
these variables, and communicate both the process
and the results to the data collectors.

Hodkiewicz, UWA – “AM data quality” 53


Potential benefits
• Unless the data collectors appreciate the process
and know how, why and by whom the data is being
used, they will have no motivation to collect quality
data.
• The more direct and timely the feedback to the data
collector about its quality and the direct effect this has
on the business need, the more influential this
feedback process becomes.

Hodkiewicz, UWA – “AM data quality” 54


Recommendations

• Data collectors need to know ‘why’ and ‘how’.


• In other words, they should know what their
data is used for, and be given feedback on
outcomes following the analysis based on the
collected data.
• Short-term feedback on errors is
fundamental.

Hodkiewicz, UWA – “AM data quality” 55


Changing the data fields and their links
Links

Hodkiewicz, UWA – “AM data quality” 57


Recommendations
• Careful thought needs to go into
– Design and use of functional location structures.
– Design of links to related fields internal and external to the
main database.
– Identifying all major users of specific data fields
– Identifying the data attributes and location for each data field
required for calculating reliability variables.
– There are a great many users of the same maintenance and
failure data and considerable care should be taken before
any changes are made to the data collection, data structure
or data storage processes.

Hodkiewicz, UWA – “AM data quality” 58


Knowledge management
Knowledge, data and information
• Knowledge about a specific equipment’s function,
condition, capability and relationship to the system, is
vitally important to asset managers.
• Knowledge is distinctly different from data or
information.
• Data is captured directly from monitoring a variable.
• Information is data that is organised and placed in
context, thus given meaning, with the objective of
facilitating meaningful managerial and/or technical
action in optimising the benefit obtained from the
system.

Hodkiewicz, UWA – “AM data quality” 60


What is knowledge?

• Knowledge is information combined with


experience, context, interpretation and
reflection that permits making predictions,
causal associations or prescriptive decisions
about what to do [9]

Hodkiewicz, UWA – “AM data quality” 61


The KM Process
• The knowledge management process involves the
following basic steps:
– identification,
– capture,
– validation and storage,
– retrieval and use for decision-making.

Hodkiewicz, UWA – “AM data quality” 62


Some challenges with KM
• Data collection processes that are not targeted at
collecting the data required for decision making.
• Data validation
• Data storage and access.
• Data granularity
• Lack of context.
• Loss of experienced personnel

• For more information, refer to [4]

Hodkiewicz, UWA – “AM data quality” 63


Why is KM
important?
• Reduce the
number of
‘subjective’
decisions relative
to ‘knowledge-
based’ decisions
based on
‘quality’ data

Concept Map developed by


Pascual & Hodkiewicz

Hodkiewicz, UWA – “AM data quality” 64


Case Study 1
Current status
• PM work orders had insufficient information and poor
instructions
• Numerous PM work orders were generated but only a
fraction of the tasks completed
• The PM work orders were not being used to generate
scheduled maintenance tasks
• Unscheduled breakdowns of equipment that had
recently been inspected indicated that the PM was
not adequately targeting all aspects of the
equipment’s operation

Hodkiewicz, UWA – “AM data quality” 66


Desired status

• Produce PM work order instructions to include


– Technician and Supervisor knowledge
– Historical failure information
– Design modifications
– Operating expectations
– Predictive Maintenance practices
• Structure the PM work order forms to capture
– Technician knowledge
– Equipment status/ condition
– Planner required information

Hodkiewicz, UWA – “AM data quality” 67


The new PM
• Itemize all parts of the inspection
• Clearly defined tasks
• Minimize writing and paperwork
• Yes/No for task completion
• Work request tracking
• Acceptable ranges for measured values
• Safety procedures/Lock out instructions
• Information for Planner

Hodkiewicz, UWA – “AM data quality”


Example: PM Inspection conveyor
Title Inspection 003 conveyor drive sheaves and belts- Job Work
quarterly Complete Order
submitted
Equipment Drive belts/sheaves - 003 conveyor
PM Code GF1040IMM
Work type PM inspection
Equip. status Downday
Frequency Monthly
Craft Mechanic
No. of craft 1
Time 1x 2 hours
Planner Check that Electrical planner does not schedule the
Information 003 belt scale PM GF1040IEM at the same time as
this requires the belt to be running.
Job Remove the guard. Visually inspect the sheaves and Yes/No Yes/No
description belts for wear and cracks.
Check sheaves and belts with a sheave gauge (5V)
Check alignment of the sheaves with a straight edge
Check belt tension
Other Belt type 4xR5V-18000, Powerband
information Warehouse stock number 314200024
Comments

Hodkiewicz, UWA – “AM data quality” 69


Operations PM work order
TARGET PUMP A PUMP B
Tails line local pressure gauge reading 80-125 psi
Gland water line pressure gauge reading
Gland line overpressure (the difference between MIN 10-15 psi
the gland water line pressure and the tails line
pressure)
Is the gland water flow to pump adequate? Yes/No Yes/No
Is the pump packing leaking mud? Yes/No Yes/No
Are there any leaks in the piping between the Yes/No Yes/No
tails box and the main tails line trench?
Are the belts noisy or smell of burned rubber? Yes/No Yes/No
Cleaned mud from motor fins and fan end? Yes/No Yes/No
Are there any fault alarms on the variable speed Yes/No Yes/No
drives?
Inside Outside
Tails sump tails sump
Are the inside and outside tails sump pumps in Yes/No Yes/No
good condition (do NOT run them dry).
Comments: ___________________________

Hodkiewicz, UWA – “AM data quality” 70


Predictive Maintenance Work
Order
Title Weekly PM mill vibration and temperature readings Job Work Order
Complete submitted
Equipment Grinding Mills 1 & 2
Equip. status Running
Frequency Weekly
Craft Mechanic
Job 1. Collect vibration route Yes/No Yes/No
description Collect vibration Mills route and download results to
predictive maintenance computer
Generate exceptions report for Planner and Engineer
Record comments and work requests submitted

2. Examine gear teeth with strobe light Yes/No Yes/No


Use strobe light to freeze teeth
Is the lube pattern even across the pinion face?

3. Inspect pinion bearing seals for leaks Yes/No Yes/No


Are the bearings covered in mud, write work request to
clean on the next downday

4. Collect temperature readings on pinion Yes/No Yes/No


Use infrared temperature gun to record temperatures on
the mill pinion bearings and gear.
Use the mill temperature report sheet to record the
readings and attach to the completed Work Order
Comments

Hodkiewicz, UWA – “AM data quality” 71


Implementation
• New PM code
• Tracking implementation
• Training for new operator sheets
• Feedback
• Set up review

Hodkiewicz, UWA – “AM data quality”


Work orders generated from PMs
Number of Work Orders Generated from PM inspections

12 months after Review

6 months after Review

Prior to PM Review

0 50 100 150 200 250 300 350 400


No. of Work Orders from PM inspections

Hodkiewicz, UWA – “AM data quality” 73


PM’s as a % of Scheduled Work Orders
Number of PM Work Orders as % of Scheduled Work Orders

12 months after Review

6 months after Review

Prior to PM Review

0 2 4 6 8 10 12 14 16 18 20
% PM / Scheduled Work Orders

Hodkiewicz, UWA – “AM data quality” 74


Other Results

• Improved Planning through increased Planning horizon


to
– schedule repairs
– order parts
• Less time spent looking for information
• Improved equipment history
• Increased Operations involvement/awareness
• Team approach – improved inter-group communication
• Significant increase in number of work orders generated
from PMs

Hodkiewicz, UWA – “AM data quality” 75


Discussion – Case study 1
• Project focused on
– identifying ‘data’ requirements
– Making it easy for the data collector to record the
necessary data
– Involving the data collector in the decision process

Hodkiewicz, UWA – “AM data quality” 76


Case Study 2
• Availability simulation model at for a refinery waste
water treatment plant
• Project required data collection to establish reliability
of each type of equipment in order to construct a
reliability block diagram (RBD) and perform
simulation.
• Data collection took >2/3 of project manhours.

Hodkiewicz, UWA – “AM data quality” 77


Source CMMS

Extraction Reject
Primary and Filtering Non-
Filtering Failure
Failure

Work Order

Secondary Manual
Selection Non-
Reject
Failure Data
Filtering
collection for waste
Failure
Failure

water modeling
Data Set

Cleansing Data Reject


Cleansing Outlier
s
Usable Data Set

196

Grouping Group Data

46

Failure Data Sets

Distributio Distribution Case Study (from [7])


n
2 parameter
Weibull
Result Distributed Failure
Data Set Hodkiewicz, UWA – “AM data quality” 78
Review results (from [7])
• The following information was NOT available in the CMMS
– Failure code
– Failed item
– Failure type
– Cause of failure
• Job descriptions and comments are vague
• Difficult to determine what failed
• Difficult to extract data from CMMS to reliability analysis
software package.
• Components failures were incorrectly coded to the wrong
equipment/ functional location.
• Multiple work orders exist for the same job

Hodkiewicz, UWA – “AM data quality” 79


Review of data from 6 sets

Failure data set description η before η after β before β after


Concrete Structure 3930 4015 20 1.00
Pressure vessel 669 609 0.74 1.80
Pump type 10 failure 274 387 0.70 1.00
Pump type 13 failure 3109 3245 40.39 8.00
Equipment type 3 failure 869 1330 0.63 1.00
Equipment type 5 failure 275 423 0.59 1.06

Case Study (from [7])

Hodkiewicz, UWA – “AM data quality” 80


Recommendations
Data Current Source Ultimate Source

Date of failure CMMS CMMS

Failure Code CMMS CMMS

Failed Item N/A CMMS

Failure Type N/A CMMS

Root Cause N/A CMMS

Repair Time Personnel CMMS and labour records

Maintenance CMMS and Personnel CMMS and Personnel

Hodkiewicz, UWA – “AM data quality” 81


Data record recommendations (1of 3)
• DATA SHEET
• Identification
• ID: J1
• Equipment: Single stage centrifugal pump
• Quantity: 3
• Description: WWTP inlet pump
• Function: Transfers water from inlet sump to the API separator
• Operation Mode: One: constant
• Two: on/off when required
• Three: standby
• Dependencies: 2 of 3 required
• Maintenance
• Current maintenance policy: run to failure
• Preventative maintenance: weekly oil and grease check
• Inspections: weekly vibration test
Case Study (from [7])

Hodkiewicz, UWA – “AM data quality” 82


Data record recommendations (2 of 3)
Characteristic Wear Inspection
Wear in/Wear out/ Random Random Inspection interval (days) 7
Beta 1.079 Mean task time (days) 0.01
Eta/MTTF (days) 1937 days Data source Planner
Gamma 0 Distribution Normal
Data source Maximo Standard deviation 0.005
Corrective Maintenance Status during inspection Online
Mean time to repair (days) 30 Condition after task Intermediate
Data source Planner Age reduction factor (%) 80
Distribution Normal Fault detection probability 70
Standard deviation 5 (%)

Condition after repair Good as new Data source Author

Data source Author Case Study (from [7])

Hodkiewicz, UWA – “AM data quality” 83


Discussion – Case study 2
• Data collection was onerous.
• However once complete the equipment data forms
are a knowledge source for a variety of projects.
• The availability model has been used to justify capital
equipment purchase to relieve a bottle neck in the
plant.
• The project highlighted to management problems
with the data collection and storage process.

Hodkiewicz, UWA – “AM data quality” 84


Workshop review
Attributes of Data quality
• The quality of the data is dependent on various
attributes including [3]
– timeliness,
– accuracy,
– relevance,
– completeness, and
– accessibility

Hodkiewicz, UWA – “AM data quality” 86


Importance of data collectors
• It is important for the collectors of data required for
reliability analysis (the maintainers and operators) to
understand
• why they need to collect the data, and
• how it will be used.
• The solution is to identify the variables on which key
decisions are made, identify the data required for
these variables, and communicate both the process
and the results to the data collectors.

Hodkiewicz, UWA – “AM data quality” 87


Remember
• (1) data quality can affect the quality of decisions that
significantly affect business outcomes
• (2) it is vital to identify all required data inputs and their
associated data collector, data storage format and
process.
• (3) Data quality is highly dependent on data collectors
knowing-why, so involve them in the process.

• Management attention needs to be given to the process


of collecting, storing and using data for maintenance
and reliability decisions.

Hodkiewicz, UWA – “AM data quality” 88


AND …..

• Quality data is ….

Hodkiewicz, UWA – “AM data quality” 89


Group discussion on data quality

Comments please
End
Thank you

You might also like