You are on page 1of 155

Instrumentation Calibration,

Design and Techniques

Instrumentation and instrument


What is instrumentation?
In general definition this can be
defined as the art and science of
measurement and/or control.
Is achieved by using an
instrument .

Instrumentation based on
industrial application:
It is the application of instrument for
the purpose of measuring, observing,
transmitting, indicating, recording,
monitoring, and controlling any
industrial process variable.

What is an instrument?
Is any device used directly or indirectly in
order to accomplish an objective or task.
In Instrumentation, an instrument is any
sensing, measuring, transmitting,
indicating, or controlling device
associated with a process or system.
Ex. Measuring a body temperature using a
thermometer.

Instrument application categories


and functional divisions.
Factory automation instruments
Plant safety or safeguarding
instruments
Product Quality monitoring/control
instruments
Environmental condition
monitoring /control instruments.
Process variable measurement and
control instruments.

Implementing instrumentation
How is instrumentation
implemented?
1. Single or Stand alone System.
2. Complex System

Instrumentation system
Instrumentation system - is an
arrangement of two or more
instruments connected together to
perform a unified task.
Each instrument operates
independently according to its specific
task. Failure from one member of
instrument, means failure of the entire
instrument system.
Simplest form of instrumentation

Automation
Is a system concept that utilizes
instrumentation system to perform a
certain task or sequences of operations
in an automatic manner or without
human intervention.
Both maximizing quantity of production
and quality and durability of produced
goods is greatly improved.

AUTOMATED PROCESS
Is a process or sequence of
production activities done in an
automatic manner.
TYPES OF AUTOMATED
PROCESS
Highly Mechanized Process
Chemical and Physical Process

Maintenance

What is Maintenance?
All actions necessary for retaining an
item, or restoring to it, a serviceable
condition, include servicing, repair,
modification, overhaul, inspection and
condition verification.
Keep systems equipment in working
order.
To repair the equipment after
FAILURE.

Question?
Why do we need maintenance?
What are the costs of doing

maintenance?
What are the costs of not doing
maintenance?
What are the benefits of maintenance?
How can maintenance increase
profitability of company?

Purpose of Maintenance
Attempt to maximize performance of
production equipment efficiently and
regularly
Prevent breakdown or failures
Minimize production loss from
failures
Increase reliability of the operating
systems

Principle Objectives in Maintenance


To achieve product quality and
customer satisfaction through adjusted
and serviced equipment
Maximize useful life of equipment
Keep equipment safe and prevent
safety hazards
Minimize frequency and severity of
interruptions
Maximize production capacity through
high utilization of facility

Maintenance Objectives
Must be consistent with the goals of
production (cost, quality, delivery,
safety)
Must be comprehensive and include
specific responsibilities

Maintenance Costs
Cost to replace or
repair
Losses of output
Delayed shipment
Scrap and rework

Failure
Failure inability to produce work in

appropriate manner
Equipment / machine failure on production
floor worn out bearing, pump, pressure
leaks, broken shaft, overheated machine etc.
Equipment failure in office failure of power
supply, air-conditioned system, computer
network, photocopy machine
Vehicle failure brake, transmission, engine,
cooling system

Types of Failure

Types of Failure
Functional Failure

the inability to meet the specified


performance standard
Potential Failure
a physical condition which indicates
that the failure process has started
Hidden Failure
Failure is not apparent until the function is
attempted

Current Maintenance Strategies


Fix it when it fails or run until

failure
Time based (calendar time or
running time)
Condition based

Types of Modern Maintenance

Types of Maintenance
Maintenance may be classified into four
categories:
(some authors prefer three categoriesscheduled and preventive maintenances are
merged)

Corrective or Breakdown maintenance


Scheduled maintenance
Preventive maintenance
Predictive (Condition-based) maintenance

Corrective or Breakdown Maintenance


Corrective or Breakdown maintenance implies
that repairs are made after the equipment is
failed and can not perform its normal function
anymore
Quite justified in small factories where:
Down times are non-critical and repair costs
are less than other type of maintenance
Financial justification for scheduling are not
felt

Disadvantages of Corrective
Maintenance

Breakdown generally occurs inappropriate times


leading to poor and hurried maintenance
Excessive delay in production & reduces output
Faster plant deterioration
Increases chances of accidents and less safety for
both workers and machines
More spoilt materials
Direct loss of profit
Can not be employed for equipments regulated by
statutory provisions e.g. cranes, lift and hoists etc

Scheduled Maintenance
Scheduled maintenance is a stitch-in-time
procedure and incorporates
inspection
lubrication
repair and overhaul of equipments

If neglected can result in breakdown


Generally followed for:
overhauling of machines
changing of heavy equipment oils
cleaning of water and other tanks etc.

Preventive Maintenance (PM)


Principle Prevention is better than cure
Procedure - Stitch-in-time
It

locates weak spots of machinery and equipments


provides them periodic/scheduled inspections and
minor repairs to reduce the danger of
unanticipated breakdowns

Advantages of PM
Advantages:
Reduces break down and thereby down time
Lass odd-time repair and reduces over time of
crews
Greater safety of workers
Lower maintenance and repair costs
Less stand-by equipments and spare parts
Better product quality and fewer reworks and
scraps
Increases plant life
Increases chances to get production incentive
bonus

Predictive (Condition-based)
Maintenance

In
predictive
maintenance,
machinery
conditions are periodically monitored and this
enables the maintenance crews to take timely
actions, such as machine adjustment, repair or
overhaul
It makes use of human sense and other
sensitive instruments, such as
audio gauge, vibration analyzer, amplitude meter,
pressure, temperature and resistance strain gauges
etc.

Predictive Maintenance (Contd.)


Unusual sounds coming out of a rotating
equipment predicts a trouble
An excessively hot electric cable predicts a
trouble
Simple hand touch can point out many
unusual equipment conditions and thus
predicts a trouble

Effective Instrumentation
Maintenance Approach
Locating the real cause of a problem can
be the most difficult part of the
troubleshooting process. But taking a logical
approach helps ensure a successful result.

Factors that Could Influence the Effectiveness


of an Instrumentation System Maintenance:
Familiarity of the process
Proper understanding of the problem
Proper evaluation of visible symptoms
Knowledge in the application of different
Measurement,
Process
Control
and
Maintenance Fundamentals
Knowledge in the proper use of hand tools
and equipment
Use of Systematic Maintenance Approach

Understand properly the extent of the


problem based on given facts, data and
Effective
Maintenance
Approach:
symptoms.

1.

2.

Start
troubleshooting
by
first
using
Elimination by Deduction method. If the
cause of the problem is highly identified,
perform corrective action at once to solve
the problem.

3.

Continue troubleshooting by applying


Elimination by Input / Output Test
method.

4.

Apply Root Cause Analysis (RCA).

Elimination by Deductive
Approach
Troubleshooting
by eliminating one
component from the other component
in a loop by deduction or logical
thinking method.

Elimination by Deductive Approach


1.

Understand properly the extent of the problem


based on given facts, data and symptoms.

2.

Start
troubleshooting
by
first
using
Elimination by Deduction method. If the
cause of the problem is highly identified,
perform corrective action at once to solve the
problem.

3.

Continue
troubleshooting
by
applying
Elimination by Input / Output Test method.

4.

Apply Root Cause Analysis (RCA).

Guidelines in using Deductive Approach:


1.Analyze

the extent of the problem


based on given facts or symptoms.

2.Come

up with a probability per element


based on given facts and decide which
element most likely to cause the
problem.

3.Rectify
4.Apply

problem if already possible.

Root Cause Analysis (RCA).

Elimination by Input/Output Test


or Cause & Effect Method
Troubleshooting by applying an input
and monitoring the output per loop
component
based
on
elements
input/output relationship table.

Guidelines in using Input/Output Test:


1.
2.

3.
4.

Established the details of each loop


component of a given control loop.
Using the degree of probability based
on the result of your Deductive
Approach, perform Input/Output
Test.
Rectify problem encountered while
doing input/output test.
Apply Root Cause Analysis (RCA).

Significant use of Input/Output


Test or Cause & Effect
Input/Output
Test
if
properly
administered is a very effective tool in
identifying
equipment
functional,
potential and hidden failures.
Any failure identified during the test
could trigger appropriate maintenance
action/s.

Applying Root Cause Analysis (RCA)

What is a Root Cause Analysis


It is a systematic approach to maintenance
problem analysis. It emphasize mainly on
the main cause or root cause of the
problem not just the temporary solution.
This concept could be well implemented by
considering the ff questions:
Is the problem clearly identified and
understood based on given symptoms?
Is the corrective action done really
corrects the problem?

Other factors need to be considered


in troubleshooting instrumentation
system problem:
Loop configuration / system integrity.
Instrument type, installation,
calibration & physical conditions.
Environmental conditions.

In general, the following simple guide


questions will help an Instrument Technician
perform effective maintenance:
What is the problem?
What do we think caused the problem?
What evidence do we have about the
causes?
What solution(s) do you have in mind?
How will the solution(s) eliminate the
cause of the problem?

Correcting Instrument Output


response:
What do you think will be your courses
of actions if after doing an input/output
test, the actual measured values are
significantly different from the desired
values?

Do Adjustment

Why Instrument Calibration is


Necessary?
The
successful
operation
of
any
automated industrial process depends on
the accuracy and performance of each
instrument in the measurement and control
loop.
Instrument calibration helps to ensure
that a process operates efficiently and
safely within plant specifications and
produces a product of optimum quality.

Example
An Instrument technician is
conducting an Input/Output
test of an I/P converter shown
in the figure. Input is 4-20 mA
from TRCA and output is 3-15
psi.
The resulting As Found I/O
Table is shown below:

Findings
Based on the As Found: I/O test Table
below, the I/P shows an error of 0.2psi
in every test point.
In-order to eliminate the error, the
instrument sensitivity was adjusted.
After adjustment, another Input/Output
test was conducted, and the result is
shown on the I/O Table on the right. This
time the error in psi per test point is 0.
the process of adjusting the error is
what is called CALIBRATION.
The Table that contains the data after
calibration
As Left: I/O Test As Left: I/O Test Table
As Found: is
I/O called
Test Table
Table.

Calibration

Calibration
Calibration is an insurance policy that
verifies the accuracy of test instruments.

Calibration is the act of checking and


verifying the accuracy of a measurement
instrument by comparison with a reference
standard. Properly calibrated instruments
perform
to
manufacturers
published
specifications.
Regularly
calibrating
measurement instruments ensures the
accuracy of measurements that are relied
upon during design and manufacturing test.

Why Calibrate?
While most instruments that are evaluated
and calibrated normally pass the test, instrument
performance can change over time. There are
several factors that can contribute to this change
including drift, normal wear and tear, lack of
proper maintenance, user error, and improper
use and abuse of equipment. Regular calibration
ensures that test and measurement instruments
are operating at a known performance level.

Why is Calibration
Required?
By doing proper calibration procedure
and through proper interpretation of the
calibration results, instrument error/s can
be identified and be corrected.

Calibration is required by law.

Instrument Calibration Cycles

Calibration is not a one-time occurrence.


Instruments must be calibrated periodically to ensure
specified performance.
Each instrument requires a specific interval between
calibrations. This interval is determined by the
instruments owner and is often based on the
manufacturers recommendations.
The original equipment manufacturers (OEM)
calibration
intervals
are
typically
based
on
conservative performance for the average user. For
best results, the instrument owner should use several
additional factors in determining the optimal
calibration interval, including: The required accuracy
for the application vs. the instruments specified
accuracy. The business impact of using OOT

Calibration Quality System


The International Organization for Standardization
(ISO) is comprised of representatives from various
national organizations and has 162 member
countries. ISO develops standards for industry and
trade. Partnering with ISO registered calibration
providers ensures that the provider follows standard
practices.
ISO 9000 is a family of standards that provide a
framework
for
managing
an
organization's
processes and a set of standardized requirements
for a quality management system.
ISO/IEC17025 is a standard used by testing and
calibration laboratories. Laboratories implement the
requirements of ISO/IEC17025 to provide assurance

Calibration according to Legal


Metrology
Calibration according to R.A. 9236 of
2003 (REPUBLIC ACT NO. 9236 THE
NATIONAL METROLOGY ACT OF 2003),
is a set of operations establishing under
specified
condition,
relationship
between
values
indicated
by
a
measuring instrument or measuring
system, or values represented by
material
measure,
and
its
corresponding
known
values
of
measure.

Instrument Calibration Block


Diagram
Input Measurement
Standard (IMS)

Unit Under Test (UUT)

Output Measurement
Standard (OMS)

b-a=c
Note: IMS & OMS are commonly known as CALIBRATORS.

When is Calibration Required?


1.Over a period of time
2.Change in process parameters
3.Change in environmental conditions
4.Change
in
instrument
mounting
position
5.Before installation of new instrument
6.After any instrument repair
7.When process verification is required
8.Governmental Regulation (i.e.RA 9236)
9.Other reasons deemed necessary

How often instrument is calibrated?

By practice, the frequency of calibration


depends
upon
the
classification
of
the
instruments:
Critical: An instrument which, if not
conforming to specification, could potentially
compromise product or process quality and
safety. (Typical is twice yearly)
Non-critical: An instrument whose function is
not critical to product or process quality, but
whose function is more of an operational
significance. (Typical is yearly)
Reference Only: An instrument whose
function is not critical to product quality, not

Cost and Risk of Not Calibrating

Calibration can be easily ignored or cycles


extended beyond their recommended time
frame, which may increase operational risk or
regulatory
compliance.
Neglecting
routine
calibration schedules can lead to quality and
regulatory issues, increasing downtime, and
increase expenses.
If a company is unable to meet its customer
or regulatory requirements, they introduce
significant risk of business interruption, loss of
operating privileges, or compromised public
safety.
When compared with the significant business
risks associated with non-compliance, calibration

Types of Calibration

Workshop, Laboratory or Bench Calibration


Calibration utilizing deal conditions such as room
temperature, humidity, room pressure, vibration & etc.
A Bench Calibration is performed in the shop on the
bench with power supplied from an external source. It
may be performed upon receipts of new instruments
prior to installation. This provides an assurance that
the instrument received is undamaged. This also allows
configuration and calibration in a favourable
environment.
Advantages
Disadvantages
1. Instrument is removed, cleaned and
inspected.

1. Problem may encounter during pullout and installation.

2. Calibration done in an ideal conditions. 2. Zero adjustment usually required after


installation to compensate for field
3. Fixed calibration set-up and utilities.
ambient operating conditions.

Typical Bench Calibration Set-up

Standards or Calibrators

Field Calibration
Calibration utilizing actual field conditions such as
field ambient temperature, barometric pressure,
vibration, utilities, position & etc.
Field Calibrations are performed in-situ or in-place,
as installed. The instrument being calibrated is not
removed from the installed location. Field calibration
may be performed after installation to ensure proper
connections and configuration. Periodic calibrations are
more likelyAdvantages
to be performed in the field.
Disadvantages
1. May save calibration time.
2. May identify and allow
troubleshooting of installation problems.
3. Done in actual field ambient operating
conditions.

1. Loop elements
performance/condition may not be
individually checked.

Field Calibration

Characteristics of
Calibration
1. Compliance
to
the
Required
Accuracy Ratio of Standards
2. Traceability of Calibration Standards
3. Uncertainty of Measurements
4. Compliance to ISO-17025

1. Accuracy Ratio of
Standards
This term describe the relationship
between the accuracy of the calibration
standard and the accuracy of the UUT.
A good rule of thumb is to ensure an
accuracy ratio of 4:1. This means that
the accuracy of the calibration standard
is four times better than the accuracy
of the UUT.

Calibration Standards

The Importance of Calibration


Standards (Calibrators):
To determine whether a measurement is
accurate and precise, it must be compared to a
known STANDARD. A measurement standard is
one that has been established as a model.
Instruments that are used as measurements
standards (Calibrators) are calibrated according
to internationally accepted standards (Primary
Std.). These certified standard instruments are
then used to calibrate test equipment
(Secondary Std.). Test equipment is, in turn,
used to calibrate process instruments.

Instrument Calibration Block


Diagram
Input Measurement
Standard (IMS)

Unit Under Test (UUT)

a = 50%

Output Measurement
Standard (OMS)

b = 50.1%
c = 0.1%

b-a=c
Standards or Calibrators

What is Calibration
Standard?
Calibration
Standard
is
an
internationally
accepted
and
traceable instrument or material
used as reference in calibrating
instruments.

Classification of Calibration
Standard
1. Primary
Reference
Standard
or
Material
-. Directly traceable to international
standards.
-. A
standard
which
has
highest
metrological quality in a specified field.
2. Secondary or Certified Reference
Standard or Material
-. Traceable only to manufacturers
reference standards.
-. One
which
value
is
fixed
by

2. Traceability of Standards
All calibrations should be performed
traceable
to
a
nationally
or
internationally
recognized
standard.
Traceability is defined by ANSI/NCSL
Z540-1-1994 as the property of a result
of a measurement whereby it can be
related
to
appropriate
standards,
through
an
unbroken
chain
of
comparisons.

Calibration Traceability

Hierarchy of Calibration Standards &


Traceability (ISA)

Traceability the property of a result of a measurement relating to appropriate


standards, generally national or international through an unbroken chain of
comparison.

In the Philippines, calibration is legally


supported under R.A. 9236 the
National Metrology Act of 2003.
It is an act establishing a National
Measurement
Infrastructure
System
(NMIS) for standards and measurements,
and for other purposes.

National Metrology Laboratory of the


Philippines (NMLPHIL)

3. Uncertainty of measurements
Uncertainty analysis is required for
calibration labs conforming to ISO
17025 requirements.
Uncertainty analysis is performed to
evaluate
and
identify
factors
associated
with
the
calibration
equipment and process instrument
that affect the calibration accuracy.

Measurement
Uncertainty

Why Measure?
The objective of a measurement
is to determine the value of the
measurand or the value of the
particular
quantity
to
be
measured.

Measurement Errors, effects


and corrections:
In
general,
measurement
has
imperfections that give rise to an
error in the measurement result.
Commonly, an error is classified into
three types, namely: a random
error, systematic error and spurious
error.

Random Error (Accuracy


Random errorsError)
are unavoidable errors,
which
are
introduced
into
the
measurement process at random or by
chance. The effects of such variations
known as random effects, give rise to
variations in repeated observations of the
measurand.
Although it is not possible to compensate
for the random error of measurement
result, it can usually reduced by
increasing the number of observations.

Systematic Error (Bias or Precision Error)


Systematic error, like random error, cant
be eliminated but it too can often be
reduced. If the influence of systematic
error known as systematic effect can be
quantified and if it is significant in size
relative to the required accuracy of the
measurement, correction factor or bias
can be applied. After the correction, the
expected value of the error arising from
the systematic effect is zero.

Spurious Error
Spurious errors are error, such as
human mistakes or instrument
malfunction, which invalidate a
measurement. Such errors cant be
treated with statistical analysis and
the
measurement
should
be
discarded.

Measurement and Measurand


In
general,
the
result
of
a
measurement
is
only
an
approximation or estimate of the
value of the measurand and thus is
complete only when accompanied by
a statement of the uncertainty of
that estimate.

What is UNCERTAINTY?
Based on definition under 2.2 of the
ISO
Guide to the Expression of
Uncertainty in Measurement (GUM),
the word uncertainty means DOUBT,
and thus in its broadest sense
uncertainty of measurement means
doubt about the validity of the result of
a measurement.

Measurement Uncertainty Concept

Measured Value = 99.89 mBar + Uncertainty of


measurement

Common Sources of Uncertainty


Environmental conditions
Personal bias in reading values
Finite instrument resolution
Calibration of standards
Rounding of measurement

Methods & procedures of


measurement
Stability of power supply and etc.

Ishikawa (fishbone diagram):


Environment

Equipment

Temperature

Man

Accuracy

Pressure
Vibration &
others

Bias

Resolution

Error

Stability & others

Direct or Inferred
Others

Measurement
method &
procedure

Others

Connection & wire


resistance
Stability of

utilities
Others

Measurement
system & utilities

Uncertainty can be expressed in


terms of the following:
1.
2.
3.

Standard Uncertainty: ui
Combined Uncertainty: uc
Expanded Uncertainty: U = uc
(k)

Methods of Evaluating Standard


Uncertainty ui Components:
1. Type A Evaluation (of uncertainty)
is
the
method
of
evaluating
uncertainty by the statistical analysis
of a series of observations. In this
case, the standard uncertainty is the
experimental standard deviation of the
mean that follows from an averaging
procedure.

2. Type B Evaluation (of uncertainty)


method of evaluation of uncertainty
by means other than the statistical
analysis of series of observations. In
this case the evaluation of the
standard uncertainty is based on
some pool of information such as:

previous measurement data;

experience with or general knowledge of the


behavior and properties of relevant materials and
instrument

manufacturers specifications

data provided in calibration and other certificates

other relevant information

Sample: Calculating Standard Uncertainty (ui)


of Flowmeter and Proving Tank in terms of
Standard Deviation (STDV):
A 10-validation runs were
conducted on a flowmeter
calibration system using 3000
batch
size.
Results
were
tabulated and STDV of the
flowmeter and the proving tank
were calculated. Results shown
on the left table.
ui of Flowmeter is 3.13
ui of Proving tank is 1.05

The Combined Standard Uncertainty


(uc):

The combined standard uncertainty


of a measurement result, suggested
symbol uc, is taken to represent the
estimated standard deviation of the
result. It is obtained by combining the
individual standard uncertainties ui,
whether arising from Type A or a Type B
evaluation, using the usual method for
combining standard deviations.

Calculating the Combine Standard


Uncertainty (uc) of Flowmeter and Proving
tank:
ui of Flowmeter is 3.13 and ui of Proving Tank is
1.05

Combine
Uncertainty (uc)

(ui of Flowmeter) + (ui of Proving Tank )


(3.13) + (1.05)

Combine Uncertainty (uc) = 3.3

Expanded Uncertainty (U):

A quantity defining an interval about


the result of a measurement that may be
expected to encompass a large fraction
of the distribution of values that could
reasonably
be
attributed
to
the
measurand. The expanded uncertainty
denoted by U is obtained by multiplying
the combined standard uncertainty uc by
a coverage factor k. Thus; U=uc(k)
TRUE Measurement = Observed measurement +/-U

Calculating the Expanded Uncertainty


(U) of Flowmeter and Proving tank:
Expanded Uncertainty (U)

= uc (k)

= 3.3 (2)
= +/- 6.6
Where:
uc = Combine standard uncertainty
k = Coverage Factor

Statement of Uncertainty of Measurement


in Certificates
In calibration certificates , the complete result
of the measurement consisting of the estimate
y of the measurand and the associated
expanded uncertainty U shall be given in the
form (y +/- U).
To this an explanatory note must be added
The reported
expanded
uncertainty
of measurement
is stated
which
in the
general
case
should have
the
as the standard uncertainty of measurement multiplied by the
follow:
coverage factor k=2, which for a normal distribution
corresponds to a coverage probability of approximately 95%
level of confidence.

Level of Confidence:
Most
of
expanded
uncertainty
calculations are based on coverage
factor (k=2) and confidence level of 95%
(1 chance in 20 that the value of the
measurand lies outside the interval).

4. Compliance to ISO-17025
Calibration technical requirements per
PNS ISO/EIC 17025:2000
Human Factors (Personnel)
Environmental Conditions
Test & Calibration Methods and
Method Validation
Test/calibration equipment
Traceability
Handling of test and calibration items

Overall benefits of having a


regular instrument calibration:

Enhances production efficiency


Enhances product quality assurance
Increases plant safety
Reduction in production cost
Improves profit margin

Instrument Range and Span

Range and Span


To fully understand the concept of
calibration, it is essential to understand
the range and span of an instrument.
Range is the set of values (LRV & URV)
over which a measurement can be
made
without
changing
the
instruments sensitivity.
Span is the distance (or difference)
between the upper range vale (URV)
and lower range value (LRV).

Difference between instrument


Measuring Range and Calibration
Range
Measuring Range refers
instrument measuring capability

to

Calibration Range refers to


range the instrument is calibrated
to produce a scaled output.

Instrument Accuracy,
Precision and Gain

Accuracy in Calibration
Instruments are calibrated to make
them
accurate
within
manufacturers specifications.
Accurate calibration therefore is
an essential factor in instrument
performance.

Ways of Determining Instrument


Accuracy:
Manufacturers specifications
2. By calculations (Calculated)
1.

1. As a percent of output
span

Example:
A pressure transmitter has an output
span of 50 psi. It measures an actual
tank pressure of 25 psig but reads
26 psi. In this case, the transmitter
is accurate within 1 psi or 2% of
span.
Accuracy (%) = MV TV/Span
x 100

Example of accuracy
calculation (% of span):

2. As a percent of measured
value (MV)
Example:
A pressure transmitter has an output
span of 50 psi. It measures an actual
tank pressure of 25 psig but reads
26 psi. In this case, the transmitter
accuracy is 4% of measured value.
Accuracy (%) = MV TV/MV x
100

Example of accuracy
calculation (% of span):

What is an Accurate Instrument?


An accurate instrument is an
instrument in which
the output
accuracy
always
falls
within
manufacturers specifications every
time an input is applied.

Sample: Accurate Instrument


Manufacturers accuracy
statement is +/- 0.5% of span

Control Chart (21-PT-020)


FR Cement Corporation

Meaning and Importance of


Instrument Precision
Precision is another important
factor in instrument performance.
A precise instrument will produce
the same output every time it
receives an identical input. A
transmitter that produces the same
output signal from a constant input
is precise.

Sample: Precise Instrument


Manufacturers accuracy
statement is +/- 0.5% of span

Control Chart (21-PT-020)


FR Cement Corporation

Sample: Precise Instrument


Manufacturers accuracy
statement is +/- 0.5% of span

Control Chart (21-PT-020)


FR Cement Corporation

Meaning and Importance of Gain


in Calibration
The level of accuracy to which an
instrument can be calibrated is
partially dependent on another factor
known as Gain.
Gain refers to the amount of output
change for each increment of input
change. It is a key factor in
determining
how
accurately
an
instrument can be calibrated.

Gain Calculation
Transmitter Gain is calculated by dividing
the output span by the input span.

Sample Gain Calculations

Calibration Procedures

Calibration Procedures
Calibration procedure refers to the
way or manner on how calibration is
carried
out
in
relation
to
instruments
input/output
relationship. It could be either a 5point
or
10-point
input/output
relationship.

5-point Calibration Procedures


A calibration procedure which utilizes
a 5 input and 5 output test points.
This is the most widely used
calibration procedure. Test points
commonly used are 0, 25, 50, 75 and
100% of the input and output span.

5-point Input and Output relationship


Table
Example: Direct Acting Electronic Temperature
Controller with calibration range of 100
500 C

10-point Calibration Procedures


A
calibration
procedure
which
utilizes a 10 input and 10 output test
points. This is the most widely used
calibration procedure. Test points
commonly used are 0, 25, 50, 75,
100, 75, 50, 25 and 0% of the input
and output span.
This procedure is used to determine
instrument
error
known
as
hysteresis.

10-point Input and Output relationship


Table
Example: Direct Acting Electronic Temperature
Controller with calibration range of 100
500 C

Basic Steps in Calibrating an Instrument:


1. Identify the type of UUT to be calibrated
and record all necessary information
required for the calibration job.
2. Identify and prepare the appropriate IMS,
OMS and Utilities required for the
calibration job.
3. Set-up the calibration system.
4. Calibrate UUT reference to UUTs
maintenance manual or users established
Work Instruction.
5. Evaluate/correct instrument error.
6. Do housekeeping.
7. Finalize Calibration Certificate.

Typical Bench Calibration Set-up

Types of Calibration Error

Instrument error can occur due to a variety of


factors: drift, environment, electrical supply, addition of
components to the output loop, process changes, etc.
Since a calibration is performed by comparing or
applying a known signal to the instrument under test,
errors are detected by performing a calibration. An
error is the algebraic difference between the indication
and the actual value of the measured variable.

Types of Calibration Error


1.

Linear Errors

Zero Shift

Span Error

Zero and Span Error

2.

Non linear Error

3.

Hysteresis

Zero Shift (Linear)


A zero shift refers to a situation in
which an instrument signal output is
consistently higher or lower than
would be expected throughout the
input span.
It can also be described as a
situation where the instrument is
outputting consistently with inputs
provided but starts at a point too
high or too low on the output scale.

Span Error (Linear)


Span error is another type of
instrument error. The readings for an
instrument with span error either do
not represent 100% of the output
span or the output span does not
match the input span.

Zero Shift and Span Error


Combination (Linear)
Both zero shift and span error can
occur in the same instrument. In
such case, an input/output graph
produces a signal line that agrees
with neither the origin nor the angle
of the ideal line.

Non linear Error


Non-linearity is a condition in which
an instrument outputs signals that do
not match inputs between upper and
lower limits of the span. Severe nonlinearity is not a simple adjustment
problem and may require instrument
repair. And if the magnitude of the nonlinear
error is unacceptable and it cannot be
adjusted, the instrument must be replaced.

Non linear Error

Hysteresis
Hysteresis
is
another
common
instrument problem. In this case,
instrument produces different signals
depending upon the direction of the
input procedure. Moving up or down
through the input range produces
different output signals.

Hysteresis

Hysteresis

The Calibration Certificate

Calibration Certificate/Report must at least


contain the following elements as per 5.10.2
of PNS ISO/IEC 17025-2000.
A title
Name and address of the laboratory
where calibration was carried out.
Certificate Identification
Name and address of client
Identification of the method being
used.
Unit Identification

Date Unit received & calibrated


Traceability,
Uncertainty
&
Environmental Conditions
Test results & units of measurement
Findings & observations
Statement of the effect of the results
relate only to item calibrated.
Signatures authorizing the certificate

In addition, calibration certificates shall


include the following, where necessary for
the interpretation of calibration results:
1. When an instrument for calibration
has been adjusted or repaired, the
calibration results before and after
adjustment shall be reported.
2. A calibration certificate or label shall
not contain any recommendation on
the calibration interval except agreed
with the client.

3. When a calibration work has been


contracted, the laboratory performing the
work shall issue the calibration certificate
to the contracting laboratory.
4. The format of the calibration certificate
shall be designed to accommodate data
and to minimize the possibility of
misunderstanding.
5. When it is necessary to issue a complete
new calibration certificate, this shall be
uniquely identified and shall contain in a
reference to the original that it replaces.
6. Calibration certificates are part of the
controlled documents in a companys

Important Notes Regarding


Certificates:
1. Hard copies of calibration certificates
should also include the page number
and total number of pages.
2. It is highly recommended that a
statement specifying that the test
report or calibration certificates shall
not be reproduced except in full,
without written approval from the
issuing laboratory.
3. Calibration
certificates
must
be
controlled
and
considered
legal
documents.

Introduction to
Validation/Verification Concept

What is Validation/Verification?
Validation/verification is the process of
simulating an instrument with a known
input and comparing the result to a
Calibration Tolerance. If the difference is
within the specified tolerance, no action
shall be taken. However, if not, Calibration
must be performed.

Characteristics of a
Validation/Verification

1. Compliance
to
the
Required
Accuracy Ratio of Standards
2. Traceability of Calibration Standards
3. Uncertainty of Measurements
4. Compliance to ISO-17025
5. Acceptable Tolerance

5. Acceptable Tolerance
Every Validation/Verification should be
performed to a specified tolerance. The
terms tolerance and accuracy are often
used incorrectly. The definitions for each
are as follows:
Accuracy: The ratio of the error to the full
scale output, expressed in % of span or the
ratio of the error to the output, expressed
in % reading.
Tolerance: Permissible deviation from a
specified value. Maybe expressed in
measurement units, % of span or % of

Validation Tolerance

should
not
be
based
on
manufacturers accuracy statement only.
It should include also the following:
Requirements of the process

Capability of available test


equipment

Consistency
with
similar
instruments at your facility.

Validation/Verification Block
Diagram

Calibrate

y
Is the diff. more than
the Cal. Tolerance?

Quit

c
a

b-a=c

Validation Acceptance Criteria:


Error is within the given calibration
tolerance.
Types and magnitude of error
indicated by instrument is acceptable
based on application.
Measurement
uncertainty
is
known/defined.
Other relevant criteria specified by
process application.

5-point Validation/Verification
Curve
Acceptable
Calibration
Tolerance
(+/- 0.2% of Span)

Measured
Value

Quality Management System


and Instrument Calibration

Quality Management
System

Quality Management System is a


part
of
the
organizations
management system that focuses
on the achievement of results, in
relation to the product quality
objectives.

QMS and Instrument


Calibration

Since an accurately calibrated instrument


contributes not only to the safety aspects of
any automated industrial process but also
much on efficiency and product quality,
most of plants Quality Management System
(QMS) deals calibration issue as one of its
main concern.
ISO-9000 QMS standard specifically item 7.6
thereof requires regular testing and
calibration of any process instruments,
which during malfunction could affects
product quality.

Instrument calibration Standards


shall be all in accordance with PNS
ISO/IEC 17025:2000
General
requirements
for
the
competence of testing and calibration
laboratories.

You might also like