You are on page 1of 27

Part 1

General Instrumentation
(1 Hour)
At the end of this chapter, you should be able At the end of this chapter, you should be able
to:
explain units and quantities in
instrumentation and measurement.
discuss and calculate various types of error in
t measurement.
Explain the meaning of some terms in
instrumentation field instrumentation field.
Instrumentation is defined as the art and
science of measurement and control of process
variables within a production, or manufacturing
area area.
Simply put, instrumentation is the science of
precise measurement and control of physical p f p y
processes.
Instrument a device or mechanism used to
determine the present value of a quantity determine the present value of a quantity
Quantity Symbol Unit Unit Abbre.
Length l meter m
Mass m kilogram kg
Time t second s
Temperature T Kelvin K
Electric current I Ampere A
Quantity Symbol Unit Unit Abbre.
emf/ voltage V volt V e / o age o
charge Q coulomb C
resistance R Ohm
capacitance C farad F p
inductance L henry H
Standards are defined in 4 Standards are defined in 4
categories:
i i l d d international standards.
primary standards.
secondary standards.
working standards. g
Are defined by international agreements. These y g
standards are maintained at the International
Bureau of Weight and Measures in Paris, France.
They are periodically evaluated and checked by They are periodically evaluated and checked by
absolute measurements in term of the
fundamental units of physics. They represent
certain units of measurement to the closest
possible accuracy attained by the science and
technology of measurement and used for gy
comparison with primary standards.
Are maintained at institution in various
countries around the world, such as the
National Bureau of Standard in Washington
D C and SIRIM in Malaysia The primary D.C and SIRIM in Malaysia. The primary
standards are not available for use outside
the national laboratories. Their principle
function is to calibrate and verify the function is to calibrate and verify the
secondary standards.
Also known as National Standard
Used as the basic reference standards used by y
measurement & calibration laboratories in the
industry. Each industrial laboratory is completely
responsible for its own secondary standards responsible for its own secondary standards.
Each laboratory sends its secondary standards to
the national standards ( primary standards)
laboratory for calibration. After calibration, the
secondary standards are returned to the
industrial uses with the certification and
checked periodically.
Working standard is the principle tools of a g p p
measurement laboratory and the lowest level of
standards. They are used to check and calibrate
the instruments used in the laboratory or to the instruments used in the laboratory or to
make comparison measurement in industrial
application. As example, the standard resistor,
capacitors, inductor usually found in an
electronics laboratory are classified as working
standards.
Error - The deviation of a reading or set of
readings from the expected value of the
measured variable measured variable.
Measurement is the process of comparing an
unknown quantity with an accepted q y p
standard quantity
There are various types of error in
measurement: measurement:
I. absolute error
II. gross error II. gross error
III. systematic error
IV. random error
V. limiting error
Absolute error.
Absolute error maybe defined as the difference
between the expected value of the variable and
th d l f th i bl the measured value of the variable, or
e = Yn Xn
to express error in
percentage
e Yn Xn
where:
e = absolute error.
percentage
% error =
100 *
n
Y
e
Yn = expected value.
Xn = measured value
to derive the relative
accuracy, A;
n n
X Y
A

1
n
n n
Y
A =1
Example 1. Example 1.
The expected value of the voltage
across a resistor is 5.0 V. However, ,
measurement yields a value of 4.9 V.
Calculate:
a) absolute error
b) % error
c) relative accuracy
d) % accuracy
generally the fault of the person using the g y p g
instruments.
Errors may also occur due to incorrect
adjustment of instruments and j
computational mistakes.
These errors cannot be treated
mathematically.
The complete elimination of gross errors is
not possible, but one can minimize them.
One of the basic gross errors that occurs
f tl i th i f frequently is the improper use of an
instrument. The error can be minimized by
taking proper care in reading and recording
the measurement parameter the measurement parameter.
due to problems with instruments/ due to problems with instruments/
environmental effects or observational
errors.
Instrument errors : Instrument errors may be due to friction in the
bearings of the meter movement, incorrect spring tension, improper
calibration, or faulty instruments. Instrument error can be reduced by
proper maintenance use and handling of instruments proper maintenance, use, and handling of instruments.
Environmental errors : Environmental conditions in which instruments
are used may cause errors. Subjecting instruments to harsh
environments such as high temperature pressure or humidity or strong environments such as high temperature, pressure, or humidity, or strong
electrostatic or electromagnetic fields, may have detrimental effects,
thereby causing error.
Observational errors : Observational errors are those errors introduced Observational errors : Observational errors are those errors introduced
by the observer. The two most common observational errors are probably
the parallax error introduced in reading a meter scale and the error of
estimation when obtaining a reading from a meter scale.
generally the accumulation of a large generally the accumulation of a large
number of small effects
maybe of real concern only in measurements
requiring a high degree of accuracy.
such errors can only be analyzed statistically.
manufacturers of instruments state that an
instrument is accurate within a certain
percentage of a full-scale reading.
example is; a voltmeter is accurate within example is; a voltmeter is accurate within
2% at full-scale deflection.
this specification is called the limiting errors. p g
However, with reading less than full-
scale, the limiting error will increase.
therefore, it is important to obtain
measurements as close as possible to full
scale scale.
Example 2 Example 2
A 300-V voltmeter is specified to be accurate
within 2% at full scale. Calculate the
li iti h th i t t i d t limiting error when the instrument is used to
measure a 120-V source?
The magnitude of the limiting error is
2/100 x 300 = 6V
Therefore, the limiting error at 120 V is
6/120 x 100 = 5%
(reading < full scale, limiting error increased)
Precision of measurement
A quantitative or numerical indication A quantitative, or numerical, indication
of the closeness with which a repeated
set of measurements of the same set of measurements of the same
variable agrees with the average of the
set of measurements.
where
X
n
= the value of the n
th
measurement
X
n
= the average of the set of n measurements
Example 3
Table 1.1 gives the set of 10 measurement that
were recorded in the laboratory. Calculate the
precision of the 6th measurement precision of the 6th measurement.
1) Arithmetic mean/average

=
+ + + +
=
n
n 3 2 1
x x x x

x
x
i
L
n = total number of piece of data

=1 i
n n
n total number of piece of data
x
n
= the value of the nth measurement
x
i
= set of number
i
2) Deviation
the difference between each piece of data and
arithmetic mean
x x d
n n
=
0
2 1
= + + + =
n tot
d d d d L
3) Average deviation (D)
- precision of a measuring instrument
- high D low precision
- low D high precision
n
d d d
D
n
+ + +
=
L
2 1


n
4) Standard deviation
the degree to which the value vary about the average
value
( )
30 n for
1
2
1
2
< =

=

= =
d x x
S
n
i
i
n
i
i
30 n for
1 1
<

=
n n
S
n
30 n for
1
2
=

=
n
d
S
i
i
n
Some terms and definitions are as below: Some terms and definitions are as below:
Error The deviation of a reading or set of g
readings from the expected value of the
measured variable.
Accuracy The degree of exactness of a y g
measurement compared to the expected
value
Precision A measure of consistency, or Precision A measure of consistency, or
repeatability of measurements.
Instrument a device or mechanism used
to determine the present value of a to determine the present value of a
quantity
Sensitivity - The ratio of the change in
output (response) of the instrument to a output (response) of the instrument to a
change of input or measured variable.
expected value the most probable value we
should expect to obtain.
deviation the difference between any piece deviation the difference between any piece
of data in a set of numbers and the arithmetic
mean of the set of numbers.
Measurement a process of comparing an
unknown quantity with an accepted standard
quantity. quantity.
Standard an instrument or device having a
recognized permanent (stable) value that is
d f used as a reference.
Resolution - The smallest change in a
measured variable to which an instrument will measured variable to which an instrument will
respond

You might also like