You are on page 1of 5

Unit 2 Mod 2 Uncertainty in Measurements

Precision vs Accuracy Accuracy is the degree of closeness of a measured or calculated quantity to its actual (true) value. Precision, is the closeness of the values to each other. The results of calculations or a measurement can be accurate but not precise; precise but not accurate; neither; or both. A measurement system or computational method is called valid if it is both accurate and precise. Errors made in measurements can be either random or systematic. Random errors can be minimised as it usually is related to experimental technique e.g. reading the value of a burette or pipette. Generally random errors result in a spread of data around the mean. This is called variability or the standard deviation of the data. Systematic error (or bias) is more difficult to eliminate, since it is usually equipment error that causes the problem. If the equipment is not regularly calibrated, or changes in the environment or even using improper observational skills, systematic error can occur. Systematic errors cause the mean of the value to differ significantly from the true value. Systematic errors are usually in one direction i.e. either greater or less than the true value. It is a constant error that occurs. In analytical chemistry, a calibration curve is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. The calibration curve is a plot of how the instrumental response, the socalled analytical signal, changes with changing concentration of analyte (substance to be measured).

Unit 2 Mod 2 Uncertainty in Measurements

Most analytical instruments produce a signal even when a blank ( matrix without analyte) is analyzed. This signal is referred to as the noise level LOD limit of detection is the lowest quantity of a substance that can be distinguished from the absence of that substance (a blank value) within a stated confidence limit (generally 1%) LOQ Limit of Quantification Just because we can tell something from noise does not mean that we can necessarily know how much of the material there actually is. The LOQ is the limit at which we can reasonably tell the difference between two different values. Note, the gradient of the slop indicate sensitivity of the device. The higher the gradient, the more sensitive the device.
Example It is often difficult to understand the concept of detection limit. The following example may help to clarify some of the concepts defined above. Suppose you are at an airport with lots of noise from jets taking off. If the person next to you speaks softly, you will probably not hear them. Their voice is less than the LOD. If they speak a bit louder, you may hear them but it is not possible to be certain of what they are saying and there is still a good chance you may not hear them. Their voice is >LOD but <LOQ. If they speak even louder, then you can understand them and take action on what they are saying and there is little chance you will not hear them. Their voice is then >LOD and >LOQ. Likewise, their voice may stay at the same loudness, but the noise from jets may be reduced allowing their voice to become >LOD. Detection limits are dependent on both the signal intensity (voice) and the noise (jet noise).

Mean In mathematics, an average, or central tendency of a data set refers to a measure of the "middle" or "expected" value of the data set. There are many different descriptive statistics that can be chosen as a measurement of the central tendency of the data items. Standard deviation Sample standard deviation. The formula is:

Unit 2 Mod 2 Uncertainty in Measurements Example The owner of the Ches Tahoe restaurant is interested in how much people spend at the restaurant. He examines 10 randomly selected receipts for parties of four and writes down the following data. 44, 50, 38, 96, 42, 47, 40, 39, 46, 50 He calculated the mean by adding and dividing by 10 to get

Then divide 2600.4 / 10 -1 = 2600.4 / 9 = 288.7 Then find 288.7 = 16.99 Therefore standard deviation = 16.99

Calibration is the comparison between measurements - one of known magnitude or correctness made or set with one device and another measurement made in as similar a way as possible with a second device.

Calibration Technique of a Pipette One of the easiest ways of calibrating a pipette is via gravimetric calibration. In gravimetric calibration you are measuring the weight of water a pipette dispenses. This weight must then be converted into a volume by using the physical property of water density. Volume (ml) = Weight of water dispensed ( mg ) / density of water (mg/ml) The density of water is a well-known constant at known temperatures, and thus the mass of the dispensed sample provides an accurate indication of the volume dispensed. Remember the density of water is temperature dependent.

Unit 2 Mod 2 Uncertainty in Measurements

Worksheet 1.

Unit 2 Mod 2 Uncertainty in Measurements

You might also like