You can think of accuracy and precision in terms of hitting a bull's-eye. Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center. Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target. Measurements that are both precise and accurate are repeatable and very near true values.
Accuracy
There are two common definitions of accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value.
The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise.
Precision
Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error.
Examples
You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy. If he doesn't make many baskets but always strikes the same portion of the rim, he has a high degree of precision. A player whose free throws always make the basket the exact same way has a high degree of both accuracy and precision.
Take experimental measurements for another example of precision and accuracy. You can tell how close a set of measurements are to a true value by averaging them. If you take measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.7 grams, your scale is precise, but not very accurate. The average of your measurements is 47.6, which is lower than the true value. Yet, your measurements were consistent. If your scale gives you values of 49.8, 50.5, 51.0, and 49.6, it is more accurate than the first balance but not as precise. The average of the measurements is 50.2, but there is a much larger range between them. The more precise scale would be better to use in the lab, providing you made an adjustment for its error. In other words, it's better to calibrate a precise instrument than to use an imprecise, yet accurate one.
Mnemonic to Remember the Difference
An easy way to remember the difference between accuracy and precision is:
- ACcurate is Correct (or Close to real value)
- PRecise is Repeating (or Repeatable)
Accuracy, Precision, and Calibration
Do you think it's better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet it's close to your true weight, the scale is accurate. Yet it might be better to use a scale that is precise, even if it is not accurate. In this case, all the measurements would be very close to each other and "off" from the true value by about the same amount. This is a common issue with scales, which often have a "tare" button to zero them.
While scales and balances might allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer. Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside that range. To calibrate an instrument, record how far off its measurements are from known or true values. Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings.
Comments
Post a Comment
https://gengwg.blogspot.com/