Calibration of instruments

Calibration of the Instruments

Calibration of the measuring instrument is the process in which the readings obtained from the instrument are compared with the sub-standards in the laboratory at several points along the scale of the instrument. As per the results obtained from the readings obtained of the instrument and the sub-standards, the curve is plotted. If the instrument is accurate there will be matching of the scales of the instrument and the sub-standard. If there is deviation of the measured value from the instrument against the standard value, the instrument is calibrated to give the correct values.

All the new instruments have to be calibrated against some standard in the very beginning. For the new instrument the scale is marked as per the sub-standards available in the laboratories, which are meant especially for this purpose. After continuous use of the instrument for long periods of time, sometimes it loses its calibration or the scale gets distorted, in such cases the instrument can be calibrated again if it is in good reusable condition.

Even if the instruments in the factory are working in the good condition, it is always advisable to calibrate them from time-to-time to avoid wrong readings of highly critical parameters. This is very important especially in the companies where very high precision jobs are manufactured with high accuracy.

Purpose of instrument calibration

Calibration refers to the act of evaluating and adjusting the precision and accuracy of measurement equipment. Instrument calibration is intended to eliminate or reduce bias in an instrument’s readings over a range for all continuous values.

Precision is the degree to which repeated measurements under unchanged conditions show the same result

Accuracy is the degree of closeness of measurements of a quantity to its actual true value.

For this purpose, reference standards with known values for selected points covering the range of interest are measured with the instrument in question. Then a functional relationship is established between the values of the standards and the corresponding measurements. There are two basic situations:

Instruments which require correction for bias : The instrument reads in the same units as the reference standards. The purpose of the calibration is to identify and eliminate any bias in the instrument relative to the defined unit of measurement. For example, optical imaging systems that measure the width of lines on semiconductors read in micrometers, the unit of interest. Nonetheless, these instruments must be calibrated to values of reference standards if line width measurements across the industry are to agree with each other.

Instruments whose measurements act as surrogates for other measurements : The instrument reads in different units than the reference standards. The purpose of the calibration is to convert the instrument readings to the units of interest. An example is densitometer measurements that act as surrogates for measurements of radiation dosage. For this purpose, reference standards are irradiated at several dosage levels and then measured by radiometry. The same reference standards are measured by densitometer. The calibrated results of future densitometer readings on medical devices are the basis for deciding if the devices have been sterilized at the proper radiation level.

Steps for correcting the instrument for bias

The calibration method is the same for both situations stated above and requires the following basic steps:

  • Selection of reference standards with known values to cover the range of interest.
  • Measurements on the reference standards with the instrument to be calibrated.
  • Functional relationship between the measured and known values of the reference standards (usually a least-squares fit to the data) called a calibration curve.
  • Correction of all measurements by the inverse of the calibration curve.

Calibration Disciplines

There are many calibration disciplines, each having different types of calibrators and calibration references. Common calibration disciplines include but are not limited to:

  • Electrical
  • Radio frequency (RF)
  • Temperature
  • Humidity
  • Pressure
  • Flow
  • Dimensional

When are instruments calibrated

Before major critical measurements

Before any measurements that requires highly accurate data, send the instruments out for calibration and remain unused before the test.

After major critical measurements

Send the instrument for calibration after the test helps user decide whether the data obtained were reliable or not. Also, when using an instrument for a long time, the instrument’s conditions will change.

After an event

The event here refers to any event that happens to the instrument. For example, when something hits the instrument or any kinds of accidents that might impact the instrument’s accuracy. A safety check is also recommended.

When observations appear questionable

When you suspect the data’s accuracy that is due to instrumental errors, send the instrument to calibrate.

Per requirements

Some experiments require calibration certificates. Check the requirements first before starting the experiment.

Indicated by manufacturer

Every instrument will need to be calibrated periodically to make sure it can function properly and safely. Manufacturers will indicate how often the instrument will need to be calibrated.

How Calibration of the Instruments is done

All the measuring instruments for measurement of length, pressure, temperature etc should be calibrated against some standard scale at the regular intervals as specified by the manufacturer. There are different methods or techniques of calibration, which are applied depending on whether it is routine calibration or if it is for special purpose where highly accurate calibration of the instruments is desired. In many cases different methods of calibration are applied for all the individual instruments. No what type of calibrations is being done, all of them are done in the laboratory.

The calibration of the instrument is done in the laboratory against the sub-standard instruments, which are used very rarely for this sole purpose. These sub-standards are kept in highly controlled air-conditioned atmosphere so that there their scale does not change with the external atmospheric changes.

To maintain the accuracy of the sub-standards, they are checked periodically against some standard which is kept in the metrological laboratories under highly secured, safe, clean and air conditioned atmosphere. Finally, standards can be checked against the absolute measurements of the quantity, which the instruments are designed to measure.

Here is the procedure for the calibration of mechanical instruments:

  1. Firstly, the readings obtained from the scale of the instrument are compared with the readings of the sub-standard and the calibration curve is formed from the obtained values. In this procedure the instrument is fed with some known values (obtained from the sub-standard). These are detected by the transducer parts of the instrument. The output obtained from the instrument is observed and compared against the original value of the substandard.

  2. A single point calibration is good enough if the system has been proved to be linear (that is readings from instrument are linear with the substandard), but if it is not, then readings will have to be taken at multiple points.

  3. In most of the cases the static input is applied to the instruments and its dynamic response is based on the static calibration.

In some instruments it is not feasible to introduce the input quantity for the calibration purpose like in bonded strain gauges. In such cases the spot calibration is done by the manufacturer. The procedure applied for different types of such instruments is different, which shall be discussed with the individual instruments.

1 Like