Difference between Calibration & Ranging, What is Up-tests and Down-tests in calibration?

Calibration and range are two tasks related to the determination of an accurate correspondence between the input signal of any instrument and its output signal.

Calibration ensures that the tool allows the real-world variable to be measured or controlled correctly. Simply defined, range sets the required connection between the input and output of an instrument.

Calibration VS Ranging:


Calibrating an instrument means checking and adjusting its reaction (if needed) so that the output corresponds correctly to its input over a defined range. To do this, a real input stimulus of the exactly known amount must be exposed to the tool.

For pressure instrument this would mean subjecting the pressure instrument to known fluid conditions for a pressure gage, indicator, or transmitter and comparing the reaction of the tool to those known pressure amounts. Without comparing the response of an instrument to known physical stimuli, one can not conduct a real calibration.


Ranging an instrument means setting the reduced and upper range values in order to respond to modifications in input with the required sensitivity.

For example, a pressure transmitter set to a range of 0 to 200 PSI (0 PSI = 4 mA output ; 200 PSI = 20 mA output) could be re-ranged to respond on a scale of 0 to 150 PSI (0 PSI = 4 mA ; 150 PSI = 20 mA).

Re-ranging could (generally) only be achieved by re-calibration in analog tools since the same modifications were used for both times. Calibration and range are typically distinct modifications in digital tools (i.e. a digital transmitter can be re-ranged without having to conduct a full recalibration), so understanding the distinction is essential.

Up-tests and Down-tests

Calibration tables showing various up and down calibration points for documenting hysteresis and deadband mistakes. Note the table below, displaying a transmitter with 0.313 percent peak hysteresis:

It is important not to over-shoot any of the test points during such a directional calibration test. If you over-shoot a test point when setting up one of the instrument’s input circumstances, merely “back up” the test stimulus and re-approachthe test point in the same direction as before. Unless the value of each test point is approached in the right direction,