What is meant by "linearity" in an instrumentation system, and why is it important?

The capacity of a sensor to react uniformly over the whole range to changes in a measured variable is known as linearity. If you’ve encountered older styles pressure gauges, you may have noticed that they frequently feature scales with inconsistent division sizes. This corrects for non-linearities in the bourdon tube’s flexing. Regardless of whether they are discrete devices like proximity sensors or process instruments like flowmeters, the same principles apply to electronic devices.

The degree to which a device’s output curve resembles a straight line is referred to as the linearity of an instrument’s output. It really serves as a gauge for the output’s nonlinearity.

Plotting the average output curve is the first step in determining linearity. By averaging the output values noticed across two or more successful instrument input cycles, this curve is generated.

With the use of this technique, linearity may be observed regardless of deadband and hysteresis effects. It is predicated that the real output curve would be a single line positioned halfway between the upscale and downscale values if there were no dead band or hysteresis mistake.

A straight line constructed between the output values with upper and lower range value inputs is then compared to the average output curve. The instrument’s linearity is measured by how far the average output curve deviates from the line.

Linearity should be referred to as one of the following terms as a performance specification:

Independent linearity,

Terminal-based linearity,

Zero-based linearity.

Independent linearity

Without further explanation, it is understood that the straightforward term linearity implies independent linearity.

The greatest deviation of the average curve from a straight line that is placed to minimize the maximum variations of the output curve from the line is known as independent linearity.

Terminal-based linearity

The greatest variation of the average output curve from a line that is positioned to correspond with the actual output curve at the higher and lower range values is known as terminal-based linearity.

Zero-based linearity

The average output curve’s maximum deviation from a straight line that is placed to coincide with the average output curve at the lower-range value and to minimize the maximum divergence of the curve from the zero based line is known as zero-based linearity.

Example for linearity:

Take, for instance, a temperature sensor as our example. Thermocouple, RTD, or thermistor conversion from temperature to voltage or resistance is required. The voltage or resistance must adapt to variations in temperature. A device will vary in voltage or resistance by the same amount per change in temperature over the whole range of the device if it has good linearity.

For a type K thermocouple, for instance, a change of 10 °C will result in a shift of 0.427 mV when it is about in the center of its measurement range (500 °C). The change in voltage per degree will be constant over the whole measurement range if the instrument is genuinely linear. The truth is that it isn’t. A 10 degree change at the low extreme (-250 °C) equals 0.017 mV. A 10 °C shift is represented by 0.340 mV at the 1,370 °C opposing extreme.