Nowadays smart transmitters are used in industrial instrumentation. These devices have the built-in diagnostic capability, greater accuracy and the ability to communicate digitally with host devices to inform various parameters.
The only calibration settings available on the analog transmitter are the “zero” and “span” settings. In the case of smart transmitters, we establish lower and upper range values (LRV and URV) in an intelligent transmitter, but it is also possible to calibrate the analog-to-digital and digital-to-analog converter circuits independently.
Smart instruments always provide a means to calibrate the ADC and DAC circuits, to ensure that the microprocessor “sees” the correct representation of the applied stimulus and to ensure that the output signal of the microprocessor accurately converts to a direct current, respectively. This calibration function is called digital trim.
Example: Calibration for pressure transmitter;
Suppose we have an intelligent pressure transmitter with a range of 0 to 100 PSI with an analog output range of 4 to 20 mA, but the pressure sensor of this transmitter is fatigued by years of use, so an actual applied pressure of 100 PSI generates a signal that the analog to digital converter is interpreted as only 96 PSI.
The digital/analog converter in smart transmitter is to monitor the microprocessor’s process variable (PV) and analog output (AO) registers. To test the transmitter compare the real input and output values against trusted calibration standards.
The following example shows a differential pressure transmitter with a sensor (analog-to-digital) calibration error:
-
The calibration standard for pressure input to the transmitter is a digital pressure gauge
-
The digital multimeter (DMM) is our calibration standard for the current output.
Technicians often use a hand-held HART communicator device to reset the values of the LRV and URV ranges to any new value desired by operations personnel without having to re-verify the calibration by applying known physical stimuli to the instrument. Connect the HART device and open setting and change the LRV or URV.
Sensor Trim and Output trim:
The calibration standard for pressure input to the transmitter is a digital pressure gauge. The digital multimeter (DMM) is our calibration standard for the current output. Both shown in the above figure.
If the value of the digital multimeter shown is not the expected value. Immediately we know from the pressure gauge and the readings from the multimeter that there is some kind of calibration error in this transmitter.
The comparison of the PV and AO displays of the HART communicator with our calibration standards reveals more information about the nature of this error: we see that the AO value matches the multimeter while the PV value does not match the digital pressure gauge.
This tells us the calibration error lies within the sensor (input) of the transmitter and not with the DAC (output). Thus, the correct calibration procedure to perform on this errant transmitter is a sensor trim
While the PV value agrees with the digital pressure gauge and the AO value does not agree with the digital multimeter. This tells us the calibration error lies within the digital-to-analog converter (DAC) of the transmitter and not with the sensor (input). Thus, the correct calibration procedure to perform on this errant transmitter is an output trim.