Why 4mA in 4-20mA standard is used and why not 3mA or 5mA?
The answer to your question as to why the lower range value is 4.0mA is because that’s the standard, defined by an international standard ANSI/ISA-50-1975 (2002), originally adopted, as the date in the title shows, in 1975.
The creation of the standard lagged the development and adoption of the 4-20 range by at least a decade, meaning the most of the majors in the process industry had effectively settled on the 4-20mA range by the time the standard was created. The main hold-out and competitor to 4-20mA was 10-50mA used by Foxboro, which did have an advantage where more loop power was needed, say to drive split-ranged valve positioners in series.
The reason for a ‘live zero’, a minimum value above zero, is that 2-wire field instruments use 3.5mA to power the device’s electronics that make the measurement and to control the regulated output. The fact that a live zero provides a diagnostic of an open loop/wire break if the loop current drops to zero is a nice/convenient feature, but was not the main reason for a live zero, the ability to power 2-wire field instruments was the main reason.
The 1:5 ratio of 4:20 followed the pneumatic ratio of the then standard 3-15 PSI. The same reason for a minimum 3 PSI applies - the operation of pneumatic controls required some on-going, continuous consumption of air (air bleed through the flapper valve) so having zero pressure would eliminate the ability to use air to make the measurement. A live zero was necessary to make the measurement and control the regulated output.
A second serious consideration in the adoption of 4-20mA was the magnitude of circuit energy that could be handled through Intrinsic Safety concepts and engineering for use in hazardous areas.
The industry effectively adopted 4-20mA before the standard because 4-20mA could power field instruments over 2 wires, fit the accepted 1:5 ratio, ‘offered a live zero’, and could be intrinsically safe (with associated apparatus).