What is the difference between calibration range and full scale range?

Full scale range determines the lowest and highest value what can be measured - for example if you specify pressure gauge 0 - 100 bar, the full scale range is 100 bar, the lowest pressure which could be measured is 0 bar and the highest pressure which could be measured is 100 bar. In the same time, you can calibrate the pressure gauge in any point or any range inside the full scale range, depends on the real process pressure range which will be measured. For example, if you have pressure gauge 0 - 100 bar and if you expect that the process pressure which will be measured is about 50 bar, you can calibrate the pressure gauge at calibration range near the real process value, for example 40 - 60 bar. However, in case of process pressure gauges, the most usual is to calibrate a pressure gauge at full scale range - the calibration at some narrow range inside the full scale range is mostly usual for the transmitters and not for the gauges