With the increasing use of energy and its rising cost, the sustained accuracy of energy meters is more important now than ever before, writes Bal Mukund Vyas, DGM of Yadav Measurements, Udaipur (India).
In many countries, there are indirect commercial implications to the use of energy. For example, if the estimation of tax levied on a manufacturing organisation is based on its electricity consumption. Consumption recorded by meters is also used to estimate carbon emissions.
Why does a meter become inaccurate with time? How can the inaccuracy be minimised?
The durability of voltage and current sensors and power supply are important factors in achieving sustained accuracy in electricity meters. Most popular voltage sensors are resistive potential dividers because of their low cost, linearity and they are not affected by a magnetic field.
This resistive divider is directly connected to the mains voltages that are to be measured. Hence a mains facing resistor should be a high voltage resistance; typically high ohmic/high voltage metal glaze resistors. Any change in value of the this resistor will directly affect the accuracy of the meter. The value of resistance can change due to loading, pulsed loading, ESD and environmental cycling.The typical ageing specification of such resistors is shown in Table 1.
These specifications indicate that a new resistor might change its value by a significant amount after the meter is installed in field. When a new meter is calibrated during manufacture, the resistor is new. The change in value of resistance after the calibration will increase the errors in energy recorded by the meter. One of the methods to minimise this is to pre-burn the resistors before manufacturing the meters. Current sensors are usually...