Difference between revisions of "Class of accuracy in electrical measurements"
m (→Example) |
m (1 revision) |
(No difference)
|
Latest revision as of 13:52, 10 December 2011
In electrical engineering class of accuracy is a figure which represents the error tolerance of a measuring device.
Measurement
In electrical engineering quantities like current or voltage can be measured by devices named ammeter, voltmeter, multimeter, etc. The ammeter is used in series with the load. So the same current flows through the load and the ammeter. The voltmeter is used in parallel with the load. So the voltage between the two terminals of the load is equal to the voltage between the two terminals of the voltmeter. Ideally the measuring device should not affect the circuit parameters ie, the internal impedance of the ammeter should be zero (no voltage drop over the ammeter) and the internal impedance of the voltmeter should be infinite (no current through the voltmeter). However in actual case, ammeters have a low but non zero impedance and voltmeters have a high but not infinite internal impedances. Thus the measured parameters are somewhat altered during the measurements.
Example
Let E be the EMF of the source R be the resistance of the load and r be the resistance of the ammeter. The currenet through the load is I
- <math>I=\frac{E}{R}</math>
When the ammeter is connected in series with the load the current I2 is
- <math>I_2=\frac{E}{R+r}</math>
The difference introduced by the measuring device is then,
- <math>\Delta I = I-I_2 \approx\frac{E\cdot r}{R^2} </math>
The ratio of the difference to the actual value is [1]
- <math>\frac{\Delta I}{I}\approx\frac{r}{R} </math>
Class of accuracy
The profesional measuring devices are labelled for the class of accuracy. This figure is the percentage of the inherent error of the measuring device with respect to full scale deflection. For example, if the class of accuracy is 2 that means an error of 2 volts in a full scale 100 volt reading.[2]