A voltmeter is used to measure a known voltage of 100 V. Forty percent of the readings are within 0.5 V of the true value. Estimate the standard deviation for the meter. What is the probability of an error of 0.75 V?
Step by step explanation helps a lot. Thank you.