The Richter scale converts seismographic readings into numbers for measuring the magnitude of an earthquake according to this function: M(x)=log(x/x sub 0), where x sub 0 equals10 to the -3 power.
Two earthquakes differ by .1 when measured on the Richter scale. How would the seismographic readings differ at the distance of 100 km from the epicenter?
The answer is The earthquake of greater magnitude has a seismographic reading that is10 to the .1 power which is equal to 1.26 times that of the lesser earthquake, but I have no idea how.