The mean amount of time a (unnamed) computer server is down is 18.3 minutes with a standard deviation of 5.7 minutes.
Assuming we know nothing about the shape of the data set, at least what percentage of times will the server be down between 8.325 and 28.275 minutes?