John was figuring out how much his investment might be worth in 30 years. If John invested $250,000 for 30 years with a 10% annual rate in 30 years he would have $4,362,351. After a while of thinking John, Mike, and Steven was trying to figure out that it might unlikely to find a return that will payout these term in 30 years. They felt that if the money was invested in stocks it might return higher than 10%, and it could be lower as well. So to help account for the potential variability in the investment returns John and his friends came up with a plan; they assume that he could find an investment that would produce an annual return of 17.5% seventy present of the time and a return (or actually a ) of -7.5% thirty percent of the time. Mike felt certain that this meant John could still expect his $250,000 investment to grow to $4,362,351 in 30 years. Steven thought that Mike was wrong and said that John would see a 17.5% return in 70% of the 30 years or (0.7(30) = 21 years and a -7.5% return in 30% of the 30 years or (0.3(30) = 9 years. So, John should have (1=0.175)21 (1-0.075)9 = $ 3,664,467 after 30 years. But that's $697,884 less than what Mike says John should have. So what do you think? Who right, Mike or Steven, or either? And why?