There are two algorithms called Alg1 and Alg2 for problem of size n. Alg1 runs in n2 microseconds and Alg2 runs in 100n log n microseconds. Alg1 can be executed using 4 hours of programming time and requires 2 miniutes of CPU time. If programmer are paid 20 dollars per hour and CPU time costs 50 dollars per minute, determine how many times should a problem of size 500 be solved by using Alg2 to explain its development cost?