Assume a network transmits 1024 byte packets having a 128-byte header and a four byte checksum. If a workstation on the network is guaranteed to be able to transmit at least one packet every x time units,
(1) Determine maximum amount of time, as a function of x, should be required (based on these factors) to transfer a 3MB file from a server to a workstation?
(2) Determine the effective transfer rate from the server to the workstation?