Description : Suppose that the number of instructions executed between page faults is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between page faults is also doubled. Further, consider that a normal instruction takes one micro second, but if a page fault occurs, it takes 2001 micro seconds ... how long would it take to run if twice as much memory were available? (A) 60 sec (B) 30 sec (C) 45 sec (D) 10 sec
Answer: C Explanation: T = Ninstr x 1µs + 15,000 x 2,000 µs = 60s Ninstr x 1µs = 60,000,000 µs - 30,000,000 µs = 30,000,000 µs Ninstr = 30,000,000 The number of instruction ... doesn't mean that the program runs twice as fast as on the first system. Here, the performance increase is of 25%.