Suppose that a 10-Mbps wireless station is transmitting 50-byte frames one immediately after the other.
i) How many frames is it transmitting per second?
ii) If the probability of a frame being damaged (having at least one wrong bit) is 0.004, approximately how many frames will be damaged in one hour?



Answer :

There are 1'000'000 Bytes in a Megabyte. In 10 Megabytes there are 10'000'000 bytes. So if it were transmitting 1-byte frames then the answer would be 10'000'000 frames. However this is not the case, because it is transmitting 50-byte frames, so we must divide 10'000'000 by 50.

Therefore, the answer to part (i) is: 200'000 Frames/second

If there are 200'000 Frames/second, then there are 12'000'000 Frames/hour (we get that by multiplying 200'000 by 60)

If the probability of a damaged frame is 0.004, then (multiplying 0.004 by 12'000'000) we get (ii) 48'000 Damaged Frames/hour
AL2006
8 bits = 1 byte

10 Mbps = 10/8 M bytes per second

1 frame = 50 bytes

Frame rate = 10 M / (8 x 50) =  10 M / 400 = 25,000 frames per second

========================

If each frame has an error probability of 0.004, then (0.004 x 25,000) = 100 frames per second are damaged.

1 hour = 3,600 seconds.

100 damaged frames per second = (100 x 3,600) = 360,000 damaged frames per hour.