kickureface
Master Don Juan
- Joined
- Apr 23, 2006
- Messages
- 735
- Reaction score
- 1
having some trouble with this one-
signal travels via optical fiber from LA to chicago 3000km away. the signal is 10^6 bytes at 10^8 bits/sec
a. how long is the signal in seconds.
b. if the first signal bit leaves LA at t=0, when does it reach chicago?
c. when does the last signal reach chicago
part a is simple, some division and unit conversion
part b and c throw me off-am i suposed to know how fast signal bits are? and how much separation is between the first and last bits?
signal travels via optical fiber from LA to chicago 3000km away. the signal is 10^6 bytes at 10^8 bits/sec
a. how long is the signal in seconds.
b. if the first signal bit leaves LA at t=0, when does it reach chicago?
c. when does the last signal reach chicago
part a is simple, some division and unit conversion
part b and c throw me off-am i suposed to know how fast signal bits are? and how much separation is between the first and last bits?