@7:45 , the propagation time is distance/speed = 100km/1000km per hour= 0.1 hour=6mins. the transmission time per packet or per car is 1min (given). So total time for one packet to get from A to B is 6mins + 1 min= 7mins. Previously it took 12 mins for all the packets to get from A to B but now in the new conditions it takes 7 mins for ONE packet to get from A to B. So the answer is YES, so packets will actually reach B before all of them even leave A
Yes, this is correct. Even I noticed the same now. Its 6 mins(time taken by 1 car to travel from 1st toll to 2nd toll) + 1 min (service time at a booth) which is 7 mins. But Mr. Kenan told it as 10 mins + 1 min which is 11 mins (not correct). I guess he told that by oversight. But this analogy is correct.
Once you study computer science for a few years it becomes obvious just how many youtube tutorials actually have very little idea of what they are talking about. Either that or they are just copying from someone else.
And, if so, how would you calculate the total bandwidth of the Client to Server connection? Would it be similar to throughput, where Rs would cause the total bandwidth to decrease?
Unfortunately I've taken this class twice for my Masters. Between two different professors, and this video, it's like the only way to teach this class is by basically just reading the book to you. I can't tell if its due to the density of the course or if it's due to laziness of the professors for this course, but I haven't found someone who actually teaches the concepts. They just regurgitate the material.