I understand the difference between bandwidth, latency, and throughput. I wanted to ask, why does bandwidth have a unit of time?
I get that if packets were cars on a highway, latency would be the speed of each lane, bandwidth would be the number of lanes, and throughput would be the number of cars that got through in a given timespan.
If using the hose-and-water analogy, bandwidth would be the diameter of the hose. But with networks, bandwidth is not just a number of 'lanes' or a specific size, it's quantity per time (Gb/S, Mb/s, etc).
Why is this? And how is that not throughput?