09-19-2025 07:59 AM - edited 09-19-2025 08:00 AM
I have a question I'm hoping somebody can explain in layman's terms for me regarding the inner workings of line rate speeds.
I've attached a very basic drawing of an example camera network. A PoE camera connects via Ethernet to a local switch at 1 Gbps and passes 3 Mbps worth of data > local switch connects to distro switch with a 1 Gbps trunk > distro connects back to core via a 10 Gbps trunk, recording servers connect to core also with 10 Gbps links.
Now, as that 3 Mbps camera passes data back to the recorder, each time it goes to the next switch, I would assume that the speed gets faster - my understanding is line rate is the fastest maximum speed at which data can pass. So 1 Gbps > 1 Gbps > 10 Gbps > 10Gbps in this example. But what happens on the way back to a viewing client when the speeds go from 10 Gbps down to 100 Mbps? Assuming the client is only viewing a handful of cameras and not overtaxing the connection, is there going to be buffering by reducing the speed at each connection?
In my head, I'm imagining the 10 Gbps links as a major highway, the 1Gbps link as an exit from that and then the 100 Mbps as a smaller road off of that. If that is an appropriate illustration, then what happens at these exits? Is there buffering by dropping to links with slower speeds? Internally, how does the switch handle these transitions? Ultimately, I'm just trying to better understand what's happening in the background of my network.
Any help would be greatly appreciated. Thank you
09-19-2025 12:45 PM - edited 09-20-2025 06:30 AM
In your specific case, assuming none of the interfaces are otherwise congested, video stream may be effectively 3 Mbps, both up and down.
For "up", if the camera is generating a 3 Mbps stream, it will effectively flow at 3 Mbps.
For "down", server could transmit it at 10g, but if a video client app is involved, the client is likely interacting with the server, and having the server effectively send the video stream at 3 Mbps, or possibly less (if a lower resolution has been requested).
However, if the video client just wants a particular chunk of the (pre-recorded) video stream, then it would likely be effectively sent at 100 Mbps.
As a side note, laymen often misunderstand bandwidth. Using you highway metaphor, all the highways have the same speed limit. However, some highways allow vehicles with greater carrying capacity, so vehicles with greater carrying capacity can deliver more cargo in less time.
Your 3 Mbps camera stream, on a gig link, is transmitted on a gig capable capacity vehicle, but only at 0.3% of that vehicle's capacity. When it is transfers to a 10g link, it only consumes .03% of the 10g capacity.
Effectively, the (live) video 3 Mbps data stream, would take about the same time to record on the server on a 10 Mbps, 100 Mbps, 1 Gbps or 10 Gbps links.
However, pre-recorded video, which if transferred as quickly as possible (not being watched in real-time), would take one tenth the time across 100 Mbps vs. 10 Mbps, gig would take 1/100 the time and 10g would take 1/1000 the time to transfer.
If you use telnet (literally) from one side of the world to the other, while typing, unless you're a very fast typist, you will see little difference between running across a 64 Kbps vs. 10g, end-to-end. However, if you do something that commands a display of lots of output, then you should see a huge difference.
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide