09-21-2007 05:26 AM - edited 03-05-2019 06:37 PM
Hi all,
I am doing some basic calculations for time it should take to transfer a file over my network. apparently its been a while since I had to do math as I am doubting myself!
I just wondered how long it would take to transfer my 90GB file over a network connection that I knew to be 800Mbps.
So I did the following:
1Mbps = .125MBps
800Mbps * .125MBps = 100MBps
So thats 100 meg a second correct?
1000Mb = 1GB
90000MB = 90GB
at that rate it should take roughly 900 seconds to transfer the 90GB file..or 15 minutes. is this right?
TIA,
R
Solved! Go to Solution.
09-21-2007 07:20 AM
Your basic calculations are correct for time to transmit 90GB across 800 Mbps. In practice, due to frame/packet overhead, and how well stack deals with BDP (bandwidth delay product), actual transmission time might be considerably slower than the calculated value.
09-21-2007 07:20 AM
Your basic calculations are correct for time to transmit 90GB across 800 Mbps. In practice, due to frame/packet overhead, and how well stack deals with BDP (bandwidth delay product), actual transmission time might be considerably slower than the calculated value.
09-21-2007 07:32 AM
Thanks for your reply. I do realize that this is a perfect world scenario with no overhead and such. I have dis proven that this will ever be the case with an actual transfer, but just for the sake of math...i gave it ago! thanks again
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide