calculation for max TCP throughput by given RTD ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-19-2003 01:49 AM - edited 03-02-2019 11:49 AM
Hello,
I have been searching for a while now for a calculation which tells me the time needed to download a certain filesize from a remote server.
I know it depends on TCP windowsize, RTD, packetloss.
But what's the equation to calculate download time ?
Here are the facts:
File size 200 Mb
RTD to server 250 msec
Access-line speed 2 Mbps
TCP window size 65Kb
No packetloss.
How to calculate how long it will takes before file is local ?
Thanks,
Diederik
- Labels:
-
Other Networking
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-15-2003 03:31 PM
Below are links to help understand TCP/IP and methods used to calculate throughtput.
Understanding TCP IP
Network Performance:
http://www.astro.caltech.edu/~pls/papers/acts-report/node10.html
Traffic Measurement analysis
http://www.soi.wide.ad.jp/class/20020032/materials_for_student/07/kjc-day2.pdf
Using TTCP to test throughput
