05-19-2004 12:48 AM - edited 03-02-2019 03:47 PM
In a ftp servers environment when the servers are meant for users to access from external to download applications.
Logically, a show interface s0/0 will show a input rate lower than an output rate. However I encountered an output rate lesser than the input rate which does not make sense in my environment.
Any one has any clue what has gone wrong
05-19-2004 04:49 AM
I am not clear whether you really mean rate or whether you are talking about the packet and byte counters.
If you do mean rate, then consider that it is a measurement over a fairly short period of time and it is possible that the incoming requests have been high and the outgoing responses not so high (possibly could be invalid requests, congestion on the server, or whatever to slow down its responses).
I wonder if you are really asking about the input and output counters. These accumulate over time.
How long has the router been running and when, if ever, have the counters on the interface been reset.
One possibile explanation of what you are seeing is that the output counter got full and rolled over.
05-19-2004 05:16 PM
hi
i mean data rate (bits p/sec). Packet and byte counters are accumulative whereas data rate is real time.
The output rate bit is constantly lower than input rate bit. Are you suggesting that due to congestion on the servers, requests for retransmission is
causing the alnormal observation.
Thanks
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide