Um, monitoring the circuit using MRTG or something similar would be a good way of doing this. To work out the average utilization, you could do the following: Average_Input=Total_Input (in BITS)/ Time (in SEC), Avg_Out=Total_Out (BITS) / Time (sec)
Last clearing of "show interface" counters 11w3d
Input queue: 0/75/0 (size/max/drops); Total output drops: 2051
Queueing strategy: weighted fair
Output queue: 0/1000/64/2046 (size/max total/threshold/drops)
Conversations 0/81/256 (active/max active/max total)
Reserved Conversations 0/0 (allocated/max allocated)
30 second input rate 33000 bits/sec, 3 packets/sec
30 second output rate 7000 bits/sec, 2 packets/sec
53836304 packets input, 2893045321 bytes, 0 no buffer
Received 0 broadcasts, 1384 runts, 3 giants, 0 throttles
1424 input errors, 2 CRC, 0 frame, 0 overrun, 0 ignored, 16 abort
58226253 packets output, 1732843968 bytes, 0 underruns
Avg_Input=2893045321bytes / 11w3d
Avg_Input=23144362568bits / 80days so
Avg_Inout=23144362568bits / 6912000seconds... Therefore
Avg_Input=3348 bits-per-second (bps) (Hehe, this is on a T3, the customer is paying us for a T3 for an average of 3.3kbps!)
This is purely an average. They could have been idle for 11w2day and then apssed all of the traffic in the last day. Without historical data there is not way of knowing...
At first, I was wondering where you get value “23144362568” as it’s not part of the output. Then I realize it is in bits.
The original output is in bytes. 1 bytes is equal to 8 bits. So we need to multiply the output in bytes with 8 in order to get a bits value.
2893045321 bytes x 8 = 23144362568 bits.
But, if you’re too lazy to calculate, then we need a online calculator