Hi,
when we issue command show int gig 1/0
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
now gig 1/0 is a 1gig ethernet interface
1gig means it can sends 1 billion bits per second (1,000,000,000 bits\s)
interface delay is 10 microsecond. microsecond means 10-6 second,
10 usec is 10-5 second, if you divide the second over this number
1/10-5 = 100000 bits\s (100 kbit\s)
which does not match the actual interface speed,
so what does the interface delay timer means ?