cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
134
Views
1
Helpful
3
Replies

Use of I/G bit and U/L bit in MAC address

Experts,

I'd like to get your thoughts regarding mac address I/G bit and U/L bit

I understand I/G Bit 0 - unicast and I/G Bit 1 = Multicast/Broadcast.

What is the use of that?

1) Does switch solely look at the bit and say whether "1" is there, then it will flood it out of all ports ?

How about End devices?

2) What is the use of knowing "U/L" bit in terms of a switch or end device. How is going to impact its forwarding decision?

Thanks in advance

 

 

1 Accepted Solution

Accepted Solutions

Joseph W. Doherty
Hall of Fame
Hall of Fame

Possibly difficult to fully appreciate today, but Ethernet standards, like MAC addressing were being worked out, hardware capabilities and performance were much, much less capable, leading to design specifications to make it easier for the (then) hardware/software to process data.

So, for #1, if a switch can make a processing decision based on 1 bit, it usually does.  Whether a current switch or host needs this would depend on the device and the developers of that device.

Unlikely, for #2, a device "cares" at least regarding normal forwarding decisions.  (From what I was just reading, local setting often not marked for local assignments, so as it cannot be counted on for accuracy, it's probably also just ignored.)

View solution in original post

3 Replies 3

Joseph W. Doherty
Hall of Fame
Hall of Fame

Possibly difficult to fully appreciate today, but Ethernet standards, like MAC addressing were being worked out, hardware capabilities and performance were much, much less capable, leading to design specifications to make it easier for the (then) hardware/software to process data.

So, for #1, if a switch can make a processing decision based on 1 bit, it usually does.  Whether a current switch or host needs this would depend on the device and the developers of that device.

Unlikely, for #2, a device "cares" at least regarding normal forwarding decisions.  (From what I was just reading, local setting often not marked for local assignments, so as it cannot be counted on for accuracy, it's probably also just ignored.)

Hi @Joseph W. Doherty ,  you made a good point about the evolution of technology. I can see why they have that bit. Thanks for sharing your thoughts!

BTW, as to using bits for forwarding processing, consider IPv4's classful address scheme.  If first bit is a zero, its a Class A network, if not, if second bit is a zero, its a Class B network, if not, if third bit is a zero it's a Class C network.

I recall the first computer I learned machine programming on, back in the late 70's, was the DEC PDP-8, an architecture from 60's.  To examine individual bits, you had to load up the accumulator and rotate the bits, as only the link register bit could be tested for zero or one status.  The 802.3 standard was first published 1983, it may have been worked on in the late 70's, it could have very well considered architectures like the PDP-8's for its design specifications.

Review Cisco Networking for a $25 gift card