02-27-2012 10:18 PM - edited 07-03-2021 09:40 PM
Hi,
Just a basic querry, Although I know the difference between 802.11 a and 802.11 g standard but just confused about this.
When both the standards have been originated from OFDM and provides maximum Data rate of 54 Mbps. They have almost similar features and the only difference is of Frequency range they work on i.e. 5 Ghz and 2.4 Ghz respectivity.
My question is - If they both have so many similarities then why there are 2 different standards - a and g. Just due to the frequency bands are different ?
Pls. clarify my doubt on this.
Solved! Go to Solution.
02-28-2012 01:05 AM
The difference is just the bands. The thing is that you only have 3 non overlapping channels on the 2.4ghz and you have 12 with the 5ghz in the US. That will vary by country. Also you need to realize that the lower the ghz the less it is prone to attenuation. So that is why the 2.4ghz will get you more coverage.
Sent from Cisco Technical Support iPhone App
02-28-2012 01:05 AM
The difference is just the bands. The thing is that you only have 3 non overlapping channels on the 2.4ghz and you have 12 with the 5ghz in the US. That will vary by country. Also you need to realize that the lower the ghz the less it is prone to attenuation. So that is why the 2.4ghz will get you more coverage.
Sent from Cisco Technical Support iPhone App
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide