Hi wireless folks,
We know that the actual frequency range needed for the transmitted signal is known as the signal bandwidth, what determines the bandwidth of a signal? It seems like it depends on the physical nature of the wave itself, the more longer the wavelength, the more frequency range would be needed to transmit that particular signal. If so, the question then becomes what determines the wavelength.
Another possible answer would be the transmission type:
- DSSS requires 22MHz channel width
- OFDM requires 20MHz channel width
Not sure the which one is correct, any ideas?
You are touching on some pretty complicated topics and I highly recommend the CWNA text book for some in-depth background reading.
I had written some stuff regarding wavelengths and some of the properties but then decided to keep this a bit more simple if possible.
You are trying to work out the bandwidth and really there are some simple factors involved.
1) Frequency. There are two frequencies used in wireless, 2.4Ghz and 5Ghz. 2.4 can reach further (longer wavelength) but 5GHz offers a cleaner signal and can reach higher speeds.
2) Channel Width - Both frequencies can do varying channel widths. Ignoring DSSS which is a legacy type of modulation, OFDM uses 20MHz for all the common protocols (802.11g, 802.11n, 802.11ac). By default each network is 20MHz width but in the 2.4GHz frequency, you can bond 2 channels together to form a 40MHz channel. This is not advisable as there is not much room in that frequency and basically you end up 'talking' over each other and performance is terrible. 5GHz can bond 2, 4 or even 8 channels to get crazy high speeds but most people only bond 2 (40MHz) as that offers high speeds whilst not overlapping with anyone else and causing headaches.
3) Protocol - Each protocol can offer its own rates as they are technologies within themselves and require bi-directional support from the AP and the the device using the network. 802.11n is the most common and is available on 2.4 and 5GHz frequencies. 802.11ac is the latest protocol and is 5GHz only. This protocol should be utilised wherever possible!
4) Half Duplex - Whatever throughpt/bandwidth figures are put up by vendors e.g. 1.3 Gigabit for 802.11ac or even 7.3 Gigabit for 802.11ac wave 2 are immediately halved because wireless is a shared half duplex medium. In reality speeds rarely get close to that but with 802.11ac wave 1 you can at least achieve speeds in excess of 500mbit which is pretty damn good.
5) Density - One of the key factors with wireless as AP, interferer and client density all contribute to the overall 'cell' capability and together, with the factors above, they determine the available bandwidth in each area.
That doesn't really answer your question fully but may provide an insight into the complexity of the topic you're bringing up?
thank you Ric for the detailed information. As you mentioned, OFDM uses 20MHz for all the common protocols, but why it was 20MHz, can you dig more into that? 40MHz was thanks to Channel Aggregation, I would treat it as an effect of the enhancement technology. But I want to understand why the it defaults to 20MHz, not 10MHz, 30MHz, what was behind it?
Appreciate your sharing in advance!
Sorry I think my wording was a bit poor there.
Wireless is only available in specific increments which used to be 20Mhz or 40MHz but 802.11ac expanded this to also be 80 or 160Mhz. Note, all channels are around the 20MHz mark so when a larger increment is used its actually bonding two or more channels together.
802.11n - 20 or 40MHz
802.11ac - 20, 40, 80 or 160MHz
So obviously 10 and 30 aren't available options for wireless! 20 MHz will always be the default because using anything higher could impact your network until you know you have the available channels. The 5GHz frequency has approximately 23 channels available for use depending what country you are in. If you bond to 40MHz this immediately drops to 11 and if you go to 80 and 160MHz you start severely limiting your network if there is high density. This infographic is quite useful: