I have a cisco controller 5520 with more than 300 access points joining it, I found that most of the access points like 250 out of 300 are working on the maximum power level 1 on 5g band only, while on 2.4 bands the power level is within normal levels, the power is set to auto
I checked random clients attached to those Access points I found that they are getting a very good signal with RSSI above 60dbm and SNR above 30
coverage hole detection values are all set to default -80
Any recommendation that might cause the Access point to radiate on power level 1 on the 5G band only?
Till now (this is a new implementation) nothing was reported as slowness or bad performance, but it is a bit confusing for me why the access points are working on the highest power level although the distance between access points are not that far
First question I have is what was the wireless requirements that it was designed to? Was there a requirement for -60 RSSI?
When I design the wireless I design it at power levels normally around 25mW (based on clients i am using). I then us RF profiles to limit the power the APs can auto tune to to one or 2 above and below this after I have verified the deployment.
DCA and TPC will be setting the power to power level 1 as it is not seeing the 5GHz neighbours close enough.
The wireless design was designed on -67db during the predictive survey, the APs were also placed on the lowest power levels (10W) during the survey
"DCA and TPC will be setting the power to power level 1 as it is not seeing the 5GHz neighbors close enough."
so do you see if there is no performance issue reported, that will be good from a design perspective? what should be the best power levels for the AP to guarantee that there is no interference and no coverage holes as well