Right now the fastest speed everyone receives is around 144 mbps. I could change the band over from 20 mhz to 40 which should double the pipe but i want to know if its even necessary. When I look at my laptops stats from within the controller this is what I am seeing as far as received and transferred bytes.
This means Im not even touching a mg on the receiving end. Is this logic accurate in stating based on this information we dont need more than 144 mhz? is there another way to tell?
You have to remember that data rates are not throughput.
Wireless is a shared medium and half duplex in nature. Each device has requires clear channel before it can transmit, and each frame will be acknowledged (or block ack).
It then comes down to what the client is connecting at based on its number of spacial streams, RSSI and SNR to what data rate it can connect at.
Whilst increasing the channel bandwidth with 40 MHz (do not do this on 2.4GHz networks) gives you a bigger pipe, it will reduce the number of non overlapping channels, which in a high density environment (or even a multi tenanted building) could cause co-channel interference and adversely affect the performance of the wireless devices.
It is worth reading this blog on wireless throughput: http://divdyn.com/wi-fi-throughput/
It will show that on average it is possible to get better per client throughput using 20 MHz wide channels.
Around what the client is actually using this can vary dependent on what they are doing so always best to plan for what they might use.
Right now the fastest speed everyone receives is around 144 mbps.
I'd like to know what is the wireless NIC used. 144 Mbps is smells like a NIC card that can only support 2x2:1.