Why do WiFi routers do such a bad job of channel selection?

Wi-Fi congestion, especially in the 2.4GHz range, is a serious problem in some areas. It is widespread enough that there are many guides to choosing a less congested channel. E.g. https://www.howtogeek.com/197268/how-to-find-the-best-wi-fi-channel-for-your-router-on-any-operating-system/

Given that most routers default to automatically choosing their channel and the hardware seems capable of detecting conflicting networks, why don't they do a better job of channel selection?


Solution 1:

The failure of Wi-Fi APs to pick 2.4GHz channels well comes down to a small handful of issues:

  • Most only pick a channel at boot time, but a channel that was good when the AP was last rebooted may have become a poor choice days, weeks, or months later.
  • Most do not want to delay booting by spending long enough to truly evaluate every channel, so they use poor heuristics like "just pick the channel where we see the fewest APs", which doesn't necessarily correlate to which channel will provide the best throughput and reliability. Even worse, these oversimplified heuristics can cause problems like choosing a channel that partially overlaps with channels other APs are on, which will cause APs to interfere with each other without being able to cooperate with each other like they would if they were on the exact same channel.
  • Most don't even have the spectrum analyzer hardware necessary to truly evaluate the RF interference on each channel; they have Wi-Fi radios and focus on interference from other Wi-Fi devices, and are fairly ignorant of interference caused by non-Wi-Fi devices such as Bluetooth, microwave ovens, cordless phones, wireless subwoofers, baby monitors, wireless cameras, and more.
  • Creating an AP that has the hardware and the algorithms to choose channels well not just at boot, but to keep re-evaluating the channel choices later, and change channels when there would be benefit to do so, is both expensive and fraught with potential interop problems. Not all clients are great at honoring channel switch announcements from the AP, so an AP that changes channels on the fly risks having clients fall off the network every time it does so.

Solution 2:

The overarching problem here is that the 2.4GHz band is completely saturated in any moderately populated area. In addition, there are only 14 channels, depending on country, available to use. Out of those 14 only 3 channels don’t overlap and interfere with each other. And that is only true if the device uses only 20MHz of bandwidth and not the 40MHz bandwidth available on some access points.

All properly configured Wi-Fi routers should only use channel 1, 6, or 11 at 20MHz bandwidth. An access point stomps on the signals of any nearby access points for at least 2 channels higher and 2 channels lower from itself. Worse if it’s on 40MHz bandwidth.

When access points can see each other, on the same channel, they will cooperate and share the air space. If two access points are using nearby, but different channels, then they stomp on each other and each collision results in lost data.

Unfortunately, most modern Wi-Fi routers, for simplicity, default to auto-channel selection. However, they do not adhere to the 1, 6, or 11 rule. Instead they use a proprietary algorithm that is probably based on the usage of each channel. This causes severe and unavoidable interference of nearby networks, practically rendering the 2.4GHz band useless in some areas. In addition, the auto-channel selections usually only happen during a reboot or rarely at all. So the channel selection can quickly become stale as nearby access points also jump channels and compete to find the “cleanest” channel. To make things worse, the channel selection is based on what the AP hears, and not what the client hears, which may be closer to a different set of APs.

So, the problem is not the selection mechanism, but the fact that the 2.4GHz band is just completely saturated. Not only by Wi-Fi access points, but by cordless phones, microwaves, Bluetooth, baby-monitors, wireless cameras, and any number of other technologies.

The answer is to use the 5GHz band. There are dozens of 5GHz channels available. None of which overlap with others if the standard 20MHz bandwidth setting is used. This means that all devices using the 5GHz band can cooperate with each other without interfering. Unfortunately, Wireless-N and especially Wireless-AC allow for wider channels which overlap in an attempt to provide greater throughput. So, even in the 5GHz band, you should be conscious of co-channel interference and choose your settings wisely, rather than utilizing auto-channel selection.

In a densely populated area, the use of wide channels will provide little, if any benefit and could actually make things worse.

Solution 3:

Just adding a visual representation about the 2.4GHz congestion vs the 5GHz band to the already excellent answers.

I live in an European capital with a strong Internet and Wifi market penetration.

Furthermore, most local ISPs also add an extra roaming SSID/network by default in their router/modems/CPE and so often it is at least 1 SSIDx2 per home/neighboor. Do keep in mind, that besides the APs broadcasting signals, clients also broadcast.

So as an example, only listening with a normal notebook without any amplification in a fixed point in my bedroom, without walking around home, I can see at least 136 SSIDs (around 70-90 APs). It would not be a long stretch that led me to suspect I might have around me aprox. 200 equipments (APs+clients) broadcasting signals in the 2.4GHz band.

Compare the graphics in the left side, 2.4Ghz, with the right side, in the 5GHz band.

wifi