2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth

2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.

Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.

Edit: I am actually really surprised at how unpopular this opinion appears to be.

  • shortwavesurferOP
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    10
    ·
    10 months ago

    That is a fair point, and mobile devices are going to be hardest hit by that, since they have such small batteries, but laptops and desktops and stuff would be just fine since they are constantly connected to a power source, and can use a card with higher transmit power.