2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth

2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.

Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.

Edit: I am actually really surprised at how unpopular this opinion appears to be.

  • shortwavesurferOP
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    14
    ·
    edit-2
    4 months ago

    The max power a 5 gigahertz access point puts out is 1 watt where the max on 2.4 gigahertz is 0.3 watts. You are right though. You do have to do better about centrally locating the access point in your home in order to get the best performance from it. Because otherwise, one side will have good WiFi and the other side will have nothing or very weak WiFi.

    Edit: Another benefit of that is that if somebody wants to crack your Wi-Fi network, they have to be physically closer to your house to do so. So, on like 60 gigahertz, where the signal doesn’t leave the room you’re in, it’s basically as secure as Ethernet, because an intruder would have to break into your house to crack your Wi-Fi network.

      • shortwavesurferOP
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        10
        ·
        4 months ago

        That is a fair point, and mobile devices are going to be hardest hit by that, since they have such small batteries, but laptops and desktops and stuff would be just fine since they are constantly connected to a power source, and can use a card with higher transmit power.