2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth

2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.

Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.

Edit: I am actually really surprised at how unpopular this opinion appears to be.

  • skatrek47
    link
    fedilink
    English
    arrow-up
    23
    ·
    10 months ago

    I think it’s a fair opinion, but a lot of “cheap” IoT devices only support 2.4GHz, so I do have both networks setup in my house for that reason…

    • Weirdmusic@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Yeah, I’m guessing that (if anything) 2.4Hz will be relegated to IoT device setup & control and little else

    • shortwavesurfer@monero.townOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      9
      ·
      10 months ago

      IOT devices should support 5 GHz and at least for me personally, if it doesn’t support it, I don’t buy it. Which also means that I have no IOT devices. LOL. My alarm system only supports 2.4 GHz, but it also has a cellular radio, so has never been connected to Wi-Fi in the time I’ve owned it.

      • Max-P@lemmy.max-p.me
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        10 months ago

        Why would you refuse to buy IoT devices unless they’re more expensive, use more battery and have less range? Like why, what does it give you to not have a 2.4 GHz network? It’s not like it’ll interfere with the 5 GHz network.

        Like sure the 2.4 GHz spectrum is pretty crowded and much slower. But at this point that’s pretty much all that’s left on 2.4GHz: low bandwidth, battery powered devices at random locations of your house and on the exterior walls of your house and all the way across the yard.

        It’s the ideal spectrum to put those devices on: it’s dirt cheap (they all seem to use ES8266 or ESP32 chips, lots of Espressif devices on the IoT network), it uses less power, goes through walls better, and all it needs to get through is that the button has been pressed. I’m not gonna install an extra AP or two when 2.4 reaches fine, just so that a button makes my phone ring and a bell go ding dong or a camera that streams and bitrates that you could stream on dialup internet.

        Phones and laptop? Yeah they’re definitely all on 5 GHz. If anything I prefer my IoT on 2.4 because then I can make my 5 GHz network WPA3 and 11ac/11ax only so I don’t have random IoT devices running at 11n speeds slowing down my 5 GHz network.

        • shortwavesurfer@monero.townOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          10 months ago

          But cameras on 5GHz could stream very high quality 4K video directly to your phone or whatever 2.4GHz would be lots more likely to buffer and skip doing that.

          • Max-P@lemmy.max-p.me
            link
            fedilink
            English
            arrow-up
            7
            ·
            10 months ago

            My best camera does 1080p at 150kbit/s H264. Most “4K” cameras have such shit encoding they’re nowhere near exceeding what 2.4 GHz can provide still. And if I were to spend money on a nice 4K camera that actually streams real 4K I would also invest on making it run over PoE because that would chew through battery like there’s no tomorrow and needs a power source anyway, and would go to an NVR to store it all on a RAID array.

            And if that had to happen I’d just put it on a dedicated 5 GHz network, because I want to keep the good bandwidth for the devices that needs it like the TV, phones and laptops. Devices on older WiFi standards slow down the network because they use more airtime to send data at lower rates, so fast devices gets less airtime to send data at high rates.

            Using the most fitting tech for the needs is more important than trying to get them all on the latest and greatest. Devices needs to be worthy of getting granted access to my 5 GHz networks.

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              10 months ago

              Channel slicing into units solves some of this and when you go higher frequency like that you can put more antennas in the same physical space so you can have like 16 transmit 16 receive to combat those airtime issues.

              • Max-P@lemmy.max-p.me
                link
                fedilink
                English
                arrow-up
                5
                ·
                10 months ago

                Yes but that’s expensive and only part of newer WiFi standards, and almost nothing implements it. Most devices barely even support basic MIMO.

                The point remains that I won’t go replace lightbulbs just so they run 5 GHz WiFi. It’s dumb and pointless and just generates a ton of completely unnecessary and avoidable ewaste, just to avoid using a network band nobody cares about anymore.

                In an ideal world yes, everything would be 11ax already on 6GHz spectrum. But this is the real world, a world where 10-20 year old WiFi devices still connect to 2.4 GHz networks and are still useful and most importantly, still works perfectly fine. WiFi 11n chips are dirt cheap, why should we have to add an extra 5-10 bucks on a lightbulb just so it’s on a modern WiFi standard when all it needs to receive is an RGBA value to know what color and how bright it should be. At that point it’s an economics problem not a tech problem. Those devices couldn’t even handle maxing out 11n even if they wanted to anyway, they barely handle a web server.

      • WallEx@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        If your iot devices go on 5ghz you will soon have the same bamdwidth/airtime problem as with 2,4. Because if you’re not using WiFi 6e you won’t be able to have many clients talking at the same time.

        How many devices to you have connected via WiFi?

          • WallEx@feddit.de
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 months ago

            Yeah then maybe don’t assume that everyone has as few as you do, especially counting iot or home automation. That seems to be the core problem with this discussion.if you are living with other people you might have that many clients without the iot devices.

          • Nommer
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            Jesus. Run a wire to some of those devices. Wireless should only be a last resort.

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              10 months ago

              They are not IoT devices. They are phones and laptops. I don’t own any IoT devices. And networks can handle that many devices easily.

              • Nommer
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                I didn’t say they were IoT…

      • bigredgiraffe@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Well 5ghz requires more power, has less range, and needs its own antenna so for microcontrollers this makes it pretty pointless for devices that need range and low bandwidth for sending sensor updates, especially those that are battery powered. 5ghz can also have its own issues in cities if you have a lot of use of the DFS bands as well as being worse at traversing reinforced concrete.

        Also, a 2.4ghz radio can also sometimes support other things like zigbee, BT, and BLE which can be used for other functions.

        For what it’s worth, I have probably 50 WiFi devices and the majority of them are 2.4ghz sensors or switches and other low bandwidth tasks and I don’t have any issues, even when living in an apartment complex. If you are having issues you might need different hardware or more access points or something.

        Anyway, all that to say that 2.4ghz definitely still has a lot of utility today.