• MajorHavoc@programming.dev
      link
      fedilink
      English
      arrow-up
      27
      ·
      edit-2
      7 months ago

      What’s he smoking?

      Whatever he’s smoking, it’s strength rating is at least: “make it seem like a good idea to call employees back from remote work despite remote work facilitation being the one thing we sell”.

      So that’s gotta be some strong stuff.

      • David Gerard@awful.systemsOPM
        link
        fedilink
        English
        arrow-up
        19
        ·
        7 months ago

        the important thing about Zoom is that it was the lucky winner of the pandemic. Could have been Google Meet, could have been any of their other competitors, but somehow everyone just converged on Zoom.

        • mountainriver@awful.systems
          link
          fedilink
          English
          arrow-up
          21
          ·
          edit-2
          7 months ago

          Having worked in an IT department in 2020, it wasn’t just random. Zoom was stable for large meetings and scaled pretty smoothly up to a thousand participants. And it’s a standalone product and it had better moderator tools.

          MS Teams often got problems over around 50 to 80 participants. Google Meet worked better but its max was way lower than Zoom (250?). I tried a couple of other competitors, but none that matched up (including Jitsi, unfortunately).

          So if you were at an IT department in an organization that needed to have large meetings and were looking for a quick solution that also worked for your large meetings , Zoom was in 2020 the best choice. And big organisations choices means everyone has to learn that software, so soon enough everyone knows how to use Zoom.

          They were at the right place, had the better product, gained a dominant position. And now they are tossing all that away. C’est la late stage capitalism!

          • zogwarg@awful.systems
            link
            fedilink
            English
            arrow-up
            10
            ·
            7 months ago

            Also according to my freelance interpreter parents:

            Compared to other major tools, was also one of the few not too janky solutions for setting up simultaneous interpreting with a separate audio track for the interpreters output.

            Other tools would require big kludges (separate meeting rooms, etc…), unlikely in to be working for all participants across organizations, or require clunky consecutive translation.

          • V0ldek@awful.systems
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            7 months ago

            MS Teams often got problems over around 50 to 80 participants

            As honourable mention, MS Teams is also uncontrollable, overblown jank that

            • doesn’t work in a browser despite being built in Electron
            • is complete shite on Android, despite bring built in Electron
            • barely works on Windows, thanks to being built in Electron but despite the fact that it’s built by the Windows people

            And even at its best behaviour it randomly loses messages while eating up way more CPU and RAM than possibly justifiable for a glorified IRC UI.

            No wonder Zoom won out over that one, if you tried to use Teams in 2020 you barely could.

    • VirtualOdour
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      Custom hardware designed with ai pipelines in mind similar to how gpu architecture solved a lot of render issues due to how memory can be accessed and what operations are prioritized. The idea people have been talking about is basically the llm on one part of the chip and other NNs beside it that can modify its biases - basically setting the ‘mood’ and focusing things as the answer is created should help enable creativity in some areas while locking it out in others. Coding for example requires creativity in structure or variable names but needs to be very factual about function names or mathematical operations.

      I think it’s very unlikely to be the way things go based on progress with pure llms and llm architecture but maybe in the future it’ll turn out to be a more efficient way of solving the problem, especially with ai designed chips.