• InvertedParallax@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 months ago

    I’m considering it, but only just, my 5800x is good enough for most gaming, which is GPU bound anyway, and I run a dual xeon rig for my workstation.

    zen 2-4 took care of a lot of the demand, we all have 8-16 cores now, what else could they give us?

    • twoface
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      I have a 5900x and honestly don’t see any need for an upgrade anytime soon.

      A new CPU would maybe give me like 10 fps more in games, but a new GPU would do more. And I don’t think the CPU will be a bottle neck in the next few years

      • InvertedParallax@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Even beyond that, short of something like blender, Windows just can’t handle that kind of horsepower, it’s not designed for it and the UI bogs down fairly fast.

        Linux, otoh, I find can eat as much CPU as you throw at it, but often many graphics applications start bogging down the X server for me.

        So I have a windows machine with the best GPU but passable cpu and a decent workstation gpu with insane cpu power on linux.

          • InvertedParallax@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Meh, not nearly as configurable as linux, some things you can’t change.

            NFS beats SMB into a cocked hat.

            You start spending more time in a terminal on linux, because you’re not dealing with your machine, you’re always connecting to other machines with their resources to do things. Yeah a terminal on windows makes a difference, and I ran cygwin for a while, it’s still not clean.

            Installing software sucks, either having to download or the few stuff that goes through a store. Not that building from source is much better, but most stuff comes from distro repos now.

            Once I got lxc containers though, actually once I tried freebsd I lost my windows tolerance. Being able to construct a new effective “OS” with a few keystrokes is incredible, install progarms there, even graphical ones, no trace on your main system. There’s just no answer.

            Also plasma is an awesome DE.

            • Mihies@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              Ah, ok, I thought you were taking about Windows not being able to run CPU at full speed. But yes, it’s certainly a different OS with ups and downs.

              • InvertedParallax@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Well, it can’t run multithreaded jobs at full speed.

                Exhibit A: The latest AMD patch for multicore scheduling across NUMA.

    • floofloof@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      They do still seem to be making advances in single-core performance, but whether it matters to most people is a different question. Most people aren’t using software that would benefit that much from these generation-to-generation performance improvements. It’s not going to be anywhere near as noticeable as when we went from 2 or 4 cores to 8, 16, 24, etc.

      • InvertedParallax@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        Single-thread is really hard, we’ve basically saturated our l1 working set size, adding more doesn’t help much. Trying to extend the vector length just makes physical design harder and that reduces clock speed. The predictors are pretty good, and Apple finally kicked everyone up the ass to increase OOO like they should have.

        Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

        Flash was the epochal change, maybe we have some new form of hybrid storage but that doesn’t seem likely right now, Apple might do it to cut costs while preserving performance, actually yeah I see them trying to have their cake and eat it too.

        Otherwise I don’t know, we need a better way to deal with GPUs, there’s nothing else that can move the needle, except true heterogenous core clusters, but I haven’t been able to sell that to anyone so far, they all think it’s a great idea, that someone else should do.

        • floofloof@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 months ago

          Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

          The incentives are all wrong for this, except in FOSS. It’s never going to be a priority for Microsoft because everyone is used to the (lack of) speed of Windows, and “now a bit faster!” isn’t a great marketing line. And it’s not in the interests of hardware companies that need to keep shifting new boxes if the software doesn’t keep bogging each generation down eventually. So we end up stuck with proprietary bloatware everywhere.

          • naturlychee@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 months ago

            “what intel gives, microsoft takes away”

            dates from the mid 90s, still relevant.

            • InvertedParallax@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              Let’s be fair, Ms was vastly outrunning Intel for a long time, it’s only slowed down recently, and now the problem isn’t single-thread bloat so much as it is an absolute lack of multicore scaling for almost all applications except some games, and even then windows fights as hard as it possibly can to stop you, like amd just proved yet again.

          • InvertedParallax@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Yes, mostly the applications aren’t there, if you need real cpu power (or gpu for that matter), you’re running linux or on the cloud.

            But we are reaching a point where the desktop has to either be relegated to the level of embedded terminal (ie ugly tablet, before it’s dropped altogether), or make the leap to genuine compute tool, and I fear we’re going to see the former.