• 2xsaiko@discuss.tchncs.de
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    6 days ago

    This sounds like getting either visual artifacts from predicting a frame that wasn’t rendered yet or higher latency from waiting for the next one so you can interpolate between them.

    I’d rather turn the resolution down if your exorbitantly priced hardware can’t even run a game smoothly.

  • Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    4
    ·
    8 days ago

    No thanks, I’ll be using my 2080ti until it croaks and then buy AMD (or secondhand). Nvidia has more than enough money, they don’t need mine anymore.

    • Demdaru@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      Well, I am still sitting on 750… All these posts about newest gen are just abstractly funny to me xD

    • somedev@aussie.zone
      link
      fedilink
      arrow-up
      3
      ·
      7 days ago

      Yep, I’ll never be buying a brand new series GPU ever again. Building a gaming PC has always been expensive, but now it just seems offensively so.

  • Viri4thus@feddit.org
    link
    fedilink
    arrow-up
    15
    arrow-down
    3
    ·
    7 days ago

    So they copied AFMF. If it’s anything like their driver level upscaler(NIS) is going to be shit because they released it to tick a box rather than invest actual development time to make it functional, unlike the competition who has a, together with the FOSS community, turned into a mainstay of not rich people toolset everywhere.

    NVIDIA Fuck you

    • Poopfeast420@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      7 days ago

      Maybe I’m misunderstanding you, but

      unlike the competition who has a, together with the FOSS community,

      I don’t think that has worked out too well for AMD so far. It feels like they just do the bare minimum, release and hope someone else finishes the job.

      FSR is open source, and the only noteworthy thing I remember is No Mans Sky on Switch with a surprisingly good custom implementation. Nobody else bothered.

      • Viri4thus@feddit.org
        link
        fedilink
        arrow-up
        11
        arrow-down
        4
        ·
        7 days ago

        Everyone and their mother uses FSR on edge devices and can customise it to their liking. In fact, it’s part of the SteamDeck toolset to make games run at acceptable frame rates. I’d call that a win. Conversely, DLSS and similar tech has only accomplished making games run like shit and deprive us of non blurry images unless using it to superscale which is basically MSAA at that point… I should be the biggest fan of DLSS given I have an edge device with an NVIDIA gpu, however, I mostly use AMD tech on that (now) underpowered GPU because Nvidia is rent seeking and I don’t want to pay them rent. Make of that what you will, I’m not interested into being dragged into a fanboyism war. I have both vendor’s products and there’s one I use predominantly because it’s open and another I dislike because instead of pushing the industry forward is promoting shit game optimisation and hobbling game engines through MDF bribery making gaming less accessible to everyone.

        • WolfLink
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 days ago

          I have a 30 series card and my experience is DLSS now is better than when it first came out. Original DLSS was pretty bad, it made things noticeably blurry. Current DLSS I’m fine with keeping on “Quality” or maybe even “Balanced” settings.

          I’ve tried FSR but DLSS seems to be much better quality for similar performance.

          I only have found one game that has FSR frame gen in it to try (Marvel Rivals specifically) but something went horribly wrong because it added a lot of latency and framerate instability. Tbh Marvel Rivals is probably not the kind of game you want to be using frame gen with anyway.

          Nvidia is rent seeking and I don’t want to pay them rent.

          There seems to be this attitude like you are supposed to upgrade your GPU every year. I don’t intend to upgrade my GPU until it either breaks or really can’t keep up with modern games. Neither is happening any time soon.

          • Viri4thus@feddit.org
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            6 days ago

            There seems to be this attitude like you are supposed to upgrade your GPU every year. I don’t intend to upgrade my GPU until it either breaks or really can’t keep up with modern games. Neither is happening any time soon.

            Precisely. NVIDIA tries its best to always give you less for more. The latest is selling a xx70 class card for 1160€ and calling it 5080.

            As for frame gen, I’m not a framegen fan, its use case is not me or anyone else in the sub 1000€ price point. It’s useful when you’re CPU limited and want to play, single player games, at the max refresh rate of your shiny new 540Hz monitor.

            For everyone else, like 3060 owners like myself, DLSS works well if you are playing a dear marketing game like cyberpunk. Use it on S.T.A.L.K.E.R. 2 and it’s a blurry mess at anything below 1080p input res (XeSS works best here for me, and more ppl, better than DLSS). Frame gen (even just 2x) doesn’t solve the problem that the overwhelming majority of people have, which is playing the latest games at the best possible visual quality with acceptable frame rates. I didn’t invest on a PC to play The Witcher 3 (DLSS broken on older gen cards,) with PS4 levels of latency!? Plus DLSS, as of late has TERRIBLE frame pacing in most games. Perhaps it’s the limited framebuffer!?

            That doesn’t change the fact that games are horribly optimised nowadays. God Of War ran on a PS4 but I need a power plant to run STALKER or Alan Wake at 60fps without it looking like a disco party with all the shimmering. Thx NVIDIA

        • Poopfeast420@discuss.tchncs.de
          link
          fedilink
          arrow-up
          9
          arrow-down
          7
          ·
          edit-2
          7 days ago

          uses FSR on edge devices and can customise it to their liking

          Everybody can customize it, but nobody does, that’s my point.

          Conversely, DLSS and similar tech has only accomplished making games run like shit and deprive us of non blurry images

          What? How is it doing that? Also, why is DLSS the reason for blurry images, but FSR is better?

          Nvidia is rent seeking and I don’t want to pay them rent

          What are you even talking about?

          I’m not interested into being dragged into a fanboyism war

          It looks like you’ve already joined the fanboy war on the side of AMD.

          another I dislike because instead of pushing the industry forward is promoting shit game optimisation and hobbling game engines through MDF bribery making gaming less accessible to everyone.

          Case in point. And more conspiracies.

          • Viri4thus@feddit.org
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            edit-2
            7 days ago

            Everybody can customize it, but nobody does, that’s my point.

            Literally this week

            What? How is it doint that? Also, why is DLSS the reason for blurry images, but FSR is better?

            just one of the many games where the implementation was shit. FSR is better because it’s free and open

            What are you even talking about?

            Sauce

            Sauce 2

            Sauce 3

            Nvidia has been further weaponising the patent system with great success by offering grants to grad unis under the condition the IP is the sole property of NVIDIA. They have completely hobbled the competitive environment around parallel processing making it almost impossible for non juggernauts to enter the fray.

            It looks like you’ve already joined the fanboy war on the side of AMD

            I’m fanboying for FOSS, open software and open standards that foster competition and new ideas.

            Almost, every single game made with NV support has been plagued by performance issues. That’s before we mention their recent app which is problematic.

            They hold a monopoly and are antithetical to FOSS despite the latest push motivated by propelling AI deployment on consumer desktop linux to increase sales.

            Perhaps you should re-examine why you’re so antithetical to criticism of NVIDIA, maybe you merged your definition of self with the brand and that’s you react so emotionally when someone points out how detrimental of an organisation NVIDIA is.

            • Poopfeast420@discuss.tchncs.de
              link
              fedilink
              arrow-up
              5
              ·
              7 days ago

              Literally this week

              So you don’t get what I meant, ok. You can mod DLSS into games that don’t support it, that doesn’t mean it’s customizable. I was talking about the actual FSR upscaling implementation. Basically, no one cares that FSR is open source because no devs are putting any work into it to make it better. So in practice, this isn’t a positive.

              just one of the many games where the implementation was shit. FSR is better because it’s free and open

              Like I said, FSR being open source doesn’t mean anything, because nobody is working on it, AMD included. FSR in Dead Space was a blurry mess, so by your logic it’s garbage.

              I’m fanboying for FOSS, open software and open standards that foster competition and new ideas.

              In theory yes, but not how AMD is doing it. When they try to catch up with NVIDIA tech, they do the bare minimum and hope someone else finishes the work, which doesn’t work.

              That’s before we mention their recent app which is problematic.

              Does that mean AMD is terrible, because they had problems with their drivers? Or when Adrenalin causes problems?

              Almost, every single game made with NV support has been plagued by performance issues.

              Got any examples, that are NVIDIAs fault? I remember the Tessellation stuff in Crysis. Anything more recent, that isn’t just game runs bad, so it must be NVIDIAs fault?

              Perhaps you should re-examine why you’re so antithetical to criticism of NVIDIA, maybe you merged your definition of self with the brand and that’s you react so emotionally when someone points out how detrimental of an organisation NVIDIA is.

              I’m not against criticism of NVIDIA. I’m against baseless accusations, that are often just conspiracy theories, without any evidence. It mostly just comes down to NVIDIA bad.

              That’s why I’m saying you are an AMD fanboy. A lot of the examples you mentioned can also apply to AMD, but you just ignore them, and it doesn’t matter to you. AMD supports FOSS, so they can’t do wrong and are the good guys.

              • Viri4thus@feddit.org
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                7 days ago

                So you don’t get what I meant, ok. You can mod DLSS into games that don’t support it, that doesn’t mean it’s customizable. I was talking about the actual FSR upscaling implementation. Basically, no one cares that FSR is open source because no devs are putting any work into it to make it better. So in practice, this isn’t a positive.

                Here go to forks then commits and you’ll see a wide community including non AMD staff.

                Like I said, FSR being open source doesn’t mean anything, because nobody is working on it, AMD included. FSR in Dead Space was a blurry mess, so by your logic it’s garbage.

                Yes, t’was trash in Dead Space, however, it’s free so it cost me nothing to have trash rather than a kidney.

                In theory yes, but not how AMD is doing it. When they try to catch up with NVIDIA tech, they do the bare minimum and hope someone else finishes the work, which doesn’t work.

                TressFX and several other tech has been open and industry leadig. Apple’s Metal, DX12 and Vulkan itself benefited from AMD’s push for mantle by the BF4 release. People have a convenient habit to forget the FOSS initiatives that end up resulting in pearls like proton, DxVK, et al.

                Does that mean AMD is terrible, because they had problems with their drivers? Or when Adrenalin causes problems?

                Dunno, using FOSS drivers. How’s Noveau doing with the 5xxx series? Regarding adrenalin poblems, fuck ye it makes AMD terrible, people paid a butload of cash for hardware that doesn’t do what was advertised. Same for NVIDIA or Intel.

                Got any examples, that are NVIDIAs fault? I remember the Tessellation stuff in Crysis. Anything more recent, that isn’t just game runs bad, so it must be NVIDIAs fault?

                Partners

                Devs

                Partners

                I’m also happy you chose to ignore the EU anticompetitive suit that is ongoing and I referenced (that one goes beyond AI and includes bundling of technologies).

                Naturally that would defeat your entire attempt at arguing I’m an AMD fanboy thus jeopardising your “ironclad” protection of the self identification with NVIDIA.

                Edit: here we go, best tech company in the world

                Also amazing to see the voting trends when US Americans wake up…

                • Poopfeast420@discuss.tchncs.de
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  7 days ago

                  Naturally that would defeat your entire attempt at arguing I’m an AMD fanboy thus jeopardising your “ironclad” protection of the self identification with NVIDIA.

                  No, you simply are an AMD fanboy. As you’ve shown in basically all of you comments here.

                  Yes, t’was trash in Dead Space, however, it’s free so it cost me nothing to have trash rather than a kidney.

                  Again, ignoring the examples that contradict your argument. NVIDIA has bad DLSS implementation, they’re bad, AMD doesn’t matter, they can’t do wrong.

                  Here go to forks then commits and you’ll see a wide community including non AMD staff.

                  Ok, but if you look at these examples, lots of times, AMD is just dumping the code, sometimes do a few commits, and then nothing. Like I said multiple times, it seems like AMD is doing the minimum amount of work and hope others finish.

                  Then the tons of forks, what do they matter, if nothing comes of it. I asked you which dev worked on FSR1 or 2, to improve it and released something.

                  TressFX and several other tech has been open and industry leadig. Apple’s Metal, DX12 and Vulkan itself benefited from AMD’s push for mantle by the BF4 release. People have a convenient habit to forget the FOSS initiatives that end up resulting in pearls like proton, DxVK, et al.

                  Finally, one single example, where AMD possibly didn’t just do the minimum amount of work and hope others do the rest (although who knows how much of it was DICE, but let’s give AMD the benefit of the doubt. I also don’t really care about your other examples like proton or DxVK. I never said FOSS are bad or something. This is just about AMD and how they (seemingly) handle their open source stuff.

                  As for your EVGA or Apple examples, you are constantly expanding the scope of the argument. First, it’s about how DLSS ruins game visual. Then it’s NVIDIA basically sabotaging games and crippling their performance. Now you’re bringing past business partners into the mix, because you can’t answer the initial question.

                  Please show me, where I said NVIDIA has never done anything wrong and Jensen is a saint. Yes, NVIDIA can be a shitty and greedy company. They might be a pain to work with. They do shitty stuff, basically because they’re the top dog right now. However, you are basically blaming everything that’s wrong in the world on them, and when asked for anything to back it up, you talk nonsense.

                  Next, your Assassin’s Creed example. Ubisoft and NVIDIA are pointing fingers at each other, but you think it must be all NVIDIAs fault? And even if they are solely responsible, that’s two examples of NVIDIA casing problems with games in what, 30 years? Doesn’t seem too bad to me, unless you got anything else.

                  I’m also happy you chose to ignore the EU anticompetitive suit that is ongoing and I referenced (that one goes beyond AI and includes bundling of technologies).

                  Hey, just like you’re ignoring something I said, I do the same thing. Call me when that suit is done.

                  Imma be honest, I don’t know why you brought NVIDIA into this. I just criticized AMD and FSR, but you couldn’t just show me examples of how I’m wrong, you mostly had to tell me that NVIDIA are the devil.