• @[email protected]
    link
    fedilink
    English
    287 months ago

    Nvidia isn’t the only horse in town. AMD (and to an extent Intel) usually offer much better value at these mid-range (and dare I say “low-end” at like $200) price points.

    And while Nvidia probably still sells more GPUs than AMD (for whatever reason there are actually people out there buying 4060 (Ti) cards), it’s not like AMD doesn’t sell any cards. The 7800 XT was priced very well from AMDs standpoint because it was just at the edge of what people thought was actually solid price to performance. It probably sold and still sells quite well.

    • @QuantumSparkles
      link
      English
      57 months ago

      Can anyone give me a suggestion for what cards I should be looking at to get a little over ps5 graphics without breaking the bank? It’s been a while since I worked on my last pc and I’m really lost these days

      • @sugar_in_your_tea
        link
        English
        87 months ago

        This article claims your baseline should be:

        • RX 6600XT - I have the 6650XT and I think this is fair
        • RX 7600
        • A750
        • GTX 3060
        • GTX 2070 super - from this LTT forum post

        Those should all be about as good or a little better than the PS5.

        That said, YMMV may vary because games for console may be better tuned for console hardware than for PC, even if the hardware is equivalent. So maybe to up a step to be safe. If you want RTX, do NVIDIA, otherwise AMD or Intel will probably offer better value.

        I paid a little over $200 for my RX 6650XT, so expect to pay $200-300 to match or slightly exceed the PS5.

        • @QuantumSparkles
          link
          English
          17 months ago

          Thanks you for this very well put together response! I definitely have an idea of what to look for now

        • @[email protected]
          link
          fedilink
          English
          17 months ago

          This list sounds about right. The whole “but it’s optimized for one console” thing is a pretty moot point nowadays as well. Sure, crappy ports exist but solid ports perform evenly to console with similar hardware specs.

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      I have nothing against AMD but the power consumption is outstandingly high. My room is already hot enough.

      • @[email protected]
        link
        fedilink
        English
        12
        edit-2
        7 months ago

        While current Nvidia cards are certainly more efficient, RDNA3 still improves efficiency over RDNA2, which itself was actually more efficient than Ampere (mostly due to Ampere being based on the Samsung 8nm process).

        A 7800 XT is more efficient than both a 6800 XT and an RTX 3080, with the RTX 4070 being the most efficient in this performance ballpark.

        I feel like you’re blowing this way out of proportion.

        • @[email protected]
          link
          fedilink
          English
          07 months ago

          What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.

          • @sugar_in_your_tea
            link
            English
            97 months ago

            Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.

            To really understand what’s going on, you need to look at something like watts per frame.

            • @[email protected]
              link
              fedilink
              English
              27 months ago

              I’m getting the numbers from GamersNexus’ power consumption chart from their review of the card.

            • @[email protected]
              cake
              link
              fedilink
              English
              -27 months ago

              The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers

              • @sugar_in_your_tea
                link
                English
                37 months ago

                No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.

                So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.

          • @[email protected]
            link
            fedilink
            English
            27 months ago

            You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.

            • @[email protected]
              link
              fedilink
              English
              17 months ago

              It is already hot enough that I don’t want to add more heat to it. Also yes I have a tiny room.