• @sugar_in_your_tea
    link
    English
    97 months ago

    Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can’t directly compare those marketing numbers on products from different gens, much less different companies.

    To really understand what’s going on, you need to look at something like watts per frame.

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      I’m getting the numbers from GamersNexus’ power consumption chart from their review of the card.

    • @[email protected]
      cake
      link
      fedilink
      English
      -27 months ago

      The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers

      • @sugar_in_your_tea
        link
        English
        37 months ago

        No, it’s TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that’s not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.

        So the wattage rating doesn’t really tell you much about expected power usage unless you’re comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it’s just a rough idea.