The amount of carbon this has undoubtedly put into our atmosphere is really my main concern. Yes I know you can do hacky workarounds to fix this, but how many of their consumers did this? Roughly none. What a waste of our planets resources.

  • NotWhatMyNameIs@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    AMD doesn’t seem to be able to get past their issues with memory clocks at idle with multiple (especially asymmetric) display configurations.My Vega 64 did this when I first got it, after about a year of driver updates, they fixed it and my whole system would draw about 65w at idle.My 6900XT has always done it (running 2x identical Freesync 60Hz UHD monitors and an LG UHD TV) with no difference if I unplug the TV and the combination of that and my 5950x seemingly being worse at idling than the 2700x and 3900xt CPUs I was running before and my replacing my mobo with one with a lot more integrated hardware has now left my system idling at ~150w. It sucks but I just try not to leave it on when I’m not using it. Funnily enough, with the 6900XT being so much more efficient than Vega was, my system actually draws about 100w less when playing FFXIV at 4K, which TBH is what I do most of the time, although Starfield reminded me what a space heater this thing can be when it’s pushed!

    TBH I don’t know if the grass is greener on the green side because I haven’t owned an Nvidia card since GTX 970.

  • IrrelevantLeprechaun@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I literally haven’t had this issue for months now. Whatever is wrong with yours isn’t AMDs fault because it’s been solved for ages for vast majority.

  • JasonMZW20@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    The amount of carbon emitted depends on your region’s power mix. At night, when solar is unavailable, yes, it might be higher, but honestly, 100W is nothing (0.1kW). If your PC isn’t put to sleep or turned off, and instead idles most of the day, those are entirely your emissions.

    Level 3 EV chargers at dedicated stations draw up to 200kW (50/100kW vehicles are common now), so imagine when 95% of the world population has EVs. We’re rapidly heading toward disaster without the necessary infrastructure and no one wants to hear it.

    Humans breathe out CO2 as well (and expel methane, an even stronger GHG), and there’s 8+ billion of us doing that 24/7. We’re carbon-based life, so carbon is always going to be emitted. The issue is that we deforested and paved over our carbon sinks because we’re shortsighted anytime money is involved.

    Also, good luck solving your issue.

    • dudeimsupercereal@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Yeah you put it in better perspective than anybody. You are right, It’s incredibly minuscule, but I think there’s value to caring about every Wh or kWh. A little bit everywhere goes a long ways

    • DJGloegg@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      So fuck people who dont have freesync, right?

      I have my monitors lose signal randomly when freesync is on. 🤷

    • bobalazs69@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      MIne does 5w idle with 3x 1080p monitors.

      The trick was to reduce the 165 Hz to 60 Hz…

      • pcdoggy@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That’s not a good ‘solution’ - what if someone wants to game on 120 hz variable refresh rate?

  • Electrical-Bobcat435@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    OP, these posts inevitably lead to tips, ehat worked for me, a restatement of the problem, and digress from there.

    I for one agree with u 100%. There are NOT user-side solutions for a great many of us.

    Amd also asks too much of the user. I know i musta logged well over 200hrs on the idle power problem since i bought in February.

  • koordy@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    Considering my PC is on usually like 12-16h a day (I work from home) and my 4090 eats about 17W at idle (1440p 240Hz + 1080p 240Hz or 4K 120Hz), I wonder if it is already cheaper compared to if I got a 7900xtx instead.

    • ImYmir@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      Doing a little calculator check, you probably only saved around 50 to 100 bucks depending on how expensive electricity is in your area. Of course this number will only increase over time. Looking back, i’d probably recommend a 4080 over a 7900xtx to my brother. He can’t even run it full speed because of the massive heat it produces (400w) full load and the idle power use.

      • Keldonv7@alien.topB
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 months ago

        Electricity in EU can easily be 0.4$ per kWh depending on the country. If we assume 80w difference, 14h per day and 0.4$/kWh u get about 140$ per year.

        • BOLOYOO@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          But you don’t sit 14h per day doing nothing, don’t you? You can consider only idle (doing literally nothing), browsing and watching movies for this power draw, so it may be max 8h if you work at home, but maybe 2h if you use your PC primarly for gaming so the cost would be ~20$. It’s still wasted money and it shouldn’t happen, but realistically it’s nowhere near 140$.

  • Pazret@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    I also had the same issue. I could get 7-10 on dual monitor but 3 monitors never went below 100w.

    I changed to a 4090 and my idle is 25-30w with 3 monitors.

  • mdiz1@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    What triggers this high usage?

    I use a 3440 x 1440p LCD screen at 165hz

    Asrock Phantom Gaming OC 7900xtx

    Idle power draw ranges between 10-30W

    I don’t get the high usage you mentioned

      • pcdoggy@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Because a dual monitor setup is pretty common… it sounds like any setup that is more than 1 monitor - when at least one or more monitors are more than 60 hz - then it happens? I dunno if resolution impacts anything - if they have more than one monitor, these days, it seems that at least one is higher than 1080p.

      • DeBlackKnight@alien.topB
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 months ago

        I’ve got the triple threat, 3 monitors at different resolutions and refresh rates. 1440p 144hz, 3440x1440p 165hz, 4k 120hz. Idle power draw is 120w. The fan doesn’t shut off unless ambient temp falls below 72f/~22c.