Bought a new PC, and I was measuring its consumption out of curiosity. I noticed something weird (to me): when the PC is off (in fact, I completely disconnected the PSU and did the same test), there is quite some current running in the power cable to the PSU (0.15A).

Further measures showed a power factor of (almost) zero, and I can actually measure a capacity of 2uF across the PSU ac input.

I did the same thing on an older PC I have, and there is no current / capacity. So what would the reason of a capacitor across the mains on the input be in a PSU?

PS: the PSU is a Thermaltake Toughpower GF A3 1050W

Edit: I found some official measurements for this specific PSU: https://www.cybenetics.com/evaluations/psus/2249/ that have 40W standby apparent-power by design

  • litchralee
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    2 months ago

    How were you measuring the current in the power cable? Is this with a Kill-o-watt device or perhaps with a clamp meter and a line splitter?

    As for why there is a capacitor across the mains input, a switching DC power supply like an ATX PSU draws current in a fairly jagged fashion. So to stabilize the input voltage, as well as preventing the switching noise from propagating through the mains and radiating everywhere, some capacitors are placed across the AC lines. This is a large oversimplification, though, as the type and values of these capacitors are the subject of careful design.

    Since a capacitor charges and discharges based on the voltage across it, and because AC power changes voltage “polarity” at 50 or 60 Hz, the flow of charge into and out of the capacitor will be measurable as a small current.

    Your choice of measuring instrument will affect how precisely you can measure this apparent power, which will in-turn affect how your instrument reports the power factor. It can also be that the current in question also includes some of the standby current for keeping the PSU’s logic ICs in a ready state, for when the computer starts up. So that would also explain why the power factor isn’t exactly zero.

    • mattreb@feddit.itOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      How were you measuring the current in the power cable? Is this with a Kill-o-watt device or perhaps with a clamp meter and a line splitter?

      For the current both with a line-splitter+clamp and checked with an in-line meter. For the power factor, since I don’t have any actual instrument to measure It, and I just needed a ball-park figure to discern actual consumption from a capacitor, I used this diy method: https://www.giangrandi.org/electronics/cosphi/cosphi.shtml , which measured 0.04 ( with great approximation ).

      As for why there is a capacitor across the mains input […]

      I have the basic on how a switching power supply work, but I was asking because it seemed weird to me that commercial appliances didn’t take any stand-by meaures to avoid “keeping the wires warm”… is this the norm?

      • litchralee
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        commercial appliances didn’t take any stand-by measures to avoid “keeping the wires warm”

        Generally speaking, the amount of standby current attributable to the capacitors has historically paled in comparison to the much higher standby current of the active electronics therein. The One Watt Initiative is one such program that shed light on “vampire draw” and posed a tangible target for what standby power draw for an appliance should look like: 1 Watt.

        A rather infamous example of profligate standby power was TV set-top boxes, rented from the satellite or cable TV company, at some 35 Watts. Because these weren’t owned by customers, so-called free-market principles couldn’t apply and consumers couldn’t “vote with their feet” for less power-hungry set-top boxes. And the satellite/cable TV companies didn’t care, since they weren’t the ones paying for the electricity to keep those boxes powered. Hence, a perverse scenario where power was being actively wasted.

        It took both carrots (eg EnergyStar labels) and sticks (eg EU and California legislation) to make changes to this sordid situation. But to answer your question in the modern day, where standby current mostly is now kept around 1 Watt or lower, it all boils down to design tradeoffs.

        For most consumer products, a physical power-switch has gone the way of the dodo. The demand is for products which can turn “off” but can start up again at a moment’s notice. Excellent electronics design could achieve low-power consumption in the milliwatts, but this often entails an entirely separate circuit and supply which is used to wake up the main circuit of the appliance. That’s extra parts and thus more that can go wrong and cause warranty claims. This is really only pursued if power consumption is paramount, such as for battery-powered devices. And even with all that effort, the power draw will never be zero.

        So instead, the more common approach is to reuse the existing supply and circuitry, but try to optimize it when not in active operation. That means accepting that the power supply circuitry will have some amount of always-on draw, and that the total appliance will have a standby power draw which is deemed acceptable.

        I would also be remiss if I didn’t mention the EU Directives since 2013 which mandate particular power-factor targets, which for most non-motor appliances can only be achieved with active components, ie Active Power Factor Correction (Active PFC). While not strictly addressing standby power, this would be an example of a measure undertaken to avoid the heating caused by apparent power, both locally and through the grid.

    • mattreb@feddit.itOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Thanks, I’ll have a look. It’s an universal mains power supply with no voltage switch.