The video dissects a USB-C cable marked with a 10A rating even though there is no such rating in the standard.

It would be interesting what this is meant for, as I’ve never seen a device with such a rating?

    • Thorry84@feddit.nl
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      Is it? I think USB 3.2 only goes up to 20V 5A for 100 watts of power. I don’t think 10A is in the spec.

      This cable also turns into a heater at 10A, so I don’t think it can do it for long. You need a pretty thick cable for 10A, 3 times as big as you need for 5A. So cheap Chinese cables won’t do 10A even if they use copper instead of CCA.

      • tia@lemmy.worldOP
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        Pretty much what I expected, but I don’t think they care to much about that

      • Gadg8eer
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        I know it’s not my specialty, but isn’t an amp and a watt related? I’ve seen cables that conduct anywhere from 40w (Steam Deck) up to 240w (cables on Alibaba, but don’t expect a low price for them, and above 140w they’re much thicker than usual by design or you’re being scammed with false advertising).

        • beastlykings
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 month ago

          Amps and watts are related, but you need voltage to complete the equation.

          Wattage, which is total power delivered, is derived by multiplying volts times amps.

          So the steam deck can charge using 40 watts. A quick Google says that it’s max charging voltage is 15 volts, and its max charging amperage is 2.6 amps.

          15 times 2.6? 39. That’s basically 40 watts.

          Now, the Deck can also use 12v, 9v, and even 5v in a pinch. But that’s only 31, 23, and 13 watts respectively. So you’re likely not going to be able to charge and play at the same time unless your charger is capable of 15 volts at 2.6 amps.

          Now for 240w, if you wanted to do that at 15 volts, you’d need 16 amps! 15 volts times 16 amps is 240 watts. At 12, 9, and 5 volts you’d need 20, 27, and 48 amps respectively. You’d need a pair of cables bigger than your thumb to carry that many amps without overheating.

          That’s why in the USB C PD spec, 240w is only possible if your device can accept 48 volts.

          48 volts times 5 amps is 240w.

          In fact the max amperage is 5 amps, and that requires a special cable with a special chip to prove it won’t melt. So at lower voltages, 36 and 28, the max wattage is 180 and 140 respectively.

          That’s the beauty of electricity, and the relationship between voltage and amperage. The more power(watts) you want to deliver, the more amps you need. But, if you can increase the voltage, you can deliver the same power for less amperage.

          That’s why overhead power lines are high voltage, 12,000 volts for residential lines in the USA.

          A standard home has 240 volt service, split into two 120 volt “phases”. The maximum amperage at the main breaker is usually 100 or 200 amps.

          240 volts times 200 amps is 48,000 watts!

          Now, once that gets on the overhead lines, the amperage is much less.

          48,000 watts divided by 12,000 volts? 4 amps. You could carry the entire full load of a household: washer, dryer, stove, water heater, etc. Max out the panel with current draw, and you could carry all of that over a single USB C cable at 4 amps, if you could get the voltage up to 12,000 volts.

          You can’t of course, because electricity gets really jumpy at those voltages, so you’d need two USB C cables and you’d have to hold them several feet away from each other and the ground or anything else conductive. But still, you could do it, and that’s amazing!

          Sorry for the big write up. Thanks for coming to my TED Talk. There are pamphlets at the doors.