“never plug extension cords into extension cords” is probably the most common piece of electrical related advice I’ve ever heard. But if you have, say, 2 x 2m long extension cords, and you plug one into the other, why is that considered a lot more unsafe than just using a single 4 or 5 meter cord?
Does it just boil down to that extra connection creating another opportunity for the prongs to slip out and cause a spark or short circuit? Or is there something else happening there?
For that matter - why aren’t super long extension cords (50 or more meters) considered unsafe? Does that also just come down to a matter of only having 2 connections versus 4 or more on a daisy chained cord?
Followup stupid question: is whatever causes piggybacked extension cords to be considered unsafe actually that dangerous, or is it the sort of thing that gets parroted around and misconstrued/blown out of proportion? On a scale from “smoking 20 packs of cigarettes a day” to “stubbing your toe on a really heavy piece of furniture”, how dangerous would you subjectively rate daisy chaining extension cords, assuming it was only 1 hop (2 extension cords, no more), and was kept under 5 or 10 metres?
I’m sure there’s probably somebody bashing their head against a wall at these questions, but I’m not trying to be ignorant, I’m just curious. Thank you for tolerating my stupid questions
The longer the distance, the larger the diameter of the wire you need, due to resistance/heat.
Typically, extension cords are going to be manufactured with the thinnest wire they can get away with based on the safety requirements, in order to save on materials cost.
So plugging 2 short cords together might cover the same distance as 1 longer cord, but the longer cord will use thicker wire to maintain the proper margin of safety.
Distance by itself would be no different than a single cord of the same length.
However, connection points are areas of localized resistance where connectors meet. This can introduce dangerous areas.
That said, those aren’t really the problem here:
The practical, human, problem here is important. Connectors come loose, which makes them dangerous. The majority of this thread is treating this question like a paper test problem, when in reality there are other factors that outweigh the “under ideal circumstances” problem.
Said another way: Each cable is given the minimum copper and shielding that cable needs for the length it is made.
As soon as you plug two together, you’re operating at greater resistance than either one was made for, and relying on the margin of error.
I’ve run a full DJ setup with speakers, a mixer, soundboard, laptop, etc. off a single line of 6-8 daisy-chained extension cords more times than I can count.
…uh…how have I never learned of this.
Honestly that’s probably not a huge electrical load.
DJ’s really will find any excuse to tell you they’re a DJ
Edit: you wrote and deleted a super unhappy message so just wanted to say sorry, I was just being goofy not trying to rag on you
deleted by creator
Because you ain’t loud enough lol
The longer the wire, the more heat it can dissipate, so no, you don’t need wire to be thicker.
Not sure if you’ve ever used fuse wire before. It’s what was used before capsule fuses and breakers. Essentially, if too much current goes through it, it will melt, breaking the circuit as protection. The thicker the fuse wire, the more current it can pass through without melting. The length of the wire doesn’t come into it. 1cm of 10 amp fuse wire will melt at the same current as 1 meter of 10 amp fuse wire.
Yes, maximum carried current is indeed invariant of length. I explained math behind it in https://lemm.ee/comment/17115060
But unless coiled up on the ground the longer cable also has more area to dissipate heat, so the longer cable doesn’t change anything here. The heat output will be consistent for any section of the cable no matter how much more cable there is on easier side of it.
The only think that the different resistance would affect is the voltage drop to the end device. But voltage drop varies wildly so you are unlikely to have a meaningful difference caused by a few extension cords (unless maybe you are already a bad case like an apartment building to start).
Resistance of a cable is (resistivity x Length)/(πr^2) so the residence increases with length, which is why longer extension cords are designed thicker to reduce resistance. Power grids are voltage stabilized so the voltage drop will be negligible but it will take more power to get down the daisy chain, producing more heat.
And temperature difference to ambient temperature is
thermal resistivity * dissipated power / (2\*π\*radius\*length)
. If you will plugdissipated power = resistance * current^2
and resistance into it, you will see, that temperature difference is invariant of length.The longer wire (being also thicker) has less resistance and is therefore wasting less power as heat, that’s where the Voltage drop is going.
Sure, most of the time it’s fine if you know what you’re doing, but that’s why it’s general wisdom and not a hard rule, like “don’t put metal in the microwave”, it’s said to protect those that have no idea what they’re doing/why the saying exists
I have no idea whether this really is the answer but it seems like the most plausible answer.