cross-posted from: https://lemmy.world/post/22892955

The Prius Prime is a dual fuel vehicle, able to run 100% on Electric, or 100% on gasoline, or a computerized blend in-between. This presents me a great opportunity to be able to do a direct comparison with the same car of an EV engine vs an ICE engine.

  • Toyota computer claims 3.2mi-per-kwhr.

  • Kill-a-watt (https://en.wikipedia.org/wiki/Kill_A_Watt) claims 2.2mi-per-kwhr.

  • Additional 1.5% losses should be assumed in the wires if you wish. (120V drops down to 118V during charging, meaning 2V of the energy was lost due to the resistance of my home’s wires).

  • Level 1 charger at home (known to be less efficient).

  • Toyota computer claims 53miles-per-gallon (American Gallon).

  • I have not independently verified the gallon usage of my car.

  • 295 miles driven total, sometimes EV, sometimes Gasoline, sometimes both.

  • 30F to 40F (-1C to 4.5C) in my area this past week.

  • Winter-blend fuel.

  • 12.5miles per $electricity-dollar (17.1c / kw-hr home charging costs)

  • 17.1 miles per $gasoline-dollar ($3.10 per gallon last fillup).

If anyone has questions about my tests. The main takeaway is that L1 charging is so low in efficiency that gasoline in my area is cheaper than electricity. Obviously the price of gasoline and electricity varies significantly area-to-area, so feel free to use my numbers to calculate / simulate the costs in your area.

There is also substantial losses of efficiency due to cold weather, that is well acknowledged by the EV community. The Prius Prime (and most other EVs) will turn on a heater to keep the battery conditioned in the winter, spending precious electricity on battery-conditioning rather than miles. Gasoline engines do not have this problem and remain as efficient in the winter.


I originally wrote this post for /c/cars, but I feel like EVs come up often enough here on /c/technology that maybe you all would be interested in my tests as well.

  • sploosh@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    3 days ago

    Nice info! A few notes:

    Volts are not energy. Volts are a component of electrical energy. The drop in voltage is because of the additional load on the circuit that the charger represents. Energy losses come out as heat. If something gets hot while it’s doing its job and its job is not to heat things it means that it’s got a loss of energy somewhere. I’d bet that the charger warms up in use, and the loss is likely greater than the 1.75% loss that you were thinking you had with the voltage sag.

    L1 charging is less efficient for a few reasons, but the biggest increase you get in efficiency from the L2+ charges is time. Triple the charge rate gets you full in an afternoon rather than overnight. But then you can parlay that into cost savings by timing your charge to off-peak times and charge up on cheaper electricity, if it is available in your area.

    Electric cars do have an easily measurable range drop in cold weather for the reasons you outlined. Gas cars really shine in cold weather because not only do they not perform measurable worse in terms of range when the thermometer drips, but they also make their own heat, ALL THE TIME. In electric cars you have to use battery power to heat both the battery and the cabin, which is yet another drain on range. But, that gas combustion heat is loss in the summer when you’re not using it. Like the waste from your charger, but this time it also comes with emissions.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Not to mention, for a naturally aspirated IC engine, you actually make more power in cold weather due to it being more thermodynamically efficient with a colder intake. (This is the reason why an intercooler increases the power output of a tuner car.)

    • dragontamer@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      Oh yes.

      But a wire’s resistance is purely ohmic (or damn near a pure resistance, very little inductance or capacitance to discuss).

      So seeing a voltage sag of 120V (original) to 118V across the 11 Amps my charger pulls means we can calculate the resistance to be 0.2Ohms.

      The rest of the 118V to the charger and the car is the usable power. After all, the current is constant (11Amps).

      Power = Voltage * Current.

      With 2V * 11Amps being wire waste (22W in the wires in my home), and roughly 1298W delivered to the actual charger

      Additional losses are clearly evident (winter, battery heater, air conditioner etc. etc.) later. But the 22W of wire-heat waste / 1320W of power drawn from my power meter (where my electric bill is calculated) is the 1.5ish% loss of this element.


      So the 1.5W is literally the loss in my homes wires, before accounting for the charger itself heating up.

      It sounds like you know all this but just want to clarify and double check.