Amid the transition to an electrified transportation sector, efforts to decarbonize the U.S. power grid are evident in the planned additions and retirements of utility-scale electricity generating capacity.
You’re comparing GW (nameplate capacity, how much it can generate at a given instant) with GWH (amount generated over the course of a year) which makes things look a lot worse than it is.
the amount generated is more relevant. stating capacity is misleading: ofc higher amount generated implies higher capacity.
1gw solar power plant will produce (random numbers) 4gwh in 4 hours per day ( in summer it produces 10gwh in 10 hours per summer day, and 2gwh in 2 hours per winter day, 4 kinda annual average, again, random… ) so in a year, it produces 4*365= 1460gwh, when 100% uptime, less if there is downtime. fossil for exemple: 1gw fossil plant produces 24gwh per day, thats 8760gwh per year but with 100% year long uptime: 20% downtime->7008gwh only, so on and so forth…stating capacity is misleading, show us productions graphs that could help judge how much generated in said area per year.
Demanding people talk about average power when they’re talking about capacity is idiotic.
If you have a 1GW load to run for two hours at midnight on a winter day 1GWac solar array is useless. So is a 200MW fossil fuel plant which generates the same average power over the year. If you need to run 1GW of AC during the hottest hours of the year, the solar array is pretty good, but the 200MW thermal plant is still useless.
A 200MW OCGT peaker that runs for 2 hours in california during summer is fully replaced by a 200MW 2 hour hattery array because it will never be short of energy to charge.
The industry standard measure is watts.
Pretending a gas peaker is the same as a coal plant whilst pearl clutching that capacity is being discussed is disingenuous nonsense.
a lot of infrastructure developments are nearing completion too which makes it easier to integrate new renewable projects, hopefully we’ll see an increase in tied usage for industries which can best make use of power at peek times as a way of stabilising the grid - when it’s windy they make hydrogen or extract carbon from the air using the excess energy then turn off when power generation levels fall, the more this replaces traditional constant use systems the easier it and more productive it is to add renewables to the grid especially at scale.
A similar thing is likely to emerge with electric cars, e-bikes, and other battery devices, smart meter tariffs which allow people to set it to only charge when the grid has power to spare and prices are lower - if they’re paired with home solar and localised generation then it could really help take the pressure off long distance transmission lines.
You’re comparing GW (nameplate capacity, how much it can generate at a given instant) with GWH (amount generated over the course of a year) which makes things look a lot worse than it is.
It’s stlll not enough; I’m expecting the rate of decarbonization to pick up as the factories to support it are finished
Oh thanks. Glad I was misreading something.
the amount generated is more relevant. stating capacity is misleading: ofc higher amount generated implies higher capacity.
1gw solar power plant will produce (random numbers) 4gwh in 4 hours per day ( in summer it produces 10gwh in 10 hours per summer day, and 2gwh in 2 hours per winter day, 4 kinda annual average, again, random… ) so in a year, it produces 4*365= 1460gwh, when 100% uptime, less if there is downtime. fossil for exemple: 1gw fossil plant produces 24gwh per day, thats 8760gwh per year but with 100% year long uptime: 20% downtime->7008gwh only, so on and so forth…stating capacity is misleading, show us productions graphs that could help judge how much generated in said area per year.
Sure…but don’t use capacity to compare directly with amount generated. Mixing units like that in a comparison tends to result in nonsense answers.
Capacity measures capacity.
Demanding people talk about average power when they’re talking about capacity is idiotic.
If you have a 1GW load to run for two hours at midnight on a winter day 1GWac solar array is useless. So is a 200MW fossil fuel plant which generates the same average power over the year. If you need to run 1GW of AC during the hottest hours of the year, the solar array is pretty good, but the 200MW thermal plant is still useless.
A 200MW OCGT peaker that runs for 2 hours in california during summer is fully replaced by a 200MW 2 hour hattery array because it will never be short of energy to charge.
The industry standard measure is watts.
Pretending a gas peaker is the same as a coal plant whilst pearl clutching that capacity is being discussed is disingenuous nonsense.
a lot of infrastructure developments are nearing completion too which makes it easier to integrate new renewable projects, hopefully we’ll see an increase in tied usage for industries which can best make use of power at peek times as a way of stabilising the grid - when it’s windy they make hydrogen or extract carbon from the air using the excess energy then turn off when power generation levels fall, the more this replaces traditional constant use systems the easier it and more productive it is to add renewables to the grid especially at scale.
A similar thing is likely to emerge with electric cars, e-bikes, and other battery devices, smart meter tariffs which allow people to set it to only charge when the grid has power to spare and prices are lower - if they’re paired with home solar and localised generation then it could really help take the pressure off long distance transmission lines.