> The decrease is almost entirely due to gains in lighting efficiency in households...
The article is an interesting treatment of how lighting is getting more efficient and well worth a read. But pedantically zooming in on this one throwaway phrase for a second... this is a misinterpretation of the data on 2 levels.
1) The (badly labelled) graph seems to be displaying a very very slight linear uptrend for "residential".
2) Energy is literally the first example of where we expect to see Jevons paradox [0]. If its use is going down, that is because energy is getting more expensive in real terms. If the only trend here was lighting getting more efficient, households on aggregate would find ways to use more electricity because it is extremely fungible.
By default the proper way to interpret the data (if for the sake of argument I say what I would interpret as a slight uptrend is actually a downtrend) is that electricity is getting more expensive real terms. The impact that has on living standards is cushioned somewhat by improvements in lighting efficiency. But if electricity costs were steady and lighting efficiency improved we'd expect to see an increase in electricity use.
That depends on the demand elasticity of the thing in question. If everyone already had about all the lights they wanted in their homes (a somewhat reasonable assumption, give or take a proportion of families turning off lights religiously), making them cheaper wouldn't make you go out and buy more lights -- certainly not enough to make up for the cheaper cost.
Stated somewhat differently, when you switched from incandescents to LEDs, did you add more or less than 8x the total lumens to your home? I'm guessing quite a bit less. My apartment is a little small, but I couldn't manage 8x if I doubled the brightness in every room and always left every light on. No lighting efficiency improvements are going to convince me to go beyond that threshold.
The point about households on aggregate finding ways to use more electricity doesn't apply because it's not power those houses now have in comparative excess, but rather dollars -- you don't have to buy as much power to do the things you want, so you have more dollars and can choose to spend them on additional electricity consumption or massively inflated rent or a vacation or whatever. The _unit cost_ of electricity (note that Jevons paradox applies to unit costs, not substituted goods or alternative sources of income or savings) didn't change by making lighting more efficient (and, as you've noted, has gotten more expensive), so you're not incentivized more than you already were to spend those dollars on electricity consumption instead of other things you could buy instead.
I think this just doesn't come to terms with how much more efficient modern lights are?
I remember when folks were resisting LED lights at the start. Folks would literally promote turning off the lights earlier to save energy. Remember back when making sure the lights were out was a big deal?
Turns out, 60-100 watts down to 10 is just ridiculously hard to come terms with. Turn off the lights early just doesn't compete. Not even close.
This also ignores how much more efficient other things are. Televisions would be an amusing one. It isn't as dramatic, sure, but it is about a quarter of the energy?
> I remember when folks were resisting LED lights at the start.
At the start, LEDs were horrible. There were early versions that you could point your cell phone camera in video mode at the lights and see them blinking. We did that just to prove that some of us could see these lights strobing. Now they have insane blink rates that you'll never see them except maybe for the highest of high frame rate cameras that mere mortals will never see.
I would be curious to which was worse between the early LEDs or CFL. I hated both.
To be sure! Early low flo toilets were also pretty bad. Modern ones are a whole new thing, though.
It's a lot like battery technology. It progressed a ton with people not realizing it. If you said I could get a battery powered lawn mower, I'd have assumed you meant as a toy not even a decade ago.
I feel the same way about all of the generative AI stuff that keeps getting posted here. Some of it is just laughable at how bad it is that you have to seriously ask why would they show it off to people. Maybe in the next decade, we'll find the road out of uncanny valley.
There's a risk of people thinking the early stuff is so bad that it must be a joke, but the people working have to promote they are working hoping to name recognition/funding. Sometimes it pays off to be first with low quality that matures, other times it's a death knell
Decreasing time for lights on has a linear impact on power usage. Increasing the efficiency of the lights has a constant factor impact on power usage. It is a big constant, but stopping particularly egregious wastes of electricity can overcome it. That said, both at once is the best of both worlds.
I get what you are aiming at, but strictly, both have a linear relationship here?
More, you can only cut out so much of the time, and since you can run modern lights for the full day before you burn the energy you would save by turning off the lights an hour early, it is kind of silly at a personal level to think that will work out. (City level and larger, sure?)
Pulling it back to this article, the assertion is that getting more efficient with energy use for lights would just be offset by us using more lights. Which, I think it is fair to say that we do. Considerably more lights, in many instances. It doesn't completely negate the energy savings, but largely because of how big that energy savings was.
As far as the integral is concerned, the rate is a constant while time is a linear variable, so the improvement from LEDs is a constant time improvement and the improvement from running the lights less is a linear time improvement. Otherwise, we are in agreement.
That said, I do think that there is an incentive to spend the energy savings on things other than lighting.
But the power use is also constant over time t, such that any reduction in time use is effectively a constant drop in cost. You are literally shrinking either the width (time) or the height (electricity use) of the curve, as it were.
For your point to be stronger, you would have to show that using lights for longer makes them less efficient, such that you increase the unit cost of electricity over the time variable. Which may indeed be the case. The naive model, though, is just Cost = (electrical use) * (time), and reducing time is the same as reducing electrical use via efficiency.
The Jevons effect has nothing to do with how efficient something gets. Lightbulbs could be free to build & operate and it'd still trigger Jevons. Electricity use won't drop unless it got more expensive to produce electricity. Or, ironically, unless electric goods suddenly become much less efficient for some reason.
It is built into the laws of supply and demand. If you want to argue that the Jevons effect won't apply you basically have to argue that the supply/demand curves are funky. In this case there is no reason to believe they are.
> The Jevons effect has nothing to do with how efficient something gets.
Why do you say that? From the first sentence of the link you posted: “In economics, the Jevons paradox occurs when technological advancements make a resource more efficient to use (thereby reducing the amount needed for a single application); however, as the cost of using the resource drops, if the price is highly elastic, this results in overall demand increases causing total resource consumption to rise.”
It has everything to do with efficiency, as higher efficiency is what makes it a paradox. It’s not even interesting, let alone a paradox, if we ignore efficiency and talk about lower prices leading to increased demand.
Why would the demand curve for lighting not be “funky”?
If someone offered to pay for all of your home lighting, no matter how much power your lighting took, how much lighting would you get? Presumably a bounded amount! But in this case, the price you are paying for power for lighting is zero. So, reducing the energy use per amount of lighting would presumably not increase the amount of lighting you use beyond this limit, and therefore would decrease the energy you use for lighting.
The unrealistic assumptions - the argument started by assuming that resources are unlimited. Most of economics is expected to break down from that starting point. The point of all the models is theorising about how people will distribute limited resources.
It is like talking about the market for air or trying to measure its supply/demand curves. They are completely degenerate because it is too abundant for anything meaningful to be said.
The unrealistic assumption of someone offering to pay the part of your electricity bill that comes from lighting, no matter how big? Ok, yes, that’s clearly unrealistic,
but, the cost of this presumably wouldn’t be like, totally insane, right?
Of course, I don’t mean that in realistic scenarios that the amount of energy used for lighting would decrease directly by the proportion given by the increase in efficiency of light-per-power, which would only happen if you had exactly no demand for more lighting than you have even if it was free,
but that doesn’t mean the energy used would necessarily increase, just that it wouldn’t decrease as much as it would if you consumed the same amount of lighting.
I mean, the Jevons paradox is that increased efficiency can lead to increased use? No?
So, the idea would be that being able to more efficiently use electricity for running lights might still see us use more electricity for lights as we use them in more and more places. And, at a personal level, that kind of tracks. We don't hesitate to put lights in places that we used to accept as dark.
You certainly see this with televisions. The dramatic increase in efficiency afforded by new television technologies has seen us both start having screens everywhere, and in larger televisions.
Is Jevons’ applicable here? People only have a fixed square footage in their house that needs to be lit, and often negative utility to having rooms lit all of the time.
If electricity were cheaper you might turn the lights higher instead of balancing cost vs. comfort, wouldn't use the eco-mode on the dishwasher that occasionally results in dirty dishes, would probably not think twice about washing clothes at 30 °C instead of 40 °C, maybe use a dryer instead of clothes racks blocking the living room for a day, use the more comfortable tankless warm water heater, properly preheating the oven giving you the results you want etc. pp. ... the list is endless.
But electricity often costs upwards of 30 cents/kWh nowadays, so you avoid doing all those comfy things. 'cause they're expensive.
My power is still $0.11/kwh... I haven't turned off my christmas lights in 3 years. There are huge swaths of the US where power is still (relatively) dirt cheap and nobody thinks twice about the heavy soil function on the dishwasher or leaving landscaping lights on.
If I have to spend more than 30% of my monthly budget on power, I will not be taking cold showers or living in the cold. Consuming energy replaces other hobbies. High energy prices have been normalized, at least in my state. Same with gas. People had to stop caring, or leave.
- purposefully made with line frequency (60hz) refresh which means its actually constantly blinking. you can see this with led Christmas lights by moving them.
- pack liquid capacitors designed to fail well before the rest of the board.
- thermals are too hot and fry the diodes or the rectifier IC’s.
- wrong colors sold everywhere. a proper 4000K 90+ CRI led is hard to find and more expensive. the two most often available are 2700K (yellow) and 5000K (blue)
Why do you want 4000K? Incandescent/halogen are 2700k/3000k, and sunlight is 5000k. 4000k is flourescent, and I don't think anyone wants lighting to look like that. That said, if that's what you want Lowes has it (technically 3500k at CRI 90) https://www.lowes.com/pd/GE-Pro-60-Watt-EQ-A19-Bright-White-...
Also, only the absolute cheapest LEDs have noticeable flicker. Decent ones have big enough capacitors to make it un-noticable.
The article is an interesting treatment of how lighting is getting more efficient and well worth a read. But pedantically zooming in on this one throwaway phrase for a second... this is a misinterpretation of the data on 2 levels.
1) The (badly labelled) graph seems to be displaying a very very slight linear uptrend for "residential".
2) Energy is literally the first example of where we expect to see Jevons paradox [0]. If its use is going down, that is because energy is getting more expensive in real terms. If the only trend here was lighting getting more efficient, households on aggregate would find ways to use more electricity because it is extremely fungible.
By default the proper way to interpret the data (if for the sake of argument I say what I would interpret as a slight uptrend is actually a downtrend) is that electricity is getting more expensive real terms. The impact that has on living standards is cushioned somewhat by improvements in lighting efficiency. But if electricity costs were steady and lighting efficiency improved we'd expect to see an increase in electricity use.
[0] https://en.wikipedia.org/wiki/Jevons_paradox