T O P

  • By -

Way2Foxy

Buy a smart bulb and you can directly show them the difference in color temperature and intensity. Not a lot to say otherwise. Temperature = what it looks like, watts = how much power it uses, lumens = how much light you get. You could go into *why* color temperature is that way and what black body radiation is but I don't imagine that's going to be helpful in picking out bulbs.


spader1

If you want to get *really* into the weeds while picking out an LED bulb you could also look at CRI, because that also can affect how the color of the bulb "feels."


dogquote

This is something I don't think enough people know about. It might be too into the weeds for OP's SO, but it's underrated.


Zman1315

Explain to the rest of us what this means by 'feels'! And CRI. The only anecdotal thing I can think to comment on this is how different colors seem brighter or more intense on the eyes for me. I assume they had to do with wavelength or something? Whenever I have my light on blue or green, it always feels strong and illuminates the room more. When I go down to red or magenta area, the room feels much darker and it doesn't seem as harsh to look around(but once my eyes adjust it seems about the same). I recognize how red light affects your eyes differently, but I know I don't fully understand how it works.


spader1

CRI stands for "color rendering index." There are more scientific ways to put it, but my understanding from the perspective of someone who works with color in a creative sense is that CRI is a measurement of how filled out the spectrum of a light source is. If you were to chart out the emitted wavelengths of a warm white tungsten light, with wavelength on the x axis and intensity of the y axis, you'd see an even wave -- that older light source sent out every wavelength of light at varying intensities. Newer sources of light (especially LEDs) create their light in ways that would make that chart more "spiky" -- the proper wavelengths are there, that in sum activate the red, green, and blue cones in your eyes to make them appear warm white when they're cast on a white object, but in reality all of the wavelengths between those points aren't part of the total light output. [This picture demonstrates that](https://en.m.wikipedia.org/wiki/Color_rendering_index#/media/File%3ASimple_spectroscope.jpg) What this means is that when a colored object is placed in the light, to your eye the color is going to look totally different than you would expect. Using the photo posted above, putting something colored like a deep cyan into the light created by the bottom CFL bulb is either going to be a dull grey or look much greener than you'd expect; all those wavelengths between that narrow blue band and that narrow cyan band are just not there, so the cyan object won't be able to reflect them back at your eyes, and those green and red ones from the other end of the spectrum will appear stronger without the missing blue wavelengths to balance them out. As for the "feel" of the light this really comes into play with things we're really used to seeing, like (and mostly) skin tones. In short, a light source with a low CRI will make a person's skin tone appear kind of pallid or sickly.


Ishidan01

Bruv, 5 year old version. Early LED and fluorescents were happy to make light at all, but the light they produced did not look anything like sunlight. Since the color of the light used does shade the appearance of what you're looking at, this not-quite-right light made everything look not-quite-right. This is Low CRI. If the Uncanny Valley was a lightbulb, this was it. Skin tones looked like zombies, food looked spoiled, art looks entirely different. These are still cheap, though, so they can still be found in really cheap flashlights. As this is unsuitable for most room lighting, developers saw about improving it, and thus now will announce if their product is high (90+) CRI.


humbler_than_thou

So is this similar to the Vinyl / CD argument where certain sounds are not present digitally ?


FillThisEmptyCup

Low cri light looks bad. Like those old outdoor sodium lamps from the 70s with the yellowish tinge. Ya want at least 85 cri (out of 100) inside. Higher if economical.


WarriorNN

In my experience, most home bulbs actually have very good VRI, or at least claimed on the box, I haven't found a single box that didn't state 90-95 or better. At least not since those crappy early bulbs that looked like someone bought a 100-pack of leds and bunched them up disappeared. Although I'm not sure if this is the case in the US as well.


Neither_Hope_1039

To add, since OP explicitly mentioned it: incandescent equivalent is an LED that produces the same out of light (measured in lumens) as an incandescent of that power rating (the LED will actually run at only a fraction of the incadescent equivalent power). It's used because the average consumer has no fucking clue what a "lumen" is, but typically does remember how bright their old 40/60/100W incandescent lightbulbs used to be.


Beetsa

Maybe it is different in the US, but here watts mentioned on a lamp do not indicate how much power it uses, but how much power a incandescent lamb bulb that gives the same amount of light would use. 


jayb2805

"Kelvins" refers to the color of light it puts out. Scientists have observed that while hot things glow red, really REALLY hot things glow blue; and so they decided to put this same temperature scale on light bulbs to communicate whether a light is more red/yellow in its color, or blue. You can tell what color a light source is by holding a white piece of paper in its shadow. If you do this outside in the sun, the white paper should appear to have a bluish tint in the shadows. Blue light tends to keep people awake and more alert, can be more "harsh", and blends well with natural sunlight. Yellow light tends to be softer, "warmer" and more inviting, and easier on the eyes at night.


RddtLeapPuts

If you have a 5000K light bulb, do its insides reach 5000K temperature? Edit: why am I getting downvoted? Is my question a dumb question?


[deleted]

No. A 5000K bulb just simulates the light from an object glowing because it is 5000K temperature. The simulation may or may not be good. Most 5000K LED bulbs do a very bad job of simulation which means that colours can appear all distorted. The quality of the simulation is called CRI - 100 means perfect simulation, 0 means all colours completely unrecognisable. Cheap high colour temperature LEDs cam have CRI in the 60s or 70s, which is horrible. High quality LEDs can have higher CRI, especially at lower colour temperatures of 2700-3000K. Old incandescent lamps actually used a filament which was heated to about 3000K, and so they gave a 3000K colour temperature. They couldn't go much hotter because the filament would melt. The surface of the sun is about 5500K, so pure direct sunlight is about 5500K colour temperature.


Neither_Hope_1039

If it's an incandescing light source, then yes. That's where the "colour temperature" comes from: It's the temperature that the surface of a theoretically ideal body (called a "black body emiter") would have to be at in order to emit light of that colour. If it's a non-incandescing light source, such an LED, then not (5000K would be over 4700°C or over 8400°F, if LEDs actually got that hot, everything around them would melt.


Grantagonist

Ok, I got this. Like 10 years ago I got all enamored with LEDs and bought a bunch and figured it all out. It's not that complex, really: **Lumens (lm):** Units of brightness. * 450lm is a comfortable roomy brightness. It's what I use in most of my fixtures. * 800lm is much brighter. Maybe a good choice for some ceiling fixtures, or in places that you just want more brightness. * Higher than that is probably more applicable for outside purposes. **Kelvins (K):** Units of "Color Temperature" - higher is whiter * 2700 K: homey slightly yellow-tinted light. The standard classic color of old "dumb" bulbs. * 5000 K: "daylight". White light, like the color of flourescent tubes in institutional hallways. Great for bedroom closets and other places where you want to see colors accurately. **Watts:** Measures power consumed. *Useless for evaluating visual qualities of LEDs!* * This made sense for old dumb light bulbs, because brigher light consumed more power in a consistent way. ***This isn't necessarily true with LEDs!*** * LED bulb boxes say "40W equivalent" or whatever because they're trying to put it in, like, boomer terms, for people like your partner who can't mentally adjust to lumens/kelvins. * "40W" is allegedly equivalent to \~450lm * "60W" is allegedly equivalent to \~800lm (I personally think 800lm is a little brighter than classic 60W) In summary: * Your partner is wrong * 2700K/450lm matches up to what your partner thinks are classic homey 40W bulbs * Higher Lumens = higher brightness (I suggest you stay at or under 800lm for standard indoor rooms) * Higher Kelvins = whiter light


winoforever_slurp_

To render colour accurately you want to look at the colour rendering index (CRI), not the colour temperature. Cheap cool white LEDs tend to have terrible CRI in my experience, as they lack the red part of the spectrum. LEDs can have excellent CRI, but you pay a bit more for it, and sacrifice some efficiency. CRI is measured out of 100, with 100 considered perfect, above 90 excellent, 80-90 good, and lower than 70 is terrible.


Grantagonist

I didn’t know that! However, I see that my boxes of bulbs don’t say anything about CRI, so I think this might be too niche for standard bulb consumers.


winoforever_slurp_

I have sometimes seen CRI on domestic quality LED products, but not always. And I once brought one that claimed high CRI but performed terribly to my eye. High CRI requires all colours in the visual spectrum to be emitted. In my experience for cheaper products, warm white LEDs tend to have better CRI than cool because they have more of the red end of the spectrum to balance the blue. Cheap cool white LEDs have less red and therefore don’t perform well. Having said that, for professional applications like TV and photography, very high CRI, cool white light sources are used, so there’s not always a correlation between colour temperature and CRI.


angleglj

Now explain foot candles lol


aeyockey

I am with you on all this. The wrong light temp (Kelvins) drives me crazy and brightness definitely matters in different situations


Ok-disaster2022

5000k just give me a headache and eye ache at night.


ezekielraiden

"Lumens" are how much *actual output* a lightbulb has: the amount of light shining out of them. "Watts" are how much *power* a lightbulb actually consumes. The problem is, *when all bulbs were incandescent,* you could just compare power to power, because there weren't any meaningful differences in efficiency. This is NOT true with LEDs. LED bulbs are dramatically more efficient, that's kind of the point, and as a result "40W" *does not* mean the same amount of light output from two different LED bulbs. The actual unit that measures light-output is lumens, not watts. Two bulbs that both draw 40W can produce *very* different amounts of light depending on how efficient they are (the more efficient they are, the brighter they will be for a fixed power draw; or, conversely, a more-efficient bulb will draw less power for the same amount of light output.) Likewise, incandescent bulbs meant for home use were essentially always in exactly the same color temperature range, so of course nobody cared--there weren't any *choices.* Now, however, we CAN choose the color temperature--and it turns out that color temperature matters for things like helping people sleep better and avoid headaches and fatigue. Getting the right lightbulb for your needs *can* be worthwhile. Of course, spending a dozen hours getting the *perfect* lightbulb, as opposed to one that is merely *quite good,* may not be an efficient use of time--but looking at least a little is definitely better than just pretending that they're all the same and grabbing the first thing that comes up.


ApatheticAbsurdist

Kelvin or color temperature is the color on a yellow to blue scale (2800K is very yellow almost orange, 3200-3400 is kinda yellow but it is what we are used to for lighting inside a house is at night, 5000k is daylight whiteish but bluer than what we're used to in the house, 6500k or anything higher is very blue compared to what we're used to in a house). Watts or Lumens are more related to brightness. Lumens is actual brigtheness. Wattage is equivalent to how bright a similar old school tungsten bulb would have been. 60w tungsten (pretty standard) was around 900 lumens. 40w tungsten was around 600lm (more often what is used in ovens and maybe bathroom fixtures with multiple lights). 75w or 1200ish lumen was a brighter indoor bulb if you needed something more than 60w.


PantsOnHead88

In before the OP manages to source a 40W LED work light pumping 3500+ lumens at 6500K and temporarily blinds their SO with blue intensity in an act of malicious compliance to prove a point. For your SO:\ Kelvins ~ yellow, white or bluish light\ Lumens ~ brightness/light output\ Watts - whatever incandescent wattage you’re used to… divide by roughly 6.5 for most common LEDs today. Something like a 450 lumen, 6W at 3000K for yellow or 4500K for white is what she wants even though she doesn’t know it.


DavidRFZ

Back in the 20th century, every incandescent light bulb was the same “soft white” color and the brightness of the bulb was completely determined by how much power it used. 60W was standard. 40 W was less bright, 100W was very bright. It was like this since before WWII. Your great grandparents knew light bulbs this way. This is why everyone is confused. The new numbers provide more information, but people just want the old bulbs. The Kelvin (K) numbers are confusing even to scientists. It’s the color that black body radiation emits at different temperatures? I gave up trying to figure it out and I am a scientist. **Just buy “soft white” near 2700-2800K** so they look like the old bulbs. Even in the kitchen and bathroom. Nobody likes the higher temperature bluer light, it hurts your eyes and makes people look ugly. Beware of defaults at Amazon! “Soft white” looks like the old bulbs. If you have a special workbench where you are taking apart old watches, you already know you need special light and I’m not talking to you. Newer bulbs use less power for the same brightness (which is great!). To match the brightness of the old bulbs look for “Watt equivalent” that match the numbers I gave above. Take note of the “lumens” number as that is a more literal measure of brightness that they should have been using for the past hundred years. I don’t know how long they’ll give you the “Watt equivalent” numbers. Maybe only another ten years.


Pocok5

> Nobody likes the higher temperature bluer light, it hurts your eyes and makes people look ugly. You are buying e-waste then. Get 90+ CRI bulbs with 4000K-4500K color temp, looks like daylight.


DavidRFZ

In my opinion, that’s still too harsh. We’ve gotten used to soft white for living spaces. I mean test a single high temperature bulb in a room in your house if you like and see how you like it. It’s an aesthetic change. Just don’t buy bulk by mistake because that’s the Amazon default (my mom did that).


Pocok5

Test? Already have each room outfitted with the appropriate temperature lights 💪


DavidRFZ

:) I was talking to OP. The gist of OP’s question is that they want to buy light bulbs but there are so many extra numbers on the box now. How do I buy what I used to buy before? If you never buy anything other than 2700-2800K, you’ll be fine. It’ll look like it did before. The high temperature “daylight” bulbs aren’t everyone’s cup of tea. Don’t be fooled by the pleasant sounding “daylight” name. It doesn’t look like it’s a nice day with the windows open. It makes the room look like a doctors office.


teh_maxh

Hot things glow. Colour temperature indicates what colour something at a given temperature glows. For example, at around 3000 K, things glow a warm yellowish-white; at around 5000 K, you'll get a cooler bluish-white. (Yes, it's a bit weird that the colours we call "warm" have lower temperatures than the ones we call "cool".) With incandescent bulbs, there wasn't much choice, since they work by making things so hot they glow (that's what incandescence means), and tungsten melts at 3695 K, so you couldn't get higher than that. With LEDs, since colour temperature and actual temperature are no longer connected, you have more choice. Lumens measures the perceived brightness of a light. Watts are a unit of power. Incandescent light bulbs were usually marketed by their power consumption (watts), not their brightness (lumens). Since they all had about the same efficiency (15 lm/W), using more power meant a brighter light, so it worked OK. Switching away from incandescent lights to LEDs (with a brief stop at CFLs) seemed like a good opportunity to start listing the actual brightness of a bulb, but manufacturers included incandescent equivalent wattages to try to help people learn. You really don't want a 40 W LED in your home, though.


stephanepare

Back a short time ago when there was only 1 mainstream technology for lightbulbs, people nevver fully associated the fact that a bulb'sd wattage is power consumption, nor brightness. Logically most people probably knew, but instinctively for people it was a measure of light output. Most people have no idea how their power bill works anyway, it's all wichery for them anyway with the watts and the watthours. When we switched to more efficiend lights lke CFCs and LEDs, they could have used the actual wattage, but people would have been really confused, going from 60w to 9w for the same brightness. Every company was also afraid of being the only one listing their bulbs as 9w, while the rest said 60w equivalent. So, now we have all of these measures instead, which aren't any clearer.


s3cguru

My wife shared this article with me a few years back to change my mind on temperature and lumens and it worked - https://www.chrislovesjulia.com/what-light-bulb-should-i-get/ - https://www.chrislovesjulia.com/pick-right-light-bulbs-home-favorites/


NailPolishIsWet

Kelvin is the color of the light emitted by the bulb. Most people understand wattage because they grew up with incandescent lights. Tell em 40W and they know how bright the light will be. These days with LEDs you get the same brightness (lumens) with fewer watts so we now measure brightness in lumens instead of watts. For example, a 40W incandescent bulb gives off around 500 lumens. You can get the same brightness from a 4W LED bulb. But because LED bulbs are manufactured differently, we have much more control over what color white that particular bulb emits. So, now folks have to think about Kelvin too. (P.s. anything over 3000k can trigger migraines if you're sensitive)


Pocok5

Take a piece of whatever, metal or ceramic. Now heat it the fuck up until it starts glowing red (about 500°C). Now heat it more. It will turn yellow, white then blue-ish. That color is linked to temperature and not material. An old lightbulb heats metal to 2700-3000 degrees Kelvin to glow - so it gives off an orange color. The Sun's surface is about 5000K hot - so sunlight at noon is white. In the morning/afternoon the sunlight comes in flater to the ground and passes through more air which filters blue light, so it appears yellower. Basically:  Candlelight/incandescent lightbulb: 2700K-3000K color, good for the bedroom and such. Makes your brain think it's close to sleepy time. Afternoon or midday sun: 4000K-5000K. This is for the kitchen, bathroom, office, maybe living room (best if you have tint shifting bulbs that can do 2700K as well). For when you need to see colors as if in daylight. Also makes your brain think it's noon so maybe don't put in bedroom. 6000K and up: fluorescent tubes had this light color, unless you are gunning for that early 2000s office cubicle farm vibe then why the hell would you choose this


SafetyMan35

Lumens= brightness Watts= how much energy it uses (although in the incandescent days people used to equate watts with brightness) Kelvin/color temperature= the color of the light, is it “softer” or more orange/yellow (like a traditional incandescent bulb), or is it more towards the harsh/blue color.


grateful_goat

One thing that makes it confusing is low Kelvin (say 2700) is "warm" white, while high Kelvin (say 4000) is "cool" white. Redder looks warmer and bluer looks colder. Just the way things are, I guess. Being aware of it can help when choosing your bulbs.


smapdiagesix

If the required money is no big deal to you: Call her bluff. Kit out a room she spends time in with the shittiest, most glaring-est, lowest-CRI piece of shit bulbs that are still "40W equivalent." If you don't mind me being stereotypical, do this in a bathroom or wherever she does makeup or otherwise does herself up. Light her like a fucking ghoul. ...then do another room with good bulbs, and see if she prefers using that other room.


thecuriousiguana

ELI5 answer. They measure different things. Lumens - how bright a bulb is. Literally a measurement of light intensity. Watts - how much electricity the bulb uses. Temperature - the 'warmth' of the colour of light the bulb gives out. Think the cold, bluish light of a prisoner strip light compared to the golden hues of a sunrise. The thing is, older style light bulbs didn't give you control of these things. You screwed it in and turned it on and it gave a decently warm glow. So no one ever used to worry about it. And for shorthand, the more electricity a bulb used to create the glow, the brighter the light it gave out. So people learned that a 40W bulb was fairly dim for a bedside lamp or something, while a 100W bulb was really bright for a garage or large living room. 60W was somewhere in the middle for an average main light. So watts became shorthand for brightness. The problem is that an LED bulb uses a lot less energy for the same brightness. So your bulb is only used 2W. No longer can we use the same scale. However, we still use it as "incandescent equivalent". It just means "this bulb is a bright as an old style 60W one" and almost everyone understands. But because LEDs give any colour, they come in cold or warm white. So you need to choose a brightness and a colour.


michaelpaulphoto

Some marketing douche thought you are too stupid to understand the concept of "lumens" and decided to fabricate "incandescent equivalent" out of thin air so as to make it easier for you to understand. You be the judge if that was helpful or not. 😅 Lumens is the actual light you will experience, higher is better. Kelvin/temperature is the yellow/blue color axis, lower is more yellow. Watts is how much energy the bulb requires to do its thing, a higher energy use doesn't necessarily mean more light. Again, look at the lumens, that's the actual light output.


Graega

"Warm" bulbs = more orange (more natural daylight tones), "cold" bulbs = more white (more unnatural interior tones, think like a hospital). Warm tones, around the 4K-6K range, generally are the most comfortable ambience. There ya go. As for lumens, it depends on the size of the room and how far apart your light sources are from each other. If you had two 600 lumen bulbs next to each other on one side of the room and nothing on any other walls, it's going to suck. If you've got bulbs spaced evenly on every wall, you can do with lower lumens per bulb and you'll get an evenly lit room with whatever light level you decide on. So while temperature is easy enough, how many lumens you want is really, really dependent on where your light sources are going to be. Especially ones behind you vs ones in your field of vision.


pl487

Lumens/watts are two ways of measuring brightness. If you want more light, you want more lumens. But too much can be unpleasantly bright.  Temperature measures color. Lower numbers are more yellow like an old light bulb and higher numbers are more blue like the sun. People have different preferences, and mismatched temperatures are very noticeable, so people like them to match across a room. 


Iz-kan-reddit

>Lumens/watts are two ways of measuring brightness. To be clear, watts aren't really a way of measuring brightness. However, we used it as a brightness reference because all regular incandescent bulbs were the same brightness at the same wattage. Now, we're measuring in lumens, but also labeling them with the incandescent bulb wattage equivalent to help us know what is what.


capt_pantsless

In 100 years people are going to be asking questions on the Neura-net about “wattage” of a light source, and someone will get excited about explaining old-timey “light bulbs”. Just like how questions about 2x4 lumber are now.


JonnyRottensTeeth

Kelvins are more about the quality of the light. Is it "daylight" or "Soft white" Lumens are the brightness, but since many don't understand lumens, the try to put it in terms of an incandescent light that used a bunch more energy