What is a lumen and why should I care?

Let’s get this out of the way right up front. I’m not talking about Lumon, the imaginary creepy company from Apple+’s Severance. I’m talking about a lumen, a common measurement that you probably wondered about occasionally.

As usual, Wikipedia is useless

Look, I use Wikipedia, I know you do too. One of the things that’s best about it is that it’s all about facts. They don’t allow editorializing of any kind. That’s great, but it means you get absolutely no context to go along with what they write. Here’s a perfect example:

The lumen (symbol: lm) is the unit of luminous flux, a measure of the total quantity of visible light emitted by a source per unit of time, in the International System of Units (SI). It a standard internationally defined by the CIE.[2] Luminous flux differs from power (radiant flux) in that radiant flux includes all electromagnetic waves emitted, while luminous flux is weighted according to a model (a “luminosity function“) of the human eye’s sensitivity to various wavelengths, this weighting is standardized by the CIE and ISO.[3]

Look, that’s all true. I don’t deny it. But does it tell you why you should care about lumens? Not in the least. Does it tell you why lumens are a good way to measure anything? Nope. I guess that’s good, because it leaves room for bloggers like me. I get to write the story that helps you understand what a lumen really does for you, and that’s really the question you’re asking when you want to know about lumens, right?

How much light. That’s the point

A lumen is a measure of how much light is put out by something that glows. The more lumens, the more light. It’s a good measure because it’s pretty absolute. You can say that a 500 lumen blue light is the same as a 500 lumen red light or white light. Yes, you’ll perceive the brightness of those colored lights differently because your eyes have different sensitivities to different colors. But most importantly, a 500 lumen red light from a laser is the same as a 500 lumen red light from a bulb or an LED. So, if you want to know how bright something is, and you don’t care what kind of thing it is, you use lumens. That wasn’t always the case.

You still remember watts and you probably wish we still used them.

In the 20th century, light was largely measured in watts. A watt is not an measure of light. It’s a measure of actual electrical current at a moment in time. Watts are the numbers you get when you multiply volts and amps. I try to explain those things in this article, and they’re not really important here.

For about 100 years, light output was measured in watts even though watts don’t measure light. The reason is simple. There was really only one kind of light bulb. Sure, later in the century you got HID and arc lamps and that stuff, but for your average joe or jane, you were using incandescent light bulbs. These are the bulbs that were perfected by Edison in the late 1800s. Incandescent bulbs work simply by passing electricity through something that doesn’t carry it very well. Instead of the electricity flowing through like it would on a wire, it gets changed into light and heat. The amount of electricity that gets changed into light is predictable and consistent. So, the only thing you really thought about was how much energy the light bulb needed.

It got… complicated.

First, fluorescent bulbs came into play. Fluorescent bulbs are much more efficient. They use very high-voltage electricity to excite a gas inside a tube until it glows. This creates a lot less waste heat so you can use fewer watts for the same light output. But fluorescent bulbs didn’t work in regular light sockets. So, you didn’t care that they wattage numbers were different.

Then came CFLs (compact fluorescent bulbs) and LEDs. These overtook incandescent bulbs because they could be 10 times more efficient than incandescent bulbs. CFLs have all the hardware required to step up the voltage right within them. LEDs are so efficient that they actually need hardware to step down the amount of current that reaches them.

All of a sudden, labeling a bulb by its power requirement made no sense. Yeah, the boxes would say something like “60 watt equivalent” for a 10-watt LED. But this only takes you so far.

Enter the lumen

The lumen is a much better way to describe light output. A 4,000 lumen flashlight would be precisely the same brightness no matter what kind of light source it used. That’s the way it should be.

Lumens are used for flashlights, light bulbs, headlights, and even video projection systems. They’re an essential way to compare the light output of two completely different things. When it comes to display technology, there are so many different ways to get a picture on a screen that you need something interchangeable like lumens to make it make sense.

All of a sudden lumens give you an equal playing field. You’re going to care how much energy something uses because you’ll pay for that energy. But when it comes to understanding how bright something is, you don’t have to care what kind of thing is creating the light. That’s a massive jump forward in being able to compare things.

What about nits?

A nit is the egg of a head louse. Ewwww. But that’s not the kind of nit I’m talking about. You will often see TV screen brightness measured in nits instead of lumens. In practice, you can use either and you can convert from lumens to nits by multiplying by 3.426.

Generally TV screens are measured in nits because nits try to describe the actual light that is emitted by the screen in total, not the light put out by the light source. Today’s televisions either use some sort of backlight or sidelight (in the case of traditional LED LCD TVs) or a matrix of small LEDs located all over the screen (in the case of local-dimming LED LCD TVs or QLEDs.) These TVs have some sort of filter over them to change the color and brightness seen by the viewer.

In the case of all LCD TVs, though, the amount of brightness put out by the backlight, which is measured in lumens, is not the amount of brightness that the TV puts out into the room. So, nits are used. Even though a nit is an equivalent measure of brightness, it is generally used to describe the effective brightness of something, not its raw brightness. It’s just a custom, not really a requirements.

What about OLEDs?

OLED TVs, which are slowly becoming standard in larger TVs, use multicolor LEDs that directly shine out to the viewer without any filter. That’s how they can be so bright and look so pure. In an OLED, the brightness of the light source is equal to the brightness that the TV puts out, because there is no filter.

It’s possible that once almost all TVs use OLED, that lumens will be used to describe their brightness. I personally doubt it, since there’s really no benefit to it compared to using nits.

This article is brought to you by Solid Signal. Shop at Solid Signal for everything you need to live your best digital life. If you have questions, call us at 888-233-7563 or fill out the form below.

About the Author

Stuart Sweet
Stuart Sweet is the editor-in-chief of The Solid Signal Blog and a "master plumber" at Signal Group, LLC. He is the author of over 10,000 articles and longform tutorials including many posted here. Reach him by clicking on "Contact the Editor" at the bottom of this page.