Connected Lighting Is Here, Now What? Lighting control systems today are evolving as rapidly as...Read Article
How Important is CRI When Evaluating Lighting?
I was in a very interesting meeting on Friday. I was presenting a track head selected by a lighting designer. He wanted to look at a ceramic metal halide track head and we were happy to show him. I asked in an off-handed way…
Why do you prefer ceramic metal halide, when the world is turning to LEDs?
The designer explained very honestly that he had evaluated many many LED track heads and for the product he was lighting he felt no LED product was ready. I’m not in the business of telling specifiers what they should and shouldn’t prefer so I didn’t argue, but the conversation branched off into a discussion of CRI. What CRI are these LEDs or those LEDs? This CDM lamp carries a CRI of 83 but another has a CRI 90. All leading to one question – how important is CRI really?
First, what is CRI?
Let’s start from the beginning…
CRI is short for Color Rendering Index. This is a theoretical scale from 0-100 meant to express how well a light source displays colors back to your eye. The sun at noon on a clear day is considered a perfect 100 on the scale. Most common light sources are somewhere from 80 CRI up to 100. Here are some typicals:
- Linear Fluorescent: 80-85 CRI
- Ceramic Metal Halide: 80 CRI (Typical) 90 CRI possible.
- Halogen/Incandescent: 100 CRI
Most reputable LED manufactures create LED products that rate at least an 80 CRI. Better lighting manufacturers produce fixtures which rate over 90 CRI. I don’t want to get in the weeds on the science behind how CRI is calculated but it is a flawed metric. How can I say that?
The First Reason CRI is Flawed Metric – The Sock Test
Get out two pairs of dress socks – one navy blue and one black. Got them? Great. Now put them on the counter then use just a single incandescent light bulb to try and tell them apart. Odds are they will both appear to be black to you. That’s because incandescent light has a relatively low amount of blue within it’s makeup, meaning it can’t reflect the dark blue back to your eye. However, it gets a perfect 100 CRI. This is a serious flaw of the scale.
The Second Reason CRI is a Flawed Metric – The Color Temperature Doesn’t Matter
More important than my sock test, CRI does not account for color temperature in its overall rating. That means that fluorescent tubes with a greenish hue or orange high-pressure sodium streetlights could theoretically get a high CRI rating even though they dramatically change our perception of color.
There’s in fact quite a bit of controversy over the CRI within the lighting industry. A competitive metric called the color quality scale has been proposed and while some manufactures have submitted for testing, it has not become an industry-standard at this point. While there is large scale agreement that CRI metric is flawed, we continue to use it. Why? Well because we can’t agree on a better metric. CQS has it’s detractors, and no other alternative has come to the fore, and so we’re stuck with what we have.
So How Do We Use CRI?
Think of CRI as a qualifying metric. If the appearance of color is important to your design then look for products with a 90 CRI or better, not because it’s guaranteed that it will be perfect, but because it signals that the manufacturer is capable of controlling the quality of their chips and that they care about the quality of the light they create. Much like MPG on a car, actual color rendering may vary based on all kinds of conditions, but if a given fixture can’t get above an 80 CRI that tells you that color rendering was sacrificed as a design feature. They might have traded chip set color quality for lower cost, or higher brightness. Not every application needs perfect CRI. Also remember that it is the combination of CRI and Color Temperature that will really tell the story of how the light is perceived to the human eye.