Thursday, August 13, 2015

Lumens and brightness and all that

This post I am really writing mainly for myself.   I can't afford to test my software with a real HDR screen like this 4000 cd/m^2 (luminace) SIM2 display which is around $30k.    Mike Herf suggested I try a projector aimed at a wall to create an image about the size of the SIM2.   Good idea!

I have always found photometry (light for humans) and radiometry (light for physics) confusing.   For rendering you can just use radiance and once you forget its definition it doesn't impact your programming much.   But when my confusion always becomes most deadly to me is when I buy a projector because the constants matter a lot there.   Fortunately, I don't have to do that very often.   First lets start with lumens.    This is a measure of how much light comes from a light source per solid angle.   When you see a projector, it usually says what it cranks out in lumens.   For example, here is a screen shot from amazon today:



So it's "brightness" is 800 lumens.   Ignore the term "brightness"-- that has so many inconsistent formal and informal meanings it's not useful.   But the 800 lumens is objective.   But what is a "lumen"?   It's the "perceivable" amount of power the projector puts out.   So you plug the projector in and it consumes, say, 500 Watts of power.   Some of that is lost to heat and perhaps other waste, and 300 Watts comes out as photons.    If that light is all ultraviolet and you can't see it, that is zero lumens.    Lumens are just a constant "of human usefulness" that is wavelength dependent times the number of Watts.    A very nice figure shows this:
A single Watt of light can produce up to 683 lumens.   Figure from a nice post on green lasers.
The 683 lm/W is an historical constant to ensure backward compatibility with old units when this stuff was all candles.   So our made-up 500W projector could in principle produce 683x500 lumens.   This of course is the peak value of the projector.   For a hypothetical laser-based projector, cranking up the green would be the brightest but for most projectors it just means a white screen.

But we are graphics people.   We want luminance.    So how "bright" (note the almost meaningless term but I don't have a better one!) is it?       For emitting screens like phones and TVs, this can be complicated because the luminance varies with angle as visualized in this image:
Different phones have their brightness fall-off differently with viewing angle. (figure from an article that goes into this in more detail)

This falloff is usually by design so that most of the light can be sent toward the user looking straight at the device.   Not suprisingly this is especially true of mobile devices which rarely have more than one viewer.   Mike Herf tells me that his measurements usually have a falloff roughly modeled by a phong-like cosine function with an exponent around 4 to 5.   So when phones or TVs report their luminance they usually mean straight on viewing.

It makes sense for a projector to talk about lumens and not luminance because the luminance depends on the screen distance.   What's the conversion?   It depends on the screen because the screen will absorb some light, and it will vary with angle like the phone screens in the picture above.

Let's assume a perfect Lambertian screen with albedo 1.0 where all the light hits the screen.   Because it's Lambertian it will be dimmer at normal viewing incidence than many screens.     The measure of luminance is candela per square meter per steradian.    This is sometimes called nits.   If the screen is A square meters this is almost easy because of the candela per square meter part.    But what's up with that steradian?   As a Lambertian surface this screen will vary in how much power it send in each direction, but we know the luminance is constant so something detail-oriented and confusing is at work.   So we could go dig into the first principles (my favorite is Pat Hanrahan's chapter in the Ray Tracing classic book) or we could recall a lovely formula used all the time in radiosity (for radiometry) also applies in photometry (it's all just scaling factors):

Luminance of a diffuse surface = (exiting luminous power) / (PI * area)

So for a lambertian surface, our life is simple.   Take the lumens of the projector, and divide by PI times area.   For a one meter screen that is approximately divide by 3.   I once lived in Indiana, so I will say that is exact.

What is the area of that SIM2 screen I want to be able to emulate?    It's a 47" screen, and that usually means the diagonal.   It's 16x9, so the screen dimensions are approximately 41" by 23"which means it is about 943 square inches which according to a web calculator is about 0.6 square meters.    Since I want to be conservative, I can call that 1.0, but also pretend my screen is perfectly white.   So to get a SIM2 brightness (4000nit) at a SIM2 size I need to 12,000 lumen projector.   If I can paint my wall to 90% albedo and use the real 0.6 area and the real 3.14(etc) PI, I need about an 8500lumen projector.   It's hard to find relatively low-cost projectors (like $1000ish) over 5000 lumens, so this is not a happy number.   But close enough that I should shop.

1 comment:

Unknown said...

This is a brilliant writing and very pleased to find this site. I couldn’t discover to much different information on your blog. I will surely be back again to look at some other important posts that you have in future.
1080p gaming projector