Note: this post is adapted from an answer I wrote for the Computer Graphics StackExchange beta, which was shut down a couple of weeks ago. A dump of all the CGSE site data can be found on Area 51.
In computer graphics, we deal a lot with various radiometric units, such as flux, irradiance, and radiance, which quantify light in various ways. But there’s a whole other set of units for light, called photometric units, that also show up sometimes. It’s important to understand the relationship between radiometry and photometry, and when it’s appropriate to use one or the other.
Each radiometric unit has a corresponding photometric unit, in which flux (energy per unit time, aka power) is replaced with luminous flux. Luminous flux is a perceptual quantity rather than a physical one. It’s a unit that weights each wavelength of light according to how strongly it’s perceived by the human visual system—in other words, how strong of a perception of brightness it creates relative to other colors, in a “typical” human, as determined by experiments on a large number of participants.
So, a given amount of radiant flux (physical light power) produces different amounts of luminous flux depending on where it falls in the spectrum. Green light produces the greatest luminous flux per watt; red and blue light produce less; and light outside the visible spectrum produces no luminous flux at all. Technically, luminous flux is the integral of the physical spectral power distribution (power per unit wavelength) multiplied by the CIE photopic luminosity function, which quantifies how sensitive we are to each wavelength of light.
Luminous flux is measured in lumens. At 555 nm, our most sensitive wavelength, one lumen is equal to 1/683 of a watt. At other wavelengths, it requires more watts to produce an equivalent one lumen, since our eyes are less sensitive to these wavelengths. Perceptually, one lumen is a pretty dim light—a common candle emits about 10 lumens, and a typical light bulb on the order of 1000 lumens.
The radiometric units and their corresponding photometric units are:
|Radiant flux||watt (W)||Luminous flux||lumen (lm)|
|Radiant intensity||W/sr||Luminous intensity||candela (cd) = lm/sr|
|Irradiance||W/m²||Illuminance||lux = lm/m²|
|Radiance||W/(m²·sr)||Luminance||nit = cd/m² = lm/(m²·sr)|
Note that each photometric unit is arrived at by simply swapping lumens for watts, so each row of the table has the same relationship that lumens and watts do. For instance, a luminance of one nit at 555 nm corresponds to a radiance of 1/683 watt per square meter per steradian.
So, when should we use radiometric or photometric units in graphics?
If we only rendered monochrome images, we could use either one interchangably (except for an overall conversion factor). However, with RGB images, it’s important to recognize that our display devices behave more radiometrically than photometrically. A red pixel value of 255, and a green pixel value of 255, both result in about equal amounts of radiant flux (watts) being generated by a pixel on a screen—not equal amounts of luminous flux. By the same token, digital cameras capture pixel values that correspond to radiant flux, not luminous flux.
That’s why we need to use luma coefficients when converting images to grayscale, or calculating the brightness of a pixel, to get a perceptually accurate result; and it also means that rendering RGB images proceeds more naturally in radiometric units than in photometric ones. Hence the emphasis on radiometry in physically-based rendering textbooks and papers.
However, there are places where photometric units come in handy. The brightness of light bulbs in real life is usually quoted in lumens, and the brightness of screens is commonly measured in nits. When building a realistic game world it can be useful to have the game engine accept lumens and/or nits as intensity values for light sources, so that artists can set up in-game lights with reference to real-world ones.