Color pictures

Filters or sensors ?

We ponder a photographer’s desire for a monochrome camera.

Our photographer follows certain blogs daily, or nearly so, and one of these bloggers (whom he deeply respects for his experience, skill and honesty) wishes for an affordable monochrome digital camera.  Leica does make a monochrome digital, but it’s rather expensive (a hallmark of that brand).  Now, all digital cameras have a “B&W” setting, and certainly any digital picture can be converted to monochrome after the fact.  But the photo-blogger argues that for him, if his camera can take color pictures, he will see color pictures to take; if (in the old days) his camera had black and white film in it, he would see black and white pictures to take.  (This is the main point; we refer you to the post for his full argument.)

Our photographer has other reasons for also desiring a monochrome digital.  The sensor of a digital camera is an array of millions of tiny units, each one sensitive to light.  Most color sensors use Bayer arrays, in which a pattern of colored filters is placed over these units; for every four, two have green filters, one blue and one red.  This matches the color sensitivity of the human eye (peaking in the green).  Each picture taken with the sensor is separated into a blue channel, a green channel and a red channel, which are then combined to form the color picture.  There are variations on this technique; we’ve described some of them, and a bit more about capturing color images, in an essay linked from here.  Our photographer’s point is that he’s forced to use the filters that come with the sensor.  While he can adjust the relative weights in processing, the captured colors are fixed.

Our astronomer gives an example from the era of film.  The color sensitivity of different color films varied.  It happens that the light from a low-pressure sodium vapor streetlight comes out mostly in one narrow region of the spectrum; in the famous film Kodachrome, this fell mostly in between the region recorded by the red layer and that of the green layer.  So taking a picture on Kodachrome by low-pressure sodium light was almost impossible.  We know of no one who would actually want to do that, but the point is that our photographer feels that with a digital color sensor he may be missing something, and certainly doesn’t have the color control that he has with black-and-white film and his box of color filters.

Does it actually matter?  We see color because of three separate pigments in our eyes, so in a sense we are already limited to a specific set of filters.  And how we perceive color can vary enormously by context.  A room lit by a fluorescent light looks white while we are in it, but distinctly green if we’re outside in the sunlight (especially toward sunset).  Perhaps we’re not actually missing very much by not having an affordable monochrome digital camera.

But our astronomer and our photographer now and then look at catalogs of monochrome digital sensors designed for astronomy, and sets of filters designed for all sorts of science, and dream of putting something together on their own.

Share Button