Yes, I'm serious – some of the Light L16's sensors will indeed be black and white (aka monochrome)! It has been mentioned several times in interviews and articles recorded after the announcement but before cameras started shipping, that some of the sensors would be black and white (but I couldn’t find out how many – Light has told me they’ll let us know before shipping starts) but it wasn't really thought of as a big deal. But the more I learn, the more I consider it a big deal.

So why?

Short answer – more light, less noise, and increased sharpness!

Long answer:

**Before I get into that, note that Light has also told me they are going to call these “panchromatic” sensors. **

We’re so used to color sensors and color images that it seems strange to think that some of the sensors in the Light L16 would be black and white (ok, panchromatic LOL), but the more I learn, the more it makes sense in a computational imaging world. We want more light from our sensors/cameras and we want sharpness!

In a talk called “Gathering Light” (see the video here jump to 36:00 for this segment), Co-founder Rajiv Laroia said this about the sensor modules:

“Also, one more thing we can do is increase the low light performance of the camera. Again, we’re capturing redundant pictures. We do not need to capture every picture in color.

It turns out that for eye, the luminance information is more important than the chrominance information. If we do not put a color pattern filter on the sensor for a given exposure time, we can collect three times more light because they don’t filter 2/3 of the light out at every pixel.

We still get the color information we need because they’re taking multiple pictures so by using sensors that don’t have Bayer patterns on them (some of the sensors not all of them), we can further enhance the low light performance of our camera.

We’re actually working with our sensor manufacturer to produce sensors for us that don’t have the pattern. We have a hard time convincing people to build stuff for us but I think after the latest announcement, it will make it a little easier for us to get that done.”

The first time I heard about a pure B&W digital sensor was when RED announced they were going to produce a B&W camera back in 2012. Here’s a little snippet from their page:

“Unlike color sensors, monochrome sensors capture all incoming light at each pixel—regardless of color. Each pixel therefore receives up to 3X more light, since red, green and blue are all absorbed

This translates into a 1-1.5 stop improvement in light sensitivity, and is why the standard ISO speed of the EPIC-M MONOCHROME camera has increased from 800 to 2000. This can also improve the appearance of noise when shooting under artificial light or other color temperatures which differ substantially from daylight.”

Another B&W sensor was announced on the Digital Bolex video camera – which unfortunately shut down in 2016. I didn’t really give the RED and the Digital Bolex B&W much thought at the time because I felt like it had limited usage – I mean I love B&W movies, but not every movie should be B&W right?

But then when I started learning about the Light L16 and I saw the several references to B&W sensors, I started to really wonder why. Especially since nothing Light has said has been about delivering B&W images.

So now I understand… we get some great benefits! Especially with computational imaging – increased light gathering, increased sharpness, and a reduction in noise. Now we’re talking.

Dan Llewellyn over at maxmax.com has made a business out of turning color sensors into B&W ones – which is a crazy interesting process because he shaves off the color layer on sensors!

“Why use a monochrome camera?  Because an equivalent monochrome camera will always take a much sharper image than a color camera because resolution is dependent on the color content in the picture.  With a monochrome camera, you lose color resolution and gain spatial resolution.  Converting a color sensor to monochrome is a technically challenging process.  After taking the camera apart to remove the image sensor, we have to remove the ICF/AA stack, sensor coverglass and then about 5 microns of the microlenses and Color Filter Array (CFA) that have been photolithography printed on the surface of the sensor in epoxy.

To understand why a monochrome sensor will have higher resolution that a color sensor, you have to understand how a color camera sensor works.  We have a rather technical explanation for those so inclined.  For the artistic types that aren't interested in the science, please just consider the two pictures below.”

(Note, the left is an image from a color sensor and the right image is the same image as shot with just a B&W sensor)

Images courtesy Dan @ maxmax.com

I asked Dan why do all that work to shave off the color on the sensors and whether there were large B&W sensors on the market and he replied:

“Phase 1 recently released a 101 megapixel monochrome camera.  You can buy large monochrome sensors from sources like Dalsa, but they are very expensive.  There are some small monochrome sensor options, but it is mostly for stuff like machine vision cameras.  Leica makes a monochrome camera, but it is also expensive and I have heard they actually lose money on it because of low volumes.  Since we can modify some of the consumer digital SLR sensors to become monochrome, we can fulfill a niche that is too small for the large companies.”

I also asked about astrophotography because I know there are B&W sensors in use there (but I never really thought about them for our every day usage):

“In the astronomy world, it is common that they want large pixel because that translates to larger photon wells which means better dynamic range.  Also, stars emit light at frequencies relating to their gas composition, so a hydrogen star emits strongly on the H-Alpha line of 656nm.  For a color sensor, only 1 in 4 pixels can see that frequency, so they often use a monochrome sensor.  To take a color picture, they will sometimes use filters and then stack the pictures in post.  Using a red, green, blue and clear filter is sometimes used.  It could be that the monochrome sensor is used to get better luminance information.”

If you’re interested in knowing a LOT more about the process, please visit Dan’s page. I also found these instructions for doing the dirty work of removing the color layers of a sensor – but good golly don’t try this at home! And don’t come yelling at me if you ruin your sensor 🙂

So that’s all well and good for a DSLR sensor, but what about the sensors in the Light L16? Well, Rajiv Laroia has said on occasion that the Light L16 will have monochrome sensors but hasn’t said how many. But I do recall (I think it was in the “Gathering Light” video) him saying that initially the sensor manufacturers were not interested in creating monochrome sensors but eventually they were convinced. Especially if you consider that smartphones are now shipping with multiple cameras (see the iPhone 7+ and others), then the options for producing monochrome sensors just exploded.

I’ll note that the glory of monochrome sensors appears to be spreading along with the computational imaging functionality. Here’s a story about the Qualcomm Clear Sight Dual Camera technology that was written in September 2016.

“Our latest innovation is called Qualcomm Clear Sight, powered by Qualcomm Spectra ISP (Image Signal Processor), and comes straight out of the engineering labs. The results will astound you, as this technology is designed to mimic the attributes of the human eye. Clear Sight is engineered to give your photos improved dynamic range, sharpness, and less noise in low light.

The human eye is a great analogy because your eyes contain cells called “cones” and “rods.” Cones are great at capturing color, but require well-lit environments, while rods excel in capturing light in low-light conditions, but don’t capture as much color. Clear Sight is designed to mimic cones and rods to give you the best of both worlds, producing an image that has optimal contrast and brightness.

Clear Sight features two cameras, each with its own lens and image sensor. Like your eyes, the lenses have identical focal length (meaning they see the same distance). But each camera has a different image sensor: one color image sensor (to mimic cones), and a separate black and white image sensor (which can absorb more light, to mimic rods).”

Note that the article specifically mentions using the Snapdragon 820 – which is the identical processor inside the Light L16!

“Clear Sight technology is supported by the premium Qualcomm Snapdragon 820 and 821 processors. Snapdragon processors feature a technology core dedicated to cameras—an Image ISP we call Qualcomm Spectra. The Qualcomm Spectra ISPs inside of these Snapdragon processors are engineered to merge and quickly processes data simultaneously from both cameras to create a high quality photograph, even in very low lighting, by using algorithms to blend the images captured from each of the two cameras intelligently.

Doesn’t that sound exactly like the computational imaging going on inside the Light L16? Does to me too!

So now we all know a bit more about what makes the Light L16 so special (even tho we don’t have our hands on them yet!).

(cover photo credit: snap from the RED article)

lightRumors

planetMitch started many moons ago with a large airplane manufacturer who decided they didn't need him any more. Now he runs successful businesses including the best DSLR video blog on the planet - planet5D.

View all posts
/* ]]> */