
“Magic is the only explanation“
It’s the genius, Steven Sasson, who we have to thank for the advances in digital camera sensor technology. Steven is an American electrical engineer and the inventor of the first self-contained (portable) digital camera when he worked for Kodak, ironically, that used a “charge-coupled device image sensor” back in 1975. FYI, it wasn’t until 1988 that the first ever digital camera was produced by Fuji and available to consumers across the world.
We’re going to do our best to simplify the process, but if you’re keen to understand the wizardry in detail; you’ll need to be well-versed on electromagnetic radiation itself, how focus works within a lens, the photovoltaic effect, the Bayer filter array, and something called pixel quantification too. So without further ado, the most succinct version we could come up with to explain one of the most complex, misunderstood, groundbreaking technological advances in recent history, is:
*DRUM ROLL PLEASE*
The process of making a digital image from light
1. Light is converted into electrical energy
A digital camera sensor (located right behind the lens) has millions of tiny light sensitive pixels on a wafer thin piece of silicone. When the sensor is exposed to light (by clicking the shutter button), each pixel on the sensor converts the energy received by photons in light waves into electrical energy, known as the photovoltaic effect.
2. Pixels interpret the light into greyscale
The light sensitive pixels can only interpret light waves in monochrome (black and white). If you think of the camera sensor as a grid of pixels, it assigns different numbers to each pixel depending on the brightness and intensity of the receiving light.
3. RGB colour filters translate the image
To then translate the monochrome grid of numbers into a colour image, the pixels need to understand how much red, green and blue (RGB) light was reflected from the scene and does so with coloured filters above each pixel that determine the amount of RGB coloured light photons that should be absorbed, known as the Bayer filter array.
4. The data is then stored as a file on a memory card
To view the image, your device’s memory card stores the processed files and our devices (camera, laptop, phone etc.) displays the data gathered by the sensor as it has been instructed to by the file.
So that’s how it’s done – a little photovoltaic sorcery
The conversion of light into electrical energy and how it’s interpreted by each pixel. Who knew? The breakthrough in camera sensor technology has been irrevocably impactful on society. Digital cameras have opened the world for all to see without a corner being documented visually.
Social networks like Instagram have given content creators a sustainable income through photographic expertise, given the rise of smartphone photography accessories like Moment lenses and DJI’s mobile stabiliser. Sensor technology will only continue to improve with the best smartphone cameras in 2018 taking photographs that are indistinguishable from professional cameras.