Sensor Pixel Size as a Determinant of Digital Camera Image Quality

Revision 0.22
Copyright © 2007-2011 by .
All rights reserved.

The purpose of this web page is to explain the concept of sensor element size and to provide some examples from actual digital cameras, so that you can get an idea of which digital cameras are more likely to offer the best image quality.


A digital camera sensor is what senses an image's light and converts it into an electronic image. It consists of an array of millions of pixel sensors. Typically each pixel sensor is made up of sensors for component colors, normally red, green, and blue. These sensors are usually adjacent to each other, but in some cameras they stacked on top of one another.

What really matters when determining digital camera image quality ("IQ") is actually not how many megapixels it has, but rather what the size of each of these microscopic pixels is. In this article, I measure pixel size square micrometers (µm2).

The size of a pixel directly impacts how much noise an image will have in low light, and in some cases even in daylight. The bigger the pixel is, the lower the noise because more photons can reach a bigger pixel sensor.

To illustrate, suppose you have two cameras that both have a sensor that is 1/1.8" in size (this represents the area as a single number since the numerator is always 1), but different numbers of pixels.

  • The one with more pixels crams more pixels into the same space so they must be smaller than in the other camera.
  • If the same number of photons are reaching each sensor per second then fewer photos will be reaching individual pixels in the higher-megapixel camera.

Therefore it initially appears that the camera with fewer megapixels and therefore large pixels will have better image quality with less noise.

This is a good reason to be skeptical about the marketing hype and salespeople who claim or imply that it is better to purchase higher megapixel cameras.

Of course, pixel size is not the sole determinant of image quality. There are others:

  1. Sensor quality differs between sensor chip manufacturers.

  2. Sensor quality varies within the range of products from any one sensor manufacturer. Every product line has a high-end line and a low-end.

  3. Furthermore, any one maker of cameras e.g. Canon may use sensor chips from different manufacturers between their camera models. So they might put a cheap sensor from one manufacturer in a cheap camera, but a pricey sensor from another manufacturer in their high-end camera.

  4. Newer sensor technology is surely better than older technology, because chip manufacturers are getting better every year, improving the optical and electrical characteristics of sensors.
    So while the Canon A80's 4MP sensor's pixels have an area of 10 µm2, which is big, its quality could be inferior to something made 3 years later that uses half the area per pixel.
    Thus the technology's generation matters.

  5. Optics can play a big role at high megapixels. Not all lenses are equally finely polished. Some high-MP cameras are being sold with lenses that are just barely sufficient to support the number of megapixels. Some cameras with over 12 megapixels are being paired with lenses that are not ground fine enough to support that many pixels.
  6. Lastly, whether sub-pixels (red, green and blue) are adjacent versus stacked is a big factor in image sharpness. Stacked subpixels are used in Foveon(tm) sensors. Adjacent are used in Bayer sensors.

Sensor geometry


Now let's talk about the geometry of the sensors themselves. A sensor's overall width and height typically have a 4:3 ratio. Each pixel in a sensor has 4 color-specific sensor elements -- two green elements, one blue and one red, placed adjacently (except in the Foveon sensor, where they are stacked). Pixels and color-speific sensors are square. Like this:

1/X designation

For cheaper cameras, the information about the size of the sensor is always expressed as 1 over X", where X is a number that varies by sensor.

For expensive cameras they instead usually give you the width and height in millimeters, or the name of the sensor size e.g. APS-C.

The 1/X" format is an old way of describing sensor sizes devised for Vidicon television cameras. It means that the diagonal of the 4:3 sensor is 1 over X inches, times two thirds. The reasons behind this arcane standard are best left to the history books, and not bothered with here.

Determining pixel dimensions from sensor width & height

If you are told the actual dimensions of the sensor, determining pixel area is simple:
Area of entire sensor (in mm2) = width in mm * height in mm
Area of entire sensor (in µm2) = 1,000,000 * area in mm2
Area of one pixel = area of sensor in µm2 / # pixels

Determining pixel dimensions from 1/X size

While often it is possible to learn the width and height of a sensor from a company's camera manual or specification sheet, sometimes all you can get quickly is the 1/X" value.

To determine 4:3 sensor width and height from 1/X", let's solve this equation:

(1 / X") * 0.667= sqrt ( (4a)2 + (3a)2 )
Or simply...
0.444 / X2 = 16a2 + 9a2
And from this we get...
0.444 / X2 = 25a2
And then this...
sqrt (0.444 / 25X2) = a
And like so...
0.667 / 5X = a
And finally...
Width = 4a = 4 * 0.667 / 5X
Height = 3a = 3 * 0.667 / 5X
...but we need the total area, too:
Area = width * height = 0.21333 / X2
Plus we need to convert to metric!
Width in micrometers = 25,400 * width in inches
Height in micrometers = 25,400 * height in inches
Area in µm2 = 645,160,000 * area in inches
So, the final equation is:
Area of the entire sensor in µm2 = 137,630,000 / X2
The area occupied by one pixel is:
Pixel area = area of sensor in µm2 / Y
...where Y = total pixels

Camera manufacturers typically provide two numbers for the total pixels in the camera, e.g. 10MP effective and 10.3MP actual. You have to use the actual number of pixels in the equation above i.e. the higher of the two numbers usually specified.

Point and shoot pixel area values

Camera Sensor size Megapixels Pixel area(µm2) Pixel
density (MP/cm2)
Canon Powershot A630 1/1.8" 8.2MP 5.2 19.3
Casio Exilim Z1080 1/1.75" 10.3MP 4.36
Canon Powershot G7 1/1.8" 10.3MP 4.1
Canon Powershot A640 1/1.8" 10.3MP 4.1
Canon Powershot A650 1/1.7" 12.4MP 3.8
Casio Exilim Z1200 1/1.7" 12.39MP 3.8
Canon Powershot G9 1/1.7" 12.4MP 3.8
Nikon Coolpix 5100 1/1.72" 12.43MP 3.74
Canon Powershot A570IS 1/2.5" 7.1MP 3.1
Canon Powershot A720IS 1/2.5" 8.3MP 2.6 (!!)
Canon Powershot A80 1/1.8" 4.1MP 10.

Special case of a 16:9 Panasonic camera

The Panasonic Lumix DMC-LX2 is the same camera as the Leica D-LUX 3. Both manufacturers claim the sensor, which has the "widescreen" format, can be thought of as a 1/1.65" even though that standard was invented for old-time television cameras. I was unable to find the manual online to get the real dimensions. The Panasonic version of this camera generally does not review well, perhaps because the odd-dimensioned sensor is a new and unusual creation that has not been refined yet.

Camera Sensor size Megapixels Pixel area (µm2)
Panasonic Lumix DMC-LX2 1/1.65" 10.2MP 5.0

Bayer digital SLR pixel area values

Camera Sensor size Megapixels Pixel area (µm2) Pixel
density (MP/cm2)
Nikon D700 36 x 24 mm CMOS sensor 12.9MP 67.0 1.5
Nikon D3 36 x 24 mm CMOS sensor 12.9MP 67.0 1.5
Nikon D70 23.7 x 15.6 mm CCD sensor 6.3MP 58.7 1.7
Nikon D40 23.7 x 15.5 mm CCD sensor 6.3MP 58.7 1.7
Canon EOS 350D / Digital Rebel XT 22.2 x 14.8 mm CMOS sensor 8.2MP 40.1 2.5
Nikon D60 23.6 x 15.8 mm CCD sensor 10.8 MP 34.6 2.9
Nikon D80 23.6 x 15.6 mm CCD sensor 10.8MP 34.5 2.9
Nikon D40X 23.6 x 15.6 mm CCD sensor 10.8MP 34.5 2.9
Canon EOS 400D / Digital Rebel XTi 22.2 x 14.8 mm CMOS sensor 10.5MP 31.3 3.2
Canon EOS 40D 22.2 x 14.8 mm 10.5MP 31.3 3.2
Nikon D90 23.6 x 15.8 mm CMOS sensor 12.9MP 28.9 3.5
Canon EOS 500D / Digital Rebel Ti1 22.3 x 14.9 mm CMOS sensor 15.1 MP 22.0 4.5
Canon EOS 7D 22.3 x 14.9 mm CMOS sensor 18 MP 18.5 5.4

Foveon digital SLR pixel area values

Foveon sensors have some natural advantages over the Bayer format. Whereas the Bayer's pixel area is like a mosaic, with pixel area divided into two green sensors (½ the area), a red sensor and a blue sensor (¼ the area each), the Foveon's entire pixel area is available to each color because the color sensors are stacked.

Therefore Foveon color sensors receive more light than Bayer elements do given the same area. Or to put that differently, a Bayer sensor would have to be larger than a Foveon for its pixel sensors to receive the same amount of light.

In addition, because all sensor colors are at one photosite, Foveon sensors naturally produce sharper images than Bayer sensors, even those with 2 times as many pixels.

There is a wrinkle in the stacking approach, however. The sensor is made of silicon, which is only translucent. In order words is passes some light but absorbs it as well. As you go deeper into the silicon, it consequently absorbs fewer and fewer photons. Furthermore different wavelengths penetrate to different depths. Blue penetrates silicon the shallowest, so the blue detector is the top and thinnest layer. Green penetrates deeper but not as deeply as red, so green is the next layer below blue, and then red is the deepest and thickest layer.

My experiments in infrared photography using the Sigma DP1 showed that the Foveon sensor is highly sensitive to these long wavelengths despite fewer photons reaching the red layer.

As far as I know, the DP1 and DP2 use the same sensor however there are varying reports as to how many pixels each has. So here are two calculations:

Camera Sensor size Megapixels Pixel area (µm2) Pixel density (MP/cm2)
Sigma DP2/ DP2s 20.7 x 13.8 mm 4.7 MP 60.8 1.6
Sigma DP1 20.7 x 13.8 mm 4.8 MP 59.5 1.6


Share on StumbleUpon: