Pixels vs DPI. How to compare?

5 replies [Last post]
smarty
Offline
Joined: Oct 29 2001

Hi,

I am looking at getting a CMOS digital camera that gives me a resolution of 3.17 megapixels with a sensor head that is 2056 x 1544 pixels. The specifications also tell me that it has 12 bits per pixel.

How does one convert from this to dots per inch? Presumably there is a scaling factor, or is there a standard equation you can use?

Also, what is the equation for working out how many megabytes this will be per image assuming no compression?

Thanks in advance, i realise this is a basic question, Mart.

Alan Roberts at work
Offline
Joined: May 6 1999

It's a very basic question. And there's no answer. If you want to print your pictures at 1 pixel per inch or 1000 per inch, that's entirely up to you. The image resolution is defined by the number of pixels, not by the dpi.

So, your 2k by 1.5k picture will scale to whatever size you want to print it. My stills camera (Olympus C3030) is 2k by 1.5k (3.3m pixels, but that's another story) and I can happily print it to full A4 size. Cropping it down to 1k by 0.75k it will still, just about, print to A4 although pixellation is visible if you look closely.

The image dimensions in pixels aren't directly related to the number of pixels in the ccd, because it all depends on the pattern of RGB sensitive cells on it. There may well be 3.3million of 'em, but they are split between red, green and blue. Usually about 50% are green, 33% red, 17% blue because we're not equally sensitive to colour resolution. The best arrangements manage to do it quite well, such that 3.3million makes about 2k by 1.5k and gives good pictures.

mdoragh
Offline
Joined: Dec 5 2000

Hope you can follow what I am saying here. Writing things down isn't my strong point... so I hope it makes enough sense:-

Alan is right, the dots per inch (dpi) is what you choose and is up to you. BUT there are some rules of thumb which work well. basically it depends what quality you need. In theory higher dpi = higher quality. But ther are limits, beyond which you won't see any improvement.

Basically the percieved quality depends on your proposed output medium. So you need different dpi's for a similar quality on differnt media. If it is for use on a Computer screen only then you will only require about 75dpi. Beyond this the detail will be invisble unless you zoom in close (or have an exceptionally detailed monitor)

If you (like most people) want to print it out on you printer then it will depend on how good your printer is. Normally a claimed 600dpi printer can only manage about 200dpi of High Quality colour, as it needs to use a grid of dots to make the right shade. This figure I believe is called the printer's lpi (lines per inch). My old Hewlett Packard printer claimed to be 600dpi but in its booklet explained that a picture only needed 200dpi because the printer used a grid (preumably of 3x3) to make each shade correctly. So that printer was 200lpi, which meant it only needed about 200dpi in the source picture.

I have seen Alan list 250dpi elsewhere as a good rule of thumb. Basically 200dpi is your minimum and you might get a better result from a higher dpi from a newer printer. Beware that the printer's maximum quality isn't the dpi listed eg (28,800 x 14,400 dpi) for this printer 320dpi will probably be the highest it can actually resolve in full colour).

So to find out how big your camera will allow you print a picture, you divide your total dots by your dots per inch.... which leaves the size per inch.

So your 2056dots/200dpi = about 10 inches
and 1544dots/200dpi = just less than 8 inches.

Or to think of it the other way round. If you want 200dpi and you want 10 inches picture then you need about 2000 pixels (dots). 200 dots in every of the 10 inches.

SO approximately, you can print to A4 size pictures without losing ANY quality because smaller than that, the printer will probably be the limiting factor. Larger than A4 can be done happily with only minor loses in quality.

Just for reference.... at work we have a 1.3Meg Camera. We regularly print the pictures at A4 size and find them acceptable. You can even push it to A3, but then there are very noticable pixels in it.... but viewed at a distance (like large pictures are) it looks fine.

The maths about raw image sizes.... it depends whether your 12 bits are total, or per channel eg Red, Blue and green each 12bits = 36bits...

But again simple maths.
"X pixels" x "Y pixels" x "total colour depth in bits" / 8 = raw size minus the header overheads which should be small. (the divide by 8 converts back from bits to bytes).

So in your case 2056x1544x12/8= 4,761,696Bytes
or if it is 12bits per channel: 2056X1544x36/8= 14,285,088Bytes
Thinking about this in word picture form: Each unique pixel on the picture needs 12 bits of data to decribe what colour it is. If you know how many unique pixels there are then you just multiply them by the depth bits to find the file size in bits. Divide by 8 converts to bytes.

Good luck in deciphering this....

Mike

smarty
Offline
Joined: Oct 29 2001

Thanks Guys. I am bogged down with specifications and the simple stuff evades you at the best of times! Appreciate your time and efforts in assisting.

Alan Roberts at work
Offline
Joined: May 6 1999

Yep, that's all about right. 250dpi in the print will make the pixels smaller than you can see, whatever the size of the picture. You can work it out from there.

Steve Allen
Offline
Joined: May 4 1999

Smarty,

Try taking a look at this website, it certainly helped me understand the relationship between pixels dpi and getting a good print or web base picture. The url is www.scantips.com. I understood it so it can't be too difficult. The website is mainly for scanning photos but the explanation fits for digital cameras as well.

cheers

Steve Allen