Tips for Mac users

resolution, part 3

tvalleau

Bruce asked some questions (below) that can easily be answered now, and not really as a detour, either. (The answer to part two of his question, about paper, will come separately.)

But, as to how the eye/brain reacts to dpi/ppi…

First Dots per Inch is strictly a printing term, and should not be (but often is) used when discussing computer monitors. Frankly, it’s hardly a major issue, since in casual conversation we generally know if we’re discussing a computer screen or a book… and in those kinds of practical terms, they pretty much mean the same thing. (I’t sjust kinda nice to use the correct words…)

Generally speaking, however, pixels refer to transmitted light, and dots refer to reflected light. So, you’ll have pixels with a digital camera and a scanner and a monitor, but your photo prints, books and magazines (which you can’t see in the dark because there is no light to reflect off the page) are “dots.”

First, let’s look at your monitor. It’s probably right around 100 PPI. (I’m going to limit this to LEDs for convenience.) That number is fixed. (The iPod Touch is 160 PPI, which is why that screen is so easy on the eyes despite it’s small size.)

So, what it really comes down to is the native resolution of your monitor, and it has almost nothing to do with the “resolution” of the picture. (The other thing is how far away you are from an image when you see it, but I’ll get into that later.)

OK: new word – ‘resolution.’ Strictly speaking a digital image has no resolution in and of itself. It simply has dimensions: 640 x 480; 3000 x 4000 and so on. “Resolution” is “resolving power” and a photo can take it on only in comparison to something else, such as the size of the subject of the photo, or the photo when shown on a screen or printed in a book.

Let me get the first one out of the way, er.. first. If you take a photograph of a penny with a macro lens, so that the penny fills the entire image, the a sensor that has 12,000,000 “pixels” (photo sites) will resolve more detail than a sensor with 30,000. You’ve divided the image up into many more discreet points of information. Thus the 12MP camera can be said to have greater resolution than a camera that takes a 640 x 480 (1/3MP) image.

But the photos -themselves- in either case, do not have “resolution” – just dimensions.

Now you can start talking about resolution in the second sense when you start specifying size (in much the same sense that we specified the size of the penny object).

And that’s where the PER INCH as in DPI or PPI comes in.

Let’s work with the 3000 x 4000 12MP image. That’s a fixed size. It never changes in this example.

If you decide to print that image out at 300 dpi, then the width of the printed image will be 3000/300 or 10 inches. Each inch of the resulting print will have 300 of the original 3000 pixels allotted to it.

Thus the resolution of that -print- is 300 dpi.

If you decided to print that very same file at 150 dpi, then the resulting print would be 20 inches wide; and if printed at 75 dpi, it would be 40 inches wide.

But the resolution at 75 dpi is 1/4 what it is at 300 dpi. Bigger print; lower resolution.

And now we get to viewing distance, and the human eye. Take that first 10″ print an put it on the wall 10 feet away from you. Does all that exquisite detail do you any good? Nope: it’s lost because you are too far away to see it in that little print. But, put that 75 dpi, 40″ wide print on the same wall, and suddenly you can see things you never saw before. It will look wonderfully detailed…. yet viewed up close it will look terrible, and only the 10″ one will look good held in your hands.

If you’ve ever seen one of those JumboTron stadium-sized monitors up close, you’ll find that each dot/pixel/point is about 1/2″ in size. Yep: 2 dpi. But that’s fine: no one ever looks at it from 2 feet away; 200′ is more like it.

How does that work? Hold your thumb and forefinger about an inch apart, at arms length.Put a ruler just touching them. How many 1/4″ marks can you see between your fingers? Should be four. That’s an effective “marks per inch” of four. Now move give that ruler to a friend and have them stand 10 feet away, and look at the ruler between your fingers again. How many 1/4″ marks can you see now? Should be about 48. That’s an effective “marks per inch” of 48.

(Thus the apparent resolution of an image has to do with how far away you are from it. The ruler didn’t change; the one-inch space didn’t change. To see the same resolution (that is 4 marks per inch) your ruler at 10 feet would need marks every three inches, not every 1/4 inch.)

So, when someone specifies a photo as “…your file should be 1000 x 2000 at 300 dpi” they have not got a clue what they are talking about. A digital image file is always specified by it’s dimensions, and if they specify a DPI, it -only- makes sense if they specify the “I” – the inches – as well. “5 x 7 inches @ 300 dpi” is an image that is 1500 x 2100 pixels.

So: armed with that info, let’s look at your question about why a nice, small jpg looks fine on your screen, but terrible when printed out.

A very common size on the web is 320 x 240, which on your monitor will make an image about 3 inches wide and just under 2.5 inches high. Now part of the reason that looks fine is that you’re seeing projected light: it is just like shining a flashlight into your eyes. That obscures/blurs detail. And part is, as you suggested, because your mind fills in a lot. Part can be skillful trickery (read: psychology) by the image creator (sharpening does not add any detail; it just makes you think it’s there.) (Your TV probably has a resolution of about 1/2 of that of your monitor; around 50 ppi, and that’s for HDTV!)

On the other hand, in reflected light an image printed at 100 dpi just doesn’t cut the mustard. You want something at least 150 and likely higher (288 – 360 for a photograph.) You can get that out of your 320 x 240 image, of course… just tell your printer to print it at 300 dpi. And the resulting photo will be about 1 inch wide, and 2/3 of an inch high, and will look plenty detailed… if you like postage-stamp sized photos.

So what if you love that 320 x 240 picture and want to print it out at 7 inches wide? Well, you can if you set the dpi to 45. And it will look pretty terrible… (Unless, of course you look at it from about 10 feet away…)

Or, you can tell your software to “enlarge it” but the simple fact is that no software can make up details from something that is not there in the first place. To print that 320 x 240 at 300 dpi and 7 inches wide, those 320 existing pixels have to magically become 2100 pixels. Where do they extra 1780 pixels come from? Thin air, is the answer. The software can try various tricks (some pretty sophisticated, frankly) but really it only has one original pixel for every 6.5 it’s trying to make up.

The result is an image that looks, er, odd, to say the least. (Unless, as I said, you look at it from across the room.)

In sum then, a one-word answer to your question as to why one looks good and the other doesn’t, then, is: “resolution.”

In fact, if you want to see it in action, take a 640 x 480 photo and put it up on your computer screen, and also on an iPod Touch. You’ll instantly see that the iPod image looks much better, because the screen resolution is so much higher. If you make them each the same physical size (use a ruler) the detail will be lost on your computer monitor because it can resolve only 100 ppi, while the iPod can resolve to 160 ppi.

(Which again proves that aspect of it which is, as you noted, in the venue of the mind: what you get used to seeing. Another way to see, side by side, the difference is to compare a photographic print with a reproduction in a book. The book is likely to be about 150 dpi, while the photo may be as high as 1440 or 2880. You can see the ‘dots’ in the book, but you cannot see them in the photo (with an unaided eye.)

Perhaps that will help clear up the confusion I hear in your statement “…even if it has only been reduced to say 150 or 300 dpi or ppi…” I hope you now see that you’ve not ‘reduced’ the image at all. The image is still the same size it always was, and in fact, if you want it printed at the same size you seen on the screen you will have to use some fancy software to -enlarge- the image (not reduce) so that it can print at 150 or 300 dpi. If you don’t do that, you’re just taking a tiny square pixel and blowing it up into a large square block.

Out of energy… off to spend some time with my wife.

Tracy

On Nov 29, 2009, at 6:26 PM, Bruce wrote: snip

Your comment below reminds me to ask you, when you get to it (in a few days?), to please clarify an underlying aspect that I have never been clear about. That is the difference between print standards and screen viewing standards. (By the way, let’s continue to discuss this at the not-TOO-technical level we have been using so far here.)

I think what I am asking about is mostly a matter of human visual perception (or if you prefer, of brain integration of visual information). In short, why does an image look just fine to me on a monitor at a “display” resolution reduced down to 72 or 96 dpi or ppi or whatever? Yet even a total amateur can be dissatisfied by printing out the same image, even if it has only been reduced to say 150 or 300 dpi or ppi. Why should the paper printing process be so closely examined by our eyes and brains, while a monitor image can get away with such less information and still be pleasing?

Or am I perhaps even fooling myself with this observation?

But it seems to me that this needs to be addressed before much can be said about the needed ppi, dpi, lpi, or whatever on printing.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top