Sep 09, 2013

How many ppi before it is beyond the realm of human perception?

Apple really got people to pay attention to the ppi (pixels per inch) of screens with the brilliantly named Retina displays, but when do we reach the point that we lowly humans can no longer see an improvement. There has to be physiological limit there somewhere.

Chris had a good answer, but here's an interesting article on pixel density.

Pixel density

"Pixels per centimeter (ppcm), Pixels per inch (PPI) or pixel density is a measurement of the resolution of devices in various contexts: typically computer displays, image scanners, and digital camera image sensors.

Ppcm can also describe the resolution, in pixels, of an image to be printed within a specified space. Note, the unit is not square centimeters. For instance, a 100×100 pixel image that is printed in a 1 cm square has a resolution of 100 pixels per centimeter (ppcm). Used in this way, the measurement is meaningful when printing an image. It has become commonplace to refer to PPI as DPI, which is incorrect because PPI always refers to input resolution. Good quality photographs usually require 300 pixels per inch, at 100% size, when printed onto coated paper stock, using a printing screen of 150 lines per inch (lpi). This delivers a quality factor of 2, which delivers optimum quality.

The lowest acceptable quality factor is considered to be 1.5, which equates to printing a 225ppi image using a 150 lpi screen onto coated paper.[citation needed] Screen frequency is determined by the type of paper that the image is to be printed on. An absorbent paper surface, uncoated recycled paper for instance, will allow the droplets of ink to spread (dot gain), and so requires a more open printing screen. Input resolution can therefore be reduced in order to minimise file size without any loss in quality, as long as the quality factor of 2 is maintained. This is easily determined by doubling the line frequency.

For example, printing on an uncoated paper stock often limits the printing screen frequency to no more than 120 lpi, therefore, a quality factor of 2 is achieved with images of 240 ppi."

In and of itself ppi means little to me once it passes the line of obvious pixelization. There are other things that matter such as contrast, color saturation and accuracy. It also matters how far away the display is. For example, think of those giant jumbotron displays at sports events. When you are 100 years away, they can almost look like a HD TV. When you are 18” away, it looks like a bunch of individual lights. 


As for when it is beyond the capability of the average human eye to perceive additional pixel density, there is some disagreement. Most experts (whatever that means) say that the 300-500 PPI range is about the limits of the human eye, while others, such as the blog Christopher cited, claim a much, much higher density is required.


According to this blog:


"If a healthy adult brings any display screen or printed paper or whatever 4 inches (100 mm) from his or her face, the maximum resolution he/she can see at is 2190 ppi/dpi."


That's nearly 10 times sharper than the 239 PPI delivered by the Chromebook Pixel, the sharpest image of any laptop on the market. I tested one for two weeks and it was fantastic. It's hard to believe a picture could be much sharper, or that our eyes can perceive it, but that 2190 figure suggests it's possible.

Answer this