The 4K display, with roughly 4,000 horizontal pixels, became common around 2014, and less than a decade later, 8K screens arrived, doubling that figure, but at what point do extra pixels stop improving what we can actually see?
Producing and powering higher-resolution screens consumes significant resources, so understanding where visual benefits level off is both an economic and environmental concern.
The limits of screen resolution
To explore this, researchers from the University of Cambridge and Meta Reality Labs examined how much detail the human eye can truly detect. Their study, published in Nature Communications, updates one of optometry’s oldest tools, the Snellen chart, for the digital age.
The Snellen chart, familiar to anyone who has had an eye test, was developed more than 160 years ago to assess how clearly people could read letters from a distance. However, our daily habits have shifted dramatically since then. Most of us now spend more time focusing on screens than on printed pages, prompting scientists to question whether this nineteenth-century method still accurately reflects the modern visual experience.
Pixels per degree testing the limits of the human eye
Instead of printed letters, the researchers created a digital display designed to test how the human eye perceives fine patterns on screens. They measured clarity in pixels per degree (PPD), the number of pixels visible within a single degree of vision. Unlike total pixel count, which measures overall screen resolution, PPD provides a direct measure of how sharp an image appears to a viewer.
Volunteers viewed a range of screen patterns in both grayscale and color while the researchers gradually moved the images closer or farther away. Participants identified the point at which the individual lines could no longer be distinguished from one another. The study also tested peripheral vision by displaying patterns toward the sides of the visual field. Results showed that human eyesight typically detects detail at around 60 PPD, equivalent to the standard 20-20 vision benchmark, but actual performance often exceeded this level depending on color and context.
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataRead it in black and white
Interestingly, participants could perceive slightly finer detail in grayscale than in color. On average, the limit for black-and-white images was around 94 PPD, while red and green averaged near 89 PPD, and blue and yellow dropped to about 53 PPD. The researchers suggest that this occurs because the human brain is less efficient at processing color detail than light-and-dark contrast.
This difference was particularly noticeable in peripheral vision, where our eyes are less sensitive to color variation. A clearer understanding of these visual thresholds is valuable for today’s display technologies and the next generation of virtual and augmented reality systems.
By identifying how much resolution the human eye can meaningfully detect, manufacturers can design screens that balance image quality, cost, and energy use. Ultimately, these insights can help ensure that display improvements serve real visual benefits rather than simply adding unnecessary pixels.

