With 120 million monochrome and 5 million color receptors, the eye and brain are able to do what even our most advanced cameras are unable to, according to computer graphics pioneer Tim Sweeney of Epic Games.
With a resolution of about 30 megapixels, the human eye is able to gather information at about 72 frames per second, which explains why many gamers debate the need for frame rates higher than 70 in games at all.
The maximum required graphics resolution for the human eye to reach its apex in visual fidelity is 2560×1600 with a 30 degree field of view or 8000×4000 with a 90 degree FOV. (These resolutions compare with 2048×1536 for the new iPad’s “retina” display.)
That 2560×1600 resolution is what we see today on modern 30-in LCD panels but the 8000×4000 resolutions is about 16x that of current HDTVs.
According to the Nyquist theorem that debates the amount of information required to present the “good enough” result for a given resolution, game engines would need about 40 billion triangles per second to reach perfection at the 8000×4000 resolution.
Currently, the fastest GPU for triangle processing can handle 2.8 billion per second and Sweeney claims we are only a factor of 50x from reaching that goal. That difference could likely be reached in another two generations of architecture, which actually gives some credence to those that say the end is near.