The Tech That Will Push VR to the Limits of the Human Eye

Big tech is eager to get us excited about the coming of the metaverse, but today’s virtual reality hardware is a long way from meeting their ambitious goals. One of the biggest challenges is building better displays with far more pixels per inch, but experts say new materials and designs are on the way.

Silicon Valley is betting billions of dollars that the internet is about to undergo its biggest shift since the advent of the smartphone. Soon, the thinking goes, most people will be accessing the web via wearable headsets that transport us into virtual worlds rather than by tapping on a touchscreen.

Today, though, virtual and augmented reality are still fairly rudimentary. While companies like Meta, Microsoft, Google, and Magic Leap are already selling virtual and augmented reality headsets, they have found limited use cases so far, and the experiences they offer still fall well short of the high-definition standards we have come to expect from digital entertainment.

One of the biggest limitations is current display technology. In a VR headset, screens sit just a few centimeters in front of our eyes, so they need to pack a huge number of pixels into a very small space to approach the definition you might expect from the latest 4K TV.

That’s impossible with today’s displays, but in a perspective published last week in Science, researchers from Samsung and Stanford University say that emerging technologies could soon get us close to the theoretical limit of pixel density, ushering in powerful new VR headsets.

Efforts to boost the performance of displays is complicated by the fact that this directly competes with another crucial goal: making them smaller, cheaper, and more energy-efficient. Today’s devices are bulky and unwieldy, limiting the amount of time they can be worn and the context in which they can be used.

A major reason why headsets are so large today is the array of optical elements they feature and the need to keep sufficient space between them and the displays to focus light properly. While new compact lens designs and the use of metasurfaces—nanostructured films with unique optical properties—have allowed some miniaturization in this area, say the authors, this is likely reaching its limits.

Novel designs like holographic lenses and “pancake lenses” that involve bouncing light around between different bits of plastic or glass could help reduce the lens-to-display distance by a factor of two to three. But each of these interactions reduces the brightness of the images, which needs to be compensated for by more powerful and efficient displays.

Better displays are also needed to solve another important limitation of today’s devices: resolution. Ultra-HD TV displays can achieve pixel densities of around 200 pixels per degree (PPD) at distances of around 10 feet, far in excess of the roughly 60 PPD that the human eye can distinguish. But as VR displays are at most an inch or two from the viewer’s eyes, they can only achieve around 15 PPD.

To match the resolution limits of the human eye, VR displays need to squeeze between 7,000 and 10,000 pixels into each inch of display, say the authors. For context, the latest smartphone screens manage only around 460 pixels per inch.

Despite the size of that gap, though, there are already clear paths towards closing it. At present, most VR headsets use separate red, green, and blue organic light-emitting diodes (OLEDs), which are hard to make more compact due to their manufacturing process. But an alternative approach that adds colored filters to white OLEDs could make it possible to achieve 60 PPD.

Relying on filtering has its own challenges, as it reduces the efficiency of the light source, resulting in lower brightness or higher power consumption. But an experimental OLED design known as a “meta-OLED” could get around this trade-off by combing the light source with nanopatterned mirrors that exploit the phenomenon of resonance to emit light only from a particular frequency.

Meta-OLEDS could potentially achieve pixel densities of more than 10,000 PPD, approaching the physical limits set by the wavelength of light. They could also be more efficient and have improved color definition compared to previous generations. However, despite keen interest from display technology companies, the technology is still nascent and likely further away from commercialization.

The most likely near-term innovation in displays, say the authors, is one that exploits a quirk of human biology. The eye is only capable of distinguishing 60 PPD in the central region of the retina know as the fovea, with significantly lower sensitivity on the periphery.

If eye movements can be accurately tracked, then you only need to render the highest definition in the particular section of the display that the user is looking at. While the required improvements in eye and head tracking add extra complexity to designs, the authors say this is probably the innovation that will happen soonest.

It’s important to remember that there are a host of issues other than just better displays that will need to be solved if VR is to become widely commercialized. In particular, powering these headsets raises complicated challenges around battery capacity and the ability to dissipate heat from onboard electronics.

Also, the display technologies discussed by the researchers are primarily relevant to VR and not AR, whose headsets are likely to rely on very different optical technology that doesn’t obscure the wearer’s view of the real world. Either way, though, it seems that while more immersive virtual experiences are likely still some way off, the road map to get us there is well in place.

Image Credit: Harry Quan / Unsplash 

Edd Gent
Edd Genthttp://www.eddgent.com/
I am a freelance science and technology writer based in Bangalore, India. My main areas of interest are engineering, computing and biology, with a particular focus on the intersections between the three.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured