Browse all


Optics and photonics

Optics and photonics

A virtual tour of virtual reality

31 Jan 2018 Margaret Harris

By Margaret Harris at Photonics West in San Francisco

“How many people in this room are wearing smart glasses today?”

When Bernard Kress, a photonics expert and optical architect on Microsoft’s HoloLens smart-glasses project, posed this question at the Photonics West trade show, he had reason to expect a decent response. He was, after all, speaking at a standing-room-only session on virtual reality, augmented reality and mixed reality (VR, AR and MR), and the audience was packed with tech-friendly, early-adopter types who had come specifically because they’re interested in such devices. Surely, someone in the audience would put up their hand.

But in the end, the only hand that went up belonged to one of the speakers: Thad Starner, a computing expert at Georgia Tech who has been using smart glasses for more than a decade. And in a way, that pretty much sums up the state of VR/AR/MR today: despite recent progress, it still hasn’t reached a mass consumer market. Indeed, several of the session’s speakers – notably Robert Schultz, head of optics R&D at the VR/AR development firm Vuzix – said as much, pointing out that virtual reality is basically at the same stage as mobile phones were in, oooh, 1985 or thereabouts.

That said, the overall thrust of the session was that – like the “brickphones” carried by Wall Street types in the Gordon Gekko era – today’s VR and AR devices are the advance guard of a technological revolution.  Certainly, the appetite is there. When Starner asked who would wear a display that looked and felt like a normal pair of eyeglasses, and had the same functionality as a smartphone, he got a forest of raised hands.

Getting to that point will require progress in several areas, but one that kept coming up during the session was the need to improve the way VR devices deal with depth cues. Most displays operate at a single, fixed focal distance, which means that everything you see through a headset appears to be in focus. That might sound like a good thing, but as Doug Lanman of Oculus Research pointed out, “Vision scientists will tell you it’s a bug.” Our eyes just don’t see the world that way, and when they’re presented with it in a virtual format, the result is a cocktail of nausea, dizziness and headaches.

Developers and optics experts from a wide range of companies – including start-ups such as Lemnis Tech and Avegant as well as established players like Microsoft, Google and Intel – are exploring various ways of solving this problem, which is known as the vergence-accommodation conflict. Between talks, I had the pleasure of trying out the Lemnis Tech device, which uses adaptive optics to generate natural focusing cues. The hardware part of their solution involves cameras and infrared LEDs that capture information about where the wearer is looking; the software part takes this information and uses it to, in effect, turn the headset into a varifocal system. As a result, when I looked through their headset at a simulated version of the Earth and the Sun, I could feel the system adjusting to my eye position, giving me a more realistic visual experience.

I also tried out a range of other devices, including Digilens’ heads-up display for motorcyclists, uSens’ hand-tracking headset and ODG’s augmented-reality glasses. Of the devices I tested, I found ODG’s device the most comfortable to wear – although the comparison isn’t really a fair one, since the four devices (and their potential applications) are so different. I do, however, think it’s fair to say that no-one has solved the other key challenge facing VR developers, which is to make compact, near-eye displays that resemble normal glasses. The buzzword for this during the talks was “form factor”; my own preferred term is “not making users look like total dorks”.

As the above video shows, that goal is still pretty remote, and reaching it won’t be easy. According to Schultz, the keys to making a truly wearable device include eliminating stray light, “ghost” reflections and unwanted diffraction effects; reducing weight; improving processing power and battery life; designing an intuitive interface; incorporating WiFi and Bluetooth; and – above all – paying attention to style.

Ticking all of those boxes is going to require a host of innovations. But then, you could have said the same thing about turning a 1980s brickphone into today’s sleek smartphones. As another speaker, Tish Shute of Huawei USA, put it, “We are entering an era when computing will become integrated with human perception.” There’s a brave new world out there somewhere, and VR will undoubtedly be a part of it.

Related journal articles from IOPscience


Copyright © 2018 by IOP Publishing Ltd and individual contributors
bright-rec iop pub iop-science physcis connect