Facebook is going to need to find a killer app to get people to buy the Oculus Rift. But it also needs to deal with a queasy problem.
Attendees play a video game wearing Oculus Rift virtual reality headsets at the Intel booth at the International Consumer Electronics Show (CES) in Las Vegas.
The Associated Press
Facebook's Oculus Rift virtual reality headset — and the VR industry in general — still have a technical problem to solve as they push to turn the technology mainstream: Using it makes some people feel sick.
In early versions of headsets like the Oculus Rift, simulation sickness — a phenomenon in which people become nauseous while playing a game on a VR headset — emerged as a surprising but not unexpected issue. Despite describing it as a “magical” experience overall, David Helgason, the CEO of gaming software company Unity, told BuzzFeed News the first version of the developer kit he used made him feel sick.
Historically, similar issues have emerged in piloting simulators and other immersive computer systems that can lead to sensory mismatches. If someone feels like they are walking around while wearing a VR headset, but their body is not actually moving, the result can be a feeling similar to motion sickness.
VR technology is certainly improving, with each new version producing a higher-quality experience. But it's hard to tell exactly what the impact will be until the devices are in use among a large consumer base, according to some game developers and executives.
As part of a larger story about the state of VR as the industry races to widespread adoption, BuzzFeed News interviewed Dr. Nik Blevins, chief of the Division of Otology and Neurotology and an expert in vertigo at Stanford Health Care, to discuss the phenomenon.
Dr. Blevins has also researched the use of virtual reality in surgery simulation. Here's an edited transcript of our interview.
Can you tell us a little bit about simulation sickness?
Dr. Nik Blevins: If you look at this from a balance, you look at why people get carsick, it's really a discrepancy between all of the senses, between how much you're moving and in what direction. There are a number of senses that give us an idea of where we are in space; one is our inner ear, two is our vision. Our eyes tell us where we move. Our inner ears give us information about linear and rotational acceleration. Then we have other senses: our touch sensation, what you feel about your skin, your joint position, the joints in your neck, your muscles that change when you move in space. Then you have to integrate all of that in your central nervous system, and you have to take the appropriate action given the cumulative info provided by those senses.
The problem is, some people much more so than others can be very intolerant of mismatches between the sensory info that comes in. For example, when people get sick in the back of a car and read, is because their inner ears are feeling every bump and curve, but your eyes and touch and muscles are experiencing the same motion sensation. So your brain is trying to make sense of these disparate sensory input, and it makes you sick. That sensory mismatch that is intolerable, some people are wired in such a way that any slight sensory mismatch provokes really horrible sickness, and some people can tolerate huge mismatches and not blink an eye.
What about in virtual reality environments?
NB: What we see in these VR environments, we're very good at providing some very compelling sensory inputs. We're moving in a certain environment, but we can't match all the sensations that go along with that. If you put on VR goggles and you're in an environment that says you're in a roller coaster, the other senses say you're standing in a lab. It's the same mismatch you would get if you were reading in the back of your car, or below deck on a boat. That's really the challenge: We don't have a way in your VR interfaces to match all of the senses that are required to have a unified picture of an artificial environment.
Is there a way to tune down the VR experience in such a way that it doesn't impact that sensory mismatch as much?
NB: It's kind of a paradox as I understand it. Usually what we're manipulating in VR environments is vision. We're pretty good at presenting a compelling 3D moving environment. You want to make that compelling, you want to make that so real that it overcomes the fact that you're standing in a still room. You can turn down the sense of motion in your vision, but you're doing that at the expense of the realism of the experience of the VR environment.
I don't think there's a way to make that VR environment real and compelling without having a risk of having it out of phase or out of sync with your other sensory inputs. I think there's a bit of a paradox: If you turn down the realism of the visuals you are necessarily gonna alter the realism.
What about as the technology improves?
NB: Again, some people, it's gonna have a big impact on, some people it's not. It's hard to tell what your audience is. The better the virtual environment is, the less you'll have disturbing sensory input. Having high refresh rates with good graphic quality that meets the 3D expectations of your visual system is going to help. What I've found in some of this, from personal experience, using immersive technology in our lab, if the parallax and the stereoscopic presentation of the environment is off, even without a lot of motion, you'll feel disconnected with the environment. That leads you to have more risk; your brain can find discrepancies between your right and left eye. In general the better the visual fidelity, the less likely you're gonna have bothersome input.
I think from a physiological standpoint, the way to get around it would be to provide additional sensory input from the other senses, but that's a difficult thing to do technically.
Would the problem be resolved if an additional sense was stimulated in the experience?
NB: I don't think we really know the answer to that; different people show such variable responses to these mismatched sensory stimuli. I see this every day when I see patients, people who have an otherwise small inner ear problem on one side that are just incapacitated by it because they can't cope with one ear being off. We see some people who have lost an ear completely who are hardly bothered.
In some ways, situations where sensory input that's really different than the input can have a bigger effect than input that's just a little bit different. If it's close and you can't put it together, sometimes it's worse than something that's far off that you can ignore. Because we're working with people that are always wired differently, it's hard to say one is always gonna be better than the other.
You can see that in some of the simulated roller coasters or flight sims that have some inner ear stimulation associated with it — in many ways that's something that can induce additional sickness rather than mitigate it because you're simulating the inner ear and the eyes, but you're not quite stimulating the ears correctly. You can't simulate all the inputs to the inner ear; you're substituting one sense for the other. For linear acceleration, tipping the ride or the simulator, rather than moving it in a linear way, there's subtle differences to that which can really evoke more of a sense of dissonance in your sensory input.