Experiences in VR development

October 16, 2016

With the release of the PlayStation VR last week, I thought I’d just take a few minutes to talk about why VR is such an exciting technology and design space, and some of my experiences working in the field.

A leap forward

Firstly, it’s thrilling to see how far virtual reality has come just in the last few years. When I started working at the University of Guelph’s CDRU back in early 2012, we were running a fairly straightforward VR simulation. It consisted of a single street in a generic-looking neighbourhood, with a stream of oncoming traffic from the left. Our test subjects would stand on the sidewalk, and try to cross the street safely.

Our equipment was quite limited compared to what’s available today. Positional tracking was handled with a series of (expensive) cameras mounted to the walls around the simulation room (which were prone to malfunctioning on the regular). The headset itself had a relatively low-resolution display, was quite heavy, and was attached to a backpack and a genuine tangle of cables. When a participant in our study walked across the street, a research assistant had to walk with them, carrying all of the cables. When the participant reached the other side of the street and turned to walk back, they had to turn in a particular direction to avoid getting tangled.

Today, the PlayStation VR setup consists of a single PlayStation Camera placed in front of your television, and a single HDMI or USB connection to the PlayStation 4. Most importantly, you don’t need a research assistant to follow you around your living room!

The brain buys in

Perhaps more importantly, the technology available now (the Oculus Rift, HTC Vive, PlayStation VR) is much more immersive than what we were using at the CDRU back then. One of the biggest limitations of our headset at the time was its extremely limited field-of-view, or FOV (the amount of the world visible at any given moment). Our headset at the time had a horizontal FOV of less than 60 degrees, which is significantly smaller than a human’s natural FOV (close to 180 degrees). This gave the impression of tunnel vision, meaning you could really only see what was more or less directly in front of you.

In comparison, the PlayStation VR has a horizontal FOV of 100 degrees, while the HTC Vive and Oculus Rift sport a horizontal FOV of 110 degrees. That’s still more limited than natural human vision, but it’s significantly more immersive than what we had available even a few short years ago.

When we got our hands on the Oculus Rift DK2, what impressed me the most was how quickly the brain seems to “buy in” to the virtual environment, how quickly it seems like you’re actually there. In one memorable episode, we had the newest member of our lab’s team try on the Rift for the first time, and when he accidentally walked into a tree in our simulation, he threw his hands out in front of him to brace for the impact, knocking a full cup of coffee all over our equipment.

One issue that VR users might find themselves running into (no pun intended) on account of this immersion is safety. Since you’re so fully immersed in the virtual world, and have no vision of where you are in the real world, you could easily find yourself running into a table, tripping over a couch, or some other mishap. Once, while helping to calibrate our simulation, a research assistant walked face first into a wall. Luckily, the only thing injured was her pride, but we had to have an additional research assistant making sure that nobody walked into anything in the future. The PlayStation VR has a smaller workable tracking area (3m by 2m) than we had available, but users will have to make sure that their space is clear of hazards (or hire an assistant).

Wrap

In short, it’s great to see this fantastic technology becoming available to more people, and I’m looking forward to seeing it catch on. I’ve got another post coming with some design considerations for developing VR applications, including some issues that we ran into at the CDRU.