This past weekend, at the local Baltimore VR meet up, I got the chance to try out two devices I’d been aching to get my hands on for quite some time: the HTC Vive, and the Leap Motion system. Although I’ve been using the Oculus Rift DK2 and Samsung Gear VR for some time now, this was my first chance to try a physical VR controller. While the HOTAS system I bought specifically for Elite: Dangerous is something I can touch, it only has a cursory equivalent in the virtual world. Placement doesn’t precisely match up.
I started the night by putting on an HTC Vive headset and playing a VR game called Budget Cuts. The game puts you in the role of a hapless first day employee, searching for your job application while trying not to get murdered by robots with guns. It’s about as silly and interesting as it sounds, but the first really cool discovery I made came after putting on the headset and asking for the motion controllers.
[A shot of Budget Cuts, in which a robot is about to have a very bad day.]
I expected to need the person running the station to hand me the controllers, being now blind to the physical world, but that’s not what happened. Instead, thanks to the tracking inherent in the Vive’s Lighthouse system, I looked down and saw the physical controllers sitting on the ground, inside the virtual world of Budget Cuts. An object I could touch was represented in real-time in virtual reality!
I knelt down and reached for the HTC Vive controllers, half-expecting my hands to clip through them. Yet Budget Cuts uses in-game models for the HTC Vive controllers that match quite well with the physical version. I had no problems picking up this virtual/physical object and using it to play the game.
The virtual model matched the physical model in my hands, and this was shockingly immersive. Everything felt real – the shape, the heft, the weight – and simply holding and moving these physical controllers, and seeing their movements reflected in VR, dramatically increased my sense of presence. Rather than simply viewing this virtual world, I was now an active participant within it.
[ The consumer version of the HTC Vive’s Motion Controllers ]
Budget Cuts does a lot of things right – such as using a “portal gun” mechanic to allow you to travel distances much longer than the space you’ve set aside for VR, and making your other controller into a vacuum that sucks up knives (to kill robots, of course!) and pulls aside grates – but I think the best decision they made was to accurately represent the HTC Vive controllers in game, including the shape.
As strange as it sounds, using my vacuum controller to suck out a virtual grate (which then remained stuck to the business end of the vacuum until released ) felt really immersive, as did sucking up knives. Basically, by treating the physical HTC Vive controller as a “bridge” between the physical world and the virtual one, I bought that these virtual objects were, in fact, quite real.
Had I tried to pick up those knives with my hands, I’d have caught nothing but air. Yet by sucking the knifes onto the business end of a vacuum controller I could physically hold in my hand, I felt as if those knives were real, tangible objects. The vacuum controller was real, therefore, so were the knives.
There was a good line of people waiting to try the Vive, so I only got to play for a few minutes (eventually, I limp wristed a knife throw and got shot by a robot) but even those few minutes were probably the most immersive experience I’ve had in VR, simply thanks to the Vive’s controllers. Everything felt intuitive and, combined with the fact that the Vive allowed me to walk about freely in addition to ducking, crouching, and leaning, made the world of Budget Cuts feel very, very real.
The next station I went to had a Leap Motion camera hooked up to an Oculus Rift, and having seen the Leap Motion: Orion demo on Youtube, I was really excited to try it out. My initial thought was that actually being able to see my hands in VR and use them to manipulate objects would be even more immersive than the Vive. This is why I was so surprised when the opposite turned out to be true.
To start, the tracking on the Leap Motion was excellent. I simply raised my real hands before me, and wireframe hands (or what was really more like bones) appeared in front of me, rendered in real time. Not all motions were tracked (oddly, the Leap Motion would detect me flexing my pinky, but not my index finger) but I could easily wave, give a thumbs up, turn my hands palms up or palms down, and, in general, feel like the virtual environment was actually representing my real, floating hands.
The Orion demo lets you do a number of interesting things: you start by batting around blocks, then picking them up and stacking them. Eventually, you can create new blocks (as seen in this video), stack them, and even turn gravity off and on. All of it was very cool, but after playing the Vive, none of it really felt immersive. After playing with the system for a while, I figured out why. The objects I was interacting with had no physical presence. They were literally ghosts, offering no physical feedback.
[This is how your actual hands look in a VR Headset with a Leap Motion Camera]
Quite often, I tried to put down a block and couldn’t, because I couldn’t feel it. It was like having an incredibly light paper box stuck to my fingers, yet I didn’t even have that sensation. Throughout the simulation, I never could settle on how tightly I should clench my fingers to pick something up, or how wide I should open them to release it. It was frustrating, rather than being immersive.
While actually seeing my hands represented was incredibly cool, as a method for interacting with virtual objects, it was inconsistent and difficult to master. Obviously, we’re only seeing the nascent stages of technology like the Leap Motion, but the lack of physical feedback is a real problem. It just didn’t work as well as the Vive’s hand controllers, and the sense of presence I’d expected simply wasn’t there.
In summary, it seems to me that the approaches taken by Vive and Oculus (hand controllers ship with the Vive, and will arrive for the Oculus Rift later this year) is by far the superior approach to interaction with virtual reality, when compared with your own bare hands. Having a single level of separation between your physical sensations and the virtual environment (the “bridge” of the physical controller) helps tremendously in maintaining presence. It keeps you from failing to touch intangible objects.
As VR experiences continue to evolve, I think it will behoove developers incorporating motion controllers to build the virtual models for their input devices to match the physical models. I simply hadn’t encountered anything as convincing as seeing those virtual Vive controllers on the virtual floor of that room in Budget Cuts, then reaching down and physically picking up what I saw in the virtual world. It sells the experience.
While it absolutely makes sense for developers to change details such as texture (you could make the controllers metallic, or rusted, or formed of glass) and to add virtual additions to them (such as a long gun barrel you don’t actually touch) I now believe matching the virtual model of the controller you see in game as closely as possible to the physical model you hold is ideal for maintaining presence.
I’ve always planned to only buy one VR solution. With both the Oculus Rift and HTC Vive now announced, I made the decision to go for the Vive, and I doubt I’ll regret it. Even my short experience at the Baltimore VR meetup was incredibly fun. I really think Vive has the perfect recipe for immersive experiences, because it allows you to walk around and ships with motion controllers at launch.
I also have no doubt the Oculus Touch controllers are amazing, and I suspect that if Oculus developers follow these same rules (representing those controllers accurately in virtual reality) they’ll provide the same massive boost to presence that the Vive controllers do. For now, however, I’m simply surprised to say that the most immersive VR experience involves a physical controller, and not using your own hands.
I think this just goes to show that, as game developers, we’re still learning how best to develop for VR, and I hope other games follow the lead of Budget Cuts and accurately represent the physical controllers that ship with the Vive or Rift. It seems like the best way to sell interactions with otherwise intangible objects.