The Standard VR Cockpit, and Why I Want It

One of the most common types of games thus far released for VR have been “cockpit simulators”: games where you drive a car, or fly a spaceship, or pilot a giant mech. These experiences are ideal for current VR because they mimic what users are doing in real life -sitting in a chair and looking around at the environment. In earlier posts on VR I’ve talked about the three elements that immerse players in virtual environments – the Bubble, the Proxy, and the Controller – and today I want to delve a bit more into the Controller and why I’d like to see developers settle on a standard VR cockpit.

Two game development studios, Frontier and Tammeka Games, have each recently released games users can experience in virtual reality – Elite: Dangerous and Radial-G. Both games render your body (the Proxy) in a virtual cockpit (the Bubble) where you control your racing pod or spaceship, and both look great – however, no physical controller set up will match both cockpits. Each game has its cockpit and controllers configured differently and as a VR user, the first thing I want to do is set up my physical controllers to as closely match the virtual environment as possible.

In Elite: Dangerous, the cockpit uses a HOTAS setup, with the Proxy grasping a flight stick in their right hand and a throttle in their left. In Radial-G, the Proxy instead holds two throttles, one in each hand. This means I, as a user, would need to either purchase two controller setups to mimic these cockpits in the real world or, at minimum, reconfigure my physical set up before switching from game to game.


Development of VR experiences and controllers are still very much in the “Wild West” phase – everyone is working on their own iterations and this can (and should) continue as the technology becomes more mainstream. However, if developers settled on a unified cockpit design that a controller manufacturer could then reproduce and sell in physical form, VR users would have an easy solution for mimicking the appearance of each virtual environment in the physical world. Being able to touch and manipulate your spaceship, car, or racing pod’s controls would go a long way toward increasing presence and I’d love to see someone tackle this as VR goes mainstream.

Before, I’ve mentioned the subtle effectiveness of the desk demo that ships with the Oculus Rift and how it establishes presence by so closely mirroring physical reality (you, in a room, in a computer chair, looking at a desk). That simulation grows even more immersive when you can match the height of the virtual desk to the height of your physical desk at home. You can then place your hands on the virtual surface of the desk and encounter a hard surface – your physical desk – where your eyes in virtual reality tell you there should be one. When this happens presence takes a huge jump. The world feels vastly more real because you can touch it and this makes your VR experience all the more compelling.

The same principle applies to cockpit simulators. After I fire up the game and put on a VR helmet, if I can see a joystick and then grasp it in my hand, my sense of presence jumps by an order of magnitude. If I can see my manipulation of a real world object (such as a joystick) reflected in a virtual environment, that sense of presence continues. This sort of experience is easily obtained with current technology.

The Oculus Rift uses two processes for tracking the position of the user’s viewpoint in VR – gyroscopic sensors in the headset (which determine your head’s pitch, yaw, and roll) and an IR camera, which evaluates the position of individual dots on your headset and uses this data to adjust your viewpoint as you lean forward or back. All of this technology easily translates to physical controllers and could be passed to a simulation and rendered, just as the user’s viewpoint is rendered now.

Why not build a joystick that can detect its Pitch/Yaw/Roll as you use it? Why not build a steering wheel that tracks its Roll as you turn it? And once you have those, why not paint small IR dots on both that the Rift’s IR camera can use to place them within the virtual environment? This would make even modular set ups possible, where the simulation detects and dynamically positions your physical controls based on where the IR camera detects them in the user’s real world environment.

A set up like this would allow a simulation to place your controller where it actually resides, bolted to your desk, in relation to your viewpoint. Taking it a step further, the IR camera could even detect what IR dots on a steering wheel are not visible and use them to place your hands on it in virtual reality (the presumption being if the dots are covered, it is because hands are obscuring them). Sensors within the joystick or wheel could detect its Pitch/Yaw/Roll and reflect those in the simulation in real time.

A controller that tracked this data and passed it to the simulation could, when synced with the Rift’s IR camera, create a wheel or joystick users can grasp and manipulate in VR in real time. You’ll see the wheel turn or the joystick shift within the simulation as you move it in the physical world.

This is why I’d like to see a universal cockpit for VR simulations or, at least, several standard setups from which I can choose. As a user, I could then buy the “Rift cockpit” with a physical joystick, throttle, and steering wheel that the Rift’s IR camera could then detect and place so that their placement in the simulation mirrors the real world. I could buy one VR cockpit and have it match any number of different virtual experiences. Grasping and manipulating controls in real life and seeing my manipulation of those controls mirrored in these simulations would make VR more immersive and fun.

While the process of getting many different game studios to agree on a universal cockpit set up is daunting, we have seen companies agree on standards before (OpenGL and DirectX are two examples). Even if we never agree on a universal set up for a virtual cockpit, creating controllers that can send data about their states and positions to a simulation and using the Rift’s IR camera to place them within the virtual environment would be amazing. This would be a great way to solve one of the most noticeable problems with VR right now – the inability to touch and manipulate objects within the simulation – at least in regards to cockpit simulators.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s