Going “Hands On” With VR Controllers

This past weekend, at the local Baltimore VR meet up, I got the chance to try out two devices I’d been aching to get my hands on for quite some time: the HTC Vive, and the Leap Motion system. Although I’ve been using the Oculus Rift DK2 and Samsung Gear VR for some time now, this was my first chance to try a physical VR controller. While the HOTAS system I bought specifically for Elite: Dangerous is something I can touch, it only has a cursory equivalent in the virtual world. Placement doesn’t precisely match up.

I started the night by putting on an HTC Vive headset and playing a VR game called Budget Cuts. The game puts you in the role of a hapless first day employee, searching for your job application while trying not to get murdered by robots with guns. It’s about as silly and interesting as it sounds, but the first really cool discovery I made came after putting on the headset and asking for the motion controllers.

BudgetCuts

[A shot of Budget Cuts, in which a robot is about to have a very bad day.]

I expected to need the person running the station to hand me the controllers, being now blind to the physical world, but that’s not what happened. Instead, thanks to the tracking inherent in the Vive’s Lighthouse system, I looked down and saw the physical controllers sitting on the ground, inside the virtual world of Budget Cuts. An object I could touch was represented in real-time in virtual reality!

I knelt down and reached for the HTC Vive controllers, half-expecting my hands to clip through them. Yet Budget Cuts uses in-game models for the HTC Vive controllers that match quite well with the physical version. I had no problems picking up this virtual/physical object and using it to play the game.

The virtual model matched the physical model in my hands, and this was shockingly immersive. Everything felt real – the shape, the heft, the weight – and simply holding and moving these physical controllers, and seeing their movements reflected in VR, dramatically increased my sense of presence. Rather than simply viewing this virtual world, I was now an active participant within it.

HTCViveController

[ The consumer version of the HTC Vive’s Motion Controllers ]

Budget Cuts does a lot of things right – such as using a “portal gun” mechanic to allow you to travel distances much longer than the space you’ve set aside for VR, and making your other controller into a vacuum that sucks up knives (to kill robots, of course!) and pulls aside grates – but I think the best decision they made was to accurately represent the HTC Vive controllers in game, including the shape.

As strange as it sounds, using my vacuum controller to suck out a virtual grate (which then remained stuck to the business end of the vacuum until released ) felt really immersive, as did sucking up knives. Basically, by treating the physical HTC Vive controller as a “bridge” between the physical world and the virtual one, I bought that these virtual objects were, in fact, quite real.

Had I tried to pick up those knives with my hands, I’d have caught nothing but air. Yet by sucking the knifes onto the business end of a vacuum controller I could physically hold in my hand, I felt as if those knives were real, tangible objects. The vacuum controller was real, therefore, so were the knives.

There was a good line of people waiting to try the Vive, so I only got to play for a few minutes (eventually, I limp wristed a knife throw and got shot by a robot) but even those few minutes were probably the most immersive experience I’ve had in VR, simply thanks to the Vive’s controllers. Everything felt intuitive and, combined with the fact that the Vive allowed me to walk about freely in addition to ducking, crouching, and leaning, made the world of Budget Cuts feel very, very real.

The next station I went to had a Leap Motion camera hooked up to an Oculus Rift, and having seen the Leap Motion: Orion demo on Youtube, I was really excited to try it out. My initial thought was that actually being able to see my hands in VR and use them to manipulate objects would be even more immersive than the Vive. This is why I was so surprised when the opposite turned out to be true.

To start, the tracking on the Leap Motion was excellent. I simply raised my real hands before me, and wireframe hands (or what was really more like bones) appeared in front of me, rendered in real time. Not all motions were tracked (oddly, the Leap Motion would detect me flexing my pinky, but not my index finger) but I could easily wave, give a thumbs up, turn my hands palms up or palms down, and, in general, feel like the virtual environment was actually representing my real, floating hands.

The Orion demo lets you do a number of interesting things: you start by batting around blocks, then picking them up and stacking them. Eventually, you can create new blocks (as seen in this video), stack them, and even turn gravity off and on. All of it was very cool, but after playing the Vive, none of it really felt immersive. After playing with the system for a while, I figured out why. The objects I was interacting with had no physical presence. They were literally ghosts, offering no physical feedback.

LeapMotion

[This is how your actual hands look in a VR Headset with a Leap Motion Camera]

Quite often, I tried to put down a block and couldn’t, because I couldn’t feel it. It was like having an incredibly light paper box stuck to my fingers, yet I didn’t even have that sensation. Throughout the simulation, I never could settle on how tightly I should clench my fingers to pick something up, or how wide I should open them to release it. It was frustrating, rather than being immersive.

While actually seeing my hands represented was incredibly cool, as a method for interacting with virtual objects, it was inconsistent and difficult to master. Obviously, we’re only seeing the nascent stages of technology like the Leap Motion, but the lack of physical feedback is a real problem. It just didn’t work as well as the Vive’s hand controllers, and the sense of presence I’d expected simply wasn’t there.

In summary, it seems to me that the approaches taken by Vive and Oculus (hand controllers ship with the Vive, and will arrive for the Oculus Rift later this year) is by far the superior approach to interaction with virtual reality, when compared with your own bare hands. Having a single level of separation between your physical sensations and the virtual environment (the “bridge” of the physical controller) helps tremendously in maintaining presence. It keeps you from failing to touch intangible objects.

As VR experiences continue to evolve, I think it will behoove developers incorporating motion controllers to build the virtual models for their input devices to match the physical models. I simply hadn’t encountered anything as convincing as seeing those virtual Vive controllers on the virtual floor of that room in Budget Cuts, then reaching down and physically picking up what I saw in the virtual world. It sells the experience.

While it absolutely makes sense for developers to change details such as texture (you could make the controllers metallic, or rusted, or formed of glass) and to add virtual additions to them (such as a long gun barrel you don’t actually touch) I now believe matching the virtual model of the controller you see in game as closely as possible to the physical model you hold is ideal for maintaining presence.

I’ve always planned to only buy one VR solution. With both the Oculus Rift and HTC Vive now announced, I made the decision to go for the Vive, and I doubt I’ll regret it. Even my short experience at the Baltimore VR meetup was incredibly fun. I really think Vive has the perfect recipe for immersive experiences, because it allows you to walk around and ships with motion controllers at launch.

I also have no doubt the Oculus Touch controllers are amazing, and I suspect that if Oculus developers follow these same rules (representing those controllers accurately in virtual reality) they’ll provide the same massive boost to presence that the Vive controllers do. For now, however, I’m simply surprised to say that the most immersive VR experience involves a physical controller, and not using your own hands.

I think this just goes to show that, as game developers, we’re still learning how best to develop for VR, and I hope other games follow the lead of Budget Cuts and accurately represent the physical controllers that ship with the Vive or Rift. It seems like the best way to sell interactions with otherwise intangible objects.

Advertisements

Why a Standalone Character Creator Would Be Awesome

When I first started working to promote my books, I focused on the things that made me, personally, more interested in a story. One of those things, as with all entertainment, is artwork of the characters. I could describe my characters with words, sure, but there’s something special about having a picture of them. Having visual representations of characters has always, for me, made them feel more real.

SeraPainting

[Sera Valence, as rendered in Black Desert]

I was fortunate that the Internet makes it easy to find talented creative folks, many of whom don’t live in the same state or even the same country I do. First, I found the talented Greg Taylor to do my book covers (his cover for Demonkin is particularly epic) and, for my author website, I found another talented artist in Jin Kim, who also, coincidentally, does contract work for the videogame industry (my day job).

I contracted Jin to create black and white images of my characters and loved the results, but professional art, as with the other costs of indie publishing (hiring an editor, buying advertising space, booking convention flights and stays, and so on) is expensive, and unless you have a day job to support your author aspirations, paying for art can be hard to justify. So what other options do authors have?

KaraSketchBW

[Kara Honuron, as sketched by Jin Kim]

Interestingly, the ability to create unique, visually striking characters has been around for decades now – in videogames. Games going back all the way to the original Everquest and forward to the latest Mass Effect have provided detailed customization options to allow you to create your unique avatar, which is then rendered using the game’s graphic engine. These days, such images have become truly striking.

JyllithPainting

[Jyllith Malconen, as rendered in Black Desert]

This image and all the other color images in this post were made in the character creator for Black Desert, a popular MMORPG coming soon to the US. I did absolutely nothing to this image in Photoshop or anywhere else. It’s a straight screenshot from the game, and it looks stunning. Better yet, creating this required nothing more (from me) than selecting some options and tweaking some sliders.

Naturally, with a snow day on the horizon, the first thing I did with Black Desert’s character creation tool was to try to recreate, as closely as possible, the characters from my books in glorious CGI. Even in cases where I didn’t recreate Jin’s sketches precisely, I still feel like I was able to get the “feel” about right.

CrossReference

[Sketches of my characters, with similar shots of them in Black Desert]

Better yet, the CGI artwork makes so many of the more subtle details clear. Kara’s orange eyes. The fact that Tania is blind. Jyllith’s striking red hair. All of these graphical details come out far more vividly in color artwork, and this artwork is completely computer generated. Each unique avatar took maybe 20 minutes to create.

TrellPainting

[Trell, as rendered in Black Desert]

While Black Desert includes one of the most flexible and gorgeous character creators I’ve ever seen (you can currently download it here and create your own characters, absolutely free!) the concept of creating customized characters using a toolset created by programmers and artists is nothing new. With the increase in the popularity of indie publishing and the number of people publishing their own work, there’s now increased demand for quality artwork for book covers and promotion. It makes me wonder if a properly robust character creation system, generating copyright free images, could provide those.

KaraPainting

[Kara Tanner, as rendered in Black Desert]

Many computer-generated image (CGI) tools already exist, of course (3DS Max, Maya, and Poser are examples) but the barrier to entry is steep, with some (such as 3D Studio Max) costing thousands of dollars, and requiring a significant amount of artistic training before you can generate anything remotely professional looking. Worse yet, these tools require quality 3D models and textures to generate anything approaching professional looking artwork. Hopefully, this won’t offend any independent authors out there, but I can spot a “Poser cover” a mile away. These covers don’t look professional at all.

TaniaPainting

[Tania, as rendered in Black Desert]

So why haven’t character creators like this become more freely available independent of the games for which they’re designed? It seems like a no brainer – if you charged people a small fee to buy a toolset that allowed them to create character images this striking on their home computer, simply by tweaking sliders and selecting options, why aren’t there already a number of toolsets out here? It seems ideal for traditional and independent authors, roleplayers, tabletop gamers, and a huge market of nerds.

BynPainting

[Byn Meris, as rendered in Black Desert]

Better yet, since Black Desert’s character creator was released, even those who might not be dedicated gamers or roleplayers have found the fun, by recreating celebrities in the engine (as seen here) or creating truly monstrous, nightmare inducing abominations by tweaking the sliders WRONG (as seen here). People did the same with Fallout 4’s character creator and many other character tools. So why aren’t there already a dozen reasonably priced character creation tools out there for use by anyone?

ArynPainting

[Aryn Locke, as rendered in Black Desert]

The simplest answer is that, like all game design these days, creating the artwork available in these tools and the tool itself is expensive – and in fact, far more expensive than even something like 3D Studio Max, when you add up all the developer salaries. The reason these character creators are so easy to use (for us) is because dozens of artists toiled away for weeks or months to create a huge library of high quality art that’s also used in the game. Talented programmers and UI designers then created an interface that allows us to “mix and match” this art into gorgeous images, dynamically rescaling models in real time.

JairPainting

[Jair Deymartin, as rendered in Black Desert]

Sadly, as much as I would love to see a character creation suite as powerful as Black Desert’s released for general use, I just don’t think there’s enough demand for it. Traditional publishers already have the money to contract professional artists to create their book covers, and indie publishers (and others who might be interested in quality CGI artwork, like roleplayers and tabletop gamers) aren’t a big enough market to justify the development cost of such a tool, at least as a standalone software package.

As striking as these images are, using them to create book covers would almost certainly run afoul of a significant number of copyright laws, and so for the moment, as great as they might look, they’re stuck in the same realm as fan art of copyrighted stories – fine, so long as you don’t try and sell it.

Despite this, I hold out hope that one day some enterprising company or Kickstarter will take a route similar to Heroforge or other 3D printed miniature makers, creating a toolset to create truly high-quality CGI artwork to the masses. For the time being, however, we can at least continue to play in Black Desert.

And, at least unofficially, bring the characters from our heads to gorgeous CGI life.

First Impressions of the Gear VR

Serendipity

Two weeks ago, my wife and I finally decided to upgrade our cellphones, which we do every three or four years. We don’t skimp on essential electronics we plan to keep for extended periods of time, and so bought ourselves a pair of Samsung Galaxy S6 phones. As you’d expect, they’re really nice phones!

Gear01-Phones

A week ago, at Capclave, one of my fantasy short stories won 2nd place in the Baltimore Science Fiction Society’s short story contest. The prize for 2nd place was an invitation to Capclave 2015, an invitation to Balticon 2016, and … $100 dollars. Which was awesome, but will also soon become important.

Two days ago, the Samsung Gear VR (basically, a super nice Google Cardboard VR headset designed exclusively for the Samsung Galaxy S6) went on sale on Amazon … for $100 dollars. I already have an Oculus DK2 and know what VR is like, so I’d never have bought a Gear VR otherwise. But the sale, plus the recent short story prize, plus hearing good things about Gear VR initiated an impulse buy.

So, thanks to a once every four years phone upgrade, winning a short story contest, and 50% off on Amazon, I impulse bought a Gear VR and snapped my new phone inside. Here are my first impressions.

VR is Super Clear

VR, both for movies and videogames, is super clear on the Gear VR. It provides a sneak preview of what VR will look like on the final consumer versions of upcoming VR headsets, which is to say, amazing. This additional clarity makes the biggest difference in 3D movies, and I think passive 360 viewing experiences (like safaris and concerts) will be the bread and butter of “casual” VR adopters. The accessible hook.

Gear02-Headset

Even with the lower quality of VR movies available to on Gear VR at launch, flying over a city in a helicopter (and being able to look straight down) is now an awe-inspiring and memorable experience. Also, when gaming, even small UI elements are super crisp and easy to read. Which is great. Finally, there is no stutter, since all VR experiences are designed to fall within the Galaxy S6’s specifications.

Not Being Tethered to a Personal Computer Opens Up New Play Mechanics

When you think of peripherals to make VR more immersive, many come to mind: HOTAS flight sticks and throttles, Sixense motion trackers, steering wheels and pedal sets with force feedback, the Buttkicker. What don’t most people think of when considering VR peripherals? The humble swivel chair.

Gear03-SwivelChair

The game that proved this for me was Anshar Wars. The fact that such a simple game with a single level and perhaps four mechanics blew me away was a testament to smart and simple design. The best game ideas are massively intuitive and immediately fun, and Anshar Wars proves the swivel chair mechanic.

The game is simple. You watch your fighter fly out of your mothership (third person – you’re in a “chase cam” behind your fighter) and then enemy fighters attack your mothership. Your velocity is constant … you are always flying forward. To guide your ship and aim your crosshair, you just … look where you want to go. To put that alien ship in your crosshairs? Look at it. To thread the needle between two asteroids? Look between them. Look up to fly up, down to fly down, and finally, the best part.

To bank 180 and chase the enemy fighter that just blew by you, you swivel your chair around.

In concept, this seems silly. In practice, it’s awesome. For the thirty minutes straight I played Anshar Wars, I must have looked (to the average observer) rather ridiculous. Looking up and down, using measured presses of my feet to rotate my chair left and right at varying speeds. Yet in VR, I was flying in loops. Zooming around asteroids, locking on and firing missiles, and blowing past and then quickly banking around to evaporate enemy fighters. At least until I flew into that asteroid and went boom.

Gear04-AnsharWars

The swivel chair mechanic is something that simply won’t ever work with a “wired” headset (like the Oculus Rift, HTC Vive, or Sony Morpheus) because the wire is going to get tangled. Because the Gear VR has NO wires – it’s just strapped to my face – I can spin any direction as far as I want as often as I want and never have any problems. Thus, the humble swivel chair goes from the thing that makes your butt hurt after a long gaming session to an input device as integral to gameplay as a keyboard or controller.

Never would have guessed!

The Features of a “Final” Consumer Device

The Oculus DK2 is a dev kit, not a consumer product, and technically, Samsung claims the Gear VR is not a consumer product either. Yet it already incorporates a number of useful features I feel must be in the final consumer version of the big boy headsets. These include:

Built-in Touchpad

This is one of the best features of the Gear VR. It has a touchpad and “Back” button built into the side of the headset. For my X-Men fans, remember how Cyclops would touch the side of his visor to unleash optic blasts? Well, that’s pretty much what it feels like to interact with the Gear VR, and the touchpad is the controller for many games (sidenote: Someone make a Gear VR game where I’m Cyclops).

Look at Menus to Select Them

I first noticed this mechanic in the excellent Titans of Space, and called it out as the ideal way to navigate menus in VR. Well, Oculus apparently agrees. Every menu button within the Gear VR highlights when you “look” at it (you have a crosshair that shows you exactly where you are looking, in VR, and can look past the crosshair when not using it). Clicking the touchpad when looking at an option selects it.

Pass Through Camera

Simple, but super useful! Without removing the headset, a quick menu selection allows me to activate the camera on my phone. I then see the real world through my phone’s camera, with the video image projected inside the headset. Thus, I can pause my game to grab a drink off my desk, check on what the dog is eating (that she probably shouldn’t be eating) and talk to my wife without removing the headset.

Focus Wheel

This huge win is a wheel on top of the Gear VR, similar to that on top of a pair of binoculars. Scrolling it left or right subtly moves the plastic sheet behind the goggles strapped to your face forward or back to bring your phone’s split screen into focus. In early headsets like the DK2, getting the focus right is troublesome and requires putting the headset on, booting up an app in VR (no “Oculus Home Menu” for the DK2), pulling the headset off, adjusting, and repeating. With the focus wheel, you just don the headset, adjust it like a pair of binoculars, and focus in seconds. Intuitive, efficient and easy.

Wireless VR!

The consumer versions of the Oculus Rift and Sony Morpheus (seated experiences) and the HTC Vive (standing experience – possibly) will have massive cables running from each headset to a computer. So, while you can tiptoe around these, you must be careful not to get your cables twisted. The Gear VR is entirely wireless (you wear the computer!) and the feeling of freedom is vastly superior. Swivel chair!

Downsides

Your Phone May Literally Melt

Obviously, there are tradeoffs for wireless VR freedom. First, your phone gets super hot when used for VR. And by super hot, I mean the Oculus app you use for VR literally includes a function that measures the heat level of your phone and, when your phone is approaching its melting point, shuts off what you’re doing. It then displays a prompt along the lines of “Your phone is too hot. Please allow your phone to cool before continuing your VR experience”. You can’t play again until your phone cools down.


Heat tracking and application shutdown is an integral component of Gear VR. It’s actually called out in the instruction manual. This suggests that Oculus and Samsung know a computer in a tiny plastic case running lots of calculations inside another plastic case gets hot, and they can’t fix this. Your phone overheating is inevitable, and occurs after anywhere from thirty minutes to an hour in VR. This will not be a problem with the Rift, Vive, or Morpheus headsets, but is a problem with current generation phones playing games for extended periods or even watching movies. This limits VR playtime.

You Are a Head on a Flagpole

The biggest thing missing from the Gear VR is what the Oculus DK2 does beautifully: moving your head with your body. With the Gear VR (as with the Oculus DK1) you can look up, down, left, right, and so on, but if you straighten, slouch, or lean, your view doesn’t change and your head remains “locked” to default X Y Z coordinates. This ruins VR and gets nauseating very quickly if you don’t force yourself to *not* move while using Gear VR, because what you “see” in VR doesn’t match what your body is doing.

Like any other sort of motion sickness, moving your torso too much or too often and not seeing that reflected in VR can quickly nauseate you to the point of quitting. Most people shouldn’t have a problem if they have a high-backed chair and sit straight against it for the entire time, but this requires discipline. Most people instinctively slouch or shift while sitting, and the Gear VR can’t account for this.

Despite this, games like Anshar Wars play beautifully in the Gear VR because of their clever incorporation of swivel chairs. So long as you press your back to the chair and spin the chair, not your body, you can play intuitively for extended periods with no motion sickness. So there’s that!

Summary

The Gear VR, and other devices like it, feel like the equivalent of a handheld game console. If a powerful computer with an HTC Vive is your PS4, then a Samsung Gear is your PS Vita. Lighter, self-contained, easy to transport, limited by battery, and designed for shorter, snappier experiences on the road. In exchange for giving up computing power and longer play times, you gain portability and ease of use.

In my opinion, that’s a decent tradeoff. I think the best thing about the Gear VR and similar devices will be VR evangelism. The Gear VR is something you can take anywhere, adjust to a new user in seconds, and use to initiate those who’ve never tried VR into the fold. People who use it will “get” VR quickly.

The next generation of VR headsets will be, in my opinion, wireless versions of the Oculus Rift, HTV Vive, or Sony Morpheus, or Gear VR-like devices that track torso movement and don’t threaten to melt during use. If you already have a compatible phone, the Gear VR is a fun supplement to your home VR setup.

A Day in the Life of a Space Bounty Hunter

Those who read my prior post, A Day in the Life of a Space Trucker, might be interested to know I’m still flying, and I’ve graduated from running cargo to shooting criminals out of the sky. Elite: Dangerous with a flight stick and VR helmet remains one of the most immersive experiences out there, and as you’d expect, blowing other ships up in virtual space is entertaining. For those who haven’t gotten to try VR yet, experiences like this are what you can expect when the first consumer headsets ship in late 2015.

Virtual Cockpit

[ My virtual cockpit. In my basement. Because I take my games seriously. ]

So what’s it like to be a bounty hunter in Elite? Pretty fun! At the moment, one of the most efficient ways to find criminals and earn bounties is to fly out to your local system’s Nav Beacon, a hub for ships coming and going from other systems. Some of these ships, inevitably, have pilots flagged as Wanted, meaning they did something the system authority considers bad.

Note: All screenshots that follow were taken in the Oculus Rift, which is the reason for the odd resolution. Each is essentially “half” a shot, what would be presented to my left eye.

My Viper Cockpit

[ Sitting in the cockpit of my Viper, ready for take off. ]

What horrible crime did they commit? Who knows! I’m paid to shoot them down, not debate criminal law. Though, just for reference, in the world of Elite: Dangerous there is one penalty for every infraction, even minor ones such as scratching paint or stealing food. Death. Firey, laser-induced death.

My typical gameplay session involves hopping into my Viper space superiority fighter (putting on my Oculus Rift DK2), leaving my local station, jumping into Supercruise, and dropping out at a Nav Beacon.

02 - Off to the Nav Beacon

[ Cruising around from the dark side of a planet as the sun begins to rise. Gorgeous. ]

I then fly around scanning ships as they arrive, almost like a traffic cop zapping passing vehicles with a radar gun. Except if my radar gun flags you as “speeding”, I will most likely try to kill you. So it’s a bit different.

03 - Scanning

[ Time to find out who’s been naughty or nice. ]

If a ship comes up “Wanted”, it’s time for me to go to work. I can attack that ship unprovoked without becoming Wanted myself, and blowing it up earns me credits I can use to improve my ship and weapons, thus allowing me to blow up more ships. It’s like the Circle of Life, except with Death.

04 - Mahilda

[ Let’s see if Mahilda has been a naughty girl. ]

Because I’m a mercenary (and not a “space cop”) I’m not required to engage every lawbreaker that enters my crosshairs, and I often don’t, even if they are Wanted. Knowing who to engage and who to leave well enough alone are critical calculations for profitable bounty hunting. Engaging a ship that’s more heavily armored and armed than me, or a wing of ships when I’m flying on my own, costs ammo, may damage my ship, and may even end the Circle of Death (with my death), which means I’m paying a big chunk of insurance to get a new ship. I’m here to make credits, not spend them, so how do I make my decision?

05 - Mahilda Clean

[ It seems she’s a law-abiding citizen. Good for her! ]

My first point of data is the Wanted pilot’s combat rank, which I see when I scan them. Every pilot has a combat rank (ranging from “Mostly Harmless” to “Elite”) which, for AI, tells you how hard they are to defeat, and, for players, just tells you they’ve blown up a lot of other ships. I can drop a “Mostly Harmless” pilot in seconds, while a Master or above pilot in a good ship may prove challenging.

Other factors in my decision include the presence or lack of System Authority Vessels (Elite’s overzealous “space cops”), who will aid me in taking out dangerous criminals or groups of criminals if I engage nearby, and the estimated worth of a bounty. Blowing up tiny cargo haulers (in addition to being laughably easy) yields next to no bounty, and really, what could a cargo hauler have really done that was so horrible to warrant death? He probably stole some food or something. For his sick spouse and kids.

08 - Wanted No Challenge

[ This guy honestly isn’t worth my time. ]

In addition, in the chaos of a multi-ship engagement (with multiple “Clean” pilots engaging a “Wanted” pilot) it’s not uncommon for a stray shot from a friendly pilot to tag me or another Clean ship, resulting in an instant “Wanted” flag for the poor soul. I generally let those pilots go, out of common courtesy if nothing else. Remember how I said Elite’s space cops are “overzealous”? This is where they prove it.

If a friendly ship accidently tags me in a multi-ship dogfight, they’re immediately flagged Wanted and, as mentioned, there is only one penalty for scratching another ship’s paint with a stray shot. Death!

06 - Tomah Wanted

[ Tomas Ekeli is wanted. Tomas Ekeli is about to have a very bad day. ]

Shooting down people who’ve previously helped me take down criminal pilots leaves a bad taste in my mouth (not to mention these ships are worth next to nothing, with tiny bounties) so I leave them to system authority and hope they have the sense to jump to Supercruise before they’re blown up.

09 - Dropping into Six

[ Tomas is already busy shooting at a law-abiding citizen. I slide in behind him and deploy hardpoints. ]

Should I choose to engage, I usually get to start the fight, because pirates rarely attack a fighter with no valuable cargo. I drop into the enemy fighter’s six, open my hardpoints, and say hello by unloading everything I have, inflicting as much damage as possible before they react and evade.

10 - Saying Hello

[ ~Please allow me to introduce myself. ]

Poor pilots often become fireworks before escaping my crosshairs, while skilled ones may evade and give me a real fight. Dogfights among equally maneuverable fighters and skilled pilots in Elite: Dangerous often turn into two fighters flying in an endless loop (imagine a wheel with a “north” and “south” point spinning endlessly) with occasional shot opportunities.

11 - Attempting Loop

[ Tomas is attempting a loop. Spoiler: It doesn’t work. ]

Assuming neither of us changes direction or breaks off and engages our afterburner (in hopes of getting enough distance to turn and get a shot) this can go on for a while. This type of dogfight is where a headset like the Oculus Rift proves its worth – I can constantly look “up”, out of the top of my cockpit, and track an enemy ship even when it’s not in front of me. So that’s cool!

12 - I Can See You

[ I can still see you, Tomas. ]

Upon taking heavy damage, an enemy pilot may disengage and attempt to jump into Supercruise. This is a clean getaway unless I have a Frame Shift Wake Scanner, which allows me to track them into Supercruise and follow them. Since I’m primarily facing simulated pilots who exist to ensure I have fun, these dogfights usually end with the enemy ship exploding and a bounty credited to my ship. Because it wouldn’t be fun if every AI pilot jumped to Supercruise when you were about to kill them, would it?

13 - Bad Day Ends

[ Tomas’ bad day ends in fireworks. Of death. ]

Ultimately, the credits I earn from shooting down criminals are just a promise, not mine, until I return to a local station and “cash in” my bounty vouchers.

14 - Back to Station

[ I have to get back here alive to get paid. ]

This is why bounty hunters must be careful not to push their luck, especially when they’ve accrued a sizable bounty voucher. If any enemy ship manages to destroy you, all those promised credit vanish, and you’re on the hook for the insurance cost of your ship. As your ship and the components you’ve bought for it improve, this becomes very expensive very quickly.

15 - Reward

[ Thank you for risking your life to keep us safe. Here’s some credits. ]

I’ve yet to face a human pilot in Elite: Dangerous (playing on the “lawful” side of the universe) but given my prior experiences in PvP, I imagine this will be an even more pulse-pounding experience. I also imagine the chances either I or my opponent will attempt to disengage at a certain damage threshold would be much higher, which makes hunting human pilots a less attractive proposition for a hard working bounty hunter. Especially when I can lose all my credits with my ship.

So, if I do someday encounter a Wanted player, I might engage just for the novelty of it, but after trying it a few times, I’d probably leave them well enough alone unless they shot at me or something. The risk/reward calculation just isn’t there, especially when AI pilots are much easier to kill.

16 - Random Beauty Shot

[ It’s a beautiful game. ]

Ultimately, after nearly twenty hours of flying and blowing up Wanted (AI) criminals, I’m still enjoying my experience with VR in Elite: Dangerous, which is an encouraging statistic for future VR games having a long shelf life when properly designed. Blowing up naughty pirates and escaping Interdiction by their angry friends remains a great virtual experience, and one I hope many people will be sharing with me in December 2015 and early 2016, when the first VR headsets become available to consumers.

17 - Heading Back for More

[ I’m NOT the Law. I’m like the Law’s cousin. Or roommate. ]

Until then, know that CMDR Captain Sunshine will be keeping the space lanes clean of lawbreakers (so long as the criminals are easy to kill) and ensuring that criminals don’t escape justice (unless they’re in a wing with other criminals – or an Anaconda – or an Elite Viper pilot – or I don’t feel like it at the moment). If none of these things is true, however, look out, lawbreakers! Justice is coming for you! (maybe)

Why VR is Actually Here (This Time)

Ever since I got my Oculus Dev Kit 2 (DK2) I’ve been experimenting with demos, speculating about development, and following many projects closely. I’ve also been talking to people at conventions about why virtual reality is actually happening, and why it didn’t all the previous times we said it was. Yet it wasn’t until playing Elite: Dangerous with a high-tech flight stick that I could truly and confidently say “Yes, we’ve nailed VR this time. It’s actually here.”

Dev Kit 2

Yes, VR technology is still in its infancy, similar to homebrew computers in the 70s and 80s. A group of extremely motivated early adopters are now playing with prototype tech and software, putting it through its paces and pushing its limits. But just as the homebrew computers those hobbyists soldered together decades ago led to the cheap plug-and-play machines now available at your local Best Buy, VR headsets and experiences will become increasingly common over the next decade.

The Oculus Rift is constantly mentioned as the singular device that will allow us all to experience virtual reality as it was meant to be experienced. To the average person, the Rift has become synonymous with VR. However, as impressive as the Rift is, it was really just the first hardware to hit the mark. As with TVs and computers, it is now just one hardware platform among many that will offer the same (now proven) experience. Today you can buy TVs from twenty different manufacturers, and VR headsets are already going down this same route. We already have giants like Sony and Microsoft producing their own headsets along with dozens of other small manufacturers. The design is known and it works.

We didn’t get here on our own. It was only in the last decade or so that technology advanced to the point where the components we needed for true VR existed. These include:

  • Gyroscopes and sensors powerful enough to match our virtual view to our physical movements, yet cheap enough to be mass produced
  • Smartphone displays that are sharp enough to show high resolution images, yet tiny enough to hold in our hands
  • Advances in computer processing power and graphics technology that allow us to render all these images on powerful computers that fit inside a phone, and
  • Software integration that reduces latency, making “lag” related to head tracking all but non-existent. Previously, even a tiny delay between when you turned your head in the physical and virtual worlds caused motion sickness.

As with the first TVs, now that we have proven it is possible to build a working VR headset (with the developers at Oculus and Valve leading the charge) the hardware is coming, from a number of manufacturers, in all shapes and forms and sizes. A long road remains, but it is a road we are now traveling. This is why, unlike all the prior times developers and hardware enthusiasts have claimed “We’ve made virtual reality!” this time, it actually happened.

This is where we swing back around to Elite: Dangerous, and how it was the first VR experience that proved to me, beyond a doubt, that we’re living in a world where VR works. Below, some graphics.

EliteDangerousCockpit

The screenshot above shows the cockpit of your first spaceship, a Sidewinder, as it appears in the world of Elite. You can see your instrument panels, your joystick and throttle, your wraparound viewscreen, and space. Beautiful, beautiful space.

SAMSUNG

This shot is my home setup: an Oculus Rift DK2, a Saitek X-52 Pro, and a desk (with a bonus cameo from my dog). Those three pieces of hardware, plus a desk and chair, are all I need to fly a spaceship against other spaceships in Elite, traveling through deep space to meet new people and make them explode. Why does this experience work? I’ll break it down.

Before moving forward, if you haven’t, go check out my post on The Three Components of VR Game Design. Here’s how all these components work together in Elite.

First, the Bubble. It’s my Sidewinder, and even as a starter ship it’s a pretty badass bubble. The spaceship interior is detailed and interesting enough to be convincing, yet large enough that I never clip through a console or cockpit glass. Moreover, it is interactive. When I look left, navigation and trip plotting windows pop into existence, floating for me to peruse. When I look right, I see another pop up window floating where I can adjust power to my ship’s systems, lower and raise my landing gear, and dozens of other operations.

Better yet, I can access all these functions directly from the dozens of buttons on my controller (more on that later). When I’m in a dogfight and an enemy fighter flies above me, I simply look up, out the top of my cockpit glass, and track it as I throttle down and pull back my stick to bring it into my sights. I never fumble for a keyboard or a mouse.

Next, the Proxy. Elite nails this one, and experiences going forward will only improve. I have a body in Elite. It’s in my spaceship and my head is attached to it. When I look down, I see my body (clad in a spacesuit, of course). I see my arms stretching out in that cockpit, just as my real arms are doing, and I see my gloved hands gripping a joystick and throttle — just like the joystick and throttle I’m gripping in real life. My virtual body moves like my real one, and my suit is not unlike one I’d be wearing in Tron. It’s my body and it moves like I do (so long as I don’t let go of the sticks!) Elite even has a built in option to chose a male or female body (only visible in VR) so it works great for everyone.

Finally, the Controller. This where the Saitek X-52 (and the way Frontier’s devs cleverly integrated it into Elite) comes into play. The joystick and throttle on my desk look very similar to the joystick and throttle in my virtual cockpit. I’ve placed them in similar locations on my real world desk. And when I hold them, in the real world, I see my hands and arms doing the same in the virtual world. Better yet, I feel them.

I grip a physical joystick and throttle and see my virtual body grip virtual sticks inside my headset. When I pull back on my joystick in the real world, my virtual body pulls back on its joystick in my Sidewinder. When I push the throttle on my desk forward, my virtual hand hands throttles up in game. I can even see my thumb move to click a button in my cockpit when I click it in real life. There’s lag, mind you, but it’s still incredibly cool.

Is this experience perfect? Absolutely not. As I’ve said, VR is still developing. The resolution of the Oculus DK2 is not amazing (the next version, commonly referred to as the “consumer version”, has already improved it). There is lag involved in matching up my manipulation of my real world joystick and throttle to those in game (on average, about half a second) which is a limitation of my hardware and software set up. And I’m not running a $2,500 computer. My rig is firmly middle ground by today’s standards, and so while space looks neat, it’s not as jaw-droppingly awesome as these screenshots (just a sample of the massive universe you can explore in Elite).

But the nuts and bolts are there. The Bubble sells. My Proxy matches and mirrors me. And the Controller is in my hands. For those who spent way too much time in arcades in the 80s (raises hand) it’s not unlike playing Afterburner, or flight simulators that have been available to aviators for decades now – just much, much better. I know I’m not flying a spaceship. This isn’t the The Matrix. But I am inside a thoroughly impressive simulated spaceship that looks ten times better than any real contraption I could ever actually fit in my home. Flying a spaceship in Elite is incredibly fun and very, very convincing.

And this is just the start. Computers will continue to get more powerful. Graphics more impressive. Headsets lighter. Displays clearer. And input devices are coming — so many input devices — that will allow us to manipulate virtual worlds in the way we’ve all seen in movies like The Lawnmower Man. Just Youtube some videos from Sixense or PrioVR to see what I mean.

Elite: Dangerous isn’t the only VR experience that proves VR is actually here. It’s simply the first experience where I managed to cobble all its components together. And it is dependent on a great many things — having the proper hardware, setting it up in a specific physical configuration, and actively tiptoeing around the current technology’s many limitations. Yet it is real. And I’m playing it.

Very soon, I hope, all of you will be doing the same thing.

An Argument for Fanfiction (from a Game Developer)

Here’s a confession. I used to write fanfiction. For the one person reading this that doesn’t know what that is, it’s when an author writes a story set in another (usually successful) author’s universe. There’s a slew of “fanfics” out there on the Internet from all sorts of people, and there is a huge community that reads and creates new stories in existing worlds with characters they already love.

Forgotten Messiah

My fanfic wasn’t my first foray into writing (that’s a trilogy of novels that will never be published because they’re terribly cliche) but it was certainly my most successful. Unsatisfied with the vague ending of the excellent Square RPG Final Fantasy 7, I wrote a sweeping 70,000 word novel that took place shortly after the game’s closing cutscene. It wrapped up loose plot threads, introduced new threads and a new villain, and continued the epic journey Square started in a way I found fun.

Then it turned out people actually liked it.

After it was hosted on RPGamer (a popular site for many fan creations, including fanfiction) the results were encouraging. At least 400 people (if not more) tossed me e-mails telling me how much they had enjoyed my story, made encouraging comments on plot points and characters they liked, and also gave me the best feedback a writer can ask for, which was simply “I enjoyed reading this”.

Then administrators of other sites started asking if they could host my fanfic, and off it went around the Internet. The positive response to that first fanfic was what convinced me to seriously tackle an original novel set in a universe of my own design. It’s what inspired the first draft (one of eight) of my first published fantasy novel. I learned a great deal about the writing process by writing that single fanfic.

Was it my best work? Absolutely not. I was still learning to be a writer and continue to learn to this day. But my desire to see Final Fantasy 7 continue motivated me to create a complex story with interwoven plot lines, a diverse cast, an interesting villain, enjoyable action, and a satisfying conclusion.

If the experienced developers and writers at Square hadn’t first produced the world, characters, and plot, I’d never have written that fanfic, nor would I have had that terrific opportunity to learn and improve as a writer. Square generously provided me with a fully realized world and well-developed characters so I could concentrate on the basics. Learning to tell a good story.

Here’s the parallel. My entry into my career as a professional game designer mirrors my foray into fanfiction. My first real game design experience (what most refer to as “modding”) was for LucasArt’s original X-Wing. I got an editor (XMB) off my local BBS and created a brand new 12 mission campaign for the game. I uploaded it to that same BBS. Three people downloaded it and thought it was awesome.

X-Wing

My brand new X-Wing campaign was, essentially, X-Wing “fanfiction” created in its game engine. LucasArts gave me a fully realized world with existing game mechanics, enemies with scriptable AI, and a good variety of existing mission objectives, and all I had to do was design and script my missions. From there I branched out to build new content for Doom and Neverwinter Nights and Quake and dozens more games, improving with each new project. Each experience allowed me to further hone my skills until I reached the level of a professional game developer and, eventually, shipped my first AAA title.

Could I, as a newbie developer, have created a new X-Wing campaign if I’d had to program a graphics and physics engine from scratch, script the AI, design the UI, and model the ship assets in 3D? Absolutely not. I simply didn’t know enough yet to pull that off but that didn’t matter. XMB enabled me to take an existing engine with game mechanics designed by experienced developers and simply learn “how to tell a good story”.

In the same vein, fanfiction is the modding of the writing world. Just like people who play X-Wing, or Doom, or Skyrim, become inspired to create their own content (new textures, new missions, and even new game mechanics) people who write stories set in the worlds of Harry Potter or Westeros or Pern are modding books. They are learning to be writers like modders learn to be game developers, by building on the work of experienced writers and learning one skill at a time. We should let new authors do that and even encourage them.

When a new author writes fanfiction, the groundwork for their story is already complete. The barrier to entry is low. A talented and experienced author (who has spent years honing their craft) has created a world, characters, and mechanics that are clear and compelling. New authors can focus on whatever skill they want to develop – writing dialogue, creating interesting situations, or building on existing ideas with new ones – in an existing playground. They don’t have to make a game engine from scratch.

Writing a good novel is hard, and learning to do everything at once (worldbuilding, characterization, dialogue, mechanics, and everything else) is a daunting task. It’s much easier to build on a solid foundation and learn all the different elements of professional writing at your own pace. This is why I think even prominent authors should tolerate fanfiction written in their worlds by their fans, even if they (understandably) don’t read it.

The benefit of inspiring more people to write and giving them a venue to do so outweighs any perceived drawbacks of having fanfiction on the Internet. Fanfics, like game mods, are created by fans, shared among fans, and completely free ways to inspire new authors. Just like we encourage prospective game developers to mod games, we should encourage prospective authors to write fanfiction.

They’ll get better with our help. They’ll use the starting point we created to learn and grow as writers. If we’re lucky, the fanfic writers of today will become the authors of exciting original worlds tomorrow.

Thoughts on Rift Experiences – Part 2 of 3

Edit: I’m adding some additional notes to this write up, which is written about a demo version of Radial-G that is more than a month old.

  • Radial-G has been approved through Steam Greenlight, funded, and Tammeka Games now aims to release its first full version in November 2014.
  • The team at Tammeka Games has already implemented many improvements to the demo I played and more are on the way, including some of those suggested here.

Original Post:

I’d originally planned to post my thoughts on Radial-G and Titans of Space several weeks ago, but the arrival of my daughter (a bit earlier than planned!) understandably set these back a few weeks. Everyone is doing fine and I’m slowly transitioning back into blogging and editing between feedings and nap time and listening to an unreasonable amount of lung practice. This week I’m going to write up my thoughts on Radial-G and how it uses the components of VR design I established several weeks ago.

As with my prior post on Ocean Rift, I want to again state that I’m reviewing a demo, not a finished product, and focusing specifically on how it hits the three elements of VR game design I’ve called out.

Radial-G was developed by Tammeka Games and you can check out its website here, where you can learn more about the game and team. It’s currently on Steam Greenlight and in development. My impressions come from playing the original demo (released with the Kickstarter) more than a month back and may not reflect the experience of playing the most recent demo. However, I still want to examine this experience in regards to my thoughts on VR game design.

Radial-G

This demo was recommended heavily on the Oculus Reddit and, after playing it, I can certainly see why. The developers took a relatively simple concept (riding a rollercoaster, a common VR demo) and took it a step further, allowing you to control the speed of the coaster and maneuver it around a central track, kind of like the classic arcade game Tempest (for those of you who go back that far). The game places you in a capsule attached to a central rail and tasks you with getting the fastest time around the “track” by hitting speed boosts and avoiding hazards which slow you down.

This was the first experience I played on the Oculus Rift that actually felt like a game and the developers did an excellent job of taking a simple concept and implementing it well. Highlights:

  • Looking around inside my capsule/spaceship. It really did feel like being inside a metal capsule attached to a long rail.
  • Hitting a spinning wall hazard for the first time. I unconsciously leaned my entire body to the side to avoid it and then flinched when I hit it dead-on.
  • Getting my first really good lap time and cruising at a fast and uninterrupted pace. Radial-G does a great job of starting you off with a task that is easy to learn but rewards even incremental improvement in an understandable way – a faster lap time.

I did experience Presence in this demo (thought this was intermittent) and while it was not as strong as it could have been due to some technical issues, it was still a fun experience.

The Bubble

Radial-G took the correct approach in creating the Bubble. You spend the experience inside a racing “capsule” that completely encloses you as you speed down the track. Being inside a craft like this makes it believable that you aren’t feeling wind in your hair or on your face, and speeding along the track and spinning loops around it imparts a sense of motion that (almost) matches doing this in real life.

Where the Bubble could use improvement is in technical issues. It was not obvious how to recalibrate the Rift to place my head in the “center” of the capsule, and so I spent some time with my head stuck in the seat or in the ceiling (breaking Presence). The capsule itself also didn’t full “sell” the experience of racing around a track. It felt too smooth and sterile, and I wanted it to feel more white-knuckle.

The Proxy

Unfortunately, the version of Radial-G which I played provides no Proxy, and I couldn’t see my legs, body, or hands. The sense of being a disembodied “ghost” inside the capsule fights with Presence, and while I did obtain Presence when not paying attention to the capsule interior, the lack of any sort of body was a constant reminder that I wasn’t really there.

So, while I could feel Presence when I was looking out into the environment or down the track (which I was doing most of the time) looking down at myself or behind me broke Presence immediately. As with other games of this sort, I think Radial-G would benefit from adding a body in a flight suit (even a stationary one) so that I at least had some sort of anchor for my disembodied head.

The Controller

The demo suggests a standard XBox Controller and this was a really good choice. The developers were conscious of how disorienting it can be to move in VR and did a great job of creating simple, intuitive controls that are difficult to get wrong. Since I’m “flying” a capsule attached to a track, using a joystick felt very natural, and the movements were simple. Right Trigger accelerated, Left Trigger braked, and I used the Left Stick to either slide left or right around the looped track.

While these controls aren’t binary, they are very easy to use because each control corresponds to a single direction of movement. By depressing the Right Trigger, I accelerate (move forward) or don’t. By pressing the Left Stick, I slide directly left or right around the central tube. It’s impossible to get pointed in the “wrong” direction or get turned around, and by attaching me to a round tube, the developers created a clean and fun way to speed about what would normally be a complex and disorienting track without ever getting lost or frustrated.

The control scheme for Radial-G and how it interacts with the game elements was top notch and one of the best things about the demo.

How Can Radial-G Better Establish Presence?

A few improvements on the technical side would help (and given the demo I played is more than a month old, these may already have happened). Offering the user an easy way to recalibrate the position of the Oculus Rift would be a good start, ensuring I’m always at the optimal point inside the capsule, and not stuck in a seat or in the ceiling.

Second, adding a Proxy (a human body in a race suit) would increase my sense of Presence and anchor me inside the racing capsule. If this Proxy was animated (with a hand pressing forward on an accelerator, and perhaps side to side as I turned) this would also improve the experience.

I’d also like to see more feedback inside the capsule itself. Screeching and rattling when I speed along a rough portion of the track. Perhaps a bit of shaking or bucking in the capsule itself when I pull a number of Gs in a turn. Right now the capsule feels very static and almost too solid to sell the experience. I’d expect it to feel a bit more rickety or, at least, affected by the fast speeds and tight turns I’m pulling.

Finally, a great addition would be adding distortion or blackness at the edges of my vision as I pull tight turns, similar to what fighter pilots experience when pulling a number of Gs. Another demo used this with great success and it could really add to Radial-G’s immersiveness.

Summary

Radial-G is a great example of a straightforward, well designed, and playable game experience that works perfectly with the Oculus Rift. It’s easy to learn and control, provides a large amount of fidelity in incremental player improvement, and does a relatively good job of establishing Presence during play. As it is polished and improved I’m sure it will turn into a truly impressive VR experience.

Next Post

Next up, I’ll be providing the same thoughts on Titans of Space, a virtual tour of our solar system and perhaps the best and most immersive VR experience I’ve tried with the Oculus Rift.