Quantifying and identifying VR motion sickness causes, in order to solve them.

Brendan Iribe’s, the Oculus CEO, comments at the Web Summit in Dublin today were interesting not only for the warning he gives to other companies, but for the problems Oculus still face.

“We’re a little worried about some of the bigger companies putting out product that isn’t quite ready. That elephant in the room is disorientation and motion sickness.” He said.

By bigger companies of course, they mean Sony. “Don’t poison the well here”, but I think perhaps the bigger danger could be from Oculus developers rather than Sony. The PS4 is a known variable, as will the Sony headset when it’s released, and developers will be more comfortable with the Sony development ecosystem when their HMD is made available. If I decide to develop a VR game for Sony, well, that decision isn’t mine. I have to apply to the development program, buy a dev kit for $2,500 (assuming I’m accepted) and then persuade them to give (or sell) me a Morpheus HMD. Even if I manage to create (what I regard as) a great game it’s likely that Sony will veto any VR title that doesn’t create a good experience for users.

Oculus on the other hand is totally open, so I can cheaply buy a DK1/2 (and even this is optional) and create whatever I like. I’m not arguing against this openness since the low barrier to entry is a great way of encouraging new demos and games but poorly executed code are far more likely to create the ‘disorientation and motion sickness’ that Iribe is so concerned about.

It’s a well known fact that perhaps 10% of the population are susceptible to simulator sickness and while Oculus are attempting to address this by recommending following their ‘best practices’ guide it’s still a black art knowing what will cause nausea in some people and not others. Generally this is some kind of movement, either unexpected or something that throws their vestibular system off. If you have sub-millimeter positional tracking and are rendering at a locked framerate of 75-90hz 99.9% of people will be fine when they’re not moving and just looking around. Figuring out exactly what the problem is once the user begins to move is something that needs a more methodical approach to solving.

From 6:30 to about 9 minutes, Tom Forsyth talks about how even Oculus are still not sure about why people get sick and what specifically causes it in different people. Now if the egg heads at Oculus are still figuring this out then what chance do regular developers have? We can follow the best practices and our users might still get sick when faced with the wrong type of stairs :p

So here are my three simple suggestions for tackling the whole ‘disorientation and motion sickness’ issue that Oculus, and all the VR companies, still face.

Demos

For most first time Rift owners the standard demo that is loaded first is usually the ‘Demo Scene’ with a simple desk, a plant and lamp… it’s simple, effective, non nauseating…. and quite boring. Now a ‘boring demo’ is fine to show what the rift can do to ‘VR virgins’ but you’ll have a hard time convincing them that they also need to go out and buy a HMD and good PC just so they can look at a desk. Most people jump into the Tuscany Demo from here, but sadly even that can cause nausea in some people. A better option would be for Oculus to release the demos that they recently released for the Crescent Bay demo, or something similar. Give Rift owners a well constructed suite of demos known to run well and guarantee the first time user a great experience. Sadly too many rift owners seem to enjoy throwing first time users into far too intense or scary experiences. Dreadhalls or rollercoaster demo’s are great fun, but you might be showing the rift to someone who’s never really played a video game since Pacman and you run the risk of giving them immediate motion sickness, a delayed nightmare or even face planting into concrete.

A users first time experience should be fun, safe and non nauseating, make something not enjoyable and you might put them off for a long time. We need some awesome introductory experiences to amaze VR virgins, not make them ill.

If Oculus want a cheap way of finding some great new intro VR experiences, run a competition giving a few CV1 headsets away.

Training and testing

After giving someone a ‘nice’ experience give them a disguised ‘motion sickness’ test. We construct a carefully designed level, perhaps constructed as a museum or art gallery, that the user can explore while we test their comfort level. So they move around the various floors looking at various exhibitions and the like, but at each intersection or ‘level’ we ask them to rate their comfort on a scale from 1 to 10. Assuming they stay comfortable we can then begin to alter the parameters of the test, such as walking speed, comfort mode turning, stairs, blink transitions and the like. Our aim is not really to make users sick but identify when it happens.

This gives Oculus a chance to not only test users but train them in the (yet to be determined) new standards of movement, UI and the like. It also gives you a chance to teach users about recognising nausea, what causes it and assure them that it’s temporary and gets better with exposure.

This gives us a standardised, repeatable test that we can use to strip away, and hopefully identify, the causes of simulator sickness. If we also anonymously gather the users age, sex, IPD, height and perhaps glasses prescription, plus the computer specifications and frame-rate, we have an easy way to quickly and scientifically test hundreds of thousands of people, and their hardware, to look for patterns. This also lets content creators identify what experiences are most likely to affect their users so they can either alter them for wider comfort, or warn users that a certain game might make them sick.

Now it’s also understood that simulation sickness is something that can be mitigated through repeated exposure so perhaps after a few weeks playing one game the users simulator sickness level could be reduced, so the test can be run again and the comfort remeasured. At the end we can now tell the user that their perceived comfort has improved by say, 20% overall or in certain areas, and the user would now probably handle “GTA VR” with no problems, whereas three weeks before they wouldn’t last  5 minutes in that game. Perhaps some games will push certain aspects of users nausea and these could be used to acclimatise users to the effect, so if a user can’t do ‘stairs’ they can train on demo that uses gently sloping ramps instead, improving their tolerance for virtual stairs.

So Oculus, give us a nice training and test mode and we’ll give you the data to pin point exactly what makes some of us sick, letting you nail down a solution to motion sickness.

Reporting

I think it would also be incredibly useful to bake a ‘report nausea’ feature into the SDK, which sends some screenshots and fps graphs back to Oculus. This would allow Oculus and devs to identify elements in games, demos and practices that are causing problems and find fixes. Perhaps devs fail to spot areas where there is a problem and this would help pin point problems.  This could be dynamic, so you could ‘power through’ something that affects you temporarily, while noting it’s affecting you, or as simple as adding a ‘Nausea Quit’ button in the menu.

 

So, nice demos, a training and test game and better ways of reporting what makes us ill sometimes. Three easy things.

 

Comments are always welcome, or join in the argument on reddit 🙂

Wired 2014 Startup pitch for the motion simulator….

I might put this on the front page, it sums up everything nicely in 398 words…..

 

Since the invention of the classic six degree of freedom platform sixty years ago motion simulators haven’t changed much. They remain massive, ugly and slightly dangerous. Despite the advances in materials and engineering they still require large, powerful, and expensive motors to lift a platform to simulate movement. This keeps motion simulators stuck in the realm of an expensive niche product. After all who has £5,000 to £20,000 lying around?

With the dawning age of virtual reality gamers will no long be content to peer through a monitor screen and they’ll begin to demand a more visceral experience to match the feeling of being inside another world that the rift provides. The market for the first affordable motion simulator is potentially massive.

What is needed is a new design that gives the user not only an affordable, modular device but one that is safe, stylish and smart.

This is the philosophy behind the Feel Three Motion Simulator for Oculus Rift.

Instead of lifting a chair in a fixed rectangular frame, we use interlocking aluminium hexagonal and pentagonal panels which are curved into a semi sphere. They’re light, cheap and very strong. This design allows the user to decide their range of pitch and roll, since you decide the amount of panels. 360° yaw is also built in and as John Carmack said recently “Swivel chair VR is going to be kind of a big deal”. If you want to go all out, make a fully enclosed sphere to roll and loop all day. The platform can even turned into a snowboard or hang glider simulator, and if you’re not a hardcore gamer you can just fill it with cushions and chill while you gently sway to the latest demo.

A motion simulator disguised as a chic sofa? Your wife will love it!

An electronic gyroscope on the platform works out where you are 1,000 times a second and the Oculus Rift camera is mounted to the platform. The base powers the sphere with revolutionary ‘omnichains’, which allow 3 degrees of rotation simultaneously and an optional heave module allows low power vertical movement with counter weights. The whole kit is tiny when unassembled and since it’s modular you can buy the basic platform and build your unique design .

And the price? Slightly more than a decent gaming PC.

A cool, safe, affordable motion simulator for the masses. It’s time.

You feelin’ it?

Open sourced, gloveless, 6DOF hand/arm/finger tracking + gun for Virtual Reality for less than $50…

Leap motion had the right idea when they recently released a mount for the Oculus Rift. The idea was that you can see your hands from the front of your face and give you a cool way to put your hands inside VR. It was a good idea but due to the limited range of the device fundamentally flawed. Tracking beyond 30cm is just too far away for the LM to handle reliably. Still, it’s a step in the right direction. Where else can you find a solution to track fingers and hands with excellent accuracy for less than $25 (second hand)?

Hand tracking really is key to a more immersive VR experience, a problem that the people at Perception neuron used to garner a sweet $570k on kickstarter recently. But gloves? In 2014? It’s a cool set of kit, but so many parts to break, wires to cut, parts to snag and you look somewhat foolish wearing the full setup. And $200 minimum investment? Ouch!

So what do we really need, at a bare minimum? Being able to see our dominant hand (preferably both) in VR space reliably, if we can track our lower arm accurately we can pretty much track the rest of the arm too. Your elbow is a pretty simple joint, approximating the upper arm isn’t hard if you know where your wrist is and where it’s pointing. The VR guns that are appearing more and more don’t give us that information, but guns are easy, fingers are where it’s hard. Showing your hand and fingers, which 99% of the world automatically look for when they first don a Rift, is really what we’re always going to wanting to do.

The main issue is cost, it’s mostly a solved problem if you have the money, but unless you can do it cheaply no one will adopt it and you’ll end up with early adopter blues. Nintendo powerglove anyone?

So, who wouldn’t want to be able to see their hand and fingers move accurately in front of their face while in the Oculus Rift. Everyone right? Now who wants that for less than $50?

It’s easy if you know what to hack together 🙂

Once again, it’s a PROTOTYPE! It’s not perfect, nothing ever is, but show me how to track your hand in VR, move around and shoot for less than $50 and I (and everyone else) will be very happy! I threw this together in a couple of days, if you like it, tell me! If you don’t like it… well, go make your own bloody controller!! :p

And if Oculus is watching, please give us access to the ‘skeleton’ of the camera in the SDK. We know you’re working on your own controller, but you never know, someone else might come up with a better solution and it will die on the vine because it’s impossible to support easily.

Here is a quick video, some pics, a break down of the hardware and how you can hack the software together to make it work.

IMG_2939

 

IMG_2941

Note: The MPU isn’t attached, and the nunchuck isn’t plugged in.

Hardware

The hand tracker is actually quite simple. I’m going to break down each part, what they do and why they’re needed.

Cuff

wrist

This is the base of the prototype, everything hangs on the wrist cuff. In its current incarnation it’s a little rough and could be more comfortable. A later version will have foam padding for comfort and a quick release buckle, or velcro, for a snug fit. It will also need a box for the arduino and gyro. Ideally it would also contain a small USB3 hub so the leap motion and the arduino can communicate to the the PC over only one lead. This raises the price a little for convenience. We also have the option of adding a few more buttons to the cuff which can be activated by the off hand.

(the tubes on the side are for support, printed plastic can be quite weak when printed like this, the tubes allow us to insert a 3mm length of filament for strength)

 

Wii Nunchuck

Why reinvent the wheel? The nunchuck can be found for as little as $3 online, it has a joystick and two trigger buttons as well as a 3 axis accelerometer for simple motion detection. It’s cheap, reliable and easy to replace. It also has a convenient arduino library just waiting to be used.

Ideally we would be able to connect and disconnect two nunchucks, for use in both hands, although using only one is perfectly fine. The cable is long enough to allow use in the ‘off-hand’ so the user could move with one hand and aim/’finger shoot’ with the other. it has a slightly inconvenient proprietary plug but this is easily adapted with a $1 gizmo from ebay. nunchuck

Arduino

A fully formed computer for $3? How can we refuse. This forms the brains of the gyro sensors and interprets the nunchuck signals. Using a pro-micro we can also emulate a joystick with no drivers. Handy!

Gyrometer/accelerometer/magnetometer

This tiny miracle on a chip provides a mass of information a thousand times a second. With this we can accurately measure where your wrist is pointing since it’s attached to the Cuff. They’re also only $8 each.

Female Arm

female

A simple printed arm is attached to the cuff. This provides some simple cable management too. Requires a couple of bolts to attach to the cuff.

Male arm.

male

Another printed part that can be used to adjust the total distance of the arms. A bolt locks it to the female arm and allows adjustment.

The leap holder.

holder

The fourth and final printed part. This hold the leap motion sensor which will be pointing at the hand and providing constant hand and finger tracking. It also has some holes for wires and needs two bolts to attach to the arm.

Leap Motion

 

Another miracle in a small package. You can find them second hand online now for $25. It has fairly mature drivers and a SDK for use with games. By mounting it to the wrist we can over come it’s problems with range and free it from the desk. The leap gives a better experience than putting on sweaty gloves for a fraction of the price. It also has zero moving parts, so there is nothing to break.

Ping Pong Ball

This is attached to the end of the leap holder and has a hole inside it. Illuminated from the inside it gives a cheap way to give us positional information. 25c

LED and wires.

This gives the eleventh and last component something to look at in the darkness. <$1.

PS3 Camera (or equivalent).

The camera tells the PC where your arm is in space, just like the Sony Move works. We can use open source software to track it quite easily and if we use an IR filter and Infrared LED inside the ball we can cut a lot of the tracking processing. Available on ebay for $5.

 

3D Printing Alternatives:

If you don’t have access to a 3D printer you could always make a leap holder from some wood, or even better use friendly plastic. You melt the pellets in boiling water and mould them into shape, this stuff is perfect to make a project like this and it’s very cheap (and reusable).

Bill of Materials

  1. Printed Cuff
  2. Printed female arm
  3. Printed male arm
  4. Printed Leap Holder
  5. Arduino $3 (new)
  6. MPU-9150 $8 (new)
  7. Wii Nunchuck $3 (new)
  8. Leap Motion $25 (second hand)
  9. PS3 Eye camera $5 (second hand)
  10. Ping Pong ball 25c
  11. LED + 40cm wire + 10 oh resistor $1
  12. 5 bolts and washers ~$1
  13. Extra cuff buttons 25c each (optional)
  14. Wii nunchuck adapter $1 (optional)
  15. Cuff Velcro $1 (optional)
  16. Cuff comfort foam $1 (optional)
  17. Short pieces of wire to connect the MPU and arduino
  18. USB lead for arduino (on hand)
  19. 5 meter USB3.0 extender for the Leap Motion $7
  20. 5v motor for cuff vibration $1 (optional)

Total : $46.25

or $58 with optional extras.

notes : The Cuff is ideally printed using Nylon since this is more flexible than ABS or PLA and should last longer. The leap can be position to be facing the palm or the back of the hand, although the latter, while more convenient, might be less accurate. It’s also possible that if an optimal position can be found we can reduce the required printed parts to a complete arm and cuff, removing the need for the bolts, and improving the appearance. A USB 2.0 lead can be used for the Leap Motion but this lowers the data speed and may affect accuracy.

 Software

This is a work in progress… but we can break it down into 4 distinct areas, all of which have open source software available.

Finger Tracking : Sign up with Leap here for their SDK

Position Tracking : Choose from two open computer vision projects, SimpleCV or OpenCV

Rotation Tracking : It’s a work in progress over at Arduino

Nunchuck libraries : Tim Teatros or check the arduino Site

 

 

 

Throwing together a new prototype for 6DOF hand tracking….

Oculus Connect was pretty good but I was really hoping for a input device reveal. I think a lot of other people were too since we’re in danger of some really expensive solutions becoming the ‘standard’.  Stem and PrioVR are pretty cool, but not only expensive but massive overkill or underkill for VR input. The solution that will win will be the one that lets you see your hands and tracks fingers. Being able to hold a ‘gun’ would also be very useful. Anyway, I have a few ideas to achieve this, and cheaply, which is always good.

The seed was planted long ago 🙂

Just like the Rift the best way to do something new is to put together existing technology, open source it and let people play with the idea. I’m printing parts of a ‘proof of concept’ now, which also means it should be pretty easy for people to try the idea and improve it.

Who wouldn’t want to print 4 simple parts, put together some simple electronics, install a program and have hand/finger tracking for less than $100?

Can anyone point me in the direction of a IR camera tracking solution on github….?

Slightly frustrated

Well I’ve spent far too much time trying to get a simple sphere to rotate properly in unity. You would think it was a simple matter but it’s not as simple as I though. I was hoping it could be done mostly with a simple configurable joint but there are so many options in there it’s a nightmare to work out. I think I have it mostly figured out though but I’ll still need a simple script to join the joypad input to the right rotation axis and limit the rotation, otherwise it will go spinning of like crazy.

I’m also thinking of making a simple curved screen version too for a projector. A version for iracing with an iPad for the car telemetry behind the wheel would be really cool.

I’m also hoping to show a simple surfing/boarding version using the inverse kinematic built into unity. Watching c-3po standing on a board and shifting his balance as it moves would be pretty cool. I can also make a fairly realistic paragliding version too.

As soon as I get the rotation working I’ll put out a VR versions for the rift. I’m less concerned about the frame rate now I know how easy it is to hide and show the various elements with a simple script.

A few ideas for the VR demo.

1. Changing the view to the inside of the sphere.

2. Simple rotating view of the panel components floating in space.

3. Using a hinge for motion/rotation instead of a collider for better control and smoother appearance. It also lessens the CPU work.

4. Ragdoll C-3PO in a sofa version.

5. Set it inside the bridge of a Star Destroyer.

6. R2-D2 and training bot props for flavour :p

7. Enable/disable various elements to increase the FPS and allow you to examine the models a little better.

8. Some simple controls… preferably with a controller thumbstick.

9. A playback demo that everyone could relate to…. having the attack on the death star scene from star wars playing on a virtual screen while the sphere moves in time with lukes X-wing :p

 

It would be nice to get this version out before the conference on Thursday…. (and to build my sphere)…