Human vision is about 210°, we can see pretty much 180° side to side and moving our eyes can see slightly behind our head. Useful for noticing predators sneaking up on us… The DK1 from Oculus was 110°, the DK2 is about 100°. This was due to a few reasons but it’s generally expected the FOV will increase in the next iteration. A low FOV tends to pull you out of the experience, no one wants that blinkered feeling you get in a diving mask. When I was a boy these were made from black rubber, so the tunnel effect was really pronounced, modern masks are made from clear silicone. Your view is still blocked but since light passes through you can still sense movement from your outer peripheral vision. Much nicer.
Some creative people took this idea and turned it around to create LED strips that project an average light reading and display it on the wall behind your TV. Ambilight and lightpack are of limited use, but a cool idea. It didn’t take long for someone to suggest putting one inside an Oculus Rift to increase the perception of a wide FOV but HMD’s are quite a bit too small for this to work well. It also seems Apple has a Patent for this idea but my following suggestions are not the same.
Still the idea is quite good, most of our vision is concentrated in a 6° arc and the amount of detail we can perceive drops off away from the center of our retina. A method of putting extra light into our peripheral vision would be really nice, especially if we can do it for ‘free’.
But how? I thought of a couple of possibilities, but please remember this is just idle speculation, a thought experiment, although if Oculus want to hire me to try them out they better hurry up before I apply to the HAXLR8R program 🙂
Well, first we would need white (or probably neutral grey) borders around the screen to allow for some reflectivity.
A fairly simple idea is to create a clear plastic rectangle that fits around the edge of the lens, this refracts a small portion of the image out and onto the sides of the borders. We might sacrifice 2° of ‘real’ FOV to create an impression of an extra 10° or so, on all four borders, per eye. This ‘lens’ might have to be precisely aligned however since we might need to increase the luminance at the edge of the screen to compensate for the loss of light as it’s refracted.
Look at the simple example I drew in Inventor. The right side shows the normal eye -> lens -> screen. BUT the version on the left has a strip of clear plastic around the edges. Now part of this light is refracted out and onto the HMD plastic panels on the side. We lose a little detail but make the Rift feel less contricted and enclosed.
Another idea would require some minor changes to the HMD casing but could produce a much more impressive effect. We would lose NO viewable area but could gain a really bright peripheral effect with some clever design engineering.
Consider that the current DK2 wastes a huge amount of screen estate. This is the nature of the design and not a massive flaw, but we’re throwing away pixel light that could be used in out peripheral vision.
Each of the eight corners are only displaying black. Instead we could cover these areas with translucent plastic that bounces these pixels light out to the edges of the screen in a similar way that fiber optics can relocate light. Instead of wasting this potential light we can add it to the experience and a 110° FOV could be perceived as perhaps 130°, bringing us one step closer to a more immersion.
Here is a very crude example. The top of the right screen is sampled, the corner is illuminated and reflected into the top border. This doesn’t have to be quite so general, we can split the areas up into smaller bands to improve the effect.