Share this project


Share this project

Developer kit for the Oculus Rift - the first truly immersive virtual reality headset for video games.
Developer kit for the Oculus Rift - the first truly immersive virtual reality headset for video games.
Created by

9,522 backers pledged $2,437,429 to help bring this project to life.

Building a Sensor for Low Latency VR

We want to kick off the New Year by sharing a few details on the Oculus sensor’s hardware and software that allow for immersive, low-latency head tracking. Parts of this update will dig in deep on technical aspects of the sensor, so strap in! For the game designers out there, remember that a thorough and passionate understanding of sensor fusion is not required to build great VR games with the Rift, although it certainly can’t hurt ;-).

Every Millisecond Counts

When it comes to latency, every millisecond counts in making the experience as immersive as possible. There’s little argument that perfect virtual reality requires zero latency, but minimizing latency (especially in the display hardware) continues to be a major challenge.

Michael Abrash at Valve recently shared a fantastic engineering post on the technical challenges around latency for AR and VR:

At Oculus, we’re constantly researching ways to reduce real and perceived latency in the Rift. This means evaluating the latest hardware and developing creative solutions in software. One source of latency we decided to tackle early on was our motion sensor.

The New Oculus Sensor

The original Oculus Rift prototypes used a sensor that was readily available on the market, but ultimately we decided to develop our own sensor hardware to achieve an optimal experience. With the new Oculus sensor, we support sampling rates up to 1000hz, which minimizes the time between the player’s head movement and the game engine receiving the sensor data to roughly 2 milliseconds.

The increased sampling rates also reduce orientation error by providing a denser dataset to integrate over, making the player’s real-world movements more in-sync with the game.

Sensor Fusion

The Oculus sensor includes a gyroscope, accelerometer, and magnetometer. When the data from these devices is fused, we can determine the orientation of the player’s head in the real world and synchronize the player’s virtual perspective in real-time. The Rift’s orientation is reported as a set of rotations in a right-handed coordinate system, as follows:

The process of combining the sensor data from all three devices into something useful is called “sensor fusion.” For those of you interested in learning more, we recommend this Google tech talk:

The gyroscope, which reports the rate of rotation (angular velocity) around X, Y and Z axes in radians/second, provides the most valuable data for head orientation tracking. By constantly accumulating angular velocity samples over time, the Oculus SDK can determine the direction of the Rift relative to where it began.

Although the gyroscope provides orientation relative to the starting point, it leaves us with two challenges: it can’t provide the original orientation of the headset and it’s subject to a small amount of drift over time (imagine re-orienting your head back to perfect center but in-game you’re now looking slightly left or right).

These are obviously significant issues for any VR game with a fixed reference point (ie. a game with a cockpit, where your head’s orientation does not affect the position of whatever car/plane/mech you’re piloting). Nevertheless, we can leverage the accelerometer to estimate the “down” vector and our magnetometer to measure strength and direction of the magnetic field. Combined, these allow for correction of drift in all three axes.

Making Developers' Lives Easy...

If all of this drift correction and sensor fusion business seems like a lot of work, don’t worry! In addition to raw data, the Oculus SDK provides a SensorFusion class that takes care of the details, returning orientation data as either rotation matrices, quaternions, or Euler angles. The SDK also includes a complete C++ “Oculus Room” example that demonstrates many different player input schemes integrated with Oculus head tracking.

If you’re using Unreal or Unity, it'll be even easier, as you can enable Oculus head tracking right out-of-the-box. We’re also working on detailed documentation and tutorials to make the integration process into proprietary engines as painless as possible.

So, did all those sensor fusion details get you excited? Good news...

We’re Hiring!

Oculus is looking for the best and brightest engineers to help build the future of virtual reality. If you have what it takes to help develop the next-generation of gaming, let us know!

We’re located in sunny Irvine, California, just a few miles from Disneyland and the beach. You can always find an up to date list of available positions with detailed requirements at

Here are a few of the Jedi we're looking for:

Senior Software Engineer

As a senior engineer, you will lead and contribute to a range of Oculus projects, including the Oculus SDK, our game engine integrations (Unreal, Unity), VR gameplay samples, internal and end-user facing tools, 3D authoring tool integrations (Maya, Max, CAD), 3D video playback, firmware and multi-platform driver development. We’re also researching positional tracking (optical and otherwise), kinematics, and alternative controller input, so there’s plenty of challenges to dig into.

Great experience would include multi-threading, multi-platform development, device drivers, 3D graphics, GPU programming, game engines, algorithm development, sensors and filters, computer vision, and a passion for VR.

Senior Mechanical Engineer

As a mechanical engineer at Oculus, your designs will leave a footprint in VR history! You’ll help lead the design and engineering of Oculus products, including the Rift. You’ll also be responsible for building and testing mechanical models and translating industrial design into tooling drawings ready for manufacturing.

An ideal hire for this position has previous experience designing, building, and shipping a successful consumer electronics device (bonus points if it’s video game related!).

Senior Hardware Design and RTL Engineer

As a senior hardware design and RTL engineer at Oculus, you’ll be creating the next-generation of hardware designed specifically for virtual reality. You’ll be developing custom virtual reality focused designs on FPGAs, including display controllers and low-latency stereo camera interfaces. You’ll also work with the engineering team to identify and bring to life practical solutions to virtual reality problems.

We love to see candidates with experience designing, implementing, and synthesizing high speed SerDes blocks for LVDS, TMDS, and similar interfaces using Verilog or VHDL. Hardware hackers wanted! We’re always interested in seeing cool FPGA projects.

You can apply for these positions and many others by visiting or emailing us at

See you in the game!

-- Palmer and the Oculus team


    1. Creator Tomas Pokorny on February 27, 2013

      Neck muscles are moving head. What about using Thalmic’s muscle activity sensors? Like MYO uses for arm motions. I know it would need another device around neck but it could another accessory device for better user experience.

    2. Creator Matt Hargett on January 9, 2013

      does the C++ SDK use pure virtual classes to easy unit-level testing, and avoid the forced-inheritance models of some other SDKs?

      I'd really like to be able to use mockitopp to unit test my classes that interact with the SDK instead of having to wrap everything in my own classes/interfaces from the get-go. We *hate* how Unity and Android force you into depending on concrete types that you then have to inject in various hacky ways that compliate/slow down the unit test build.

    3. Creator thegameveda on January 5, 2013

      @ Palmer : many thanks for your reply , this is what I love about HMDS , in my adventures I have been in touch with the CEO of smd , the people at Zeiss and now yourself. This is a field where developers are genuinely interested in customer input. The reason we need to hold a view stable with the head tracker is primarily to aim through scopes in FPS games (eg maintain a bead on a target, be it Doom 3 or Far Cry 3 , in the comments to my videos cursor jitter (technical term) is one of the main concerns. The other use is to navigate Windows , icons tiles and apps , select switches etc in cockpits , or track a stationary object while moving at speed around it strafing in an FPS or maintaining a lock on a target in a flight sim when I paint targets in my Arma 2 flight video Getting games designed with VR in mind is key when you release the dev kits into the wild ..that's when we should start seeing (!) the true potential of the Rift ..(and yes that was another vain attempt to push the release forward)..many thanks for getting back to me

    4. Creator thegameveda on January 5, 2013

      @ Mario Basiola : Hi the Zeiss emulates the mouse AND also has 3 axes for developers to mess with ..the roll axis exists I have been using it in games that support it , including for tilting the view in Doom 3 here is the relevant info from the manual There is no pretending!

      RE your comments about hmds , I have been using hmds now on a daily basis for around 1.5 years from the HMZT series to the smd ST1080 to the Cinemizer OLED . check out my whole experience here What I have learned is that comparing resolutions and FOVs (on paper) will get you nowhere. HMDs are complex devices to design, no one has perfected one yet, eg It is no use having a 100 ft screen that you can only see the middle third of clearly through mangled optics ..which is what you get if you move whilst using the HMZT1.How Palmer gets a clear undistorted FOV across 90 degrees at this res . Comfort ,optics ,ergonomics,portability,compatibility with the widest range of hardware and software , display choice all come in to play. I go into more detail in my now almost 1.5 years worth of coverage of the state of hmds and what is wrong or right with them in my experience including the age ofd FOV and res argument.. I would like to include more on the Rift but as of yet it does not exist (another subtle dig to get Palmer to send mine quicker) . .. I own and use all the devices ..what I would like people to understand is actual use is very different to paper stats and that all the manufacturers could learn from feedback about each others products because it is not just the hardware that is evolving , the end user's expectations as I can see from your comments are too . If you are expecting VR this year or next, it won't happen but that does not mean we cannot push Palmer and co closer towards it through some positive feedback.. and test a few games along the way ,,,.psyche!

    5. Creator Mario Basiola on January 4, 2013

      Also about the tracking using mouse emulation. Granted it supports far more games. But it's limited to the head 'pitch' and 'yaw' axis alone. It pretends 'roll' axis dont exist. Proper vr head tracking needs all 3 axis.

    6. Creator Mario Basiola on January 4, 2013

      Looking at the hardware specs for the Zeiss cinemizer OLED, Im not impressed.

      Zeiss cinemizer:
      Resolution: 870x500 (0.43 Mpixel) pr.eye
      Fov: 30 degrees

      Oculus Rift:
      Resolution 640x800 (0.51 Mpixel) pr.eye
      Fov: 90 degrees

      Good headtracking is very important for VR. And as described in the latest Oculus update, a fast quality tracker is included in the devkit.

      FOV is however the nr1 most important factor for a good immerse effect. The Zeiss fov of only 30 is far from good enough. A fov that low is also know as the "looking through toilet paper tubes effect".

      The design of the Zeiss hmd is good. But that isn't important at all. Its what is going on the inside that really matters.

      In Oculus shoes, I wouldn't worry to much about Zeiss. Seems they are lightyears behind. Marketing the same crap Vusix does.

    7. Creator Oculus on January 4, 2013

      Hi Thibaut Girka,

      There is indeed no distortion in the picture, good catch! We are still working on distortion shaders for our new optics and display, and happened to have it turned off in the picture.


    8. Creator Oculus on January 4, 2013

      Hi thegameveda,

      The tracker in the Rift is actually 5x faster than the Cinemizer sensor, so it should be up to your performance standards. Note, however, that being able to hold a view as stable as an unmoving mouse is not a desirable characteristic for a VR device. The only way to achieve that is to heavily filter the data and add a deadzone for movement, which increases the latency and reduces the sensitivity of the tracking. If a particular game needs that ability, they can certainly add it, but we are going to provide the highest performance we possible can.

      Mouse emulation is not something we plan on doing, mainly because it cannot provide an experience even close to natively integrated head tracking. Mouse emulation is higher latency, drifts, and it cannot provide any roll to the camera. The only way to get a great VR experience is for the game to be designed with VR in mind.


    9. Creator thegameveda on January 4, 2013

      Great to have some news , I have been testing a prototype of the Zeiss headtracker here and here . I can perceive no latency when using this device . I have several thousand PC and console games to test at present with the head tracker.All the videos are recordings of actual gameplay using the headtracker alone. I find the ability to hold an on screen cursor dead dead as a mouse left alone on a desk..using the head tracker is vital .check my aiming in FPS ,TPS and sim games using headtracking . I never expected to be able to play games with a headtracker before the Zeiss ..which requires 0 drivers or calibration . I hope Palmer and co are not lost in endless upping of tech specs that all us gamers get lost in .. The Rift needs to release to have an effect .I cannot wait to test the Rift but I am worried it will be superseded by competitors , you must find a way to emulate mouse control driver free like the Zeiss and allow other hmd uses that do not require games or apps to be Rift enabled .. Other hmds are plug and play and portable too and here right now .Sometimes the simplest solutions are best . It may be cool in house to compare tech site scores for latency , but I think for commercial success perceived latency by joe public is more relevant , as a gamer I perceive 0 latency using the Zeiss headtracker and I am able to state that as fact because I have the device on my head right now .We are all rooting for you guys to succeed so please get those dev kits out soon so we can spread the word and get back to you with real world use outside tech labs..

    10. Creator Ken Sturgis on January 4, 2013

      The addition of a 9DOF IMU is amazing news, i was worried you would need a camera to correct yaw drift. Can I ask which chip you used for the IMU? The only 9DOF chip I know is the BoschSensortec BMX055. Is there any worry about magnetic fields being unstable inside of buildings causing the magnetometer to be useless?

    11. Creator Ryan McClelland on January 4, 2013

      Looks like they are filing a patent based on the style of the image illustrating the coordinate system.

    12. Creator John Arne Birkeland on January 4, 2013

      Nothing secret about the math behind IMU sensor fusion for leveling. Search for information about Extended Kalman Filters or Direction Cosine Matrixes to get some pointers. There are even plenty of open source solutions used for autopilots in remote controlled hobby drones. Like for example over at

    13. Creator gonggeer on January 4, 2013

      Timmmm,I think anything deeper maybe trading secret ?

    14. Creator Timmmm on January 4, 2013

      I'm not sure I would describe any of that as "deep technical aspects". In fact it doesn't say at all what makes the rift sensor different. You have a highly technical audience here. No need to dumb down!

      Sounds like it's going well though, so carry on.

    15. Creator ET3D on January 4, 2013

      Thanks for the fascinating insight into tracking. I read the Michael Abrash post yesterday, so I enjoyed reading your take on the subject.

    16. Creator Martin on January 4, 2013

      Thank you for keeping us up to date!

      What is Nirav working on in the last photo? Could it be a prototype magnetic sine generator for positional tracking in the consumer version? ;)

    17. Creator Thibaut Girka on January 4, 2013

      This is not about the sensor, but... It's hard to say due to the small size of the picture, but I can't see any distortion on the computer screen. Does it mean it doesn't have to be done in software anymore?

    18. Creator WizDish Ltd. on January 4, 2013

      The open and honest way in which you are tackling these difficult problems is very impressive. Well done guys.

    19. Creator Lithon on January 4, 2013

      Thank you! Awesome post! This is how companies should react to other companies!