Wireless, modular motion tracking system for the most natural and intuitive interaction with video games, virtual reality, and more. Read more
This project was successfully funded on October 12, 2013.
SixenseVR SDK Demo Applications
Our recent updates have been focused on hardware, but now we want to give you more details on the important work our software team has been doing to prepare for the launch of the STEM System. As we first discussed in Update #35, we are creating the SixenseVR SDK to enable developers to easily incorporate full-body presence and allow for natural and intuitive user interactions within virtual reality and other virtual worlds which, as many of you know, can be very challenging even for experienced developers with existing hardware.
The STEM System with the SixenseVR SDK together make it easy for any developer to deliver presence in VR games and applications without writing any code. Below we have included videos and descriptions of a few examples of VR interactions developed with the SixenseVR SDK. Backers will be given access to the full source for these sample projects, complete with customizable prefabs that you can use in any game.
Shooting Gallery (Advanced Weapon Mechanics)
Our Shooting Gallery, which we created using the SixenseVR SDK, provides an example of common game interactions with a focus on full-simulation weapon mechanics. The STEM System allows you to wield a gun in each hand naturally, and to use either eye to look down the sight. Additionally, we have added interactions for ejecting the magazine, picking up another magazine and inserting it into the gun, all done naturally with your hands. You can even inspect the magazine to see how many bullets are left before reloading it. With the STEM System and the SixenseVR SDK, instead of using scripted and hand-animated actions to aim and fire with a joystick and buttons as you do in traditional games, you now have a simulation in which you actually forget that you’re in the game. Even a complex task like reloading can be performed naturally and realistically, calling upon the player to learn how a weapon operates to reload more quickly. The player can use guns from holsters on the waist or a shotgun strapped to the back, and access each by reaching for them naturally. We are adding more complex two-handed weapon interactions, such as allowing the player to operate the slide with the off hand. Imagine cocking a pistol or a pump-action shotgun, using bow and arrow, or pulling a pin from a grenade and then tossing it. We will provide customizable prefabs for these behaviors, and we are excited to see what developers can do with them.
For almost every child (and many adults in our case) a fantasy is to become a Jedi wielding a lightsaber. The lightsaber or sword is the true test for a one-to-one motion tracking system in VR, because it requires low latency, and almost perfect tracking of both position and orientation between your physical and your virtual self as you perform swipes, slashes, blocks and counters with the lightsaber. The STEM System and the SixenseVR SDK make this experience possible, and also make it easy for developers to create these types of applications quickly. The experience is a fantasy, but it feels natural because of the near-perfect hand-eye coordination that the SixenseVR SDK provides.
Back Office (Easily Built Common Interactions)
This video demonstrates some of the relatively mundane day-to-day interactions in our virtual “back office” that can be easily re-created in VR with the SixenseVR SDK. Our goal here is to make all of our environments fully interactive and and to make them feel completely intuitive to manipulate in VR, so that the untrained user needs no instructions or prompts. You can turn on a ceiling fan by hitting a switch on the wall with your hand, smash delicate glassware and electronics, pick up a flashlight and point the beam wherever you want, or even get creative and use a keyboard to stop the ceiling fan by jamming it into the spinning blades. We are working to ensure that any object rigged for physics simulation behaves as you would expect when touched or grabbed, and we will provide Unity prefabs and Unreal Blueprints for a wide range of common interactive objects that you might need for your game design.
Here are some of the features that we have planned for the SixenseVR SDK:
- Leg, arm, torso, and physical prop tracking. Allows you to place STEM Controllers or STEM Packs wherever you like to experiment with other types of interaction using your entire body, and even other types of controllers.
- User avatar adjustment control panel. When allowed by the game/application designer, users can override the gender and proportions of their avatar to match their own body, for a much more natural sense of body awareness.
- Two-handed interactions with various types of virtual objects.
- Two-way collisions with the environment, allowing other objects to affect the avatar’s body.
- Multi-user support. Simple interface for efficiently synchronizing a user’s body pose over a network, enabling nuanced face-to-face interaction with subtle body language cues across distant physical locations.
- Additional game engines: Unreal Engine 4, Source SDK and CryEngine.
We’re looking forward to seeing what you all can do with it. We'll have more videos soon.