Share this project

Done

Share this project

Done
Photo original
Wireless, modular motion tracking system for the most natural and intuitive interaction with video games, virtual reality, and more.
Created by

Sixense

2,383 backers pledged $604,978 to help bring this project to life.

SixenseVR SDK Demo Applications

20 likes

Our recent updates have been focused on hardware, but now we want to give you more details on the important work our software team has been doing to prepare for the launch of the STEM System. As we first discussed in Update #35, we are creating the SixenseVR SDK to enable developers to easily incorporate full-body presence and allow for natural and intuitive user interactions within virtual reality and other virtual worlds which, as many of you know, can be very challenging even for experienced developers with existing hardware.

The STEM System with the SixenseVR SDK together make it easy for any developer to deliver presence in VR games and applications without writing any code. Below we have included videos and descriptions of a few examples of VR interactions developed with the SixenseVR SDK. Backers will be given access to the full source for these sample projects, complete with customizable prefabs that you can use in any game. 

Shooting Gallery (Advanced Weapon Mechanics)

Our Shooting Gallery, which we created using the SixenseVR SDK, provides an example of common game interactions with a focus on full-simulation weapon mechanics. The STEM System allows you to wield a gun in each hand naturally, and to use either eye to look down the sight. Additionally, we have added interactions for ejecting the magazine, picking up another magazine and inserting it into the gun, all done naturally with your hands. You can even inspect the magazine to see how many bullets are left before reloading it. With the STEM System and the SixenseVR SDK, instead of using scripted and hand-animated actions to aim and fire with a joystick and buttons as you do in traditional games, you now have a simulation in which you actually forget that you’re in the game. Even a complex task like reloading can be performed naturally and realistically, calling upon the player to learn how a weapon operates to reload more quickly. The player can use guns from holsters on the waist or a shotgun strapped to the back, and access each by reaching for them naturally. We are adding more complex two-handed weapon interactions, such as allowing the player to operate the slide with the off hand. Imagine cocking a pistol or a pump-action shotgun, using bow and arrow, or pulling a pin from a grenade and then tossing it. We will provide customizable prefabs for these behaviors, and we are excited to see what developers can do with them. 

Lightsaber Demo

For almost every child (and many adults in our case) a fantasy is to become a Jedi wielding a lightsaber. The lightsaber or sword is the true test for a one-to-one motion tracking system in VR, because it requires low latency, and almost perfect tracking of both position and orientation between your physical and your virtual self as you perform swipes, slashes, blocks and counters with the lightsaber. The STEM System and the SixenseVR SDK make this experience possible, and also make it easy for developers to create these types of applications quickly. The experience is a fantasy, but it feels natural because of the near-perfect hand-eye coordination that the SixenseVR SDK provides. 

Back Office (Easily Built Common Interactions)

This video demonstrates some of the relatively mundane day-to-day interactions in our virtual “back office” that can be easily re-created in VR with the SixenseVR SDK. Our goal here is to make all of our environments fully interactive and and to make them feel completely intuitive to manipulate in VR, so that the untrained user needs no instructions or prompts. You can turn on a ceiling fan by hitting a switch on the wall with your hand, smash delicate glassware and electronics, pick up a flashlight and point the beam wherever you want, or even get creative and use a keyboard to stop the ceiling fan by jamming it into the spinning blades. We are working to ensure that any object rigged for physics simulation behaves as you would expect when touched or grabbed, and we will provide Unity prefabs and Unreal Blueprints for a wide range of common interactive objects that you might need for your game design. 

Here are some of the features that we have planned for the SixenseVR SDK: 

  • Leg, arm, torso, and physical prop tracking. Allows you to place STEM Controllers or STEM Packs wherever you like to experiment with other types of interaction using your entire body, and even other types of controllers. 
  • User avatar adjustment control panel. When allowed by the game/application designer, users can override the gender and proportions of their avatar to match their own body, for a much more natural sense of body awareness. 
  • Two-handed interactions with various types of virtual objects. 
  • Two-way collisions with the environment, allowing other objects to affect the avatar’s body. 
  • Multi-user support. Simple interface for efficiently synchronizing a user’s body pose over a network, enabling nuanced face-to-face interaction with subtle body language cues across distant physical locations. 
  • Additional game engines: Unreal Engine 4, Source SDK and CryEngine. 

We’re looking forward to seeing what you all can do with it. We'll have more videos soon. 

Nick Thompson, Stanley Au, and 18 more people like this update.

Comments

    1. Creator Peter Tiedemann on August 15, 2014

      @Sean:
      As someone who has backed both STEM and several other tracking solutions, and spent quite a lot of time understanding the details of each, I find it a bit arrogant to so easily brush aside the competing products and patronize their backers. I do believe STEM has the best chance of becoming the least "exotic" of these options (ie. more mainstream), but the other products do offer things that STEM don't (and vice versa).

    2. Creator Daniel Ervik on August 14, 2014

      Thanks for a great update!
      How is the integration with Omni going?

      Ideally I would love to integrate as few SDKs as possible, so if you will control this, like I assume you control the DK2 SDK, it would be an awesome feature.

      I belive that the Oculus, Omni and STEM are a perfect match :)

    3. Creator Robertus on August 14, 2014

      Looks promising

      Any contact already with Virtual Worlds providers like Linden Labs from SecondLife, Open Sim, High Fidelity ?

    4. Creator Sean Concannon on August 14, 2014

      Fantastic, I refused to back all other input devices because I knew Sixense has the best system available. Anyone who argues differently clearly hasn't done their homework.

    5. Creator Richard Fleming on August 14, 2014

      Whenever the 'Control VR' people were asked they claim STEM and Control VR are incompatible (and give reasons which don't appear to make much sense) - I and others who have enquired are interested in simultaneously using both systems - there hand tracking looks awesome but they have only track hands, arms & chest. Have you talked to them (I understand they are the enemy) about this? Also it would be great if there were some integrated way to link these devices to the software they provide input to - are you talking?

    6. Creator Richard Fleming on August 14, 2014

      Also - any word on makeVR?

    7. Creator Richard Fleming on August 14, 2014

      You've previously said there is insufficient bandwidth for more than 5 sensors at a time - I guess there is no chance you are working towards pushing that to more sensors?

    8. Creator Carl Kenner on August 14, 2014

      What's the license on the SDK?
      I need to make open source software with STEM support.

    9. Creator Robert Silva on August 13, 2014

      Great to see real examples and progress. thank you

    10. Creator Paul R. Dillinger on August 13, 2014

      It looks amazing guys! I received my DK2 yesterday and I can't wait to build! Is there any ETA on when things might start shipping? Original estimate was Oct. 2013, but we all know Kickstarter shipping estimates are like weather predictions (rarely correct until they're about to happen). It would be nice to know what quarter you "currently" expect to start shipping things. Thanks!

    11. Creator Phil McCarty on August 13, 2014

      When do you anticipate releasing these sample projects? If I remember correctly, you said they would be backwards compatible with the Razer Hydra, so, in theory, we could start developing projects for them now, right?

    12. Creator Michael Banks on August 13, 2014

      Nice update.... sure would prefer at least one non-violent example... Perhaps mechanised robotic arms or precision actions such as building, sculpting , sports... but hey that's what we're here for... Right :) to develop awesome experiences

    13. Creator SaltyBrains on August 13, 2014

      that some impressive stuff right there.

    14. Creator Michael Bylehn on August 13, 2014

      I love how you can see in-game where the dock is located so you can just put them down without removing your HMD.