Over the past few months, we've been hard at work adding new capabilities to Structure Sensor & SDK, as well as making it compatible with even more devices. Here's a brief summary of what's new, starting with a foray into mixed reality.
Announcing Bridge Engine for mixed reality
Many of the earliest adopters of Structure Sensor (including Kickstarter backers like you, of course) have gravitated towards 3D scanning as their go-to application.
Advanced mixed and augmented reality have been in the works at Occipital since our Kickstarter campaign. We're thrilled to reveal the next step towards fulfilling the vision for mixed reality that we laid out in our video: Bridge Engine.
Bridge Engine is a powerful tool for creating cutting-edge mixed reality applications with Structure Sensor. To get things rolling, we recently hosted a private beta where a handful of developers got early access to Bridge Engine to see what they could create. After just a few days, the results were amazing. Here's what they created:
We'd like to thank all of the developers who participated in the Bridge Engine Beta.
The team here at Occipital also put together a Bridge Engine demo which we revealed at this year's CES – complete with an animated rabbit that casts realistic shadows on real objects and real world walls that fly away to reveal a virtual world. All with rock-solid tracking.
The new iPad Pros are here – and so are new brackets for Structure Sensor
With a newer, more powerful A9X chip and a 12-megapixel camera on the 9.7" version, the new iPad Pros make the Structure Sensor user experience faster and smoother than ever before. We tested 3D scanning on an iPad Air 2 vs. the new 9.7" iPad Pro. The result? 50% faster mesh updates on the iPad Pro.
To attach Structure Sensor to the new Pros, there is an innovative new bracket that works with both the 12.9” and 9.7” iPad Pro models. It’s 3D printed with maximum precision on professional machines using durable nylon, and it attaches firmly with strong-yet-removable adhesive pads. A clever slide-in design allows for attaching and removing Structure Sensor for easy transport of an iPad Pro with the Structure Sensor detached.
The iPad Pro bracket is available now at the Structure Sensor accessories store. Don't have an iPad Pro? Brackets are available for all recent iPad models: Air and Air 2, mini 2, 3 and 4.
Structure Sensor in the news
Structure Sensor continues to spark the interest of journalists worldwide. Most recently, it was featured in the Wall Street Journal (for 3D scanning), Forbes (for augmented reality) and PC Gamer (with our friends at Uraniom).
With Unbounded Tracker, any Structure SDK developer can integrate the unbounded positional tracking first seen in S.T.A.R. OPS into their own app. As an added bonus, Unbounded Tracker also lets you easily record motion paths as well.
StructureUnityUBT takes many of the features of Unbounded Tracker and brings them into Unity, along with four sample scenes to get you started. Check out the video above to see how to use the new Unity plugin.
Scanner Sample 1.5: As a sample app from the Structure SDK, the decision to add features and functionality to Scanner are grounded in how they impact developer's options when building apps based on it. The latest improvements to Scanner now allow access to high-resolution color textures, for added photorealism to people and objects captured with Scanner. Download Scanner 1.5 from the App Store now, or build your own app with the latest Structure SDK to access the same high-fidelity scanning.
Skanect 1.8: Similar to Scanner 1.5, Skanect 1.8 offers higher resolution color textures - and not just of new models, but of existing models already captured with older versions of Skanect. To learn more about these updates, see our blog post about the launch of Skanect 1.8 here.
From January 6th to the 10th, a majority of the Occipital team converged on Las Vegas for our first ever booth at the Consumer Electronics Show (CES).
While we've previously released features that make the Structure Sensor an excellent mobile 3D scanner, we decided to show something completely new at CES 2015. We demonstrated how you can now use the Structure Sensor for six degrees of freedompositional tracking for games and more.
Not sure what that means? Think of it this way: virtual reality systems, for instance, track your head movements on three axes: pitch (nodding your head up and down), yaw (rotating your head from left to right) and roll (tilting your head to one side or the other). However, what if you wanted to stand up and walk around, and have your real world movements translated into movements in a virtual world? That ability to translate your position in three-dimensional space in addition to rotating your head on those three axes add up to six degrees of freedom.
So what was it like to experience this in person? With our booth focused around a gameplay zone, we handed visitors an iPad with a Structure Sensor, which proceeded to transport them in time and space from the CES booth to a fictional facility 700 years in the future. Once there, visitors found that every movement they made in the real world was mirrored exactly in the virtual world they observed through the iPad. Visitors were even able to interact in advanced ways: pouring fluids and firing a laser blaster.
Unbounded Positional Tracking
While there are devices (such as VR headsets) that now offer rudimentary 6DoF positional tracking, most require the user to stay within a constrained area. The Structure Sensor eliminates those spatial constraints, and allows positional tracking to travel with the user as they move. We call it unbounded positional tracking. Just imagine using this new unbounded positional tracking within a virtual reality headset.
We didn’t develop this brand new demo for CES only. Soon, you’ll also be able to download this experience - we call it S.T.A.R. Ops- to your own iPad to try with your Structure Sensor. For the Structure SDK developer community, the unbounded positional tracking inside S.T.A.R. Ops will be soon added to the Structure SDK.
The Air 2. The mini 3. The iPhone 6 and 6 Plus. The Structure Sensor will be ready for them all.
It's that time of year again - time for Apple's latest introduction of new iOS devices. A few weeks ago, the world was introduced to the new iPhone 6 and iPhone 6 Plus. And earlier this month, we learned more about the latest iPads, the iPad Air 2 and the iPad mini 3.
As you might have guessed from last year's mid-Kickstarter campaign launch of new Structure Sensor brackets for the iPad Air and iPad mini with Retina display, we've already been preparing for the new iPads by designing a new bracket to fit the iPad Air 2 and ensuring that the iPad mini 2 bracket will be a perfect fit for the new iPad mini 3 (which it is!). We've also begun exploring the enhanced performance that will be possible with the Structure Sensor when it's used in conjunction with the upgraded components in the iPad Air 2.
If you're planning on upgrading to the new iPad Air 2, you'll be able to pre-order a new bracket soon. But there's more to the story than just a new bracket. For the complete details, visit our iPad Air 2 page.
Think you've got what it takes to create a great case to attach a Structure Sensor to a new iPhone? Together with our friends at Shapeways, we're putting up $1,000 in prizes for the best designs from our community. To learn more, visit the contest page - we're looking forward to getting all your entries!
Come hack with us next week in Vancouver!
From November 8th to November 9th, we're co-hosting a 3D Hackathon in Vancouver along with Microsoft's Kinect for Windows Team! Are you in the Pacific Northwest and have a great idea for the next great depth-powered app? Build it live with our expert help! Registration is open now, and its limited to the first 100 signups. There are only a few spaces left, so sign up now if you'd like to participate!
Back Nimble Sense
Our friends at Nimble VR (formerly known as 3Gear Systems) launched their own Kickstarter yesterday for the Nimble Sense, a compact depth camera optimized for bringing your hands into virtual reality systems like the Oculus Rift. We've had the opportunity to test it ourselves, and the technology is pretty incredible - low latency, high dexterity and a big step forward in the overall VR experience. Pledges start at just $99, so head over to their campaign now to check it out.
In the backer update last week, we celebrated the one year anniversary of Structure Sensor’s Kickstarter launch, launched Calibrator, and added color support to Skanect and the Scanner sample app. It's been fun seeing all the color scans you've shared. Keep them coming!
This week we've got even more to share with you, starting with the biggest update yet to the Structure SDK.
Available Now: Structure SDK 0.3
Over the last few months, our team has been working extremely hard to deliver the latest version of the Structure SDK. Today we're happy to announce the availability of Structure SDK 0.3, which is packed with a number of features many of you have been waiting for.
In the last backer update, we announced Calibrator. This week, in Structure SDK 0.3, you can now access Calibrator's precision-alignment to do new things with your sensor.
Once calibration is complete, new SDK features that rely on that color alignment are unlocked. The first of those is the inclusion of STColorizer, which can add per-vertex color to an STMesh. This feature is used in the Scanner sample app (available as of today in source form) to include per-vertex coloring. Open the Scanner sample source code and start building your own app with color scanning. Another new feature in the Scanner sample is optional .zip compression of OBJ files to reduce file size.
Perhaps the most significant new feature in Structure SDK 0.3 is a new color-enhanced mode for tracking. Whereas the only tracking mode previously available relied solely on the Structure Sensor's depth camera, the new color-enhanced tracking mode offers robust tracking in environments where geometry alone may not provide enough data - such as a room with flat walls.
The new tracker was built to enable us to launch a completely new sample app, which is available for the first time in the Structure SDK: the Room Capture sample app, which utilizes Calibrator for precise depth and color alignment, as well as the new color tracker for robust tracking, and offers UV texturing for color.
In many ways, the Room Capture sample app is one of the most sophisticated apps ever created for the iPad (even though it’s just a sample), as it brings a completely new set of possibilities to mobile devices.
With Room Capture, you’ll be able to glimpse into a future (that you get to help create!) where we’ll be able to put a highly accurate model of where we live in our pocket. We’ll never again have to guess how much paint or carpet we’ll need when we’re remodeling. We’ll never again wonder if a new couch will fit in the living room. And we’ll change the way we preserve and revisit our memories of spaces.
Of course, that's still the future, but we're off to a great start. Room Capture will let you start experiencing a number of fun and useful things right now. It lets you capture room-sized, scale-accurate textured 3D environments. And once you capture your first room, you’ll be able to virtually fly through it to view it from a number of different angles. You can even float to the very top to look at the space you’ve captured from a completely new viewpoint.
You can zip and export the model you just captured to open it in a different program (like Meshlab) for further editing and manipulation. And you can finally throw away your tape measure: within the Room Capture sample app, you can touch two points anywhere on the room model you have captured to measure the distance between them.
Of course, what is most exciting about Room Capture is that it is a sample app complete with source code. That means that code for the Room Capture sample is available in the Structure SDK so you can create new applications that use Room Capture as a foundation. We know that many of you develop software and apps for the architecture, engineering and construction industries, and that this is exactly what you’ve been waiting for. We can’t wait to see what you’ll build!
It’s been spectacular to see developers building great new apps that extend what you can do with your Structure Sensor. We’d like to highlight two apps to emerge from developers:
ModelCluster - Created by Will Powell (a Beta backer), Model Cluster is a simple-to-use object scanning app that lets you capture, annotate, color and share 3D models using the Structure Sensor. Our favorite feature is that you can choose fun effects that make scans appear glossy or metallic. We’ve tried it ourselves here in the office, and it is a pleasure to use. Thanks for creating a great app, Will! Download Model Cluster from the App Store.
Gestoos - Gestoos lets you use natural gestures interpreted by your Structure Sensor to control things like iTunes, YouTube, your computer’s volume control and even connected devices like Philips Hue lightbulbs! You can learn more about Gestoos at their website here: http://www.gestoos.com/ or you can download it from the Mac App Store. Requires USB Hacker Cable.
Are you working on a new app using the Structure SDK? Let us know!
Pacific Northwest Backers: We're co-sponsoring a depth-powered hackathon with Microsoft.
We're excited to announce that we're joining forces with Microsoft's Kinect for Windows team to co-sponsor the IEEE Vancouver Kinect and Structure Sensor Hackathon. This two-day event will take place from November 8th to November 9th, 2014 in Vancouver at the BCIT School of Computing and Academic Studies. The event is limited to 100 participants, and registration opens on October 15th. Members of Occipital's engineering team will be on hand to provide guidance and answer questions.
We hope that all of our backers in the area will register to create awesome Structure SDK and Kinect SDK powered apps at the Hackathon. Perhaps one of you will be adventurous enough to create a hybrid Kinect/Structure app - maybe even with sample code from the new Room Capture sample app?