LazeeEye? Seriously? The name "LazeeEye" is a portmanteau of "laser" and "eye," indicating that your phone's camera (a single "eye") is being augmented with a second, "laser eye" - thus bestowing depth perception via stereo vision, i.e., letting your smartphone camera see in 3D just like you can!
What is a 3D camera? It is a device that captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. However, a 3D camera also captures the depth dimension (in addition to the standard 2D data).
What can you do with a 3D picture?
Capture models of objects or people for 3D printing or CAD modeling
Make absolute 3D measurements from the photo - for, construction and remodeling, interior design, clothes shopping, etc.
Remove objects or people outside a given depth - eliminate "photo bombers," remove the background scene from photos, replace the background scene
Change the angle or lighting of the photo after the fact
More easily perform a variety of photo editing ("photo-shopping") effects, with the aid of the image depth channel
Implement augmented reality games, or play existing augmented reality games
Much, much more - just search the web to see what people do with 3D sensing, and imagine how these applications could translate to or enhance mobile device apps
Additionally, by analogy with stitching 2D panoramas, It is also possible to stitch together multiple 3D views of an object for improved resolution and model quality. Here's an example of a multi-view stitched model:
Why should someone get LazeeEye? Because it is simple; because it is cheap. It is an add-on module that can work with any smartphone - all you need is a camera and some processing horsepower (iPhone or Android), and the LazeeEye stereo vision app can form the 3D picture. The size of the module is roughly the size of a a pencil (although our existing prototype is a little larger), and, once mass-produced, is projected to cost less than the current DIY reward level. Sure, we all know 3D camera phones are on the horizon (although none are available yet); but will you be able to get one for even under $500, let alone under $100? This is the kind of price point that truly can bring this technology to *everyone* - which, in turn, can drive a virtuous circle of innovations, applications, and business in the emerging world of 3D mobile.
Which version should I get? By popular demand, we've decrypted the pledge reward levels into this infographic to help you better understand the options:
The figure above shows the notional components of LazeeEye and how it integrates with existing phone hardware. Also shown is an example real-world scene and its computed depth image. Below is one simple way to visualize depth in a 3D image, using a "wiggle GIF" :
The apparent motion is related to the depth of each point or object in the scene - giving a sense of the 3D scene structure to a person viewing the image even on a clunky old 2D computer screen!
How does LazeeEye work? The enabling technology behind LazeeEye is active stereo vision, where (by analogy with human stereo vision) one "eye" is your existing smartphone camera and passively receives incoming light, while the other "eye" actively projects light outwards onto the scene, where it bounces back to the passive eye. The projected light is patterned in a way that is known and pre-calibrated in the smartphone; after snapping a photo, the stereo vision software on the phone can cross-reference this image with its pre-calibrated reference image. After finding feature matches between the current and reference image, the algorithm essentially triangulates to compute an estimate of the depth. It performs this operation for each pixel, ultimately yielding a high-resolution depth image that matches pixel-for-pixel with the standard 2D color image (equivalently, this can be considered a colored 3D point cloud). Note that LazeeEye also performs certain temporal modulation "magic" (the details of which we're carefully guarding as a competitive advantage) that boosts the observed signal-to-noise ratio, allowing the projected pattern to appear much brighter against the background.
The image above shows an early, but functional, prototype of LazeeEye. For both backer rewards and mass manufacture, a smoother look-and-feel with likely smaller components will be used (as per the trend below):
LazeeEye in Popular Tech Press
See what others are saying!
By connecting to existing services and software partners, LazeeEye can have a more immediate and broad impact out-of-the-box.
For example, SketchFab is a popular 3D model sharing site with hundreds of thousands of users; LazeeEye will export directly to SketchFab, allowing you to easily share your 3D pictures and models with anyone using a modern web browser. For an example model, see top of this page: "Do Girls Dream in 3D?"
On the computer vision front, Heuristic Labs will be working closely with the surging start-up vision.ai to accomplish 3D reconstruction (multi-view stitching) and object detection algorithms using the power of cloud computing - of course, all designed to work seamlessly with data coming out of the LazeeEye 3D camera.
We're also exploring partnerships with end-user applications, such as the nifty 3D body scanning system (e.g., for precise clothing measurements) produced by Size Stream.
How is LazeeEye different from existing, similar products? In short, it is smaller and it is cheaper... MUCH cheaper. It uses existing components that are already mass-manufactured (i.e., CHEAP) for the laser illuminator, and - this is the key insight - it utilizes the low-power/light-weight processing and high-resolution camera that already exists in millions of smartphones, so YOU DON'T HAVE TO BUY ANOTHER PHONE! Just upgrade the one you have, for pennies on the dollar. For a partial list of specific comparisons to related technologies, see below.
Project Tango (https://www.google.com/atap/projecttango/) was recently announced by Google. It requires an entirely new phone with a custom processing chip (selling price not yet disclosed, but not likely to be cheaper than existing flagship smartphones at ~$500) and focuses on mapping applications; whereas LazeeEye uses your existing smartphone, costs 10x less, and focuses on 3D photo snapshot and object modeling applications.
The Microsoft Kinect (http://en.wikipedia.org/wiki/Kinect) uses a custom system-on-chip image processor, costs ~$150, is much larger/heavier/power-hungry than a smartphone, must be tethered to a PC, and is focused on skeleton tracking applications
The PrimeSense Capri sensor is used in the Microsoft Kinect, and so the previous compare/contrast bullet also applies; also note that PrimeSense was recently acquired by Apple.
2D photos constitute a fundamentally lossy representation of our 3D world, making automated perception (required for advanced photo editing and effects), as well as size measurements, difficult or impossible.
Above, we show a project plan for developing LazeeEye, delivering rewards to backers, and post-project plans. Of particular note are the June and December milestones, when we'll be shipping prototype units and developer library API, respectively.
Although the rewards we're offering in this campaign can't reflect the economies-of-scale unit price under mass production, we believe there is still tremendous value in being early supporters, backing our project, receiving a prototype of one of these cool devices, capturing scenes in all their 3D glory, using the device's data in your own software app or artistic projects, generally becoming the envy of your friends and family... and, most importantly, proudly counting yourself amongst those who helped bootstrap the 3D revolution!
Risks and challenges
UPDATE: to remove risk or uncertainty for backers, we can promise a full refund at any time (given you return device in working order if you want refund after you've already received it).
The main risk to completing the project is not achieving the accuracy, resolution, or robustness to lighting conditions needed to meet all use cases. Currently, we have built a proof-of-concept prototype in hardware and have implemented the depth estimation routine on a PC; we've tested the algorithm on real sensor data collected from this prototype. Early results are encouraging, instilling confidence that our technique can do roughly as well as more-expensive or less-mobile competitors. In any case, we are certain that we will be able to deliver a functioning prototype to backers - the hardest part has been done already, and now we just need to tweak some parameters to maximize performance.
Note that backers may rest assured that we know what we're doing, that this is not our first rodeo, and that we can deliver: Heuristic Labs comprises decades of aggregate experience in these technologies, members having worked widely as research scientists, engineers, and designers at institutions such as NASA, MIT, Microsoft Research, IBM Research, Maya Design, Carnegie Mellon University, and others. See our full bio (link at upper right) for more details on the Heuristic Labs story.
Absolutely! The app will provide export to several open data formats, such as STL and PLY, that can be imported into your 3D printing application. Note that, depending on the object/scene complexity, an aggregate model stitched from multiple views at different angles will give much better results; see below for multi-view stitching details.
Absolutely! We are using Class 3R laser modules as a component (equivalent to an off-the-shelf laser pointer), and we introduce no focusing optics (in fact, a de-focusing or beam spread occurs, making the beam even more eye-safe). That means the FCC deems it eye-safe for normal usage - i.e., just don't stare directly into it with a magnifying glass for many hours, and you'll be fine. The same applies across all colors we are offering. More info here: http://en.wikipedia.org/wiki/Laser_safety
Absolutely! ... kind of. The 3D data provided by the basic app is definitely appropriate for injecting into a multi-view stitching solution. These exist already, so we plan to allow export of the data to an open format that can be read by existing options. However, these all run on a desktop computer, so we are also investigating: (1) multi-view stitching as a cloud service accessible directly by your smartphone; and (2) advanced algorithms that can run directly on the smartphone.
[insert term stronger than "absolutely"]! In fact, this was the inspiration for LazeeEye - we had built a drone UAV with a smartphone as its brain, and another - bulky - 3D sensor as its eyes. We realized acutely the redundancy, inefficiency, cost, and lack of portability that existing 3D sensors forced on us in this situation... and LazeeEye was born! Further contemplation led us to the conclusion that - hey - EVERYONE could use this (not just robots).
Certainly. To pledge for multiple rewards (several of the same or different), simply add the amounts together for each individual reward, then increase your pledge amount to this total. When the project ends, you will be sent a survey allowing you to list the reward levels and quantity for which you had pledged.
Not enormously. Each color will have the same output power, but a couple minor factors are worth considering: (1) shorter wavelengths will have tighter beam patterns, meaning higher resolution; (2) indoor lighting tends to be more toward the red end of the spectrum, meaning LazeeEye will have greater max range for shorter wavelengths; and (3) typical smartphone cameras tend to have more sensitivity for shorter wavelengths.
The main difference (on our end) is the cost of the laser module; on your end, this means green and blue lasers are more rare to see in consumer use and thus making you that much cooler if you wield these colors of laser!
Yes, you read our minds! We are currently pursuing investment from all these venues to complement this Kickstarter campaign targeting a smartphone-version for everyday use by everyone. The software stack will have a common interface, so feel free to grab a LazeeEye prototype here for cheap, get a feel for how it will work in your industry, and send a message letting us know your desired use case so we can better meet your future needs.
The "Support the Team" Special. If you don't need a 3D camera or can't spare the change, this backing level still lets you show your support; in exchange, you'll get a signature 3D photo of the team as a high-tech thank you.
The LazeeEye DIY e-Kit. This kit is perfect for tinkerers, makers, and for early birds who want to get started snapping 3D photos ASAP. Contents include CAD model (ready for 3D printing), parts list, in-depth instructions, calibration tool, and the BlinkSnap 3D photo-snapping app (for Android or iPhone). Just get your parts, print or make your phone's specific bracket, run through the calibration routine, and start snapping in 3D!
The LazeeEye DIY Hardware Kit. Ideal for the DIY-er on a time budget, this kit is identical to the e-Kit, but we'll get all the hardware components together, package them up nicely, and send them to you with a little bow! Included laser is red (wavelength 650 nanometers).
The LazeeEye Standard Edition (second batch). Red
laser illuminator (wavelength 650 nanometers), universal phone mount, calibration tool, and the BlinkSnap 3D photo-snapping app (for Android or iPhone).
The LazeeEye Standard Edition (third batch). Red laser illuminator (wavelength 650 nanometers), universal phone mount, calibration tool, and the BlinkSnap 3D photo-snapping app (for Android or iPhone).
The LazeeEye Pro Edition. Same as Standard Edition, but includes a green laser (wavelength 532 nanometers), green light camera filter, and toggling of laser illuminator power in app by audio port attachment.
The LazeeEye Elite Edition. Same as Pro Edition, but includes a blue laser (wavelength 405 nanometers), blue light filter, and demo app for simple editing of 3D photos (change perspective, remove background by thresholding depth, etc.)
The LazeeEye Elite Edition (second batch). Same as Pro Edition, but includes a blue laser (wavelength 405 nanometers), blue light filter, and demo app for simple editing of 3D photos (change perspective, remove background by thresholding depth, etc.)
The LazeeEye Developer Edition. If you'd like to use LazeeEye's 3D capabilities in your own custom app or project, this one is for you. You get the Elite Edition, plus software development kit (3D imaging and editing library, API documentation, code samples, and yes, even source code).
LazeeEye DSLR Edition. For use with DSLR cameras to achieve highest 3D image quality (HDR mode and flash sync trigger required). Quality improvement in 3D image over LazeeEye smartphone editions is akin to image quality improved in 2D photos achieved by DSLR over smartphone. Includes post-processing software for desktop PC (Windows, Mac, or Linux) for image formation and multi-view stitching. LazeeEye mounted and triggered via standard hot shoe; additional, optional C-mount optical filter provided; blue laser (405 nanometers).
LazeeEye Jedi Edition. Ultra-powerful laser from Wicked Lasers. Includes everything from Elite Edition (demo apps, audio jack trigger, optical filter). Default is blue color (http://www.wickedlasers.com/arctic) at 700mW (140x more powerful than other LazeeEye editions, for roughly 12x greater range and improved performance in bright backgrounds); can choose any other color/power option from Wicked Lasers and pay the difference (even dual wield of two lasers is possible). For expert users only (seriously, these are not eye safe and require appropriate eyewear and safety precautions and are not to be used to image people or animals).
Collaborative "Standing on the Shoulders of Giants" Design Pow-wow. For those who wish to use the LazeeEye or its underlying technology in their own projects/products, the team will spend up to one work week discussing, brainstorming, and helping to design your cool idea.
Custom Sensor Design. If you have a business or product (existing or planned) that requires 3D sensing, then we can help you analyze your application's specific requirements and thereby optimize the design of a custom structured light 3D sensing system for you. Up to one work week, w/ optional extension thereafter.