Funded! This project was successfully funded on July 20, 2012.

Photo-main
Play
00:00
00:00

Make Laser Radar (LIDAR) an inexpensive gadget for every robot developer to have

This project is to develop inexpensive lidar from affordable components - laser pointer, small single board computer running linux and a webcam. Unlike more expensive lidars calculating timing between light emitted and returned, this lidar will have software calculating angles and distances to reflected spots and output serial signal with XYZ coordinates of reflected points in relation to 0,0,0 position of a camera. 

These funds will help produce precision machined parts and buy components that are required to make this lidar precise.Even with foam prototype the error was no more than 2%, e.g. at distances of 50 inches it could have an error of 1 inch.

Currently prototype works between 1 ft and 16 ft reliably, but running MATLAB on an embedded computer is not a way to go, so I'm writing concise C code to capture images, recognize reflected spots and triangulate distances to them.

FAQ

  • Since childhood we have great ideas how to make robots do interesting, amazing things. Those who actually get into robotics see how frustrating it is to use same old weak hardware and processors, same unreliable sonic rangefinders, same toy servos. I want to bring more capabilities to inventors, engineers, designers, that may start on small budget but don't have $1600 for a most basic lidar on the market.

    Last updated:
  • Current lidars have sophisticated lasers and detectors, that are very expensive. They calculate time between when laser emitted and laser reflection received. Those numbers are in billionths of a second. Knowing the speed of light it can calculate distance to the object. My lidar will work on a different principle, that is not new: Laser is pointing the same way as a camera. if laser is reflected by very near object- point will be "in peripheral vision" of a camera. If it is reflected far far away, it will be almost in the center camera's vision. By knowing which pixel is "shining" in camera I can calculate (triangulate) distance to that reflected spot.

    Last updated:
  • Robots with one camera have no depth perception. There are ways to make robot know where is it located in space so it won't hit anything - it can be complex software for one camera, two cameras for stereoscopic vision and some powerful software or a lidar. Lidar (Light Detection and Ranging) allows robot to see it's surrounding in 3D more reliably and simply. Hardware can be as simple as a processor for a toy robot, software also can be quite simple. Both of these allow lidar to be inexpensive and available.

    Last updated:
  • Both Kinect and Neato vacuum cleaner LiDARs are products on their own and built for own purpose. I want to create a new product that is specifically made for robot developers, where all you have to do is plug USB to your robot and read XYZ coordinates in character or other form.

    Last updated:
76
Backers
$7,557
pledged of $2,400 goal
0
seconds to go
  • Pledge $10 or more
    You selected

    26 backers

    You'll be included in backers' credits on the upcoming website, receive periodic updates on the project and get a thank you email from me.

    Estimated delivery:
  • Pledge $25 or more
    You selected

    13 backers

    All of the above and a Personal Certificate of a Supporter of the Future and a Thank You Letter

    Estimated delivery:
  • Pledge $60 or more
    You selected

    6 backers

    All of the above and a laser kit that was used in a project

    Estimated delivery:
  • Pledge $200 or more
    You selected

    1 backer

    All of the above and a single board computer that was used in the project

    Estimated delivery:
  • Pledge $300 or more
    You selected

    20 backers

    all of the above and a Fully functional LIDAR.

    Estimated delivery:
Funding period

- (30 days)