Share this project

Done

Share this project

Done
A sleek, multi-channel, wireless headset that monitors your brain activity and translates EEG into meaningful data you can understand.
A sleek, multi-channel, wireless headset that monitors your brain activity and translates EEG into meaningful data you can understand.
Created by
4,459 backers pledged $1,643,117 to help bring this project to life.

Developer How-To Sneak Peek: Key Bindings

Posted by Tan Le (Creator)
1 like

The Emotiv Insight allows you to control a keyboard using bindings and emulation software. In short, users can map detections (such as facial expressions or mental commands) to specific keys or keystrokes. The software packaged with the headset features a simple way to set things up, and multiple command recognitions can be active at once.

The SDK also features an option that allows the user to specify which application the keystrokes are sent to. This opens up a whole new way to interface and interact with devices. We can see many ways in which this can be used: from assisted typing to controller interfaces, this opens up numerous possibilities in the world of BCI. Want to talk to an Arduino using just your mind? Simply connect it to your computer, set up a virtual serial port and use the Emotiv Insight’s key binding software to send characters directly to the Arduino! It’s that simple!

We’d love to hear your thoughts on how you’ll be able to use the Emotiv Insight!

  • Image 307210 original.png?ixlib=rb 1.1
Kurt Keller likes this update.

Comments

Only backers can post comments. Log In
    1. Missing avatar

      Chenglong Yu on

      I'd like to know how the Key Bindings process going on. will it be free to all users?

    2. Missing avatar

      Murray Macdonald on

      Not all things are applications with keyboard inputs, for example a Windows Service. Ideally there would be an option to configure the driver to send Windows Event messages. I have redrawn this dialog to include the required parameters for this (Handle, Message WParamm, LParam) but I cannot upload it here. Do you have an email address I should send it to? Also, how do I use this interface to send the sequence CTRL-T, ALT-Y, F8. I think you may need a lightweight parser syntax to support such sequences. Also a delay capability would be handy is some situations ala CTRL-T, ALT-Y, DELAY(100ms), F8.

    3. Ronnie on

      @Bojan P- I never had the pleasure to see, use, or actually never heard of the Oculus Rift until I got on kickstarter. But, that would be great to be able to move the mouse with the Insight. But, I question was about the binding keystrokes. If I read correctly, we would need the research or dev edition to actually send the keystroke we bind to an application..

      Quote: "The SDK also features an option that allows the user to specify which application the keystrokes are sent to."

      So, as for the 199 and 229 pledges.... We can bind all we want, but wont be able to send it to an application....That doesn't make sense to me.

    4. Missing avatar

      Gtb on

      First of all, I am a mechanical designer working with 3D CAD software. I would be thrilled to be able to control the 3D CAD virtual environment with the Insight. Of course this means controlling the 3D spacemouse (multiple movement at one time) as well as moving the mouse and controlling the clicks with the head set. (is the reason why you guys chose for the inertial sensor as stretch goal ?) In my opinion, this would create a massive revolution in the 3D design world, since there are so many people, as myself in the past, who are suffering RSI complaints.
      @Tan Le :
      Am I correctly assuming, that the keystrokes are now only going to the active application ?
      I agree with Ronald McGinnis Jr, that everybody should be able to decide where the keystrokes are going, could you elaborate on this issue, as well as the issue of moving the mouse with the headset ?
      Good luck getting to the first stretch !

    5. Bojan Potočnik on

      @Ronald McGinnis Jr - if Insight will have gyroscope and accelerometer implemented, we will be able to move mouse with our head (as Oculus Rift - it have MPU-9150 9-axis motion sensor for detecting movement (magnetometer compensates gyroscope drift and gyroscope compensates magnetometer error)). And then you can bind your "thoughts" to clicks. Actually, that would be really nice to try out, theoretically you wouldn't need to use mouse any more.

    6. Guy Sheffer on

      Is 20ms a good response time to someone that has been trained?

    7. Missing avatar

      Oleg Magrisso
      Superbacker
      on

      This is amazing! So looking forward to get my hands on one of these! The applications and possibilities seem endless! As well as making keystrokes with your thoughts, do you think it would be possible to expand this even further to words? As in writing text messages? ( I'm just really curious to know how far can this be pushed? )

    8. Ronnie on

      I think users who get the 199 and 229 pledge should be able to have it where we can decide where the keystroke are sent to...

      I might be confused but .. If I can type a word.. What's the purpose of down so if i can select the program it go to?

    9. Ronnie on

      What would be the purpose of sending mouse clicks.. If we can't move the mouse with the headset

    10. Ronnie on

      I was hoping for a video preview but this is awesome !!! I plan on using my binding to maybe use it to send common words i usually type ... Or preform an action on my computer or run a script...

    11. Tan Le Creator on

      @DK82- Thanks for sharing your story. We've seen many applications of this technology to help improve the lives of people with disabilities. It's an incredible feeling to be able to touch their lives in this small way and to witness the joy that it brings to them.

      @Gary- More wine indeed! LOL

      @Douglas- We can't wait to make Emotiv Insight with all the added features! We've got a ways to go to meet the stretch goal though. Please share and help promote! Together we can make this happen! Thanks for the support and enthusiasm! Your energy keeps us going!

    12. Missing avatar

      DK82 on

      I have been wanting to get my hands on one to develop a robotic prosthetic. I cant imagine what it would be like to lose a limb, but I know people who have, or who were born without some. I would love to help them regain something so precious that was previously lost, or never had.

      My brother has lost much of his mental capacities due to falls from seizures, and I hope this may be a way to give him a new lease on life. This is a god send, and I pray that it would bring new hope for many.

      The applications are endless, new quicker interfaces with technology, a cheaper research too for neuroscience and psychology, possibly a new tool to allow us to communicate with younger children and disabled individuals, possibly even something that will allow us to understand animals thought process better. Keep those synapses firing, who knows in what ways we can change the world with this technology.

    13. Missing avatar

      Gary Chappell on

      I'm imagining (pun intended) some opportunities to utilize Emotiv for a new breed of security interfaces, where brain activity replaces keystrokes for authentication. Beyond just thinking causing characters to be sent (which still might be intercepted by keyloggers), perhaps there are other modalities which would be harder to intercept and/or translate. Needs more cogitation, or more wine... ;-)

    14. Douglas Livingstone on

      You are tantalizing us with all these new features; bring it on!