eyeNav

EyeNav is an Eye-control interface for people with ALS.
EyeNav is using live video stream from a camera located on glasses frame, pointed directly to the user’s eye. The video is processed and analyzed using computer vision algorithm and translates the eye movement to keystrokes (up/down/left/right)

inspired by the eyeWriter project, and uses same low-cost hardware (webcam on plastic glasses).
What makes EyeNav different is that unlike direct, mouse-like cursor manipulation, which can be tiresome - it is based on discrete gestures (like keyboard), for navigation.

The project was conceived, born, and took its first steps in 12 hours, during a Hackathon for disabled people

project page


back to Projects