The five members of eyeCU

Team members:

  •  Nick Bertrand
  •  Arielle Blum
  •  Mike Mozingo
  •  Armeen Taeb
  •  Khashi Xiong

The goal of our project is to design and implement a low-cost human-computer interface which allows its user to control a computer cursor with eye movements. This technology has many applications; however, the focus of this system is to enable individuals with limited mobility to easily interact with technology. The system design employs a head-mounted unit with an video camera to capture the position as well as the motion of the user's gaze. The device processes the images collected by the camera in real-time to generate the corresponding cursor movement which is transmitted wirelessly to the computer.

We have two levels of goals for the tracking of the eye. Level one consists of using 'eye gestures' to control the cursor movement on a computer. In this configuration, when the eye looks left for example, the cursor will move to the left and stop when the eye moves back to the center. Our high level goal for this project will provide a more intuitive set of commands in which the cursor follows the position of the user's gaze on the computer display. This mode will require additional image processing to determine where the eye is looking and achieve proper cursor motion on the computer display. If time permits, software will be developed to support the eye tracker interface with common computer applications.