2009年7月24日

Hands-free vision-based interface for computer accessibility

"Hands-free vision-based interface forcomputer accessibility",
Javier Varona, Cristina Manresa-Yee, Francisco J. Perales, JNCA08'

In order to draw disabled people to new technology, the paper presents a hands-free vision-based interface. First, tracking the facial feature in few frames to initialize the model. No special lighting or static background is important, any orientation must be avoided during initialization. They select nose and eye zone for tracking by color distribution. The symmetry of nose feature points may affect the tracking precision. Locating user's eyes, they focus on eyes and eyebrows by color. Similarly, wearing glasses may result in certain lighting condition and cause error. They apply a weighting function according to the distance between pixels and the eye center. Then, using mean-shift algorithm to tracking. A linear regression method is used to smooth the positions. When recognizing facial gesture, wink recognition is taken into consideration. If the (vertical) iris contours are detected in the image, the eye will be considered as open, otherwise, close.

There may be two different forms to replace the mouse for hands-free computer accessibility. One is directly mapping the nose position onto the screen. Another uses relative head motion which has a predictable tendency and is not as sensitive to the racking accuracy.

More head and facial gestures are planned for improving the system.