HandSight: Supporting Everyday Activities through Touch-Vision

HandSight: Supporting Everyday Activities through Touch-Vision

We are building and evaluating a new system called HandSight, which is aimed at supporting activities of daily living (ADLs) for people with severe visual impairments by sensing and feeding back non-tactile information about the physical world as it is touched. HandSight consists of tiny cameras and micro-haptic actuators integrated into one or more fingers, computer vision algorithms to support inference and recognition, and a smartwatch for processing, power, and speech output. We have two high-level goals: first, to develop the basic building blocks of an extensible HandSight platform that will support a range of ADL applications. Second, to explore and demonstrate the potential of HandSight through three proof-of-concept applications: reading, dressing, and technology access. In the first year of funded work, we have focused largely on the first goal, including: designing and iterating on physical form factors, developing computer vision algorithms to extract attributes of the physical world and to support on-body interaction for mobile technology accessibility, and experimenting with haptic feedback options to guide the user’s finger/hand.

Duration: 
August 2014 - September 2018
Funder: 
U S Department of Defense-Other
Total Award Amount: 
$992,821

Principal Investigator:

Jon Edward Froehlich

Additional Investigators

Ramalingam Chellappa