Project Description

2013–Present
HandSight augments the sense of touch in order to help people with visual impairments more easily access the physical and digital information they encounter throughout their daily lives. It is still in an early stage, but the envisioned system will consist of tiny CMOS cameras and micro-haptic actuators mounted on one or more fingers, computer vision and machine learning algorithms to support fingertip-based sensing, and a smartwatch for processing, power, and speech output. Potential use-cases include reading or exploring the layout of a newspaper article or other physical document, identifying colors and visual textures when getting dressed in the morning, or even performing taps or gestures on the palm or other surfaces to control a mobile phone.

Publications

Applying Transfer Learning to Recognize Clothing Patterns Using a Finger-Mounted Camera

Lee Stearns, Leah Findlater, Jon E. Froehlich

None

Design of an Augmented Reality Magnification Aid for Low Vision Users

Leah Findlater, Jon E. Froehlich, Lee Stearns

None

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Lee Stearns, Uran Oh, Leah Findlater, Jon E. Froehlich

None

Recognizing Clothing Colors and Visual Textures Using a Finger-Mounted Camera: An Initial Investigation

Chuan Chen, Leah Findlater, Jon E. Froehlich, Alexander Medeiros, Lee Stearns

None

Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction

Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

None | Acceptance Rate: 26.2% (33 / 126)

Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface

Jonggi Hong, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

None

Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera

Victor De Souza, Leah Findlater, Jon E. Froehlich, Lee Stearns, Jessica Yin

None

Localization of Skin Features on the Hand and Wrist From Small Image Patches

Lee Stearns, Uran Oh, Bridget Cheng, Leah Findlater, David Ross, Rama Chellappa, Jon E. Froehlich

None

Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Lee Stearns, Ruofei Du, Uran Oh, Catherine Jou, Leah Findlater, David Ross, Jon E. Froehlich

None

Evaluating Angular Accuracy of Wrist-based Haptic Directional Guidance for Hand Movement

Jonggi Hong, Lee Stearns, Tony Cheng, Jon E. Froehlich, David Ross, Leah Findlater

None

Supporting Everyday Activities for Persons with Visual Impairments Through Computer Vision-Augmented Touch

Leah Findlater, Lee Stearns, Ruofei Du, Uran Oh, David Ross, Rama Chellappa, Jon E. Froehlich

None

The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the Blind

Lee Stearns, Ruofei Du, Uran Oh, Yumeng Wang, Rama Chellappa, Leah Findlater, Jon E. Froehlich

None

Talks

HandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairments

Aug. 1, 2018 | PhD Defense, Computer Science

University of Maryland, College Park

Making with a Social Purpose

April 6, 2017 | Lecture Series at the Laboratory for Telecommunication Sciences

LTS Auditorium, College Park, MD

Interactive Computational Tools for Accessibility

Nov. 7, 2016 | Diversity in Computing Summit 2016

College Park, Maryland

DJDT

History

Versions

Time

Settings from makeabilitylab.settings

Headers

Request

SQL queries from 1 connection

Static files (1713 found, 31 used)

Templates (33 rendered)

Cache calls from 1 backend

Signals