Design and Preliminary Evaluation of an Augmented Reality Interface Control System for a Robotic Arm
Abstract
Despite advances in the capabilities of robotic limbs, their clinical use by patients with motor disabilities is limited because of inadequate levels of user control. Our Johns Hopkins University Applied Physics Laboratory (APL) team and collaborators designed an augmented reality (AR) control interface that accepts multiple levels of user inputs to a robotic limb using noninvasive eye tracking technology to enhance user control. Our system enables either direct control over 3-D endpoint, gripper orientation, and aperture or supervisory control over several common tasks leveraging computer vision and intelligent route-planning algorithms. This system enables automation of several high-frequency movements (e.g., grabbing an object) that are typically time consuming and require high degrees of precision. Supervisory control can increase movement accuracy and robustness while decreasing the demands on user inputs. We conducted a pilot study in which three subjects with Duchenne muscular dystrophy completed a pick-and-place motor task with the AR interface using both traditional direct and newer supervisory control strategies. The pilot study demonstrated the effectiveness of AR interfaces and the utility of supervisory control for reducing completion time and cognitive burden for certain necessary, repeatable prosthetic control tasks. Future goals include generalizing the supervisory control modes to a wider variety of objects and activities of daily living and integrating the capability into wearable headsets with mixed reality capabilities.