Immersive Learning: Teaching American Sign Language in Virtual Reality

Researcher(s)

  • Benjamin Le, Computer Science, University of Delaware

Faculty Mentor(s)

  • Leila Barmaki, COMPUTER & INFORMATION SCIENCES, University of Delaware

Abstract

Being deaf or hard of hearing can result in exclusion due to a lack of understanding in communication; leading to feelings of alienation. Sign language allows for those who are hard of hearing to learn, socialize, and work like anyone else, bridging gaps on an individual level and fostering inclusivity.  Our study integrates immersive Virtual Reality (VR) environments combined with Artificial Intelligence (AI) to enhance the learning experience and retention of American Sign Language (ASL). Additionally, we are testing how distance affects engagement by creating far and close scenes for each condition. In a pilot study (N=8), participants in and outside the lab were asked to sign phrases either in a recreated conference room to an AI avatar or in a room with a TV and instructions. Results favored the close and far conference room with the AI avatar, showing higher speed and accuracy for test scores in the close scene and higher engagement times for the far scene. Our aim is to provide an engaging and responsive way to learn ASL by creating a safe and enjoyable environment that encourages learning and retention through leveraging the potential of VR and AI.