According to the World Health Organization (2023), an estimated 2.2 billion people suffer from visual impairments. Individuals with visual impairments face unique challenges when it comes to interacting and communicating with others. Non-verbal cues, such as facial expressions, can be difficult to perceive, which can lead to misunderstandings in social interactions.
To support this community, students from the Universidad Nacional de San Agustín de Arequipa (National University of San Agustin of Arequipa) collaborated with the Centro de Rehabilitación para Ciegos Adultos (Rehabilitation Center for Blind Adults) to create a device that identifies human emotions and relays the information to the user. The assistance system uses a combination of technology, including a processing unit, a portable wristband, and a mobile software.
The system works as follows: the processing unit, equipped with a camera and AI-based detection algorithm, known as convolutional neural networks (CNNs), captures a real-time image of the user’s surroundings, searching for human faces. Once a human face is identified, the system checks if it matches a registered face in the user’s app. If the face is not registered, the system will discard the image. If the face is registered, the AI model will analyze the facial expression and provide haptic feedback to the user according to the detected emotion, via the portable wristband. All of this technology is designed to operate on a long-lasting rechargeable battery making it safe and efficient.
Ultimately, this system aims to create a more equitable society that offers visually impaired individuals an opportunity to forge meaningful connections and develop a sense of societal belonging.
This project was made possible by $4,160 in funding from the Jon C. Taenzer Memorial Fund.