Navigating the world with a visual impairment poses unique challenges, as everyday environments are often not designed for total accessibility. Traditional mobility aids, while helpful, lack dynamic technology and real-time feedback.
To address this gap, a team from the SRM Institute of Science and Technology student branch, in collaboration with India Grama Munnetra Iyakkam, developed “Echo-Vision” – an optical device that uses deep learning and computer vision to analyze surroundings and alert users to obstacles.
The device integrates three main technologies: LiDAR sensors, haptic feedback, and GPS navigation. The LiDAR sensors scan the environment to create a detailed map of nearby obstacles. This data is translated into haptic feedback, delivered through vibrations in the glasses that vary based on proximity. The GPS navigation system provides real-time directions through text-to-speech.
This innovative solution will empower visually impaired individuals to navigate the world with greater confidence and independence, ultimately enhancing their quality of life.
This project was made possible by $910.00 in funding from EPICS in IEEE.
