Giving a Voice to the Voiceless

LOCATION: Bangalore, India

PROJECT LAUNCHED: 2012

PROJECT LEADS:  IEEE Student Members at RNS Institute of Technology

EPICS IN IEEE FUNDING: $1,705 USD


PROJECT OVERVIEW & UPDATES

People with speech and hearing impairments have used sign language for hundreds of years. In fact, signing – which uses combinations of hand movements and body language to convey meaning – helps people to communicate in almost 150 different nations and cultures.

But one huge barrier has long stood in the way: Most people who can speak and hear normally don’t understand sign language. As a result, speech and hearing-impaired people are often unable to function independently in routine activities, whether it’s conducting simple transactions at the bank or post office, asking for directions in unfamiliar places or describing physical symptoms to a physician.

In 2012, three IEEE Student Members at RNS Institute of Technology in Bangalore, India, set out to change things. With an initial grant of $1,305 from EPICS in IEEE, the trio proposed designing an inexpensive portable device that would translate basic sign language into text and audio in real time – and bridge the communication gap between those unable to speak or hear with those who do. Educators at RV Integrated School for the Hearing Impaired, also in Bangalore, were enthusiastic about participating in the project.

To document basic hand gestures used in sign language, the team next obtained the involvement of 10 high school science students, four of them girls, from Sri Aurobindo Memorial School. Working together, the university and high school students documented the physically challenged students’ hand gestures as they signed words and phrases. To do this, a system comprising a computer, two monochromatic cameras, three infrared LEDs and a gaming motion controller. Software from Leap Motion was used so the controller could detect and track hand and finger movements. Then, after compensating for background objects, such as a person’s head and ambient environmental lighting, the images were analyzed to construct a 3-D representation of what the controller saw.

Finally, a tracking layer in the software was used to send motion information to a controller, which identified the sign. The word for it was then displayed on an LCD screen and played so it could be heard.

New IEEE Student Members have taken over the project from the original group since all have since graduated. The current team requested and received $400 to complete the project, and the final prototype will be a simple hand-held device that can be afforded by almost anyone.

The original team envisioned that the device would provide new dignity and confidence among speech and hearing-challenged people. In addition, the ability to communicate in real time is also expected to open up many new opportunities for employment. And that will be music, as well as words, to everyone’s ears.


PROJECT PARTNERS

  • RV Integrated School for the Hearing Impaired
  • Sri Aurobindo Memorial School
  • RNS Institute of Technology

Contact Us

ADDRESS

Ashley Moran
445 Hoes Lane
Piscataway, NJ 08854, USA

PHONE

+1 732-562-6552

EMAIL

epicsinieee@ieee.org

CONNECT WITH US

JOIN OUR MAILING LIST