BU professors seek ASL dictionary

A new video-based ASL dictionary will try to solve one of the classic problems of translation: how to identify a word if there is no way to know the English equivalent.,Communication sciences and disorders majors struggling to learn American Sign Language may soon get a helping hand from two Boston University researchers.

A new video-based ASL dictionary will try to solve one of the classic problems of translation: how to identify a word if there is no way to know the English equivalent.

The remedy is simple enough in French or Spanish where students can run to a dictionary and sound out the pronunciation.

However, when ASL students see a teacher make a sign they do not know, their options are slim.

BU professors Carol Neidle and Stan Sclaroff head the project that may change all that-with the help of a $90,000 grant from the National Science Foundation-to solve the trick of translating the gesticulated into the articulated.

Neidle and her associates are in the process of creating a computer program that would use a digital video camera to recognize signs.

Students will also be able to select a sign from a library of video clips to find a definition.

While the program will not help face-to-face interaction, it will allow someone to look up a sign they don’t understand.

Junior TV/video major Chelsea Phillips, who learned ASL in her hometown of Atlanta, said the way the signs were performed were different at Emerson. The regional differences made it even more difficult to look up a sign if she missed the meaning.

“Sign language is all about motion and emotion,” she said. “You can’t get that in a book.”

However, it may take longer than the grant’s allotted three years to perfect the technology. Home and classroom implementation may be still further off.

“Speech technology research has been going on for a long time,” Niedle said. “ASL technology research is something we are just beginning.”

Since ASL has no written form, there is room for misinterpretation. Facial expressions and body language are often used to construe a point, making a two-dimensional representation inaccurate.

Sarah Wolford, a communication sciences and disorders major, said there are a few options for students who need to identify a sign, but said she’s had little difficulty in the past.

“It wasn’t a big problem,” the sophomore said. “It would make it easier for people to learn the signs on their own.”

Nancy Vincent, the American Sign Language Coordinator for the Department of Communication Sciences and Disorders, who is also deaf, said through a telephone interpreter that regular dictionaries available to students are not always reliable, and the best resource is usually other people.

“They can look up when they don’t know the sign,” Vincent said. “However, many times students check with the teacher to make sure if it is correct or not.”

The project’s goal for the new system is not only for the classroom setting. The creators said they intend the technology to become integrated in the deaf and hearing communities.

“It has real practical applications for deaf people,” Neidle said. “When deaf kids are born into hearing family and the kids goes off and learns a sign the parents can it look up.”