Viewing entries in



Ross Brooks on September 10, 2013.

Researchers in Japan are starting to develop a system that recognizes sign language and automatically converts it into Japanese characters – only requiring a commercial motion sensor such as the “Microsoft Kinect.”
Mizuho Information & Research Institute Inc and Chiba University aim to improve communications between hearing-impaired people and normal listeners, with plans for a prototype to be available in October 2013, and a full-fledged version in 2014.
The system uses four steps to achieve its goal:
  1. Senses the movements of the signer’s forearm (wrists, elbows, etc)
  2. Compares the movements with motion data for each word
  3. Automatically estimates the meanings of the movements
  4. Displays Japanese characters on a monitor in real time.
Mizuho Information & Research Institute will be responsible for the application of the system, while Chiba University will offer a technique to recognize sign languages and prepare motion data for each word.
If the system reaches fruition, it could provide an ingenious new way for hearing-impaired people and normal listeners to interact, without having to spend months, or possible years, learning sign language. This is especially true when any kind of web-based communication is involved, making the use of microphones highly improbable.
Read more at PSFK



Stephanie Feyne | Authenticity: The Impact of a Sign Language Interpreter’s Choices

 | August 27, 2013

Stephanie presented, Authenticity: The Impact of a Sign Language Interpreter’s Choices, at StreetLeverage – Live 2013 | Atlanta, GA. Her talk explored how the choices made by sign language interpreters affects the perception of Deaf people and how interpreters can present a more “authentic” representation of someone’s message.
You can find the PPT deck for her presentation here.


(The examples in this article are of female interpreters and male Deaf individuals in order to accommodate the gendered demands of English pronouns. This may or may not reflect the actual identities of the people involved.)
In this presentation I will be discussing the concept of “authenticity” during interpretation – what it means and why I use this term.
We interpreters know we are responsible for the transmission of the content of speakers’ messages. An additional responsibility is to express the manner in which one person speaks, which allows the other participant to get a glimpse of who the person is.
Last month a Deaf teacher was presenting in front of a group of hearing children. I was interpreting for him. He told them to copy his notes from the board. I interpreted that in the first person, “copy what I wrote…”
A first grade girl spun her head towards me in disbelief. “You didn’t write anything!” she exclaimed. I agreed with her, that I hadn’t, but I then explained that our job as interpreters is to say what the Deaf person said. She thought about this for a second and replied, “Oh, you’re pretending to be him.”
That struck me as a profound statement. And, of course, she was absolutely correct! That’s exactly what we interpreters do – we take on the identity of the Deaf person as we represent their message so that the hearing person knows who they are.

We speak not “FOR” the Deaf party but “AS” the Deaf party. Our utterances are expressed in the first person:  ”I don’t understand my homework”, “I want to work for your company”, “My daughter is sick.”
Read the full article at Street Leverage 

( )



MLK’s “I Have a Dream” Speech, in ASL

Boston University BU, Martin Luther King Jr I have a dream speech 50th anniversary in American Sign Language ASL, Richard Bailey graduate studies, BU disability services, College of Arts and Sciences CAS, John Thornton professor of history and African American Studies program director of graduate studies, the Boston Landmarks Orchestra
Richard Bailey (GRS’13) (right) with John Thornton, a CAS professor of history and African American Studies program director of graduate studies, at Commencement weekend 2013. Photo by Christopher Robinson
By Leslie Friday

Earlier this year, the Boston Landmarks Orchestra was searching for an American Sign Language interpreter to translate Martin Luther King, Jr.’s “I Have a Dream” speech. The orchestra was planning a concert commemorating the speech’s 50th anniversary and approached Christopher Robinson, a staff interpreter at BU’s Disability Services, about the job. But Robinson had a better idea: why not place native ASL users on stage and base their interpretations on an official ASL translation of the speech?
The BLO liked the idea—the only problem was that no official translation existed. Robinson had a solution for that as well: he suggested Richard Bailey, for whom he regularly interpreted African American studies courses.
Bailey (GRS’13), a native ASL user who is biracial, was studying the writings and speeches of Martin Luther King, Jr. (GRS’55, Hon.’59) as part of his master’s level research on identity and representation. He was willing to create an official translation, but there was a catch: it would have to be recorded, and Bailey was camera-shy.

Read the full article at BU Today




5 Easy Career Enhancers for Sign Language Interpreters

 | July 10, 2013
What makes up a successful career as a sign language Interpreter? Logically, it depends on who is asked. Regardless of what is ultimately determined to be the magic ingredients, those interpreters who are the most successful and satisfied in their work are those who consistently seek out opportunities to grow as a professional.
While this growth may seem like it is only possible over time, and time being an important part, I believe there are steps one can take to establish a foundation for success.
Below you will find 5 simple steps that will add an important level of polish to your career.

 1.  A Pro bono Injection.

Commit to accepting pro bono assignments. Notice I didn’t say volunteer? This commitment consciously moves us past the concerns for payment and terms and reconnects us with the fundamental reason we signed up to do this work—supporting people.
There is a tremendous satisfaction in knowing your work as a sign language interpreter has made a difference. Probono work will rewarm the goo inside, which will do wonders for your perspective on the work and your role in it.
Pro bono grants perspective.

Read the full article at Street Leverage 



Hawaiian researchers confirm distinctive island sign language

March 3. 2013

Researchers from the University of Hawaii announced the confirmation of a unique sign language, distinct from American Sign Language (ASL), on March 1. The team will formally unveil their findings at the 3rd International Conference on Language Documentation and Conservation on March 3.

"Hawaiian, the indigenous language of this state, has been brought back from the brink of extinction… But what we didn't know until very recently is that Hawaii is home to a second highly endangered language that is found nowhere else in the world."
– William O'Grady, linguistics professor at the University of Hawaii
Researchers interviewed about half of the 40 people who are thought to communicate with Hawaiian Sign Language. Most are in their 80s. Linguists videotaped 19 elderly deaf people, and 2 adult children of deaf parents, using the sign language.



10 Reasons Why "That Deaf Guy" Web Comic Is Awesome

That Deaf Guy is a web comic written by deaf cartoonist Matt Daigle and his wife Kay. Here are just ten reasons why the comic is awesome and should be read by deaf and hearing people alike.Amsterdamshusi

1. It de-mysticizes sign language.

It de-mysticizes sign language.

2. It gives shoutouts to other awesome deaf people, like rap artist Sean Forbes.

It gives shoutouts to other awesome deaf people, like rap artist Sean Forbes.

3. It contains practical information for everyday life.

It contains practical information for everyday life.



Designing a City for the Deaf

Most cities aren’t designed for deaf people. Sidewalks are frequently too narrow or too crowded for deaf persons engaged in a conversation that requires so-called “signing space.” Public benches are often set in rows or squares, limiting the ability of the deaf to create the “conversation circles” and open sight lines that they require. Urban landscapes are so visually stimulating that they hinder communication among people who rely on visual cues. And light fixtures may be too dim or shine directly into signers’ eyes.

These things don’t just make a deaf person’s life more challenging; they can make it dangerous. In January, three deaf people were struck by a vehicle and seriously injured in Olathe, Kansas*, as they left a deaf cultural event. The same thing happened to a deaf man last year in Sacramento.
In 2009, Deaf411, a public relations firm serving the deaf community, released a report on Deaf-Friendly Cities in the U.S., saluting places like Washington, D.C., Chicago, Seattle, Raleigh, and Denver for their efforts to accommodate the deaf or hard of hearing. But for every city on the list, countless others—including San Francisco, St. Louis, Atlanta, and Philadelphia—did not make the cut.
Now Gallaudet University in Washington, D.C., the nation’s leading institution for the deaf and hard of hearing, has produced a set of so-called DeafSpace Guidelines that address those aspects of the urban environment that inhibit communication and mobility among those who communicate with their hands. In doing so, architects and design researchers have used technology to gather information on how deaf people use public spaces and modify them to meet their needs. Campus officials say that the guidelines have already begun a dialogue that they hope will have an impact on urban development nationwide.
“The clarity with which a deaf person communicates relates to the clarity and clutter of what’s around them,” says Hansel Bauman, director of campus design and planning at Gallaudet, who led the multiyear effort to create the DeafSpace Guidelines. “Space becomes an essential part of how you communicate.”



Sign language that African Americans use is different from that of whites

Carolyn McCaskill remembers exactly when she discovered that she couldn’t understand white people. It was 1968, she was 15 years old, and she and nine other deaf black students had just enrolled in an integrated school for the deaf in Talledega, Ala.
When the teacher got up to address the class, McCaskill was lost.
“I was dumbfounded,” McCaskill recalls through an interpreter. “I was like, ‘What in the world is going on?’ ”
The teacher’s quicksilver hand movements looked little like the sign language McCaskill had grown up using at home with her two deaf siblings and had practiced at the Alabama School for the Negro Deaf and Blind, just a few miles away. It wasn’t a simple matter of people at the new school using unfamiliar vocabularly; they made hand movements for everyday words that looked foreign to McCaskill and her fellow black students.
So, McCaskill says, “I put my signs aside.” She learned entirely new signs for such common nouns as “shoe” and “school.” She began to communicate words such as “why” and “don’t know” with one hand instead of two as she and her black friends had always done. She copied the white students who lowered their hands to make the signs for “what for” and “know” closer to their chins than to their foreheads. And she imitated the way white students mouthed words at the same time as they made manual signs for them.