Quantcast
Channel: :last-child » Yahoo!
Viewing all articles
Browse latest Browse all 12

That Deaf Guy AKA Ken Bolingbroke

$
0
0

The following was written by a Yahoo! engineer to help his co-workers understand his deafness. He wrote this to answer the myriad of questions and to make meetings more productive. It was originally published on the Yahoo! Accessibility Lab’s web site.


The Background

I was diagnosed with a hearing loss when I was four years old. I’d had the hearing loss all along, they just didn’t figure it out until then. I had no hearing at all in my right ear, and a progressive loss in my left ear that gradually got worse over the years until I could no longer hear at all by the time I was 17. While I still had some hearing, I was able to hear better with a hearing aid that amplified sound, but once my hearing loss became too severe, hearing aids could no longer help me.

When I was 18, I got an early, experimental cochlear implant that enabled me to hear again, not quite as well as I could hear several years earlier with a hearing aid, but enough that I could understand speech with the aid of lipreading. This device never received FDA approval, and I stopped using it in 1994, then had it removed in 2000.

From 1994 to 2009, I had no usable hearing at all. Strictly speaking, I could still hear in my left ear, but only very loud noises. VERY loud noises, like say, a running jet engine .. if I were close enough. So for example, a few years ago, there was a fire drill at Hewlett Packard. The alarms were loud enough that many people were covering their ears with their hands. I still could not hear that at all.

In September 2009, I got a new cochlear implant in my left ear. In February 2010, I got another one in my right ear, but this one was particularly complicated, because of the prior two surgeries on that ear, so it doesn’t work nearly as well as the one in my left ear. Additionally, this last surgery messed up my sense of balance, and left me with severe dizziness for several weeks, and at this writing, still hasn’t entirely abated. I can get especially dizzy when standing up after sitting for an extended period. So if you see me staggering awkwardly down the hall, I’m not necessarily drunk.

Incidentally, the ability to locate sounds requires two ears. Up until I was 17, I could only hear in my left ear and later, only in my right ear. So until February, I could never hear in both ears, and hence, I was never able to locate the direction of sounds. Now that I can actually hear in both ears, in theory, I should be able to do that, but so far, that ability hasn’t manifested.

Why am I deaf?

When doctors finally figured out that I had a hearing loss, they could only speculate as to why. I was the eighth of ten children, and I’m the only one with a hearing loss. So they assumed that it was a result of circumstances around the time of my birth — I was born not breathing, I got sick, I was given antibiotics, etc, and doctors speculated that one of these could have caused my hearing loss. However, several years ago, one of my brothers had a deaf son. And then a deaf daughter. And after another daughter with normal hearing, a second deaf daughter came along. At some point along the way, the first two were given genetic tests and found positive for a syndrome that typically causes a progressive hearing loss, worse in one ear than the other. I only learned this last year This article was originally written several years ago.

My deaf nephew and nieces have the same kind of cochlear implant I got. My youngest deaf niece got her implant just a few weeks ago.

So what can I hear now?

cochlear inplant

Oddly, the implant I received in 2009 is far more advanced than the one I got back in 1988. With this new implant, I’ve been hearing better than I’ve ever heard in my life, hearing things I’ve never heard before. The technology is absolutely amazing.

However! I am still severely hearing impaired. Despite hearing things I’ve never heard before, it’s still not normal hearing.

First, this is still relatively new to me. When I first turned on my left ear in September 2009, everything was just random noise, with voices sounding no different than a car engine. I had to learn how to hear all over again, and I’m still learning and over time, I may be able to understand speech better than I can now.

Second, there are fundamental limits. I have 16 electrodes implanted in my left cochlea, and only 15 of them work. In my right cochlea, there are only 12 functioning electrodes. The entire range of possible sound frequencies have to be split down across these 12 or 15 electrodes. They do some innovative programming to make adjacent electrodes work together to create an extra “virtual” electrode, increasing the overall range possible. But it’s still a fairly limited range.

This means that I’m not able to hear the difference between sounds that are close on the frequency range. For example, it’s very difficult for me to hear the difference between the ‘k’, ‘t’, and ‘p’ sounds. I can barely hear the ‘s’ sound at all.

More generally, I’m not able to hear high pitch sounds as well as low pitch sounds. So I can’t understand higher female voices as well as I can understand low male voices, for example.

Because that’s a limitation in the technology, I may never be able to improve on that. Or perhaps someone will be able to improve the technology such that it will improve my ability to hear differences between similar sounds.

How does that apply to practical situations?

I generally drive with my radio set to a public radio station, so I can practice listening to speech without any visual cues. Mostly, I don’t understand anything. Occasionally, I can understand particularly long words, and then that gives me a clue, and I start picking up parts of other phrases. So for example, I might catch the word “economy”, and then I know it’s probably a news report on the current state of the economy, and that gives me a clue as to what to listen for and I start catching phrases about Congress discussing some bill, or some such. And then the topic changes, and I’m lost again.

I’m told that radio people are selected for their clear speech, so presumably that’s an ideal situation. And I can just barely understand occasional words and phrases from people with a presumably clear speaking ability. As a point of comparison, I have not yet been able to understand anything my teenage children speak, but I’m told that they mumble to each other (they normally use sign language with me and their mother).

If you’ve met me, you know that I can generally understand more with you than I do from the radio. That’s because when I have visual cues handy, namely, lipreading provides extra clues to help me figure out what you’re saying. For example, if you were to say “cat” or “pat”, since I have a hard time hearing the difference between the ‘k’ and ‘p’ sounds, I would not be able to hear which word you speak, but the way your lips move to say “cat” is very different than they way they move to say “pat”, and so I can identify what you say by the complementary combination of what I hear and what I lipread.

And no, I can’t just lipread, because lipreading is a lot more ambiguous than my hearing. For example, with lipreading, the ‘m’, ‘b’, and ‘p’ sounds all look the same (but sound very different!). So I cannot lipread the difference between “bat”, “mat”, and “pat”. But I can hear the difference, so that’s where I need to both hear and lipread to be able to understand speech.
Ken Bolingbroke at Yahoo!

And even then, it’s still an iffy deal. If there’s extra noise, like other people speaking or a loud fan, or whatever, then that negates my ability to hear. For that matter, my ability to understand is so sensitive that I can be knocked off just by someone with a sore throat that’s changing his voice. I’m finding  that just the wind blowing in my implants’ microphones is enough to drown out your voice.

Meetings are especially difficult for me. The main speaker isn’t standing face-to-face with me, so lipreading is more difficult. If the speaker turns to face the whiteboard or someone at the opposite side of the room, then I can’t lipread at all. And worse, when someone somewhere at the table starts speaking, I have to look around until I can figure out who is speaking (see above, as to why I can’t locate the direction of sound), and then hope it’s someone facing me … and in the meantime, I can’t understand anything they’re saying. And if they’re not facing me at all, then I’m still lost.

Beyond that, at this point, since listening to speech is a new thing for me, after a 15 year hiatus with no hearing at all, it’s surprisingly exhausting. It’s a strain for me to listen and try to understand speech. It seems odd to think of it this way, perhaps, but you’re hearing every day, 24 hours a day (even when you’re sleeping, you’re still hearing), all your life, so it’s ingrained for you. But I’ve only been hearing again for a few months, and I turn off my implants at least every night, sometimes longer, so it’s remarkably exhausting. These last few months with my restored hearing, any time I have a particularly heavy day of listening, I find that it wipes me right out.

Ouch! That’s bad, what do we do?

In meetings, it would greatly help me if you could provide me with a written agenda of what will be discussed. When I know the context of what is being said, that makes it much easier to follow what’s being said. Give me a heads-up when you change the topic, so I’m not trying to shoehorn your speech into the prior context.

Face me when you’re speaking. If I’m not looking at your face when you’re talking, then I’m not understanding anything you’re saying. This is especially difficult when multiple people are speaking. If you could raise your hand until you have my attention before you speak, that would ensure I have the best possible chance of understanding what you say.

If you take notes, share them with me after the meeting. Listening to speech takes my entire focus, so if I try to take notes, I lose focus and lose track of what’s being said.And if you have any action items that involve me, make sure that I have them, preferably in writing.

What about sign language?

Well sure, I know ASL. But who else does? Not enough people around here to make it useful for me. And my pet peeve is that interpreting doesn’t do a good job of translating the technical jargon used in IT. Case in point, at the open house where I applied for this job, I had an ASL interpreter helping me out, which was a good thing, because the crowd noise pretty much negated my hearing in that situation. But as I was talking to someone in the Search group, the interpreter signs “D – B – M” which completely confused me, because it didn’t fit in the context at that moment. We pause and worked out that what was said was “Debian.” The interpreter hears “Dee bee enn” and incidentally, “n” and “m” is another pair of sounds that are very close on the sound frequency scale, so she thinks it was “m”, and that’s how you get from “Debian” to “DBM”. So ASL isn’t really useful for interpreting technical discussions, although it does come in handy for general conversation.

Related articles


Viewing all articles
Browse latest Browse all 12

Latest Images

Trending Articles





Latest Images