Parents - Case Studies: Exploring Different Perspectives on Educational Interpreting
Learning to Use Interpreters After Progressive Hearing Loss: Codeswitching
By Susan Dickinson, Teacher of the Deaf and Deaf Adult
At age fivewas diagnosed with a moderate hearing loss that gradually worsened until I became profoundly deaf in my late teens. The process of learning new information then took a radical shift away from being a passive activity. As a hard of hearing child during my grade school years, I vividly recall how easy it was to sit back and take in new information effortlessly. Learning from my teachers just naturally 'happened' whenever their words flooded the air around me. Whether I was thinking about being first to line up for recess, picking at the wood splinter on my chair, or playing with a rubber band in my desk, I was constantly taking in information. Even when I chose not to pay attention, my mind was still able to hold lessons for later retrieval. When a person loses their sense of hearing they become acutely aware of how they took their ability to learn new information for granted. When ears and brain are working together, the passive listener can subconsciously take in broad concepts, be exposed to new terms and still know when it was time to tune in or respond.
In the absence of useable hearing, one must rely on their vision to learn and this is a totally different, labor intensive process. Reading sign language and comprehending the speaker’ message requires work; and the moment the mind drifts or the eyes gaze away, the information is lost. In addition to carefully attending the interpreter, one must also develop codeswitching and discourse mapping skills in order to retain information. By codeswitching I am referring to the process of changing language modalities such as messages conveyed via a visually based language (ASL) as opposed to an auditorially based language (spoken, fingerspelled and printed English). Deaf students must constantly and rapidly switch back and forth among a variety of linguistic representations. They are presented a plethora of printed materials in classes (written on the board, on a handout, book, overhead) and must quickly shift gears to read sign language on the interpreter (some of which is fingerspelled English). They may also codeswitch again if they glance back and forth to see spoken English on the lips of the teacher.
To further compound the cognitive burden, students need to organize the information they are receiving. Discourse mapping relates to identifying pragmatics, or the purpose of the message. The listener needs to discover a method of organizing the information for later retrieval. While codeswitching and discourse mapping activities are taking place, there is also a great deal of self-talk occurring to decode and repair misreads. It is not surprising that it is virtually impossible for a deaf student using an interpreter to access 100% of the message. We are only able to pick up parts of two languages, (with competing visual vs auditory bases) via a variety of representations while constantly decoding and relating these parts to the whole message. No wonder we get so tired!