`
`(12) Ulllted States Patent
`Shridhar et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,929,722 B2
`Apr. 19, 2011
`
`(54) HEARINGASSISTANCE USING AN
`EXTERNAL COPROCESSOR
`
`(75)
`
`Inventors: Vasant Shridhar, Royal Oak, MI (US);
`Duane Wertz, Byron, MI (US);
`Malayappan Shridhar, West
`B1°°mfie1d=M1(US)
`
`(73) Assignee:
`
`.
`( * ) Notice:
`
`Intelligent Systems Incorporated, West
`Bloomfield, MI (US)
`.
`.
`.
`.
`Subject. to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(1)) by 381 days.
`
`(21) App1.N0.: 12/273,389
`
`(22)
`
`Filed;
`
`NOV_ 13, 2008
`
`2
`,
`,
`6,021,207 A
`6,035,050 A
`6,041,129 A
`6,058,197 A
`$1
`6:424:722 B1
`6,449,662 B1
`6,556,686 B1
`6,684,063 B2
`6,816,600 B1
`6,851,048 B2
`6,888,948 B2
`5395345 132
`
`IZii1i1_3erII1aI1iet a1~
`s ige et a .
`2/2000 Puthuff et a1.
`3/2000 Weinfuitner et al.
`3/2000 Adelman
`5/2000 Delage
`1
`i
`iléeda
`7/E002 Ha;::Ii11S:taé1.'
`9/2002 Armitage
`4/2003 Weidner
`1/2004 Berger et a1.
`11/2004 Jakob et al.
`2/2005 Ar
`"t
`5/2005 Herélefnagfeli
`5/3005 Bye et 31,
`(Continued)
`
`OTHER PUBLICATIONS
`
`PCT Search Report for PCT Application No. PCT/US2009/053480,
`mailed Mar. 10, 2010 (8 pages).
`
`(65)
`
`Prior Publication Data
`US 2010/0040248 A1
`Feb. 18,2010
`
`Related U_S_ Application Data
`
`(C0mineed)
`.
`Primary Examiner — Brian Ensey
`/1IZO}"l’l€y, Agent, 0}’ Firm * Lee & Hayes,
`
`(60)
`
`(51)
`
`lfgoxéigiiignal application No. 61/188,840, filed onAug.
`’
`.
`Int CL
`(2006.01)
`H04R 25/00
`(200601)
`H04M 1/00
`(52) U.s. Cl.
`...................... 381/314; 381/312; 455/569.1
`(58) Field of Classification Search ................ .. 381/312,
`.
`381/314, 315, 455/569.1
`See application file for Complete Search history
`
`'
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`4,918,737 A
`5,390,254 A
`5,479,522 A
`5,710,819 A
`5,721,783 A
`
`4/1990 Luethi
`2/1995 Adelman
`12/1995 Lindemann et al.
`1/1998 Topholmet al.
`2/1998 Anderson
`
`ABSTRACT
`(57)
`Techniques are described for enhancing a hearing assist
`device using one or more coprocessor devices. The hearing
`assist device uses a handshaking protocol to detect. and pair
`with the one or more coprocessor devices. The hearingiassist
`d§V1°e 15 fijfable Of S‘a“d;ia1°.“e Signal Pr°°§S:¥“g “i1 fie
`"‘ S°.“°°°
`. °°°P.r°°°S.S°r
`°V‘°°S‘ 90“ °m 0. ‘mm’
`°
`hearing assist device directs processing of a signal to the
`.
`.
`coprocessor device when the coprocessor 1S. detected. In
`another embodiment,
`the hearing assist device detects a
`coprocessor device and uses the coprocessor device to
`supplement signal processing performed by the hearing assist
`device. In yet another embodiment, the hearing assist device
`communicates with a plurality of coprocessor devices and the
`work of processing the signal is shared amongst the devices
`according to a respective functionality of each device.
`
`31 Claims, 6 Drawing Sheets
`
`Hearing Assist Device Q
`Communication
`sor
`Sen
`Interiace
`2112
`E
`
`Convener
`N
`_0A
`
`Processor
`1%
`
`Memom
`E
`
`Signal
`Processing
`Algorrlhm
`
`E
`
`Handshaking
`Module
`MA
`
`Functionality
`Comparing
`Module
`21.6
`
`Processor
`Switchrng
`Module
`.’|.‘LQ
`
`Stimulator
`AE
`
`/— 200
`
`
`
`HIMPP 1001
`HIMPP 1001
`
`
`
`US 7,929,722 B2
`Page 2
`
`US. PATENT DOCUMENTS
`6,938,124 132*
`8/2005 Rust er a1.
`................... .. 711/114
`519545535 B1
`10/2005 Amd“°*”11~
`619751739 B2
`12/2005 B°g‘*S°“‘°*‘ 31‘
`519785155 B2
`12/2005 Berg
`7,054,957 B2
`5/2006 Armitage
`7,257,372 B2*
`8/2007 Kaltenbach et al.
`7,283,842 B2
`10/2007 Berg
`7,292,698 B2
`11/2007 Niederdrank et al.
`:1 1
`2005/0058313 A1
`3/2005 Victorian et al.
`
`..... .. 455/569.1
`
`........... .. 381/315
`
`2/2006 Sangllino Gt 111.
`2006/0039577 A1*
`'7
`$513,333; 6, ,1
`‘,2?
`10/3007 Brueckneretali
`2007/0239294 A1
`12/3007 Se eletal
`’
`2007/0282394 A1
`5/3008 Roicketai
`2008/0107278 A1
`'
`‘
`OTHER PUBLICATIONS
`
`Written Opinion for PCT Application No. PCT/US2009/053480,
`
`* cited by examiner
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 1 of6
`
`US 7,929,722 B2
`
`/-100
`
`Hearing Assist Device
`102(a)
`
`Hearing Assist Device
`102(n)
`
`goal
`
`l
`J
`
`0 0 0
`
`Processor
`
`Switching
`Module 1
`
`Wireless
`
`Communication
`Interface
`
`fl
`
`Handshaking
`Module 1
`
`Coprocessor Device
`104(a)
`
`Coprocessor Device
`104(m)
`
`Fig. 1
`
`
`
`
`
`Wired
`
`Communication
`Interface
`
`fl
`
`
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 2 of 6
`
`US 7,929,722 B2
`
`/— 200
`
`Hearing Assist Device 1
`
`Communication
`
`Interface
`
`E
`
`Handshaking
`Module
`
`E
`
`Functionality
`Comparing
`Module
`
`E
`
`Processor
`
`Switching
`Module
`
`m
`
`Stimulator
`
`E
`
`Converter
`
`&
`
`Processor
`
`Processing
`Algorithm
`M
`
`Fig. 2
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 3 of 6
`
`US 7,929,722 B2
`
`/— 300
`
`Coprocessor Device
`
`C0”Ve”er
`E
`
`Processing
`Algorithm
`31_o
`
`Processor
`
`Communication
`
`Bfi
`
`Interface
`E
`
`Handshaking
`Module
`112
`
`Other Coprocessor
`Functionality
`
`Fig. 3
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 4 of6
`
`US 7,929,722 B2
`
`Detect a C0 rocessor Device
`
`400
`
`@
`
`Processing Algorithm on
`the Coprocessor Device
`fl
`
`,( Detect a Signal
`
`Compare a Functionality of the Coprocessor Device
`to a Functionality of a Hearing Assist Device Q
`
`Processor Speed
`Processor Load
`
`Processor Capability
`Memory Capacity
`Memory Capability
`Signal Processing Algorithm
`Enhancement of a Signal Processing Algorithm
`Sensor Capability
`Strength of Communication Signal
`
` Direct a Signal
`
`to at least one of the
`
`Hearing
`Hearing Assist Device
`
`or the Coprocessor Device 4i
`Coprocessor
`ASSiS’t
`
`Device
`Device
`
`
`- Availability of the Coprocessor Device
`
`
`- User Input
`
`
`- Determination that the Coprocessor
`Device has a Necessary and/or
`
`Superior Functionality
`
`Process Signal at the
`Hearing Assist
`Device
`
`fl
`
` Send the Signal to
`
`the Coprocessor
`Device for
`
`Processing
`fl
`
`Receive a Processed
`
`Signal from the
`Coprocessor Device
`fl
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 5 of6
`
`US 7,929,722 B2
`
`/— 500
`
`Process a Signal with a Hearing Assist Device 1
`
`E
`
`
`Coprocessor
`
`Device Detected?
`
`
`Yes
`
`
`
`Additiona
`
`
`
`No
`
`
`
`Coprocessor
`I evice Detected?
`
`
`fl
`
`
`
`Yes
`
`Compare Additional
`Coprocessor Device to
`Coprocessor Device and to
`Hearing Assist Device fl
`
`Compare Coprocessor
`Device to Hearing Assist
`Device fl
`
`o Functiona|ityAbsent
`o Functionality Enhanced
`
`o Functiona|ityAbsent
`o Functiona|ityEnhanced
`
`
`
`
`
`Yes
`
`Direct the Signal
`to the Hearing Assist
`Device?
`
`
`
`
`
`
`
`Direct the Signal
`to the Hearing Assist
`Device?
`
`E
`
`es
`
`No
`
`Q
`
`No
`
`Fig. 5a
`
`
`
`U.S. Patent
`
`Apr. 19, 2011
`
`Sheet 6 of6
`
`US 7,929,722 B2
`
`/— 500
`
`
`
`Process Signal with
`Coprocessor Device
`fl
`
`irect the Signal to the
`Coprocessor Device?
`
`Process Signal
`with Additional
`
`Coprocessor Device
`fi
`
` Process Signal is
`
`Parallel with Hearing
`Assist Device?
`
`E
`
`
`
`
`
`Process Signal in
`Integrate
`Parallel with Hearing Assist
`
`Signals
`Device and/or Coprocessor
`E
`Device?
`
`
`
`
`
`E
`
`Process Signal
`is Series
`
`E
`
`Fig. 5b
`
`
`
`1
`HEARING ASSISTANCE USING AN
`EXTERNAL COPROCESSOR
`
`RELATED APPLICATION
`
`his application claims the benefit of U.S. Provisional
`Ap olication No. 61/188,840 filed Aug. 13, 2008.
`
`TECHNICAL FIELD
`
`'he subject matter of this disclosure relates to a hearing
`enhancement device, and more specifically,
`to a hearing
`enhancement device capable of functioning together with a
`coprocessor device.
`
`BACKGROUND
`
`Historically, hearing aids assisted people with hearing loss
`by providing sound amplification. Typically, hearing aids
`include microphones to detect external sound, a processor to
`amplify the detected sound, a battery, and a speaker to present
`amplified sound to a user. Many hearing aids presently trans-
`late the detected sound into a digital signal and use a digital
`signal processor (DSP) to process the signal. The DSP can
`manipulate the signal by applying signal processing algo-
`rithms stored on the hearing aid to improve the quality of the
`amplified sound.
`Wearers of hearing aids desire increasingly smaller sized
`devices to improve comfort and personal appearance. How-
`ever, the small size of hearing aids limits functionality. This
`form-factor constraint is apparent in short battery life, low
`powered processors, and weak signal processing algorithms.
`Sound processing is limited due to the constraints imposed by
`the small size of hearing aids. For example, much of the
`processing power of current hearing aids is devoted to reduc-
`ing feedback, and thus, remaining processing power is unable
`to run powerful signal processing algorithms.
`It is desirable to maintain the hearing aid as a small device
`that is placed in or on the ear of a user. It is also desirable to
`hearing aid users to have a device which is portable, always
`present, and able to produce
`quality amplified sound.
`Even with increases in processor power and component min-
`iaturization, hearing aid users still have many complaints
`about the capabilities of current hearing aids. Therefore,
`methods and devices that provide improved signal processing
`and function within the existing form-factor constraints
`would have considerable utility.
`
`SUMMARY
`
`Most ofthe form-factor limitations ofconventional hearing
`aids can be overcome by coupling a hearing aid to an external
`coprocessor device. Since the coprocessor device is not
`required to be placed in or near the ear, it is possible for the
`coprocessor device to have a powerful processor with greater
`functionality than a stand-alone hearing assist device. By
`sending a signal detected at the hearing assist device out to a
`coprocessor for processing it is possible realize the benefits of
`a small hearing assist device, without sacrificing signal pro-
`cessing power.
`In one aspect, the l1earing assist device has a processor and
`a memory to store signal processing algorithms. Thus the
`hearing assist device is able to process signals (e.g., audio
`signals converted into electronic form) without a coprocessor
`device. In order to communicate with the coprocessor device,
`the hearing assist device may also include a communication
`interface to communicate with the coprocessor device, and a
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`US 7,929,722 B2
`
`2
`
`handshaking module to receive information regarding a func-
`tionality of the coprocessor device via the communication
`interface. In some instances the coprocessor device may have
`different capabilities than the hearing assist device, so a func-
`tionality comparing module in the hearing assist device com-
`pares the functionality of the coprocessor device to a func-
`tionality of the hearing assist device. Since there may be
`instances in which the hearing assist device will provide
`better signal processing and other instances in which the
`coprocessor device would be a superior processor, a proces-
`sor switching module in the hearing assist device may direct
`the signal for at least partial processing to a processor in either
`(or both) of the hearing assist device or the coprocessor
`device. The processed signal is then returned to the hearing
`assist device (if processed by a coprocessor device) and pre-
`sented to a user by means, such as a speaker on the hearing
`assist device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The detailed description is described with reference to the
`accompanying figures. In the figures, the use of the same
`reference numbers in different figures indicates similar or
`identical
`items. These drawings depict only illustrative
`embodiments of the invention and are not, therefore, to be
`considered to be limiting of its scope.
`FIG. 1 illustrates a system of a plurality of hearing assist
`devices in communication with a plurality of coprocessor
`devices in accordance with one illustrative embodiment ofthe
`
`present disclosure.
`FIG. 2 is a schematic view of an illustrative hearing assist
`device usable in the system of FIG. 1.
`FIG. 3 is a schematic view of an illustrative coprocessor
`device usable in the system of FIG. 1.
`FIG. 4 is flowchart of an illustrative process for directing a
`signal for processing in accordance with an embodiment of
`the present disclosure.
`FIG. 5 is flowchart of an illustrative process for directing a
`signal for processing in accordance with another embodiment
`of the present disclosure.
`
`DETAILED DESCRIPTION
`
`This disclosure describes techniques, by which the form-
`factor constraints inherent in hearing aids are overcome by
`leveraging the processing power of an additional processor,
`such as a coprocessor, which does not suffer from the same
`form-factor constraints. Processing power superior to that
`provided by conventional hearing aids has become ubiquitous
`in modern societies in the form of mobile phones, personal
`digital assistants, electronic music players, desktop and lap-
`top computers, game consoles, television set-top-boxes, auto-
`mobile radios, navigation systems, and the like. Any of these
`devices may function as a coprocessor, while continuing to
`perform the primary functions of each respective device. The
`coprocessor may also be a device specially designed to func-
`tion together with a hearing aid.
`Permanent coupling to the coprocessor device, however,
`requires that a hearing aid user always bring a coprocessor
`device if he or she desires to benefit from the hearing aid. The
`bulk of a coprocessor device may be undesirable when, for
`example, engaged in sports. Operation of the coprocessor
`device may even be prohibited at times such as while on an
`airplane or near sensitive medical equipment. In such situa-
`tions the hearing aid user may desire whatever benefit the
`hearing aid can provide even if enhanced processing of the
`coprocessor device is not available. Thus, it is desirable to
`
`
`
`US 7,929,722 B2
`
`3
`have a hearing aid that will function as a stand-alone-device in
`the absence of a coprocessor device, and provide enhanced
`functionality if and when a coprocessor is available.
`In some embodiments, the hearing aid provides sound
`enhancement to a user with diminished hearing capacity.
`However, in other embodiments, the methods and devices of
`the present disclosure enhance the hearing abilities of a user
`with or without impaired hearing. For example, appropriate
`signal processing algorithms used together with the subject of
`the present disclosure may allow a solider to distinguish the
`snap of a twig from other sounds in a forest, or allow a
`mechanic to detect a grating of gears inside a noisy engine.
`Accordingly, devices of the present disclosure are referred to
`as hearing assist devices to encompass devices used to
`enhance sound for users with or without hearing impairment.
`FIG. 1 illustrates a system 100 of a plurality of hearing
`assist devices 102(a) to 102(n) in communication with a
`plurality of coprocessor devices 104(a) to 104(m). A commu-
`nication interface between the hearing devices 102 and the
`coprocessor devices 104 may be wired 106 and/or wireless
`108. The wired communication interface 106 may include,
`but is not limited to, controller-area network, recommended
`standard-232, universal serial bus, stereo wire, IEEE 1394
`serial bus standard (FireWire) interfaces, or the like. The
`wireless communication interface 108 may include, but is not
`lin1ited to, Bluetooth, IEEE 802.1 lx, Al\/I/FM radio signals,
`wireless wide area network (WWAN) such as cellular, or the
`like.
`
`Each hearing assist device 102 may include a processor
`switching module 110 to manage routing of signals amongst
`the processors of the hearing assist device 102 and one or
`more of the coprocessor devices 104. The coprocessor
`devices may include a handshaking module 112 to facilitate
`communication between the hearing assist device 1 02 and the
`coprocessor device 104,
`including sending information
`describing a functionality ofthe coprocessor device 104 to the
`hearing assist device 102 as part of the handshaking.
`Flexibility inherent in the system 100 of the present dis-
`closure allows one hearing assist device 102 to communicate
`with zero to m coprocessor devices 104. Moreover, the hear-
`ing assist device 102 may dynamically add or drop coproces-
`sor devices 104 on the fly. The hearing assist device 102
`functions as a stand-alone device when zero coprocessor
`devices 104 are present. The hearing assist device 102(a)
`may,
`for example, communicate only with coprocessor
`device 104(a) via the wired communication interface 106. In
`other embodiments, hearing assist device 102(a) may com-
`municate with a first coprocessor device 104(a) via the wired
`communication interface 106 and a second coprocessor
`device 104(m) via the wireless communication interface.
`Many other communication paths are covered within the
`scope of the present disclosure including a hearing assist
`device 102 communicating with more than two coprocessor
`devices 104 through any combination of wired and/or wire-
`less communication interfaces.
`It is also envisioned that, in some embodiments, more than
`one hearing assist device 102 may communicate with a copro-
`cessor device. For example, hearing assist device 102(a) and
`hearing assist device 102(n) may both communicate with
`coprocessor device 104(m) via two wireless communication
`interfaces 108. The two hearing assist devices, 102(a) and
`102(n), may represent devices placed in a right ear and a left
`ear of a single user. The two hearing assist devices 102(a) and
`102(n) may alternatively represent devices worn by two dif-
`ferent users. Many other communication paths are covered
`within the scope of the present disclosure, including multiple
`users each wearing one or two hearing assist devices 102 and
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`all of the hearing assist devices 102 using a coprocessor
`device 104 through a plurality of wired and/or wireless com-
`munication interfaces.
`
`Any combination of multiple hearing assist devices 102 in
`communication with single or multiple coprocessor devices
`104 is also within the scope of the present disclosure. For
`example, hearing assist device 102(a) may be connected to
`coprocessor device 104(a) via a wired communication inter-
`face 106 and to coprocessor device 104(m) via a wireless
`communication interface 108. While at the same time, hear-
`ing assist device 102(n) may also be connected to coprocessor
`device 104(m) via a wireless communication interface.
`The hearing assist devices 102 may also be able to com-
`municate with other hearing assist devices either directly (not
`shown) or via a coprocessor device 104 such as hearing assist
`device 102(a) communicating with hearing assist device 102
`(11) via coprocessor device 104(m). Thus, a given hearing
`assist device 102 may stand alone and communicate with no
`other devices, it may communicate with a one or more copro-
`cessor devices 104, it may communicate with a one or more
`other hearing assist devices 102, or it may communicate with
`the one or more coprocessor devices 104 and one or more
`other hearing assist devices 102.
`The coprocessor devices 104 may also be able to commu-
`nicate with other coprocessor devices (not shown). The
`coprocessor devices 104 may also communicate with a server
`110. In some embodiments, the server 110 may be a network
`server connected to a network such as the Internet. Commu-
`
`nication between the coprocessor devices 104 and the server
`110 may be wired or wireless. In some embodiments, not
`shown, a coprocessor device 104 may be a component of a
`larger computing device and the server may be another com-
`ponent of the same larger computing device. Thus, a given
`coprocessor device 104 may communicate with a one or more
`hearing assist devices 102, and/or with a one or more other
`coprocessor devices 104, and/or with a one or more servers
`110.
`
`Hearing Assist Device
`FIG. 2 shows a schematic view 200 of the hearing assist
`device 102 of FIG. 1. The hearing assist device 102 includes
`a sensor 202 configured to detect energy in the form of sound
`waves. This sensor may be a microphone or any other device
`capable of detecting sound. The hearing assist device 102
`may also include a converter 204 configured to convert the
`detected energy of the sound waves into a signal. The signal
`may be an analog signal, a digital signal, or a signal in any
`other form that is capable of undergoing processing. The
`signal is processed by a processor 206 of the hearing assist
`device 102. In some embodiments, the processor 206 is a
`digital signal processor (DSP). The hearing assist device 102
`also includes a memory 208 which may be configured to store
`signal processing algorithms 210. Depending on the exact
`configuration and type of hearing assist device 102, the
`memory 208 may be volatile (such as random access memory
`(RAM)), non-volatile (such as read only memory (ROM) and
`flash memory), or some combination of the two. The signal
`processing algorithms 210 may include, but are not limited to,
`echo cancellation, noise reduction, directionality, speech pro-
`cessing, pitch-shifting, signal separation, audio compression,
`sub-ba11d processing, language translation, user customized
`hearing profiles, and feedback reduction algorithms as well as
`audiologist customizations. The hearing assist device 102
`also includes a communication interface 212 which may pro-
`vide a communicative connection via a wired or wireless
`communication interface to coprocessor devices 104 or other
`hearing assist devices 102.
`
`
`
`US 7,929,722 B2
`
`5
`The handshaking module 214 of the hearing assist device
`102 may be configured to receive information describing a
`functionality of the coprocessor device 104 via the commu-
`nication interface 212. Examples of specific filnctionalities of
`the coprocessor device are described below. In some embodi-
`ments, the handshaking module 214 may also send informa-
`tion describing a functionality ofthe hearing assist device 102
`to the coprocessor device 104. By using the handshaking
`module 214 to mediate initial communications between the
`
`hearing assist device 102 and the coprocessor device 104, the
`hearing assist device 102 is able to do 1nore than merely open
`a communication channel to passively await a transfer of data.
`The handshaking module 214 allows for exchange of infor-
`mation describing the functionalities of the hearing assist
`device 102 and the coprocessor device 104, such that com-
`municative connections will be made only if necessary and
`only to the extent necessary to provide an enhanced process-
`ing to the hearing assist device 102.
`Once the functionalities of the hearing assist device 102
`and the coprocessor device 104 are known, then a function-
`ality comparing module 216 may compare the functionality
`of the coprocessor device 104 to the functionality of the
`hearing assist device 102. Ifmultiple coprocessor devices 104
`are available, the functionality comparing module 216 may
`compare the functionality of each coprocessor device 104 to
`each other and/or to the functionality of the hearing assist
`device 102. In some embodiments, the functionality may be a
`signal processing algorithm. The functionality comparing
`module 216 may determine that a signal processing algorithm
`on one of the coprocessor devices 104 provides a signal
`processing functionality absent from the hearing assist device
`102 and also absent from other coprocessor devices 104. For
`example, a laptop computer functioning as a coprocessor
`device may have a pitch-shifting signal processing algorithm
`which all other devices in the system lack. In such a situation
`it may be desirable to process the signal at the laptop com-
`puter to benefit from the pitch-shifting ability of the copro-
`cessor 104.
`
`Even if a same general type of signal processing function-
`ality is present on both the hearing assist device 102 and the
`coprocessor device 104, that signal processing ftmctionality
`may be enhanced on the coprocessor device 104. For
`example, both the hearing assist device 102 and the copro-
`cessor device 104 may include versions of a signal processing
`algorithm for signal separation, but the specific algorithm on
`the coprocessor device 104 may, for example, provide greater
`signal separation than the algorithm on the hearing assist
`device 102. Thus, the signal processing functionality present
`on the coprocessor device 104 is enhanced as compared to the
`signal processing functionality on the hearing assist device
`102 because of the enhanced signal processing algorithm
`(e. g., the greater signal separation algorithm) available on the
`coprocessor device 104.
`Enhanced signal processing functionality may also be
`achieved when two devices have identical signal processing
`algorithms but one device provides an enhanced processing
`capability. For example, the processor power or memory
`available on the coprocessor device 104 may allow that
`coprocessor device 104 to provide enhanced processing capa-
`bility as compared to the hearing assist device 102 even when
`both devices use the same signal processing algorithm.
`When multiple coprocessor devices 104 are present, the
`functionality comparing module 216 may compare a signal
`processing functionality of one coprocessor device 104 to
`another coprocessor device 104 to determine if any of the
`coprocessor devices 104 have a signal processing f1mctional-
`ity absent from the other devices. The functionality compar-
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`ing module 216 may also determine if any of the plurality of
`coprocessor devices 104 has an enhanced signal processing
`functionality (either in terms of a superior algorithm or in
`terms of processing power or memory) as compared to the
`other coprocessor devices 104.
`The hearing assist device 102 also includes a processor
`switching module 110 configured to direct the signal for at
`least partial processing to the processor 206 of the hearing
`assist device 102 and/or a processor of the coprocessor
`device. Given the coprocessor devices 104 available to the
`hearing assist device 102 at any point in time, the function-
`ality comparing module 216 will determine which processor
`or combination of processors can provide desired signal pro-
`cessing functionality for the needs of the user of the hearing
`assist device 102. The desired signal processing functionality
`may be determined in advance by an audiologist or manufac-
`tures of the hearing assist device. In some embodiments the
`desired signal processing functionality may be determined by
`the user (e. g. manual selection) or by the coprocessor device
`(e.g. if the coprocessor device is a car radio then the desired
`signal processing functionality includes correction for road
`and engine noise). The processor switching module 110 may
`then dynamically switch processing ofthe signal based on the
`comparisons performed by the functionality comparing mod-
`ule 216.
`
`In some embodiments, the signal may be processed in
`series by the processor switching module 110 directing the
`signal to the processor 206 of the hearing assist device 102
`and/or processors of one or more coprocessor devices 104.
`For example, a sound detected by the sensor 202 and con-
`verted to a signal by the converter 204 may be initially pro-
`cessed at the processor 206, sent to a first coprocessor via the
`communication interface 212 for additional processing, sent
`from the first coprocessor to a second coprocessor for further
`processing, and finally received from the second coprocessor
`via the communication interface 212. One benefit ofprocess-
`ing the signal in series is that the processing by the first
`coprocessor device can be taken into account by the second
`coprocessor. In other embodiments, the signal may be pro-
`cessed in parallel by the processor 206 of the hearing assist
`device 102 and/or processors of one or more coprocessor
`devices 104. When processed in parallel, the signal may be
`processed substantially simultaneously by a plurality of pro-
`cessors and then the respective processed signals may be
`integrated into one signal at the hearing assist device 102 by
`an integrator (not shown). One possible benefit of processing
`the signal in parallel is that latency of signal processing by the
`coprocessors is minimized.
`Ultimately, the processed signal is presented to the user of
`the hearing assist device. The hearing assist device 102
`includes a stimulator configured to stimulate an auditory
`nerve of a user. The stimulator may take any form which
`directly or indirectly induces the auditory nerve to generate an
`electrical signal that is perceived by the user as representing
`sound. In some embodiments the stimulator may be a speaker.
`In other embodiments the stimulator may be a device, such as
`a cochlear implant, that acts directly on the auditory nerve.
`While the hearing assist device 102 is shown and described
`as having certain hardware and software modules, it should be
`understood that all modules may be implemented as appro-
`priate in hardware, software,
`firmware, or combinations
`thereof. If implemented by software, the software may reside
`on memory associated with any component of the hearing
`assist device 1 02, standalone memory provided in connection
`with the hearing assist device 102, a remote memory storage
`device, removable/nonremovable memory, a combination of
`the foregoing, or any other combination of one or more pro-
`
`
`
`US 7,929,722 B2
`
`7
`cessor-readable media. While the hearing assist device 102 is
`shown as having certain modules, it should be understoodthat
`in some embodiments, one or more of the modules could be
`combined or omitted entirely.
`Coprocessor Device
`FIG. 3 shows a schematic view 300 of the coprocessor
`device 104 of FIG. 1. The coprocessor device 104 may
`include a sensor 302, similar to the sensor 202 of the hearing
`assist device 102 of FIG. 2. In some embodiments, the sensor
`3 02 may provide additional information used at least in part in
`processing the signal. For example, a microphone on the
`coprocessor device 104 may detect ambient noise and reduce
`the ambient so that the user can hear voices with enhanced
`
`clarity. The sensor 302 may also be used to enhance process-
`ing of directionality. The coprocessor device 104 may also
`include a converter 304 that may be similar to the converter
`204 of the hearing assist device 102 of FIG. 2.
`Coprocessor device 104 includes a processor 306 config-
`ured to process a signal. In one embodiment the signal may be
`a signal received from a hearing assist device 102. In another
`embodiment the signal may be a signal received from another
`coprocessor 104. In yet another embodiment the signal may
`be a signal from the converter 304.
`Coprocessor device 104 also includes a memory 308 con-
`figured to store signal processing algorithms. The signal pro-
`cessing algorithms 310 may include, but are not limited to,
`echo cancellation, noise reduction, directionality, pitch shift-
`ing, signal separation, audio compression, sub-band process-
`ing, language translation, user customized hearing profiles,
`and feedback reduction algorithms as well as audiologist
`customizations.
`
`The coprocessor device 104 also includes a communica-
`tion interface 312 similar to the communication interface 212
`
`of the hearing assist device 102 of FIG. 2. The communica-
`tion interface 312 may be configured to send a signal pro-
`cessed by a processing module 314 (described below) to a
`hearing assist device 102 or another coprocessor device 104.
`In some embodiments the communication interface 312 may
`receive an indication of a functionality of the hearing assist
`device and/or an indication of a desired processing for the
`signal. The coprocessor device 104 may provide more signal
`processing functionality than required by a user. Rather than
`simply applying all possible processing to a signal, the indi-
`cation ofthe desired processing for the signal may instruct the
`processor 306 as to which signal processing functionalities to
`apply. When a coprocessor device 104 is processing signals
`from a plurality of hearing assist devices 102, the indications
`of the functionality of the respective hearing assist devices
`102 and the desired processing for the respective signals may
`allow the coprocessor device 104 to provide appropriate pro-
`cessing for each signal.
`In some situations the coprocessor device 104 may initially
`lack a signal processing functionality required by the user. In
`one embodiment, the communication interface 312 may be
`configured to send a signal from the coprocessor device 104
`to a server 110 in order to access additional signal processing
`functionality available on the server. For example, the server
`110 may function similar to, or make use of, an ITUNES®
`server by receiving requests for signal processing algorithms
`(instead of songs) from one or many coprocessor devices 104
`(instead of MP3 players). ITUNES® is available from Apple
`Corporation, of Mountain View, Calif. The coprocessor
`device 104 may be preconfigured with the address and access
`information for server 110, or the address and/or access infor-
`mation may be provided to the coprocessor device 104 along
`with the signal from the communication interface 312 of the
`hearing assist device. Ifmultiple servers 110 are available the
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`