throbber
US 20110058056A1
`
`a9y United States
`
`a2y Patent Application Publication o) Pub. No.: US 2011/0058056 A1
`
`Lindahl et al. 43) Pub. Date: Mar. 10, 2011
`(54) AUDIO ALTERATION TECHNIQUES (52) US.CL ... 348/222.1; 348/738; 375/240.16;
`382/100; 348/E05.122; 375/E07.076; 348/E05.031
`(75) Inventors: Aram Lindahl, Menlo Park., CA (57) ABSTRACT
`(US); Kelvin Chiu, Mountain View,
`CA (US) A method of altering audio output from an electronic device
`based on image data is provided. In one embodiment, the
`(73) Assignee: APPLE INC., Cupertino, CA (US) method includes acquiring image data and determining one or
`more characteristics of the image data. Such characteristics
`(21) Appl. No.: 12/556,380 may include sharpness, brightness, motion, magnification,
`zoom setting, and so forth, as well as variation in any of the
`(22) Filed: Sep. 9, 2009 preceding characteristics. The method may also include pro-
`’ T ducing audio output, wherein at least one characteristic of the
`A i . audio output is determined based on one or more of the image
`Publication Classification data characteristics. Various audio output characteristics that
`(51) Int.CL may be varied based on the video data characteristics may
`HO4N 5/228 (2006.01) include, for instance, pitch, reverberation, tempo, volume,
`HO04N 5/60 (2006.01) filter frequency response, added sound effects, or the like.
`HO4N 7/12 (2006.01) Additional methods, devices, and manufactures are also dis-
`GO6K 9/00 (2006.01) closed.
`66
`P a— 54 — =5 210
`
`Photo Albums
`
`&)
`
`I(0)
`D[]
`
`"\
`
`O
`Y
`S
`S
`
`@
`83
`
`i’(c.)
`
`©)
`
`—~214
`> fl Camera Roll (1) ’fi
`/ Wallpapers (2) )
`
`£)
`
`S
`S
`
`&)
`[£3
`
`Exhibit 1014
`Page 01 of 29
`
`218
`=t 216
`Pho..] 1ofl
`~202 ) 506
`—>
`
`Samsung v. Zophonos
`IPR2026-00083
`Exhibit 1014
`
`
`
`
`
`
`
`
`
`US 2011/0058056 A1
`
`Mar. 10,2011 Sheet 1 of 18
`
`Patent Application Publication
`
`92
`\
`
`334N0S
`4IMOd
`
`33I1A3d
`ONIMHOMLIN
`
`ve—
`
`3OYH01S
`
`00—
`
`AJOW3N
`
`(
`8T
`
`4l b1
`\ \ v
`S1MOd (S)34NLoNdLS \
`0/1 LNdNI 30IA3d LNdNI
`olany
`ERIELC
`(S)H0SSID0Ud SdD
`g~
`N ndo J0IA3d ONISNIS
`91— NOILOW
`l _ 9¢—
`0E~ 21907 21901 NILSASENS
`INISSTO0Hd IOVINI| [DNISSIO0Nd olany IOV
`( (
`28 ¥e
`AV1dSIa
`(
`82
`X—01
`
`Exhibit 1014
`
`Page 02 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 2 of 18 US 2011/0058056 A1
`
`50 12 14
`|
`14— ?
`56 56 70 5
`! A L — 527
`il S 125pM =
`66 N\
`14 /f ) (WEDNESDAY
`)29
`\_ J U J \—/
`TEXT CALENDAR CAMERA
`s N\ N
`MM BTl | O
`- 7’3 _J
`STOCKS MAPS WEATHER 54
`s N\ [ B V4
`CLOCK [ CALCULATOR NOTES/SETTINGS
`68 ( ‘
`/\
`q ITUNES
`{ \ = \/62
`SIEIZEIE
`\_ J
`PHONE MAIL SAFARI IPOD
`14
`::(l::: 1 ( 1 ::(l:::
`40 12 40
`
`Exhibit 1014
`Page 03 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 3 of 18 US 2011/0058056 A1
`
`= 3)
`
`L —14
`
`0 L —14
`
`52— L—14
`
`Exhibit 1014
`Page 04 of 29
`
`
`
`
`
`
`
`
`US 2011/0058056 A1
`
`Mar. 10,2011 Sheet 4 of 18
`
`Patent Application Publication
`
`FIG. 4
`
`Exhibit 1014
`Page 05 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Exhibit 1014
`Page 06 of 29
`
`Mar. 10, 2011 Sheet S of 18 US 2011/0058056 A1
`
`RECEIVE IMAGE DATA
`
`- VIDEO IMAGE
`~ STILL IMAGE
`~ SLIDESHOW 92
`- OTHER
`PROCESS IMAGE DATA f—9%
`DETERMINE IMAGE
`CHARACTERISTICS | [— 96
`ENCODE IMAGE
`DATA T8
`ALTER AUDIO
`l | ™\-100
`STORE ENCODED IMAGE
`DATA AND ALTERED AUDIO [™\_102
`OUTPUT IMAGE DATA AND
`ALTERED AUDIO <104
`
`FIG. 5
`
`
`
`
`
`
`
`
`US 2011/0058056 A1
`
`Mar. 10,2011 Sheet 6 of 18
`
`Patent Application Publication
`
`SIONTHIA30d L / viva 30IA3A $d9
`v \ 7 NOILISOd
`— S-86T 8¢
`091 193443 ¥3lHO - "I'Te ERE
`P~ S~ um__@m_fib - NOILOW ONISNIS NOILOW
`30IA3 viva |_| - 91 N~o¢
`LNdLNO olany olany AV13d -
`SNYOH) -
`Y3znvnoa - L-ovt i~ owanv
`dIIMS M3LTH - A4 aI™OLS
`8ET—N OdWIL - VIva
`ON/ HOLId - H3AZISAHINAS olany J0IA3d
`s193443 olanv|| olanv or—~1 LndNI olany
`IOVHOLS
`91907 HNISSIVOHd OIANY L —2¢ R
`_ L | 0dl U
`V1vd ¥3IHLO - ¥IHLO -
`9¢1 IdAL JNYYS - 40100 - \..wfl
`¥OYY¥3 NOILOIaIYd - SSANLHONE -
`NOILOW - SSANAUVHS - S3IVAI
`0T V1Yd ONIJOINT SYILINVHY TOVIAI - Q34015
`/4 821 F —va il I Y, Ul 1L
`{ / v1va 39V Y1va ¥0SS300¥d v1va
`AVidSid _\\ q300IN3 \P 43000N3 J\ IOV \f TYNDIS JDVIA d\ OV VY4INYO

`SOILSIYILOVHYHD .
`NmH\\ 30Vl H3HLO \P e L “ou 9 Ol
`
`Exhibit 1014
`
`Page 07 of 29
`
`
`
`
`
`
`
`
`US 2011/0058056 A1
`
`Mar. 10, 2011 Sheet 7 of 18
`
`Patent Application Publication
`
`* XX = | |% || = | =2 | oxl =
`N 0_n_ D s = see D 2|5 x
`[
`.
`¢ )
`w.wH
`|7 061-NReig] _ oopin ol
`rel— — 03pIA
`\ e — — —
`8l 08T & 8/ 91 @) 2
`e
`4 Gz doj — = ) =
`¢ pafe|d ApUsday MWJL m % @
`2L1< [« pappy Apussy | V.1 rali e \fl
`— | 0B@0
`14 SniuaY 62 —
`> s1si Aeld o @C O
`VA Cdl L == _wm.\*flfl_* s ¥
`
`Exhibit 1014
`Page 08 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 8 of 18 US 2011/0058056 A1
`
`>206
`
`Exhibit 1014
`Page 09 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 9 of 18 US 2011/0058056 A1
`
`=51 <210
`
`214
`&
`>
`
`w)
`El~
`2|Z|lg
`< | S
`ol 8 (@)
`elels
`S
`3l= —
`1 ES o
`VAN
`Qal
`< ©
`o O
`Lo S
`\ \ N
`\
`O A K A\
`= E, S
`|/ [aN]
`@ —
`s D =
`O —
`[E‘
`(\IJ H
`« e « o
`Q) ol e
`- ol
`A h |
`Exhibit 1014
`
`Page 10 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 10 of 18 US 2011/0058056 A1
`
`226 RECEIVE IMAGE DATA
`
`99g—| DETERMINE MOTION
`CHARACTERISTIC
`
`230
`
`VERTICAL
`
`MOTION
`?
`
`ALTER PITCH [—237
`OF AUDIO
`
`234
`
`HORIZONTAL
`
`MOTION
`?
`
`ALTER VOLUME |36
`OF AUDIO
`
`238
`
`CHANGE IN
`
`TOTAL MOTION
`?
`
`ALTER TEMPO | -0o4g
`OF AUDIO
`
`NO
`
`o19—] STORE ALTERED
`AUDIO
`
`244~ OUTPUT ALTERED
`AUDIO
`
`FIG. 10
`
`Exhibit 1014
`Page 11 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 11 of 18 US 2011/0058056 A1
`
`> FIG. 11
`
`198
`\
`202
`252?
`—> 202
`(tp)
`
`O~
`\
`t)
`254
`/
`.
`
`256
`/260
`TIME
`
`red l -
`153! rsd
`— [aN]
`I o
`fi .
`[9 Y]
`o
`[a\] % - .._.m
`“m Y
`-
`Vol - } ________ =
`5 @
`9 N >
`O x
`L
`<< &
`a
`N
`oS
`&
`
`(ty)
`
`198
`\
`
`Exhibit 1014
`Page 12 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 12 of 18 US 2011/0058056 A1
`
`> FIG. 12
`
`/ \
`[9N]
`o
`N o
`O —
`N [n)
`>~ =z
`fo's)
`o))
`—]
`
`—>
`270
`
`/274
`tp TIME
`
`<t
`(o}
`(V]
`N —_—
`o _HO
`/’ o ~—
`R K
`—
`om
`i )
`=<
`0
`(e)}
`—
`
`PROPERTY
`
`C\e —> ]—rgb —>
`202 262202
`(tg)
`
`(ty)
`
`198
`\
`
`Exhibit 1014
`Page 13 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 13 of 18 US 2011/0058056 A1
`
`> FIG. 13
`
`(tp)
`
`198
`\
`
`A
`
`198
`3
`
`AUDIO —280
`PROPERTY
`
`198
`3\
`6 o 202
`2. =0 1= O <
`202 202
`(tc)
`276
`S
`278
`ty \\\TfiwE
`282
`
`(ty)
`
`198
`\
`
`Exhibit 1014
`Page 14 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 14 of 18 US 2011/0058056 A1
`
`286\
`
`] RECEIVE
`2887 BRIGHTNESS DATA
`
`290 —| ADJUST AUDIO FILTER
`FREQEUCNY RESPONSE
`
`292— FILTER AUDIO FIG. 14
`
`OUTPUT
`
`294~ FILTERED AUDIO
`
`296 — STORE
`FILTERED AUDIO
`/—300
`
`RECEIVE |
`
`SHARPNESS DATA 302
`
`SET AUDIO FILTER 304
`
`PARAMETERS
`FlG 15 FILTER AUDIO — 306
`
`OUTPUT
`
`FILTERED AUDIO — 308
`STORE L 310
`
`FILTERED AUDIO
`
`Exhibit 1014
`Page 15 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 15 of 18 US 2011/0058056 A1
`
`316 —
`318—| RECEIVE IMAGE
`COLOR DATA
`CHANGE AUDIO
`20—
`320 FILTER
`322 FILTER AUDIO F|G 16
`OUTPUT
`324 FLTERED AUDIO
`326— STORE
`FILTERED AUDIO 330
`RECEIVE 330
`
`IMAGE DATA
`
`DETECT IMAGE -
`FEATURE 334
`
`APPLY AUDIO
`
`FIG. 1/ EFFECT — 336
`
`OUTPUT AUDIO }—338
`
`STORE AUDIO — 340
`
`Exhibit 1014
`Page 16 of 29
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Exhibit 1014
`Page 17 of 29
`
`Mar. 10, 2011 Sheet 16 of 18 US 2011/0058056 A1
`
`Ve 350
`
`RECEIVE MOTION
`
`352 —
`DATA
`Y
`_| RECEIVE POSITION
`304 DATA
`APPLY AUDIO
`356 EFFECT
`358—~ QUTPUT AUDIO
`360~ STORE AUDIO
`
`FIG. 18
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 17 of 18 US 2011/0058056 A1
`
`374
`
`SPEED
`
`INCREASING
`?
`
`378
`SPEED
`
`DECREASING
`?
`
`382
`
`CONTINUE
`
`YES
`
`DETERMINE SPEED |~ 372
`
`ya 370
`
`376
`
`)
`
`INCREASE MUSIC
`TEMPO
`
`380
`
`)
`
`DECREASE MUSIC
`TEMPO
`
`MONITORING
`SPEED
`
`Exhibit 1014
`Page 18 of 29
`
`FIG. 19
`
`
`
`
`
`
`
`
`Patent Application Publication = Mar. 10, 2011 Sheet 18 of 18 US 2011/0058056 A1
`
`Exhibit 1014
`Page 19 of 29
`
`
`
`
`
`
`
`
`US 2011/0058056 Al
`
`AUDIO ALTERATION TECHNIQUES
`
`BACKGROUND
`[0001] 1. Technological Field
`[0002] The present disclosure relates generally to altering
`
`audio signals for playback on a device and, more particularly,
`to techniques for altering audio based on image data and other
`non-audio data.
`
`[0003] 2. Description of the Related Art
`
`[0004] This section is intended to introduce the reader to
`various aspects of art that may be related to various aspects of
`the present disclosure, which are described and/or claimed
`below. This discussion is believed to be helpful in providing
`the reader with background information to facilitate a better
`understanding of the various aspects of the present disclosure.
`Accordingly, it should be understood that these statements are
`to be read in this light, and not as admissions of prior art.
`[0005] In recent years, the growing popularity of digital
`media has created a demand for digital media player devices,
`which may be portable or non-portable. Further, convergence
`of electronic devices has resulted in the combination of an
`increasing number of functionalities into single electronic
`devices. For example, whereas cell phones, media players,
`personal organizers, cameras, and gaming systems were once
`provided only as separate electronic systems with their own
`unique capabilities, it is now possible to use a single device to
`make telephone calls, play audio and/or video media, main-
`tain contact information, capture images, and play electronic
`games, among other functionalities.
`
`[0006] With respect to media playback, some electronic
`devices provide for playback of audio data, video data, or
`both to a user. For example, music or other audio files may be
`stored on an electronic device and may be output to a user on
`demand. Further, electronic devices may also storage and
`reproduction of image files, such as photographs, slideshows,
`and video images. While such audio files and image files may
`be transferred to the electronic device from some other device
`or the Internet, they may also or instead by acquired directly
`by the electronic device. For instance, the electronic device
`may include a microphone and a camera, allowing a user to
`capture audio and image data (e.g., still images and video
`images). In addition to media playback, electronic devices
`may also output audio associated with games, telephone calls,
`system operation, and the like.
`
`SUMMARY
`
`[0007] A summary of certain embodiments disclosed
`herein is set forth below. It should be understood that these
`aspects are presented merely to provide the reader with a brief
`summary of these certain embodiments and that these aspects
`are not intended to limit the scope of this disclosure. Indeed,
`this disclosure may encompass a variety of aspects that may
`not be set forth below.
`
`[0008] The present disclosure generally relates to tech-
`niques for altering audio based on non-audio data. For
`example, in certain disclosed embodiments, an electronic
`device may alter an audio output based on image character-
`istics of processed image data, such as characteristics identi-
`fied by an image signal processor or a video encoder. Various
`audio effects may be applied to alter an audio stream based on
`the image characteristics. In certain embodiments, these
`audio effects may include variation of one or more of pitch,
`tempo, frequency range, volume, reverb, or timbre based on
`
`Exhibit 1014
`Page 20 of 29
`
`Mar. 10, 2011
`
`the image characteristics. In other embodiments, audio output
`may be altered based also or instead on motion data, position
`data, or the like.
`
`[0009] Various refinements of the features noted above may
`exist in relation to the presently disclosed embodiments.
`Additional features may also be incorporated in these various
`embodiments as well. These refinements and additional fea-
`tures may exist individually or in any combination. For
`instance, various features discussed below in relation to one
`or more of the illustrated embodiments may be incorporated
`into any of the above-described embodiments alone or in any
`combination. Again, the brief summary presented above is
`intended only to familiarize the reader with certain aspects
`and contexts of embodiments of the present disclosure with-
`out limitation to the claimed subject matter.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0010] Various aspects of this disclosure may be better
`understood upon reading the following detailed description
`and upon reference to the drawings, in which:
`
`[0011] FIG. 1 is a block diagram depicting components of
`an electronic device configured that may provide audio alter-
`ation functionalities in accordance with aspects of the present
`disclosure;
`
`[0012] FIG.2isafront view of ahandheld electronic device
`in accordance with aspects of the present disclosure;
`
`[0013] FIG. 3 is a rear view of the handheld electronic
`device of FIG. 2 in accordance with aspects of the present
`disclosure;
`
`[0014] FIG. 4 is a perspective view of an electronic device
`of FIG. 1 in the form of a computer in accordance with aspects
`of the present disclosure;
`
`[0015] FIG. 5 is a flowchart generally depicting a method
`for altering audio based on one or more image characteristics
`in accordance with aspects of the present disclosure;
`
`[0016] FIG. 6 is a block diagram depicting the processing
`of image data and the alteration of an audio signal by com-
`ponents of an electronic device in accordance with aspects of
`the present disclosure;
`
`[0017] FIG. 7 depicts a plurality of screens that may be
`displayed on the electronic device of FIG. 2 during execution
`ofamedia player application that provides for video playback
`functions in accordance with aspects of the present disclo-
`sure;
`
`[0018] FIG. 8 depicts a plurality of screens that may be
`displayed on the electronic device of FIG. 2 during execution
`of an imaging application that may be utilized for acquiring
`live image data in accordance with aspects of the present
`disclosure;
`
`[0019] FIG. 9 depicts a plurality of screens that may be
`displayed on the electronic device of FIG. 2 during execution
`of an image viewing application that may be utilized for
`viewing images stored on the device of FIG. 2 in accordance
`with aspects of the present disclosure;
`
`[0020] FIG. 10 is a flowchart representative of one embodi-
`ment in which audio is altered in response to one or more
`motion characteristics of received image data in accordance
`with aspects of the present disclosure;
`
`[0021] FIG. 11 illustrates the variation of an audio property,
`such as pitch, based on vertical motion in video data in accor-
`dance with aspects of the present disclosure;
`
`[0022] FIG. 12 illustrates the variation of an audio property,
`such as volume, based on horizontal motion in video data in
`accordance with aspects of the present disclosure;
`
`
`
`
`
`
`
`
`US 2011/0058056 Al
`
`[0023] FIG. 13 illustrates the variation of an audio property
`based on zoom characteristics of video data in accordance
`with aspects of the present disclosure;
`
`[0024] FIG. 14 is a flowchart generally depicting a method
`for altering audio based on image brightness data in accor-
`dance with aspects of the present disclosure;
`
`[0025] FIG. 15 is a flowchart generally depicting a method
`for altering audio based on image sharpness data in accor-
`dance with aspects of the present disclosure;
`
`[0026] FIG. 16 is a flowchart generally depicting a method
`for altering audio based on image color data in accordance
`with aspects of the present disclosure;
`
`[0027] FIG. 17 is a flowchart generally depicting a method
`for altering audio based on detection of a feature of interest in
`image data, in accordance with aspects of the present disclo-
`sure;
`
`[0028] FIG. 18 is a flowchart generally depicting a method
`for altering audio based on one or both of motion data or
`position data in accordance with aspects of the present dis-
`closure;
`
`[0029] FIG. 19 is a flowchart generally depicting a method
`for altering audio based on a determined speed characteristic
`in accordance with aspects of the present disclosure; and
`[0030] FIG. 20 is a top view of the device of FIG. 2, gen-
`erally depicting rotation of the device to effect audio alter-
`ation in accordance with aspects of the present disclosure.
`
`DETAILED DESCRIPTION OF SPECIFIC
`EMBODIMENTS
`
`[0031] One or more specific embodiments are described
`below. In an effort to provide a concise description of these
`embodiments, not all features of an actual implementation are
`described in the specification. It should be appreciated that in
`the development of any such actual implementation, as in any
`engineering or design project, numerous implementation-
`specific decisions must be made to achieve the developers’
`specific goals, such as compliance with system-related and
`business-related constraints, which may vary from one imple-
`mentation to another. Moreover, it should be appreciated that
`such a development effort might be complex and time con-
`suming, but would nevertheless be a routine undertaking of
`design, fabrication, and manufacture for those of ordinary
`skill having the benefit of this disclosure.
`
`[0032] When introducing elements of various embodi-
`ments described below, the articles “a,” “an,” and “the” are
`intended to mean that there are one or more of the elements.
`The terms “comprising,” “including,” and “having” are
`intended to be inclusive and mean that there may be addi-
`tional elements other than the listed elements. Moreover,
`while the term “exemplary” may be used herein in connection
`to certain examples of aspects or embodiments of the pres-
`ently disclosed subject matter, it will be appreciated that these
`examples are illustrative in nature and that the term “exem-
`plary” is not used herein to denote any preference or require-
`ment with respect to a disclosed aspect or embodiment. Addi-
`tionally, it should be understood that references to “one
`embodiment,” “an embodiment,” “some embodiments,” and
`the like are not intended to be interpreted as excluding the
`existence of additional embodiments that also incorporate the
`disclosed features.
`
`[0033] As noted above, the present application is generally
`directed to techniques for altering an audio signal based on
`non-audio data, such as image data. In some embodiments,
`image data may be processed by an image signal processor,
`
`Exhibit 1014
`Page 21 of 29
`
`29 <
`
`Mar. 10, 2011
`
`and various image characteristics or metrics, such as bright-
`ness, sharpness, and color statistics, may be provided to audio
`processing logic. Further, in some embodiments, image data
`may be encoded by a video encoder, which may provide
`additional image characteristics or metrics to the audio pro-
`cessing logic. Such additional image characteristics may be
`related to the encoding process, and may include, for
`example, motion vectors calculated by the encoder and
`encoding prediction errors.
`
`[0034] The audio processing logic may then alter audio
`based on one or more of the received image characteristics.
`For example, the altering of audio by the audio processing
`logic may include generating a synthesized sound, varying
`aspects of the synthesized sound, and/or varying aspects of
`other audio data, such as that provided via an audio input
`device or stored in a memory. Aspects of the audio that may be
`varied include pitch, tempo, frequency response, equalization
`levels, volume, various additional processing effects, and so
`forth. In additional embodiments, audio may also or instead
`be altered by an electronic device based on motion of the
`device or location data.
`
`[0035] With these foregoing features in mind, a general
`description of electronic devices that may provide such audio
`alteration functionality is provided below. By way of
`example, FIG. 1 is a block diagram illustrating an electronic
`device, referred to by reference number 10, which may be
`configured to implement the above-discussed audio alteration
`techniques, in accordance with one embodiment of the
`present disclosure. Electronic device 10 may be any type of
`electronic device that includes capabilities for processing
`audio data and/or image data, which may include still images
`(e.g., pictures) or moving images (e.g., video). For instance,
`electronic device 10 may be a portable media player, a mobile
`phone, alaptop computer, a desktop computer, or the like. By
`way of example only, electronic device 10 may be a portable
`electronic device, such as a model of an iPod® or iPhone®
`available from Apple Inc. of Cupertino, Calif. In another
`embodiment, electronic device 10 may be a desktop or laptop
`computer, including a MacBook®, MacBook® Pro, Mac-
`Book Air®, iMac®, Mac® Mini, or Mac Pro®, also available
`from Apple Inc. In further embodiments, electronic device 10
`may be a model of an electronic device from another manu-
`facturer thatis capable of processing image and/or audio data.
`As will be discussed further below, electronic device 10 may
`include circuitry or logic (e.g., audio processing logic 32)
`configured to process audio data in response to one or more
`device operation events, which may include image-related
`events, motion-related events, or location-related events, to
`name just a few.
`
`[0036] As shown in FIG. 1, electronic device 10 may
`include various internal and/or external components which
`contribute to the function of device 10. Those of ordinary skill
`in the art will appreciate that the various functional blocks
`shown in FIG. 1 may include hardware elements (including
`circuitry), software elements (including computer code
`stored on a computer-readable medium) or a combination of
`both hardware and software elements. It should further be
`noted that FIG. 1 is merely one example of a particular imple-
`mentation and is intended to illustrate the types of compo-
`nents that may be present in electronic device 10. For
`example, in the presently illustrated embodiment, these com-
`ponents may include input/output (I/O) ports 12, input struc-
`tures 14, one or more processors 16, memory device 18,
`non-volatile storage 20, networking device 24, power source
`
`
`
`
`
`
`
`
`US 2011/0058056 Al
`
`26, display 28, image processing logic 30, and audio process-
`ing logic 32. Electronic device 10 may additionally include
`imaging subsystem 34, motion sensing device 36, positioning
`device 38, and audio input device 40 (e.g., a microphone), all
`of which may facilitate alteration of audio data in accordance
`with the presently disclosed techniques.
`
`[0037] With regard to each of the illustrated components,
`1/0 ports 12 may include ports configured to connect to a
`variety of external devices, such as headphones, or other
`electronic devices, such as computers, printers, projectors,
`external displays, modems, docking stations, and so forth. [/O
`ports 12 may support any interface type, such as a universal
`serial bus (USB) port, an IEEE-1394 port, and/or an AC/DC
`power connection port. In one embodiment, /O ports 12 may
`include a proprietary port from Apple Inc. that may function
`to charge power source 26 (which may include one or more
`rechargeable batteries) of device 10, or transfer data between
`device 10 and an external source.
`
`[0038] Input structures 14 may provide user input or feed-
`back to processor(s) 16. For instance, input structures 14 may
`be configured to control one or more functions of electronic
`device 10, applications running on electronic device 10, and/
`orany interfaces or devices connected to or used by electronic
`device 10. By way of example only, input structures 14 may
`include buttons, sliders, switches, control pads, keys, knobs,
`scroll wheels, keyboards, mice, touchpads, and so forth, or
`some combination thereof. In one embodiment, input struc-
`tures 14 may allow a user to navigate a graphical user inter-
`face (GUI) displayed on display 28. Further, in certain
`embodiments, input structures 14 may include a touch sensi-
`tive mechanism provided in conjunction with display 28. In
`such embodiments, a user may select or interact with dis-
`played interface elements via the touch sensitive mechanism.
`[0039] Processor(s) 16 may include one or more micropro-
`cessors, such as one or more “general-purpose” microproces-
`sors, one or more application-specific processors (ASICs), or
`a combination of such processing components, which may
`control the general operation of electronic device 10. For
`example, processor(s) 16 may include one or more instruction
`set processors (e.g., RISC), graphics processors, audio pro-
`cessors and/or other related chipsets. In the illustrated
`embodiment, processor(s) 16 may include graphics process-
`ing unit (GPU) 42, which may operate in conjunction with
`image processing logic 30 to output image data to display 28.
`[0040] Programs or instructions executed by processor(s)
`16 may be stored in any suitable manufacture that includes
`one or more tangible, computer-readable media at least col-
`lectively storing the executed instructions or routines, such
`as, butnot limited to, the memory devices and storage devices
`described below. Also, these programs (e.g., an operating
`system) encoded on such a computer program product may
`also include instructions that may be executed by processor(s)
`16 to enable device 10 to provide various functionalities,
`including those described herein.
`
`[0041] For example, instructions or data to be processed by
`processor(s) 16 may be stored in memory 18, which may
`include a volatile memory, such as random access memory
`(RAM); a non-volatile memory, such as read-only memory
`(ROM); or a combination of RAM and ROM devices.
`Memory 18 may store firmware for electronic device 10, such
`as a basic input/output system (BIOS), an operating system,
`various programs, applications, or any other routines that may
`be executed on electronic device 10, including user interface
`functions, processor functions, image acquisition functions,
`
`Exhibit 1014
`Page 22 of 29
`
`Mar. 10, 2011
`
`audio alteration functions, media playback functions, and so
`forth. In addition, memory 18 may include one or more frame
`buffers for buffering or caching image data.
`
`[0042] The illustrated components may further include
`other forms of computer-readable media, such as non-volatile
`storage device 20, which may be utilized for persistent stor-
`age of data and/or instructions. Non-volatile storage 20 may
`include flash memory, a hard drive, or any other optical,
`magnetic, and/or solid-state storage media. By way of
`example, non-volatile storage 20 may be used to store data
`files, such as image data and audio data. For instance, in some
`embodiments, the image data that is processed by image
`processing logic 30 prior to being output to display 28 may be
`a still image file (e.g., picture) or a video file stored in storage
`device 20.
`
`[0043] The components depicted in FIG. 1 further include
`network device 24, which may be a network controller or a
`network interface card (NIC). For example, network device
`24 may provide for network connectivity over any wireless
`802.11 standard or any other suitable networking standard,
`such as a local area network (LAN), a wide area network
`(WAN), such as an Enhanced Data Rates for GSM Evolution
`(EDGE) network or a 3G data network (e.g., based on the
`IMT-2000 standard), or the Internet. In certain embodiments,
`network device 24 may provide for a connection to an online
`digital media content provider, such as the iTunes® service,
`available from Apple Inc., through which a user may access,
`stream, or download digital audio or video to electronic
`device 10, which may then be played back and processed in
`accordance with the present techniques.
`
`[0044] Display 28 may be used to display image data,
`which may include stored image data (e.g., picture or video
`files stored in storage device 20), streamed image data (e.g.,
`from network device 24), as well as live captured image data
`(e.g., via imaging subsystem 34). Additionally, display 28
`may display various images generated by the device 10,
`including a GUI for an operating system or other application.
`Display 28 may be any suitable display such as a liquid crystal
`display (LCD), plasma display, or an organic light emitting
`diode (OLED) display, for example. In one embodiment,
`display 28 may be provided in conjunction with a touch
`screen that may function as part of a control interface for
`device 10.
`
`[0045] As mentioned above, electronic device 10 may
`include image processing logic 30 and audio processing logic
`32, which may be configured to process image data and audio
`data, respectively. Such image and audio data may be cap-
`tured by electronic device 10 (e.g., by a camera and micro-
`phone of electronic device 10), or may be received from
`another source and stored in electronic device 10. In various
`embodiments, audio processing logic 32 provides for the
`alteration of audio data that is to be output to a user via
`electronic device 10. As will be discussed in greater detail
`below, such audio alteration may be based on image data
`(which may be acquired via imaging subsystem 34 or in some
`other manner), motion events (e.g., provided via motion sens-
`ing device 36), location events (e.g., provided via positioning
`device 38), or some combination thereof.
`
`[0046] Imaging subsystem 34 may be configured to capture
`still or moving images. For instance, imaging subsystem 34
`may include one or more image capture devices, such as
`cameras having one or more image sensors. Imaging sub-
`system 34 may also include an image signal processor (ISP),
`which may be part of processor(s) 16. As will be appreciated,
`
`
`
`
`
`
`
`
`US 2011/0058056 Al
`
`the ISP may process data acquired via the image sensors to
`generate a digital representation of the captured data, which
`may be displayed and/or stored on device 10. As will be
`discussed further below, in some embodiments, alteration of
`audio to be output to a user may be based on image charac-
`teristics (e.g., brightness level, sharpness level, color statis-
`tics, etc.) from imaging subsystem 34.
`
`[0047] Motion sensing device 36 may be any device con-
`figured to measure motion or acceleration experienced by
`device 10, such as an accelerometer or a gyroscope. In one
`embodiment, motion sensing device 36 may be a three-axis
`accelerometer that includes a sensing element and an inte-
`grated circuit interface for providing the measured accelera-
`tion and/or motion data to processor(s) 16. Motion sensing
`device 36 may be configured to sense and measure various
`types of motion including, but not limited to, velocity, accel-
`eration, rotation, and direction, any or all of which may be
`used as a basis for altering audio output by electronic device
`10.
`
`[0048] Electronic device 10 also includes positioning
`device 38. By way of example, positioning device 38 may be
`a GPS system, such as an Assisted GPS (A-GPS) system.
`Positioning device 38 may be configured to determine the
`geographic coordinates of device 10. Additionally, position-
`ing device 38 may further determine course and velocity
`parameters from variation in the geographic coordinates. In
`one embodiment, audio processing logic 32 may alter audio
`output based on such data from positioning device 38.
`[0049] Additionally, electronic device 10 includes audio
`input device 40, which may be configured to receive audio
`signals. In one embodiment, audio input device 40 may
`include one or more audio receivers, such as microphones.
`Audio received via audio input device 40 may be stored in
`device 10, and may be altered in accordance with the present
`techniques.
`
`[0050] Referring now to FIG. 2, electronic device 10 is
`illustrated in the form of portable handheld electronic device
`50, which may be a model of an iPod® or iPhone® available
`from Apple Inc. It should be understood that while the illus-
`trated handheld device 50 is generally described in the con-
`text of portable digital media player and/or cellular phone,
`additional embodiments of handheld device 50 may incorpo-
`rate additional functionalities, such as a camera, a portable
`gaming platform, a personal data organizer, or some combi-
`nation thereof. Thus, depending on the functionalities pro-
`vided by handheld electronic device 50, a user may listen to
`music, play video games, take pictures, and place telephone
`calls, while moving freely with handheld device 50.
`
`[0051] In the depicted embodiment, handheld device 50
`includes enclosure 52, which may function to protect the
`interior components from physical damage and shield them
`from electromagnetic interference. Enclosure 52 may be
`formed from any suitable material or combination of materi-
`als, such as plastic, metal, or a composite material, and may
`allow certain frequencies of electromagnetic radiation to pass
`through to wireless communication circuitry (e.g., network
`device 24) within device 50.
`
`[0052] As shown in the present embodiment, enclosure 52
`includes user input structures 14 through which a user may
`interface with handheld device 50. For instance, each input
`structure 14 may be configured to control one or more respec-
`tive device functions when pressed or actuated. By way of
`example, one or more of input structures 14 may be config-
`ured to invoke a “h

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket