throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2009/0089288 A1
`
` Petersen (43) Pub. Date: Apr. 2, 2009
`
`
`US 20090089288A1
`
`(54) SYSTEM AND METHOD FOR FILTERING
`CONTENT ON A MOBILE DEVICE BASED
`ON CONTEXTUAL TAGGIhG
`
`(75)
`
`Inventor:
`
`Steven L. Petersen. Los Gatos, CA
`(US)
`
`Correspondence Address:
`CONCERT TECHNOLOGY AND WITHROVV &
`TERRANOVA
`100 REGENCY FOREST DRIVE ., SUITE 160
`CARY, NC 27518 (Us)
`
`(73) Assignee:
`
`CONCERT TECHNOLOGY
`CORPORATION Durham. NC
`US
`’
`'
`
`(
`
`)
`
`(21) APPL N05
`
`11/362,835
`
`(22) Filed:
`
`Sep. 27, 2007
`
`Publication Classification
`
`(51)
`
`Int Cl
`(2006.01)
`G06F 7/06
`(52) US. Cl. ............................................................ 707/9
`
`ABSTRACT
`(57)
`A system and method of contextually filtering content pre-
`sented to a user on a mobile device based on contextual
`tagging. The user controls how content will be filtered by the
`mobile device by creating contextual tags and associating or
`tagging content with the contextual tags. The contextual tag
`includes a contextual behavior that is either satisfied or not
`based on the current context of the mobile device. During
`operation, content accessible to the mobile device is searched
`to determine Which contextual taos are met based on the
`.
`P
`7
`.
`current context of the mobile dev1ce. Content tagged With
`contextual tags Whose behavior is currently met based on the
`current context ofthe mobile device are filtered and presented
`to the user. This allows the automatic presentation of a more
`manageable subgroup of content to the user on the mobile
`device based on the current context of the mobile device.
`
`/’—10
`
`
`
` SERVER
`
`26
`
`SERVER
`DATABAS E
`
`O
`
`O
`
`O O
`
`.
`O
`
`I
`
`Page 1 of 23
`
`SNAP EXHIBIT 1007
`
`Page 1 of 23
`
`SNAP EXHIBIT 1007
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 1 0f 12
`
`US 2009/0089288 A1
`
`’//»—10
`
`
`
`
`
` SERVER
`
`26
`
`SERVER
`DABABASE
`
`FIG. 1
`
`Page 2 of 23
`
`Page 2 of 23
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 2 0f 12
`
`US 2009/0089288 A1
`
`30
`
`ESTABLISH ONE OR MORE CONTEXTUAL TAGS AND
`ASSOCIATED BEHAVIORS
`
`32
`
`
`ASSOCIATE CONTEXTUAL TAGS WITH
`
`
`CONTENT ACCESSIBLE BY MOBILE DEVICE
`
`DONE
`34
`
`
`ESTABLISHING
`
`
`CONTEXTUAL TAGS/TAGGING
`
`CONTENT
`?
`
`YES
`
`36
`
`UPDATE CURRENT CONTEXT OF MOBILE DEVICE
`
`
`SCAN NEXT CONTENT ACCESSIBLE TO MOBILE DEVICE
`
`
`
`44
` IS
`
`FILTER OUT
`BEHAVIOR(S)
`
`CONTENT
`ASSOCIATED
`
`
`
`WITH CONTEXTUAL TAG
`FROM
`
`
`ASSOCIATED WITH CONTENT MET
`PRESENTATION
`
`
`
`
`ON MOBILE
`BASED ON CURRENT
`
`
`CONTEXT
`DEVICE
`
`
`
`?
`
`46
`
`DONE
`
`SCANNING YES
`CONTENT
`ALLOW CONTENT TO
`
`BE PRESENTED
`ACCESSIBLE ON
`
`ON MOBILE DEVICE
`MOBILE
`
`DEVICE
`
`?
`
`
`FIG. 2
`
`Page 3 of 23
`
`Page 3 of 23
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 3 0f 12
`
`US 2009/0089288 A1
`
`NF
`
`
`
`am2.m..02>ZOInxz>w
`
`
`
`0Z_<._.<ZOwOZ<_n_
`
`9.
`
`9.
`
`
`
`>IO.rZODmun—m5gm
`
`em2”m.02>ZOIn=2>m
`
`
`
`I<.=._m0MEMIF>MI
`
`
`
`._.<I.m.v_Z<mO
`
`
`
`
`
`0Z.<._.<ZOmOz<_n_
`
`
`
`mm..GE
`
`«S.QE
`
`Page 4 of 23
`
`Page 4 of 23
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 4 of 12
`
`US 2009/0089288 A1
`
`
`
`MES.05wm<m<h<oFzmkzoo$565
`
`VGE
`
`mmon
`
`w.=._.<.._0>-ZOz
`
`
`
`>._._>_._.Om_ZZOOvEO>>sz
`
`om
`
`No
`
`
`
`._.Zm=>_w062$).03.
`
`Nu.
`
`mZO_._.<O_.En_<
`
`vw
`
`
`
`NFm0_>m_om.=mO_>_
`
`Page 5 of 23
`
`
`
`
`
`w._.Zm_ZOn:>_OOm0<u_mm._.z_mmmD
`
`Page 5 of 23
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 5 0f 12
`
`US 2009/0089288 A1
`
`
`
`25:086E<3§m
`
`m.GE
`
`
`
`<20acmzmm#meon._<Ez_
`$8mlo:_><Iim...m. atmo_><:mmfitmonwzoovXmEEOQTE;GE.23meon
`
`
`
`amvllll'mo_><_._mm559.20088lilofi#35500
`
`
`
`
`
`ozEm”mszz
`
`
`
`z<m._oom”eamwmmqumg
`
` atGE#5thon
`<._.<DmOmmeH._.Xm.._.ZOO._<_._._Z_Amwv303.9Illum0_><zm_m
`
`
`
`
`w._.2m__>_m._.<_.w44050..Hmmjam.OzEHmum=2<Z
`
`
`
`
`
`
`
`z<m50mKE<3§mz<m_._oom_”eqmmmmmmxmfl
`
`88gm
`
`ozEB“mszz
`
`
`
`2558ueqmmmmmmxmm
`
`2E99mdém
`
`Page 6 of 23
`
`Page 6 of 23
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 6 0f 12
`
`US 2009/0089288 A1
`
`
`
`w1_m_<kmu<.rmmOmme>>Ih2m§m0<z<§0.1%
`
`
`
`
`
`mmomNF
`
`_
`
`
`
`
`
`._._<Uhzwfio.00—
`
`
`
`
`
`>>wzw._.<m_m0svow
`
`mm
`
`$920560..
`
`
`
`Fzmmmpo@309;.9:"
`
`$920569
`
`255mas
`
`wm
`
`IO_><Im_m
`
`._<IO_><Imm_
`
`POmBO
`
`___
`
`_
`
`
`
`
`
`m._.<.rw4<EZ_._In_m.oE.
`
`
`
`
`
`_WEE9?92.9.;#55320082me552Emma.9r9__9PBEBE
`
`
`
`
`
`_20:59pzmmmao"$st”.0."TE;mos/Emano.9zmawme:
`
` ZO:.<O_.En_<
`
`
`0.4...._<D._.xm._.ZOO
`
`
`._.Om=.mOOS.
`
`99
`
`._52Emmawe_20.28..H<5559
`
`._<Exmkzoo4<3hxwwwwm
`
`
`
`
`
`GE.J_<D._.meZOODmm<m.zo_._.<00._<GZ_._.<mmO
`
`Page 7 of 23
`
`Page 7 of 23
`
`
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 7 0f 12
`
`US 2009/0089288 A1
`
`Dmm<m-ZO_.r<OO.._
`
`
`
`OE.._<3me._.200
`
`m._.<m_ION?
`
`
`
`
`
`
`
`Dmm<m-ZO_._.<O.OI_m._.<m_m_0O...HmmDOmm.omF
`
`
`
`OS.._<D._.xw.rZOO
`
`k.GE
`
`Dmm<m-zo_._.<00.._
`
`
`
`0S.._<D.rxm_.rZOO
`
`
`
`I._._>>._.Zm:.ZOO
`
`
`
`
`
`GE.0...HwMDOmm.9:
`
`Hzm§m0<z<§OE.
`
`5.25.200
`
`
`
`mm<mfi<ozo:<o_._&<5
`
`omENo
`
`
`
`
`
`HzmHZOOmOu>¢wDO.NNF
`
`
`
`._.Zm_.rZOOm_>m=m_._.m_m
`
`
`
`
`
`O....rmeOmmmum:.omr
`
`_
`
`
`
`
`
`._.Zm._.ZOOzmakmméNF
`
`
`
`._.Zm_.rZOOmOmGS.4<DPXNFZOOme<m-ZO_._.<OOI_<OZF<mEO>._H_O_n_n_Xm_
`
`
`
`
`
`Page 8 of 23
`
`mum:OH._.Zm._.ZOO
`
`
`
`
`
`DmFmNDOwIZIP—mm.wmr
`
`
`
`GE.0mm<m.ZO_._.<OO._
`
`>._I_<D._.Xm._.ZOOmm
`
`
`
`I._._>>DMOOS.
`
`
`
`
`
`HZMHZOOHmmscmm.wmr
`
`Page 8 of 23
`
`
`
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`r.
`
`pA
`
`2,
`
`21
`
`US 2009/0089288 A1
`
`
`
`9_M55282o_”__omam“.0
`
`
`
`x0<m><._n_25mm.91.
`
`as..m.89:JEEMEOOGE4523200too.«2aomm<m-zo:<09mmEmmoR:2:558Somm<m-zoF<oo._“.020596Enemaa:
`
`
`
`
`
`
`
`
`
`
`
`._.zm_._.zOozmnkmm.3:
`
`HZMHZOO
`
`mGE
`
`
`
`me<m-ZOE.<OOJDm._.<m_m0
`
`
`
`GE.._<3._.xw._.200
`
`
`
`
`
`>._>>mzI._._>>._.Zm_._.zOO
`
`
`
`
`
`GS.>J_I_<Dbxm_._rzoo.vm_.
`
`
`
`
`
`<_Dm_>=._.._3_2<IO“.GE.Omw<m-ZO_._.<OOI_<02F<mm0>.EO_._n=>=
`
`
`
`mmmgommmm<m<._.<o<0253m5?—
`
`
`
`
`
`Em2m0<z<5GS.mm<m<k<oZO_._.<O_.._n_n_<5._.Zm_.rzOO
`
`"mmwewe
`
`Page 9 of 23
`
`
`
`
`
`x0<m><IEHmMDOm—m.0:
`
`
`
`
`
`OrzomEmIO;>mm30N:._.Zm_._.zoo07:0me”.0
`
`
`
`
`
`Page 9 of 23
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 9 0f 12
`
`US 2009/0089288 A1
`
`
`
`xm<omm<w02_o<m_mmszww025<wm
`
`
`
`
`
`wmbeEIOOOmmwmO_><Im_m_
`oz<amo_><zm_m3.1985m
`
`why—”ECm=.:.“.0
`
`amo_><1m_m3.193.3
`
`
`
`oz<:mo_><_.mm31055::@2on
`
`meOmEOO
`
`mO_><Imm
`
`VEO>>069J<DHXMHZOO
`
`._._.._O_._._<_._._z_DZ<HzmmmaomOmmePIC:
`
`
`
`
`mO_><ImmvE<DOS.IZDCGPZOO
`
`
`
`MEmmEaéEz_oz<Hzmmmao
`
`.mmmsmEmwz_#2$55._<Ez_
`
`
`
`$302oz<>42..mza.z_#2
`
`oz<EmmmzomoE2239
`
`8:5éEz.oz<Emmmnom0
`
`$9552oz<ammohoo
`
`
`
`.mmmsmomaz.#2mea._<Ez_
`
`oz<pzmmmsomo22:53
`
`
`
`EmpzsaESEmmu.oz<€4,523.
`
`@259>32oz<.__mn_<.105:2.E5
`
`
`IO.><Im_m_ZOw<mm06F4<D._.xw._.zOo
`
`
`
`
`
`
`
`mbqhm._<EZ_mn_>.r<H<D
`
`mzozv
`
`mzozv
`
`£2on
`
`ZO_H<OOI_m0
`
`mam”.
`z_I._._>>w.ZO_._.<OO._hzmmmno
`
`
`ZO_H<OO._._<EZ_m0wau.commm.._.<Z_QEOOO
`
`
`
`EO_><Im_mZO_._.<OO._0<HJ<DHXMHZOO
`
`
`
`
`
`
`am:mmfimam:mEnm._<Ez_
`3:mai<29am:mszzso:matE5
`
`88miniGE
`
`SEEo_><=um3.20
`
`:O_><Im_mOE.
`
`Page 10 of 23
`
`
`
`
`
`Hum—u.ooodrm>Om<mm<
`
`«5.GE
`
`
`
`
`
`02.9mmEwhm2_.5<0252mm
`
`
`
`02:...mwman—Omn—
`
`amo_><zwm
`
`
`
`mZOImmam—0.2IO_><Imm
`3:985:mo2mo_><_.m_mSEQ$5:$2on
`
`
`mtmOmEOoMEDHWEHOZOQGS!4<3me:.200
`
`mO_><Imm
`
`._<_._._z_DZ<PZwm—EDOEMEEHZ
`
`
`
`mO_><Im_mOz_>.EGE.J_<3._.xw.r200
`
`Page 10 of 23
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 10 of 12
`
`US 2009/0089288 A1
`
`mmjam
`
`
`
`mzozv
`
`mzozv
`
`mzozv
`
`
`
`m_2_._.Hzmmm30
`
`SEmDz<2<m
`
`meEmmw—
`
`
`
`E22.EO_><Imm
`
`
`
`><023mIO><DIDbxm
`
`
`
`._.02m.xmm>>
`
`
`
`“.0><okzmmEDoxmmxs“.0><Qm0_><Im_m
`
`
`
`mzozv
`
`£2on
`
`@2on
`
`m._.<._rm._<_._._z_
`
`
`mat.<._.<Q
`
`mmEDI
`
`zwm>>._.m_mmm<m2;
`
`52wDZ<SEE
`
`Mm.GE
`
`
`
`
`
`._<_._._Z_QZ<kmemDOm2;mO_><Im_m_
`
`
`
`
`
`20560..kzmmmso
`
`Hum“—oomZ_I._._>>m_
`
`
`
`20:660..4<Ez_”.0
`
`
`
`m.__m0mn_szIum...=m.O_2
`
`._<_._._z_QZ<Hzmmmao
`
`
`
`
`
`szamaIHOmmm<02:1..m—m
`
`
`
`mhon._<Ez_m_n_>._.<._.<Q
`
`
`
`
`
`
`
`E:«mo_><_._mm3.10
`
`
`
`at.Nmo_><_._mmn.___._u
`
`Page 11 of 23
`
`Page 11 of 23
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 11 of 12
`
`US 2009/0089288 A1
`
`
`
`
`
`
`
`omvmmmmmN53memem>>Im0_><ImmOE...<3._.xm._.ZOOm._m<._.0<._.HEGEOE.ZO_._.<03n_n_<
`
`
`
`
`
`
`
`.erEIDO
`
`kxmhzoo
`
`
`
`meadmm.omF
`
`HXMHZOO
`
`kzmmmnoNe
`
`
`
`
`
`BmEmEmEmmmEmmo_><_.m_m.8?
`
`
`
`mO_><Im_m_.mww
`
`
`
`4<DFXMHZOO.5?
`
`GE.
`
`GE.
`
`
`
`"éaxfizoo85.2w
`
`_
`
`
`
`
`
`._<3mezoo"_ESDQEE _mo”.192%.wa_mo_><_._mm_""ofi
`
`_u:mz__2mm:.wo.oww_
`
`wom_n_w_._.<w
`
`ZO_._.O<n_wF<w
`
`0.5
`
`mmIHmI>>
`
`
`
`u_Ohammm.omF
`
`
`
`mO_><Im_m+9DmEmF<mDm:u_w_._.<m4<DHUWT7HWW.oom
`
`m0_><_._m_mmO_><Im_mIMIHmI>>”.0H5mewar
`
`
`
`2.GE
`
`
`
`ZO:.<OOJFzmmmDO0253ZO_._.<03&n_<Z<<_>O<HDmm<m-ZO_._.<OO._<Gz_._.<31_<>w
`
`
`
`
`
`Page 12 of 23
`
`Page 12 of 23
`
`
`
`
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`A
`
`US 2009/0089288 A1
`
`zm>5mam:mu20Qmm<mQmmwmmmxmOE.._<D._.xm.rZOO.ONN
`
`2__1__fl__e__h__S__
`
`"—___
`
`_zo:<oo._
`
`:.GE
`
`
`
`
`
`
`«NNzmamm.mmmMEI;"58mmzmzflmmam2"monfiflwmmw5385.BEmemo_><:mm2.55m.08m.mmmzsmmfio
`
`
`"gziwxfifiw
`
`
`
`
`
`
`
`mo”.zomfimEmTE;ommmmmmxw
`
`
`.ofi20:50..zm>_w
`
`9n«20:59_0_zm.>_w29:59
`
`
`
`
`
`
`2,“omEmEm
`
`
`
`r._mo_><Imm.NNN.P_05szmm9N
`
`u_
`
`
`
`_01¢.._<D._.xw.rZOO.vE
`
`
`
`
`
`“<._.<D.CmFZOO0mm:>.F2m_ommZIDHmE.NFN
`
`.OFN______._____
`_ ___<F<D
`
`
`
`FXm—PZOODmmD>|_.er_OMIm>w_m._[m_m
`
`
`
`
`
`
`
`
`
`
`
`
`
`mmszmm>>Im0_><_._m_mGS.._<D._.XmHZOOm_._m<HOS.H.265.OE.ZO_._.<O_._n_n_<
`
`
`
`
`
`omvmmoNu#m
`
`
`
`ZOF<OOJ._<O_._.m_I._.On_>I<0253ZO_._.<O_._n_n_<Z<<_>OdPDwm<m.20_._.<004<02.F<34<>m_
`
`
`
`
`
`Page 13 of 23
`
`Page 13 of 23
`
`
`
`

`

`US 2009/0089288 A]
`
`Apr. 2, 2009
`
`SYSTEM AND l\IETHOD FOR FILTERING
`CONTENT ONA DIOBILE DEVICE BASED
`ON CONTEXTUAL TAGGING
`
`FIELD OF THE INVENTION
`
`invention relates to a system and
`[0001] The present
`method of filtering content, including but not limited to mul-
`timedia content, on a mobile device based on contextual
`tagging. Content is filtered based on whether a current context
`of a “contextually aware” mobile device satisfies the contex-
`tual behavior defined in a contextual tag associated with the
`content.
`
`
`
`BACKGROUND OF THE INVENTION
`
`large
`form factor,
`[0002] The development of small
`memory capacity hard drives and other memory devices has
`facilitated growth of mobile devices for accessing and play-
`ing digital media. Mobile devices are particularly useful
`because they facilitate convenient “on—the—go” access of digi—
`tal media for their users. Media content is stored in local
`memory in the mobile device for access by the user when
`desired. An example of such a mobile device is the Apple®
`iPOD® media player. The Apple® iPOD® media player pro-
`vides gigabytes of memory storage. Media software applica-
`tions, such as Apple® itunes® for example, are executed on a
`user’s computer to store and manage the user’s media library
`and facilitate downloading of desired media content to local
`memory in mobile devices.
`[0003] Given the plethora of media content available, users
`may not have all desired media content stored on their mobile
`device. Thus, many mobile devices are increasingly being
`equipped with wireless comm1mication capabilities. Wireles s
`communications allow media devices to access media content
`not stored locally on the mobile device. Short-range wireless
`communication allows users to share media content with
`other users. Many manufacturers are also adding cellular
`communication capabilities to mobile devices so that media
`players can access media content over cellular networks from
`remote service providers. An example ofsuch a mobile device
`is the Apple® iPhone®, which provides a combined cellular
`phone and media player into one mobile device.
`[0004] Because of the plethora of media content available
`to users of mobile devices, both from locally stored and
`remotely accessed content, it is increasingly important to
`provide filtering capabilities. Without filtering, users may
`have to navigate through large and unmanageable media file
`listings to find desired media content. Filtering capabilities
`allow content to be provided to users in more manageable
`subgroups. To provide filtering, media content can be tagged
`with one or more static criterion that delineates the content in
`
`some manner. For example, if the media content are audio
`files, the audio files may include a genre tag. If an audio file is
`of a “Comedy” genre, the media item may be tagged with a
`“Comedy” genre tag in this example. Thus, if the user of the
`mobile device only wants to access audio files in the “Com—
`edy” genre, the mobile device can consult the genre tag ofthe
`audio files to only provide those files having a “Comedy”
`genre tag.
`[0005] One disadvantage of such filtering systems is that
`they use static-based criterion and are thus non-intelligent.
`The filtering criterion provided by the tag does not adapt to
`changes in the environment or context of the mobile device.
`For example, some media items tagged with a “Comedy”
`
`genre tag maybe appropriate for some contexts such as home,
`but not for others such as a work place. Other media items
`may also be tagged with the “Comedy” genre tag, but may be
`appropriate for either home or work use. In such systems,
`media items tagged with “Comedy” genre tags would be
`filtered equally. Thus, the user may not be able to filter based
`on the presence of the “Comedy” genre tag effectively,
`because this filter may include media items that are both
`appropriate and inappropriate for a particular environment or
`context of the mobile device. If the mobile device could
`determine which “Comedy” media items were appropriate
`for which contexts on an individualized basis, the user could
`effectively use the “Comedy” genre filter without fear of a
`contextually inappropriate selection being made.
`
`SUMMARY OF THE INVENTION
`
`[0006] The present invention is a system and method of
`contextually filtering content presented to a user on a mobile
`device based on contextual tagging. The user controls how
`content will be filtered by the mobile device during operation
`by creating contextual tags and associating ortagging content
`with the contextual tags. The contextual tag includes a defined
`contextual behavior. The contextual behavior is an expression
`that is either satisfied or not based on the current context ofthe
`mobile device, a set of logical rules that apply to the current
`context and, an optional initial context. In this manner, the
`user controls the context that must exist for the mobile device
`in order for particular tagged content to be presented during
`operation. The user may use contextual tags to tag content
`deemed appropriate for certain contexts, but inappropriate for
`others. The mobile device is equipped to be “context aware.”
`The mobile device may use a sensed context to define the
`initial context of a contextual tag when created as well as the
`current context of the mobile device during operation. The
`context of the mobile device can be any condition or sur-
`rounding able to be sensed by the mobile device, includingthe
`user’s interaction with the mobile device that can change and
`can be sensed or determined.
`[0007] During operation, after contextual tags have been
`created and assigned to content by the user, content
`is
`searched to determine which have contextual tags whose
`behavior is satisfied based 011 the current context of the
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are satisfied are presented to the user.
`This means the particular content was previously designated
`by the user to be presented based on the current context ofthe
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are not satisfied based on the current
`context of the mobile device are filtered and not presented to
`the user. In this mamier, the present invention facilitates man-
`aging and automatically being presented with a more man-
`ageable subgroup of content on the mobile device based on
`the context of the mobile device from the user’s perspective.
`This is opposed to solely filtering content based on static-
`based criterion that does not adapt or change based on the
`context of the mobile device.
`
`[0008] For example, the contextual behavior defined by a
`contextual tag may be location based. A location-based con-
`textual tag may include a contextual behavior defined as the
`mobile device being at or in close proximity to a specified
`location as the initial context. How close is decided by the
`user via a logical expression assigned to the contextual tag
`that defines the desired behavior. For example, the desired
`behavior assigned by the user may be that the mobile device
`
`Page 14 of 23
`
`Page 14 of 23
`
`

`

`US 2009/0089288 A]
`
`Apr. 2, 2009
`
`must be located within ten miles of a work place for the
`contextual behavior to be satisfied. When the detected current
`
`location of the mobile device (i.e. the current context) indi-
`cates that the mobile device is located within ten miles of the
`work place in this example (i.e. the initial context), the behav-
`ior will be satisfied. The mobile device would then make any
`content tagged with this location-based contextual tag avail-
`able to the user.
`
`[0009] The user can establish and associate contextual tags
`having any desired behavior with the content. A contextual
`tag may include only one contextual behavior. Alternatively,
`a contextual tag may include more than one contextual behav-
`ior to include a composite contextual behavior. Contextual
`behavior included in contextual tags can be based on any
`contextual attribute(s) that can be sensed by the mobile
`device. As examples, the contextual attributes could include
`conditions such as the location of the mobile device, time of
`day, day of week, date, velocity, acceleration, direction of
`travel, weather, amount of sunlight, proximity of the mobile
`device to other users, state or data of applications running on
`the mobile device, or combinations thereof. For contextual
`attributes that require sensing of the mobile device’s external
`environment or surroundings, one or more context sensors or
`other hardware components which may be associated with
`the mobile device may be used to determine the current con-
`text ofthe mobile device. In this manner, the mobile device is
`“context aware.” Contextual behaviors can also be based on
`the context ofthe user and/or their interaction with the mobile
`device. For example, the user may establish contextual tags
`for “home” and “work” behaviors. Content assigned with a
`contextual tag associated with “home” behavior may not be
`appropriate for a “work” context, and Vice versa.
`[0010] The contextual tags may be established in data
`structures stored in association with the mobile device. These
`data structures may be implemented using object-oriented
`design (ODD) principles. OOD may be particularly well
`suited since it defines methods and attributes so as to associate
`behavior with data. For example, when a user desires to create
`a contextual tag, a tag factory object may be called upon to
`create a contextual tag object from a tag class. The tag factory
`may also be called upon to allow the user to create and
`associate one or more behavior objects with contextual tag
`objects. A contextual tag object does not contain any behavior
`evaluations. Instead, the one or more behavior objects asso-
`ciated with a contextual tag object are called upon. The
`behavior evaluations in the behavior objects are separated
`from the contextual tag objects to support decoupling, thus
`allowing easier reuse of behavior objects by other contextual
`tag objects. If the one or more contextual behavior objects
`associated with a contextual tag object are satisfied by the
`current context according to rules and state attributes in the
`behavior objects, the content tagged with the contextual tag
`will be made accessible by the mobile device to the user.
`[0011]
`In an alternative embodiment, the mobile device
`may allow the user to manually force or override the current
`context even if the forced context does not naturally exist
`based on the current context ofthe mobile device. This allows
`
`a user to force the mobile device to filter content contextually
`based on the context desired by the user as opposed to the
`natural context sensed by the mobile device. For example, the
`user may want to access all content contextually tagged with
`a work location contextual tag, but when the user is on vaca—
`tion. Instead ofthe user having to retag the content designated
`
`for a work context, the user can simply override the current
`context of the mobile device to force a work location context
`as the current context.
`
`In another embodiment, the mobile device may be
`[0012]
`directed to implicitly contextually tag content without the
`user having to explicitly assign contextual tags. This allows
`the user to later recall content based on making a selection
`from previous contexts in which the user browsed and/or
`accessed content. For example, as a user accesses content in a
`normal fashion, the mobile device may automatically and
`silently in the background and unknown to the user contex-
`tually tag the content. Ifthe user desires to later recall specific
`content, but the user can only remember the context in which
`the content was previously accessed, the user can review and
`select contextual tags assigned by the mobile device to recall
`content.
`
`[0013] The mobile device employed by the present inven—
`tion may be any type of mobile device, including but not
`limited to a cellular phone, a personal digital assistant (PDA),
`or a portable media player, as examples. The mobile device
`may or may not have communication capability. Communi-
`cation capabilities may include either wired communication,
`wireless communication, or both. If the mobile device has
`communication capability, content and/or the context of the
`mobile device, which is used to determine if the contextual
`behavior ofcontextual tags for filtering are satisfied, both can
`be obtained from a remote system, such as a central content
`server for example.
`[0014] Those skilled in the art will appreciate the scope of
`the present invention and realize additional aspects thereof
`after reading the following detailed description of the pre-
`ferred embodiments in association with the accompanying
`drawing figures.
`
`BRIEF DESCRIPTION OF THE DRAWING
`FIGURES
`
`[0015] The accompanying drawing figures incorporated in
`and forming a part of this specification illustrate several
`aspects of the invention, and together with the description
`serve to explain the principles of the invention.
`[0016]
`FIG. 1 illustrates an exemplary mobile device sys—
`tem and network architecture according to one embodiment
`of the present invention;
`[0017]
`FIG. 2 is a flowchart illustrating an exemplary over-
`all process of establishing contextual tags and filtering con-
`tent based on whether the current context ofthe mobile device
`satisfies the contextual behavior ofcontextual tags associated
`with the content, according to one embodiment ofthe present
`invention;
`[0018]
`FIG. 3A illustrates an exemplary song file listing
`displayed on a mobile device before contextual filtering is
`activated, according to an embodiment of the present inven-
`tion;
`FIG. 3B illustrates an exemplary song file listing
`[0019]
`displayed on a mobile device after contextual filtering is
`activated and applied to the song listing illustrated in FIG. 3A,
`according to an embodiment of the present invention;
`[0020]
`FIG. 4 illustrates an exemplary mobile device archi-
`tecture that may be employed to allow the mobile device to be
`“contextually aware” and filter contextually tagged content
`based on the context of the mobile device, according to one
`embodiment of the present invention;
`
`Page 15 of 23
`
`Page 15 of 23
`
`

`

`US 2009/0089288 A1
`
`Apr. 2, 2009
`
`FIG. 5 illustrates an exemplary tag management
`[0021]
`architecture and exemplary tags created using the tag man-
`agement architecture;
`[0022]
`FIG. 6 is a flow diagram illustrating an example of
`creating a location-based contextual tag for filtering content
`based on the mobile device’s location, according to one
`embodiment of the present invention;
`[0023]
`FIG. 7 is a flow diagram illustrating an example ofa
`user explicitly creating a location-based contextual tag for
`content, according to one embodiment of the present inven-
`tion;
`FIG. 8 is a flow diagram illustrating an example of
`[0024]
`the mobile device implicitly creating a location-based con-
`textual tag for content, according to one embodiment of the
`present invention;
`[0025]
`FIGS. 9A and 9B illustrate an exemplary tag table
`containing examples of contextual tags created and associ-
`ated with content, according to one embodiment of the
`present invention;
`[0026]
`FIG. 10 is a flow diagram illustrating an example of
`evaluating a location-based contextual tag to determine if its
`location-based behavior is satisfied based on the current con-
`
`text location of the mobile device, according to one ernbodi—
`ment of the present invention; and
`[0027]
`FIG. 11 is a flow diagram illustrating an example of
`evaluating a location-based contextual tag to determine if its
`location—based behavior is satisfied using a hypothetical loca—
`tion provided by the user according to one embodiment ofthe
`present invention.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`[0028] The embodiments set forth below represent the nec-
`essary information to enable those skilled in the art to practice
`the invention and illustrate the best mode of practicing the
`invention. Upon reading the following description in light of
`the accompanying drawing figures, those skilled in the art
`will understand the concepts of the invention and will recog-
`nize applications ofthese concepts not particularly addressed
`herein. It should be understood that these concepts and appli-
`cations fall within the scope of the disclosure and the accom—
`panying claims.
`[0029] The present invention is a system and method of
`contextually filtering content presented to a user on a mobile
`device based on contextual tagging. The user controls how
`content will be filtered by the mobile device during operation
`by creating contextual tags and associating or tagging content
`with the contextual tags. The contextual tag includes a defined
`contextual behavior. The contextual behavior is an expression
`that is either satisfied or not based on the current context ofthe
`mobile device, a set of logical rules that apply to the current
`context and, an optional initial context. In this manner, the
`user controls the context that must be present for the mobile
`device in order for particular tagged content to be presented
`during operation. The user may use contextual tags to tag
`content deemed appropriate for certain contexts, but inappro-
`priate Iorothers. The mobile device is equipped to be “context
`aware.” The mobile device may use a sensed context to define
`the initial context of a contextual tag when created as well as
`the current context ofthe mobile device during operation. The
`context of the mobile device can be any condition or sur-
`rounding able to be sensed by the mobile device, including the
`user’s interaction with the mobile device that can change and
`can be sensed or determined.
`
`[0030] During operation, after contextual tags have been
`created and assigned to content by the user, content
`is
`searched to determine which have contextual tags whose
`behavior is satisfied based on the current context of the
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are satisfied are presented to the user.
`This means the particular content was previously designated
`by the user to be presentedbased on the current context ofthe
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are not satisfied based on the current
`context of the mobile device are filtered and not presented to
`the user. In this mamrer, the present invention facilitates man—
`aging and automatically being presented with a more man-
`ageable subgroup of content on the mobile device based on
`the context of the mobile device from the user’s perspective.
`This is opposed to solely filtering content based on static-
`based criterion that does not adapt or change based on the
`context of the mobile device.
`
`[0031] For example, the contextual behavior defined by a
`contextual tag may be location based. A location-based con-
`textual tag may include a contextual behavior defined as the
`mobile device being at or in close proximity to a specified
`location as the initial context. How close is decided by the
`user via a logical expression assigned to the contextual tag
`that defines the desired behavior. For example, the desired
`behavior assigned by the user may be that the mobile device
`must be located within ten miles of a work place for the
`contextual behavior to be satisfied. When the detected current
`
`location of the mobile device (i.e. the current context) indi—
`cates that the mobile device is located within ten miles of the
`work place in this example (i .e. the initial context), the behav-
`ior will be satisfied. The mobile device would then make any
`content tagged with this location-based contextual tag avail-
`able to the user.
`
`[0032] The user can establish and associate contextual tags
`having any desired behavior with the content. A contextual
`tag may include only one contextual behavior. Alternatively,
`a contextual tag may include more than one contextual behav-
`ior to include a composite contextual behavior. Contextual
`behavior included in contextual tags can be based on any
`contextual attribute(s) that can be sensed by the mobile
`device. As examples, the contextual attributes could include
`conditions such as the location of the mobile device, time of
`day, day of week, date, velocity, acceleration, direction of
`travel, weather, amount of sunlight, proximity of the mobile
`device to other users, state or data of applications running on
`the mobile device, or combinations thereof. For contextual
`attributes that require sensing of the mobile device’s external
`environment or surroundings, one or more context sensors or
`other hardware components which may be associated with
`the mobile device may be used to determine the current corr-
`text ofthe mobile device. In this manner, the mobile device is
`“context aware.” Contextual behaviors can also be based on
`the context ofthe user and/or their interaction with the mobile
`device. For example, the user may establish contextual tags
`for “home” and “work” behaviors. Content assigned with a
`contextual tag associated with “home” behavior may not be
`appropriate for a “wor "’ context, and Vice versa.
`[0033]
`FIG. 1 illustrates an exemplary system and network
`architecture 10 for mobile devices that may employ a contex-
`tual filtering system and method according to embodiments
`ofthe present invention. The mobile device architecture 10 in
`FIG. 1 will be described in conjunction with FIG. 2, which is
`a flowchart illustrating an exemplary contextual filtering pro-
`
`Page 16 of 23
`
`Page 16 of 23
`
`

`

`US 2009/0089288 A1
`
`Apr. 2, 2009
`
`cess. The contextual filtering process involves contextually
`tagging content with contextual tags. The mobile device fil-
`ters content based on whether the current context of the
`mobile device satisfies the contextual behavior of the contex-
`tual tag associated with the content. As illustrated in FIG. 1,
`an exemplary platform for a mobile device 12 is provided.
`The mobile device 12 includes an output device 14, such as a
`display, and an input device 16, such as input keys. to allow a
`user 18 to control, interact, and be presented content on the
`mobile device 12. The mobile device 12 may include a cel—
`lular handset with an accompanying cellular antenna 20 for
`cellular communications. However, the mobile device 12 can
`be any type of device adapted to store and manage digital
`content, including but not limited to a personal data assistant
`(PDA), a personal media player (PMP), or other handheld
`device. The mobile device 12 can include multimedia related
`functionality, including searching, organizing, browsing, pre-
`viewing, rendering, and/or sharing/transferring content. In
`one embodiment ofthe present invention, one or more mobile
`devices 12 may be included that participate in a wirelessly-
`connected network 22. However, it should be understood that
`the present invention does not require the mobile device 12 to
`be networked. Further, the mobile device 12 may be con-
`nected to a wired network.
`
`[0034] Before any content can be tagged, the user 18 first
`establishes one or more contextual tags (step 30, FIG. 2).
`Because contextual tags are defined by contextual behaviors,
`the user 18 associates one or more contextual behaviors with
`the contextual tags (step 30). A contextual behavior may be a
`logical expression that can be determined as either being
`satisfied or not based on the current context of the mobile
`device 12. An initial context or state may also be associated
`with the contextual tag. The initial context may be used as part
`of the contextual behavioral expression to determine if the
`current context of the mobile device 12 satisfies the contex-
`tual behavior. For example, if the contextual behavior is loca-
`tion-based, the desired location in order for the behavior to be
`satisfied is stored as an initial context or location. The initial
`location (i .e. init

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket