throbber
as) United States
`a2) Patent Application Publication co) Pub. No.: US 2009/0089288 A1
`(43) Pub. Date: Apr.2, 2009
`
`Petersen
`
`US 20090089288A 1
`
`(54) SYSTEM AND METHODFOR FILTERING
`CONTENT ON A MOBILE DEVICE BASED
`ON CONTEXTUAL TAGGING
`
`(75)
`
`Inventor:
`
`Steven L. Petersen. Los Gatos, CA
`(US)
`
`Correspondence Address:
`CONCERT TECHNOLOGY AND WITHROW&
`TERRANOVA
`100 REGENCY FOREST DRIVE , SUITE 160
`CARY, NC 27518 (US)
`
`(73) Assignee:
`
`CONCERT TECHNOLOGY
`CORPORATION, Durham, NC
`(US)
`
`(21) Appl. No.:
`
`11/862,835
`
`(22) Filed:
`
`Sep. 27, 2007
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`(2006.01)
`GO6F 7/06
`(52) US. CM ceccccccccccceeecseeseeteseseeevesteeeeseeeeeeeeeeeees 707/9
`(57)
`ABSTRACT
`
`A system and method of contextually filtering content pre-
`sented to a user on a mobile device based on contextual
`tagging. The user controls how contentwill be filtered by the
`mobile device by creating contextual tags and associating or
`tagging content with the contextual tags. The contextual tag
`includes a contextual behaviorthat is either satisfied or not
`based on the current context of the mobile device. During,
`operation, content accessible to the mobile device is searched
`to determine which contextual tags are met based on the
`current context of the mobile device. Content tagged with
`contextual tags whose behavioris currently met based on the
`current context ofthe mobile device arefiltered and presented
`to the user. This allows the automatic presentation of a more
`manageable subgroup of content to the user on the mobile
`device based on the current context of the mobile device.
`
`eo
`
`
`
` SERVER
`
`26°
`
`SERVER
`DATABASE
`
` e
`
`e e e
`
`e
`
`Page 1 of 23
`
`SNAP EXHIBIT 1007
`
`Page 1 of 23
`
`SNAP EXHIBIT 1007
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 1 of 12
`
`US 2009/0089288 A1
`
`
`
`
`
`
`
`SERVER
`
`26
`
`SERVER
`DATABASE
`
`FIG. 1
`
`Page 2 of 23
`
`Page 2 of 23
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 2 of 12
`
`US 2009/0089288 A1
`
`ESTABLISH ONE OR MORE CONTEXTUALTAGS AND
`ASSOCIATED BEHAVIORS
`
`30
`
`
`ASSOCIATE CONTEXTUALTAGS WITH
`CONTENT ACCESSIBLE BY MOBILE DEVICE
`
`32
`
`
`
`
`DONE
`34
`ESTABLISHING
`
`CONTEXTUAL TAGS/TAGGING
`CONTENT
`?
`
`YES
`
`36
`
`UPDATE CURRENT CONTEXT OF MOBILE DEVICE
`
`
`SCAN NEXT CONTENT ACCESSIBLE TO MOBILE DEVICE
`
`
`
`44
`
` IS
`
`
`BEHAVIOR(S)
`FILTER OUT
`CONTENT
`ASSOCIATED
`
`FROM
`WITH CONTEXTUAL TAG
`
`
`
`ASSOCIATED WITH CONTENT MET
`PRESENTATION
`
`
`
`ON MOBILE
`BASED ON CURRENT
`
`
`
` DEVICE
`CONTEXT
`
`?
`
`
`
`ALLOW CONTENT TO
`
`BE PRESENTED
`ACCESSIBLE ON
`ON MOBILE DEVICE
`MOBILE
`
`DEVICE
`?
`
`
`
`FIG. 2
`
`Page 3 of 23
`
`Page 3 of 23
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 3 of 12
`
`US 2009/0089288 A1
`
`91
`
`ch
`
`qdNISONANOHdWAS
`
`
`
`AYOL.NOdSTHIDOd
`
`
`
`OoNIVIVNOSONVId
`
`
`
`OoNIVLYNOSONVid
`
`91
`
`9dNIS‘ONANOHdWAS
`
`
`
`HV11430SYSAHLASH
`
`
`
`IWHLANVHD
`
`Gé“Dla
`
`VéSis
`
`Page 4 of 23
`
`Page 4 of 23
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 4 of 12
`
`US 2009/0089288 A1
`
`F1GOW
`
`ATVVLOVL
`
`
`
`4SVavivdLNALNOOASVYOLS
`
`99OL
`
`c9
`
`FLWIOA-NON
`
`bP'Old
`
`
`
`ALIALLOANNOSWHOMLAN
`
`09
`
`SLNANOdWODSS0OVSHALNIHaSN
`
`ck
`
`SDIASC
`
`
`
`LINSWASVNVASVL
`
`ol
`
`SNOILWOMddV
`
`v9
`
`Page 5 of 23
`
`
`
`
`
`Page 5 of 23
`
`

`

`HOIAVHaa (82)HOIAWH3d(ALISOdNOD)XATdWOOHLIMDVLWNLXS.LNOD
`
`
`
`
`
`
`SINAWALWLSWIDOT:Sa1NYNV31008‘OSLWNTVAFONINLS‘ANYN
`VIVGYOSNAS‘LXSLNOODTVILINI(98)HOIAVHadSLISOdNOD
`
`
`
`
`
`
`
`
`NV3I1009‘031VNTYAZNV31009‘(00FSS3HdxXASI
`HOIAVHad (92)DVL
`VLIVGHOSNHS-LXALNOOTVILINI(28)DVLWNLXSLNOD
`
`
`
`
`
`SINSWSLVLSW9IDOT1:SS1NYONIMLSSSINVN
`
`
`
`
`
`NV31008‘(0SLVNTWAFNva100¢9‘Vgassa¥dXxas!
`
`
`
`
`
`
`
`(ye)
`
`(2g)YLWNALXSINOO
`
`S‘Sld
`
`(p83)
`
`WNLXALNOOD
`
`Patent Application Publication
`
`Apr. 2, 2009 Sheet 5 of 12
`
`US 2009/0089288 A1
`
`
`
`NV410089‘Jgzssaudx3si
`
`(0g)DVI
`
`
`
`ONIYLS‘AWN
`
`
`
`(p2}OWLFIdNIS
`
`Page 6 of 23
`
`Page 6 of 23
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 6 of 12
`
`US 2009/0089288 A1
`
`89osod
`
`
`
`AVLDVLSHOSNASMHLNSWSDSVNVASVL
`
`
`
`SVLIVNLXALNOOOSaSVE-NOILLW901VONILVaYD
`
`Page 7 of 23
`
`1
`
`
`
`
`
`TIV3LNAITO‘COL
`
`8YOIAVHAG
`loarao|WHOIAVHSE
`
`
`
`MANS1WAHO“VOL|
`
`c8
`
`
`
`
`
`INSYYNDLSANDSY‘90k|
`
`1II
`
`
`
`FIVLSWILLUNILASObL
`
`
`
`(Sd9)NOLLW907
`
`
`
`($d5)NOLW907
`
`
`
`NHNLSY'80L
`
`LNAXYNDXIMANJO|HLIMBOIAVH3d40!QINHNL3FL
`I|NOLLW907
`
`'OvGSLWsHo
`
`SVLWALXSLNOD
`
`
`LOsraoOVL
`
`OVL
`
`
`WNLXSLNOOTWALXELNOO
`
`
` 7!MANSUWSHOZOEyma|WALSOOL
`
`
`
`
`
`ATavLOVLOLNOVLTeNLXSLNOOd3LVSYOAWMSNLYASNI‘211OL
`
`
`
`
`
`
`
`NOLIWOMdd¥
`
`Page 7 of 23
`
`
`
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 7 of 12
`
`US 2009/0089288 A1
`
`daSVG-NOILW901
`
`
`
`OWLWNALXALNOO
`
`
`
`SIWAYD"ZEL
`
`Z‘Old
`
`G3SVa-NOLW9071
`
`SVLTWNLXALNOO
`
`HIMLNALNOD
`
`
`
`DVLOLLSANDAY'SEl
`
`GaSVS-NOLLW907SIWAYOOLLSANDAY“CEL
`
`SVLIVNLXALNOOD
`
`1
`
`
`
`
`
`
`
`OVLGASVa-NOILW901
`
`ATIWNLXALNOOAd
`
`HLIMCa99V1
`
`
`
`
`
`LNALNOOLSSNDAY“82h
`
`
`
`
`
`INALNOODHOdSVLWALXSLNOOGASVa-NOLIW901VONLIWAYDATLIONdXa
`
`LN3LNOO
`
`LNAWASVNVASVL
`
`
`
`asvavivaNOLWOIIdd¥In
`
`cd
`
`99929
`
`
`
`LNALNOOYOdAYSND“eet
`
`
`
`LNSLNOO3AR1013u
`
`
`
`OLLSANODSAYHASN‘01
`
`
`ANSZLNOONYNLSY"Peb
`
`
`dS..LS3NOAYNYNLAY“92t
`
`YASNOLLNSLNOO
`
`Page 8 of 23
`
`Page 8 of 23
`
`
`
`
`
`
`
`
`
`

`

`
`
` NnISANALNOO=g1s1I04d840*S
`
`MOVEAV1dNIDA‘971
`
`WALXSLNOOnG3SVE-NOLWI014ONOLLWSHO1S3ND3U‘sr
`
`
` N==3OWLTYALXSINOOOWLTWALXSLNOO40aI'2s1=d38V8-NOLIWO07
`
`
`
`
`ZzaIWaHOWL'0St
`
`
`US 2009/0089288 A1
`
`8Did
`
`
`
`OASVE-NOLLWWOO7GSa.lva¥o
`
`AIMANHLIMLNALNOO
`
`OVLWALXSLNOO
`
`
`
`SVLATIVNLXALNOOD“PSL
`
`Patent Application Publication
`
`oya4
`
`
`
`
`
`INAWS9VNVAWDVLaSvavivdNOVIIddV¥In
`
`aL99929
`
`
`
`ANALNOONYNLAY“Pr
`
`LNALNOOD
`
`OlsIDAdSHOSAHAND“Shh
`
`
`LNSLNOOO[slDSdS30
`
`
`
`MOVWEANddLSANOAY‘Ort
`
`LNALNOS
`
`
`
`VIGAWILINWV¥O4SVLGASVE-NOLWV907¥SNLIVAYDALLIONdA
`
`
`
`YHASMOHEASVAVLVGVONISNF114
`
`Page 9 of 23
`
`Page 9 of 23
`
`
`
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 9 of 12
`
`US 2009/0089288 A1
`
`SSINNIGHOODSd|YOIAVHAG
`SIVLSTWILINE|ScdALVIVO(991)S37NY
`
`
`
`
`ONILLASA140dd
`ANOHdATIGOW|YOIAVHSAS
`
`301440SHLAO
`
`(021)|HOIAVH3SGUHO
`cron
`
`
`(ONIUdS)AVWONYTWdd¥‘HOHYWNI
`
`(WALNIM)AUWNHGasONYANYNNYS
`
`“‘YAqWALdasNISHVS3LVdTVILINI
`
`JuvSALVOWILIN|GN’LNAYYuNd
`
`“‘YASW3O3CNISYSSLV0TVILINI
`
`S3LV0TVILINIGNYLNAYYNDHO
`ASNONYONYAACANALNISYv
`
`
`ONY(|HOIAVHAdOTIHD)SAINY
`
`
`ONY(2HOIAVHAdGTIHO)SAINY
`
`NIHLIMSENOLIWOO7LNAYYND
`
`HdvdSuvSONIGVAYHOSNAS
`
`NOLLY907TVLLIN]4OL34400S
`CNVLNAYHNDHO(HAWWNS)
`CNYLNSXYNDHO(NANNY)
`LHS!7WILIN|ONYLNSYYnd
`
`
`
`(€HOIAVH3AO1IHD)SATNY
`
`
`(LHOIAVHASOTIHO)SAINY
`YAGWSAONGNVHASOLO0
`
`
`1334000'0LSAO8VSuv
`TWILIN|ONYLNAYYND
`
`ONIGVAYHALAWLY
`
`
`
`(991)SIVLSTWILINI|(y91)SdALViva
`
`
`HOSN3SLHONYOIAVH34d
`
`YALAWILIVYOIAVHad
`
`aivdHOIAVHSa
`
`Sdp| YOIAWHaa
`
`GXYNLSIGLONO|DVLIWNLXALNOO
`(291) ZVIdNOWANVOOvSWYN|}=(094)SdALVIVO
`
`
`
`
`
` MyVd|OVLIWNLXALNODNOLLWOOT|OWLTWNLXALNOO
`
`
`
` ONIAlS|SVLWALXALNOODMYOM|SVLTWNLXALNOONOSV3S|SVLTWALXALNOOANNNASVL
`
`
`
`Q1IHO)Sa1NYHO
`
`(2YOIAVHad
`
`V6‘Old
`
`ONIGVad
`
`ONIOVSY
`
`ALISOdWOD
`
`YOIAVHSd
`
`ALISOdWOD
`
`YOIAVH3d
`
`(ANON)
`
`(ANON)
`
`SAVNIGHOOO
`
`NOLLW9O740
`
`(SNON)
`
`
`
`(89)S1@VLOVL
`
`YHOIAVHAG
`
`DVL
`
`Page 10 of 23
`
`Page 10 of 23
`
`
`
`
`
`

`

`Patent Application Publication
`
`Apr. 2, 2009 Sheet 10 of 12
`
`US 2009/0089288 A1
`
`
`
`
`
`(pZL)€YOIAVH34C'IIHO
`
`
`
`
`
`N3SML3¢3HVAWIL
`
`g6‘Old
`
`WV9ONYWd0t
`
`
` FWLLLNSYUNDNOILWO07LNSYYNO-(ANON) Wd$CNYWV6AVONNSHOAVGHNIWSNOILW001TWLLINI40JINNYHOIAVH3d4OAVOLNSYYHNO|WISM4OAVC|YOIAVHAE
`
`
`
`
`
`
`
`Naam134StLONSIMSSM4153400SNIHLIMSI
`
`
`
`
`
`
`S31nd|SIVLSTWLLIN]|SdALvivaTWILINIAdALVivaFAVESsamny
`(ANON)F|(ANON)
`(ANON) TWILIN]GN¥LNAYYNDAWIL|YOIAVHaS
`
`
`
`
`
`
`elNA1S,HLOdSuvONILLAS
`
`
`
`(221)@HOIAVH3ECTIHO
`
`Page 11 of 23
`
`TVILINIONYLNSYYND
`
`
`
`
`
`ANAOWdANOHdATION
`
`Page 11 of 23
`
`
`
`

`

`Cm
`
`—xs
`
`
`
`
`
` =5NOLLW907LNSYHNDDNISNNOLIWOMddVNVVIADVLGSSVa-NOILV901¥ONLLYMTVAS
`
`
`
`
`
`fae<t
`
`o.
`
`“v8h;&iOvi=|WNLXALNOOdSl4SLLWS
`
`
`
` éCl4SiWvSaYHOIAVH3Gqausilvsaaissivsoni‘002=NOIAWHaaHOIAVH38‘61
`
`HAHL3HM401NS3H“861=H3SHIAHM'-JOLInsay
`“961IS1X4LNOO
`
`OvL
`
`US 2009/0089288 A1
`
`OL‘Sls
`
`NOILOVASLLVS
`
`
`
`‘08!|°sos8z889Zlr92SHOSNASMHYOIAWH3dOWLWNLXSLNOOJ1VLOVL‘LINDWDVLNOLWOMdd¥
`
`
`Sst1YOIHOUVAS28h7HOIAWH3=:;;DVLTWNLXALNOO<;1HVINOLLYVd
`=11;adISNINYSLSC
`
`
`
`
`
`:g'“Isanoay‘06t2aa14SLIvSéCSISSIIVSHOIAVH3E981
`‘88h!a17OWL;TWALX.LNOO
`
`anLNAYYND‘261'2LXALNOO;INSYHND
`ev'HOIAVH3d
`
`
`Page 12 of 23
`
`Page 12 of 23
`
`
`
`
`

`

`
`‘222aOVLNHNLAY“81za
`PeHOIAVH3d
`
`
`
`ONISNa;NOaasvaGassaddxX3DVLWNLXSLNOO‘0¢2
`
`
`
`D;NSAIDNOILV907NSAID
`“nN'I='!oS;|o'|='1DNi1
`a'éNOILWOOT
`wsasl4Slivs
`
`
`
`WALXSLNOD‘PLeHO4HOUVAS‘oleHLIMd3ssaudX3 1OWL
`
`SVLNOILYOOTNSAID
`
`
`
`
`
`US 2009/0089288 A1
`
`;EI
`
`LL‘Slt
`
`
`
`eswonwuaasnedsG3l4SIVSHOIAWH3ENHNLSY‘0ez‘::Nttad“bz
`
`
`
`
`ANS3uJHLSHMsayNUNLAY“822
`tNOLWW907|wantsonsmes
`
`SARINLAYOle1\I\‘!1\!1\1
`t I'+WIVG
`LXALNOOGaSNALLNSAOSY
`
`
`
`
`
` |,WiIWdLXSLNOOG3SNATLNIOSYNYNL3AY‘212
`
`
`
`
`
`Patent Application Publication
`
`
`
`
`
`osv889Av9
`
`
`
`
`
`SHOSNASMHYOIAVHAOVLTWALXALNOOATavlSVL“LWOWDVLNOILVONddv¥
`
`
`
`
`
`
`
`
`
`NOILW907WOILAHLOdAHVDNISNNOLWOMdd¥NVVIADVLGSSVE-NOILVOOTVONLIVWAS
`
`Page 13 of 23
`
`Page 13 of 23
`
`
`
`

`

`US 2009/0089288 Al
`
`Apr. 2, 2009
`
`SYSTEM AND METHODFOR FILTERING
`CONTENT ON A MOBILE DEVICE BASED
`ON CONTEXTUAL TAGGING
`
`FIELD OF THE INVENTION
`
`invention relates to a system and
`[0001] The present
`methodoffiltering content, including but notlimited to mul-
`timedia content, on a mobile device based on contextual
`tagging. Contentis filtered based on whether a current context
`of a “contextually aware” mobile device satisfics the contex-
`tual behavior defined in a contextual tag associated with the
`content.
`
`
`
`BACKGROUND OFTHE INVENTION
`
`large
`form factor,
`[0002] The development of small
`memorycapacity hard drives and other memory devices has
`facilitated growth of mobile devices for accessing and play-
`ing digital media. Mobile devices are particularly useful
`because theyfacilitate convenient “on-the-go” access of digi-
`tal media for their users. Media content is stored in local
`memoryin the mobile device for access by the user when
`desired. An example of such a mobile device is the Apple®
`iPOD® mediaplayer. The Apple® iPOD® mediaplayer pro-
`vides gigabytes of memory storage. Media software applica-
`tions, such as Apple®itunes®for example, are executed ona
`user’s computer to store and managethe user’s medialibrary
`and facilitate downloading of desired media contentto local
`memoryin mobile devices.
`[0003] Given the plethora of media contentavailable, users
`may not haveall desired media content stored on their mobile
`device. Thus, many mobile devices are increasingly being,
`equipped with wireless communication capabilities. Wireless
`communications allow media devicesto access media content
`not stored locally on the mobile device. Short-range wireless
`communication allows users to share media content with
`other users. Many manufacturers are also adding cellular
`communication capabilities to mobile devices so that media
`players can access media content over cellular networks from
`remoteservice providers. An example ofsuch a mobile device
`is the Apple® iPhone®, which provides a combinedcellular
`phone and media player into one mobile device.
`[0004] Because ofthe plethora of media content available
`to users of mobile devices, both from locally stored and
`remotely accessed content, it is increasingly important to
`provide filtering capabilities. Without filtering, users may
`have to navigate through large and unmanageable media file
`listings to find desired media content. Filtering capabilities
`allow content to be provided to users in more manageable
`subgroups. To providefiltering, media content can be tagged
`with one or morestatic criterion that delineates the content in
`
`some manner. lor example, if the media content are audio
`files, the audiofiles may include a genre tag. Ifan audiofile is
`ofa “Comedy”genre, the media item maybe tagged with a
`“Comedy”genre tag in this example. Thus, if the user of the
`mobile device only wants to access audiofiles in the “Com-
`edy”genre, the mobile device can consult the genre tag ofthe
`audio files to only provide those files having a “Comedy”
`genre tag.
`[0005] One disadvantage of such filtering systems is that
`they use static-based criterion and are thus non-intelligent.
`Thefiltering criterion provided by the tag does not adapt to
`changes in the environmentor context of the mobile device.
`For example, some media items tagged with a “Comedy”
`
`genre tag may be appropriate for some contexts such as home,
`but not for others such as a work place. Other media items
`mayalso be tagged with the “Comedy”genretag, but maybe
`appropriate for either home or work use. In such systems,
`media items tagged with “Comedy” genre tags would be
`filtered equally. Thus, the user may not be able to filter based
`on the presence of the “Comedy” genre tag effectively,
`because this filter may include media items that are both
`appropriate and inappropriate for a particular environment or
`context of the mobile device. If the mobile device could
`determine which “Comedy” media items were appropriate
`for which contexts on anindividualized basis, the user could
`effectively use the “Comedy”genre filter without fear of a
`contextually inappropriate selection being made.
`
`SUMMARYOF THE INVENTION
`
`[0006] The present invention is a system and method of
`contextually filtering content presented to a user on a mobile
`device based on contextual tagging. The user controls how
`content will be filtered by the mobile device during operation
`by creating contextualtags and associating ortagging content
`with the contextual tags. The contextual tag includes a defined
`contextual behavior. The contextual behavior is an expression
`thatis either satisfied or not based on the current context ofthe
`mobile device, a set of logical rules that apply to the current
`context and, an optional initial context. In this manner, the
`user controls the context that mustexist for the mobile device
`in order for particular tagged content to be presented during,
`operation. The user may use contextual tags to tag content
`deemedappropriate for certain contexts, but inappropriate for
`others. The mobile device is equipped to be “context aware.”
`The mobile device may use a sensed context to define the
`initial context of a contextual tag when created as well as the
`current context of the mobile device during operation. The
`context of the mobile device can be any condition or sur-
`rounding able to be sensed by the mobile device, including the
`user’s interaction with the mobile device that can change and
`can be sensed or determined.
`[0007] During operation, after contextual tags have been
`created and assigned to content by the user, content
`is
`searched to determine which have contextual tags whose
`behavior is satisfied based on the current context of the
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are satisfied are presented to the user.
`This meansthe particular content was previously designated
`by the user to be presented based onthe current context ofthe
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are not satisfied based on the current
`context of the mobile deviceare filtered and not presented to
`the user. In this manner, the present invention facilitates man-
`aging and automatically being presented with a more man-
`ageable subgroup of content on the mobile device based on
`the context of the mobile device from the user’s perspective.
`This is opposed to solely filtering content based onstatic-
`based criterion that does not adapt or change based on the
`context of the mobile device.
`
`[0008] For example, the contextual behavior defined by a
`contextual tag may be location based. A location-based con-
`textual tag may include a contextual behavior defined as the
`mobile device being at or in close proximity to a specified
`location as the initial context. How close is decided by the
`user via a logical expression assigned to the contextual tag
`that defines the desired behavior. For example, the desired
`behavior assigned by the user may be that the mobile device
`
`Page 14 of 23
`
`Page 14 of 23
`
`

`

`US 2009/0089288 Al
`
`Apr. 2, 2009
`
`must be located within ten miles of a work place for the
`contextual behavior to be satisfied. When the detected current
`
`location of the mobile device(i.e. the current context) indi-
`cates that the mobile device is located within ten miles ofthe
`workplace in this example(i.e. the initial context), the behav-
`ior will besatisfied. The mobile device would then make any
`content tagged with this location-based contextual tag, avail-
`able to the user.
`
`[9009] The user can establish and associate contextual tags
`having any desired behavior with the content. A contextual
`tag may include only one contextual behavior. Alternatively,
`a contextual tag may include more thanone contextual behav-
`ior to include a composite contextual behavior. Contextual
`behavior included in contextual tags can be based on any
`contextual attribute(s) that can be sensed by the mobile
`device. As examples, the contextual attributes could include
`conditions such as the location of the mobile device, time of
`day, day of week, date, velocity, acceleration, direction of
`travel, weather, amount of sunlight, proximity of the mobile
`device to other users, state or data of applications running on
`the mobile device, or combinations thereof. For contextual
`attributes that require sensing of the mobile device’s external
`environmentor surroundings, one or more context sensorsor
`other hardware components which may be associated with
`the mobile device may be used to determine the current con-
`text ofthe mobile device. In this manner, the mobile deviceis
`“context aware.” Contextual behaviors can also be based on
`the context ofthe user and/ortheir interaction with the mobile
`device. For example, the user may establish contextual tags
`for “home” and “work” behaviors. Content assigned with a
`contextual tag associated with “home”behavior may not be
`appropriate for a “work”context, and vice versa.
`[0010] The contextual tags may be established in data
`structuresstored in association with the mobile device. These
`data structures may be implemented using object-oriented
`design (OOD)principles. OOD may beparticularly well
`suited since it defines methods and attributesso as to associate
`behavior with data. For example, when a userdesires to create
`a contextual tag, a tag factory object may be called upon to
`create a contextual tag object from a tag class. The tag factory
`may also be called upon to allow the user to create and
`associate one or more behavior objects with contextual tag
`objects. A contextual tag object does not contain any behavior
`evaluations. Instead, the one or more behavior objects asso-
`ciated with a contextual tag object are called upon. The
`behavior evaluations in the behavior objects are separated
`from the contextual tag, objects to support decoupling, thus
`allowing easier reuse of behavior objects by other contextual
`tag objects. If the one or more contextual behavior objects
`associated with a contextual tag object are satisfied by the
`current context according to rules andstate attributes in the
`behavior objects, the content tagged with the contextual tag
`will be made accessible by the mobile device to the user.
`[0011]
`In an alternative embodiment, the mobile device
`may allow the user to manually force or override the current
`context even if the forced context does not naturally exist
`based on the current context ofthe mobile device. This allows
`
`a user to force the mobile deviceto filter content contextually
`based on the context desired by the user as opposed to the
`natural context sensed by the mobile device. For example, the
`user may wantto access all content contextually tagged with
`a worklocation contextual tag, but when the user is on vaca-
`tion. Instead ofthe user having to retag the content designated
`
`for a work context, the user can simply override the current
`context of the mobile device to force a work location context
`as the current context.
`
`In another embodiment, the mobile device may be
`[0012]
`directed to implicitly contextually tag content without the
`user having to explicitly assign contextual tags. This allows
`the user to later recall content based on making a selection
`from previous contexts in which the user browsed and/or
`accessed content. For example, as a user accesses content in a
`normal fashion, the mobile device may automatically and
`silently in the background and unknownto the user contex-
`tually tag the content. Ifthe user desiresto later recall specific
`content, but the user can only rememberthe context in which
`the content was previously accessed, the user can review and
`select contextual tags assigned by the mobile device to recall
`content.
`
`[0013] The mobile device employed by the present inven-
`tion may be any type of mobile device, including but not
`limited to a cellular phone, a personaldigital assistant (PDA),
`or a portable media player, as examples. ‘he mobile device
`may or may not have communication capability. Communi-
`cation capabilities mayinclude either wired communication,
`wireless communication, or both. If the mobile device has
`communication capability, content and/or the context ofthe
`mobile device, which is used to determine if the contextual
`behavior ofcontextualtags Sor filtering are satisfied, both can
`be obtained from a remote system, such as a central content
`server for example.
`[0014] Those skilled in the art will appreciate the scope of
`the present invention and realize additional aspects thereof
`after reading the following detailed description of the pre-
`ferred embodiments in association with the accompanying,
`drawing figures.
`
`BRIEF DESCRIPTION OF THE DRAWING
`FIGURES
`
`[0015] The accompanying drawing figures incorporated in
`and forming a part of this specification illustrate several
`aspects of the invention, and together with the description
`serve to explainthe principles of the invention.
`[0016]
`FIG.1 illustrates an exemplary mobile device sys-
`tem and network architecture according to one embodiment
`ofthe present invention;
`[0017]
`FIG. 2 isa flowchart illustrating an exemplary over-
`all process of establishing, contextual tags and filtering, con-
`tent based on whether the current context ofthe mobile device
`satisfies the contextual behavior ofcontextual tags associated
`with the content, according to one embodimentofthe present
`invention;
`[0018]
`FIG. 3A illustrates an exemplary song file listing
`displayed on a mobile device before contextualfiltering is
`activated, according to an embodimentof the present inven-
`tion;
`FIG. 3B illustrates an exemplary song file listing
`[0019]
`displayed on a mobile device after contextual filtering is
`activated and applied to the song listing illustrated in FIG.3A,
`according to an embodimentofthe present invention;
`[0020]
`['IG.4 illustrates an exemplary mobile devicearchi-
`tecture that may be employedto allow the mobile device to be
`“contextually aware”andfilter contextually tagged content
`based on the context of the mobile device, according to one
`embodimentof the present invention;
`
`Page 15 of 23
`
`Page 15 of 23
`
`

`

`US 2009/0089288 Al
`
`Apr. 2, 2009
`
`FIG. 5 illustrates an exemplary tag management
`[0021]
`architecture and exemplary tags created using the tag man-
`agement architecture:
`[0022]
`FIG. 6 is a flow diagram illustrating an example of
`creating a location-based contextual tag for filtering content
`based on the mobile device’s location, according to one
`embodimentof the present invention;
`[0023] FIG.7 isa flow diagram illustrating an example of a
`user explicitly creating a location-based contextual tag for
`content, according to one embodiment ofthe present inven-
`tion:
`FIG. 8 is a flow diagram illustrating an example of
`[0024]
`the mobile device implicitly creating a location-based con-
`textual tag for content, according to one embodimentof the
`present invention;
`[0025]
`FIGS. 9A and 9B illustrate an exemplary tag table
`containing, examples of contextual tags created and associ-
`ated with content, according to one embodiment of the
`present invention;
`[0026]
`[T'IG. 10 isa flowdiagram illustrating an example of
`evaluating a location-based contextual tag to determineif its
`location-based behavioris satisfied based on the current con-
`
`text location of the mobile device, according to one embodi-
`mentof the present invention; and
`[0027]
`FIG. 11 isa flow diagram illustrating an example of
`evaluating a location-based contextual tag to determineifits
`location-based behavioris satisfied using a hypothetical loca-
`tion provided by the user according to one embodimentofthe
`present invention.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`[0028] The embodimentsset forth below representthe nec-
`essary informationto enable those skilledin the art to practice
`the invention and illustrate the best mode of practicing the
`invention. Upon reading the following description in light of
`the accompanying drawing figures, those skilled in the art
`will understand the concepts of the invention and will recog-
`nize applications ofthese concepts not particularly addressed
`herein. It should be understood that these concepts and appli-
`cations fall within the scopeofthe disclosure and the accom-
`panying claims.
`[0029] The present invention is a system and method of
`contextuallyfiltering content presented to a user on a mobile
`device based on contextual tagging. The user controls how
`content will be filtered by the mobile device during operation
`by creating contextual tags and associating or tagging content
`with the contextualtags. The contextual tag includes a defined
`contextual behavior. The contextual behavior is an expression
`thatis either satisfied or not based on the current context ofthe
`mobile device, a set of logical rules that apply to the current
`context and, an optional initial context. In this manner, the
`user controls the context that must be present for the mobile
`device in order for particular tagged content to be presented
`during operation. The user may use contextual tags to tag
`content deemed appropriate for certain contexts, but inappro-
`priate for others. The mobile device is equipped to be “context
`aware.” The mobile device may use a sensed contextto define
`the initial context of a contextual tag when created as wellas
`the current context ofthe mobile device during operation. The
`context of the mobile device can be any condition or sur-
`roundingable to be sensed by the mobile device, including the
`user’s interaction with the mobile device that can change and
`can be sensed or determined.
`
`[0030] During operation, after contextual tags have been
`created and assigned to content by the user, content
`is
`searched to determine which have contextual tags whose
`behavior is satisfied based on the current context of the
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are satisfied are presented to the user.
`This meansthe particular content was previously designated
`bytheuserto be presented based onthe current context ofthe
`mobile device. Content tagged with contextual tags whose
`contextual behaviors are not satisfied based on the current
`context of the mobile deviceare filtered and not presented to
`the user. In this manner, the present inventionfacilitates man-
`aging and automatically being presented with a more man-
`ageable subgroup of content on the mobile device based on
`the context of the mobile device from the user’s perspective.
`This is opposed to solcly filtering content based on static-
`based criterion that does not adapt or change based on the
`context of the mobile device.
`
`[0031] For example, the contextual behavior defined by a
`contextual tag may be location based. A location-based con-
`textual tag may include a contextual behavior defined as the
`mobile device being at or in close proximity to a specified
`location as the initial context. How close is decided by the
`user via a logical expression assigned to the contextualtag,
`that defines the desired behavior. For example, the desired
`behavior assigned by the user may be that the mobile device
`must be located within ten miles of a work place for the
`contextual behaviorto be satisfied. Whenthe detected current
`
`location of the mobile device (i.e. the current context) indi-
`cates that the mobile device is located within ten miles of the
`workplace in this example(i.e. the initial context), the behav-
`ior will besatisfied.he mobile device would then make any
`content tagged with this location-based contextual tag avail-
`able to the user.
`
`[0032] The user can establish and associate contextual tags
`having any desired behavior with the content. A contextual
`tag may include only one contextual behavior. Alternatively,
`a contextual tag may include more than one contextual behav-
`ior to include a composite contextual behavior. Contextual
`behavior included in contextual tags can be based on any
`contextual attribute(s) that can be sensed by the mobile
`device. As examples, the contextualattributes could include
`conditions such as the location of the mobile device, time of
`day, day of week, date, velocity, acceleration, direction of
`travel, weather, amount of sunlight, proximity of the mobile
`device to other users, state or data of applications running on
`the mobile device, or combinations thereof. For contextual
`attributes that require sensing of the mobile device’s external
`environmentor surroundings, one or more context sensorsor
`other hardware components which may be associated with
`the mobile device may be used to determine the current con-
`text ofthe mobile device. In this manner, the mobile deviceis
`“context aware.” Contextual behaviors can also be based on
`the context ofthe user and/ortheir interaction with the mobile
`device. For example, the user may establish contextual tags
`for “home”and “work” behaviors. Content assigned with a
`contextual tag associated with “home” behavior may not be
`appropriate for a “work” context, and vice versa.
`[0033]
`FIG.1 illustrates an exemplary system and network
`architecture 10 for mobile devices that may employ a contex-
`tual filtering system and method according to embodiments
`ofthe present invention. The mobile device architecture 10 in
`FIG.1 will be described in conjunction with FIG. 2, which is
`a flowchart illustrating an exemplary contextualfiltering pro-
`
`Page 16 of 23
`
`Page 16 of 23
`
`

`

`US 2009/0089288 Al
`
`Apr. 2, 2009
`
`cess. The contextual filtering process involves contextually
`tagging content with contextual tags. The mobile device fil-
`ters content based on whether the current context of the
`mobile device satisfies the contextual behavior of the contex-
`tual tag associated with the content. As illustrated in FIG. 1,
`an exemplary platform for a mobile device 12 is provided.
`The mobile device 12 includes an output device 14, such as a
`display, and an input device 16, such as input keys, to allow a
`user 18 to control, interact, and be presented content on the
`mobile device 12. The mobile device 12 may includea cel-
`lular handset with an accompanying cellular antenna 20 for
`cellular communications. However, the mobile device 12 can
`be any type of device adapted to store and managedigital
`content, including but not limited to a personal data assistant
`(PDA), a personal media player (PMP), or other handheld
`device. The mobile device 12 can include multimedia related
`functionality, including searching, organizing, browsing, pre-
`viewing, rendering, and/or sharing/transferring content. In
`one embodimentofthe present invention, one or more mobile
`devices 12 maybeincluded that participate in a wirelessly-
`connected network 22. However, it should be understood that
`the present invention does not require the mobile device 12 to
`be networked. Further, the mobile device 12 may be con-
`nected to a wired network.
`
`[0034] Before any content can be tagged, the user 18 first
`establishes one or more contextual tags (step 30, FIG. 2).
`Because contextualtags are defined by contextual behaviors,
`the user 18 associates one or more contextual behaviors with
`the contextualtags (step 30). A contextual behavior may be a
`logical expression that can be determined as either being
`satisfied or not based on the current context of the mobile
`device 12. Aninitial context or state mayalso be associated
`with the contextual tag. Theinitial context may be used aspart
`of the contextual behavioral expression to determineif the
`current context of the mobile device 12 satisfies the contex-
`tual behavior. For example,if the contextual behavioris loca-
`tion-based, the desired locationin order for the behaviorto be
`satisfied is stored as an initial context or location.‘Theinitial
`location(i.e. initial context) may be a hard codedlocation that
`is not programmable by the user 18. For example, the initial
`location may be a knownplaceorlocation. Alternatively, the
`initial location (i.e. initial context) may be programmable.
`The initial location may be directly set by the user 18, or
`directed by the user 18 to be set by the context sensed by the
`mobile device 12 whenthe contextualtag is created. In any of
`these cases, this allows the mobile device 12 during operation
`to

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket