`(10) Patent N0.:
`US 8,780,130 32
`Morris
`
`(45) Date of Patent: Jul. 15, 2014
`
`US008780130B2
`
`715/784, 788, 792. 7944304; 717/1077109;
`717/1137114; 707/705; 758
`See application fi e for complete search history.
`
`
`
`mapping
`
`Int. Cl.
`G09G 5/00
`G09G 5/02
`G06F 3/041
`G06F 15/16
`G06F 17/00
`G06F 3/00
`G06F 3/048
`G06F 9/44
`US. Cl.
`USPC ......... .. 345/581; 345/173; 345/589; 345/630;
`709/203; 709/218; 715/200; 715/204; 715/275;
`715/700; 715/788; 717/107; 717/114
`Field of Classification Search
`USPC ....... .. 345/581, 589; 619, 360, 501, 538, 156,
`345/168, 1727173; 175, 179, 620, 6237625,
`345/629; 709/2017203, 2177218, 219;
`715/200, 273; 275, 700, 7337734, 751,
`715/760, 7637764; 864, 86(L861, 843. 845,
`715/851, 8557856; 828, 829, 830, 833. 781,
`
`NIETHODS, SYSTEMS, AND COMPUTER
`PROGRAlVI PRODUCTS FOR BINDING
`ATTRIBUTES BETWEEN VISUAL
`COMPONENTS
`
`Inventor: Robert Paul Morris, Raleigh, NC (US)
`
`Assignee: Sitting Man, LLC, Raleigh, NC (US)
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted raider 35
`U.S.C. 154(b) by 634 days.
`
`12/956,008
`
`Nov. 30, 2010
`
`Prior Publication Data
`
`US 2012/0133662 A1
`
`May 31, 2012
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2013.01)
`(2006.01 )
`
`(5 6)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`, . . . . . . . . . . . . .
`,
`,
`7/2006 Lau .
`7,080,088 B1*
`1/2003 Phillips et a1.
`2003/0018609 A1 *
`..
`2006/0271853 A1 * 11/2006 Marcos et a1.
`..
`2006/0294470 A1 * 12/2006 Marcos et a1.
`2009/0254610 Al" 10/2009 Arthur'sson .
`2010/0037154 A1*
`2/2010 Marcos et a1,
`OTHER PUB] ,ICA'I'IONS
`
`.
`
`. .. 1/1
`.
`,
`.
`707/1
`715/700
`715/764
`709/203
`,, 715/760
`
`Microsoft Developer Network. “Object Bindng Sample,” Jul. 15,
`2010, http://msdn.microsol't.conVen-us/library/8e36eeyx%28v:vs.
`90%2‘).aspx, last accessed Mar. 20, 2014.
`Microsoft Developer Network, “Object Binding in Visual Studio,”
`2010,
`http://msdn.microsoft.com/en-us/1ibrary/m52338l5(V:Vs.
`100).aspx, last accessed Mar. 20, 2014.
`
`* cited by examiner
`
`Primary Examiner 7 Wesner Saj ous
`
`(74) Attorney, Agent, or Firm 7 Patrick E. Caldwell, T
`The Caldwell Firm, LLC
`
`ABSTRACT
`(57)
`Methods and systems are described for binding attributes
`between Visual components. A first Visual component, includ-
`ing a first presentation space for presenting first data by an
`operating first application; is detected. Binding information,
`for the first application, is identified that specifies a mapping
`between a first visual attribute ol‘ the first Visual component
`and a second Visual attribute of a second Visual component
`including a second presentation space for presenting second
`data by a second application. A change to the first Visual
`attribute is detected.
`111 response to the detection of the
`change, change information is automatically sent to change
`the second Visual attribute according to the mapping.
`45 Claims, 11 Drawing Sheets
`
`Detect a first visual component ineiuding a first presentation space tor
`presenting first data by an operating iirstapplication
`
`Identify binding information tor the first application, that specifies a
`mapping between a first visual attribute of the first Visual component
`and a second ViSUal attribute eta second visual component including
`a second presentation space for presenting second data by a second
`application
`
`Detect a first change to the first visual attribute
`
`Send change information automatically, in response to detecting the
`first change, to change the second visuai attribute according to theI
`
`MICROSOFT CORP. EX. 1008
`Page 1 of 28
`
`
`
`Jul. 15, 2014
`
`ll«I01teehS
`
`US 8,780,130 B2
`
`
`
`aaoo_>on_SQEwo_>mn_3930
`
`
`
`83mm59.:59:0-525:me
`
`
`
`mEmEcescm8:30me
`
`amvoz\mo_>oo
`
`mco_umo__aq<
`
`NH
`
`
`
`o;m:099mv:g2gg.n_flmEBmfigsw
`
`
`
`Ucmm®_._m.5_|_.550
`
`
`
`
`
`
`
`
`
`
`Ia63.2%gE:a20592
`
`
`O_‘_\meQmUJ‘®U_>®D@C_wm®00._n_—_m0_w>£n_
`
`US. Patent
`.Eeuzo.....mothEEmucoomw
`
`
`x552‘5334‘we$92053580
`3l035996
`
`MICROSOFT CORP. EX. 1008
`Page 2 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 2 of 11
`
`US 8,780,130 B2
`
`Detect a first visual component including a first presentation space for
`presenting first data by an operating first application
`
`Detect a first change to the first visual attribute
`
`Send change information automatically, in response to detecting the
`first change, to change the second visual attribute according to the
`
`Identify binding information, for the first application, that specifies a
`mapping between a first visual attribute of the first visual component
`and a second visual attribute of a second visual component including
`a second presentation space for presenting second data by a second
`application
`
`mapping
`
`MICROSOFT CORP. EX. 1008
`Page 3 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 3 of 11
`
`US 8,780,130 B2
`
`UI Monitor 02
`
`Binding Monitor
`@
`
`Change Director
`
`Binding Director
`&
`
`fl
`
`MICROSOFT CORP. EX. 1008
`Page 4 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 4 of 11
`
`US 8,780,130 B2
`
`Execution Environment 401a
`
`Input
`Driver
`441 8
`
`|
`
`Network Stack 411a
`
`Binding Director - Binding Monitor
`
`Presentation Controller 435a
`
`Change Director
`
`Application 403a
`
`Ul Element
`Handler
`433a
`
`404a
`
`406a
`
`Ul Monitor 402a
`
`Application Protocol Component 413
`
`a
`
`MICROSOFT CORP. EX. 1008
`Page 5 of 28
`
`
`
`Input
`DHver
`441b
`
`GUI Subsystem
`437b
`
`Presentation Controller 435b
`
`Change
`Director
`M
`
`Ul element
`
`handler
`w
`
`Binding Director
`404b
`
`Binding Monitor
`406b
`
`' Network A Iication
`
`u Agent 405b
`
`Ul Monitor 402b
`
`'
`I
`
`Content :
`Handler:
`431 b
`
`Content Manager
`415b
`
`Application Protocol Component 413b
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 5 of 11
`
`US 8,780,130 B2
`
`Execution Environment 401 b
`
`Graphics Subsystem
`43%
`
`Network Stack 411b
`
`MICROSOFT CORP. EX. 1008
`Page 6 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 6 of 11
`
`US 8,780,130 B2
`
`Graphics Subsystem 439C
`
`GUISubsystem m
`
`Binding Monitor
`4060
`
`Binding Director
`4040
`
`Change
`Director
`408C
`
`Ul Monitor 4020
`
`Execution Environment 4010
`
`Input Driver
`4410
`
`Network Stack 4110
`
`Application Protocol Component 4130
`
`First Application 403-10
`
`Presentation Controller 435-26
`
`Second Application 403-2c
`
`Presentation Controllerm
`
`Ul Element
`Handler
`433-10
`
`UI Element
`Handler
`
`m
`
`MICROSOFT CORP. EX. 1008
`Page 7 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 7 of 11
`
`US 8,780,130 B2
`
`Network Stack 411d
`
`Network Application Platform 409d
`
`Controller 417d
`
`View Subsystem 42 d
`
`Ul Monitor 402d
`
`Change
`Director
`408d
`
`Application Protocol Component 413d
`
`Execution Environment 401d
`
`Binding Monitor
`406d
`
`Binding Director
`404d
`
`Template Engine
`423d
`
`Model Subsystem 419d
`
`Network Application 403d
`
`Template
`Database 42 d
`
`Template
`fl
`
`Model Database
`421d
`
`MICROSOFT CORP. EX. 1008
`Page 8 of 28
`
`
`
`US 8,780,130 B2
`
`User Node
`
`&
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 8 of 11
`
`fl
`
`Application
`Provider Node
`
`MICROSOFT CORP. EX. 1008
`Page 9 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 9 of 11
`
`US 8,780,130 B2
`
`Display Presentation Space 602a
`
`Second App
`
`604-1a \
`
`6064 a \\
`
`Tools
`
`Help
`
`First App
`\\ File
`Edit View
`Bookmarks
`610'1a'\\
`http://mysite.OoOT.com
`
`Presentation Space 608-121
`
`MICROSOFT CORP. EX. 1008
`Page 10 of 28
`
`
`
`Display Presentation Space 602b
`
`604-1 b
`
`\_
`
`Tools
`
`Help
`
`Presentation Space film-1b
`
`2nd
`604-2b
`
`3rd
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 10 of 11
`
`US 8,780,130 B2
`
`First App
`‘ File
`Edit View Bookmarks
`
`604-3b
`
`MICROSOFT CORP. EX. 1008
`Page 11 of 28
`
`
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 11 of 11
`
`US 8,780,130 B2
`
`702
`
`<viéuaI-binding>
`
`708-1
`706—1
`7044
`704-2 \:‘<application path="c:\apps\editors\htm|Edit.exe" id=”editor” \>
`Yapplication path=”c:\uti|ities\search\filesearch.exe" id=”navigator” \>
`706-2
`708-2
`
`710-4
`
`714-1
`712-1
`710-1 \\
`718-1 iiSattribute-binding id=”editor.h1ain.state” op-i’d=”init” >
`\\
`\<bind id=”navigatormainistate” op:i\d=”init” param="c:\\docs\\web” \>
`</attrlbute-blndlngT>20_1
`712_2
`722—Jl71‘1r2
`724-1/ 7162
`710-2
`718-27i;_<attribute—binding id=”editor.main.size” op—id=”max” symmetric="TRUE">
`x<bind id=”navigator.main.size” op-iq=”min” \>
`</attribute-binding>
`722-2
`“6'3
`714'3
`712-3
`720-2 /
`7106
`718-3 T;\<attribute-binding id=”editor”.main.size” op-id=”change” symmetric="TRUE">
`\‘<bind id=”navigator.main,size" op-_id=”opposite" \>
`</attribute—binding>
`722-3
`716-4
`714_4
`/,/ 712-4
`720-3 ’
`éattribute-binding id=”‘editof.main.size" op-iaf’resize” symmetric=”FALSE”>
`718'4 "\’\<bind id=”navigator.main.location” op-id=”empty-space” \>
`</attribute-binding>
`/
`\\
`720-4 /
`710-5 \
`“2'5 HM
`714_5 _\ \‘<attribute-binding id=”editor.main.Iocation” op-id=”change”
`\symmetric="TRUE">
`718—5 \\ <bind id=”navigator.majn.Iocation” qp-id=”empty-space” \>
`</attribute—bindinn>
`722-5
`_
`/
`716-6
`714-6
`712-6
`720-5 ’
`<attribute-binding id="editor.main.state" op-i/dé'bn-focus" symnfietric=”TRUE">
`* <bind id=”navigator.main.Ipcation” op-ij:i=”empty-space” \>
`</attribute-binding>
`722-6
`
`Fig. 7
`
`716-5
`
`710_6 _\
`7186
`
`714-7
`/ 712-7
`I
`/
`710_7 _\_\
`/
`718_7 ;‘\:attribute-binding id=”editor/main.transparency” op5id=”change” >
`<bind id=”navigator.main.transparency” op-id=”match” \>
`</attribute-binding>
`722-7
`
`720-7
`
`<visuaI-binding>
`
`MICROSOFT CORP. EX. 1008
`Page 12 of 28
`
`
`
`In other
`
`invention will
`Objects and advantages of the present
`become apparent to those skilled in the art upon reading this
`description in conjunction with the accompanying drawings,
`in which like reference numerals have been used to designate
`like or analogous elements, and in which:
`F G. l is a block diagram illustrating an exemplary hard-
`ware device included in and/or otherwise providing an execu-
`tion environment in which the subject matter may be imple-
`mented;
`F G. 2 is a flow diagram illustrating a method for binding
`attri 7utes between visual components according to an aspec
`of the subjec matter described herein;
`F G. 3 is a block diagram illustrating an arrangement 0
`components or oincing attribu es aetween visua com 30-
`nents accorcing to ano her aspec
`0
`he su Jject matter
`describec herein;
`F G. 4:; is a block diagram illustrating an arrangemen 0
`components or oincing attribu es oetween visua com ao—
`nents accor ing to ano her aspec
`o ' he su )ject ma ter
`describec herein;
`F G. 4b is a block diagram illustrating an arrangemen 0
`components or oincing attribu es aetween visua com 30-
`nents accorcing to ano her aspec
`0
`he su Jject matter
`describec herein;
`F G. 40 is a block diagram illustrating an arrangemen 0
`components or aiming attribu es aetween visua com —
`nents accor ing to ano her aspec
`0
`he su )ject ma ter
`describec herein;
`F G. 4d is a block diagram illustrating an arrangemen 0
`components or oincing attribu es aetween visua com 30-
`nents accorcing to ano her aspec
`0
`he su Jject matter
`describec herein;
`F G. 5 is a network diagram illustrating an exemplary
`system for binding attributes between visual components
`according to another aspect of the subject matter described
`herein;
`F G. 6a is a diagram illustrating a user interface presented
`via a display according to another aspect ofthe subject matter
`described herein;
`F G. 61) is a diagram illustrating a user interface presented
`via a display according to another aspect o f the subject matter
`described herein; and
`F G. 7 is an illustration ofbinding information according to
`another aspect of the subject matter described herein.
`|)H'1'AII.HI)DHSCRIP'I'ION
`
`1
`METHODS, SYSTEMS, AND COMPUTER
`PROGRAM PRODUCTS FOR BINDING
`ATTRIBUTES BETWEEN VISUAL
`(IOMPOVEN'I‘S
`
`BACKGROUND
`
`While some applications can be used alone, some applica-
`tions are used together. Often there is no integration and/or
`cooperation between or among applications used at the same
`
`time by a user. Even in application suites, cooperation is
`limited to features that ease data sharing between or among
`applications in a particular application suite For example,
`documents often include both text and media such as images
`from pictures, graphs, and drawings. Word processors pro-
`vide rich feature sets for creating and editing text, but provide
`relatively weak or no features for creating and editing other
`forms of data. As a result, users work on text for a document
`in a word processor, images in an image editor, and drawings
`using a drawing tool such as a computer aided design ((TAI ))
`tool. Users spend significant time managing the user inter—
`faces ofthese various applications in order to access the data
`desired in the application desired.
`Accordingly, there exists a need for methods, systems, and
`computer program products for binding attributes between
`visual components.
`
`SUMMARY
`
`The following presents a simplified summary of the dis-
`closure in order to provide a basic understanding to the reader.
`This summary is not an extensive overview o l' the disclosure
`and it does not identify key/critical elements of the invention
`or delineate the scope of the invention. Its sole purpose is to
`present some concepts disclosed herein in a simplified form
`as a prelude to the more detailed description that is presented
`later.
`Methods and systems are described for binding attributes
`between visual components.
`In one aspect,
`the method
`includes detecting a first visual component including a first
`presentation space for presenting first data by an operating
`first application. The method further includes identifying
`binding information, for the first application, that specifies a
`mapping between a first visual attribute of the first visual
`component and a second visual attribute of a second visual
`component including a second presentation space for present-
`ing second data by a second application. The method still
`further includes detecting a lirst change to the first visual
`attribute The method additionally includes, in response to
`detecting the first change, automatically sending change ,
`information to change the second visual attribute according to
`the mapping.
`Further, a system for binding attributes between visual
`components is described. The system includes an execution
`environment including an instruction processing unit config—
`ured to process an instruction included in at least one ofa user
`interface monitor component, a binding director component,
`a binding monitor component. and a change director compo-
`nent. The system includes the user interface monitor compo-
`nent configured for detecting a first visual component includ-
`ing a first presentation space for presenting first data by an
`operating first application. The system further includes the
`binding director component configured for identifying bind—
`ing information, forthe first application, that specifies a map—
`ping between a first visual attribute of the first visual compo-
`nent and a second visual attribute of a second visual
`component including a second presentation space for present-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`US 8,780,130 B2
`
`2
`ing second data by a second application. The system still
`further includes the binding monitor component configured
`for detecting a first change to the first visual attribute. The
`system additionally includes the change director component
`configured for, in response to detecting the first change, auto-
`matically sending change information to change the second
`visual attribute according to the mapping.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`One or more aspects of the disclosure are described with
`reference to the drawings, wherein like reference numerals
`are generally utilized to refer to like elements throughout, and
`wherein the various structures are not necessarily drawn to
`scale. In the following description, for purposes of explana—
`tion, numerous specific details are set forth in order to provide
`a thorough understanding of one or more aspects of the dis—
`closure. It may be evident, however, to one skilled in the art,
`that one or more aspects of the disclosure may be practiced
`with a lesser degree of these specific details.
`
`MICROSOFT CORP. EX. 1008
`Page 13 of 28
`
`
`
`
`
`3
`instances, well—known structures and devices are shown in
`block diagram form in order to facilitate describing one or
`more aspects of the disclosure.
`An exemplary device included in an execution environ-
`ment that may be configured according to the subject matter
`is illustrated in FIG. 1. An execution environment includes an
`arrangement of hardware and, optionally, software that may
`be further configured to include an arrangement of compo—
`nents liirperliir'rning a method ofthe subject matter described
`herein. An execution environment includes and/or is other-
`wise provided by one or more devices. An execution environ-
`ment may include a virtual execution environment including
`software components operating in a host execution environ-
`
`ment. Exemplary devices included in or otherwise providing
`suitable execution environments for configuring according to
`the subject matter include personal computers, notebook
`computers, tablet computers, servers, handheld and other
`mobile devices, multiprocessor devices, distributed devices
`and/or systems, consumer electronic devices, routers, com-
`munication servers, and/or other network-enabled devices.
`Those skilled in the art will understand that the components
`illustrated in FIG. 1 are exemplary and may vary by particular
`execution environment.
`FIG. 1 illustrates hardware device 100 included in execu—
`tion environment 102. FIG. 1 illustrates that execution envi—
`ronment 102 includes instruction-processing unit (IPU) 104,
`such as one or more microprocessors: physical IPU memory
`106 including storage locations identified by addresses in a
`physical memory address space of IPU 104; persistent sec-
`ondary storage 108, such as one or more hard drives and/or
`flash storage media; input device adapter 1 10, such as a key or
`keypad hardware, a keyboard adapter, and/or a mouse
`adapter; output device adapter 112, such as a display and/or
`an audio adapter for presenting information to a user; a net-
`work interface component, illustrated by network interface
`adapter 1 14, for communicating via a network such as a LAN
`and/or WAN; and a communication mechanism that couples
`elements 104-114, illustrated as bus 116. Elements 104-114
`may be operativer coupled by various means. Bus 116 may
`comprise any type of bus architecture, including a memory
`bus, a peripheral bus, a local bus, and/or a switching fabric.
`IPU 104 is an instruction execution machine, apparatus, or
`device. Exemplary IPUs include one or more microproces-
`sors, digital signal processors (DSPs), graphics processing
`units, application-specific integrated circuits (ASICs), and/or
`field programmable gate arrays (FPGAs). In the description
`ofthe subject matter herein, the terms “IPU” and “processor”
`are used interchangeably. IPU 104 may access machine code
`instructions and data via one or more memory address spaces
`in addition to the physical memory address space. A memory ,
`address space includes addresses identifying locations in a
`processor memory. The addresses in a memory address space
`are included in defining a processor memory. IPU 104 may
`have more than one processor memory. Thus, IPU 104 may
`have more than one memory address space. IPU 104 may
`access a location in a processor memory by processing an
`address identifying the location. The processed address may
`be in an operand of a machine code instruction and’or may be
`identified in a register or other portion of IPU 104.
`FIG. 1 illustrates virtual IPU memory 118 spanning at least
`part of physical IPU memory 106 and at least part of persis-
`tent secondary storage 108. Virtual memory addresses in a
`memory address space may be mapped to physical memory
`addresses identifying locations in physical IPU memory 106.
`An address space for identifying locations in a virtual proces-
`sor memory is referred to as a virtual memory address space;
`its addresses are referred to as virtual memory addresses; and
`
`US 8,780,130 B2
`
`4
`its IPU memory is referred to as a virtual IPU memory or
`virtual memory. The terms “IPU memory” and “processor
`memory” are used interchangeably herein. Processor
`memory may refer to physical processor memory, such as
`IPU memory 106, and/or may refer to virtual processor
`memory, such as virtual IPU memory 118, depending on the
`context in which the term is used.
`Physical IPU memory 106 may include various types of
`memory technologies. Exemplary memory technologies
`include static random access memory (SRAM) and/or
`dynamic RAM (DRAM) including variants such as dual data
`rate synchronous DRAM (DDR SDRAM), error correcting
`code synchronous DRAM (ECC SDRAM), and/or RAM-
`BUS DRAM (RDRAM). Physical IPU memory 106 may
`include volatile memory as illustrated in the previous sen—
`tence and/or may include nonvolatile memory such as non—
`volatile flash RAM (NVRAM) and/or ROM.
`Persistent secondary storage 108 may include one or more
`flash memory storage devices, one or more hard disk drives,
`one or more magnetic disk drives, and/or one or more optical
`disk drives. Persistent secondary storage may include remov-
`able media. The drives and their associated computer-read-
`able storage media provide volatile and/or nonvolatile storage
`for computer—readable instructions, data structures, program
`components, and other data for execution environment 102.
`Execution enviromnent 102 may include software compo-
`nents stored in persistent secondary storage 108. in remote
`storage accessible via a network, and/or in a processor
`memory. FIG.
`1 illustrates execution enviromnent 102
`including operating system 120, one or more applications
`122, and other program code and/or data components illus—
`trated by other libraries and subsystems 124. In an aspect,
`some or all software components may be stored in locations
`accessible to IPU 104 in a shared memory address space
`shared by the software components. The software compo-
`nents accessed via the shared memory address space are
`stored in a shared processor memory defined by the shared
`memory address space. In another aspect, a first software
`component may be stored in one or more locations accessed
`by IPU 104 in a [Irst address space and a second software
`component may be stored in one or more locations accessed
`by IPU 104 in a second address space. The first software
`component is stored in a first processor memory defined by
`the first address space and the second software component is
`stored in a second processor memory defined by the second
`address space.
`instructions
`typically include
`Software
`components
`executed by IPU 104 in a computing context referred to as a
`“process”. A process may include one or more “threads”. A
`“thread” includes a sequence ofinstructions executed by IPU
`104 in a computing sub-context of a process. The terms
`“thread” and “process” may be used interchangeably herein
`when a process includes only one thread.
`Execution environment 102 may receive user-provided
`information via one or more input devices illustrated by input
`device 128. Input device 128 provides input infomiation to
`other components in execution enviromnent 102 via input
`device adapter 110. Execution enviromnent 102 may include
`an input device adapter for a keyboard, a touch screen, a
`microphone, a joystick, a television receiver, a video camera,
`a still camera, a document scanner, a fax, a phone, a modem,
`a network interface adapter, and’or a pointing device, to name
`a few exemplary input devices.
`Input device 128 included in execution environment 102
`may be included in device 100 as FIG. 1 illustrates or maybe
`external (not shown) to device 100. Execution environment
`102 may include one or more internal and/or extemal input
`
`MICROSOFT CORP. EX. 1008
`Page 14 of 28
`
`
`
`storage location in a processor memory, secondary storage, a
`
`5
`devices. External input devices may be connected to device
`100 via corresponding communication interfaces such as a
`serial port, a parallel port, and/or a universal serial bus (USB)
`port. Input device adapter 110 receives input and provides a
`representation to bus 116 to be received by IPU 104, physical
`IPU memory 106, and/or other components included in
`execution enviromnent 1 02.
`()utput device 130 in FIG. I exemplifies one or more output
`devices that may be included in and/or may be external to and
`opcrativcly coupled to device 100. For example, output
`device 130 is illustrated connected to bus 116 via output
`device adapter 112. Output device 130 may be a display
`device. Exemplary display devices include liquid crystal dis-
`plays (LCDs), light emitting diode (LED) displays, and pro-
`jectors. Output device 130 presents output of execution envi—
`romnent 102 to one or more users. In some embodiments, an
`input device may also include an output device. Examples
`include a phone, a joystick, and/or a touch screen. In addition
`to various types ofdisplay devices, exemplary output devices
`include printers, speakers,
`tactile output devices such as
`motion producing devices, and other output devices produc-
`ing sensory information detectable by a user.
`A device included in or otherwise providing an execution
`environment may operate in a networked environment com—
`municating with one or more devices via one or more network
`interface components. The terms “communication interface
`component” and “network interface component” are used
`interchangeably herein. FIG. 1 illustrates network interface
`adapter (NIA) 1 14 as a network interface component included
`in execution environment 102 to operatively couple device
`100 to a network. A network interface component includes a
`network interface hardware (NIH) component and optionally
`a software component. The terms “network node” and “node”
`in this document both refer to a device having a network
`interface component for operatively coupling the device to a
`network.
`Exemplary network interface components include network
`interface controller components, network interface cards,
`network interface adapters, and line cards. A node may
`include one or more network interface components to inter—
`operate with a wired network and/or a wireless network.
`Exemplary wireless networks include a BLUETOOTH net-
`work, a wireless 802.1 1 network, and]or a wireless telephony
`network (e.g., a cellular, PCS, CDMA, and/or G SM network).
`Exemplary network interface components for wired networks
`include Ethernet adapters, Token-ring adapters, FDDI adapt-
`ers, asynchronous transfer mode (ATM) adapters, and
`modems of various types. Exemplary wired and/or wireless
`networks include various types of LANs, WANs, and/or per—
`sonal area networks (PANs). Exemplary networks also .
`include intranets and internets such as the Internet.
`The terms “device” and “node” as used herein refer to one
`or more devices and nodes, respectively, providing and/or
`otherwise included in an execution environment unless
`clearly indicated otherwise.
`The components ofa userinterface are generically referred
`to herein as user interface elements. More specifically, Visual
`components of a user interface are referred to herein as visual
`interface elements. A visual interface element may be a visual
`component of a graphical user interface (GUI). Exemplary
`visual interface elements include windows, textboxes, slid-
`ers, list boxes, drop—down lists, spinners, various types of
`menus,
`toolbars, ribbons, combo boxes, tree views, grid
`views, navigation tabs, scrollbars, labels, tooltips, text in vari—
`ous fonts, balloons, dialog boxes, and various types ofbutton
`controls including check boxes and radio buttons. An appli-
`cation interface may include one or more of the elements
`
`US 8,780,130 B2
`
`6
`listed. Those skilled in the art will understand that this list is
`not exhaustive. The terms “visual representation”, “Visual
`component”, and “visual interface element” are used inter-
`changeably in this document. Other types of user interface
`elements include audio output components referred to as
`audio interface elements, tactile output components referred
`to as tactile interface elements, and the like.
`/\ visual component may be presented in a two—dimen—
`sional presentation where a location may be defined in a
`two-dimensional space having a vertical dimension and a
`horizontal dimension. A location in a horizontal dimension
`may be referenced according to an X-axis and a location in a
`vertical dimension may be referenced according to a Y-axis.
`In another aspect, a visual component may be presented in a
`three—dimensional presentation where a location may be
`defined in a three—dimensional space having a depth dimen—
`sion in addition to a vertical dimension and a horizontal
`dimension. A location in a depth dimension may be identified
`according to a Z-axis. A Visual component in a two-dimen-
`sional presentation may be presented as if a depth dimension
`existed, allowing the visual component to overlie and/or
`underlie some or all of another visual component.
`An order of visual components in a depth dimension is
`herein referred to as a “X—order". The term “Z—value“ as used
`herein refers to a location in a Z—order, or an order of Visual
`components along a Z-axis. A Z—order specifies the front-to-
`back ordering ofvisual components in a presentation space. A
`visual component with a higher Z-value than another visual
`component may be defined as being on top of or closer to the
`front than the other Visual component.
`A “user interface (UI) element handler” component, as the
`term is used in this document, includes a component config—
`ured to send information representing a program entity for
`presenting a user-detectable representation of the program
`entity by an output device, such as a display. A “program
`entity” is an object included in and/or otherwise processed by
`an application or executable program component. The user-
`detectable representation is presented based on the sent infor-
`mation. 'l'he sent information is referred to herein as “presen—
`tatiou information". Presentation information may include
`data in one or more formats. Exemplary formats include
`image formats such as JPEG, video formats such as MP4,
`markup language data such as HTML and other XML-based
`markup, and/or instructions such as those defined by various
`script
`languages, byte code, and/or machine code. For
`example, a web page received by a browser from a remote
`application provider may include hypertext markup language
`(HTML), ECMAScript, and/or byte code for presenting one
`or more user interface elements included in a user interface of
`the remote application. Components configured to send infor-
`mation representing one or more program entities for present-
`ing particular types of output by particular types of output
`devices include Vi sual interface element handler components,
`audio interface element handler components, tactile interface
`element handler components, and the like.
`/\ representation 0 fa program entity may be stored and/or
`otherwise maintained in a presentation space. As used in this
`document. the term “presentation space” refers to a storage
`region allocated and/or otherwise provided for storing pre-
`sentation information, which may include audio, visual, tac-
`tile, and/or other sensory data for presentation by and/or on an
`output device. For example, a buffer for storing an image
`and’or text string may be a presentation space. A presentation
`space may be physically andjor logically contiguous or non—
`contiguous. A presentation space may have a virtual as well as
`a physical representation. A presentation space may include a
`
`MICROSOFT CORP. EX. 1008
`Page 15 of 28
`
`
`
`US 8,780,130 B2
`
`J
`
`.
`
`FIG. 4b and network application platform 409d in FIG. 4d
`
`7
`memory ofan output adapter device, and/or a storage medium
`of an output device. A screen ofa display, for example, is a
`presentation space.
`As used herein, the term “program” or “executable” refers
`to any data representation that may be translated into a set of
`machine code instructions and optionally associated program
`data. Thus, a program component or executable component
`may include an application, a shared or non—shared library,
`and a system command. Program representations other than
`machine code include object code, byte code, and source
`code. Object code includes a set of instructions and/or data
`elements that either are prepared for linking prior to loading
`or are loaded into an execution environment. When in an
`execution enviromnent, object code may include references
`resolved by a linker and/or may include one or more unre—
`solved references. The context in which this term is used will
`make clear that state of the object code when it is relevant.
`This definition can include machine code and virtual machine
`code, such as JavaTM byte code.
`As used herein, an “addressable entity” is a portion of a
`program, specifiable in a programming language in source
`code. An addressable entity is addressable in a program com-
`ponent translated from the source code in a compatible execu-
`tion environment. anmples of addressable entities include
`variables, constan