throbber
(12) United States Patent
`(10) Patent N0.:
`US 8,780,130 B2
`Morris
`(45) Date of Patent:
`Jul. 15, 2014
`
`USOO8780130B2
`
`METHODS, SYSTEMS, AND COMPUTER
`PROGRAM PRODUCTS FOR BINDING
`ATTRIBUTES BETWEEN VISUAL
`COMPONENTS
`
`Inventor: Robert Paul Morris, Raleigh, NC (US)
`
`Assignee: Sitting NIan, LLC, Raleigh, NC (US)
`
`Notice:
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 634 days.
`
`12/956,008
`
`Nov. 30, 2010
`
`Prior Publication Data
`
`US 2012/0133662Al
`
`May 31, 2012
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2000.01)
`(2006.01)
`(2006.01)
`(2013.01)
`(2006.01)
`
`mapping
`
`Int. Cl.
`G09G 5/00
`G09G 5/02
`G06F 3/041
`G061" 15/16
`G06F l7/00
`G06F 3/00
`G06F 3/048
`G06F 9/44
`US. Cl.
`USPC ......... .. 345/581; 345/173; 345/589; 345/630;
`709/203; 709/218; 715/200; 715/204; 715/275;
`715/700; 715/788; 717/107; 717/114
`Field of Classification Search
`USPC ....... .. 345/581, 589,619,360, 501, 538, 156,
`345/168, 1727173, 175. 179, 620, 6237625,
`345/629; 709/2017203, 2177218, 219;
`715/200, 273, 275, 700, 7337734, 751,
`715/760, 7637764, 864, 86m861, 843, 845,
`715/851, 8557856, 828, 829, 830, 833, 781,
`
`715/784, 788, 792, 7947804, 717/1077109,
`717/1137114; 707/705, 758
`See application file for complete search history.
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`7,080,088
`2003/0018609
`2006/0271853
`zoos/0294470
`2009/0254610
`2010/0037154
`
`7/2006
`B1 *
`.
`'
`'
`1/2003
`A1 *
`.. 715/700
`A1 * 11/2006 Marcos et a1.
`.. 715/764
`A1
`12/2006 Marcos et a1.
`.. 709/203
`A1
`10/2009 Arthursson .... ..
`A1"
`2/2010 Marcos et a1.
`.............. .. 715/760
`OTHER PUBLICAl’lON S
`
`Microsoft Developer Network. “Object Binding Sample,” Jul. 15,
`2010, http://msdn.micro soft.corn/en-us/library/8e3 6eeyx%28v:vs.
`90%29.aspx, last accessed Mar. 20, 2014.
`Microsoft Developer Network, “Object Binding in Visual Studio,”
`2010,
`http:Wmsdnanicrosoft corm’en-u s/library/1115233 815(v:vs.
`100).aspx, last accessed Mar. 20, 2014.
`
`* cited by examiner
`
`Primary Examiner 7 Wesner Sajous
`(74) Attorney, Agent, or Firm Patrick E. Caldwell, Esq.;
`The Caldwell Firm, LLC
`
`ABSTRACT
`(57)
`Methods and systems are described for binding attributes
`between visual components. A first visual component, includ-
`ing a first presentation space for presenting first data by an
`operating first application, is detected. Binding information,
`for the first application, is identified that specifies a mapping
`between a first visual attribute of the first visual component
`and a second visual attribute of a second visual component
`including a second presentation space for presenting second
`data by a second application. A change to the first visual
`attribute is detected. In response to the detection of the
`change, change information is automatically sent to change
`the second visual attribute according to the mapping.
`
`45 Claims, 11 Drawing Sheets
`
`Detect a first visual component includirg a first presentatlun space for
`presenting firstdata by an operating firstappllcation
`
`Identify binding information, tor the first application, that specilies a
`mapping between a first visual ettnbute of the first visual component
`and a second visual attribute at a seco‘td visual component including
`a second presentation spaoe tor presenting second data by a second
`applicatim
`
`Delecl a first change to the first visual atlribute
`
`Send change lnforr'lauon aulbmatically, in response to detecting the
`first change, to change the second VlSUal attribute according to the
`
`MICROSOFT CORP. EX. 1006
`Page 1 of 28
`
`

`

`Jul. 15, 2014
`
`Sheet 1 of 11
`
`US 8,780,130 B2
`
`
`
`83mm5%:59:0-cozoaag
`
`
`
`
`O_\_\LGHQWU<@D_>@D®C_mm®OO._n__m0_w>£n_
`
`
`
`
`
`_I||ll||||.
`
`
`
`0mcm35v:E2gH.n_flmEofismgw
`
`US. Patent
`Eoguzo...oomtBE53:8me
`
`0
`
`{9562.5%?we@92on8:980
`_§t_>a3IIIIII
`aIqNIF8.295
`
`ibeg:3n:
`ccmwarms:EEO
`wcosmo=qq<
`____| _NNV_____dovoonSmD
`
`IaLoam?gE:a30522
`
`
`
`
`oo_>wn_59:830059:0
`
`
`
`
`
`N—.ucwECOLsCmC0330mxm
`
`MICROSOFT CORP. EX. 1006
`Page 2 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 2 of 11
`
`US 8,780,130 B2
`
`Identify binding information, for the first application, that specifies a
`mapping between a first visual attribute of the first visual component
`and a second visual attribute of a second visual component including
`a second presentation space for presenting second data by a second
`application
`
`mapping
`
`Detect a first visual component including a first presentation space for
`presenting first data by an operating first application
`
`Detect a first change to the first visual attribute
`
`Send change information automatically, in response to detecting the
`first change, to change the second visual attribute according to the
`
`MICROSOFT CORP. EX. 1006
`Page 3 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 3 of 11
`
`US 8,780,130 B2
`
`UI Monitor 02
`
`Binding Monitor
`&
`
`Change Director
`
`Binding Director
`&
`
`&
`
`MICROSOFT CORP. EX. 1006
`Page 4 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 4 of 11
`
`US 8,780,130 B2
`
`Execution Environment 40 a1
`
`Graphics Subsystem
`439a
`
`U
`IGU S bsystem
`437a
`
`Input
`Driver
`441 a
`
`Network Stack 411a
`
`Binding Director - Binding Monitor
`
`Presentation Controller 435a
`
`Change Director
`
`Application 403a
`
`Ul Element
`Handler
`4338
`
`404a
`
`406a
`
`Ul Monitor 4023
`
`Application Protocol Component 413
`
`a
`
`MICROSOFT CORP. EX. 1006
`Page 5 of 28
`
`

`

`Input
`Driver
`441 b
`
`GUI Subsystem
`437b
`
`Presentation Controller 435b
`
`Change
`Director
`
`w
`
`Ul element
`
`handler
`w
`
`Binding Director
`404b
`
`Binding Monitor
`406b
`
`' Network A Iication
`
`. gent 405b
`
`Ul Monitor 402b
`
`'
`
`Content 5
`Handler I
`431 b
`
`Content Manager
`415b
`
`Application Protocol Component 413b
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 5 of 11
`
`US 8,780,130 B2
`
`Execution Environment 4 1b
`
`Graphics Subsystem
`439b
`
`Network Stack 411b
`
`MICROSOFT CORP. EX. 1006
`Page 6 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 6 of 11
`
`US 8,780,130 B2
`
`Graphics Subsystem 439C
`
`GUISubsystem 4 7c
`
`Binding Monitor
`406C
`
`Binding Director
`404C
`
`Change
`Director
`4080
`
`Ul Monitor 402C
`
`Execution Environment 401
`
`Input Driver
`441C
`
`Network Stack 411C
`
`Ul Element
`Handler
`433—1 c
`
`First Application 403-10
`
`Presentation Controller 435-20
`
`Second Application 403-20
`
`Presentation Controllerm
`
`UI Element
`Handler
`
`M
`
`Application Protocol Component 413C
`
`MICROSOFT CORP. EX. 1006
`Page 7 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 7 of 11
`
`US 8,780,130 B2
`
`Network Stack 411d
`
`Application Protocol Component 413d
`
`Network Application Platform 4
`
`9d
`
`Controller 417d
`
`View Subsystem 42 d
`
`Ul Monitor 402d
`
`Change
`Director
`408d
`
`Execution Environment 4 1d
`
`Model Database
`421d
`
`Binding Monitor
`406d
`
`Binding Director
`404d
`
`Template Engine
`423d
`
`Model Subsystem 419d
`
`Network Application 4 3d
`
`Template
`Database 425
`
`Template
`fl
`
`MICROSOFT CORP. EX. 1006
`Page 8 of 28
`
`

`

`US 8,780,130 B2
`
`User Node
`
`&
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 8 of 11
`
`fl
`
`Application
`Provider Node
`
`MICROSOFT CORP. EX. 1006
`Page 9 of 28
`
`

`

`Second App
`
`OpA OpB
`
`OpN
`
`Presentation Space 608-2a
`
`6141a m -
`
`-
`
`6142a
`
`;
`
`- -
`
`-
`
`-
`
`6141n
`
`6142n
`
`First App
`
`* File
`
`Tools
`
`Help
`
`US. Patent
`
`Jul. 15, 2014
`
`Sheet 9 of 11
`
`US 8,780,130 B2
`
`Display Presentation Space 602a
`
`Presentation Space 608—1a
`
`Edit View
`Bookmarks
`http://mysite.OoOT.com
`
`MICROSOFT CORP. EX. 1006
`Page 10 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 10 of 11
`
`US 8,780,130 B2
`
`First App
`
`File
`
`Edit
`
`View
`
`Bookmarks
`
`Tools
`
`Help
`
`Display Presentation Space 602b
`
`604-3b
`
`Presentation Space 608-1b
`
`2nd App
`604-2b
`
`3rd App
`
`MICROSOFT CORP. EX. 1006
`Page 11 of 28
`
`

`

`US. Patent
`
`Jul. 15, 2014
`
`Sheet 11 of 11
`
`US 8,780,130 B2
`
`702
`
`<viéua|—binding>
`
`708-1
`706-1 ~\
`7044
`7042 \:\<application path=”c:\apps\editors\htm|Editexe" id=”editor” \>
`\‘<app|icati0n path=”c:\uti|ities\search\filesearch.exe” id=”navigator” \>
`706-2
`708-2
`
`Fig. 7
`
`[/7 714-1
`712-1
`7104
`718-1\TSattribute-binding id=”editor.h’1ain.state" op-i/d=”init” >
`\
`“<bind id="navigator.main,state” op:i§l=”init” pararp=”c:\\docs\\web” \>
`</attr1bute—b1ndIn97>20_1 y/
`7102
`7122
`722/-/:1_’,14_2 7244/ 7162
`718-2\:§attribute-binding id=”editor/.main.size” op-id=”max" symmétric="TRUE”>
`‘<bind id=”navigator.main.§ize" op-id\=”min” \>
`</attribute—binding>
`\- 722-2
`///
`,
`,
`710-3 ~\
`,/
`718-3 :;<attribute-binding id=”editor“.main.size” op-id=”change" symmetric=”TRUE”>
`\<bind id=”navigator.main/.size” op-jd=”opposite" \>
`</attribute—binding>
`\ 722—3
`/ 716-4
`714-4
`712-4
`720-3 /
`7104
`718 4‘ éattribute-binding id="editof.main.size"op-ia=”resize” symmetric=”FALSE”>
`'
`\\<bind id="navigator.mai/n.Iocation” op-id=”empty-space” \>
`</attribute—binding>
`/'
`\-\\ 722-4
`720-4 /
`71 2'5
`710-5
`kattribute—binding id=”editor.main.location” op-id=”change”
`714_5
`718 5 \\symmetric=”TRUE”>
`' \\ <bind id=”navigator.majn.Iocation” o\p-id=”empty-space”\>
`<lattribute—bindinn>
`\ 722-5
`/
`716-6
`714-6
`712-6
`720-5/
`<attribute-binding id=”editor.1fiain.state” op-ia;"on-focus” symnfietric=”TRUE”>
`<bind id="navigator.main.lpcation” op-igl=”empty-space” \>
`</attribute—binding>
`722_6
`714-7
`712-7
`720-6 /
`710-7
`7184 \gttribute-binding id=”editor.h’1ain.transparency” op-’1d=”change” >
`<bind id="navigator.main.transparency” op-[d=”match" \>
`</attribute-binding>
`722—7
`
`716-5
`
`71% \
`7186
`
`<visuaI-binding>
`
`720-7 /
`
`MICROSOFT CORP. EX. 1006
`Page 12 of 28
`
`

`

`US 8,780,130 B2
`
`2
`ing second data by a second application. The system still
`further includes the binding monitor component configured
`for detecting a first change to the first Visual attribute. The
`system additionally includes the change director component
`configured for, in response to detecting the first change, auto—
`matically sending change information to change the second
`Visual attribute according to the mapping.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`In other
`
`invention will
`Objects and advantages of the present
`become apparent to those skilled in the art upon reading this
`description in conjunction with the accompanying drawings,
`in which like reference numerals have been used to designate
`like or analogous elements, and in which:
`FIG. 1 is a block diagram illustrating an exemplary hard-
`ware device included in and/or otherwise providing an execu—
`tion environment in which the subject matter may be imple-
`mented;
`FIG. 2 is a flow diagram illustrating a method for binding
`attributes between visual components according to an aspect
`of the subjec ma er described herein;
`FIG. 3 is a block diagram illustrating an arrangement of
`componen s for jinding attributes be ween Visual compo-
`nents accorcing 0 another aspoc of the sub'ec matter
`describec herein;
`FIG. 4a is a block diagram illustrating an arrangement of
`componen s for Jinding attributes be ween Visual compo-
`nents accorcing 0 another aspec of the sub‘ec matter
`describec herein;
`FIG. 4b is a block diagram illustrating an arrangement of
`componen s for Jinding attributes be ween Visual compo-
`nents aecorcing 0 another aspec of the sub'ec matter
`dcscribcc herein;
`FIG. 40 is a block diagram illustrating an arrangement of
`componen s for Jinding attributes be ween visual compo-
`nents accorcing 0 another aspec of the subjec matter
`describec herein;
`FIG. 4d is a block diagram illustrating an arrangement of
`componen s for jinding attributes be ween Visual compo-
`nents accorcing 0 another aspec of the sub_'ec matter
`describec herein;
`FIG. 5 is a nc work diagram illus rating an exemplary
`system for binding attributes between Visual components
`according to another aspect of the subject matter described
`herein;
`FIG. 6a is a diagram illustrating a user interface presented
`Via a display according to another aspect ofthe subject matter
`described herein;
`FIG. 6b is a diagram illustrating a user interface presented
`Via a display according to another aspect ofthe subject matter
`described herein; and
`FIG. 7 is an illustration ofbinding infomiation according to
`another aspect of the subject matter described herein.
`
`1
`METHODS, SYSTEMS, AND COMPUTER
`PROGRAM PRODUCTS FOR BINDING
`ATTRIBUTES BETWEEN VISUAL
`COMPONENTS
`
`BACKGROUND
`
`While some applications can be used alone, some applica—
`tions are used together. Often there is no integration and/or
`cooperation between or among applications used at the same
`time by a user. Even in application suites, cooperation is
`limited to features that ease data sharing between or among
`applications in a particular application suite. For example,
`documents often include both text and media such as images
`from pictures, graphs, and drawings. Word processors pro-
`vide rich feature sets for creating and editing text, but provide
`relatively weak or no features for creating and editing other
`forms of data. As a result, users work on text for a document
`in a word processor, images in an image editor, and drawings
`using a drawing tool such as a computer aided design (CAD)
`tool. Users spend significant time managing the user inter—
`faces ofthese various applications in order to access the data
`desired in the application desired.
`Accordingly, there exists a need for methods, systems, and
`computer program products for binding attributes between V
`visual components.
`
`SUMMARY
`
`The following presents a simplified summary of the dis-
`closure in order to provide a basic understanding to the reader,
`This summary is not an extensive overview of the disclosure
`and it does not identify key/critical elements of the invention
`or delineate the scope of the invention. lts sole purpose is to
`present some concepts disclosed herein in a simplified form
`as a prelude to the more detailed description that is presented
`later.
`Methods and systems are described for binding attributes
`between visual components.
`In one aspect,
`the method
`includes detecting a first visual component including a first
`presentation space for presenting first data by an operating
`first application. The method further includes identifying
`binding information, for the first application, that specifies a
`rnapping between a first Visual attribute of the first Visual
`component and a second Visual attribute of a second Visual
`component including a second presentation space for present-
`ing second data by a second application. The method still
`further includes detecting a first change to the first visual
`attribute. The method additionally includes, in response to
`detecting the first change, automatically sending change ,
`information to change the second Visual attribute according to
`the mapping.
`Further, a system for binding attributes between Visual
`components is described. The system includes an execution
`environment including an instruction processing unit config-
`ured to process an instruction included in at least one of a user
`interface monitor component, a binding director component,
`a binding monitor component, and a change director compo-
`nent. The system includes the user interface monitor compo-
`nent configured for detecting a first Visual component includ—
`ing a first presentation space for presenting first data by an
`operating first application. The system further includes the
`binding director component configured for identifying bind-
`ing information, for the first application, that specifies a map-
`ping between a first visual attribute of the first Visual compo—
`nent and a second visual attribute of a second visual
`component including a second presentation space for present-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`DETAILED DESCRIPTION
`
`One or more aspects of the disclosure are described with
`reference to the drawings, wherein like reference numerals
`are generally utilized to refer to like elements throughout, and
`wherein the various structures are not necessarily drawn to
`scale. In the following description, for purposes of explana-
`tion, numerous specific details are set forth in order to provide
`a thorough understanding of one or more aspects of the dis-
`closure. It may be evident, however, to one skilled in the art,
`that one or more aspects of the disclosure may be practiced
`with a lesser degree of these specific details.
`
`MICROSOFT CORP. EX. 1006
`Page 13 of 28
`
`

`

`102 may include one or more internal and/or extemal input
`
`3
`instances, well—known structures and devices are shown in
`block diagram form in order to facilitate describing one or
`more aspects of the disclosure.
`An exemplary device included in an execution environ-
`ment that may be configured according to the subject matter
`is illustrated in FIG. 1. An execution environment includes an
`arrangement of hardware and, optionally, software that may
`be further configured to include an arrangement of compo-
`nents forperforming a method ofthe subject matter described
`herein. An execution environment includes and/or is other—
`wise provided by one or more devices. An execution environ—
`ment may include a virtual execution environment including
`software components operating in a host execution environ-
`ment. Exemplary devices included in or otherwise providing
`suitable execution environments for configuring according to
`the subject matter include personal computers, notebook
`computers, tablet computers, servers, handheld and other
`mobile devices, multiprocessor devices, distributed devices
`and/or systems, consumer electronic devices, routers, com—
`munication servers, and/or other network—enabled devices.
`Those skilled in the art will understand that the components
`illustrated in FIG. 1 are exemplary and may vary by particular
`execution environment.
`FIG. 1 illustrates hardware device 100 included in execu—
`tion enviromnent 102. FIG. 1 illustrates that execution envi-
`ronment 102 includes instruction-processing unit (IPU) 104,
`such as one or more microprocessors; physical IPU memory
`106 including storage locations identified by addresses in a
`physical memory address space of IPU 104; persistent sec—
`ondary storage 108, such as one or more hard drives and/or
`flash storage media; input device adapter 110, such as a key or
`keypad hardware, a keyboard adapter, and/or a mouse
`adapter; output device adapter 112, such as a display and/or
`an audio adapter for presenting information to a user; a net-
`work interface component, illustrated by network interface
`adapter 114, for communicating via a network such as a LAN
`and/or WAN; and a communication mechanism that couples
`
`elements 104—114, illustrated as bus 116. Elements 104—114
`may be operativer coupled by various means. Bus 116 may
`comprise any type of bus architecture, including a memory
`bus, a peripheral bus, a local bus, and/or a switching fabric.
`IPU 104 is an instruction execution machine, apparatus, or
`
`device. Exemplary IPUs include one or more microproces-
`sors, digital signal processors (DSPs), graphics processing
`units, application-specific integrated circuits (ASICs), and/or
`field programmable gate arrays (FPGAs). In the description
`ofthe subject matter herein, the terms “IPU” and “processor”
`are used interchangeably. IPU 104 may access machine code
`instructions and data via one or more memory address spaces
`in addition to the physical memory address space. A memory ,
`address space includes addresses identifying locations in a
`processor memory. The addresses in a memory address space
`are included in defining a processor memory. IPU 104 may
`have more than one processor memory. Thus, IPU 104 may
`have more than one memory address space. IPU 104 may
`access a location in a processor memory by processing an
`address identifying the location. The processed address may
`be in an operand ofa machine code instruction and/or may be
`identified in a register or other portion of IPU 104.
`FIG. 1 illustrates virtual IPU memory 118 spanning at least
`part of physical IPU memory 106 and at least part of persis-
`tent secondary storage 108. Virtual memory addresses in a
`memory address space may be mapped to physical memory
`addresses identifying locations in physical IPU memory 106.
`An address space for identifying locations in a virtual proces—
`sor memory is referred to as a virtual memory address space;
`its addresses are referred to as virtual memory addresses; and
`
`US 8,780,130 B2
`
`4
`its IPU memory is referred to as a virtual IPU memory or
`virtual memory. The terms “IPU memory“ and “processor
`memory” are used interchangeably herein. Processor
`memory may refer to physical processor memory, such as
`IPU memory 106, and/or may refer to virtual processor
`memory, such as virtual IPU memory 118, depending on the
`context in which the term is used.
`Physical IPU memory 106 may include various types of
`memory technologies. Exemplary memory technologies
`include static random access memory (SRAM) and/or
`dynamic RAM (DRAM) including variants such as dual data
`rate synchronous DRAM (DDR SDRAM), error correcting
`code synchronous DRAM (ECC SDRAM), and/or RAM-
`BUS DRAM (RDRAM). Physical IPU memory 106 may
`include volatile memory as illustrated in the previous sen—
`tence and/or may include nonvolatile memory such as non-
`volatile flash RAM (NVRAM) and/or ROM.
`Persistent secondary storage 108 may include one or more
`flash memory storage devices, one or more hard disk drives,
`‘ one or more magnetic disk drives, and/or one or more optical
`disk drives. Persistent secondary storage may include remov-
`able media. The drives and their associated computer-read-
`able storage media provide volatile and/or nonvolatile storage
`for computer—readable instructions, data structures, program
`components, and other data for execution environment 102.
`Execution environment 102 may include software compo-
`nents stored in persistent secondary storage 108, in remote
`storage accessible via a network, and/or in a processor
`memory. FIG. 1 illustrates execution environment 102
`including operating system 120, one or more applications
`122, and other program code and/or data components illus-
`trated by other libraries and subsystems 124. In an aspect,
`some or all software components may be stored in locations
`accessible to IPU 104 in a shared memory address space
`shared by the software components. The software compo-
`nents accessed via the shared memory address space are
`stored in a shared processor memory defined by the shared
`memory address space. In another aspect, a first software
`component may be stored in one or more locations accessed
`by IPU 104 in a first address space and a second software
`component may be stored in one or more locations accessed
`by IPU 104 in a second address space. The first software
`component is stored in a first processor memory defined by
`the first address space and the second software component is
`stored in a second processor memory defined by the second
`address space.
`instructions
`typically include
`Software
`components
`executed by IPU 104 in a computing context referred to as a
`“process”. A process may include one or more “threads”. A
`“thread” includes a sequence ofinstructions executed by IPU
`104 in a computing sub—context of a process. The terms
`“thread” and “process” may be used interchangeably herein
`when a process includes only one thread.
`Execution environment 102 may receive user-provided
`information via one or more input devices illustrated by input
`device 128. Input device 128 provides input information to
`other components in execution environment 102 via input
`device adapter 110. Execution enviromnent 102 may include
`an input device adapter for a keyboard, a touch screen, a
`microphone, a joystick, a television receiver, a video camera,
`a still camera, a document scanner, a fax, a phone, a modem,
`a network interface adapter, and/or a pointing device, to name
`a few exemplary input devices.
`Input device 128 included in execution environment 102
`may be included in device 100 as FIG. 1 illustrates or may be
`external (not shown) to device 100. Execution environment
`
`MICROSOFT CORP. EX. 1006
`Page 14 of 28
`
`

`

`storage location in a processor memory, secondary storage, a
`
`5
`devices. External input devices may be connected to device
`100 via corresponding communication interfaces such as a
`serial port, a parallel port, and/or a universal serial bus OJSB)
`port. Input device adapter 110 receives input and provides a
`rcprcscntation to bus 116 to bc roccivcd by IPU 104, physical
`IPU memory 106, and/or other components included in
`execution environment 102.
`Output device 130 in FIG. 1 exemplifies one or more output
`devices that may be included in and/or may be external to and
`operatively coupled to device 100. For example, output
`device 130 is illustrated connected to bus 116 via output
`device adapter 112. Output device 130 may be a display
`device. Exemplary display devices include liquid crystal dis-
`plays (LCDs), light cmitting diodc (LED) displays, and pro-
`jectors. Output device 130 presents output of execution envi—
`ronment 102 to one or more users. In some embodiments, an
`input device may also include an output device. Examples
`include a phone, a joystick, and/or a touch screen In addition
`to various types ofdisplay devices, exemplary output devices
`include printers, speakers,
`tactile output devices such as
`motion producing devices, and other output devices produc-
`ing sensory information detectable by a user.
`A device included in or othcrwisc providing an cxccution
`environment may operate in a networked environment com—
`municating with one or more devices Via one or more network ,
`interface components. The terms “communication interface
`component” and “network interface component” are used
`interchangeably herein. FIG. 1 illustrates network interface
`adapter (NIA) 114 as a network interface component included
`in execution environment 102 to operatively couple device
`100 to a network. A network interface component includes a
`network interface hardware (NIH) component and optionally
`a software component. The terms “network node” and “node”
`in this document both refer to a device having a network
`interface component for operatively coupling the device to a
`network.
`Excmplary nctwork intcrfacc components include nctwork
`interface controller components. network interface cards,
`network interface adapters, and line cards. A node may
`include one or more network interface components to inter-
`operate with a wired network and/or a wireless network.
`Exemplary wireless networks include a BLUETOOTH net—
`work, a wireless 802.1 1 network. and/or a wireless telephony
`network (e.g., a cellular, PCS, CDMA, and/or GSM network).
`Exemplary network interface components for wired networks
`
`includc Ethcmct adaptcrs, Token-ring adapters, FDDI adapt-
`ers, asynchronous transfer mode (ATM) adapters, and
`modems of various types. Exemplary wired and/or wireless
`networks include various types of LANs, WAN s, and/or per-
`sonal area networks (PANs). Exemplary networks also ,
`include intranets and internets such as the Internet.
`The terms “device” and “node” as used herein refer to one
`or more devices and nodes, respectively, providing and/or
`otherwise included in an execution environment unless
`clcarly indicatcd othcrwisc.
`The components of a user interface are generically referred
`to herein as user interface elements. More specifically, visual
`components ofa user interface are referred to herein as visual
`interface elements. A visual interface element may be a Visual
`component of a graphical user interface (GUI). Exemplary
`visual interface elements include windows, textboxes. slid-
`ers, list boxes, drop-down lists, spinners, various types of
`menus,
`toolbars, ribbons, combo boxes,
`tree views, grid
`vicws, navigation tabs, scrollbars, labels, tooltips, text in vari-
`ous fonts, balloons, dialog boxes, and various types of button
`controls including check boxes and radio buttons. An appli-
`cation interface may include one or more of the elements
`
`US 8,780,130 B2
`
`6
`listed. Those skilled in the art will understand that this list is
`not exhaustive. The terms “visual representation“, “visual
`component”, and “visual interface element” are used inter-
`changeably in this document. Other types of user interface
`clcmcnts includc audio output componcnts rcfcrrcd to as
`audio interface elements, tactile output components referred
`to as tactile interface elements, and the like.
`A visual component may be presented in a two-dimen-
`sional presentation where a location may be defined in a
`two—dimensional space having a vertical dimension and a
`horizontal dimension. A location in a horizontal dimension
`may be referenced according to an X-axis and a location in a
`vertical dimension may be referenced according to a Y-axis.
`In anothcr aspcct, a visual component may be prcscntcd in a
`three—dimensional presentation where a location may be
`defined in a three-dimensional space having a depth dimen-
`sion in addition to a vertical dimension and a horizontal
`dimension. A location in a depth dimension may be identified
`according to a Z—axis. A visual component in a two—dimen—
`‘ sional presentation may be presented as if a depth dimension
`existed, allowing the visual component to overlie and/or
`underlie some or all of another Visual component.
`An ordcr of visual componcnts in a dcpth dimension is
`herein referred to as a “Z—order”. The term “Z—value” as used
`herein refers to a location in a Z-order. or an order of visual
`components along a Z-axis. A Z-order specifies the front-to-
`back ordering ofvisual components in a presentation space. A
`Visual component with a higher Z—value than another visual
`component may be defined as being on top of or closer to the
`front than the other visual component.
`A “user interface (UI) element handler” component, as the
`term is used in this document, includes a component config-
`ured to send information representing a program entity for
`presenting a user-detectable representation of the program
`entity by an output device, such as a display. A “program
`entity” is an object included in and/or otherwise processed by
`an application or cxccutablc program componcnt. Thc uscr-
`detectable representation is presented based on the sent infor—
`mation. The sent information is referred to herein as “presen-
`tation information”. Presentation information may include
`data in one or more formats. Exemplary formats include
`image formats such as JPEG, video fomiats such as MP4,
`markup language data such as HTML and other XML-based
`markup, and/or instructions such as those defined by various
`script
`languages, byte code, and/or machine code. For
`example, a web pagc received by a browscr from a rcmotc
`application provider may include hypertext markup language
`(HTML), ECMAScript, and/or byte code for presenting one
`or more user interface elements included in a user interface of
`the remote application. Components configured to send infor-
`mation representing one or more program entities for present—
`ing particular types of output by particular types of output
`devices include visual interface element handler components,
`audio interface element handler components, tactile interface
`clcmcnt handlcr componcnts, and thc like.
`A representation of a program entity may be stored and/or
`otherwise maintained in a presentation space. As used in this
`document, the term “presentation space” refers to a storage
`region allocated and/or otherwise provided for storing pre-
`sentation information, which may include audio, visual, tac—
`tile, and/or other sensory data for presentation by and/or on an
`output device. For example, a buffer for storing an image
`and/or text string may be a presentation space. A presentation
`space may be physically and/or logically contiguous or non-
`contiguous. A presentation space may have a virtual as well as
`a physical representation. A presentation space may include a
`
`MICROSOFT CORP. EX. 1006
`Page 15 of 28
`
`

`

`US 8,780,130 B2
`
`FIG. 4b and network application platform 409d in FIG. 4d
`
`7
`memory ofan output adapter device, and’or a storage medium
`of an output device. A screen of a display, for example, is a
`presentation space.
`As used herein, the term “program” or “executable” refers
`to any data representation that may be translated into a set of 5
`machine code instructions and optionally associated program
`data. Thus, a program component or executable component
`may include an application, a shared or non-shared library,
`and a system command. Program representations other than
`machine code include object code, byte code, and source
`code. Object code includes a set of instructions and/or data
`elements that either are prepared for linking, prior to loading,
`or are loaded into an execution environment. When in an
`execution environment, object code may include references
`resolved by a linker and/or may include one or more unre—
`solved references. The context in which this term is used will
`make clear that state of the object code when it is relevant.
`This definition can include machine code and virtual machine
`code, such as JavaTM byte code.
`As used herein, an “addressable entity” is a portion of a
`program, specifiable in a programming language in source
`code. An addressable entity is addressable in a program com-
`ponent translated from the source code in a compatible execu-
`tion environment. Examples of addressable entities

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket