throbber
a2) United States Patent
`(10) Patent No.:
`US 6,567,767 B1
`Mackeyetal.
`(45) Date of Patent:
`May20, 2003
`
`
`US006567767B1
`
`(54) TERMINAL SERVER SIMULATED CLIENT
`PERFORMANCE MEASUREMENT TOOL
`
`(75)
`
`Inventors: Kyle Joseph James Mackey, Aliso
`Viejo, CA (US); Lauren Ann Cotugno,
`Dove Canyon, CA (US); Joseph
`Kuoweng Chan, Mission Viejo, CA
`(US); Matthew James Hengst, Lake
`Forest, CA (US); Mark Douglas
`Whitener, Alisa Viejo, CA (US)
`
`(73) Assignee: Unisys Corporation, Blue Bell, PA
`(US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 171 days.
`
`(21) Appl. No.: 09/664,100
`44.
`(22)
`Filed:
`Sep. 19, 2000
`(51)
`Int. Ch? oe G04F 15/173; GO4F 10/00
`(52)
`UWS. Che eeccseccssseccssetecssstesssnieseses 702/186; 702/182
`(58) Field of Search ..0000.0.00000.0cccccce 702/186, 182,
`702/187; 709/203, 219, 224
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`
`5/2000 Reps et al. wo... 709/224
`6,070,190 A *
`8/2000 Maccabee et al.
`....0..... 709/224
`6,108,700 A *
`1/2001 Formanet al. ow... 709/224
`6,178,449 B1 *
`
`
`3/2001 Klein etal. ....... 702/178
`6,202,036 Bi *
`........ 370/252
`4/2001 Knauerhaseet al.
`6,215,774 B1 *
`6,308,211 B1 * 10/2001 Rosboroughetal. ....... 709/224
`
`* cited by examincr
`
`Primary Examiner—Marc S. Hoff
`Assistant Examiner—Paul Kim
`(74) Attorney, Agent, or Firm—Alfred W. Kozak; Mark T.
`Starr; Lise A. Rode
`
`ABSTRACT
`67)
`A performance measurement tool is developed to measure
`performance of a terminal server servicing multiple Clients
`whooperate on remote systems in a farm of multiple PC’s.
`Test scripts to simulate actual operating conditions are run
`on each PC Client-User over a sequence of time which
`varies the number of concurrently active PC Client-Users
`from a small number of Users to a vary large number of
`current Users. During this test period, a record is kept of
`designated simulated-user-initiated actions such as log-on
`times, time to open various application programs, and char-
`acter entries thus to determine acceptable operating configu-
`rations.
`
`6,041,352 A *
`
`3/2000 Burdicket al.
`
`....0........ 709/224
`
`3 Claims, 5 Drawing Sheets
`
`rI
`
`I|
`|
`\
`
`{1
`I
`|! a es a =
`a==iTa —
`\
`|NTWos |NTWOTS. INTWooa
`NIMADE
`{18CLIENTS} pei
`JsBCLIENTS)
`stich} -
`100MbitHUB
`fo— 100MbitHUB
`100MbitHUB
`|
` t
`
`
`
`!
`\
`|
`
`[
`\
`
`
`
`=
`=
`f=
`
`of
`
`9g
`
`oh
`
`‘
`
`t
`I
`
`FLEaPaNT_|
`
`2PROC~|
`is
`2PROC
`|
`
`EXCHANGE_|
`
`2 PROC
`PDC
`|
`2PROC
`I
`
`Google Exhibit 1060
`Google v. VirtaMove
`
`||]toonpi Hue foo Mbit HUB 100MbR HUB 100Mbit HUB !
`
`
`
`
`
`
`
`|
`10h
`I
`I
`
`!
`
`Google Exhibit 1060
`Google v. VirtaMove
`
`

`

`U.S. Patent
`
`May20, 2003
`
`Sheet 1 of 5
`
`US6,567,767 B1
`
`10
`TT TTTTeez ———— “7
`|
`TERMINAL SERVER EDITION CLIENTS
`|
`I
`10a
`10b
`10¢
`10d
`;
`1
`|
`I
`\
`|
`J
`|
`|
`|
`|
`
`NTWO1-
`NTWO15
`(15 CLIENTS)
`
`NTWot6-
`NTWo030
`(15 CLIENTS)
`
`NTWo3t-
`NTW045
`(15 CLIENTS)
`
`NTWo45-
`NTWos0
`(15 CLIENTS)
`
`|
`I
`|
`I
`{
`1
`|
`i
`|
`|
`
`100 Mbit HUB
`
`100 Mbit HUB
`
`100 Mbit HUB
`
`100 Mbit HUB
`
`9a
`
`9b
`
`10e
`
`NTW061-
`NTWO75
`(15 CLIENTS)
`
`9c
`
`10f
`
`NTWO075-
`NTW0S0
`(15 CLIENTS)
`
`r=
`
`9d
`
`10g
`
`NTWo91-
`NTW105
`(15 CLIENTS)
`
`iE
`
`=
`
`9g
`of
`9e
`I
`Pee ~- je
`
`|
`I
`I
`ton!
`{
`|!
`[1
`|
`j
`|
`
`NTWios-
`NTWi20
`({5CLIENTS)}
`
`-
`
`I
`J
`
`12
`
`f=
`
`9h
`
`
`
`| MONTOR&CONTROL | 16
`BACK-ENDSERVERS
`
`; SERVERS -——|_~
`FILE & PAINT
`2 PROC
`Ws
`2 PROC
`EXCHANGE
`2 PROC
`
`{
`
`|
`__|
`
`I
`I
`
`|
`
`!
`CONTROL
`{
`I
`Ppc
`2 PROC
`oP
`16
`ROC
`C -— a Ce |
`"4
`| ‘TESTSERVER | 18
`|
`~~
`{
`!
`!
`I
`{
`|
`I
`I
`I
`(
`
`16m
`MONITOR
`2PROC
`
`/
`|
`1
`I
`
`
`!J
`
`es204s
`
`2PROC
`
`|
`
`I
`{
`
`{
`
`

`

`U.S. Patent
`
`May20, 2003
`
`Sheet 2 of 5
`
`US 6,567,767 B1
`
`$S390UdLN3IM9GALINMIS
`
`9}
`
`YS3AYIS
`
`SL9}
`
`
`
`idfdOsLS3L
`
` 2ainbig
`
`SUVMLIOS
`
`TOULNOD JYVMLIOS
`
`
`
` bAOLA-OdINAS8Y3AUASLSS)XO}X-3dLNSMND
`OZ40|OdHSHLONV=A0c}40LNOOd}=X
`
`LN3M9SLBovdSXINsMoLl=|—C—iti‘“;‘;*é*dYStété‘(SS#SY:SCSAQNNIIFIV'LL
`
`
`
`/
`
`
`
`SSNOHLSGHYOGAIY
`
`
`
`SaiVddNIWOIKdVHD
`
`
`
`S3NOULSGHVORAS
`
`
`
`Sa.lVddNTWoIHdvHD
`
`
`
`
`

`

`U.S. Patent
`
`May20, 2003
`
`Sheet 3 of 5
`
`US 6,567,767 B1
`
`
`
`
`
`INSITDHSAH3STVNIWEALOl
`DNIGNOdSSYHCSGNSS
`
`3NOHLSAI®
`
` CGNVINWOSSHOULees,
`
`(gv)
`
`Aay
`
`€ainbi4
`
`GalsloadSHOdLIVM
`
`
`
`d1VddNWolHKdvud
`
`(z¥)
`
`
`
`CONVINWODLIVM
`
`
`
`
`
`CONVWINOSHAWILLdOls
`
`CONVWINODHSLLYVLS
`
`
`
`HSWILLSZITVILINI
`
`NOLLONNS
`
`
`
`ONVWANODLdid3s
`
`
`
`LXGNdvadéLSaLGNA
`
`
`
`
`
`1dld90SLS3LLYOdIl
`
`LINASWSLuis
`
`NOILNDSX3
`
`LN3IT9GALVINWISHOVSHOS
`
`
`
`(Ly)INSITDGALVINIWIS
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`

`

`U.S. Patent
`
`May20, 2003
`
`Sheet 4 of 5
`
`US6,567,767 B1
`
`TIMER FUNCTION PROGRAM
`
`(B)
`
`(FROM FIG. 3)
`
`START TIMER FUNCTION
`
`(B1)
`
`62)
`
`(FROM FIG. 3)
`
`‘ii>=
`
`(B3)
`
`OPEN LOGFILE
`
`=a
`
`CACHE
`START DATA
`
`(FROM FIG.3)
`(c)
`
`STOPTIMER Funcron]
`
`—
`
`CALCULATE
`RESPONSE TIME
`
`GET ACTION LABEL
`AND USERID
`
`WRITETOLOGFILE
`L_]
`
`(C5)
`
`Bee
`
`Figure 4
`
`ct)
`
`(C2)
`
`(C3)
`
`(C4)
`
`

`

`U.S. Patent
`
`May20, 2003
`
`Sheet 5 of 5
`
`US 6,567,767 B1
`
`ae/
`
`QNOO2SSI
`
`STVAWALNI
`
`|:i
`
`gonbiy
`
`SUASN9Eb SH3SNZlt
`Aw\:q
`T1ISSOdNOLVZITILNHOSS390HdWO!40SDVINSOWad~<—wy44
`
`
`
`SNOISSSSJAILOVNISH3SN40HASWNNuaaNnn
`
`
`saywoaeymai,WPAsdL\=)&-Q]|Goris€'>sousoary|Jumes1CC]@-@awed»djasfoo,
`
`
`
`
`JasojdxgJaulJa}U]YOSOITI-WAYUOWPaga}dwes\uajeg\sayi4
`BoOWMBsHoxt+Begala
`
`WYUOWpegajdwes\uajeg\safiyAy\:q(]ssauppy
`
`
`
`
`ALLOY
`
`su3sn
`
`

`

`US 6,567,767 Bl
`
`1
`TERMINAL SERVER SIMULATED CLIENT
`PERFORMANCE MEASUREMENT TOOL
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`This case is related to a co-pending application, U.S. Ser.
`No. 09/661,101 entitled “TERMINAL SERVER DATA
`FILE EXTRACTION AND ANALYSIS APPLICATION”,
`whichis incorporated herein by reference.
`BACKGROUND OF THE INVENTION
`
`10
`
`In the situation involving networks where numerousclient
`terminals are connected to a test server,
`it is desirable to
`obtain information as to the actual execution times for
`accessing programs, and also for example, accessing infor-
`mation about the time required for executing different parts
`of available programs.
`Thus, a performance and measurement system for defin-
`ing and measuring user-relevant response times to remote
`client stations which are serviced by a terminal server, is of
`great importance to evaluate the status of a network of users
`and terminal servers.
`
`Performance data produced by the interaction of the
`client-users and the terminal server is collected and subse-
`quently logged. Once the data is logged, the data may then
`be accessed and collected by an Administrator in order to
`evaluate the system responses involved in the network.
`Performancetools are used to measure the performance of
`the test server in regard to its availability of servicing the
`various and multiple clients. A resultant state of the system
`may be accomplished in order to evaluate the total resource
`utilization of the system. Such a determination may even-
`tually discover which resources cause slowdownsor bottle-
`necks in system performance, and once identified, these
`resources can be upgraded to improve system performance.
`Another useful purpose for evaluating computer perfor-
`mance may be for what is called “application tuning” in
`order to focus on particular user applicationsor situations in
`order to determine how to improve system performance
`regarding a particular application.
`Another use for performance tools is for the purpose of
`troubleshooting and to help determine why system perfor-
`mance maybe degrading without any immediately apparent
`reason.
`
`In many situations, so-called performance tools have
`generated too much information making it difficult for an
`operator-user to fully comprehend the nature of what is
`happening. If a system gathers and logs huge amounts of
`information, this requires large memory sources for data
`logging and is often very difficult to analyze, in addition to
`taking a lot of processing powerto generate this information
`and then to try to present this data into a form that is useful
`to a user.
`
`It is always a problem to identify when the performance
`of a system has been degraded beyond acceptable limita-
`tions. Manyof the earlier attempts for such analysis pro-
`vided only indirect information regarding the end-user’s
`performance expectations in addition to requiring extraor-
`dinary administration and managementefforts in the system
`to develop the required information. Many of the carlicr
`systems were influenced by the test environment character-
`istics and did not provide feedback for the actual client
`sessions undertest. As a result, this necessitated the opening
`of additional terminal server connections which were admin-
`istratively time-consuming and caused significant additional
`CPU overhead.
`
`35
`
`45
`
`50
`
`55
`
`2
`FIG. 5 is an illustration of one type of earlier performance
`measurement which was onlypartially useful as it only
`provided very limited information regarding processor uti-
`lization which limited the user’s ability to evaluate the
`conditions of operation.
`The presently described system and method will be seen
`to measure and collect the response times for any variety of
`designated actions initiated by terminal server scripts. The
`method will be seen to call a timer utility before and after
`each designated action, such as logging on, opening
`applications, and typing of characters.
`Then, by noting the response times involved during a
`sequenceof different operating conditions (small numberof
`concurrent client-users over to a large numberof concurrent
`client-users) it is then possible to determine what are the
`acceptable and non-acceptable operating limits for the entire
`system.
`
`SUMMARY OF THE INVENTION
`
`In a system where multiple client-users are connected via
`hubs and switches to a back-end server, there is provided a
`method whereby a program is provided to measure, collect,
`and analyze the response times for any variety of designated
`actions initiated by terminal server scripts, and whereby the
`calling of a timer utility, before and after designated actions,
`will provide information on the resulting response times.
`Because the timing components are added to existing
`client scripts, rather than adding new scripts, the timing
`components are minimally intrusive to the load on the server
`under test. The timing component provides data for all the
`sessions involved in any given test and the timer is inte-
`grated into sessions that are already part of the test envi-
`ronment.
`
`Actions can be designated for measurementsuch that the
`resulting data can be analyzed andis directly relevant to the
`end-user’s real world performance expectations. The result-
`ing data is minimally influenced by any test environment
`characteristics. To achieve this, the administrator modifies
`existing scripts by adding commandsthat executestrictly on
`the Remote Client PC.
`
`The accumulated response times are saved to the Client
`PC-User terminals.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a drawing of a system network showing multiple
`simulated clients connected through a switching device to a
`set of back-end servers, monitor and control servers, and a
`test server;
`FIG. 2 is a sketch showing relationships between two
`typical Client-Personal Computers (x,y) in the multitude of
`Terminal Server Clients, as related to the Control Server and
`Test Server;
`FIG. 3 is a flow chart showing the steps involved for
`executing applications on each simulated client;
`FIG. 4 is a flow chart of the timer function program to
`calculate response times and record results on the timer log
`file;
`FIG. 5 is a graph showing the number of simulated client
`users (G—G) while also indicating another graph (R—R)
`which shows the percent utilization of the total possible
`utilization for each point of the number of active Client-
`Users;
`
`GLOSSARY OF RELEVANT TERMS
`
`ACTION LABEL. An action label is a descriptive name that
`an administrator assigns to an action that he/she has
`
`

`

`US 6,567,767 Bl
`
`3
`decided to measure using the Terminal Server Simulated
`Client Performance Measurement Tool. On Table II of
`co-pending U.S. Ser. No. 09/661,101, there are six action
`labels for discreet actions that were designated within the
`TS Scripts for measurement. These action labels are:
`Connect, Excel, Outlook, Cmd Prompt, Explorer, and
`word.
`BACK END SERVERS: Servers that form the infrastructure
`for the test environment. Various applications and user
`functionality is supported by these servers with the intent
`of modeling a real world environment. These include the
`Internet Information Server (IIS),
`the Primary Domain
`Controller (PDC),
`the Exchange Server, and File and
`Printer Server, plus a Monitor, Control and Test Server.
`CLIENT SCRIPT:A looping list keyboard input that is fed
`from the TS (Terminal Server) Clicnt Software to the Test
`Server in order to mimic real user input. The script sends
`the appropriate keyboard sequence to log into a Windows
`session, to open Excel, Outlook, Internet Explorer, Word,
`and perform common real world actions within cach
`application (i.e., creating graphs, printing, sending email,
`browsing web pages). StartTimer and StopTimercalls
`before and after designated activities are inserted into
`these scripts.
`CONTROL: Control Server station (CONTROL) controls
`the creation, distribution, and execution of the scripts. It
`also manages the remote Client PCs as they execute the
`scripts, and timer functions.
`EXCHANGE:A serverthat hosts and allows emailservices.
`GRAPHICAL UPDATE: When a user/or simulated user
`delivers some form ofinput(pressing the letter “k” on the
`keyboard) to a Terminal Server Client Session, the “k”
`inputis first sent to the Terminal Server over the network.
`The Terminal Server decides what should happen when
`this “k” is input. If this “k” input changes what should be
`displayed on the screen of the Terminal Server Client
`Session, then the Terminal Server sends that graphical
`update over the network to the Client Session. If a text
`editor such as Microsoft Word was open whenthe “k” was
`submitted then the corresponding graphical update would
`add the letter “k” to the appropriate location on the
`Terminal Server Client window.
`IIS: Internet Information Server. Hosts the internetsites that
`
`are browsed in the course of the Client Script loop.
`LOG FILE: Synonymous with Timer Log File.
`MONITOR: The monitor server station (MONITOR) cap-
`tures all
`the Performance Monitor data from the test
`server, and stores the associated logs. ‘This monitoring is
`done remotely in order to minimize the performance
`impact on the server undertest.
`PDC: Primary Domain Controller. This is the system that
`authenticates user logons for the entire testing domain
`including the TS Chents who attempt to gain access to the
`test server.
`
`REMOTE CLIENT PC: A variety of desktop PCs can be
`used as a remote Client PC. This remote system runs the
`terminalserverclient (TS Client). These systemshostthe
`client component of the TS connection, the SM Client,
`and the log files.
`SM CLIENT:
`(Simulated Client) The application which
`takes a client script and feeds it
`to the TS (Terminal
`Server) Client. This process occursstrictly on the remote
`client PC and therefore docs not contribute to the load/
`stress on the test server.
`TERMINAL SERVER CLIENT:A Terminal Server Client is
`
`an application that runs on a remote clicnt PC. It receives
`desktop graphics from the test server and sends user
`initiated mouse movements and keyboard strokes to the
`test server.
`
`10
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`TERMINAL SERVEREDITION:A special version of NT4
`Microsoft Operating system that incorporates a multi-user
`kernel and allows numerous simultaneousclient sessions.
`
`The Windows 2000 equivalent does not require a special
`edition of the operating system, but instead requires an
`additional “Terminal Services” componentto be installed.
`TERMINAL SERVICES: Although the process is virtually
`transparent to the user, terminal services gives remote
`users the capability to run the Windows desktop and
`applications from a central server. A small application,
`Terminal Server Client, is installed on the Remote Client
`PC. The Terminal Server Client sends mouse movements
`
`and keystrokes and receives the corresponding graphical
`updates from a central test server.
`TEST SERVER: This server is the focus of the testing
`environment. It runs the Terminal Services enabling oper-
`ating system (NT4 Server Terminal Server Edition, or
`Windows 2000 Server with Terminal Services component
`enabled). The test server receives mouse movements and
`keyboard input over the network sent
`from the TS
`(Terminal Server) Clients which are executing on the
`remote Client PC. Thetest server hosts the client desktops
`and applications, sending the resulting graphical updates
`to the TS Client. The test server is commonly referred to
`as a central server because it “centralizes” the execution
`
`of Windowsdesktops and applications similar to the way
`a mainframe works.
`TIMER:A unit providing clock time in milliseconds with a
`resolution equal to the machine cycle speed.
`TIMER DYNAMIC LIBRARY: A dynamic linked library
`piece of the WTS Timer Utility containing multiple
`programmatic procedures that can be called/initiated from
`anotherfile. The Client Scripts are modified with calls to
`this library.
`TIMER LIBRARY: Synonymous with Timer Dynamic
`Library.
`TIMER LOG FILE: A file created during the execution of a
`timer modified Client script. This file details the results of
`the timing actions, with the nameof the action measured,
`the time in milliseconds the action took to execute, and
`the time/date of the entry to the log file.
`USERID: Each TS (Terminal Server) Client has a unique
`identifier.
`
`DESCRIPTION OF PREFERRED EMBODIMENT
`
`FIG. 1 showsthe general overall environment of modules
`and units which are involved in the computing architecture
`for a Thin-Client/Server set of installations. The test server
`18 contains a terminal server enabled operating system (NT
`4 Terminal Server Edition or Windows 2000 Server with
`Terminal Services enabled). Terminal Services functionality
`is made possible using three components whichare (i) the
`Terminal Server Operating System, (ii) the Remote Desktop
`Protocol, and (iii) Terminal Server Client. With the addition
`of the SM Client (10xsm, FIG. 2) andtest script (L6ts, FIG.
`2) components imaginary users can be created to simulate
`the work load ofreal users.
`
`As seen in FIG. 1, there is indicated a PC farm designated
`as Terminal Server Edition Clients 10. Here there is indi-
`
`cated eight groups of PC’s designated 10a, 105... thru 10g,
`10h. Each item represents a group of 15. Each group of PCs
`are connected through a 100 megabit HUB designed as 9a,
`9b, .. . 9g, 9h. Also the series of HUBs are connected to a
`100 megabit switch 12.
`The Terminal Server client software in FIG. 2 (10xtsc,
`10ytsc) runs on a range of devices with varying operating
`systems and enables the users to gain scamless acccss to
`
`

`

`US 6,567,767 Bl
`
`5
`32-bil applications. The Remote Desktop Protocol (RDP)is
`used to communicate between the client PCs 10 andthetest
`
`server 18. This component involves a network protocol that
`connects the client PCs and the test server over the network.
`
`the testing environment is
`As will be seen in FIG. 1,
`equipped with 8 scts of 15 PCs 10a, 105, . .. 10g, 10/. With,
`for example, 120 total PCs, each running oneset of Terminal
`Server client connections,the testing environment can simu-
`late 120 user connections to the test server 18. While each
`PC (in the Client Group 10) is capable of running multiple
`Terminal Server connections.
`
`It is important to know the performance and load capa-
`bilities for the Terminal Services Operating System, which
`is installed on test servers, shown as item 18 in FIG. 1. Here
`this is of considerable value in order to enable designers to
`plan and size the deployment of Thin-Client/Server Solu-
`tions.
`
`The test server 18 of FIG. 1 is designedto deliver reliable
`performance and scalability to as many Terminal Server
`Clicnts 10 as possible without sacrificing optimal perfor-
`mance. A concept of “optimal performance”is defined as a
`performance that allows the Thin-Client architecture to
`remain transparent to the user.
`In FIG. 1 the Test Server 18 is set up as a test server for
`running either the Microsoft Windows NT Server 4.0 Ter-
`minal Server Edition or Windows2000 Server with Terminal
`Services enabled, and is configured with the Office 2000
`Suite of applications. The test network of FIG. 1 also
`provides a monitor (16m) and control (16c) of station 16.
`‘The monitor station (16m) captures all
`the performance
`monitor data concerning the test server (18) and stores the
`associated logs. This monitoring is done remotely in order to
`minimize the performance impact on the server 18 under
`test.
`
`The control station in 16 controls the creation,
`distribution, and executionofthe scripts. It also manages the
`remote clients 10 as they execute the scripts. The Monitor-
`Control servers 16 and Test Server 18 are seen connected to
`the 100 megabit switch 12.
`Now,additionally connected to the 100 megabit switch 12
`is a set of Backend Servers 14 whichare set up to simulate
`a real-world environment. These include a Primary Domain
`Controller (PDC),
`a Microsoft Exchange Server 5.5
`(EXCHANGE), a Microsoft Internet Information Server 4.0
`(IIS), and a Microsoft WindowsN'Tserver 4.0, used forfile
`and printer sharing (FILE&PRINT).
`BENCHMARK PROCEDURES: Experimental operations
`indicated that “simulated” Office 2000 user scripts would
`take approximately thirty minutes to loop through
`Outlook, Word, Access, Excel, and Internet Explorer 5 at
`a typing speed of forty-eight words per minute. These
`scripts are designed to match the typical work patterns of
`real-world users. Tests were made to stress the server
`
`under test 18 by logging on simulated Terminal Server
`clients that were running on these scripts.
`The numberof concurrent clients was gradually increased
`while the scripts were cycled through the various applica-
`tions. Thus multiple test runs were conducted and additional
`sets of Performance Monitor (PERFMON)log files were
`produced to verify reproducibility.
`BENCHMARK MEASUREMENTS: Using the Microsoft
`Performance Monitor, performance counters were col-
`lected on all
`the available objects and counters. The
`counters for Processor Usage, Active Sessions, and Pro-
`cessor Queue [Length are activated and a recording is
`made for percent of total processor usage for cach period
`
`10
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`related to the number of active session Client-users. The
`performance data thus reflects the stress on the server 18
`under test which influences the end-user performance.
`This is indicated in FIG. 5.
`To evaluate end-user performance, timer components are
`inserted into the test scripts before and after a designated
`action. For example, timed actions can include (i) log-on
`time “Connect”;
`(ii) time to open applications and (iii)
`character delay while typing.
`DEFINING OPTIMAL PERFORMANCE: Optimalperfor-
`mance is the point at which a server is loaded with the
`maximum numberof clients possible without user per-
`formance degrading beyond a predetermined limitation.
`During testing, timer logs are created to measure the
`delays for completing certain actions from the user’s point
`of view. Ideally, the limitations on delays are determined
`with the intent of providing a transparent solution to the
`user, that is to say so that the user could not distinguish
`that the applications are running on a centralized server,
`such as server 18.
`Table I below is a table showing one possible configura-
`tion for a test server undergoing tests. (this is just one
`possible server configuration) (Test Server 18).
`
`TABLE I
`
`Table I Server Test_Configuration
`
`System Processor Cache
`Col.1
`Col. 2
`Col, 3
`
`Memory
`Col. 4
`
`Disk
`Col. 5
`
`Network
`Col. 6
`
`ES2045 Four Intel L2 Cache: 2 GB
`Xeon
`2MBper Memory
`processors processor
`at 550
`Mhz
`
`External
`disk array
`
`One Network
`Interface Card
`with 100 MB
`access
`
`The first column shows the system designation while the
`second column showsthe processors involved as four Intel
`XEONprocessors. Column 3 indicates the cache as a L2
`Cache, having two megabytes per processor. Column 4
`shows the memoryutilized as being 2 gigabytes of memory.
`Column 5 shows the disk as being an external disk array,
`while column 6 showsthe network as involving one network
`interface card with one hundred megabytes of access.
`In order to determine the optimal performance of a
`terminal server solution, it is required that Test Server 18 be
`placed under a significant load using Thin-Client simulation.
`Microsoft provides a scripting engine and language to be
`used for such testing (the SM Client and associated Testing
`Scripts). So, in order to “simulate” user operations, there are
`virtual Thin-Clients which are launched and there are user
`
`applications applied within the given session. Thenrealistic
`user scenarios are constructed by combining both applica-
`tion and task sequences in the scripts.
`The Office 2000 user simulation script was developed for
`measuring the performance of Office 2000 in a Terminal
`Server environment. By modifying the scripts to time record
`the desired actions, the Terminal Server Client Measurement
`Tool measures and saves data regarding the delays involved
`in executing these actions.
`FIG. 2 is a sketch illustrating the inter-relationships of
`modules involved in the simulated Client process. Assuming
`for cxamplec,that there are 120 clicnts as was indicated in
`FIG. 1, then a typical Client PC-X showsa block designated
`10x which indicates several sets of software which reside in
`
`Client 10x. Onc portion of software is designated Terminal
`Server Client (TS Client) 10xtsc. This piece of software
`receives information from anotherset of software designated
`SM Clicnt, 10xsm.
`
`

`

`US 6,567,767 Bl
`
`7
`Likewise, another Personal Computer designated as
`another typical Client PC-Y is another typical PC residing in
`the overall farm of 120 PCs. This client is designated 10y.
`Again, there are two distinct groups of software in the Client
`Y andthese are designated as the lerminal Server Chent (1S
`Client) 10ytsc, which is fed from the package of software
`designated SM Client 10ysm.
`As will be further seen in FIG. 2, there is a Control Server
`16 which utilizes a Test Script portion of software desig-
`nated 16/s. This test script
`is fed to each of the SM
`(Simulated) Clients in the Client farm and,in this case, FIG.
`2 showsthe test script being transmitted to the SM Client
`10xsmandalso to the SM Clicnt 10ysm. Thus, the test scripts
`in the Control Server 16 are fed to the software of the SM
`
`Client software for each and every one of the PCs. Subse-
`quently then, the SM Client software is then fed to the
`Terminal Server Client software in each one of the various
`PCs in the farm 10.
`Connected from the Client-PC 10x, it will be seen that
`each keyboard stroke is provided from the TS Client 10xtsc
`over to the Client X Space 18x, and the Client X Space 18x
`feeds back the corresponding graphical update information
`to the TS Client 10x1sc.
`
`in the other typical Client PC-Y, designated
`Likewise,
`10y,
`the TS Chent 10yfsc will feed an enumeration of
`keyboard strokes over to the Client Y Space designated 18y,
`and the Client Y will feed back the corresponding graphical
`updates back to the TS Client 10ytsc. The Client X Space
`18x and the Client Y Space 18y, whichis typical of spaces
`provided for each and every oneof all of the active TS Client
`sessions, are all located in the test server 18.
`A flowchart for the simulated client execution operations
`is shown in FIG. 3. This flow will occur for each and every
`one of the simulated clients 10a, 105, 10c, .
`.
`. 10/, 10g.
`At step A1, there is the initiation or start of the SM Client
`software, such as 10xsm, and 10ysm, etc., (FIG. 2) which
`will then lookfor a specific test script for each and every one
`of the multiple client PCs.
`Then, at step A2, there is an importation of the test script
`from the control server 16 over to each of the various SM
`Clients, such as 10xsm, 10ysm,etc.
`At step A3, there is an initialization of the timer function
`which will be later seen in FIG. 4.
`Step A4 of FIG. 3, is a multiple decision block from where
`the process sequence can go to—Exit, A4E); to step A5 (B);
`to step A6 (C); to step A7 (wait for graphical update) or step
`A8 (execute keystroke command).
`At step A4 of FIG. 3, there will be initiated the reading of
`the next script command from the Control Server 16 where
`the Terminal Server (TS) client is turned on, followed by a
`simulated user log-on where the Simulated Client (SM
`client) application provides a user-name and password.
`As the commands are fed from the Simulated Client to
`Terminal Server Client, the TS Client (10x) (10y) sends a
`series of keystrokes to the client space (such as 18x, 18y of
`FIG. 2) in the Test Server 18. After this, at the same time,
`there will be a “start” of the timer commandat step A5, FIG.
`3, which indicates the marker B continuing to FIG. 4 for the
`timer function operation.
`Simultaneously at step A7, a Wait commandis initiated in
`order to receive a specified graphical update which was sccn
`in the earlier FIG. 2, whereby the Client X Space, 18x,
`provides a graphical update to the TS Client, 1Ox¢sc, and also
`the Clicnt Y Space, 18y, provides the graphical updates to
`the TS Client 10ytsc.
`After the graphical update has been completed, there is
`initiated at Step A6, the “Stop Timer” command which is
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`later seen in ils operative sleps through reference marker C
`onto the timer function program of FIG. 4.
`At step A8, there is an execution of the script command
`which when executed, will return back to step A4 in order
`to read the next script command.
`‘The script commands will continue to be read until an
`“End Test” command is encountered and the program will
`exit at step A4E.
`Nowreferring to FIG. 4, there is seen the timer function
`program whichhasbeeninitiated at step B from FIG. 3, and
`at step C from FIG. 3. The purpose here is to measure the
`time period involved between the request for an action and
`the graphical update indicating the complete execution of
`that desired action. Now referring to reference markerB, the
`first step at step B1 is the start of the timer function which
`collects the current time from the Remote Client PC.
`At step B2, a decision block is utilized to question
`whetherthe timer logfile is open or not. If the log file is not
`open (NO), then the sequence proceeds to step B3 to open
`the log filc. Likewise, if the log file is open (YES)at step B2,
`then the sequence proceeds to step B4 so that the start data
`(the date and time in milliseconds) can be cached, placing
`the time therein when the process started,
`into cache
`memory.
`Then continuing on with FIG. 4 at the reference marker C
`(from FIG. 3) whereat step C1 there is a stop action for the
`timer function which collects the current time at that instant.
`
`Next, at step C2 a calculation is made of the response time
`for that particular action for
`that particular TS Client
`Session, by calculating the difference between the slop time
`and cached“start time” whichis referred to as the “response
`time”. The process then continues by obtaining the action
`label, user ID and response time for that particular PC client,
`and this information at step C5, is placed into the timer log
`file of FIG. 4.
`
`As wasindicated in FIG. 3, the start and the stop timer
`functions are called for in every single designated action
`performed within the simulated client script for each and
`every single one of the multiple number of TS Client
`sessions. Therefore, an entry is madeto the logfile for every
`single designated script action for each and every one of the
`TS Client sessions.
`
`FIG. 5 is an illustration of a prior art type of graphical
`analysis which was derived in the Windows Operating
`System and illustrates the limited information available in
`contrast to the greater amount of specific information avail-
`able in the presently described performance measurement
`tool. Here, only a few limited varieties of information will
`be available since such a prior art system only showed
`utilization and numberof active users.
`
`is now
`it
`In the present enhanced performance tool,
`possible to view the test server 18 to see whatthe client users
`are getting in their simulated operations which information
`is all logged into the Timer Log File.
`Thus, the present system captures log-on time, time to
`access an application and other specific details regarding
`each client-user that was previously not available in such
`graphs as that shown in FIG. 5.
`The timer Log information in the present application can
`be printed out and also displayed in graphical format as is
`illustrated in the co-pending application, U.S. Ser. No.
`09/661,101 cntitled “Terminal Scrver Data File Extraction
`and Analysis Application”. The Y axis or ordinate of FIG. 5,
`showsthe number of active users which ranges in this case
`on the graph from 0 to 200 users. The X axis or abscissa, is
`an illustration of the percentage of total poss

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket