`
`CHAP. 9
`
`378
`
`Secme channels anti acces� control require mechanism� to di�tribute crypto•
`graphic keys. but also mechanism� 10 add and remove user, from a �ystem. The,,e
`topic:. are co\'ered by what is known a, securit) managen1e_n1. In a separnte sec•
`tion. we tli,cus$ is�ues uealing with managing cryprogr;1ph1c keys. secure grot1p
`man:1ge111elll. and handing out ceniticates that pro\'e the owner i� entitled IO ac
`ces� specified resource:...
`
`9.1 INTRODUCTION TO SECURITY
`
`We start our description of security in distributeu �yste1m b) taking a lool-. at
`:,ome general :,ecurity i��ues. First. it b neces!,ary to del'ine 11 hat a secure sy�tem
`is. \\'e di�tingubh :,ecurity policie.1 from security 111er/w11irn1L and take a look at
`the Globus 11:ide-area sy:,tem l'or which a security policy has been explicitly !'or·
`muln1ed. Our seL:ond concern is lo con�iuer !lome general design i,:,ues i'or :,,::cure
`systems. Final I}. 11 c brietly discuss some nyptogrnphic algorithm:,. ll'hich play a
`key mlc in the de�ign of ,ernrity protocob.
`9.1.l Security Threats, Policies, and Mechanisms
`Security in a computer �y�tem i, ,trongly related to 11he notion 01· dcpcndabil
`it). Informal I). a dependable compL1ter S)�tem is one that 11e ju�tifiably uw,t lO
`deli1·er ih sen ices (Laprie. 1995). As mentioned in Chap. 7. dependability in•
`dudes ;1l'ailabili1y. reliability. �akty. and mainwinability. Holl'e1er. if we arc: to
`pur om u·u�t in a computer sy�tem. then rnnf'identiality and inte?·ity �hould al,o
`be tak.:n into account. Conlidcnti:ility refers to the property ol :i computer sy\•
`tem whereby ih informntion b disclo,ed unly to authorized parties. Integrity is
`the characteristic that alteration� to a system's a�sct, can be made onl) in an m11h
`orized way. In other words. improper alterations in a secure computer system
`should be detectable and reco1·erable. i\'lajor a�seb of any computer :,ystem are its
`hardware. software. and data.
`Another way of looking al security in compttler sy�tems i� that we nttemp_l to
`protect the services anti data it offers agninst security threats. There ar.: four
`types or security threms to consider (Pfleeger. 2003):
`l. I mercepl ion
`lntem1ption
`2.
`3. Modification
`4. fabrication
`The concept of interception refer, to the situation that an unauthorized party
`has gained nccess to a sen ice or data. A typical example of interception is where
`
`379
`
`INTRODUCTION TO SECURITY
`
`SEC. 9.1
`communication between 111·0 p:tnies hn, been 01·erheard by someone ebe. Inter
`ception also happens ll'he11 data are ilkgall) copied. 1·or example. after breaking
`�
`into a per,un·-; pri1·ate directory in a file sy,1em.
`r\n example of incc:rrup1ion is II hen a file is corrupted or Jo,t. l\lwe generally
`interniption rder� to the �ituation in 11·hich service� or clat:1 become um;1·;1ilable.
`unu:,able. de,troyed. and ,o on. In thi;, sen,e. den in I of se1Yice n11acks bv which
`�omeone maliciou�ly aucmpt;, to make n serl'ice inaccessible to other ix.11:tie� i, a
`,ecurity tJ1reat that clu,�ilies as interruption.
`i\lodilication, involl'e unauthorized changing or tlata or tnmpering ll'ith n se1·.
`1·ic<' !>O that it no longer adhere� to its original specifications. Examples or mouifi.
`cation, include intercepting .111d ,ub"equently changing tran,111i11ed dnta. tamper
`ing_ '�?h cl,_,�ab.i e entri<'�. and changing a program ,o that it 5eeretly log� the
`act11·111e� ol lh u�er.
`Fabricati{>n refer, 10 the "'ituation in ll'hit:11 additional data or acti1 ity art' !!en·
`erat.:d that II ould m>rm:ill) not e,ist. For example. an imruder ma� auempt to-add
`an <'ntry into a pa"11 orJ rile or databa,,:>. Like11 i,e. it i, sometimes possible 10
`break into a �):,,tern b) replaying pre1·i0Li-ly ,ent me�,ag.:�. \Ve ,hall come acro,s
`such e,amples later in thb ,·hapta.
`Note that interruptinn. modilicaiion. and J'abric,1tio11 can each be �een a� a
`form of data fobilic:uion.
`Simply ,t;iting that a ,ystem ,hould be able w pwl<'ct it�elr again:,,t all pos•
`:,ihk :..:curit� threat, is not the ll'a)' to actual I) build a �eel/re systc:111. What i� l'irst
`needed b a de,cription of ,cc-urit) requir<'ment,. that b. a �ecurity policy. ,.\ secu
`rit�· policy de,crib<', preci,d} which actions the entitie, in a �ystcm ,ll'e allowed
`to take anti II hieh ones are prohibitt'd. E111itie, include user,. �en·ices. darn. ma
`chines. and ,o on. Once a seeurit) pol it.:) has been laid do11 n. it become:, possible
`to com:entrate on the security mechanisms by whicl1 a policy can be enforced.
`Important $ecurity mechanism, are:
`I. Encryption
`2. Authentication
`' -'· Authorizntion
`-+. Auditing
`Encryption is fundamcntnl to L:0mpu1er security. Encryption transfonns data
`inw something an auacker cannot understand. Jn other ll'orcl,. encr) ption pro1·icles
`a menn� to implement data confitlcntiality. In nddition, encryption alloll'� us to
`check whether dam h,ll'e been modi lied. Ir thus also provides support for integrity
`ched.s. Authe111icntion i, used to 1·erify the claimed identity of a user. client, sen·er,
`host. or other entity. In the case of client,. the basic premise is that before a ser
`\'ice srn11s to perform an) I\Ork on behalf' of a client. the sen·ice must learn the
`
`Netflix, Inc. - Ex. 1007, Page 000070
`
`
`
`380
`
`SECURITY
`
`CHAP. 9
`
`client·� identity (unle,� the �er\'icc i., available lo all). T) pically. u�er� are auth
`enticated by me:111' of pa,,,,·ord�. but there are man) other 11 ay, to authenticate
`client, Afo.:r a client hth been authenticated. it i, nece)sar) tn ched. 11 hcther that cli
`ent b autlwrized tn pcrfonn the a1.:1ion requested. Acee" w record, in a medical
`d:nabaw i, a typical example. Depending on II ho acce,,es the database. penni,
`sion Illa) be granted to read reconk to nwdify certain field) in a rei.:ord. or to add
`or remon� a recor<l.
`Auditin!.! moh are used to trace which clit!nb acce,,ed II hat. and 11 hich wa).
`Alrhm1gh at7diting doe'> not really prol'ide any protection agai11st ,ecurity threat�.
`audil log, can b.: .:,trc:mely Lht:ful for the analy,i, of a ,ecurity breach. and ,ub,e
`quentl) tal.ing me,htire, agaiihl intruder,. For thi) rea,on. auacker, nrc gener..ill�
`keen noi 10 lea\'e ,lll) trace, that coultl e, entuall) lead to e.,po)ing their itlentit�.
`In lhi, ,cn,e. logging acce,se, mal-.c, atwcl.ing ,ometillle) a ri,l.ier bu,ine,,.
`Example: The Globus Securit) Architecture
`The nntion or ,ccurit) polic) and the rule thm ,ecurit) mc:chani,nh pla) in
`di,1ribl1Lcd sy�tcm, for enforcing ,uch polil'ie, i, often bc,1 c:--plnincd by taking H
`lool. at a concrete e,arnpk. Con,ider the ,ccurit) polic) delined for the Glnhu,
`\11de-are,1 '} stem (Cherwnal. ct al.. 1000). GlobLh i, a ') stem ,upponing large
`,calc di,tributed computallmh 111 11 hich man) ho,ts. lilc,. and othcr re,uurces ar.:
`,imult:llh!llu�ly 11,ed for doing a crnnputation. Such em ironment� are aho referred
`tll a, computational grid, ( Fo,tcr and Ke,selman. 100:l ). Rcsourt:e, in thc,e grid,
`are oltcn locmed in different admi11i,1rati1e domain, that ma) be located in dif
`krem parh of the 1H1rld.
`lkcau,e user, and resource, nre \'a,t in number and \1idcl� spread acro,s d1f
`fcrent Hdministrati,·e domain,. �ccurity b c,sential. To dc1 i,e and propcri) use se
`curit) mechani,,m. it i, net·c,,ar) ll1 under,tand II hat e,actly need, to be pro
`tectetl. and what the as,umption, are II ith re,p.:t·t to ,ecuril). Simplif) ing mntter,
`,ome11 hat. the ,ccuril) poli1.:) for Globu, entail, the follo11ing eight i,tatemem,.
`11 hich we explain be loll' ( Foster et al.. I 998):
`I. The e111 ironment consists of multiple ndmini,tratil·e domain,.
`2. Local operations (i.e .. operation, that arc carried out on!} within a
`single domain) are subjec1 10 a local domain sccurit) policj only.
`J. Global operations (i.e .. operations inl'oldng se,eral domai1h) require
`the initiator to be kno11 n in each domain where 1h.: operation i\ ear
`ned out.
`-+. Operation� bct11·een cmitie, in different domain, reqLtire mutual
`authentication.
`5. Global authentication replaces local authentication.
`
`38)
`
`SEC. 9.1
`
`Tl\TRODUCTIO;-,.' TO SECURITY
`
`6. Controlling access to 1\:sources i� �ubjcct to local security only.
`7. Users can delegate rights to proces,e,.
`8. A group of processes in the same domain can share credential\.
`. Globu, assume, that the e111 ironment con,isb of multiple administratil'e do
`mam,. _\1_here each domain ha� ih 01111 local ,ecurit) policy. It is assumed 1hm lo
`cal pol1c1e, cannot be l'\1'.mged ju,1 because the domain panicipates in Globus. nor
`ca,_1 tl_1c �,·era II polic_y of G!ob'." 01 erride local security dt'ci,ion�. Con,equenily.
`,ecunt} 111 Globu� 11 tll res1nc1 1helf 10 operation, that affect multiple domain\.
`Related to !h1, 1\\ue _is thm Globus a!i\llnH?s 1ha1 operatioih that nrc entird)
`!ocal to a do_ma11: a�·e_�_ubJect onl) 10 that domnin's security policy. In other worLh.
`�fan op�r:won t\ _ 1n111:ued '.111d carried out ll'ithin a r.ingle domain. all sccuriiy
`h,ue, 11'1_1( be carried out usmg local ,ecuril) measure, only. Globu� ll'ill not im
`po,e add111onal me:i,urc,.
`. The Globu� securit) poliC) ,tmc:, that req11es1s for operations c:in be initiated
`e11her globally or locally. The initi:unr. be it u Ll\er or prol'cs, artin!! 011 behalf of a
`u:,er. mu,t be locally known wi1hin each domain where 1hat oper;tion i, carried
`o_u�. For e,ample, a ll\er ma) ha, e a global name that i� mapped to domain-spe
`cific loc_al name,. tfo�, exactly 1h:11 mapping take, place ,� lef'I lo each domain.
`A_n 11npor1an1 poll<.:) ,1mc111em i, that operation� between emitie, in different
`dom:1111� r�quir.-: nwtual :tuthcnticatinn. This mean�. for exnmple. 1ha1 if a u,er in
`one do'.na111 mal.c, u,e o_f_a serl'icc from another domain. then the identity of the
`u,er \I 111 ha, e to be 1-ertlled. Equall� imponant i, that the u,er 11 ill ha, c to be
`•'.�,ured Iha� he b u,ing a sen ice he thinh he i, using. \Ve return to authentica
`lJOn. ex1c1h1vely. Inter in this chapter.
`The_abm� 11\'0_P0lic) issue!, nrc combined into the folloll'ing securit) require
`ment. I� the tdent11:,. of n u,er ha, been verified. and that u�er i, also kno,\ 11
`loc_ally 111 a domain. then he can act a� being authenticatc:d for that local domain.
`Th!s �1cani, that Globus require, that its systcmll'ide au1hentica1ion mca,ures are
`�ul fictcnt 10 consider that a user ha� already been authcnticmed f'or a remote do
`�iain (where �hat_ user i, known) �1-hen acce"ing re�ourcei, in that domain. Addi
`uonal authenuc:lllon by that domam ,hould no1 be nece,sary.
`. Once a user (or process acting on behalf of a user) has been authenticated. it is
`�ull necessary to verif) lhe exact access right, with respect 10 resource\. For ex
`ample .. a user wanting lo modify a Ille will fir�l have 10 be authenticuted. after
`ll'htch ll can be checked 11:hether_ or not that u,er b actually permiued to modify
`the file. The Globus ,ecuniy policy s1mes that �uch access comrol decisions are
`made entirely local within t_he domain where the acce�sed resource h located.
`. To explain the seventh sratemen1. consider a mobile agent in Globus that car
`nes out a task by initiating sel'eral operations in differelll domains. one after an
`other. Such an agem 111:1) rake: a long time 10 complete its task. To a,·oid hal'ino e
`
`Netflix, Inc. - Ex. 1007, Page 000071
`
`
`
`382
`
`SECURITY
`
`CHAP. 9
`
`SEC. 9.1
`
`
`
`INTRODUCTION TO SECURITY
`
`383
`
`Protocol 3:
`
`
`Allocation or a resource
`Proxy creates
`by a process In remote domain
`process
`Domain
`Domain
`Resource proxy
`I Pr�ess
`Resource proxy
`l Pro�ess
`r·-'
`r·
`'
`Local security
`Local security
`' '
`policy and
`policy and
`mechanisms
`mechanisms
`Process
`Global-lo-local
`Global-to-local
`mapping of IDs
`mapping of IDs
`
`)
`
`Process
`spawns
`child process
`
`User must be
`Protocol 4:
`
`Making user known known In domain
`
`In remote domain
`
`�
`
`Domain
`
`User
`
`Protocol 1:
`
`
`Creation or
`user proxy
`
`Protocol 2:
`
`
`Allocation ol a resource
`
`by the user in a remote
`domain
`
`urchitcc1ure.
`
`to communic:uc with the user on whose behalf the agem is acting, Globu� requires
`
`
`
`
`
`
`
`
`that. processes can be delegated a subset of the user· s rights. As a consequence. by
`
`
`
`
`
`
`
`authenticating an agent and subsc::quently checking its rights. Globus should be ab
`
`
`
`
`
`le to allow :in agent to initiate an operation without having to contact rhe agem·s
`owner.
`As a final policy statement. Globus require� that groups of processes running
`
`
`
`
`
`
`
`
`with a sin2le domain and act in!! on behalf of the snme user mny share a �ingk set
`
`
`
`
`of credentials. As will be expl;ined below. credentials are needed for amhemica
`
`
`
`
`
`
`tion. This statement essemially opens the road to scalable solution, for authentica
`
`
`
`
`tion by 1101 demanding thnt each process carries it, own unique set of credentials.
`
`
`
`
`The Globu, �ecurity policy allows its designer, 10 concentrnte on den!loping
`
`
`
`
`
`domain enforce� it, own �ecuBy assuming each an overall solution for security.
`
`
`
`
`
`
`ri1y policy. Glubus concemrates only on security threats inrnlving multiple do
`
`
`
`
`
`
`main:,. In particular. the security policy indicates that the important design is�ues
`
`
`
`nre the representation of a user in a remote domain. and the ullocation of re
`
`
`
`,ource, from a re11101e domain to n u,er or his representative. What Globus there
`
`
`
`
`fore primarily needs. are medianisms for cro-..s-domnin mtthentication, and mak
`ing a user known in remote domains.
`-For this purpose. two types of representati,·e:, nre introduced. A user proxy i,
`
`
`
`
`
`
`
`
`
`
`
`
`a proce:., that i, gi,·en permission 10 act on behalf of a user for a limited period of
`
`
`
`
`time. Resources are repn:,ented by resource proxie,. A resource proxy
`is a proc
`
`
`
`
`
`
`
`on global operations b used 10 translate domain llrnt e,, running within a specific
`
`
`
`
`a re,ource inco local operations that comp!) with that panicular domain's security
`policy.
`
`
`
`
`For example. a user proxy typically communicates with a re,ource proxy
`
`
`when acces� to that resource is required.
`The Globus security architecture essentially consists of entities uch as usi:rs.
`
`
`
`
`
`
`
`
`
`
`
`
`u,er proxies, resource proxie,, :md general processes. These entities are located in
`
`
`
`
`
`
`domain� and interact with each other. In particular. the �ecurity architecture de
`
`
`
`fines four different protocols. as illuwated in Fig. 9-1 [see also Foster et nl.
`( 1998)1.
`The tin,l protocol describes precisely hO\, a user can create a user proxy und
`
`
`
`
`
`
`
`
`
`
`
`delegate rights to that proxy. In particular. in order to let the user proxy act on
`
`
`
`
`behalf of its user. the user gives the proxy an appropriate set of credentials.
`
`
`
`
`The ,econd protocol specifies how a user proxy can request the nllocation of a
`
`
`
`
`
`
`
`resource in a remote domain. In essence, the protocol tells a resource proxy to
`
`
`
`
`create a process in the 1·emo1e domain after muwal authentication bas taken place.
`
`
`
`
`Thar process represents the user (just as the u�er proxy did). but operates in the
`
`
`
`
`same domain as the reque,ted resource. The process is gi,·en access to the re
`
`
`
`
`source subject 10 rl1e access control decisions local to that domain.
`A process created in a remote domain may initiate additional computations in
`
`
`
`
`
`
`
`
`
`
`
`other domains. Consequently. a protocol is needed to allocate resources i n a re
`
`
`
`
`
`mote domain as requested by a process other than a u�er proxy. In the Globus sys
`
`
`
`
`tem, this type of allocation is done via the user proxy, by letting a process have its
`
`
`
`associated user proxy request the allocmion of resource�. e�:.enlially following the
`
`Figure 9-l. The Glotm, wcurit)
`
`
`
`
`
`
`
`�econd protocol.
`The fourth and last protocol in the Globus security architecture is the way a
`
`
`
`
`
`
`
`
`user can make himself known in a domain. Assumin2 1ha1 a user has an accoum in
`
`
`
`a domain. what needs to be establi�hed is that the �ystemwide credentiab us held
`
`
`
`
`by a user proxy are automaiically converted to credentials that are recognized by
`
`
`
`
`
`
`the specific domain. The prntocol presct'ibes how the mapping between the global
`
`
`
`
`credentials and the local ones can be registered by the user in a mapping table
`local to that domain.
`Specific details of each protocol are described in Foster et al. ( 1998). The im
`
`
`
`
`
`
`
`
`
`
`
`ponanr issue here is that the Globus security architecture retlects its secudty pol
`
`
`
`
`
`icy as slated above. The mechnnisms used to implement that architecture. in par
`
`
`
`
`ticular the abo,·e mentioned protocols, are common to many distributed systems.
`
`
`
`
`and are discussed extensively i n this chapter. The main difficulty i n designing
`
`
`
`
`�ecure distributed systems is not so much caused by security mechanism�. but by
`
`Netflix, Inc. - Ex. 1007, Page 000072
`
`
`
`384
`
`SECURITY
`decidina on how 1hose mechanisms are to be used 10 cnfora a sccuri1y policy. In
`1he ne,�� sec1ion. we consider some of these design decisions.
`
`CHAP. 9
`
`9.1.2 Design Issues
`
`A dis1ribu1cd system. or any computer system for Lhat maner. must pro1·ide
`security �crvice, by which a wide range or security policies can be implemented.
`There arc a number or importnnl design issues 1hat need to be taken into account
`ll'hen implememing general-purpose security services. In the following pages. we
`di,cus, three of these issues: focus of comrol. layering of security mechanisms.
`and ,implicity jsee also Gollmann t:!006)].
`focus of Control
`\Vhe11 con�idering the procection or a (po�sibly distributed) application. there
`nn:: es,entially rhrce dirferen1 approache\ thal can be followed. as shown in
`Fig. 9-2. The: fir�t approach b to co11ccntra1e directly on the protec1ion of the data
`cha1 i� a,:-.ociated with the applicmion. By direct. we mc:an tha1 irrespecti\'t: of che
`\'ari1)U\ opt:ration:, that c:111 po:.:,ibly be performed 011 a dnt::t ilem, the primary con
`cern is to ensure data integrity. Typically, this 1ype of protection occurs in data
`b,he �y,te1m in \1 hich ,·nriou, integrity cons1raint:, can be formulated 1hat are au
`tomaticnl ly checked e,tch time a dat:.i i1em is modified jsee. for example. Doorn
`and Rh·ero e002JI,
`The �econd approach is to concentr:ne on protection by specifying exacily
`\I hicl1 operations may be inl'ol-.ed, and by whom. 1vhen certain data or resoun:es
`are to be acces�ed. fn this case. the f'ocw, of control G �trongly related l.O access
`C()ntrol mechanisms. which we di�cuss exrensively laler in this chapter. For ex
`ample, in an object-based s�stem. i1 may be decided to specify for each method
`that i, made ,\\'ailable to clients which clients are permitted to i111'oke that method.
`Alternmiwly. nccess comrol method� can be applied to an entire interface offered
`by an object, or to the entire ,ibject itself. This approach thus allows for various
`granularities of access control.
`� A third approach is to focus directly on users by raking measures by which
`only specific people ha\·e access to the application. irrespective of the opera1ions
`they wam to carry out. For example. a clntab:tse in a bank may be protected by
`denying nccess in anyone except the bank's upper management and people specif
`icall) amhorized LO access iL As another example. in many universi1ies, certain
`darn and applications are restricted 10 be used by faculty and staff members only.
`whereas access by students is not allowed. In effect. comrol is focused on defining
`roles 1ha1 users ha1·e. and once a user's role has been verified, access to a resource
`i� either gramed or denied. As pan of designing a secure system, it is thus neces
`�nry to define role� that people may have, and provide mechanisms to support
`role-based access conu·ol. We return to roles later in this chapter.
`
`INTRODUCTION TO SECURITY
`Data Is protected against
`Data is protected against
`wrong or invalid operations
`unauthorized Invocations
`State Object
`
`385
`
`SEC. 9.1
`
`Invocation
`
`Method
`
`(a)
`
`(b)
`
`(c)
`
`Data is protected by
`checking the role of invoker
`Figure 9-2. Three npproad1�, for pr,1t�,·1io11 uguin,1 ,c,Llrit� thr�:11,. ta) Pr1•·
`tcction agairhl irl\alitl op�ration" fbJ Protection �l;!;�tin!<tl unuuthori,cd 111\Ul..'.d·
`1io11,. tel Prm�c1iu11 ngain,t 11nat11h,1ri1.eu user,.
`Layering of Security i\lcchanisms
`An imponant issue in designing �ecme systems b to decide at which lel'd se
`curi1y mechanisms �hould be placed. A lel'el in this context is related to the logi
`cal organization of n system imo a number of layer,. For example. computer net
`works are oflen organized i1110 lityers folloll'ing some reference model. as we dis
`cussed in Chap. 4. Ln Chap. I. we introduced 1he organization of distributed �ys
`tems consisting of separate layers for applica1ions, middleware, operating system
`serl'ices, and the operating system kernel. Combining the layered organization of
`computer networks and distributed systems. lends roughly 10 what is shown in
`Fig. 9-3.
`In essence. Fig. 9-3 separates general-purpose services from communication
`services. This separation is important for understanding 1he layering or security in
`disrribu1ed systems and, in particular, the notion of 1rust. The difference between
`trust and security is important. A system is either secure or i1 is not (taking various
`probabilistic measmes into account). but whether a client considers a system to be
`secure is n matter of trust (Bishop, 2003). Security is technical; trust is emotional.
`In ll'hich layer security mechanisms are placed depends on the trust a client has in
`how secure the services are in n particular layer.
`
`Netflix, Inc. - Ex. 1007, Page 000073
`
`
`
`386
`
`Application
`
`SECURITY
`
`CHAP. 9
`
`Application
`
`Middteware
`
`OS Services
`Transport
`Network OS kernel
`Data/ink
`Physical Hard1,vare
`
`Middleware
`OS Services
`Transport
`OS kernel Network
`Datalink
`Hardware Physical
`
`High-level protocols
`
`---------------------------
`
`Low-level protocols
`
`I Network
`I
`Figure 9-3. The h,gi.al orga11int1io11 ,,r a db1rillm�J ') ,1�m int,, ;c,crnl 1;1) er,.
`A� an example. con.sider an organization loc,1Led at Jifferent site� that are con
`necred through n communication ,ervice such as .S,ritthed i\lulti-megabit Data
`Sen·ice (S1\IDS). An S,\•IDS netll'ork can be thought or as a link-li!vel backbone
`connecting 1·arious local-area network� at pos�ibl) geographically disper�.::t.l site�.
`a, shown in Fig. 9--1.
`
`Encryption devrce
`
`Figure 9--1. Sc,eral site, L't111nec1cJ 1hrough n "id�-nrc;i t,n��bonc ,crdce.
`Security can be provided by placing encryprion de\'ices m each Si'vlDS router.
`as also shown in Fig. 9--1. These del'ices automatically encrypt and decrypt pack
`ers that nre sent between site�. but do not Lllherwise provide secure communica
`tion between hosts at the same �iLe. If Alice at site A �ends n message 10 Bob at
`site B. and she is worried about her rnessngc being intercepted. she must at least
`u·ust the encryption of intersite traffic to work properly. This means, for c:rnrnplc.
`thaL she mus1 t1·us1 the system administra1ors m both sites to have taken th<! proper
`rnensures ngainst tampering with the del'ices.
`Now suppose that Alice do-:s nor trust the security of intersite u·affic. She may
`then decide to cake her own measures by using a transport-level security service
`such as SSL. SSL stands for Secure Sockets Layer and can be used 10 securely
`send messages across a TCP connection. \Ve will discuss the details or SSL later
`Chap. 12 when discussing Web-based systems. The important thing 10 obser\'e
`here is that SSL allows Alice to set up a secure connection to Bob. All transport-
`
`387
`
`SEC. 9.1
`
`INTRODUCTION TO SECURITY
`
`le,·el messages will be encrypted-and at the SMDS level a, well. but that is or
`no concern ro Alice. In this case. Alice will have 10 put her tru\t into SSL. In other
`words. �he believe� that SSL is secure.
`In distributed sy:,iems. security mechanisms are often placed in the micldle
`w.u·e lnyer. If Alice does nOL trust SSL. she rna1 wam to use a local secure RPC
`sen·ice. Again. :,he \\'ill have 10 trust this RPC s<"rl'ice to do whm it promises. sueh
`n5 1101 leaking information or properly authentic,1ting clients and servers.
`Security services that are placed in the middleware layer of a distributed sys
`tem can be trusted only if the sen·ices they rely on to be secure are indeed secure.
`For example. if a secure RPC serl'ice is partly implemented by mean� or SSL,
`then tru,t in the RPC service depends on ho\\' much trust one has in SSL. lfSSL i,
`not trusted. then there can be no trust in the �ecurit) of the RPC service.
`Dislribution of Security i\Iechanisms
`Dependencies bct\\'een sen·iccs regarding trust
`lend to the notion of a
`Trusted Computing Base (TCB). A TCB i, the set or all security mechanisms
`in a (distributed) computer system that :.tre needeJ 10 enforL·e a security policy.
`and that thus need to be tru,1ed. The smaller the TCB. the better. Ir a distributed
`syste111 i, built as middle\\'are on an existing network operating system. its secl1ri
`ty may depend on the 5ecurity of the underlying lac.ii operating sys1e111,. In other
`11·ords. the TCB in u distributed syste111 may include th<! local operating s,,·s1e111s al
`n1rious hosts.
`Consider a file �er\'er inn distribut.:d file system. Such a �aver may need to
`rely on the ,·urious protection met.:lrnni\ms otkrcd by its lac.ii operating sy�tem.
`Such mechani�ms include 1101 only tho,e for protecting file· against acces�es by
`processe other rhan the file scn·er. but :1lso mechanisms 10 protect the file serl'er
`from being maliciously brought down.
`Middk\\'are-based distributed systems thus require trust in the existing local
`operating systems they depend on. H such trust does not exist. then part of the
`functionality of the local operating systems may need to be incorpora1ed into the
`distributed system itself. Considt'r n microkernel operating system. in which mo�t
`operating-system se rvices run as nonnal user processes. ln this case. the J'i le sys
`tem, for instunce. can be entirely replaced by one tailored to the speci fie needs of
`a distributed sys1em. including its various security measures.
`Consistent ll'ith this approach is to sepnrme security services from other type�
`of sen ice� by distributing sen·ices across different machines t.lepending on
`amount of security required. For example. for a secure disu·ibuted file system. it
`may be possible to isolate the file server from clients by placing rhe serl'er on a
`machine with a trusted operating system. possibly running a dedicated secure file
`system. Clients and their npplications are placed on untrusted machines.
`This separation effectively reduces the TCB to a relatively small number of
`machines and softwnre cornponents. By subsequemly protecting those machines
`
`Netflix, Inc. - Ex. 1007, Page 000074
`
`
`
`CHAP. 9
`SECURITY
`388
`against security auackh from the outside. O\·erall tru�r in th..: ,ecurity or the di�trib
`uted system can be increased. Pre\'ellling clknt" and their applications direct ac
`cess to critical ser,·ices i, followed in the Reduced Interfaces for Secure System
`Components (RISSC) approach. ns described in Neumann ( 1995). In the RISSC
`approach. any �ecurit)•Criticrtl ,erver i� placed on n �eparate machine isolated
`from entl-u,er system� using IO\\-le,·el secure network illlerfoces. n, ,hown in
`Fig. 9-5. Cliellls and their applications run 011 differelll machines and can acce,-,
`the secured �erver only through the,e net" ork i11Lerfac6.
`No direct access
`from other machines
`
`Servers running secured services
`
`I
`
`Figure 9-5. The prindpk of RISSC a, applied 1,, ,�cur..: ,h,tribut,•J ,�,tcnh.
`
`Unsecured server
`
`Simplicit.,
`Another important d.-:,ign i,,ue related to deciding in ll'hich layer to place �e
`CtLrity 111echa11i�m, i, that or �implicity. De,igning u secure computer ,y,tem b
`generally con�iderecl a tliffkult task. Con,cquently. if a �y�te,11 designer can u,c a
`fe\1·. simple mcchani�111s that are ea,ily undcr,tood anti tru"tcd to wori... thc belier
`it is.
`Unfonumut•ly. ,imple methani,nis nre not ah\a)� ,ufficienl 1·or implementing
`�ecurity policies. Corl',iuer oncc again the ,iruation in \\'hich Alice want, 10 scnd ::t
`me�,age to Bob a\ di�cu,,ed abo,·c. Link-le\ el enc:ryption i, a simple and ea,y-
`10-understand mcdmnism to protect again,t interception of intcr�ite me\�age
`trarfic. Hm, ever, much more i, nccdcd if Alic.: 11 arm to be MIi\! that only Bob
`will rcceivc h.::r mes,agc�. In that ca�e. u�er-le\.:I authentication services .ire
`needed. anti A Ike may need LO be aware or how �uch se1Yices work in order to put
`her trust in it. U�cr-lt!vel authentication may therefore require at least a notion of
`cryptographic key� ancl a\\'areness or mechanisms such as certificates. despite the
`fact thar many �ccurity services are highly automatcd and hidden from u,er�.
`In other ca�es. the application it�cll' b inherently complex and introducing se
`curity only make-'> matters worse. An cxample application domain inl'Ol\'ing com
`plex security protocols (as we discuss later in this chapter) is that of digital pay
`ment system�. The complexity or digital paymem prowcols is often caused by the
`fact that multiple panics need 10 communicate LO make a payment. In these case�.
`
`INTRODUCTION TO SECURITY
`SEC. 9.1
`389
`it is important that the underlying mechanisms that are u,ed to implemem the pro
`tocols arc relatively simple and easy to understand. Simplicity will contribute to
`the trust that end users will put into the application and. more importantly. will
`colllribute to convincing the designers that the system has no security holes.
`
`9.1.3 Cryptography
`Fundamental to security in distributed system� is the u�e or cryptographic
`techniques. The basic idea of applying these techniques is simple. Consider a
`�ender S wanting to transmit message 111 to a recei\·cr R. To protect the messa!!e
`111 '. anti subsequently sends 1111 to R. R. in turn. must decrypt the receil'ed me';;
`against security threats. the sender first encrypts it into an unintelligible mcssa;e
`sage into its original form 111.
`Encryrtion and decryption are accomplished by using cryptographic methods
`parameterized by keys. as shown in Fig. 9-6. The original form or the messaue
`that is sent is called the plain text. shown as P in Fig. 9-6: 11:e encrypted form �i�
`retimed to as the ciphertext. illustrated as C.
`Passive intruder
`Active intruder
`only listens to C
`can alter messages
`
`Ac11·,e intruder
`can Insert messages
`
`Plalntext, P
`
`Plalntexl
`
`Sender
`
`Receiver
`
`I
`Encryption
`Oecrypl1on
`key. EK
`key. OK
`Figure 9-6. Intruders and e:11 �,droppers in c0mmu11ic:11iun.
`To describe the various security protocols that are used in building security
`services for distributed systems. it is useful to have a notation to relate plaintext,
`ciphenext, and keys. Following the common notational conventions. we will use
`C = Eg(P) to denote that the ciphertext C is obtained by encrypting the plaintext
`ciphertexL C using key K. resulting in the plaintext P.
`P using key K. Likewise, P = DK(C) is used to express the decryption of the
`Returning to our example shown in Fig. 9-6, while transferring a message as
`ciphenext C, there are three different attacks that we need LO protect against. and
`for which encryption helps. First. an intruder may intercept the message without
`either the sender or receiver being aware that ea\'esdropping is happening. Of
`
`Netflix, Inc. - Ex. 1007, Page 000075
`
`
`
`SECURITY
`
`CHAP. 9
`
`390
`
`course. if the transmitted message ha, been encrypted in �uch a way that it cannot
`be easily decrypted without ha,·ing the proper key. interception is useless: tbe in
`rruder will see only unintelligible data. ( By the wa}. the fact alone that a message
`i, being tran milted may �ornerimes be enot1gh for an intruder to dra1, conclu
`sions. For example. if during a world crisis the amount of traffic into the White
`House suddenly dmps dramatically \\'hile the nmount of traffic going imo a cer
`tain moumain in Colorado incrt"a�es by 1hc ,nme a111ount. there may be useful in
`formation in kno\\'i112. that.)
`The second type-or ,mack thai needs Lo be dealt \\'ith is 1hat of modifying lht"
`mes,age. i\-lodifying plainte,\'.t is easy: 111odifying cipl1ertex1 that has been properly
`encrypted is nrnch more dirficulc because th<! intruder will t'ir�t ha,e 10 dec1ypt the
`message before he can meaningfully modify it. In addition. hi.' will also ha\'e LO
`properly encrypl it again or otherwise the receiver may notice that the message
`has been tampered



