`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`18/033,197
`
`04/21/2023
`
`Yuki NAKAI
`
`1946-1912
`
`1920
`
`Para
`
`a
`
`piesa
`
`Paratus Law Group, PLLC
`1765 Greensboro Station Place
`Suite 320
`
`Tysons Corner, VA 22102
`
`KALAPODAS, DRAMOS
`
`2487
`
`01/23/2025
`
`PAPER
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`PTOL-90A (Rev. 04/07)
`
`
`
`Application No.
`Applicant(s)
`18/033, 197
`NAKAI et al.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`DRAMOS KALAPODAS
`2487
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`
`
`1) Responsive to communication(s) filed on 12/04/2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`Disposition of Claims*
`1,3-5 and 7-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1,3-5 and 7-20 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s) filed on 04/21/2023 is/are: a)[¥) accepted or b)(.) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20250122
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013,
`
`is being examined
`
`underthefirst inventor to file provisions of the AIA.
`
`Claim Status
`
`2.
`
`Claims 1, 3-5, and 7-20 are currently pending.
`
`Claims 2 and 6 are cancelled.
`
`Response to Arguments
`
`3.
`
`Applicant's arguments with respect to claims 1, 3-5, and 7-20, have been
`
`considered but are moot in view of the new ground(s) of rejection.
`
`The argument raised in the Remarks are directed to the amended claims.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis forall
`
`obviousness rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention
`is not identically disclosed as set forth in section 102, if the differences betweenthe claimed
`invention and the prior art are such that the claimed invention as a whole would have been
`obvious before the effective filing date of the claimed invention to a person having ordinary skill in
`the art to which the claimed invention pertains. Patentability shall not be negated by the manner
`in which the invention was made.
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 3
`
`The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148
`
`USPQ 459 (1966), that are applied for establishing a background for determining
`
`obviousness under 35 U.S.C. 103 are summarized as follows:
`
`1. Determining the scope and contents of the prior art.
`2. Ascertaining the differences betweenthe prior art and the claims at issue.
`3. Resolving the level of ordinary skill in the pertinent art.
`4. Considering objective evidence present in the application indicating
`obviousness or nonobviousness.
`
`The applied reference to Hattori has a commoninventor with the instant application.
`
`4.
`
`Claims 1-, 3-5, 7-10 and 14-20 are rejected under 35 U.S.C. 103 as being
`
`obvious over Hironori Hattori et al., (hereinafter Hattori) (US 2019/0089886) having a
`
`publication date of Mar. 21, 2019,
`
`in view of Takashi Adachi etal., (hereinafter
`
`Adachi) (JP 2019-180017 A).
`
`Re Claim 1. (Currently Amended) Hattori discloses, an information processing
`
`device (an information processing apparatus, Abstract) comprising:
`
`a control unit configured to perform image-capture control on a basis of image-
`
`capture rendering information indicating a change in captured image and image-capture
`
`target information indicating an image-capture target (a camera control system
`
`performing image-capturing by a camera based information system, e.g., based
`
`on camerawork per Fig.2 element (31), imaging a subject(s) i.e., at specific target
`
`position information obtained from element (32), corresponding to various
`
`conditions changes, to generate images of the captured subject, Abstract, or at
`
`Fig.11 and Pars.[0001, 0006, 0013, 0019, 0022-0024)),
`
`wherein the image-capture target information includes information indicating a
`
`person being the image-capture target (the image captured target-information
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 4
`
`includes a camerawork information setting the composition such that the range
`
`in which players exist i.e. persons, are contained inside the image, Par.[0181-
`
`0185] and Fig.14, or Par.[0199] Fig.17, obviously being that the indicated
`
`“players” contained in the images are “persons” which are considered targets
`
`within the controlled camera information system, Par.[0181, 0189]),
`
`the image-capture target information includes information indicating an image-
`
`capture target range pertaining the person being the image-capture target (the image
`
`captured target-information includes a cameraworkinformation setting the
`
`composition such that the entire range Par.[0181], in which players existi.e.
`
`persons, are contained inside the image,i.e., being in the image-capture range,
`
`Par.[0181-0185] and Fig.14, or Par.[0199] Fig.17 or control information generation
`
`section 211, detecting the subject from a captured image Par.[0224]),
`
`the image- capture target range being which part of the person is contained in
`
`the captured image (the image-capture of the subject, i.e., target or person, has the
`
`camerawork information specifying image capture indicates that full body, or
`
`upper-part of the body is contained in image, Fig.6, Par.[0105-0110] according to
`
`specific coordinates of the full body position of the subject 151 in the image F111,
`
`Par.[0113, 0115] or per Fig.11 Par.[0117], Fig.12, and Fig.13, Par.[0170-0172] etc.),
`
`and
`
`the control unit is implemented via at least one processor (an information
`
`processing apparatus, Par.[0013] and method performed by a processor of the
`
`processing apparatus, Par.[0022] and CPU 1001, Fig.23 Pgar.[0238)]).
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 5
`
`In an analogous art, Adachi teaches about, the control unit is implemented via at
`
`least_one processor (processor Pg.5 per Fig.12 the CPU 1200, Lin.9-19).
`
`The one of ordinary skill would have found obvious before the effectivefiling date
`
`of invention, to consider the suggestion made by Hattori, at Par.[0206],
`
`indicating the
`
`selection of a single subject from an image comprising a plurality of persons, and be
`
`persuaded to seek other art(s) teaching the subject matter in more detail, as disclosed
`
`in the analogous art to Adachi, hence deeming the combination predictable.
`
`2. (Canceled).
`
`Re Claim 3. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim [[2]]1,
`
`Hattori teaches that, wherein the person being the image-capture target is a
`
`person selected from among candidates (citing: “when a group in the image
`
`including multiple artists treated as a subject, the composition may be controlled
`
`on the basis... such that only a subject including the artists who are currently
`
`singing is contained in the image”is selected, Par.[0206]).
`
`Adachi also teaches about, wherein the person being the image-capture targetis
`
`a person selected from among candidates (the person is selected from an image by
`
`estimating information of a person existing in the area where the image contains
`
`a plurality of persons over a wider area, Abstract and Fig.5 and 6, Pg.10 — as
`
`highlighted in the attached patent document renumbered copy- citing; “ Note that
`
`when the estimation area determination unit 404 in the present embodiment
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 6
`
`determines a plurality of estimation areas including at least one person among
`
`the plurality of persons detected by the detection unit 403, the humanface area is
`
`included in the estimation area.”).
`
`Re Claim 4. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim [[2]]1,
`
`Adachi teaches about, wherein the person being the image-capture target is a
`
`person recognized in a designated area of an image-capture target space (the person
`
`is selected from an image by estimating information of a person existing in the
`
`area where the image contains a plurality of persons over a wider area, Abstract
`
`and Fig.5 and 6, Pg.10-highlighted).
`
`Re Claim 5. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Hattori teaches about, wherein the image-capture target information includes
`
`information indicating a position in an image-capture target space (target position of
`
`subject 151, at Par.[0113, 0117, 0119---0127], Fig.7).
`
`6. (Canceled).
`
`Re Claim 7. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim [[6]]|1,
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 7
`
`Hattori teaches about, wherein the information indicating the image-capture
`
`target range corresponds to information indicating a range of positions relative to the
`
`personin which an imageof an instrument is captured in a case where the instrumentis
`
`in use by the person being the image-capture target (information on objects being
`
`carried by people at least at Par.[0058]).
`
`Re Claim 8. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Hattori teaches about, wherein the image-capture control includes at least any
`
`one of pan control, tilt control, or Zoom control for a real camera (withinafirst the real
`
`manual camera has Pan-Tilt-Zoom (PTZ) capabilities of the real manual camera
`
`and of the other cameras, Pars.[0090, 0103, 0124-0128, 0157-0158, 0170-0172,
`
`0182-0187]).
`
`Re Claim 9. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Adachi teaches about, wherein the image-capture control includes image-
`
`clipping control for an image captured by a real camera (citing from Pg.11: “At this
`
`time, the attribute estimation unit 406 performs the following process so as not to
`
`estimate attribute information redundantly. For example, the attribute estimation
`
`unit 406 cuts out the estimated area 604c from the captured image 604, and
`
`estimates attribute information for a person included in the extracted image of the
`
`estimated area 604c. Alternatively, the storage unit 408 maystore the position
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 8
`
`information of the person whoseattribute information has been estimated by
`
`the attribute estimation unit 406. For example, when estimating the attribute
`
`information of the estimated area 604c in the captured image, a person who has
`
`already estimated attribute information from the position information stored in
`
`the storage unit 408 may be excluded from the processing target.”).
`
`Re Claim 10. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Hattori teaches about, wherein the image-capture rendering information includes
`
`rendering information regarding zoom (camerawork information includes zoom
`
`information, Par.[0113] from PTZ cameras at Pars.[0090, 0103, 0124-0128, 0157-
`
`0158, 0170-0172, 0182-0187]).
`
`Re Claim 14. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim 1, further comprising;
`
`Hattori teaches the device, a selection reception unit configured to receive
`
`selection of information to be applied to the image-capture control from among a
`
`plurality of types of information for at least any one of the image-capture rendering
`
`information or the image-capture target information, wherein and the selection reception
`
`unit performs perform display control of a selection reception screen for receiving
`
`selection of the information to be applied (further allowing the user to enter
`
`commandsto a display device, Par.[0239]),
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 9
`
`wherein the selection reception unit is implemented via at least_one processor
`
`(an information processing apparatus, Par.[0013] and method performed by a
`
`processor of the processing apparatus, Par.[0022]).
`
`Adachi teaches, the display (including an imaging control userinterface on a
`
`display device, citing form Pg.4;
`
`“FIG. 1 is a diagram showing a system configuration according to the
`
`present embodiment. The image processing apparatus 100 is an apparatus that
`
`executes image processing to be described later. The image processing
`
`apparatus 100 is realized by, for example, a personal computer in which a
`
`program for realizing an image processing function described later is installed.
`
`The display device 101 is connected to the image processing device 100, andis a
`
`display device for a user to browse data output by image processing to be
`
`described later, a Ul (user interface), and thelike. “),
`
`the control unit is implemented via at least one processor (Pg.5 per Fig.12 the
`
`CPU 1200, Lin.9-19).
`
`Re Claim 15. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim 14,
`
`Adachi teaches, wherein a plurality of cameras to be subject to the image-
`
`capture control is provided, and the selection reception unit performs is further
`
`configured to perform, as the display control of the selection reception screen, display
`
`control of a screen capable of receiving the selection for each of the cameras (the
`
`same display and userinterface in Adachi would have been considered obvious
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 10
`
`to be applied to the multiple cameras in Hattori, performing the same functions as
`
`disclosed citing; “ The display device 101 is connected to the image processing
`
`device 100, and is a display device for a user to browse data output by image
`
`processing to be described later, a Ul (user interface), and thelike. “ at Pg.4 and
`
`FIG. 1).
`
`Re Claim 16. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim 15, further comprising;
`
`Hattori teaches, the device having, an operation screen display control unit
`
`configured to perform, as the image- capture control, display control of an operation
`
`panel on which an execution instruction operation element of control based on the
`
`image-capture rendering information or the image-capture target information selected
`
`on the selection reception screen is deployed (a user input and operation command,
`
`Par.[0238] Fig.23, a display panel Par.[0239] an image control panel in Figs.12-13,
`
`Par.[0169]),
`
`wherein the selection reception unit is capable of receiving a plurality of sets of
`
`the selection for each of the cameras (at the reception section 12 and 36 Par.[0288,
`
`0298] Fig.1, Fig.2, 15, Par.[0056-0065, 0071-0072, 0097] or information reception
`
`section 91 Par-.[ 0079-0081, 0304), [[and]]
`
`Adachi teaches about, the operation screen display control unit performs is
`
`further configured to perform control to display the operation panel for each of the sets
`
`(citing: “The display device 101 is connected to the image processing device 100,
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 11
`
`and is a display device for a user to browse data output by image processing to
`
`be described later, a Ul (user interface), and thelike. “ at Pg.4 and FIG. 1), and
`
`the operation screen display control unit is implemented via at least one
`
`processor (processor Pg.5 per Fig.12 the CPU 1200, Lin.9-19).
`
`Re Claim 17. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim 1,
`
`Hattori teaches, the information processing device functioning as a part of an
`
`image processing system capable of switching an output image among images
`
`captured by a plurality of cameras, wherein the control unit performs is further
`
`configured to perform, on a basis of control information regarding the switching of the
`
`output image, the image-capture control based on the image-capture rendering
`
`information (a switching method onbasis of information control of the
`
`cameraworkinformation Par.[0187, 0189-0198, 0200-0206, 0208 Fig.15, or Fig.19,
`
`etc.),
`
`Re Claim 18. (Currently Amended) Hattori and Adachi disclose, the information
`
`processing device according to claim 17,
`
`Hattori teaches, wherein the control unit starts is further configured to start, in
`
`response to selection of the output image or a next output candidate image from among
`
`the images captured by the plurality of cameras, the control for an image-capture
`
`source camera of the selected captured image (Par.0203, 0206, 0210] Fig.10).
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 12
`
`Re Claim 19. (Currently Amended) This claim represents the information
`
`processing method comprising causing an information processing device to perform
`
`image- capture control implemented by the apparatus of claim 1, henceit is rejected on
`
`the same mapped evidence mutatis mutandis.
`
`Re Claim 20. (Currently Amended) This claim represents the memorystoring the
`
`program code readable by a computer device, per Hattori: (the memory ROM 1002,
`
`RAM 1003 in Figs.2, or 23 Pars.[0065, 0238] considered non-transitory structures)
`
`the program causing the computer device to execute each and everylimitation of the
`
`apparatus claim 1, henceit is rejected on the same mapped evidence mutatis mutandis.
`
`5.
`
`Claims 11-13, are rejected under 35 U.S.C. 103 as being obvious over Hattoriin
`
`and Adachi view of Jun Hirano (hereinafter Hirano) (US 9,167,134).
`
`Re Claim 11. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 10,
`
`Hirano teaches, wherein the image-capture rendering information corresponds to
`
`information including at least any one of a zoom speed, a zoom start angle of view, ora
`
`zoom end angle of view (indicating the zoom speed, at Fig.6 Col.8 Lin.48-55).
`
`Though, Hattori and Adachi teach the zoom and angle of view adjustment,
`
`they
`
`do not teach the zoom speed performed in a specific image area comprising a subject,
`
`one of ordinary skill would have found obvious before the effectivefiling date of
`
`invention, to seek similar processing techniques teaching the use of such selection
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 13
`
`based on the information obtained from other analogous art disclosing the subject
`
`matter in more detail, as found in the art to Hirano hence, deeming the combination
`
`predictable.
`
`Re Claim 12. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Hirano teaches, wherein the image-capture rendering information includes
`
`rendering information regarding camera shake (a camera shaking control, hence
`
`information regarding the shake, Col.9 Lin.31-43).
`
`Considering Hattori and Adachi teaching about the zoom and angle of view
`
`adjustment, They do not teach the zoom speed or the camera shaking information the
`
`one of ordinary skill would have found obvious before the effectivefiling date of
`
`invention, to seek similar processing techniques and to associate the camera
`
`stabilization teachings during the use of such functions based on the information
`
`obtained from other analogous art disclosing the subject matter in detail, as found in the
`
`art to Hirano hence, deeming the combination predictable.
`
`Re Claim 13. (Original) Hattori and Adachi disclose, the information processing
`
`device according to claim 1,
`
`Hirano teaches, wherein the image-capture rendering information includes
`
`rendering information regarding focus (a camera control and adjustment based on
`
`focus information Col.1 Lin.22-23).
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 14
`
`In Hattori and Adachi the teaching of the zoom and angle of view adjustment,
`
`does include the focus information which one of ordinary skill would have found obvious
`
`to consider as an extrinsic camera function associated with the above PTZ controls, and
`
`be encouraged before the effective filing date of invention to seek similar processing
`
`techniques and to associate the camera PTZ teachings with the focus information
`
`during the use of such functions, identified from other analogous art disclosing the focus
`
`information found in the art to Hirano hence, deeming the combination predictable.
`
`Conclusion
`
`6.
`
`THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time
`
`policy as set forth in 37 CFR 1.136(a).
`
`A shortened statutory period for reply to this final action is set to expire THREE
`
`
`
`MONTHS from the mailing date of this action. In the eventafirst reply is filed within
`
`TWO MONTHS of the mailing date of this final action and the advisory action is not
`
`mailed until after the end of the THREE-MONTHshortened statutory period, then the
`
`shortened statutory period will expire on the date the advisory action is mailed, and any
`
`extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of
`
`the advisory action.
`
`Inno event, however, will the statutory period for reply expire later
`
`than SIX MONTHS from the mailing date of this final action.
`
`Anyinquiry concerning this communication or earlier communications from the
`
`examiner should be directed to DRAMOS KALAPODASwhosetelephone numberis (571)272-
`
`4622. The examiner can normally be reached on Monday-Friday 8am-5pm.
`
`
`
`Application/Control Number: 18/033,197
`Art Unit: 2487
`
`Page 15
`
`Examinerinterviews are available via telephone,
`
`in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO AutomatedInterview Request
`
`(AIR) at http:/Avwww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, David Czekaj can be reached on 571-272-7327. The fax phone numberfor
`
`the organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of an application may be obtained from the
`
`Patent Application Information Retrieval (PAIR) system. Status information for
`
`published applications may be obtained from either Private PAIR or Public PAIR.
`
`Status information for unpublished applications is available through Private PAIR only.
`
`For more information about the PAIR system, see http://pair-direct.uspto.gov. Should
`
`you have questions on access to the Private PAIR system, contact the Electronic
`
`Business Center (EBC) at 866-217-9197 (toll-free).
`
`If you would like assistance froma
`
`USPTO Customer Service Representative or access to the automated information
`
`system, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/DRAMOS KALAPODAS/
`
`DRAMOS . KALAPODAS
`
`Primary Examiner
`
`Art Unit 2487
`
`