throbber
www.uspto.gov
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`18/015,630
`
`01/11/2023
`
`Tsuyoshi ISHIKAWA
`
`1946-1873
`
`1930
`
`Para
`
`a
`
`topos
`
`Paratus Law Group, PLLC
`1765 Greensboro Station Place
`Suite 320
`
`Tysons Corner, VA 22102
`
`OKEBATO, SAHLU
`
`2625
`
`11/07/2024
`
`PAPER
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`PTOL-90A (Rev. 04/07)
`
`

`

`Application No.
`Applicant(s)
`18/015,630
`ISHIKAWAetal.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`SAHLU OKEBATO
`2625
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`
`
`1) Responsive to communication(s) filed on 09/23/2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`Disposition of Claims*
`1-21 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1-21 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10) The specification is objected to by the Examiner.
`11)0) The drawing(s) filedon__ is/are: a)(J accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)[¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`c)Z None ofthe:
`b)() Some**
`a) All
`1.¥) Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.1.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241104
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 2
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined
`
`underthe first inventor to file provisions of the AIA.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis (i.e., changing from AIA to pre-AlA) for the rejection will
`
`not be considered a new ground ofrejection if the prior art relied upon, and the rationale
`
`supporting the rejection, would be the same undereither status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basisfor all
`
`obviousnessrejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102, if the differences between the
`claimed invention and the prior art are such that the claimed invention as a whole would have
`been obvious before the effectivefiling date of the claimed invention to a person having
`ordinary skill in the art to which the claimed invention pertains. Patentability shall not be
`negated by the manner in which the invention was made.
`
`Claims 1-21 are rejected under 35 U.S.C. 103 as being unpatentable over
`
`Park, US PGPUB 20140336669in view of Miller, US Patent 9767616.
`
`Asto claim 1, Park discloses an information processing apparatus, comprising:
`
`a real object detector configured to detect a real object in a real space in which a part of
`
`a body is detected ([0059] For example, in a three-dimensional space composedof
`
`X-axis, y-axis, and z-axis, an object has atleast of three degrees of freedom to
`
`determine a spatial position of the object (the position in each axis) and three
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 3
`
`degrees of freedom to determine a spatial posture of the object (the rotation
`
`angle in each axis)), and
`
`a body part detector configured to detect a position and a poseofthe part of the
`
`bodyin relation to the real object ((0061] As the operator moves the hands while
`
`wearing the haptic glove 400, a depth sensor 150 recognizes the shape, position,
`
`posture, gesture, and motion of the haptic glove 400, to display an image
`
`displayed on the display 120 combinedwith a representation of the haptic glove
`
`400 in an overlay manner); and
`
`a drive signal generator configured to generate a drive signal based on a
`
`positional relationship between the real object and the part of the body, ([0062] The
`
`operator may control the operation of the surgical tool 230 of the slave system
`
`200 by moving hands while observing the image of the haptic glove 400 thatis
`
`overlaid on the image displayed on the display 120),
`
`the drive signal being supplied to a tactile sense providing mechanism thatis
`
`worn on the part of the body ({0061] The haptic glove 400 is a glove that can be
`
`wearable on handsof the operator),
`
`wherein the positional relationship used to generate the drive signalis
`
`determined according to the type of the real object and one or more shape
`
`characteristics of the real object ([0058] The end effector represents a portion that
`
`comesinto direct contact with the hands of the operator (S), and may be provided
`
`in the form of a pencil and a stick, but the shape of the endeffector is not limited
`
`thereto), and
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 4
`
`wherein the real object detector, the body part detector, and the drive signal
`
`generator are each implemented via at least one processor ([0073] the master system
`
`100 may further include a signal processor (not shown) that receives biomedical
`
`information transmitted from the slave system 200, processes the received
`
`biomedical information, and outputs the processed biomedicalinformation to the
`
`display 120).
`
`Park does not specifically disclose determine a type of the real object.
`
`However, in the same endeavor, Miller disclose determine a type of the real
`
`object (The methodofclaim 1, further comprising: identifying a first real object of
`
`a first type; recognizing the real objectof the first type with a first object
`
`recognizer that recognizesthefirst type of object, claim 14).
`
`Therefore, it would have been obvious to oneof ordinary skill in the art to modify
`
`the disclosure of Park to further include Miller's object identification number in order to
`
`improve/fix errors and missing information in geometric maps with intention of activating
`
`desired information effectively.
`
`Asto claim 19, Park discloses a tactile sense providing system, comprising: a
`
`tactile sense providing mechanism configured to be worn on a part of a body ({0061]
`
`The haptic glove 400 is a glove that can be wearable on handsofthe operator);
`
`and
`
`an information processing apparatus that includescircuitry configured to detect a
`
`real object in a real space in which the part of the body is detected ([0059] For
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 5
`
`example, in a three-dimensional space composedofx-axis, y-axis, and z-axis, an
`
`object has at least of three degrees of freedom to determine a spatial position of
`
`the object (the position in each axis) and three degrees of freedom to determine a
`
`spatial posture of the object (the rotation angle in each axis)),
`
`detect a position and a poseof the part of the bodyin relation to the real object
`
`([0061] As the operator moves the hands while wearing the haptic glove 400, a
`
`depth sensor 150 recognizes the shape, position, posture, gesture, and motion of
`
`the haptic glove 400, to display an image displayed on the display 120 combined
`
`with a representation of the haptic glove 400 in an overlay manner), and
`
`generate a drive signal based on a positional relationship between the real object
`
`and the part of the body, the drive signal being supplied to the tactile sense providing
`
`mechanism (signal processing unit 122, fig. 8, wherein the signal processing unit
`
`122 has a function to process signals related to the operation of the master
`
`device 10 based on a signal inputted from the communication unit 110),
`
`wherein the positional relationship used to generate the drive signalis
`
`determined according to the type of the real object and one or more shape
`
`characteristics of the real object ([0058] The end effector represents a portion that
`
`comesinto direct contact with the hands of the operator (S), and may be provided
`
`in the form of a pencil and a stick, but the shape of the endeffector is not limited
`
`thereto).
`
`Park does not specifically disclose determine a type of the real object.
`
`However, in the same endeavor, Miller disclose determine a type of the real
`
`object (The methodofclaim 1, further comprising: identifying a first real object of
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 6
`
`a first type; recognizing the real objectofthe first type with a first object
`
`recognizer that recognizesthefirst type of object, claim 14).
`
`Therefore, it would have been obvious to oneof ordinary skill in the art to modify
`
`the disclosure of Park to further include Miller's object identification number in order to
`
`improve/fix errors and missing information in geometric maps with intention of activating
`
`desired information effectively.
`
`Asto claim 20, Park discloses a non-transitory computer-readable storage
`
`medium having embodied thereon a program, which when executed by an information
`
`processing apparatus of a computer causesthe information processing apparatus to
`
`execute a method,
`
`the method comprising: detecting a real object in a real space in which a part of a
`
`body is detected (tactile sensation presenting units 130, fig. 9);
`
`detecting a position and a pose ofthe part of the bodyin relation to the real
`
`object ((0061] As the operator moves the hands while wearing the haptic glove
`
`400, a depth sensor 150 recognizes the shape, position, posture, gesture, and
`
`motion of the haptic glove 400, to display an image displayed on the display 120
`
`combined with a representation of the haptic glove 400 in an overlay manner); and
`
`generating a drive signal based on a positional relationship between the real
`
`object and the part of the body ([0066] If the actual surgical tool 230 is imaged(e.g.,
`
`videoed, photographed, etc.) through the endoscope 220 as shownin FIG.4, a
`
`portion on which the virtual surgical tool overlaps the actual surgical tool 230
`
`may represent the actual surgical tool 230),
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 7
`
`the drive signal being supplied to a tactile sense providing mechanism thatis
`
`worn on the part of the body the drive signal being supplied to a tactile sense providing
`
`mechanism that is worn on the part of the body ({0061] As the operator moves the
`
`hands while wearing the haptic glove 400, a depth sensor 150 recognizes the
`
`shape, position, posture, gesture, and motion of the haptic glove 400, to display
`
`an image displayed on the display 120 combined with a representation of the
`
`haptic glove 400 in an overlay manner),
`
`wherein the positional relationship used to generate the drive signalis
`
`determined according to the type of the real object and one or more shape
`
`characteristics of the real object ([0058] The end effector represents a portion that
`
`comesinto direct contact with the hands of the operator (S), and may be provided
`
`in the form of a pencil and a stick, but the shape of the endeffector is not limited
`
`thereto).
`
`Park does not specifically disclose determine a type of the real object.
`
`However, in the same endeavor, Miller disclose determine a type of the real
`
`object (The methodofclaim 1, further comprising: identifying a first real object of
`
`a first type; recognizing the real objectofthe first type with a first object
`
`recognizer that recognizesthefirst type of object, claim 14).
`
`Therefore, it would have been obvious to oneof ordinary skill in the art to modify
`
`the disclosure of Park to further include Miller's object identification number in order to
`
`improve/fix errors and missing information in geometric maps with intention of activating
`
`desired information effectively.
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 8
`
`Asto claim 2, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 1. The combination further discloses the drive
`
`signal generator generates the drive signal further according to the type (Miller, the
`
`methodof claim 1, further comprising: identifying a first real object ofa first type;
`
`recognizing the real objectof the first type with a first object recognizerthat
`
`recognizesthe first type of object, claim 14).
`
`Asto claim 3, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 2. The combination further discloses the drive
`
`signal generator is further configured to determine whetherthe real object and the part
`
`of the bodyare in contact with each other, based on the positional relationship between
`
`the real object and the part of the body, and wherein the drive signal generator
`
`generatesthe drive signal according to a result of the determination (Park, ([0061] As
`
`the operator moves the hands while wearing the haptic glove 400, a depth sensor
`
`150 recognizes the shape, position, posture, gesture, and motion of the haptic
`
`glove 400, to display an image displayed on the display 120 combined with a
`
`representation of the haptic glove 400 in an overlay manner)).
`
`Asto claim 4, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 3. The combination further discloses when the
`
`part of the body is broughtinto contact with the real object, the drive signal generator
`
`generates the drive signal such that the tactile sense providing mechanism causes a
`
`tactile sense to occur (Park, ([{0061] As the operator moves the hands while wearing
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 9
`
`the haptic glove 400, a depth sensor 150 recognizes the shape, position, posture,
`
`gesture, and motion of the haptic glove 400, to display an image displayed on the
`
`display 120 combinedwith a representation of the haptic glove 400 in an overlay
`
`manner)).
`
`Asto claim 5, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 4. The combination further discloses the drive
`
`signal generator generates the drive signal according to a force of pressing performed
`
`by the part of the body on the real object (Park, [0076] in addition to the
`
`synchronization of the motion of the haptic glove 400 with the motion of the
`
`surgical tool 230 using the depth sensor 150, a vibrator 420 and a sensation
`
`applier 430 of the haptic glove 400 feedbacks a sensation, such as a force ora
`
`torque, that is applied to the surgical tool 230 during a surgery to the operator.
`
`This will be described in detail, after the slave system 200 is described).
`
`Asto claim 6, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 3. The combination further discloses when the
`
`part of the body is moved with respect to the real object in a state in which the part of
`
`the bodyis in contact with the real object, the drive signal generator generates the drive
`
`signal such that the tactile sense providing mechanism causesa tactile sense to occur
`
`(Park, [0107] The sensation applier 430 may be embodiedasa tactile display to
`
`transmit the sensation to fingers of the operator).
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 10
`
`Asto claim 7, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 6. The combination further discloses the drive
`
`signal generator generatesthe drive signal according to at least one of a distance or a
`
`speed of movementof the part of the body with respect to the real object (Miller, the
`
`processor applies a transformation to the created user-centric light map, wherein
`
`the transformation reduces an error corresponding to a distance between the
`
`user anda virtual object to be presented to the user col. 14, lines 36-41).
`
`Asto claim 8, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 3. The combination further discloses when the
`
`real object and the part of the body are movedin a state of remaining in contact with
`
`eachother, the drive signal generator generates the drive signal such that the tactile
`
`sense providing mechanism causesa tactile sense to occur (Park, ({0061] As the
`
`operator moves the hands while wearing the haptic glove 400, a depth sensor 150
`
`recognizes the shape, position, posture, gesture, and motion of the haptic glove
`
`400, to display an image displayed on the display 120 combined witha
`
`representation of the haptic glove 400 in an overlay manner)).
`
`Asto claim 9, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 8. The combination further discloses the drive
`
`signal generator generates the drive signal according to at least one of a distance or a
`
`speed of movementof the real object (Miller, the processor applies a transformation
`
`to the created user-centric light map, wherein the transformation reduces an error
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 11
`
`corresponding to a distance between the user anda virtual object to be presented
`
`to the usercol. 14, lines 36-41).
`
`Asto claim 10, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 8. The combination further discloses the drive
`
`signal generator generates the drive signal according to a portion, in the real object,
`
`with which the specific part is in contact (Park, [0010] a plurality of vibrators onafirst
`
`surface of the haptic glove, the plurality of vibrators configured to apply
`
`vibrations; at least one pressure sensorat a finger part of a second surface of the
`
`haptic glove opposite to the first surface).
`
`Asto claim 11, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 1. The combination further discloses the drive
`
`signal generator determines whethera first object and a second objectare in contact
`
`with each other, the first object being an object of which a position relative to the part of
`
`the bodyis fixed, the second object being an object of which a position relative to the
`
`part of the bodyis notfixed, the drive signal generator generates the drive signal
`
`according to a result of the determination, and at least one ofthe first object or the
`
`second objectis the real object (Park, [0077] The plurality of robot arms 210 may be
`
`fixedly coupled to a body 201 and supported by the body 201. In this case, the
`
`numberof surgical tools 230 and the robot arms 210 that are used at one time
`
`may be determined depending on a diagnosis method, a surgery method, and a
`
`spatial limitation of a surgery room).
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 12
`
`Asto claim 12, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 11. The combination further discloses thefirst
`
`object and the second object are the real objects (Miller, in one or more
`
`embodiments, the augmentedreality display system further comprises a first
`
`object recognizer, wherein the first object recognizer is configured to recognize a
`
`subsetof a type of an object recognized by a second object recognizer, wherein
`
`the first object recognizer is run on data that has already been run through the
`
`secondobject recognizer; col. 5, lines 29-33).
`
`Asto claim 13, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 11. The combination further discloses a virtual
`
`object generator configured to generate a virtual object in a real space, wherein the first
`
`object is the real object, wherein the second objectis the virtual object, and wherein the
`
`virtual object generator is implemented via at least one processor (Miller, in one or
`
`more embodiments, a light associated with the modified one or more virtual
`
`objects resembles that of real objects in an ambient environment of the user; col.
`
`13, lines 62-65).
`
`Asto claim 14, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 11. The combination further discloses a virtual
`
`object generator configured to generate a virtual object in a real space, wherein the first
`
`objectis the virtual object, wherein the second objectis the real object, and wherein the
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 13
`
`virtual object generator is implemented via at least one processor (Miller, in one or
`
`more embodiments, the processor selects one or more additionalvirtual objects
`
`to project to the user basedatleast in part on the at least detected property of the
`
`ambient light; col. 12, lines 40-43).
`
`Asto claim 15, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 11. The combination further discloses the drive
`
`signal generator generates the drive signal according to at least one of a distance or a
`
`speed of movementof a point of contactof the first object and the second object (Miller,
`
`the processor applies a transformation to the created user-centric light map,
`
`wherein the transformation reduces an error corresponding to a distance
`
`between the user anda virtual object to be presentedto the usercol. 14, lines 36-
`
`41).
`
`Asto claim 16, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 11. The combination further discloses the part
`
`of the body is a hand, and the body part detector detects the position and the poseof
`
`the hand based on output from a sensor that is worn on the hand (Miller, in one or
`
`more embodiments, the at least one characteristic pertains to the user's hands;
`
`col. 17, lines 46-47).
`
`As to claim 17, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 1. The combination further discloses an object
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 14
`
`information acquiring section configured to acquire information related to the real object,
`
`wherein the drive signal generator generates the drive signal further according to the
`
`information, and wherein the object information acquiring section is implemented via at
`
`least one processor (Park, [0066] If the actual surgical tool 230 is imaged(e.g.,
`
`videoed, photographed, etc.) through the endoscope 220 as shownin FIG. 4, a
`
`portion on which the virtual surgical tool overlaps the actual surgical tool 230
`
`may represent the actual surgical tool 230).
`
`Asto claim 18, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 1. The combination further discloses the tactile
`
`sense providing mechanism is an oscillation generation mechanism that is capable of
`
`generating oscillation, and the drive signal generator generates, as the drive signal, a
`
`waveform of the oscillation generated by the oscillation generation mechanism (Miller, a
`
`numberof piezoelectric actuators may control an oscillation (e.g., frequency,
`
`amplitude) of the tip; col. 28, lines 7-10).
`
`Asto claim 21, the combination of Park and Miller discloses the information
`
`processing apparatus according to claim 1. The combination further discloses the drive
`
`signal generator generates the drive signal further based on a positional relationship
`
`between the real object and a virtual object (Park, [0021] In some example
`
`embodiments, the master system may comprise: a depth sensor configured to
`
`sense position, shape, posture, gesture, or motion of the haptic glove).
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 15
`
`Conclusion
`
`The prior art madeof record and not relied upon is considered pertinent to
`
`applicant's disclosure.
`
`Lim et al., US PGPUB 20130211418 discloses an apparatus and methodfor a
`
`tactile feedback are provided. The tactile feedback apparatus mayinclude a position
`
`measurementunit to measure a position of a mechanicallink, and a tactile feedback
`
`unit to transmit a tactile sensation based on at least one of a position of the mechanical
`
`link and a rotation angle of the mechanicallink.
`
`Applicant's amendment necessitated the new ground(s) of rejection presented in
`
`this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP
`
`§ 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37
`
`CFR 1.136(a).
`
`A shortened statutory period for reply to this final action is set to expire THREE
`
`
`
`MONTHS from the mailing date of this action. In the eventafirst reply is filed within
`
`TWO MONTHS ofthe mailing date of this final action and the advisory action is not
`
`mailed until after the end of the THREE-MONTHshortenedstatutory period, then the
`
`shortened statutory period will expire on the date the advisory action is mailed, and any
`
`extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of
`
`the advisory action.
`
`In no event, however,will the statutory period for reply expire later
`
`than SIX MONTHS from the date of this final action.
`
`

`

`Application/Control Number: 18/015,630
`Art Unit: 2625
`
`Page 16
`
`Anyinquiry concerning this communication or earlier communications from the
`
`examiner should be directed to SAHLU OKEBATO whosetelephone numberis
`
`(571)270-3375. The examiner can normally be reached Mon - Fri 8:00 - 5:00.
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Avwww.uspto.gov/interviewpractice.
`
`If attempts to reach the examinerby telephone are unsuccessful, the examiner’s
`
`supervisor, WILLIAM BODDIE can be reached on 571-272-0666. The fax phone
`
`numberfor the organization where this application or proceeding is assigned is 571 -
`
`273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Centeris
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent-
`
`center for more information about Patent Center and
`
`https:/Awww.uspto.gov/patents/docx for information aboutfiling in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free). If you would like assistance from a USPTO CustomerService
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/SAHLU OKEBATO/
`Primary Examiner, Art Unit 2625
`11/6/2024
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket