`
`UNITEDSTATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`18/002,663
`
`12/21/2022
`
`Atsushi IAUMIHARA
`
`19970US01
`
`1699
`
`Xsensts
`
`/Sony
`
`mens
`
`Xsensus / Sony
`100 Daingerfield Road, Suite 402
`Alexandria, VA 22314
`
`CRAWFORD,JACINTA M
`
`2617
`
`12/19/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`Xdocket @ XSensus.com
`
`Xsensuspat@ XSensus.com
`anaquadocketing @ Xsensus.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1-20 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)¥] The specification is objected to by the Examiner.
`11) The drawing(s) filed on 21 December 2022 is/are: a)¥) accepted or b)L) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241208
`
`Application No.
`Applicant(s)
`18/002,663
`IZUMIHARA,Atsushi
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`JACINTA M CRAWFORD
`2617
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 21 December 2022.
`C) A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 2
`
`DETAILED ACTION
`
`1.
`
`This action is in response to communications: Preliminary-Amendmentfiled December
`
`21, 2022.
`
`2.
`
`Claims 1-20 are pending in this case. No claims have been newly amended, added, or
`
`cancelled. This action is made Non-Final.
`
`Notice of Pre-AIA or AIA Status
`
`3.
`
`The present application, filed on or after March 16, 2013, is being examined underthe
`
`first inventorto file provisions of the AJA.
`
`Priority
`
`4.
`
`Receipt is acknowledgedofcertified copies of papers required by 37 CFR 1.55.
`
`Information Disclosure Statement
`
`5.
`
`The information disclosure statement (IDS) submitted on December 21, 2022 wasfiled
`
`on the filing date of the application on December 21, 2022. The submission is in compliance
`
`with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being
`
`considered by the examiner.
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 3
`
`Drawings
`
`6.
`
`The drawings were received on December 21, 2022. These drawingsare accepted.
`
`Specification
`
`7.
`
`Thetitle of the invention is not descriptive. A new title is required that is clearly
`
`indicative of the invention to which the claims are directed.
`
`Claim Rejections - 35 USC § 101
`
`8.
`
`35 U.S.C. 101 reads as follows:
`
`Whoeverinvents or discovers any new and useful process, machine, manufacture, or composition of
`matter, or any new and useful improvementthereof, may obtain a patent therefor, subject to the
`conditions and requirementsofthistitle.
`
`Claim 19 is rejected under 35 U.S.C. 101 because the claimed invention is directed to
`
`non-statutory subject matter. The claim(s) does/do notfall within at least one of the four
`
`categories of patent eligible subject matter. Claim 19 is directed to “a program,” which is merely
`
`a set of instructions capable of being executed by a computer, thus the program itself is not a
`
`process. Additionally, the program doesnot define any structural and functional
`
`interrelationships between the program and other claimed elements of a computer which permit
`
`the program’s functionality to be realized, thus the program itself is also not a machine,
`
`manufacture, or composition of matter. Therefore, a “program”is non-statutory.
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 4
`
`Claim Rejections - 35 USC § 112
`
`9.
`
`The following is a quotation of 35 U.S.C. 112(b):
`(b) CONCLUSION.—Thespecification shall conclude with one or more claimsparticularly pointing
`out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the
`invention.
`
`The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph:
`The specification shall conclude with one or more claimsparticularly pointing out and distinctly
`claiming the subject matter which the applicant regardsas his invention.
`
`10.
`
`Claim 18 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA), second
`
`paragraph,as being indefinite for failing to particularly point out and distinctly claim the subject
`
`matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C.
`
`112, the applicant), regards as the invention.
`
`Claim 18 recites, “...the GUI...” which lack antecedent basis. Claim 18 depends upon
`
`claim 17, which does notrecite a “GUI,” however, claims 12-14 each recite a “GUI,”thusis
`
`unclear as to which claim 18 is to depend. For prior art purposes, it is considered claim 18 is to
`
`depend from claim 17, but should recite, “...a graphical user interface (GUI)...”
`
`Claim Rejections - 35 USC § 103
`
`11.
`
`In the event the determination of the status of the application as subject to AIA 35 U.S.C.
`
`102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the
`
`statutory basis (1.e., changing from AIAto pre-AIA)for the rejection will not be considered a
`
`new ground ofrejection if the prior art relied upon, and the rationale supporting the rejection,
`
`would be the same undereither status.
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 5
`
`12.
`
`The following is a quotation of 35 U.S.C. 103 which formsthebasis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention maynotbe obtained, notwithstanding that the claimed invention is not
`identically disclosed as set forth in section 102, if the differences between the claimed invention and the
`prior art are such that the claimed invention as a whole would have been obviousbefore the effective
`filing date of the claimed invention to a person having ordinaryskill in the art to which the claimed
`invention pertains. Patentability shall not be negated by the mannerin which the invention was made.
`
`13.
`
`Claim(s) 1-3, 10, 11, 19, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable
`
`over KWONet al. (US 2016/0349849),.
`
`Asto claim 1, KWONet al. disclose an information processing method(e.g. process of Figure 5,
`
`further elaborated in Figure 9), which is executed by a computer system (Figure 1, eyewear
`
`terminal 100), comprising a control step of controlling setting of a virtual viewpoint of a user
`
`with respect to a real space (e.g. controlling the position, direction, orientation, speed, and/or
`
`path of a drone 1000 in a physical environment(real space) via user gesture(s), and further
`
`controlling the capturing of an image by a cameraof the drone from its respective position,
`
`direction, orientation, speed, and/or path, the captured image transmitted and displayed to the
`
`eyewearterminal 100 as a virtual image to be viewed by the user, thus considered “virtual
`
`viewpoint’)(e.g. Figure 4 illustrates drone 1000 which may be controlled via a control unit 180
`
`of the eyewearterminal 100, and data (e.g. captured images) may be exchanged between the
`
`drone 1000 and eyewearterminal 100 via a wireless communication unit 110 (e.g. [0148] and
`
`[0149]), the drone 1000 comprising at least one camera 112] (e.g. [0156]), and in response to a
`
`request (e.g. gesture) from the eyewear terminal 100, the control unit of the drone may transmit
`
`a preview image, e.g. as visual information 300, received through the camera 1121 to the
`
`eyewearterminal 100 through the wireless communication unit 110 that is displayed on the
`
`eyewearterminal 100 as a virtual image 400 (e.g. [0159] thru [0161 ]); Figure 10A and
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 6
`
`associated text, e.g. [0243] thru [0251] notes a user performing gestures which in turn causes
`
`the control unit 180 to control the drone 1000 to be moved in a particulardirection, e.g. one or
`
`more of 810a’ thru 810f’, where the direction of the drone 1000 may be defined with respectto
`
`the camera 1121 provided in the drone 1000) and setting of an operable range in which an
`
`operation relating to the real space can be executed by the user with respect to the real space(e.g.
`
`controlling the camera 1121 of the drone 1000, andfurther controlling operations, e.g. scaling,
`
`of the preview image, e.g. as visual information 300, captured by the camera 112] of the drone
`
`1000 in real space via user gesture(s) performed by eyewearterminal 1000 to be transmitted and
`
`displayed to the eyewearterminal 100 as a virtual image to be viewed by the user)(e.g. Figures
`
`I3A, 13B, 14A, and 14B, [0287] thru [0311] notes performing user gestures, e.g. pinch-in or
`
`pinch-out gestures, where when any oneof the pinch-in gesture and the pinch-out gestureis
`
`applied, the control unit 180 may control the camera 1121 provided in the drone 1000 to
`
`perform any one of a zoom-out function or a zoom-in function to change the preview image
`
`received through the camera 1121).
`
`As noted above,it is considered the claimed “virtual viewpoint”as at least a position (and/or
`
`direction, orientation, speed, path) of the drone in a physical environmentset by the user, where
`
`an image may be capturedat that position, which is further transmitted and displayed as a virtual
`
`image to the user via the eyewear terminal, and the operation range may bethe scale or range in
`
`whichthe virtual image is displayed, e.g. zoomed-in or zoomed-out, thus yielding predictable
`
`results without changing the scope of the invention.
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 7
`
`Asto claim 2, KWONetal. disclose the control step sets a first virtual viewpoint within the real
`
`space and sets a second virtual viewpoint different from thefirst virtual viewpoint within the
`
`operable range (e.g. as illustrated in Figure 9, a first gesture may control movement of drone
`
`1000, including position, direction, orientation, speed, and/or path of a drone 1000 in real
`
`space, where a second gesture may control the camera of the drone 1000, e.g. zoom-in and
`
`zoom-out functions).
`
`Asto claim 3, KWONet al. disclose the control step changes a scale of the operable range (e.g.
`
`as noted in claims I and 2 above, user may perform a gesture to control the camera of the drone
`
`1000, which performs scaling functions, e.g. zoom-in and zoom-out functions).
`
`Asto claim 10, KWONetal. disclose the control step changesa scale of the operable range on a
`
`basis of a position of the virtual viewpoint(e.g. as illustrated in Figure 9, first gesture may
`
`control movementof drone 1000, including position, direction, orientation, speed, and/or path of
`
`a drone 1000 in real space, where a second gesture may control the camera of the drone 1000,
`
`e.g. zoom-in and zoom-out functions, the second gesture differentfrom the first gesture, thus may
`
`be considered based onthe first gesture, which sets the “virtual viewpoint” as noted in claim
`
`1).
`
`Asto claim 11, KWONetal. disclose the control step sets a position of the virtual viewpoint on
`
`a basis of a scale of the operable range(e.g. further regarding Figure 9, [0225] and [0026] notes
`
`in the case the user gesture is a second gesture, controlling the camera of the drone 1000, which
`
`includes changing (deforming) the image, where changing (deforming) the image may include
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 8
`
`changing of a display position (distance, coordinates, direction) where the image is formed, and
`
`a display scheme).
`
`Asto claim 19, KWONetal. disclose a program ({0429] notes present invention may be
`
`implemented as computer readable code in a medium in which a program is recorded), which
`
`causes a computer system (e.g. Figure 1, eyewear terminal 100) to perform the information
`
`processing methodas outlined in claim 1. Please see the rejection and rationale of claim 1
`
`above.
`
`Asto claim 20, KWONetal. disclose an information processing system ({0429], computer
`
`system), comprising: a mobile object that movesin a real space (e.g. drone 1000); and an
`
`information processing apparatus (Figure 1, eyewear terminal 100) including a control unit(e.g.
`
`controller 180) that performs the information processing method as outlined in claim 1. Please
`
`see the rejection and rationale of claim | above.
`
`14.
`
`Claim(s) 4-9 and 12-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over
`
`KWONetal. (US 2016/0349849) as applied to claim 1 above, and further in view of Taylor et
`
`al. (US 2018/0094931)
`
`Asto claim 4, KWONet al. do not disclose, but Tayloret al. disclose a detection step of
`
`detecting a candidate plane for setting the virtual viewpoint or the operable range from the real
`
`space ({[0037] notes steering assist system 100 receives flight data for the UAV 110 (drone) as it
`
`flies along a flight path and performs maneuvers, from sensors 106, [0038] notes steering assist
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 9
`
`system 100 determines a current position of the virtual UAV modelin the virtual world modelor
`
`the position of the UAV 110 in the real world using GPS tracker 105 which connects to a GPS
`
`satellite, [0039] thru [0043] notes the steering assist system 100 determines a predicted
`
`trajectory of the virtual UAV modelwithin the virtual world model, [0044] notes steering assist
`
`system 100 then determines a navigation suggestion for the UAV based on the predicted
`
`trajectory and capability parameters for the UAV 110, where the navigation suggestions of the
`
`predicted trajectory may be considered a “candidate plane”).
`
`It would have been obviousto one of ordinary skill in the art at the time of the invention to
`
`further modify KWONetal.’s system and method of controlling a virtual viewpoint or operable
`
`range of a mobile object, such as a drone, with Tayloret al.’s method of providing navigation
`
`suggestions of a predicted trajectory, e.g. detecting a candidate plane, of a mobile objectas
`
`steering assistance for piloting drones to avoid collisions, damages, or destruction (see [0004]
`
`thru [0007] of Tayloret al.).
`
`Asto claim 5, KWONet al. modified with Taylor et al. disclose the control step sets a position
`
`separated from the detected candidate plane by a predetermineddistance as the virtual viewpoint
`
`(e.g. KWON, as noted in claim I, the user may control the movement, including position,
`
`direction, orientation, speed, and/or path, of the drone 1000 in real space via user gestures;
`
`modified with Taylor, as noted in claim 4, the navigation suggestion may be based on the
`
`predicted trajectory, which is not the current position, thus separatedfrom the navigation
`
`suggestion).
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 10
`
`Asto claim 6, KWONet al. modified with Taylor et al. disclose the control step sets a position
`
`separated from the detected candidate plane by a predetermined distance as a reference point that
`
`is a reference of the operable range (e.g. KWON modified with Taylor, see claim 5, where the
`
`user may be consideredto set a reference point, e.g. a virtual viewpoint, via the user gestures as
`
`further noted in claim 2).
`
`Asto claim 7, KWONet al. modified with Taylor et al. disclose the control step controls the
`
`virtual viewpoint or the operable range on a basis of the set reference point (e.g. KWON, as noted
`
`in claims 2 and 3, the user may control the movements or camera of the drone).
`
`Asto claim 8, KWONet al. modified with Taylor et al. disclose a setting step of setting a
`
`position of the virtual viewpoint and a scale of the operable range on a basis of a size of the real
`
`space, wherein the control step makes changesto the set position of the virtual viewpoint and the
`
`set scale of the operable range with the reference point as a reference (e.g. KWON, see claims 2,
`
`3, 10, and 11).
`
`Asto claim 9, KWONet al. modified with Taylor et al. disclose the detection step detects the
`
`candidate plane on a basis of a predetermined axis of the real space (e.g. modified with Taylor, as
`
`noted in claim 4 above, navigation suggestion for the UAV basedonthe predicted trajectory and
`
`capability parameters for the UAV 110, where the predicted trajectory may be considered to
`
`include a predetermined axis).
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 11
`
`Asto claim 12, KWONetal. disclose controlling the virtual viewpoint and the operable range to
`
`the user (see claim 1), but do not disclose, but Tayloret al. disclose a presentation step of
`
`presenting a graphical user interface (GUI) capable of controlling the virtual viewpoint and the
`
`operable range to the user (e.g. [0049] thru [0051] notes GPU 164 displays a graphical user
`
`interface (GUI) on a head-mounted display (HMD), the GPU 164 outputs a graphical
`
`representation of the UAV (drone) flight path corresponding to a selected section, the GPU 164
`
`may also output a graphical representation of controller inputs, e.g. display a two-dimensional
`
`or three-dimensional virtual model representation of a controller device that mimics the looks
`
`and control movementof the physical controller device 150, e.g. further display joystick levers,
`
`buttons, sliders, etc. on the virtual model move according to the controller inputs as the physical
`
`controller device 150 would move).
`
`It would have been obviousto one of ordinary skill in the art at the time of the invention to
`
`further modify KWONetal.’s system and method of controlling a virtual viewpoint or operable
`
`range of a mobile object, such as a drone, with Tayloret al.’s method of providing a graphical
`
`user interface (GUI)for assisting in controlling a mobile object such that the user may view and
`
`gain experience in piloting the mobile object to avoid collisions, damages, or destruction (see
`
`[0004] thru [0007] and [0051] of Tayloret al.).
`
`Asto claim 13, KWONetal. modified with Tayloret al. disclose the presentation step presents a
`
`virtual viewpoint image obtained when the user views the real space from the virtual viewpoint
`
`(e.g. KWON, as noted in claim I, the preview image, e.g. as visual information 300, received
`
`through the camera 1121 is transmitted to and displayed on the eyewear terminal 100 as a
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 12
`
`virtual image 400), and the GUIis capable of setting a first virtual viewpoint within the real
`
`space and setting a second virtual viewpoint different from the first virtual viewpoint within the
`
`operable range (e.g. KWON, as noted in claim 2, the user may control movement and the camera
`
`of the drone 1000 via user gestures; modified with Taylor, as noted in claim 12, the user may be
`
`provided a GUIdisplayed on the HMD (e.g. eyewear terminal) to control the UAV (drone), thus
`
`set different virtual viewpoints).
`
`Asto claim 14, KWONetal. modified with Tayloret al. disclose the GUIis capable of setting
`
`the candidate plane within the operable range (modified with Taylor, see claims 4, 12, and 13).
`
`Asto claim 15, KWONetal. disclose the real space created by a sensor (e.g. as noted in claim 1,
`
`drone 1000 comprises at least one camera 112] for capturing an imageof the physical
`
`environment;), but do not disclose, but Tayloret al. disclose the real space is a three-dimensional
`
`mapcreated by a sensor (modified with Taylor, [0029] notes steering assist system 100 receives
`
`physical space data in real-time and includes sensor data from sensors 106 on the UAV (drone)
`
`and/or external sensors 120 placed aroundthe flight area, the steering assist system 100 creates
`
`a virtual world in real-time using simultaneous localization and mapping (SLAM), [0030] notes
`
`the steering assist system 100 creates a virtual world model to represent the flight area, e.g. by
`
`mapping the physical space data with a physics engine 166, and additionally creates a virtual
`
`UAV modelto represent the UAV in the virtual world model, [0031] notes the physics engine
`
`166 creates a three-dimensional mesh representing a virtual world, the mesh as a collection of
`
`vertices, edges andfaces that define the shape of a polyhedral object for use in three-
`
`dimensional modeling).
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 13
`
`It would have been obviousto one of ordinary skill in the art at the time of the invention to
`
`further modify KWONetal.’s system and method of capturing images via a mobile object, e.g. a
`
`drone, with Tayloret al.’s system and method of capturing images via a mobile object, e.g. a
`
`drone, and further creating a virtual world model to represent the flight area to further create a
`
`simulation of a predicted trajectory to be used to generate navigation suggestions as steering
`
`assistance for piloting drones to avoid collisions, damages, or destruction (see [0004] thru
`
`[0007] of Tayloret al.).
`
`Asto claim 16, KWONet al. modified with Tayloret al. disclose the control step changesa scale
`
`of the operable range on a basis of the three-dimensional map created by the sensor(e.g. KWON,
`
`as noted in claim 3, the user may control the camera of the drone 1000, which performs scaling
`
`functions, e.g. zoom-in and zoom-outfunctions; modified with Taylor, [0066] notes camera 205
`
`may include a lens system which may further include multiple moveable lenses that can be
`
`adjusted to manipulate the focal length and/orfield of view (e.g. zoom) of the lens system, where
`
`it may render, implementing the virtual world model, would reflect the adjustments, e.g. zooming
`
`functions, as controlled).
`
`Asto claim 17, KWONetal. modified with Tayloret al. disclose the sensor is mounted on a
`
`mobile object (KWON, as noted in claim 1, drone 1000 comprises at least one camera 1121 for
`
`capturing an image of the physical environment; modified with Taylor, [0029] notes steering
`
`assist system 100 receives physical space data in real-time and includes sensor data from
`
`sensors 106 on the UAV (drone), where [0037] notes sensors 106 may include at least one
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 14
`
`camera, an infra-red (IR) sensor, a Light Detection and Ranging (LiDAR) sensor, a proximity
`
`sensor, a radar, a sonar sensor, or other such sensors, see Figure 2, which illustrates UAV
`
`(drone) may include multiple sensors including at least one camera 205, a position sensor 235,
`
`and one or more environmental sensors).
`
`Asto claim 18, KWONetal. modified with Tayloret al. disclose the GUI is capable of
`
`generating a path, along which the mobile object moves, by an operation of the user (modified
`
`with Taylor, [0049] and [0050] notes GPU 164 displays a graphical user interface (GUI) on a
`
`head-mounted display (HMD), the GPU 164 outputs a graphical representation of the UAV
`
`(drone) flight path corresponding to a selected section).
`
`Conclusion
`
`15.
`
`Theprior art madeof record and notrelied upon is considered pertinent to applicant's
`
`disclosure.
`
`Da Veigaet al. (US 10,139,631) disclose a system and methodof controlling a movement
`
`of a camera of a mobile object via user head movements detected via a head-mounted
`
`display.
`
`Tamanahaet al. (US 2019/0088025) disclose a display device, e.g. head-mountdisplay,
`
`for receiving video data, location data, and position data from an unmannedaerial vehicle
`
`(UAV), the display device convertsa first set of coordinates from the location and
`
`position data to a secondset of coordinates of a virtual world model, and the user of the
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 15
`
`display device provides a selection of an AR object at the display device that is also used
`
`to control the UAV.
`
`Smolyanskiy et al. (US 2017/0251180) disclose a system and methodof collaborative
`
`camera viewpoint control for interactive telepresence, viewing devices, e.g. head-mount
`
`displays, receive video of an environmentfrom different viewpoints, where the video of
`
`the environment from a selected one of the viewpoints is displayable to users of the
`
`viewing devices.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to JACINTA M CRAWFORDwhosetelephone numberis
`
`(571)270-1539. The examiner can normally be reached 8:30a.m. to 4:30p.m.
`
`Examinerinterviews are available via telephone, in-person, and video conferencing using
`
`a USPTO supplied web-based collaboration tool. To schedule an interview, applicantis
`
`encouraged to use the USPTO Automated Interview Request (AIR) at
`
`http://www.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
`
`supervisor, King Y. Poon can be reached on (571)272-7440. The fax phone numberfor the
`
`organization wherethis application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Center is available
`
`to registered users. To file and managepatent submissions in Patent Center, visit:
`
`https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more
`
`information about Patent Center and https://www.uspto.gov/patents/docx for information about
`
`
`
`Application/Control Number: 18/002,663
`Art Unit: 2617
`
`Page 16
`
`filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC)
`
`at 866-217-9197 (toll-free). If you would like assistance from a USPTO CustomerService
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`JACINTA M CRAWFORD/
`Primary Examiner, Art Unit 2617
`
`