`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/906,642
`
`09/19/2022
`
`SHO OGURA
`
`SYP335258US01
`
`7859
`
`CHIP LAW GROUP
`505 N. LAKE SHORE DRIVE
`SUITE 250
`CHICAGO, IL 60611
`
`SHEDRICK, CHARLES TERRELL
`
`2646
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`12/19/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`docketing @chiplawgroup.com
`eofficeaction @appcoll.com
`sonydocket @evalueserve.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-17 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C) Claim(s)
`is/are allowed.
`Claim(s) 1-8 and 10-17 is/are rejected.
`Claim(s) 9 is/are objectedto.
`C) Claim(s
`are subject to restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http:/Awww.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)() The specification is objected to by the Examiner.
`11)M The drawing(s)filed on 9/19/22 is/are: a) accepted or b)C] objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`c)() None ofthe:
`b)( Some**
`a) All
`1.@) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No.
`3.1.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241003
`
`Application No.
`Applicant(s)
`17/906,642
`OGURA, SHO
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`CHARLES T SHEDRICK
`2646
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 9/19/22.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 2
`
`Notice of Pre-AIA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013, is being examined underthe
`
`first inventorto file provisions of the AJA.
`
`Claim Rejections - 35 USC § 101
`
`35 U.S.C. 101 reads as follows:
`
`Whoever invents or discovers any new and useful process, machine, manufacture, or
`composition of matter, or any new and useful improvementthereof, may obtain a patent
`therefor, subject to the conditions and requirementsofthistitle.
`
`Claim 17 is rejected under 35 U.S.C. 101 because the claimed invention is directed to
`
`non-statutory subject matter. The language of the claim raises a question as to whether the claim
`
`is directed merely to an abstract idea that is not tied to a technological art, environment or
`
`machine which wouldresult in a practical application producing a concrete, useful, and tangible
`
`result to form the basis of statutory subject matter under 35 U.S.C. 101.
`
`Claim 17 claims the non-statutory subject matter of a computer program. Data structures
`
`not claimed as embodied in a computer readable medium are descriptive material per se and are
`
`not statutory because they are not capable of causing functional change in the computer. See,
`
`e.g., Warmerdam, 33 F.3d at 1361, 31 USPQ2dat 1754 (claim to a data structure per se held
`
`nonstatutory). Therefore, since the claimed program not tangibly embodied in a physical medium
`
`and encoded on a computer readable medium then the Applicants has not complied with 35
`
`U.S.C 101.
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 3
`
`Claim Rejections - 35 USC § 102
`
`1.
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the
`
`basis for the rejections under this section made in this Office action:
`
`A personshall be entitled to a patent unless —
`
`(a)(2) the claimed invention was described in a patent issued undersection 151, or in an application for
`patent published or deemed published under section 122(b), in which the patent or application, as the
`case may be, names another inventor and waseffectively filed before the effective filing date of the
`claimed invention.
`
`2.
`
`Claim(s) 1, 4-7, 11-14 and 16-17 is/are rejected under 35 U.S.C. 102(a)(2) as being
`
`anticipated by Iwakiri US Patent Pub. No.:2018/0352215 Al.
`
`Consider Claims 1 and 16-17, Iwakiri teaches an information processing device
`
`comprising: a display processing unit that performs processing of displaying a screen indicating,
`
`by filtering, camerawork information corresponding to input information of a user among a
`
`plurality of pieces of camerawork information(e.g., see gray area 0071 which reads on
`
`filtering), as a cameraworkdesignation screen that receives designation operation of
`
`camerawork information that is information indicating at least a movementtrajectory of a
`
`viewpoint in a free viewpoint image (e.g., see at least 0045 “A camera pathof the virtual
`
`camera 801 is set in advance by the operator, and when a specific scene is detected, the
`
`camera pathis selected to thereby generate a virtual viewpoint image. The virtual camera
`
`801 is an example of a virtual imaging apparatus.” see also 0054-0055 and 0070-0072).
`
`Consider Claim 4, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying information obtained by visualizing a movementtrajectory of the
`
`viewpoint on the camerawork designation screen (e.g., see gray area 0071 which reads on
`
`filtering).
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 4
`
`Consider Claim 5, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying, on the camerawork designation screen, camera arrangement position
`
`information indicating arrangementpositions of a plurality of cameras that performs imaging for
`
`generating a free viewpoint image(e.g., see start and end position to generate virtual
`
`viewpoint 0054-0055).
`
`Consider Claim 6, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying, on the camerawork designation screen, start point arrangement position
`
`information and end point arrangementposition information indicating respective positions of a
`
`camera serving as a movementstart point and a camera serving as a movementend point of the
`
`viewpoint amongthe plurality of cameras (e.g., see start and end position to generate virtual
`
`viewpoint 0054-0055).
`
`Consider Claim 7, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying the start point arrangement position information and the end point
`
`arrangement position information, and arrangement position information of cameras other than
`
`the camera serving as the movementstart point and the camera serving as the movement end
`
`point amongthe plurality of camerasin different modes(e.g., see start and end position to
`
`generate virtual viewpoint 0054-0055 and 0060 - the use of viewpoints from multiple
`
`cameras 0070-0072).
`
`Consider Claim 11, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying a target that defines a line-of-sight direction from the viewpoint on the
`
`camerawork designation screen (e.g., see at least 0054-0055, 0060 and 0070-0072).
`
`Consider Claim 12, Iwakiri teaches the claimed invention further comprising: a
`
`camerawork editing processing unit that updates information on the position of the target in
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 5
`
`camerawork information according to a changein the position of the target on the camerawork
`
`designation screen(e.g., see at least 0054-0055, 0060 and 0070-0072).
`
`Consider Claim 13, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying an image obtained by observing a three-dimensional space from the
`
`viewpoint on the camerawork designation screen(e.g., see at least 0054-0055, 0060 and 0070-
`
`0072).
`
`Consider Claim 14, Iwakiri teaches wherein the display processing unit performs
`
`processing of displaying an image obtained by rendering a virtual three-dimensional model of a
`
`real space as an image obtained by observing a three-dimensional space from the viewpoint(e.g.,
`
`see at least 0054-0055, 0060 and 0070-0072).
`
`Claim Rejections - 35 USC § 103
`
`1,
`
`The following is a quotation of 35 U.S.C. 103 which formsthebasis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention maynotbe obtained, notwithstanding that the claimed invention is not
`identically disclosed as set forth in section 102, if the differences between the claimed invention and the
`prior art are such that the claimed invention as a whole would have been obviousbefore the effective
`filing date of the claimed invention to a person having ordinaryskill in the art to which the claimed
`invention pertains. Patentability shall not be negated by the mannerin which the invention was made.
`
`2.
`
`The factual inquiries for establishing a background for determining obviousness under 35
`
`U.S.C. 103 are summarized as follows:
`
`1. Determining the scope and contents of the priorart.
`
`2. Ascertaining the differences betweenthe prior art and the claimsat issue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence present in the application indicating obviousness or
`
`nonobviousness.
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 6
`
`3.
`
`This application currently namesjoint inventors. In considering patentability of the
`
`claims the examiner presumesthat the subject matter of the various claims was commonly
`
`ownedasofthe effective filing date of the claimed invention(s) absent any evidenceto the
`
`contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and
`
`effective filing dates of each claim that was not commonly ownedasofthe effective filing date
`
`of the later invention in order for the examinerto consider the applicability of 35 U.S.C.
`
`102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
`
`4.
`
`Claim(s) 2-3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwakiri US
`
`Patent Pub. No.:2018/0352215 Al in view Hooperet al. US Patent Pub. No.: 2010/0333026,
`
`hereinafter, ‘Hooper’.
`
`Consider Claim 2, Iwakiri teaches the claimed invention except wherein the display
`
`processing unit performsprocessing of filtering and displaying camerawork information
`
`according to a keyword as the input information on the camerawork designation screen.
`
`In analogousart, Hooper teaches that each keyword may have a group of related
`
`keywords or may be used independently of such a group. Once one or more keywords are
`
`entered/selected, the result set 302 is filtered based on the keyword(s) (e.g., by displaying only
`
`those results having such a keywordin their description or as an attribute; or alternatively by
`
`displaying only those elements from facets having such a keyword associated with it (e.g., see at
`
`least 0052).
`
`Therefore, it would have been obviousto a person of ordinary skill in the art before the
`
`invention was madetotry filtering and displaying camerawork informationto arrive at the
`
`claimed invention wherein the display processing unit performs processing offiltering and
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 7
`
`displaying camerawork information according to a keyword as the input information on the
`
`camerawork designation screen for the purpose of identifying a viewpoint in scene.
`
`Consider Claim 3, Iwakiri teaches the claimed invention except wherein filtering
`
`condition information indicating a filtering condition of camerawork informationis displayed on
`
`the camerawork designation screen, and the display processing unit performs processing of
`
`filtering and displaying the camerawork information accordingto the filtering condition
`
`indicated bythe selected filtering condition information as the input information.
`
`In analogousart, Hooper teaches that each keyword may have a group of related
`
`keywords or may be used independently of such a group. Once one or more keywords are
`
`entered/selected, the result set 302 is filtered based on the keyword(s) (e.g., by displaying only
`
`those results having such a keywordin their description or as an attribute; or alternatively by
`
`displaying only those elements from facets having such a keyword associated with it (e.g., see at
`
`least 0052).
`
`Therefore, it would have been obviousto a person of ordinary skill in the art before the
`
`invention was madetotry filtering and displaying camerawork informationto arrive at the
`
`claimed invention wherein filtering condition information indicating a filtering condition of
`
`camerawork information is displayed on the camerawork designation screen, and the display
`
`processing unit performsprocessing offiltering and displaying the camerawork information
`
`according to the filtering condition indicated by the selected filtering condition information as
`
`the input information for the purpose of identifying a viewpoint in scene.
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 8
`
`4.
`
`Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwakiri US
`
`Patent Pub. No.:2018/0352215 Al in view Kazuhiro WO 2019171834 Al.
`
`Consider Claim 8, Iwakiri teaches the claimed invention except wherein the display
`
`processing unit performs processing of displaying information obtained by visualizing the
`
`moving speed of the viewpoint on the camerawork designation screen.
`
`In analogous art, Kazuhiro teachesin figure 8 is an explanatory diagram showing an
`
`example of a visual field image including an image representing the moving amountof the
`
`virtual viewpoint or the moving speed ofthe virtual viewpoint accordingto the first embodiment.
`
`Thevisual field image V2 shownin FIG. 8 includes an image vol representing the movement
`
`amountof the virtual viewpoint or the movementspeedofthe virtual viewpoint. For example, an
`
`image vol representing the movement amountof the virtual viewpoint or the movement speed of
`
`the virtual viewpointis displayed at the center of the visual field image V2. For example, the
`
`image vol representing the movement amountof the virtual viewpoint or the movement speed of
`
`the virtual viewpoint is an image representing the movement amountof how muchthevirtual
`
`viewpoint is moving and the current movementspeedofthe virtual viewpoint. The visualfield
`
`image V1 including the image vol representing the movement amountof the virtual viewpoint or
`
`the movementspeed ofthe virtual viewpoint is displayed on the display unit 120 when the user
`
`movesthe position of the virtual viewpoint, for example.
`
`Therefore, it would have been obviousto a person of ordinary skill in the art before the
`
`effective filing date to try visualizing the moving speed of the viewpoint on the camerawork
`
`designation screen for the purposeof facilitating display and control.
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 9
`
`Claim 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwakiri US
`
`Patent Pub. No.:2018/0352215 Al in view Kinoshita et al. US Patent Pub. No.: 2012/0287282
`
`A1, hereinafter, ‘Kinoshita’.
`
`Consider Claim 10, Iwakiri teaches the claimed invention except wherein the display
`
`processing unit performs processing of displaying information obtained by visualizing a field of
`
`view from the viewpoint on the camerawork designation screen.
`
`In analogous art, Kinoshita teaches an image generating unit 3 and a navigation
`
`communication unit 42, serving as a synthetic image providing means and a model image
`
`providing meansin the present invention, includes: an output unit 42a that outputs, to the
`
`navigation device 20 (a display device in the present invention), image information
`
`corresponding to a synthetic image generated by the image generating unit 3 or, a model image
`
`indicating a field-of-view range from a virtual viewpoint of the synthetic image; and a reception
`
`unit 42b that receives information input by a user from the display 21 having a touch panel
`
`function or from the operation unit 22. Herein a model imagerefers to an image in which a
`
`plurality of possible viewpoint positions of virtual viewpoints which can be selected by a user for
`
`an image of a model vehicle imitating a real vehicle are indicated, and the viewpoint positions
`
`can be changed by a user using a viewpoint position change icon. Further, the viewpoint position
`
`can be changedto a certain viewpoint position of virtual viewpoint by using the display 21
`
`having a touch panel function, or the operation unit 22. Hereinafter, a synthetic image along with
`
`a model image refers to image information in at least 0036.
`
`Therefore, it would have been obviousto a person of ordinary skill in the art before the
`
`effective filing date to try wherein the display processing unit performs processing of displaying
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 10
`
`information obtained by visualizing a field of view from the viewpoint on the camerawork
`
`designation screen for the purpose of displaying images.
`
`5.
`
`Claim 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwakiri US
`
`Patent Pub. No.:2018/0352215 Al in view Koyamaet al. US Patent Pub. No.: 2019/0191146 Al,
`
`hereinafter, ‘Koyama’.
`
`Consider Claim 15, Iwakiri teaches the claimed invention wherein the display processing
`
`unit performs processing of displaying information notifying a camera in which a changein the
`
`field of view has been detected amongthe plurality of cameras.
`
`In analogousart, Kinoshita teaches Event detector 202a detects occurrence of a
`
`predeterminedevent that can be a reason for performing the camera calibration on one of
`
`cameras 100 includedin image capturing devices 10A to 10N, based on the capturing
`
`circumstance information that is provided from image capturing devices 10A to 10N. An event
`
`that can be a reason for performing the cameracalibration is an event that causes camera 100 to
`
`moveor hasa high possibility of the movement, or an event that has a high possibility of
`
`enabling the camera calibration with high accuracy. More specific examples will be described
`
`later in description of action of multiple viewpoint image capturing system 1000. In a case of
`
`detecting the occurrence of such an event, event detector 202a determines whether to perform the
`
`cameracalibration. In a case of determining to perform the camera calibration, event detector
`
`202a outputs camera calibration information that indicates the camera calibration to be
`
`performed to, for example, camera calibration instructing unit 202b. Alternatively, the camera
`
`calibration information may be output to the display device includedin user interface 500 to be
`
`presented to a user. The camera calibration information contains, for example, camera 100 on
`
`which the cameracalibration is to be performed (or one of image capturing devices 10A to 1O0N
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 11
`
`including the camera) and details of the event that is the reason for performing the camera
`
`calibration in at least 0082.
`
`Therefore, it would have been obviousto a person of ordinary skill in the art before the
`
`effective filing date to try wherein the display processing unit performs processing of displaying
`
`information notifying a camera in which a changein the field of view has been detected among
`
`the plurality of cameras for the purpose of displaying images.
`
`Allowable Subject Matter
`
`6.
`
`Claims 9 are objected to as being dependent upona rejected base claim, but would be
`
`allowable if rewritten in independentform includingall of the limitations of the base claim and
`
`any intervening claims.
`
`Conclusion
`
`7.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to CHARLES TERRELL SHEDRICKwhose telephone numberis
`
`(571)272-8621. The examiner can normally be reached 8A-5P.
`
`Examinerinterviews are available via telephone, in-person, and video conferencing using
`
`a USPTO supplied web-based collaboration tool. To schedule an interview, applicantis
`
`encouraged to use the USPTO Automated Interview Request (AIR) at
`
`http://www.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
`
`supervisor, Lester G Kincaid can be reached on 571 272 7922. The fax phone numberfor the
`
`organization wherethis application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Center is available
`
`
`
`Application/Control Number: 17/906,642
`Art Unit: 2646
`
`Page 12
`
`to registered users. To file and manage patent submissions in Patent Center, visit:
`
`https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more
`
`information about Patent Center and https://www.uspto.gov/patents/docx for information about
`
`filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC)
`
`at 866-217-9197 (toll-free). If you would like assistance from a USPTO CustomerService
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/CHARLES T SHEDRICK/
`Primary Examiner, Art Unit 2646
`
`