`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/782,522
`
`06/03/2022
`
`AKIRA YOSHIDA
`
`SYP333848USO01
`
`5337
`
`CHIP LAW GROUP
`505 N. LAKE SHORE DRIVE
`SUITE 250
`CHICAGO, IL 60611
`
`JEAN, FRANTZ B
`
`2454
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`02/10/2025
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`docketing @chiplawgroup.com
`eofficeaction @appcoll.com
`sonydocket @evalueserve.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-3 and 6-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C] Claim(s)
`is/are allowed.
`Claim(s) 1-3 and 6-20 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10) The specification is objected to by the Examiner.
`11)0) The drawing(s) filedon__ is/are: a)(J accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)7) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`c)Z None ofthe:
`b)() Some**
`a)C All
`1.1.) Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.1.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20250127
`
`Application No.
`Applicant(s)
`17/782,522
`YOSHIDAetal.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`FRANTZ B JEAN
`2454
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 11 October 2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 2
`
`Uyiuyu5v8ihrfDETAILED ACTION
`
`Notice of Pre-AlA orAIA Status
`
`The presentapplication,filed on or after March 16, 2013, is being examined under the first
`
`inventorto file provisions of the AIA.
`
`The allowancefiled on 12 September 2024 has been withdrawn in light of a new prior art.
`
`Claims 1-3 and 6-20 are pending in the application. Claims 4-5 have been cancelled.
`
`Claim Interpretation
`
`The following is a quotation of 35 U.S.C. 112(f):
`
`(f} Element in Claim fora Combination. — An elementina claim fora combination maybe expressed as
`a means or step for performing a specified function without the recital of structure, material, or acts
`in support thereof, and such claim shall be construed to cover the corresponding structure, material,
`or acts described in the specification and equivalents thereof.
`
`The following isa quotation of AIA35 U.S.C. 112, sixth paragraph:
`
`An elementina claim for a combination may be expressed as a means or step for performing a
`specified function without the recital of structure, material, or acts in support thereof, and such claim
`shall be construed to cover the corresponding structure, material, or acts described in the
`specification and equivalents thereof.
`
`The claimsin this application are given their broadest reasonable interpretation using the plain
`
`meaning of the claim languagein light of the specification as it would be understood by one of ordinary
`
`skillin the art. The broadest reasonable interpretation of a claim element (alsocommonly referred to as
`
`a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or AIA 35 U.S.C.
`
`112, sixth paragraph,is invoked.
`
`As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong
`
`test will be interpreted under 35 U.S.C. 112(f) or AIA35 U.S.C. 112, sixth paragraph:
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 3
`
`(A)
`
`the claim limitation uses the term “means”or “step” or a term used as a substitute for “means”
`
`that is a generic placeholder (also calleda nonce termora non-structural term having no
`
`specific structural meaning) for performing the claimed function;
`
`(B)
`
`the term “means”or “step” or the generic placeholder is modified by functional language,
`
`typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking
`
`word or phrase, such as “configured to” or “sothat”; and
`
`(C)
`
`the term “means”or “step” or the generic placeholder is not modified by sufficient structure,
`
`material, or acts for performing the claimed function.
`
`Use of the word “means”(or “step”) in a claim with functional language creates a rebuttable
`
`presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or AIA 35
`
`U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C.
`
`112(f) or AIA35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient
`
`structure, material, or acts toentirely perform the recited function.
`
`Absence of the word “means”(or “step”) in a claim creates a rebuttable presumption that the
`
`claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or AIA35 U.S.C. 112, sixth
`
`paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or AIA35
`
`U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting
`
`sufficient structure, material or acts toentirely perform the recited function.
`
`Claim limitations in this application that use the word “means”(or “step”) are being interpreted
`
`under 35 U.S.C. 112(f) or AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office
`
`action. Conversely, claim limitations in this application that do not use the word “means”(or “step”) are
`
`not being interpreted under 35 U.S.C. 112(f) or AIA35 U.S.C. 112, sixth paragraph, except as otherwise
`
`indicated in an Office action.
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 4
`
`This application includes one or more claim limitations that do not use the word “means,” but
`
`are nonetheless being interpreted under 35 U.S.C. 112(f) or AIA35 U.S.C. 112, sixth paragraph, because
`
`the claim limitation(s) uses a generic placeholder that is coupled with functional language without
`
`reciting sufficient structure to perform the recited function and the generic placeholder is not preceded
`
`by astructural modifier. Such claim limitation(s) is/are: “discrimination Unit, estimation unit, video
`
`output control unit, and audio control unit” in claim 1 and its dependents 2-3, and 6-19.
`
`Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or AIA 35
`
`U.S.C. 112, sixth paragraph,it/they is/are being interpreted to cover the corresponding structure
`
`described in the specification as performing the claimed function, and equivalents thereof.
`
`If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or
`
`AIA35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them
`
`being interpreted under 35 U.S.C. 112(f) or AIA 35 U.S.C. 112, sixth paragraph(e.g., by reciting sufficient
`
`structure to perform the claimed function); or (2) presenta sufficient showing that the claim
`
`limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being
`
`interpreted under 35 U.S.C. 112(f) or AIA 35 U.S.C. 112, sixth paragraph.
`
`Claim Rejections - 35 USC § 112
`
`The following is a quotation of 35 U.S.C. 112(b):
`(b} CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out
`and distinctly claiming the subject matter which the inventor or a jointinventor regards as the
`invention.
`
`The following is a quotation of 35 U.S.C. 112 (AIA), second paragraph:
`The specification shall conclude with one or moreclaims particularly pointing out and distinctly
`claiming the subject matter which the applicant regards as his invention.
`
`Claim limitation “discrimination Unit, estimation unit, video output control unit, and audio
`
`control unit” invokes 35 U.S.C. 112(f) or AIA 35 U.S.C. 112, sixth paragraph. However, the written
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page5S
`
`description fails to disclose the corresponding structure, material, or acts for performing the entire
`
`claimed function and to clearly link the structure, material, or acts tothe function. Therefore, the claim
`
`is indefinite and is rejected under 35 U.S.C. 112(b) or AIA35 U.S.C. 112, second paragraph.
`
`Applicant may:
`
`(a)
`
`Amend the claim sothat the claim limitation will no longer be inter preted as a limitation under
`
`35 U.S.C. 112(f) or AIA 35 U.S.C. 112, sixth paragraph;
`
`(b)
`
`Amend the written description of the specification such that it expressly recites what structure,
`
`material, or acts perform the entire claimed function, without introducing any new matter (35
`
`U.S.C. 132(a)); or
`
`(c)
`
`Amendthe written description of the specification such that it clearly links the structure,
`
`material, or acts disclosed therein to the function recitedin the claim, without introducing any
`
`new matter (35 U.S.C. 132(a)).
`
`If applicant is of the opinion that the written description of the specification already implicitly or
`
`inherently discloses the corresponding structure, material, oracts and clearly links them to the function
`
`so that one of ordinary skill in the art would recognize what structure, material, or acts perform the
`
`claimed function, applicant should clarify the record by either:
`
`(a)
`
`Amending the written description of the specification such that it expressly recites the
`
`corres ponding structure, material, or acts for performing the claimed function and clearly links
`
`or associates the structure, material, or acts tothe claimed function, without introducing any
`
`new matter (35 U.S.C. 132(a)); or
`
`(b)
`
`Stating on the record what the corresponding structure, material, or acts, which are implicitly or
`
`inherently set forth in the written description of the specification, perform the claimed function.
`
`For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o0) and 2181.
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 6
`
`Claim Rejections - 35 USC § 102
`
`The following is a quotation of the appropriate paragraphsof 35 U.S.C. 102 that form the basis
`
`for the rejections under this section made in this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a}(1) the claimed invention was patented, described ina printed publication, or in public use, on sale,
`or otherwise available to the public before the effective filing date of the claimed invention.
`
`Claim(s) 1, 3, 7, 9-15, 17, and 19-20 is/are rejected under 35 U.S.C. 102(a)(1) as being
`
`anticipated by Prior art JP5618043, hereinafter PA8043.
`
`N. B.: The translated version of the prior art JP 5618043 comprises 7 embodimentssection
`
`whichwill be referred to by the examiner becausethe translated version lacks numbered
`
`paragraph and column.
`
`As perclaim 1, PA8043 teaches an information processing device, comprising: a discrimination
`
`unit configured to discriminate a gazing point of a user who viewsvideo and audio (see fourth
`
`embodimentsection, “the video type discrimination unit 34”; an estimation unit configured to
`
`estimate sounding coordinates at which a sound image of an object gazed by the user is
`generated, wherein the sounding coordinates are estimated based on a video stream, an audio
`stream, and a discrimination result of the discrimination unit (see fourth embodiment section,
`
`“the sound source position estimation Unit 5”; it must be noted that sounding coordinates are
`
`merely the determination of the sound imageto be localized in “object-based audio”; a video
`output control unit configured to control an output of the video stream (video output control
`
`unit is inherent in PA8043 in order to produce the video (see Third - Fifth embodimentssection
`that discusses “video object”); and an audio output control unit configured to control an output
`
`of the audio stream to generate the sound image at the sounding coordinates (see fourth
`embodiment section,fig 1, 9-10; “sound source position estimation unit”).
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 7
`
`As per claim 3, PA8043 teaches the information processing device according to claim 1, wherein
`
`the estimation unit is further configured to estimate the sounding coordinates to generate the
`
`sound imageof a specific object, and the specific objectserves as a sound sourcein the video
`stream (sounding coordinates; see third embodiment).
`
`4. (Canceled)
`
`5. (Canceled)
`
`As per claim 7, PA8043 teaches the information processing device according to claim 1, wherein
`
`the video output control unit is further configured to perform rendering of the video based ona
`result of discrimination of a gazing degree of the user (see Third - Fifth embodimentssection
`that discusses “video object”).
`
`As per claim 9, PA8043 teaches the information processing device according to claim 7, wherein
`
`the rendering includes at least one of a framing process or a zooming processof the video
`(third embodimentdiscusses video frame from video signal).
`
`As per claim 10, PA8043 teaches the information processing device according to claim 7,
`
`wherein the video output control unit is furtherconfigured to performthe rendering based ona
`result of tracking the object gazed by the user (first embodiment section discusses sound
`
`source position estimation and coordinates which implied tracking).
`
`As perclaim 11, PA8043 teaches the information processing device according to claim 7,
`
`wherein the estimation unit is further configured to estimate the sounding coordinates based
`
`on the video subjected to at least one of a framing process or a zoomingprocess(see first
`
`embodiment in regard to video framing).
`
`As per claim 12, PA8043 teaches the information processing device according to claim 9,
`
`wherein at least one of the framing process or the zooming process of the video is performed
`
`stepwise or at a specific speed to a target value (see sixth embodimentin regard to video
`
`framing and speed).
`
`As per claim 13, PA8043 teaches the information processing device according to claim 1,
`
`wherein the video stream is a stream of three-dimensional (3D) video (see First embodiment
`
`section in regard to video stream of 3D or three-Dimensional).
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 8
`
`As perclaim 14, PA8043 teaches the information processing device according to claim 13,
`
`wherein the estimation unit is further configured to generate the sound image based on a 3D
`
`depth ora direction of 3D display ofa specific object, and the specific object serves as a sound
`
`source included in the video stream (see embodimentssections 1 and 3).
`
`As per claim 15, PA8043 teaches the information processing device according to claim 13,
`
`wherein the estimation unit is further configured to generate the sound image based on a 3D
`depth and a direction of 3D display of a specific object, and the specific objectserves as a sound
`source included in the video stream (see embodimentssections 1 and 3).
`
`As per claim 17, PA8043 teaches The information processing device according to claim 1,
`
`further comprising: an acquisition unit configured to acquire related information of the object
`
`discriminated based onafeature of the object corresponding to the gazing point (see fourth
`
`embodiment section) ; and a related information output control unit configured to control an
`
`output of the acquired related information, wherein the video output control unit is further
`configured to control to output the related information together with the video stream (see
`
`third to fifth embodiment sections).
`
`As per claim 19, PA8043 teaches the information processing device according to claim 1,
`wherein the audio stream includes meta information of object-based audio (see fourth- fifth
`
`embodiments sections).
`
`As per claim 20, it is a method of claim 1. It contains similar limitations. Therefore, it is rejected
`
`under the same rationale.
`
`Claim Rejections - 35 USC § 103
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections
`
`set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is
`not identically disclosed as set forthin section 102,if the differences between the claimed invention
`and the prior art are such that the claimed invention as a whole would have been obvious before the
`effective filing date of the claimed invention to a person having ordinaryskill in the art to which the
`claimed invention pertains. Patentability shall not be negated by the manner in which the invention
`was made.
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 9
`
`The factual inquiries for establishing a background for determining obviousness under 35 U.S.C.
`
`103 are summarized as follows:
`
`1. Determining the scope and contents of the prior art.
`
`2. Ascertaining the differences between the prior art and the claims at issue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence present in the application indicating obviousness or
`
`nonobviousness.
`
`Claim(s) 2, 6, 8, 16, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over
`
`JP5618043 hereinafter Prior art 8043.
`
`As per claim 2, PA8043 doesnot teach information processing device according to claim 1,
`
`wherein the estimation unit is further configured to estimate the sounding coordinates based
`
`on a machine learning model.
`
`Using machine learning model is a standard practice and its implementation cannot be
`
`considered as an invention.
`
`Once skilled artisan before the effectivefiling date of the invention as claimed would use a
`
`machine Learning model to generate automatic event, feedback, and more.
`
`As per claim 6, PA8043 doesnot teach the information processing device according to claim 1,
`
`wherein the discrimination unit is further configured to discriminate the gazing point of the
`
`user based on a machine learning model.
`
`Using machine learning model is a standard practice and its implementation cannot be
`
`considered as an invention.
`
`Once skilled artisan before the effectivefiling date of the invention as claimed would use a
`
`machine Learning model to generate automatic event, feedback, and more.
`
`As per claim 8, PA8043 teaches the information processing device according to claim 7, wherein
`
`the video output control unit is further configured to perform the rendering of the video
`(rendering a video is implicit in PA8043 see third embodiment section).
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 10
`
`However, PA8043 does not discuss a machine learning model.
`
`Using machine learning model is a standard practice and its implementation cannot be
`
`considered as an invention.
`
`Once skilled artisan before the effectivefiling date of the invention as claimed would use a
`
`machine Learning model to generate automatic event, feedback, and more.
`
`As per claim 16, PA8043 teaches the information processing device according to claim 1, further
`comprising: a display unit configured to perform three-dimensional (3D) display (implicit in
`
`PA8043, see embodiments sections 1 and 3). However, PA8043 does not discuss binocular
`
`parallax. It must be noted that 3D display based on binocular parallax is equal to the maximum
`
`visible by a human viewer whichis achieved by sufficiently small size of a sub-pixel for that
`
`purpose without width of the image spectrum beinglarger than the size of the corresponding
`
`viewing slit.
`
`As per claim 18, PA8043 teaches the information processing device according to claim 17,
`wherein the estimation unit is further configured to (see fourth embodimentsection); extract
`
`the feature of the object, or acquire the related information (see seventh embodiment section).
`
`However, P83 does notdiscuss a machine learning model.
`
`machine learning model is a standard practice and its implementation cannot be considered as
`
`an invention.
`
`Once skilled artisan before the effectivefiling date of the invention as claimed would use a
`
`machine Learning model to generate automatic event, feedback, and more.
`
`Anyinquiry concerning this communication or earlier communications from the examiner
`
`should be directed to FRANTZ B JEAN whose telephone number is (571)272-3937. The examiner can
`
`normally be reached 8-5 M-F.
`
`Examiner interviews are available via telephone, in-person, and video conferencing using a
`
`USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use
`
`the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
`
`
`
`Application/Control Number: 17/782,522
`Art Unit: 2454
`
`Page 11
`
`If attempts to reachthe examiner by telephone are unsuccessful, the examiner’s supervisor,
`
`Glenton B Burgess can be reached on 5712723949. The fax phone number for the organization where
`
`this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be obtained from
`
`Patent Center. Unpublished application information in Patent Center is available to registered users. To
`
`file and managepatent submissions in Patent Center,visit: https://patentcenter.us pto. gov.Visit
`
`https ://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and
`
`https ://www.uspto.gov/patents/docx for information aboutfiling in DOCX format. For additional
`
`questions, contact the Electronic Business Center (EBC) at 866-217-9197(toll-free). If you would like
`
`assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA)or
`
`571-272-1000.
`
`/FRANTZ B JEAN/
`Primary Examiner, Art Unit 2454
`
`