throbber
www.uspto.gov
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/943,283
`
`09/13/2022
`
`Yasushi OKUMURA
`
`116335-1469349-001060US
`
`3106
`
`Sony / Kilpatrick Townsend & Stockton LLP
`Mailstop: IP Docketing - 22
`1100 Peachtree Street
`Suite 2800
`Atlanta, GA 30309
`
`ROSARIO, DENNIS
`
`PAPER NUMBER
`
`2676
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/21/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`KTSDocketing2 @ kilpatrick.foundationip.com
`ipefiling @kilpatricktownsend.com
`scea_patent_docket @Playstation.Sony.com
`
`PTOL-90A (Rev. 04/07)
`
`

`

`
`
`Disposition of Claims*
`1-8 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C] Claim(s)
`is/are allowed.
`Claim(s) 1-8 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s)filed on 9/13/2022 is/are:
`a)[(¥) accepted or b)() objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)£) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Q) All
`1.1) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date _____
`Other: See Continuation Sheet
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241112
`
`Application No.
`Applicant(s)
`17/943,283
`OKUMURAetal.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`DENNIS ROSARIO
`2676
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 9/13/2022.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`

`

`Continuation Sheet (PTOL-326)
`
`Application No. 17/943 ,283
`
`Continuation of Attachment(s) 4) Other:
`SEARCH history, 33 pages;
`Workspace Notes, 1 pages;
`SearchIllustrated, 8 pages;
`Web Search History, 1 pages; and
`17/943,283 Search History, 4 pages.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 2
`
`DETAILED ACTION
`
`Claim Interpretation
`
`The claims in this application are given their broadest reasonable interpretation
`
`using the plain meaning of the claim languagein light of the specification as it would be
`
`understood by oneofordinary skill
`
`in the art. The broadest reasonable interpretation of
`
`a Claim element (also commonly referred to as a claim limitation) is limited by the
`
`description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth
`
`paragraph,
`
`is invoked.
`
`As explained in MPEP § 2181, subsection I, claim limitations that meet the
`
`following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35
`
`U.S.C. 112, sixth paragraph:
`
`(A)—the claim limitation uses the term “means”or “step” or a term used as a substitute
`
`for “means” that is a generic placeholder (also called a nonce term or a non-
`
`structural term having no specific structural meaning) for performing the claimed
`
`function;
`
`(B)—the term “means” or “step” or the generic placeholder is modified by functional
`
`language,
`
`typically, but not alwayslinked by the transition word “for” (e.g.,
`
`“meansfor’) or another linking word or phrase, such as “configured to” or “so
`
`that”: and
`
`(C)
`
`the term “means” or “step” or the generic placeholder is not modified by sufficient
`
`structure, material, or acts for performing the claimed function.
`
`Use of the word “means?”(or “step”) in a claim with functional language creates a
`
`rebuttable presumption that the claim limitation is to be treated in accordance with 35
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 3
`
`U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim
`
`limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth
`
`paragraph,
`
`is rebutted whenthe claim limitation recites sufficient structure, material, or
`
`acts to entirely perform the recited function.
`
`Absence of the word “means” (or “step”) in aclaim creates a rebuttable
`
`presumption that the claim limitation is not to be treated in accordance with 35 U.S.C.
`
`112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim
`
`limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth
`
`paragraph,
`
`is rebutted whenthe claim limitation recites function without reciting
`
`sufficient structure, material or acts to entirely perform the recited function.
`
`Claim limitations in this application that use the word “means”(or “step”) are
`
`being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph,
`
`except as otherwise indicated in an Office action. Conversely, claim limitations in this
`
`application that do not use the word “means” (or “step”) are not being interpreted under
`
`35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise
`
`indicated in an Office action.
`
`This application includes one or moreclaim limitations that do not use the word
`
`“means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AlA 35
`
`U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder
`
`that is coupled with functional language without reciting sufficient structure to perform
`
`the recited function and the generic placeholder is not preceded by a structural modifier.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Such claim limitation(s) is/are:
`
`Re claims 1 and 3:
`
`Page 4
`
`“an obtaining section configured to obtain position information...
`a deriving section configured to derive a length...; and
`a production control section configured to perform production” in claim 1.
`
`“an obtaining section configured to obtain position information...
`A deriving section configured to derive a direction...; and
`A production control section configured to perform production” in claim 3
`
`Becausethis/these claim limitation(s) is/are being interpreted under 35 U.S.C.
`
`112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph,
`
`it/they is/are being interpreted to
`
`cover the corresponding structure described in the specification (page 4 & figure 2) as
`
`performing the claimed function, and equivalents thereof:
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 5
`
`FIG.2
`
`2/3
`
`SECTION
`
`| CONTROL SECTION
`
`OBTAINING SECTION
`
`|
`
`Z PRODUCTION CONTROL SECTION
`
`|
`
`DERIVING SECTION
`
`34i
`|
`
`36
`
`The elernents described as functional blocks performing various processes in FiG, 2 can each
`
`include a circuit block, a memory, or another large-scale integrated (LSD circuit or central
`
`pracessing anit (CPU) m terms of hardware, and are implemented by a programloaded in a
`
`memory or the bke in terms of sofiware, Hence, uf is to be understood by those skilled in the art
`
`that these functional blocks can he implemented im various forms by only hardware, only software,
`
`er combinations of hardware and software, and are oot limited to one of the forms.
`
`dn the
`
`miarmafion processing device 1G, the estimating sechion 20 and the cantral section 30 may be
`
`implemented by the same processor and may be inylemented by separate processars.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 6
`
`If applicant does not intend to havethis/these limitation(s) interpreted under 35
`
`U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may:
`
`(1) amend the
`
`claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AlA
`
`35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the
`
`claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s)
`
`sufficient structure to perform the claimed function so as to avoid it/them being
`
`interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
`
`Re claims 1,3,7,8:
`
`The claims in this application are given their broadest reasonable interpretation
`
`using the plain meaning of the claim languagein light of the specification as it would be
`
`understood by oneofordinary skill
`
`in the art. The broadest reasonable interpretation of
`
`a Claim element (also commonly referred to as a claim limitation) is in view of applicant's
`
`disclosure, page 8:
`
`The present technaslogy has been deserdsed above on the basis of the embodiment. According
`
`ig the embodiment, the production control section 36 can perform production auiomatically on the
`
`elements and processing processes af the embodimentare susceptible of various modifications and that such modifications also fall within the scape of the present technology.
`
`20
`
`[basis of a production parameter derived by the deriving section 34. The present entbodimentis
`
`iiustrative, and it ig to be understood by those skilled in the art that combinations of constituent
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 7
`
`Re 3. (Original) An information processing device comprising:
`
`an obtaining section configured to obtain position information of a plurality of
`
`parts of a photographing target;
`
`a deriving section configured to derive a direction’ connecting? two parts to each
`
`other; and
`
`a production control section configured to perform production on a basis of the
`
`derived direction.
`
`Re 4. (Original) The information processing device according to claim 1, wherein
`
`the (output) production control section performs? (A) light production and/or (B) sound
`
`production.
`
`‘direction: the act or an instanceof directing. (Dictionary.com)
`? presentparticipial adjective modifying nouns (section and/or direction), wherein presentparticipleis
`defined: a participial form of verbs (connect) used adjectivally whenthe action it describesis
`contemporaneouswith that of the main verb (‘to perform’) of a sentence and also used in the formation
`of certain compoundtenses. In English this form endsin -ing, wherein adjective is defined: a word
`imputing a characteristic to a noun (section and/or direction) or pronoun (Dictionary.com)
`3 Markush elementof alternatives follow: A and/or B
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 8
`
`Claim Rejections - 35 USC § 101
`
`35 U.S.C. 101 reads asfollows:
`
`Whoever invents or discovers any new and useful process, machine, manufacture,or
`composition of matter, or any new and useful improvementthereof, may obtaina patent
`therefor, subjectto the conditions and requirementsofthis title.
`
`Claims 1-8 rejected under 35 U.S.C. 101 because the claimed invention is
`
`directed to an abstract idea without significantly more.
`
`Step zero: Broadest Reasonable Interpretation, above.
`
`Step 2A:
`
`The claim(s) recite(s) “obtain... information...derive a length”. This mental judicial
`
`exception is not integrated into a practical application because the additional elements
`
`(“a production control section configured to perform production’*; computer stuff via 35
`
`USC 112(f)) with the exception is not improving technology or technical field, such as an
`
`apparatus-platform Theater (applicant’s fig. 1).
`
`Step 2B:
`
`The claim(s) does/do not include additional elements that are sufficient to amount
`
`to significantly more (“novel” “lighting”, disclosure, pg. 8,ll.10-12) than the judicial
`
`exception because the additional elements with the exception is conventional:
`
`applicant’s disclosures BACKGROUND and SUMMARY:pages 1,2, and thus need not
`
`explaining and thus notsignificantly more than the mental exception.
`
`* production:
`1.
`8.
`
`(Broadest Reasonable Interpretation) the act of producing; creation; manufacture.
`(non-Broacest ReasonableInterpretation) the organization and presentation of a
`dramatic entertainment (Dictionary.com)
`° stage Theater A the platform on which the actors perform in a theater.
`B this platform with all the parts of the theater and all the apparatus backof the
`proscenium. (Dictionary.com)
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 9
`
`Claim Rejections - 35 USC § 102
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that
`
`form the basis for the rejections under this section madein this Office action:
`
`A person shall be entitled to a patentunless —
`
`(a)(1) the claimed invention waspatented, described in a printed publication, orin public use,
`on sale, or otherwise available to the public beforethe effectivefiling date of the claimed
`invention.
`
`Claim(s) 1,2,6 and 3 and 7 and8is/are rejected under 35 U.S.C. 102(a)(1) as
`
`being anticipated by MAKOTOet al. (JP 2020-109556) with Machine Translation (22
`
`pages).
`
`Re 1, Makoto discloses (Original) An information processing device comprising:
`
`an obtaining (“CPU”) section configured to obtain (“positional”) position information
`
`of a plurality of (body) parts (and a “predetermined part’ of a person, figure 2) of a
`
`photographing (“player”) target;
`
`a deriving (“CPU”) section configured to derive a (“video”) length (fig. 11: “L’)
`
`between two (body) parts; and
`
`a production (“CPU”) control section configured to perform production (via a “video
`7 tte
`1 tt
`image’or “output
`
`production system”: fig. 1 producing an “output
`
`video’, i.e.,
`
`“output target” or “the output”: fig. 21) on a basis of the derived (“size’-“based”)
`
`length (via Machine Translation, 22 pages:
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 10
`
`Page 11 of 22:
`The distance estimation device 2 described aboveincludes, for example, a
`processor such as a CPU (Central Processing Unit), a memory as temporary storage,
`and a non-volatile storage device (EEPROM orhard disk). By reading the program
`stored in the storage device into the memory and executing the program, the processor
`such as the CPU causesthe orientation estimation unit 21, the control unit 22, the
`imageidentification units 23a to 23d, the positional relationship estimation unit 24, and
`the distance.
`It functions as the estimation unit 25.
`
`Page 3 of 22:
`FIG. 6 is a block diagram showing a configuration example of the distance
`estimation device 2 in this embodiment. The distance estimation device 2 includes an
`orientation estimation unit 21, a control unit 22, image identification units 23a to 23d, a
`positional relationship estimation unit 24, and a distance estimation unit 25.
`
`Page 19 of 22:
`The distance estimation unit estimates the positional relationship betweenthefirst
`specific person and the second specific person, the image length of the predetermined
`portion of the first specific person, and the image of the predetermined portion of the
`second specific person. Using the ratio with the above length, the actual length of the
`predetermined part of the first specific person, and the actual length of the
`predetermined part of the second specific person, the first specific person and the
`second specific person The distance estimation device according to claim 1, which
`estimates a distance between the two.
`
`Page 7 of 22:
`Here,the identification target identified by the image identification units 23a to 23d
`is, for example, the player (specific person) in the video andtheattribute of the player.
`Theattributes of a player are not only the name and ageof each player, but also the
`team or university to which the player belongs, the role (a pitcher or a fielder in the case
`of baseball, an offense or defense in the case of soccer), etc. .. In the following
`description, the identification target may be described as a class for convenience.
`Further, the category is a type of feature amount used foridentifying the player and the
`attribute of the player. Typical categories include, for example, the face of the player,
`the uniform worn by the player, the uniform number, the letters written on the tag and
`the number, and thelike.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 11
`
`Page 9 of 22:
`The distance estimation unit 25 estimates the distance between at least two or more
`identification targets (specific persons) using the positional relationship estimated by the
`positional relationship estimation unit 24 and the length of the object on the video.
`
`[Figure 11]
`
`
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 12
`
`Page 2 of 22:
`FIG. 1
`is a block diagram showing a configuration example of a video production system
`according to this embodiment. The video production system includesa plurality of
`cameras 1a to 1d and a distance estimation device 2.
`
`[Figure1]
`
`cameraia
`
` }
`
`cameratb—}
`
`Moetence estimatian device F
`
`
`
`
`
` [camer|
`
`
`camera id
`
`:
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 13
`
`Page 13 of 22:
`FIG. 20 is a diagram showing an example of the calculated likelihood for each
`identification target (class).
`In addition,
`in the example of FIG. 20, in addition to the
`university name of the identified player, in order to improve the accuracy, the classes of
`the motorcycle and the audience are provided.
`In the example of FIG. 20, the person A
`has the highest probability of being an audience, the person B has the highest
`probability of being a college X player, the person C hasthe highest probability of being
`a college Y player, and the person D. Has the highest certainty of being a university Z
`player. Here,if the threshold value is 0.7, the imageidentifying unit 23a indicates that
`the person A belongs to the audience class, the person B belongs to the university X
`class, the person C belongsto the university Y class, and the person D belongs to the
`university.
`It is determined to belong to the class Z. Then, the position information on
`the image of the person B (for example, coordinate information specifying the attention
`area) and the university name X are output. Similarly, the positional
`information on the
`image of the person C (for example, coordinate information for specifying the attention
`area) and the university name Y are output. Similarly, the position information (for
`example, coordinate information that specifies the attention area) of the person D on the
`video and the university name Z are output. It should be noted that the person A is
`excluded from the output target becauseit is not a player, but it does not prevent the
`output.
`
`[Figure 21]
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 14
`
`Page 8 of 22:
`In the present embodiment, regarding the identified specific person, as shownin FIG.
`11, the distance of the straight line connecting the top of the head to the midpoint of the
`neck is set to L, and an attention area of Lx2L is set downward from the center of the
`neck of the person.
`.. Then, the positional relationship estimation unit 24 estimates the
`positional relationship between the specific persons based on the size of the attention
`area as an index of the size of the identified specific person on the image.).
`
`Re 2. (Original), Makoto discloses The information processing device according
`
`to claim 1, wherein the deriving section derives a length between two (body) parts (such
`
`as head and feet) not adjacent to each other(via fig. 11:
`
`[Figure 11]
`
` Acga ot sthontion
`
`2L
`
`Re 6. (Original), Makoto discloses The information processing device according
`
`to claim 1, wherein the production control section performs video production (producing
`
`said output targetoffig. 21).
`
`).
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 15
`
`Re 3. (Original), Makoto discloses An information processing device comprising:
`
`an obtaining (CPU) section configured to obtain (positional) position information
`
`of a plurality of parts of a photographing (player) target;
`
`a deriving (CPU) section (fig. 6:222: “Control Unit’) configured to derive a (“face”)
`
`direction® connecting’ (as shownbytheelectrical lines in fig. 6 or connecting “between”
`
`“targets”: fig. 21) two parts (fig. 6: 211,23a,23b,23c,23d,24,25 or body parts comprised
`
`by people) to each other; and
`
`a production (CPU) control section configured to perform (video) production
`
`(producing the targeted output) on a basis of the derived direction (since the “distance”
`tt
`
`is based on “face”
`
`“orientation” via:
`
`=& §3
`
`.
`ha
`hofa
`
`[Figure 6]
`
`Camera la footage -
`
`Camera tb factaqe~ -
`
`Camera tc footage -
`
`Camera td footage
`
`:
`
`Iriage slontification section 8d
`
`f>
`3a
`
`1
`
`inate kcheniitication unit 233
`
`Image Tecoanition unt 23h
`
`imageidentification unit 26.
`
`(GeSedPLSCSUONeMeseae
`
`batAYR
`
`IESG
`
`Sovupvoneansg
`
`8 direction: the actor an instanceof directing. (Dictionary.com)
`’ participial adjective modifying nouns (section and/ordirection)
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 16
`
`Page 5 of 22:
`In the present embodiment, the imageidentifying unit 23a is an image identifying
`unit that has been learned to identify a player using a video image of the player
`captured from the front. In other words, the image of the player facing the front of the
`image (the direction in which the player's faceis visible) is learned as teacher data,
`and the player is facing the front of the image (the direction in which the player's faceis
`visible). An image identification unit used to identify a player from a video.
`It should be
`noted that the teacher data does notinclude only the video in which the player faces
`directly in front of the image (the direction in which the player's faceis visible) as the
`teacher data, but the player facesin theleft-right direction close to the front (for
`example, left and right). (45 degrees) Video may be added to the teacher data.
`
`Page 9 of 22:
`The distance estimation unit 25 estimates the distance betweenat least two or
`moreidentification targets (specific persons) using the positional relationship estimated
`by the positional relationship estimation unit 24 and the length of the object on the
`video.
`
`[Figure 21]
`
`d2(D2)
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 17
`
`Page 15 of 22:
`In the present embodiment, the distance between specific persons is estimated
`by estimating the orientation of the specific person in the video andits positional
`relationship, and thus the distance between specific persons can be estimated with high
`accuracy. Also, since the positional relationship of the specific person on the video can
`be known, it is possible to estimate the distance betweenthefirst player (specific
`person) and each player (specific person) and the distance betweenthefirst player and
`the third-place player. it can.).
`
`Claims 7 and 8 are rejected similar to claim 1.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 18
`
`Claim Rejections - 35 USC § 103
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis forall
`
`obviousness rejections set forth in this Office action:
`
`Apatent fora claimed invention may notbe obtained, notwithstanding thatthe claimed
`invention is not identically disclosed as set forth in section 102, if the differences between the
`claimed invention and the prior artare suchthat the claimed invention as a whole would have
`been obvious beforethe effective filing date of the claimed invention to a person having
`ordinary skill in the art to which the claimed invention pertains. Patentability shall notbe
`negated by the manner in whichthe invention was made.
`
`The factual inquiries for establishing a background for determining obviousness
`
`under 35 U.S.C. 103 are summarized asfollows:
`
`1. Determining the scope and contents of the priorart.
`
`2. Ascertaining the differences between theprior art and the claims atissue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence present in the application indicating
`
`obviousness or nonobviousness.
`
`Regarding inquiry 4, see Suggestions.
`
`Claim(s) 4,5 is/are rejected under 35 U.S.C. 103 as being unpatentable over
`
`MAKOTOetal. (JP 2020-109556) with Machine Translation (22 pages) in view of
`
`NORIYUKI(JP 2019-033868 A) with Machine Translation (68 pages).
`
`Re 4. (Original), Makoto teaches The information processing device according to
`
`claim 1, wherein the (output) production control section performs light production
`
`and/or sound production.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 19
`
`Makoto does not teach “light production and/or sound production’. Noriyuki
`2
`tt
`
`teaches “light production” or“lighting”
`
`“production” via Machine Translation (MT), pg. 43
`
`of 68:
`
`The production system 600 is a system that produces a performance production
`by controlling video, sound, lighting, and the like in accordance with the movementof
`each performer.
`
`Since Makoto suggests other video production systems, via‘ [1]FIG.
`
`1
`
`is a block
`
`diagram showing aconfiguration example of a video production system according to the
`
`embodiment.”, one of skill in production can make Makoto’s be as Noriyuki’s and
`
`recognize that the change is predictable or looked forward to since the change
`
`enhances a sports broadcast production via “lighting effects’, Noriyuki, MT: pg. 45 of
`
`68:
`
`Theilluminating device 612 includes, for example, various lights and light sources. The
`lighting device 612 performs various lighting effects® based on the DMX data from the
`effect control device 605 and the DMX controller 611.
`
`Re 5. (Original), Makoto with Noriyuki teach The information processing device
`
`according to claim 4, wherein the (output target) production control section performs
`
`light production and/or sound production by controlling a lighting apparatus and/or a
`
`sound apparatus (by “controlling...sound,
`
`lighting” via said Noriyuki:
`
`The production system 600 is a system that produces a performance production
`by controlling video, sound, lighting, and the like in accordance with the movementof
`each performer.).
`
`8 effects: lighting, sounds, etc, to accompany and enhancea stage,film, or broadcast production
`(Dictionary.com)
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 20
`
`Applicant’s disclosure states:
`
`Suggestions
`
`The production control section 36 maypadjust the tradition directions oflight hfthe lighting
`apparatuses Qhaccording to the derived direction vector DJ Far example. the production contral
`
`section 36 may cantrol the movable anit of cach lighting apparatus 2 such that the plurality of
`
`lighting apparatuses 2 apply light ta one point (cress mark an a dotted line in FIG. 7) on a half
`
`straight Hine obtained by extending the direction vector D in a direction fromthestarting pointto
`
`the end point. The performer can thereby brightly Uhuminate one point in the direction from. the
`
`left elbowto the left hand. Adistance framthe end point of the direction vector D to the one point
`
`on the half straight line may be a predetermined distance, ar may be set ai predetermined times the
`
`leneth of the direction vector. This light production enables the performer to freely manipulate the
`
`position nradiated by the lighting apparatuses 2 in the live venueso that a sovel live performance
`
`
`
`Thus the lack of the disclosed “direction vector D” in the claims is an indication of
`
`obviousness under 35 USC 103.
`
`Note that these suggestions are not provided with respect to overcoming 35 USC
`
`101,112,102 and/or 103. These suggestion are mainly provided to seek out advantages
`
`in the disclosure regardless of 35 USC 101,112,102 and/or 103.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 21
`
`Conclusion
`
`The prior art “nearest to the subject matter defined inthe claims” (MPEP 707.05}
`
`made of record and notrelied upon is considered pertinent to applicant's disclosure.
`
`The following table lists several references that are relevant to the subject matter
`
`claimed and disclosed in this Application. The references are not relied on by the
`
`Examiner, but are provided to assist the Applicant in responding to this Office action.
`
`pages 1,2
`
`p
`g
`g
`.
`2012-065819 A) in|person’s body to the tip-part of a hand (fig. 1) for popping
`the Search
`video-game balloons (fig. 2) as the closest to the claimed
`Illustrated, pg. 4|“length between two parts’.
`
`Kensaku teaches a bone-vecior as the closest to applicant’s
`IDS cited
`KENSAKUet al.|disclosed novellight-adjusting vector discussed in
`(JP 2020-204890)|Suggestions.
`in the Search
`Illustrated, page 8)
`
`Palm (JP 2000-|Palm teaches distances connecting points A,B,C in figure
`503177 & US
`75,A,B as the closest to the claimed “length between two
`5,748,199) in the|parts’.
`Search Illustrated,
`
`Anyinquiry concerning this communication or earlier communications from the
`
`examiner should be directed to DENNIS ROSARIO whose telephone numberis
`
`(571)272-7397. The examiner can normally be reached Monday-Friday, 9AM-5PM EST.
`
`Examinerinterviews are available via telephone,
`
`in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO AutomatedInterview Request
`
`(AIR) at http:/Avwww.uspto.gov/interviewpractice.
`
`

`

`Application/Control Number: 17/943,283
`Art Unit: 2676
`
`Page 22
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, Henok Shiferaw can be reached on 571-272-4637. The fax phone number
`
`for the organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Centeris
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: httos://patentcenter.uspto.gov. Visit https:/Avww.uspto.gov/patents/apply/patent-
`
`center for more information about Patent Center and
`
`https://www.uspto.gov/patents/docx for information about filing in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free).
`
`If you would like assistance from a USPTO Customer Service
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/DENNIS ROSARIO/
`Examiner, Art Unit 2676
`
`/Henok Shiferaw/
`Supervisory Patent Examiner, Art Unit 2676
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket