`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/794,138
`
`07/20/2022
`
`Shingo TSURUMI
`
`1946-1774
`
`3513
`
`Para
`
`a
`
`Herons
`
`Paratus Law Group, PLLC
`1765 Greensboro Station Place
`Suite 320
`
`Tysons Corner, VA 22102
`
`ZHAO,LEI
`
`2668
`
`12/27/2024
`
`PAPER
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`PTOL-90A (Rev. 04/07)
`
`
`
`Office Action Summary
`
`Application No.
`17/794,138
`Examiner
`LEI ZHAO
`
`Applicant(s)
`TSURUMI, Shingo
`Art Unit
`AIA (FITF) Status
`2668
`Yes
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`
`
`1) Responsive to communication(s) filed on 12/5/2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`Disposition of Claims*
`1-14 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`CL] Claim(s)__is/are allowed.
`Claim(s) 1-14 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s)filed on 7/20/2022 is/are:
`a)[(¥) accepted or b)() objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241220
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 2
`
`Notice of Pre-AlA or AIA Status
`
`The presentapplication, filed on or after March 16, 2013,
`
`is being examined underthe
`
`first inventor to file provisions of the AIA.
`
`Response to Arguments
`
`Applicant's arguments filed December 5, 2024 have been fully considered but they are
`
`not persuasive.
`
`Regarding rejection of Claims 1-6, 12, and 14 as being anticipated by Goji(Japan Patent
`
`Pub. No.: JP 2016-170603 A) under 35 U.S.C. § 102(a)(1), examiner maintains that the teachings
`
`of Goji disclose or suggest each and every limitation of claims 1-6, 12, and 14, arranged as
`
`recited in those claims. See rejections for individual claims cited below.
`
`Regarding claims 1, 12 and 14,
`
`(1) applicant states that “Goji does not consider using an attribute of the object, a CPU
`
`load, or magnitude of movement of the object related to an importance level to determine a
`
`region suitable for tracking.". Examiner disagrees with this statement. Goji teaches “the
`
`allocation of the tracking parts to the particles in the new tracking person may be randomly
`
`performed or may be performed at a predetermined ratio for each part such as head 30%,
`
`abdomen 20%,left shoulder 20%, and right shoulder 20%” ([0028]). Head, abdomen, left
`
`shoulder and right shoulder are attributes of the object. It is also reasonable to construe the
`
`ratio assigned to each region is related to an importancelevel of each region.
`
`Regarding claims 2-6, claims 2-6 are dependent claims depending uponclaims 1. See
`
`rejections for individual claims cited below.
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 3
`
`Response to Amendment
`
`The Amendment of December5, 2024 overcomes the following rejection:
`
`a.
`
`Rejection of claims 12 and 13 basedon 35 USC 101.
`
`Claim Rejections - 35 USC § 102
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form
`
`the basis for the rejections under this section made in this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a)(1) the claimed invention was patented, described ina printed publication, or in public use, on sale,
`or otherwise available to the public before the effective filing date of the claimed invention.
`
`(a}(2) the claimed invention was described ina patent issued under section 151, or in an application
`for patent published or deemed published under section 122(b), in which the patent or application, as
`the case may be, names another inventor and was effectively filed before the effective filing date of
`the claimed invention.
`
`Claims 1-6, 12 and 14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Goji
`
`(Japan Patent Pub. No.: JP 2016-170603 A).
`
`Regarding claim 1, Goji teaches an information processing device comprising: a control
`
`unit (tracking unit 4) configured to track an object in an image using images input in time
`
`series (The image capturing unit 2 is connected to the tracking unit 4, captures the monitoring
`
`Space at predetermined time intervals, and sequentially outputs the captured time-series
`
`monitoring images to the tracking unit 4. [0017]), using a tracking result obtained by
`
`performing tracking in units of a tracking region correspondingto a specific part of the object
`
`(The tracking part setting unit 41 (part setting unit) determinesapart to be tracked by the
`
`particle filter for the person set as the tracking target by the tracking person setting unit 40,
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 4
`
`generates the tracking part information 21 for each part, and stores the tracking part
`
`information 21 in the storage unit 3. [0025]), wherein the control unit determines a setting of
`
`the tracking region to be used for tracking in the units of the tracking region according to an
`
`importance level related to tracking based on a predetermined index (The allocation of the
`
`tracking parts to the particles in the newtracking person may be randomly performed or may
`
`be performed at a predetermined ratio for each part such as head 30%, abdomen 20%,left
`
`shoulder 20%, and right shoulder 20%. [0028]. It is reasonable to construe the predetermined
`
`ratio for each region indicates an importancelevel of each region.), the predetermined index
`
`includes at least one of an attribute of the object, a CPU load, or magnitude of movementof
`
`the object, and the control unit is implemented via at least one processor (The allocation of
`
`the tracking parts to the particles in the new tracking person may be randomly performed or
`
`may be performed at a predetermined ratio for each part such as head 30%, abdomen 20%,left
`
`shoulder 20%, and right shoulder 20%. [0028)).
`
`Regarding claim 2, Goji teaches the information processing device according to claim 1,
`
`wherein the control unit extracts-is further configured to extract a configuration elementof
`
`the object in the image using the image (The parts can be selected by a method of using
`
`predetermined parts such as a head,a left shoulder, a right shoulder, a left hip, a right hip, and
`
`an abdomen [0025]), and deteets-detect the object in the image using an extraction result (A
`
`monitoring image input from an imaging part 2 is processed to detect a person in a monitoring
`
`space, the detected person is tracked by a particle filter, and a tracking result is output to an
`
`output part 5. [0022]).
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 5
`
`Regarding claim 3, Goji teaches the information processing device according to claim 2,
`
`wherein the control unit deteets-is further configured to detect the object in the image for
`
`each predetermined number of frames (When the operation of the moving object tracing
`
`apparatus 1 is started, the imaging unit 2 acquires a monitoring image at a predetermined
`
`frame rate (for example, 7fps) and outputs the monitoring image to the tracing unit 4. [0045])
`
`equal to or larger than [[the]] a number of frames required to detect the object in the image
`
`(When the tracing process S2 is completed for all the persons to be tracked, the tracker 4
`
`advances the processto step $4, and on the other hand, when there is an unprocessed person
`
`to be tracked, the tracker 4 continues the tracing process S2 (step $3). [0046]).
`
`Regarding claim 4, Goji teaches the information processing device according to claim 1,
`
`wherein the control unit traeks-is further configured to track the object in the image byreal-
`
`time processing for each frame (When the operation of the moving object tracing apparatus 1
`
`is started, the imaging unit 2 acquires a monitoring image at a predetermined frame rate (for
`
`example, 7fps) and outputs the monitoring image to the tracing unit 4. Every time a monitoring
`
`image is input (step $1), the tracing unit 4 repeats a series of processes of steps S1 to S5.
`
`[0045]; That is, in order to track a person in real time, it is necessary to complete processing for
`
`particles of all parts within a photographing cycle [0007]).
`
`Regarding claim 5, Goji teaches the information processing device according to claim 1,
`
`wherein the control unit perferms-is further configured to perform tracking in the units of a
`
`tracking region, using one or more tracking regions (A moving object tracking device according
`
`to the present invention tracks a moving object in time-series images, and includes storage
`
`means for storing, for each of a plurality of parts constituting the moving object, an image
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 6
`
`feature, a past position, and a past tracking reliability indicating a likelihood of the past
`
`position. [0009]), and in a case of performing tracking in the units of a tracking region, using a
`
`plurality of tracking regions, the control unit-traeks-is further configured to track the objectin
`
`the image ona basis of the tracking result obtained by performing tracking in each of the
`
`tracking regions (As described above, according to the present invention,in tracking a moving
`
`object by setting a plurality of parts on the moving object and tracking each part using a particle
`
`filter, a group of particles, the total number of which per moving objectis predetermined,is
`
`distributed to each tracking part in accordance with the likelihood thereof. [0058]).
`
`Regarding claim 6, Goji teaches the information processing device according to claim 1,
`
`wherein the control unit seleets-is further configured to selecta tracking region (for each of a
`
`plurality of parts constituting the moving object [0009]) to be used for tracking in the units of a
`
`tracking region from a plurality of candidates (FIG.5 is a pattern diagram showing an example
`
`of atracing part setin the person H1 at a certain time during tracing, and FIG. 6 is a pattern
`
`diagram showing an example of the tracing part information 21 corresponding to the tracing
`
`part shown in FIG. 5. For example, for the person H1, the tracked parts Rj (j =1 to 10) are set
`
`with the tracked part IDs as j, and FIG. 5 showsa part of the tracked parts Rj. [0020]).
`
`Claim 12 is drawn to a non-transitory computer-readable medium having embodied
`
`thereona program, which when executed by a computer causes the computer to execute the
`
`method of using the corresponding apparatus as claimed in claim 1. Therefore, claim 12
`
`corresponds to apparatus claim 1, and is rejected for the same reasons of anticipation as used
`
`above.
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 7
`
`Method claim 14 is drawn to the method of using the corresponding apparatus claimed
`
`in claim 1. Therefore method claim 14 corresponds to apparatus claim 1 and is rejected for the
`
`same reasons of anticipation as used above.
`
`Claim Rejections - 35 USC § 103
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is
`not identically disclosed as set forthin section 102,if the differences between the claimed invention
`and the prior art are such that the claimed invention as a whole would have been obvious before the
`effective filing date of the claimed invention to a person having ordinaryskillin the art to which the
`claimed invention pertains. Patentability shall not be negated by the manner in which the invention
`was made.
`
`Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Goji (Japan Patent
`
`Pub. No.:JP 2016-170603 A), in view of Hidetomo (Japan Patent Pub. No.: JP2010-141668A),
`
`hereinafter Hidetomo.
`
`Regarding claim 8, Gojiteaches all of the elements of the claimed invention as stated in
`
`claim 1 exceptfor the following limitations as further recited. However, Hidetomo teaches the
`
`information processing device according to claim [[7]]1, wherein the predetermined index
`
`
`
`furtherincludes atteast-ene-of-ar-attributeoftheobject,a background of the object;a-CPU.
`
`
`
`lead-ermagnitude-ofimevementoftheobject(In addition, by deriving evaluation expressions
`
`such as Expression (4) and Expression (5) using Expression (13) to Expression (16), it is possible
`
`to evaluate the overlap between the foreground and the ellipse model and the overlap
`
`between the background and the ellipse model. For example, when a person who is a moving
`
`objectis tracked, it is assumed that a part of the person is tracked (detected) as illustrated in
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 8
`
`FIG. 3 a. The overlap between the person and the ellipse model is calculated using Equation
`
`(13), and is 1 when the ellipse model and the person completely match, and is 0 when thereis
`
`no overlap at all. The same applies to the overlap between the background and the elliptic
`
`model. In the harmonic mean of both, when Fm> th (threshold value), the overlap of the
`
`foreground, the background, and the elliptic region is in the best state, and in other cases, the
`
`overlap can be ignored. [0052]).
`
`It would have been prima facie obvious to one of ordinary skill in the art before the
`
`effective filing date of the claimed invention to have modified Gojito incorporate the teachings
`
`of Hidetomo to include a background of the object in determining the predetermined index to
`
`improve object tracking accuracy.
`
`Claims 9 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Goji (Japan
`
`Patent Pub. No.: JP 2016-170603 A), in view of Wei (US Patent Pub. No.: US 2006/0064396 A1),
`
`hereinafter Wei.
`
`Regarding claim 9, Goji teaches all of the elements of the claimed invention as stated in
`
`claim 1 except for wherein the control unit ehanges-is further configured to change a setting
`
`of a tracking region to be used for tracking in the units of a tracking region according to input
`
`information from a user.
`
`Wei teachesagraphical user interface that can change a location for data processing
`
`according to input information from a user (For example, a data processing function may be
`
`applied to only a designated portion of the displayed data, which is determined,for instance,
`
`via a mouseclick ona particular location of the display screen. [0030]).
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 9
`
`It would have been prima facie obvious to one of ordinaryskill in the art before the
`
`effective filing date of the claimed invention to have modified Gojito incorporate the teachings
`
`of Wei to change a setting of atracking region according to input information from a user to
`
`improvetracking efficiency and through-put.
`
`Regarding claim 13, Wei in the combination teaches the pregram-non-transitory
`
`computer-readable medium according to claim 12, fereausingtheinfermation—precessing
`
`devieeto-execute_precessing-efwherein the method further comprises: displaying, on a
`
`display on which the imageis displayed (Fig. 3
`
`
`
`Page 10
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`SNF
`
`Fig.3
`
`), an icon (button 304) for prompting a user to
`
`change a setting of a tracking region to be used for tracking in the units of a tracking region
`
`(an actionable button 304 for activating an interactive lesion detection operation [0038]).
`
`Claim 10 is rejected under35 U.S.C. 103 as being unpatentable over Goji (Japan Patent
`
`Pub. No.:JP 2016-170603 A), in view of Shingo (Japan Patent Pub. No.: JP 2010-211485A),
`
`hereinafter Shingo.
`
`Regarding claim 10, Goji teaches all of the elements of the claimed invention as stated in
`
`claim 1 except for the following limitations as further recited. However, Shingo teaches
`
`wherein the control unit identifies-is further configured to identify whether or not a new
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 11
`
`object in the imageis a predetermined object on a basis of a state represented bya plurality
`
`of configuration elements of each object (Whenatracked human face is newly detected after
`
`being lost, a face feature amount of the detected human faceis collated with a face feature
`
`amount of a past frame. As a result of the collation,
`
`if the face feature amounts match, the
`
`person is recognized as the same person, and double counting is prevented. [0011]).
`
`It would have been prima facie obvious to one of ordinaryskill in the art before the
`
`effective filing date of the claimed invention to have modified Gojito incorporate the teachings
`
`of Shingo to identify whether or not a new object in the image is a predetermined object ona
`
`basis of a state represented bya plurality of configuration elements of each object in order to
`
`improve tracking accuracy.
`
`Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Goji (Japan Patent
`
`Pub. No.:JP 2016-170603 A), in view of Ichiko (Japan Patent Pub. No.: JP 2017-212680 A),
`
`hereinafter Ichiko.
`
`Regarding claim 11, Goji teaches teaches all of the elements of the claimed invention as
`
`stated in claim 1 except for the following limitations as furtherrecited. However, Ichiko
`
`teaches wherein the control unit deteets-is further configured to detect the object in the
`
`image using the image (Fig. 4(a) the subject P in image frame 601), and perferms-perform
`
`compensation processing of compensating for movement of the object from start of
`
`detection to completion of detection of the object in the image when tracking the object in
`
`the image(Inthe case of FIG. 4 b, the analysis apparatus 400 determines that information for
`
`image analysis processingis insufficient in a section from the first frame 601 to the fourth
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 12
`
`frame 604. Therefore, for example, as illustrated in FIG. 4c, the analysis apparatus 400
`
`acquires, from the camera, the third frame 603 among the images in the section from the first
`
`frame 601 to the fourth frame 604. [0023]. In this way, the analysis apparatus 400 requests
`
`distribution of a frame existing between frames used for image analysis processing for a section
`
`determined to be insufficient in information for image analysis processing. Upon receiving a
`
`new frame 603 from the camera as shownin the image sequence 600c, the analysis apparatus
`
`400 performs image analysis processing again based on an image of the received new frame
`
`and a plurality of frames of the already received image sequence 600b. As a result, as illustrated
`
`in FIG. 4c,it is possible to appropriately track the person P and to improve the accuracy of the
`
`image analysis process. [0024]. Adding the third frame reads on “performs compensation
`
`processing of compensating for movement of the object from start of detection to completion
`
`of detection”.).
`
`It would have been prima facie obvious to one of ordinaryskill in the art before the
`
`effective filing date of the claimed invention to have modified Gojito incorporate the teachings
`
`of Ichiko to compensate for movementof the object when tracking the object to improve the
`
`accuracy of the image analysis process.
`
`Conclusion
`
`Applicant's amendment necessitated the new ground(s) of rejection presented in this
`
`Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is
`
`reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
`
`A shortened statutory period for reply to this final action is setto expire THREE MONTHS
`
`from the mailing date of this action.
`
`In the eventa first reply is filed within TWO MONTHS of
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 13
`
`the mailing date of this final action and the advisory action is not mailed until afterthe end of
`
`the THREE-MONTHshortened statutory period, thenthe shortened statutory period will expire
`
`on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a)
`
`will be calculated from the mailing date of the advisory action.
`
`In no event, however,will the
`
`statutory period for reply expire later than SIX MONTHS from the dateofthis final action.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to LEI ZHAO whose telephone number is (703)756-1922. The
`
`examiner can normally be reached Monday- Friday 8:00 am - 5:00 pm.
`
`Examiner interviews are available via telephone, in-person, and video conferencing
`
`using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is
`
`encouraged to use the USPTO Automated Interview Request (AIR) at
`
`http://www.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
`
`supervisor, VU LE can be reached on (571)272-7332. The fax phone number for the organization
`
`wherethis application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Centeris available
`
`to registered users. To file and manage patent submissions in Patent Center,visit:
`
`https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for
`
`more information about Patent Centerand https://www.uspto.gov/patents/docx for
`
`information about filing in DOCX format. For additional questions, contact the Electronic
`
`
`
`Application/Control Number: 17/794,138
`Art Unit: 2668
`
`Page 14
`
`Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO
`
`Customer Service Representative, call 800-786-9199 (INUSA OR CANADA) or 571-272-1000.
`
`/LEI ZHAO/
`Examiner, Art Unit 2668
`
`/VULE/
`Supervisory Patent Examiner, Art Unit 2668
`
`