`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/757,451
`
`06/15/2022
`
`MASAAKIISHII
`
`SYP333906USOLINT
`
`4765
`
`CHIP LAW GROUP/INT
`505 N. Lake Shore Drive
`Suite 250
`CHICAGO, IL 60611
`
`HYTREK, ASHLEY LYNN
`
`2665
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/22/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`docketing @chiplawgroup.com
`eofficeaction @appcoll.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1-20 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s) filed on 10/31/2024 is/are: a)[¥) accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241105
`
`Application No.
`Applicant(s)
`17/757,451
`ISHII, MASAAKI
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`ASHLEY HYTREK
`2665
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 06/15/2022.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AIA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined underthefirst
`
`inventorto file provisions of the AIA.
`
`Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
`
`Priority
`
`Information Disclosure Statement
`
`The information disclosure statement (IDS) submitted on 09/12/2022 has been considered and
`
`made record of by the examiner.
`
`Claim Interpretation
`
`The following is a quotation of 35 U.S.C. 112):
`
`(f) Element in Claim for a Combination. — An elementin a claim for a combination may be expressed
`as a meansor step for performing a specified function withouttherecital of structure, material, or acts
`in support thereof, and such claim shall be construed to cover the corresponding structure, material, or
`acts described in the specification and equivalents thereof.
`
`The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
`
`An elementin a claim for a combination may be expressed as a meansor step for performing a
`specified function without the recital of structure, material, or acts in support thereof, and such claim
`shall be construed to cover the corresponding structure, material, or acts described in the specification
`and equivalents thereof.
`
`This application includes one or more claim limitations that do not use the word “means,” but are
`
`nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because
`
`the claim limitation(s) uses a generic placeholder that is coupled with functional language withoutreciting
`
`sufficient structure to perform the recited function and the generic placeholder is not preceded by a
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 3
`
`structural modifier. Such claim limitation(s) is/are: “object detection unit that estimates” and “object
`
`recognition unit that recognizes” in claims 1 and 20. See FIGs. 14 and 26 for support.
`
`Becausethis/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112() or pre-AIA
`
`35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure
`
`described in the specification as performing the claimed function, and equivalents thereof.
`
`If applicant does not intend to havethis/these limitation(s) interpreted under 35 U.S.C. 112(f) or
`
`pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may:
`
`(1) amendthe claim limitation(s) to avoid
`
`it/them being interpreted under 35 U.S.C. 112) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by
`
`reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the
`
`claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them
`
`being interpreted under 35 U.S.C. 112(f) or pre-AJA 35 U.S.C. 112, sixth paragraph.
`
`Claim Rejections - 35 USC § 101
`
`35 U.S.C. 101 reads as follows:
`
`Whoeverinvents or discovers any new and useful process, machine, manufacture, or composition of
`matter, or any new and useful improvementthereof, may obtain a patent therefor, subject to the
`conditions and requirementsofthistitle.
`
`Claim 20 is rejected under 35 USC 101 because the claimed invention is directed to non-statutory
`
`subject matter. Claim 20 claims the non-statutory subject matter of a computer program. Data structures
`
`not claimed as embodied in non-transitory computer-readable media are descriptive material per se and
`
`are not statutory because they are not capable of causing functional change in the computer. Therefore,
`
`since the claimed program is not tangibly embodied in a non-transitory physical medium and encoded on
`
`a non-transitory computer-readable medium,the Applicants have not complied with 35 USC 101.
`
`Claim Rejections - 35 USC § 103
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 4
`
`In the event the determination ofthe status of the application as subject to AIA 35 U.S.C. 102 and
`
`103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis
`
`(i.e., changing from AIA to pre-AIA) for the rejection will not be considered a new ground ofrejection if
`
`the prior art relied upon, and the rationale supporting the rejection, would be the same undereither status.
`
`The following is a quotation of 35 U.S.C. 103 which formsthe basis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention maynotbe obtained, notwithstanding that the claimed inventionis not
`identically disclosed as set forth in section 102, if the differences between the claimed invention and the
`prior art are such that the claimed invention as a whole would have been obviousbefore the effective
`filing date of the claimed invention to a person having ordinaryskill in the art to which the claimed
`invention pertains. Patentability shall not be negated by the mannerin which the invention was made.
`
`The factual inquiries for establishing a background for determining obviousness under 35 U.S.C.
`
`103 are summarizedas follows:
`
`1. Determining the scope and contents of the priorart.
`
`2. Ascertaining the differences betweenthe prior art and the claimsat issue.
`
`3. Resolving the level of ordinary skill in the pertinentart.
`
`4. Considering objective evidence present in the application indicating obviousness or
`
`nonobviousness.
`
`Claims1-5, 7-8, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Barrois
`
`(US 2010/0310154 A1),in further view of Kanokphan (JP2019191991A).
`
`Consider claims 1, 19, and 20, Barrois discloses [Claim 19: An information processing method
`
`comprising causing] An information processing apparatus comprising (Abstract; information
`
`processing method; 3; camera system):
`
`[Claim 20: A program that causes a computer to function (11-15; method carried out by
`
`computing device) as]
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 5
`
`an object detection unit that estimates a presence area of an object and a number of objects on a
`
`basis of a captured image (Abstract; “A number and/or a location probability of at least one object,
`
`whichis similar to the exemplary object, is determined in the image;” 43; “Object models are used
`
`to identify objects and to determine the three-dimensionallocation thereof”);
`
`a signal processing unit that generates point cloud data from distance measurement information
`
`(48; “the point cloud is generated from two images by meansof a stereo method”) acquiredbya
`
`i
`
`;
`
`;
`
`and an object recognition unit that recognizes the object by determining a point cloud, whichis a
`
`target of clustering, in the point cloud data generated and a numberofclusters on a basis of the presence
`
`area of the object and the numberof the objects, which are estimated, and performing clustering on the
`
`point cloud data (8; “a clustering methodis applied to the point cloud in orderto identify points
`
`respectively belonging to one cluster, wherein model matching is subsequently carried out,” model
`
`matching is object recognition).
`
`While Barrois discloses a signal processing unit that generates point cloud data from distance
`
`measurement information using a camera system that may record depth and a stereo method (Barrois (3),
`
`which commonly utilizes distance measurement information,he fails to specifically disclose a signal
`
`processing unit that generates point cloud data from distance measurementinformation acquired by a
`
`distance measuring sensor.
`
`In related art, Kanokphandiscloses a signal processing unit that generates point cloud data from
`
`distance measurement information acquired by a distance measuring sensor (Kanokphan 41; “the point
`
`cloud being acquired by a distance measurement sensor such as a LiDAR;” Kanokphanfurther
`
`discloses, “the target information estimation device 1 may be a smartphone equipped with the
`
`target information estimation program”).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effective filing
`
`date to incorporate the distance measurement sensor of Kanokphaninto the information processing
`
`apparatus of Barrois to correctly estimate the orientation of an object using an acquired point cloud
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 6
`
`(Kanokphan (9). The method may be similarly applied to determine object direction (Kanokphan 113).
`
`Kanokphanfurther discloses, “a distance image generation meansthat projects object point data included
`
`in a target point cloud, which is a portion of the point cloud that correspondsto the object, onto a
`
`predetermined projection plane to generate a distance imagerelated to the object point cloud
`
`(Kanokphan {15).”
`
`Consider claim 2, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit further estimates a center position of the object (Kanokphan 4120; “the "center
`
`position" of the object can be output not only as a two-dimensional position on a given xy plane, but
`
`also as the "center position" of the object in a three-dimensional xyz coordinate system for the
`
`point cloud”), and the object recognition unit further determines a starting point at which clustering starts
`
`on a basis of the center position of the object estimated (Barrois 414; “clustering method,the attention
`
`mapis advantageously used to select suitable clusters for subsequent model matching ... attention
`
`map can advantageously be used to calculate an initial pose of the object model”; Kanokphan [56,
`
`130).
`
`Consider claim 3, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit estimates a center position of the object using a method based on machine
`
`learning (KanokphanFIG.9, $104; R-CNN usedin calculating orientation range; [131; “an object
`
`information estimation process that inputs an acquired point cloud and outputs the “type,”
`
`“orientation,” “ circumscribing figure,” and “center position” of each object contained in the point
`
`cloud”), and the object recognition unit determines the starting point on a basis of the center position of
`
`the object estimated (Barrois 414; “clustering method, the attention map is advantageously used to
`
`select suitable clusters for subsequent model matching ... attention map can advantageously be
`
`used to calculate an initial pose of the object model”; Kanokphan 130).
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 7
`
`Consider claim 4, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit estimates a center position of an object according to the method based on
`
`machine learning (Kanokphan FIG.9, $104; R-CNN usedin calculating orientation range; (131;
`
`“an object information estimation process that inputs an acquired point cloud and outputs the
`
`99 66
`
`“type,
`
`orientation,”
`
`9 66
`
`the point cloud”).
`
`“circumscribing figure,” and “center position” of each object contained in
`
`Consider claim 5, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit estimates a center position of an object based on a movement space (Kanokphan
`
`41131; “The embodimentoutlined aboverealizes an object information estimation process that
`9 66
`99 66
`
`inputs an acquired point cloud and outputs the “type,”
`
`“orientation,”
`
`“circumscribing figure,” and
`
`“center position” of each object containedin the point cloud”; 936; “a target orientation
`
`determination unit 117 that determines a "direction" related to the orientation of the target;” 931,
`
`37).
`
`Consider claim 7, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit estimates a numberof the objects using a method based on machine learning
`
`(Barrois Abstract; “A number... of at least one object ... is determined in the image using the
`
`attention map,”) and the object recognition unit determines the numberofclusters on a basis of the
`
`numberof the objects estimated (Kanokphan 4101; “According to the above-described fast R-CNN
`
`classification model, target positions in the input range image can be individually detected as region
`
`candidates. Therefore, even when one object point cloud contains information on multiple objects,
`
`the object identifying unit 115a can detect and classify each object individually.” 956; “the sets of
`
`points belonging to each cluster may be treated as each partitioned target point cloud [36 a ''target
`
`point cloud" (target point group), which is a portion of the acquired ''point cloud" that
`
`correspondsto the target]”).
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 8
`
`Consider claim 8, Barrois, as modified by Kanokphan, discloses the claimed invention wherein
`
`the object detection unit estimates a presence area of the objects using a method based on machine
`
`learning (Barrois Abstract; “A number and/or a location probability of at least one object, which is
`
`similar to the exemplary object, is determined in the image;” 73; “Object models are used to
`
`identify objects and to determine the three-dimensional location thereof”), and the object recognition
`
`unit determines a point cloud thatis a target of the clustering on a basis of the presence area of the object
`
`estimated (Kanokphan 415; “a distance image generation meansthat projects object point data
`
`includedin a target point cloud, which is a portion of the point cloud that corresponds to the object,
`
`onto a predetermined projection plane to generate a distance imagerelated to the object point
`
`cloud; an object orientation range determination meansthat sets a plurality of orientation ranges
`
`that divide the orientations that the object can take, and determinesthe orientation rangeof the
`
`object includedin the distance image from the generated distance imageusing a classifier that has
`
`been trained with the distance image and the correct orientation range of the object included in the
`
`distance image”).
`
`Consider claim 18, Barrois, as modified by Kanokphan,discloses the claimed invention wherein
`
`the information processing apparatus is included in a mobile device (Kanokphan 445; “target
`
`information estimation device 1 may be a device or unit dedicated to target information estimation,
`
`but it may also be, for example, ... a smartphone equipped with the target information estimation
`
`program”).
`
`Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Barrois, in view of
`
`Kanokphan,as applied to claims 1-5, 7-8, and 18-20 above, and further in view of Maeda (US
`
`2012/0002881 A1).
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 9
`
`Consider claim 6, Barrois, as modified by Kanokphan, while disclosing wherein the object
`
`recognition unit performs the clustering using a #ex-hierarchical clustering method (Kanokphan (56; a
`
`kd-tree), fails to specifically disclose wherein the object recognition unit performs the clustering using a
`
`non-hierarchical clustering method.
`
`In related art, Maeda discloses wherein the object recognition unit performsthe clustering using
`
`a non-hierarchical clustering method (178; k-meansclustering method).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effectivefiling
`
`date to incorporate the non-hierarchal clustering method of Maedainto the teachings of Barrois, as
`
`modified by Kanokphanto automatically generate clusters to sort objects (Maeda 178). In this method,
`
`a cluster feature that is representative of the cluster is automatically calculated (Maeda 7178).
`
`Claims9 and 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Barrois, in
`
`view of Kanokphan,as applied to claims 1-5, 7-8, and 18-20 above, and further in view of Liu
`
`(EP3121791B1).
`
`Consider claim 9, Barrois, as modified by Kanokphan,fails to specifically disclose wherein the
`
`object detection unit determinesa priority of the object on a basis of a type of the object or a position of
`
`the object.
`
`In related art, Liu discloses wherein the object detection unit determines a priority of the object
`
`on a basis of a type of the object or a position of the object (Liu 420; “in a road scenario, nearby
`
`vehicles, pedestrian objects, or vehicle objects which are located right in front of a tracker, stably
`
`move, and continuously appear, may be regarded asthe high-priority objects. Furthermore,for
`
`example, vehicle objects which are located at a middle or long distance and unstably move,shielded
`
`objects at a long distance, or objects that just appear in a screen may be regarded as non-high-
`
`priority objects.”; 928; “an analysis may be performedfor a variation of at least one first feature of
`
`a position, a velocity, a distance, and an occupation region of the tracking object between two (or
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 10
`
`among three or more) continuous(or discontinuous) frames. If the variation is relatively small, it
`
`meansthat the position, the velocity, the distance and/or the occupation region of the tracking
`
`object is basically unchanged, namely stable; accordingly, the object may beinitially selected as the
`
`high-priority object.”).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effective filing
`
`date to incorporate the determination of priority of the object on a basis of a type or position of the object
`
`of Liu into the teachings of Barrois, as modified by Kanokphan, to improve the technology of rapidly and
`
`accurately tracking a moving object, especially when multiple objects are in the frame (Liu {4, 5).
`
`Further, “in the present method, an accurate and stable tracking result can be obtained by determining the
`
`high-priority objects and the reliable high-priority objects. (Liu 21)”
`
`Consider claim 11, Barrois, as modified by KanokphanandLiu,discloses the claimed invention
`
`wherein the object recognition unit excludes a point cloud in a presence area of an object separated by a
`
`predetermined distance or longer from a target of clustering (Barrois 412; “As a result of the model
`
`matching, points erroneously or falsely assigned to a cluster can be identified and eliminated.
`
`Likewise, points erroneously lying outside of the considered cluster, either isolated or in a different
`
`cluster, which points are called outliers, can be identified as belonging to the cluster under
`
`consideration, and the assignment can be corrected accordingly.”; Kanokphan 951; “(c)
`
`Furthermore, from the point cloud segments remaining in (a) above, point cloud segments whose
`
`distance to the detected “ground”is greater than or equal to a predetermined threshold are
`
`determined to be “smooth objects” and are excluded.”).
`
`Consider claim 12, Barrois, as modified by KanokphanandLiu, discloses the claimed invention
`
`wherein the object detection unit includes two or more detectors (Liu (4; camera, binocular camera)
`
`that detect objects with different priorities at different frequencies (Liu 4/77; “more system resources
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 11
`
`maybeallocated to tracking for the reliable high-priority objects, compared to tracking for the
`
`objects to be tracked other than the reliable high-priority objects.”).
`
`Consider claim 13, Barrois, as modified by KanokphanandLiu, discloses the claimed invention
`
`wherein the object recognition unit recognizes an object with high priority at a higher frequency than a
`
`recognition frequency of an object with low priority (Liu 477; “more system resources may be
`
`allocated to tracking for the reliable high-priority objects, compared to tracking for the objects to
`
`be tracked other than the reliable high-priority objects.”).
`
`Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Barrois, in view of
`
`KanokphanandLiu,as applied to claims 9 and 11-13 above, and further in view of Maeda.
`
`Consider claim 10, Barrois, as modified by KanokphanandLiu,fails to specifically disclose
`
`wherein the object recognition unit determines a point cloud that is a target of the clustering on a basis of
`
`a priority of the object.
`
`In related art, Maeda discloses wherein the object recognition unit determines a point cloudthatis
`
`a target of the clustering on a basis of a priority of the object (Maeda FIG.29 #81801a-END, 9367; “an
`
`object priority evaluation unit evaluating an object priority for each object using a relative quantity
`
`of objects belonging to the relevant cluster along with the object”).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effective filing
`
`date to incorporate determining a point cloud that is a target of the clustering on a basis of a priority of the
`
`object of Maedainto the information processing method of Barrois, as modified by Kanokphan and Liu,
`
`to rank the objects/images by priority (Maeda FIG.29). As stated in 3 of Maeda, “By ranking and
`
`displaying images accordingly, the user can more easily select a desired image by searching through
`
`highly-ranked images within the enormousquantity of images possessed bythe user.”
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 12
`
`Claims 14-16 are rejected under 35 U.S.C. 103 as being unpatentable over Barrois, in view of
`
`Kanokphan,as applied to claims 1-5, 7-8, and 18-20 above, and further in view of Fan (US
`
`2019/0295266 A1).
`
`Consider claim 14, while Barrois, as modified by Kanokphan,discloses the claimed invention
`
`wherein the object detection unit includes two or more detectors (Barrois 46; “stereoscopically
`
`
`
`
`
`recording two imagesof an area”)withdifferentdetection-accuracies, and a comparison unit that
`
`compares detection results detected by the individual detectors (Barrois 96; “generating an attention
`
`mapfrom atleast one of the two imagesusing theclassifier”), and the comparison unit outputs any
`
`one of detection results of the individual detectors to the object recognition unit on a basis of a
`
`comparison result (Barrois 76; “determining at least one cluster of points in the point cloud by
`
`applying a clustering methodto the plurality of points, the at least one cluster representing points
`
`belonging to the real object,”; matching).
`
`Barrois, as modified by Kanokphan,fails to specifically disclose wherein the object detection unit
`
`includes two or more detectors with different detection accuracies.
`
`In related art, Fan discloses wherein the object detection unit includes two or more detectors with
`
`different detection accuracies (Fan; Abstract 3D scanner; 733-37; LiDAR, Microsoft Kinect®).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effective filing
`
`date to incorporate the detectors with different detection accuracies of Fan into the teachings of Barrois,
`
`as modified by Kanokphan,to register multiple point clouds together (Fan Abstract). As stated by Fan
`
`in [4, “In applications of multiple views combining several point clouds into a global consistent data set
`
`is typically required.”
`
`Consider claim 15, Barrois, as modified by Kanokphan and Fan,discloses the claimed invention
`
`wherein in a case where the comparison result indicates mismatch (Barrois 15; false assignment), the
`
`object detection unit outputs a detection result of the detector with higher detection accuracy to the object
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 13
`
`recognition unit (Barrois FIG. 1, #PW’, 415; “a feedback loop to the stereo method after correcting
`
`the false assignments”).
`
`Consider claim 16, Barrois, as modified by Kanokphan and Fan,discloses the claimed invention
`
`wherein in a case where the comparison result indicates mismatch (Barrois 15; false assignment), the
`
`object recognition unit performsthe clustering again on a basis of a detection result of the detector with
`
`higher detection accuracy (Barrois FIG. 1 #CL and CL’, 915; “a feedback loop to the stereo method
`
`after correcting the false assignments”).
`
`Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Barrois, in view of
`
`Kanokphanand Fan,as applied to claim 14 above, and further in view of Liu.
`
`Consider claim 17, Barrois, as modified by Kanokphan and Fan, while disclosing one or more
`
`detectors with different accuracies (Fan; Abstract 3D scanner; 933-37; LiDAR, Microsoft Kinect®),
`
`they fail to specifically disclose wherein a detection frequency of the detector with lower detection
`
`accuracy is higher than a detection frequency of the detector with higher detection accuracy.
`
`In related art, Liu discloses wherein a detection frequency of the detector with lower detection
`
`accuracy is higher than a detection frequency of the detector with higher detection accuracy (Liu (77;
`
`“more system resources may beallocated to tracking for the reliable high-priority objects,
`
`compared to tracking for the objects to be tracked other than the reliable high-priority objects.”).
`
`Therefore, it would have been obviousto one of ordinary skill in the art before the effective filing
`
`date to incorporate the system allocation method of Liu into the teachings of Barrois, as modified by
`
`Kanokphan and Fan,to allow more system resources to be moreefficiently allocated so that high-priority
`
`objects may bereliably tracked (61). Using a low accuracy detector at a high frequency OR a high
`
`accuracy detector at a low frequency is computationally efficient as motivated in 61 of Liu, “For
`
`example, more system resourcesare allocated to tracking for the reliable high-priority objects, compared
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 14
`
`to tracking for the objects to be tracked other than the reliable high-priority objects; however, the present
`
`invention is not limited such an example. In this way, the system resources can berationally allocated, so
`
`that the reliable high-priority objects can be reliably tracked, and the tracking performanceof other
`
`objects can be further optimized and improvedbased on tracking information ofthe reliable high-priority
`
`objects without increasing the system resources.”
`
`Relevant Prior Art
`
`The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
`
`Such art includes: Schmidt (3D Scene Segmentation and Object Tracking in Multiocular Image
`
`Sequences), Giila&iz (Comparison of Hierarchical and Non-Hierarchical Clustering Algorithms).
`
`Conclusion
`
`Any inquiry concerning this communication or earlier communications from the examiner should
`
`be directed to ASHLEY HYTREK whosetelephone numberis (703)756-4562. The examiner can
`
`normally be reached M-F 7:30-4:30.
`
`Examiner interviews are available via telephone, in-person, and video conferencing using a
`
`USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use
`
`the USPTO AutomatedInterview Request (AIR) at http://www.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor,
`
`Edward Urban can be reached on (571)272-7899. The fax phone number for the organization wherethis
`
`application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be obtained from
`
`Patent Center. Unpublished application information in Patent Center is available to registered users. To
`
`file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit
`
`https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and
`
`https://www.uspto.gov/patents/docx for information aboutfiling in DOCX format. For additional
`
`
`
`Application/Control Number: 17/757,451
`Art Unit: 2665
`
`Page 15
`
`questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like
`
`assistance from a USPTO CustomerService Representative, call 800-786-9199 (IN USA OR CANADA)
`
`or 571-272-1000.
`
`/ASHLEY L. HYTREK/
`Examiner, Art Unit 2665
`
`/EDWARD F URBAN/
`Supervisory Patent Examiner, Art Unit 2665
`
`