`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/053,224
`
`11/05/2020
`
`Brian A. Ellenberger
`
`82015US003
`
`4226
`
`3M INNOVATIVE PROPERTIES COMPANY
`PO BOX 33427
`ST. PAUL, MN 55133-3427
`
`SHIN, SEONG-AH A
`
`ART UNIT
`
`2659
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`05/30/2023
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`Thetime period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`LegalUSDocketing @ mmm.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`(J) Claim(s)__ is/are allowed.
`Claim(s) 1-20 is/are rejected.
`Claim(s) 20 is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`S)
`“If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http:/Awww.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10) The specification is objected to by the Examiner.
`11)() The drawing(s) filedon__ is/are: a)C) accepted or b)C) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12). Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`cc) None ofthe:
`b)L) Some**
`a)D) All
`1.(.) Certified copies of the priority documents have been received.
`2.1) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`4) (J Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20230522
`
`Application No.
`Applicant(s)
`17/053 ,224
`Ellenberger et al.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`SEONG-AH A SHIN
`2659
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 11/05/2020.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined
`
`under the first inventor to file provisions of the AIA.
`
`Status of Claims
`
`1.
`
`Claims 1-20 are pending in this application.
`
`Claim Objections
`
`2.
`
`Claim 20 is objected to becauseof the following informalities:
`
`In claim 20, "method" should be changedto --system-- as this term is previously
`
`found in parent claim 11.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousnessrejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the
`claimed invention is not identically disclosed as set forth in section 102, if the
`differences between the claimed invention and the prior art are such that the
`claimed invention as a whole would have been obvious before the effectivefiling
`date of the claimed invention to a person having ordinary skill in the art to which
`the claimed invention pertains. Patentability shall not be negated by the manner
`in which the invention was made.
`
`The text of those sections of Title 35, U.S. Code not included in this action can
`
`be found in a prior Office action.
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 3
`
`The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148
`
`USPQ 459 (1966), that are applied for establishing a background for determining
`
`obviousness under 35 U.S.C. 103 are summarized as follows:
`
`1. Determining the scope and contents of the prior art.
`
`2. Ascertaining the differences between the prior art and the claims atissue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence presentin the application indicating
`
`obviousness or nonobviousness.
`
`This application currently names joint inventors. In considering patentability of the
`
`claims the examiner presumes that the subject matter of the various claims was
`
`commonly ownedasof the effective filing date of the claimed invention(s) absent any
`
`evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to
`
`point out the inventor and effective filing dates of each claim that was not commonly
`
`ownedasofthe effectivefiling date of the later invention in order for the examiner to
`
`consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2)
`
`prior art against the later invention.
`
`3.
`
`Claims 1, 4, 7-10, 11, 14, and 17-20 are rejected under pre-AlA 35 U.S.C.
`
`103(a) as being unpatentable over Jeong (US Pub. 2017/0256260)in view of
`
`Baughmanetal., (US Pub. 2018/0336459).
`
`Regarding claim 1, Jeong discloses a method performed byat least one
`
`computer processor executing computer program instructions stored on at least one
`
`non-transitory computer readable medium to execute a method, the method comprising:
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 4
`
`receiving batch data asynchronously fromafirst data source; performing NLP on
`
`the batch data to produce batch NLP data ([0108]-[0117] “receive voice commands
`
`uttered by a plurality of users from a plurality of display devices corresponding to the
`
`respective users....the control unit 170 may generate a training selection list ... and
`
`transmit the generated training selection list to the NLP server 500”):
`
`at a batch NLP module, generating a summarized NLP data model based on the
`
`batch NLP data in a first amount of time ([0108]-[0117] “receive, from the NLP
`
`server 500, a training result obtained by performing the natural language processing on
`
`the training selection list, and store the received training result in the NLP DB 177”);
`
`at a live NLP processor, after performing NLP on the batch data:
`
`receiving first live data from a live data source ([0077]-[0084] Figs. 3 and 5,
`
`$111, $117, receiving text from corresponding utterance from a user of the display
`
`device);
`
`combining at leasta first part of the summarized NLP data model with the first
`
`live data to producefirst combined data ([0077]-[0084] Figs. 3 and 5, “check whether
`
`the text pattern has matched with a prestored voice recognition pattern so as to perform
`
`a function of the display device 100, corresponding to the text pattern. In an
`
`embodiment, the NLP DB 177 maystore a corresponding relationship between
`
`functions of the display device 100 and voice recognition patterns corresponding
`
`thereto”); and
`
`performing live NLP on the first combined data to producefirst live NLP output in
`
`a second amount of time, [wherein the second amount of time is shorter than the first
`
`amount of time] ([0077]-[0084] Figs. 3 and 5, “if the text pattern matches with the
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 5
`
`prestored voice recognition pattern, the control unit 170 performs a function of
`
`the display device 100, corresponding to the matched voice recognition pattern
`
`(S115)’).
`
`Jeong does not explicitly teach however Baughman does explicitly teach
`
`including the bracketed limitation:
`
`performing live NLP on the first combined data to producefirst live NLP output in
`
`a second amount of time, [wherein the second amount of time is shorter than the first
`
`amount of time] ([0035]-[0046] “providing structured data for storagein live event data
`
`area 2122 and comment data area 2124. Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121. Manager system 110 can structure
`
`records of live event data area 2122 and can structure data records of historical event
`
`data area 2121. As part of structuring data for use by historical event data
`
`area 2121, manager system 110 can vectorize records... training a neural
`
`network 300 caninclude useof historical event data and can perform training using data
`
`of live event data area 2122”).
`
`Therefore, it would have been obvious to one of ordinary skill before the effective
`
`filing date of the claimed invention to incorporate the method of performing a function of
`
`the display device corresponding to the voice command astaught by Jeong with the
`
`methodoftraining data in real-time with live event data as taught by Baughmanto
`
`prevent of defining reactively after a problem is experienced and to betailored to
`
`address the last experienced problem instead of future problems that may arise with the
`
`data records (Baughman, [0003]).
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 6
`
`Regarding claim 4, Jeong in view of Baughmandiscloses the method of claim 1,
`
`and Baughmanfurther discloses:
`
`wherein the combining comprises identifying portions of the summarized NLP
`
`data model that are relevant to the live data, and combining the identified portions of the
`
`summarized NLP data model with the live data ([0035]-[0046] “Training a neural
`
`network 300 can include useof historical event data and can perform training using data
`
`of live event data area 2122”).
`
`Regarding claim 7, Jeong in view of Baughmandiscloses the method of claim 1,
`
`and Baughmanfurther discloses:
`
`wherein the live data includes data representing human speech, and wherein the
`
`live NLP output comprises text representing the human speech ([0057]-[0059] receiving
`
`speech and transform to text by a speech to text server (STT server) 300, and a natural
`
`language server (NLP server) 500).
`
`Regarding claim 8, Jeong in view of Baughmandiscloses the method of claim 1,
`
`and Jeong further discloses:
`
`wherein the live data includes data representing human speech, and wherein the
`
`live NLP output comprises structured data including text representing the human
`
`speech and data representing concepts corresponding to the human speech ([0088]-
`
`[0091] NLP server mayanalysis the intention of the user with respect to the text pattern
`
`through morpheme analysis, syntax analysis, speech act analysis, and dialog
`
`processing analysis).
`
`Regarding claim 9, Jeong in view of Baughmandiscloses the method of claim 1,
`
`and Jeong further discloses:
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 7
`
`at the live NLP processor, after performing NLP on the batch data: receiving
`
`second live data from the live data source ([0035]-[0046] “providing structured data for
`
`storage in live event data area 2122 and comment data area 2124. Manager
`
`system 110, running preparation and maintenance process 111 can periodically
`
`replicate data of live event data area 2122 into historical event data area 2121. Manager
`
`system 110 can structure records oflive event data area 2122 and can structure data
`
`records of historical event data area 2121. As part of structuring data for use by
`
`historical event data area 2121, manager system 110 can vectorize records; [0100]
`
`$125, voice recognition unit 171 of the display device 100 check whether the voice
`
`commandreceived in step S101 has been again received); and
`
`combining at least a second part of the summarized NLP data model with the
`
`secondlive data to produce second combined data ([0100] performing a function of
`
`the display device 100, corresponding to the matched voice recognition pattern).
`
`Jeong does not explicitly teach however Baughman does explicitly teach:
`
`performing live NLP on the second combined data to produce secondlive NLP
`
`output in a third amountof time, wherein the third amount of time is shorter than the first
`
`amount of time ([0035]-[0046] Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121 ... Training a neural network 300 can
`
`include use of historical event data and can perform training using data oflive event
`
`data area 2122”).
`
`Regarding claim 10, Jeong in view of Baughmandiscloses the method of claim
`
`1, and Jeong further discloses:
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 8
`
`wherein performing NLP on the first combined data comprises performing live
`
`NLP onafirst portion ofthe first live data in the first combined data atafirst time, and
`
`performing live NLP again on the first portion of the first live data in the first combined
`
`data at a second time in response to determining thatthe first live data have changed
`
`since the first time ([0035]-[0046] Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121 ... Training a neural network 300 can
`
`include use of historical event data and can perform training using data of live event
`
`data area 2122”).
`
`Regarding claims 11, 14, and 17-20, Claims 11, 14, and 17-20 are the
`
`corresponding system claims to method claims 1, 4, and 7-10. Therefore, claims 11, 14,
`
`and 17-20 are rejected using the same rationale as applied to claims 1, 4, and 7-10
`
`above.
`
`4.
`
`Claims 2, 3, 5, 6, 12, 13, 15, and 16 are rejected under pre-AlA 35 U.S.C.
`
`103(a) as being unpatentable over Jeong (US Pub. 2017/0256260)in view of
`
`Baughmanetal., (US Pub. 2018/0336459) and further in view of Relangi et al., (US
`
`Pub. 2020/0372219).
`
`Regarding claim 2, Jeong in view of Baughmandiscloses the method of claim 1.
`
`Jeong in view of Baughman does notexplicitly teach however Relangi does
`
`explicitly teach:
`
`wherein the second amount of time is at least ten times shorter than the first
`
`amount of time ([O098] “generate tasks/recommendations based on the machine
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 9
`
`learning models, which may be updated at regular intervals (e.g., monthly) and based
`
`on real-time events and conditions’).
`
`Therefore, it would have been obvious to one of ordinary skill before the effective
`
`filing date of the claimed invention to incorporate the method of performing a function of
`
`the display device corresponding to the voice command astaught by Jeong in view of
`
`Baughmanwith the method of training systems for labeling natural languages quickly to
`
`improve systems for training a chatbot, or other natural language system, to respond to
`
`a broad variety of queries and adapt to new queries not yet labeled in training data
`
`(Relangi, [0006][0087)).
`
`Regarding claim 3, Jeong in view of Baughmandiscloses the method of claim 1.
`
`Jeong in view of Baughman does notexplicitly teach however Relangi does
`
`explicitly teach:
`
`wherein the second amount of time is at least one hundred times shorter than the
`
`first amountof time ([0098] “generate tasks/recommendations based on the machine
`
`learning models, which may be updated at regular intervals (e.g., monthly) and based
`
`on real-time events and conditions’).
`
`Regarding claim 5, Jeong in view of Baughmandiscloses the method of claim 1.
`
`Jeong in view of Baughman does notexplicitly teach however Relangi does
`
`explicitly teach:
`
`wherein performing live NLP on the first combined data is performed in less than
`
`5 seconds ([0051] performing process and generating result within 5 seconds by NLP
`
`device).
`
`Regarding claim 6, Jeong in view of Baughmandiscloses the method of claim 1.
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 10
`
`Jeong in view of Baughman does notexplicitly teach however Relangi does
`
`explicitly teach:
`
`wherein performing live NLP on the first combined data is performed in less than
`
`1 second ([0051] performing process and generating result within predetermined time
`
`by NLP device).
`
`Regarding claims 12, 13, 15, and 16, Claims 12, 13, 15, and 16 are the
`
`corresponding system claims to method claims 2, 3, 5, and 6. Therefore, claims 12, 13,
`
`15, and 16 are rejected using the same rationale as applied to claims 2, 3, 5, and 6
`
`above.
`
`Conclusion
`
`5.
`
`The prior art made of record and notrelied upon is considered pertinent to
`
`applicant's disclosure. Please see attached form PTO-892.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to SEONG-AHA. SHIN whose telephone number is
`
`(571)272-5933. The examiner can normally be reached 9 AM-3PM.
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Awww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, Pierre-Louis Desir can be reached on 571-272-7799. The fax phone number
`
`for the organization where this application or proceeding is assigned is 571-273-8300.
`
`
`
`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 11
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Center is
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent-
`
`center for more information about Patent Center and
`
`https:/Awww.uspto.gov/patents/docx for information aboutfiling in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free). If you would like assistance from a USPTO Customer Service
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`Seong-ah A. Shin
`Primary Examiner
`Art Unit 2659
`
`/SEONG-AH A SHIN/
`Primary Examiner, Art Unit 2659
`
`