throbber
www.uspto.gov
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/053,224
`
`11/05/2020
`
`Brian A. Ellenberger
`
`82015US003
`
`4226
`
`3M INNOVATIVE PROPERTIES COMPANY
`PO BOX 33427
`ST. PAUL, MN 55133-3427
`
`SHIN, SEONG-AH A
`
`ART UNIT
`
`2659
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/14/2023
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`Thetime period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`LegalUSDocketing @ mmm.com
`
`PTOL-90A (Rev. 04/07)
`
`

`

`
`
`Disposition of Claims*
`1-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`C) Claim(s)__ is/are allowed.
`Claim(s) 1-20 is/are rejected.
`1) Claim(s)__is/are objectedto.
`Cj} Claim(s)
`are subjectto restriction and/or election requirement
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://Awww.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10) The specification is objected to by the Examiner.
`11)0) The drawing(s) filedon__ is/are: a)(J accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)1) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`c)Z None ofthe:
`b)() Some**
`a)C All
`1... Certified copies of the priority documents have been received.
`2.1) Certified copies of the priority documents have beenreceived in Application No.
`3.1.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`4) (J Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20231106
`
`Application No.
`Applicant(s)
`17/053 ,224
`Ellenberger et al.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`SEONG-AH A SHIN
`2659
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s)filed on 8/30/2023.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013,
`
`is being examined
`
`under the first inventor to file provisions of the AIA.
`
`Status of Claims
`
`1.
`
`Claims 1-20 are pending in this application.
`
`Regarding Rejection under 35 U.S.C. 103
`
`Response to Arguments
`
`2.
`
`Applicant's amendment and arguments with respect to rejections have been fully
`
`considered but are moot because the arguments do not apply to any of the references
`
`being used in the current rejection.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousness rejections set forth in this Office action:
`
`A patent for aclaimed invention may not be obtained, notwithstanding that the
`claimedinvention is not identically disclosed as set forth in section 102, if the
`differences betweenthe claimed invention andthe prior art are such that the
`claimed invention as a whole would have been obvious before the effectivefiling
`date of the claimed invention to a person having ordinaryskill in the art to which
`the claimed invention pertains. Patentability shall not be negated by the manner
`in which the invention was made.
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 3
`
`The text of those sections of Title 35, U.S. Code not included in this action can
`
`be found in a prior Office action.
`
`The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148
`
`USPQ 459 (1966), that are applied for establishing a background for determining
`
`obviousness under 35 U.S.C. 103 are summarized as follows:
`
`1. Determining the scope and contents of the prior art.
`
`2. Ascertaining the differences between the prior art and the claims atissue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence present in the application indicating
`
`obviousness or nonobviousness.
`
`This application currently namesjoint inventors.
`
`In considering patentability of the
`
`claims the examiner presumesthat the subject matter of the various claims was
`
`commonly ownedasof the effective filing date of the claimed invention(s) absent any
`
`evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to
`
`point out the inventor and effective filing dates of each claim that was not commonly
`
`ownedasof the effective filing date of the later invention in order for the examiner to
`
`consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2)
`
`prior art against the later invention.
`
`3.
`
`Claims 1, 4, 7-10, 11, 14, and 17-20 are rejected under pre-AlA 35 U.S.C.
`
`103(a) as being unpatentable over Jeong (US Pub. 2017/0256260) in view of
`
`Haywardetal., (US Pat. 11,783,422, provisional application filed on Apr. 3, 2018)
`
`andfurther in view of Baughmanetal., (US Pub. 2018/0336459).
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 4
`
`Regarding claim 1, Jeong discloses a method performed by at least one
`
`computer processor executing computer program instructions stored on at least one
`
`non-transitory computer readable medium to execute a method, the method comprising:
`
`receiving batch data asynchronously from a [plurality of data sources, wherein
`
`data in at least one of the data sourcesin the plurality is in a different data format than
`
`data in the other of the data sourcesin the plurality]; performing NLP on the [normalized
`
`batch] data to produce batch NLP data ([0108]-[0117] “receive voice commandsuttered
`
`by a plurality of users from a plurality of display devices corresponding to the respective
`
`users....the control unit 170 may generate a training selection list ... and transmit the
`
`generated training selection list to the NLP server 500”);
`
`at a batch NLP module, generating a summarized NLP data model based on the
`
`batch NLP data inafirst amount of time ([0108]-[0117] “receive, from the NLP
`
`server 500, a training result obtained by performing the natural language processing on
`
`the training selection list, and store the received training result in the NLP DB 177”);
`
`at a live NLP processor, after performing NLP on the batch data:
`
`receiving first live data from a live data source ([0077]-[0084] Figs. 3 and 5,
`
`$111, S117, receiving text from corresponding utterance from a userof the display
`
`device);
`
`combining at least a first part of the summarized NLP data model with the first
`
`live data to produce first combined data ([0077]-[0084] Figs. 3 and 5, “check whether
`
`the text pattern has matched with a prestored voice recognition pattern so as to perform
`
`a function of the display device 100, corresponding to the text pattern.
`
`In an
`
`embodiment, the NLP DB 177 maystore a corresponding relationship between
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 5
`
`functions of the display device 100 and voice recognition patterns corresponding
`
`thereto’); and
`
`performing live NLP onthe first combined data to producefirst live NLP output in
`
`a second amountof time, [wherein the second amountof time is shorter than the first
`
`amountof time] ([0077]-[0084] Figs. 3 and 5, “if the text pattern matches with the
`
`prestored voice recognition pattern, the control unit 170 performs a function of
`
`the display device 100, corresponding to the matched voice recognition pattern
`
`(S115)”).
`
`Jeong does not explicitly teach however Hayward does explicitly teach including
`
`the bracketed limitation:
`
`receiving batch data asynchronously froma[plurality of data sources, wherein
`
`data in at least one of the data sourcesin the plurality is in a different data format than
`
`data in the other of the data sourcesin the plurality]; normalizing the batch data to
`
`produce normalized batch data; performing NLP on the [normalized batch] data to
`
`produce batch NLP data (Col. 8, lines 3-42, historical data 108 may includeaplurality
`
`(e.g., thousands or millions) of electronic documents, parameters, and/or other
`
`information; Col. 19, line 31-Col. 21, line18, “artificial neural network (ANN) 300 which
`
`may be trained by ANN unit 150 of FIG.
`
`1 or ANN training application 264 of FIG. 2
`
`...Multiple different types of ANNs may be employed, including withoutlimitation,
`
`recurrent neural networks, convolutional neural networks, and deeplearning neural
`
`networks ...wherein input data may be normalized by mean centering, to determine loss
`
`and quantify the accuracy of outputs’).
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 6
`
`Therefore, it would have been obvious to one of ordinary skill before the effective
`
`filing date of the claimed invention to incorporate the method of performing a function of
`
`the display device corresponding to the voice command as taught by Jeong with the
`
`method of normalizing data as taught by Hayward to improve upon the user input
`
`handling experience by processing claims using machine learning techniques
`
`(Hayward, Col. 2, lines 13-15).
`
`Jeong in view of Hayward does not explicitly teach however Baughman does
`
`explicitly teach including the bracketedlimitation:
`
`performing live NLP onthe first combined data to producefirst live NLP output in
`
`a second amountof time, [wherein the second amountof time is shorter than the first
`
`amountof time] ([0035]-[0046] “providing structured data for storage in live event data
`
`area 2122 and commentdata area 2124. Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121. Manager system 110 can structure
`
`recordsoflive event data area 2122 and can structure data records of historical event
`
`data area 2121. As part of structuring data for use by historical event data
`
`area 2121, manager system 110 can vectorize records... training a neural
`
`network 300 can include useof historical event data and can perform training using data
`
`of live event data area 2122”).
`
`Therefore, it would have been obvious to one of ordinary skill before the effective
`
`filing date of the claimed invention to incorporate the method of performing a function of
`
`the display device corresponding to the voice command as taught by Jeong in view of
`
`Hayward with the methodof training data in real-time with live event data as taught by
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 7
`
`Baughman to prevent of defining reactively after a problem is experienced and to be
`
`tailored to address the last experienced problem instead of future problems that may
`
`arise with the data records (Baughman, [0003]).
`
`Regarding claim 4, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1, and Baughmanfurther discloses:
`
`wherein the combining comprises identifying portions of the summarized NLP
`
`data model that are relevant to the live data, and combining the identified portions of the
`
`summarized NLP data model with the live data ([0035]-[0046] “Training a neural
`
`network 300 can include useof historical event data and can perform training using data
`
`of live event data area 2122”).
`
`Regarding claim 7, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1, and Baughmanfurther discloses:
`
`wherein the live data includes data representing human speech, and wherein the
`
`live NLP output comprises text representing the human speech ([0057]-[0059] receiving
`
`speech and transform to text by a speech to text server (STT server) 300, and a natural
`
`language server (NLP server) 500).
`
`Regarding claim 8, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1, and Jeong further discloses:
`
`wherein the live data includes data representing human speech, and wherein the
`
`live NLP output comprises structured data including text representing the human
`
`speech and data representing concepts corresponding to the human speech ([0088]-
`
`[0091] NLP server may analysis the intention of the user with respect to the text pattern
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 8
`
`through morpheme analysis, syntax analysis, speech act analysis, and dialog
`
`processing analysis).
`
`Regarding claim 9, Jeong in view of Hayward andfurther in view of Baughman
`
`discloses the method of claim 1, and Jeong further discloses:
`
`at the live NLP processor, after performing NLP on the batch data: receiving
`
`second live data from the live data source ([0035]-[0046] “providing structured data for
`
`storage in live event data area 2122 and commentdata area 2124. Manager
`
`system 110, running preparation and maintenance process 111 can periodically
`
`replicate data of live event data area 2122 into historical event data area 2121. Manager
`
`system 110 can structure records of live event data area 2122 and can structure data
`
`recordsof historical event data area 2121. As part of structuring data for use by
`
`historical event data area 2121, manager system 110 can vectorize records; [0100]
`
`$125, voice recognition unit 171 of the display device 100 check whether the voice
`
`command received in step S101 has been again received); and
`
`combining at least a second part of the summarized NLP data model with the
`
`second live data to produce second combined data ([0100] performing a function of
`
`the display device 100, corresponding to the matched voice recognition pattern).
`
`Jeong does not explicitly teach however Baughman does explicitly teach:
`
`performing live NLP on the second combined data to produce second live NLP
`
`output in a third amountof time, wherein the third amount of time is shorter than the first
`
`amountof time ([0035]-[0046] Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121 ... Training a neural network 300 can
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 9
`
`include use of historical event data and can perform training using data of live event
`
`data area 2122”).
`
`Regarding claim 10, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1, and Jeong further discloses:
`
`wherein performing NLP onthe first combined data comprises performing live
`
`NLP onafirst portion of the first live data in the first combined data at a first time, and
`
`performing live NLP again on the first portion of the first live data in the first combined
`
`data at a second time in response to determining thatthe first live data have changed
`
`since thefirst time ([0035]-[0046] Manager system 110, running preparation
`
`and maintenance process 111 can periodically replicate data of live event data
`
`area 2122 into historical event data area 2121 ... Training a neural network 300 can
`
`include use of historical event data and can perform training using data of live event
`
`data area 2122”).
`
`Regarding claims 11, 14, and 17-20, Claims 11, 14, and 17-20 are the
`
`corresponding system claims to method claims 1, 4, and 7-10. Therefore, claims 11, 14,
`
`and 17-20 are rejected using the same rationale as applied to claims 1, 4, and 7-10
`
`above.
`
`4.
`
`Claims 2, 3, 5, 6, 12, 13, 15, and 16 are rejected under pre-AlA 35 U.S.C.
`
`103(a) as being unpatentable over Jeong (US Pub. 2017/0256260) in view of
`
`Haywardetal., (US Pat. 11,783,422, provisional application filed on Apr. 3, 2018)
`
`further in view of Baughmanetal., (US Pub. 2018/0336459) and further in view of
`
`Relangi et al., (US Pub. 2020/0372219).
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 10
`
`Regarding claim 2, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1.
`
`Jeong in view of Hayward and further in view of Baughman does not explicitly
`
`teach however Relangi does explicitly teach:
`
`wherein the second amount of time is at least ten times shorter than the first
`
`amount of time ([0098] “generate tasks/recommendations based on the machine
`
`learning models, which may be updated at regular intervals (e.g., monthly) and based
`
`on real-time events and conditions’).
`
`Therefore, it would have been obvious to one of ordinary skill before the effective
`
`filing date of the claimed invention to incorporate the method of performing a function of
`
`the display device corresponding to the voice command as taught by Jeong in view of
`
`Hayward and further in view of Baughman with the method oftraining systems for
`
`labeling natural languages quickly to improve systems for training a chatbot, or other
`
`natural language system, to respond to a broad variety of queries and adapt to new
`
`queries not yet labeled in training data (Relangi, [0006][0087)).
`
`Regarding claim 3, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1.
`
`Jeong in view of Hayward and further in view of Baughman does not explicitly
`
`teach however Relangi does explicitly teach:
`
`wherein the second amount of time is at least one hundred times shorter than the
`
`first amount of time ([O098] “generate tasks/recommendations based on the machine
`
`learning models, which may be updated at regular intervals (e.g., monthly) and based
`
`on real-time events and conditions’).
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 11
`
`Regarding claim 5, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1.
`
`Jeong in view of Hayward and further in view of Baughman does not explicitly
`
`teach however Relangi does explicitly teach:
`
`wherein performing live NLP on the first combined data is performedin less than
`
`5 seconds ([0051] performing process and generating result within 5 seconds by NLP
`
`device).
`
`Regarding claim 6, Jeong in view of Hayward and further in view of Baughman
`
`discloses the method of claim 1.
`
`Jeong in view of Baughman does not explicitly teach however Relangi does
`
`explicitly teach:
`
`wherein performing live NLP on the first combined data is performedin less than
`
`1 second ([0051] performing process and generating result within predetermined time
`
`by NLP device).
`
`Regarding claims 12, 13, 15, and 16, Claims 12, 13, 15, and 16 are the
`
`corresponding system claims to method claims 2, 3, 5, and 6. Therefore, claims 12, 13,
`
`15, and 16 are rejected using the same rationale as applied to claims 2, 3, 5, and 6
`
`above.
`
`Conclusion
`
`5.
`
`The prior art made of record and notrelied upon is considered pertinent to
`
`applicant's disclosure. Please see attached form PTO-892.
`
`THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time
`
`policy as set forth in 37 CFR 1.136(a).
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 12
`
`A shortened statutory period for reply to this final action is set to expire THREE
`
`MONTHS from the mailing date of this action.
`
`In the event a first reply is filed within
`
`TWO MONTHS of the mailing date of this final action and the advisory action is not
`
`mailed until after the end of the THREE-MONTHshortened statutory period, then the
`
`shortened statutory period will expire on the date the advisory action is mailed, and any
`
`extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of
`
`the advisory action.
`
`Inno event, however, will the statutory period for reply expire later
`
`than SIX MONTHS from the mailing date of this final action.
`
`Anyinquiry concerning this communication or earlier communications from the
`
`examiner should be directed to SEONG-AH A. SHIN whose telephone numberis
`
`(571)272-5933. The examiner can normally be reached 9 AM-3PM.
`
`Examiner interviews are available via telephone,
`
`in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Awww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, Pierre-Louis Desir can be reached on 571-272-7799. The fax phone number
`
`for the organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Centeris
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: httos://patentcenter.uspto.gov. Visit https:/Avww.uspto.gov/patents/apply/patent-
`
`center for more information about Patent Center and
`
`

`

`Application/Control Number: 17/053,224
`Art Unit: 2659
`
`Page 13
`
`https://www.uspto.gov/patents/docx for information aboutfiling in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free).
`
`If you would like assistance from a USPTO Customer Service
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`Seong-ah A. Shin
`Primary Examiner
`Art Unit 2659
`
`/SEONG-AH A SHIN/
`Primary Examiner, Art Unit 2659
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket