`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/426, 106
`
`07/28/2021
`
`Masatoshi KOBAYASHI
`
`16199US01
`
`9394
`
`Xsensts
`
`/Sony
`
`ees
`
`Xsensus / Sony
`100 Daingerfield Road, Suite 402
`Alexandria, VA 22314
`
`MEINECKE DIAZ, SUSANNA M
`
`ART UNIT
`
`3625
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/27/2024
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`Xdocket @ XSensus.com
`
`Xsensuspat@ XSensus.com
`anaquadocketing @ Xsensus.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`Application No.
`Applicant(s)
`171426, 106
`KOBAYASHI etal.
`
`Office Action Summary Art Unit|AIA (FITF)StatusExaminer
`SUSANNA M DIAZ
`3625
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORYPERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensionsof time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`
`
`1) Responsive to communication(s)filed on 11/18/2024.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)[¥) This action is FINAL.
`2b) (J This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)(2) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`Disposition of Claims*
`1,4-13 and 15-20 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) _ is/are withdrawn from consideration.
`C} Claim(s)__ is/are allowed.
`Claim(s) 1,4-13 and 15-20 is/are rejected.
`(] Claim(s)__ is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)( The specification is objected to by the Examiner.
`11) The drawing(s)filed on 7/28/2021 is/are: a)(¥) accepted or b)() objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121(d).
`
`Priority under 35 U.S.C. § 119
`12)(¥) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)Y) All
`1.) Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No. |
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`*“ See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1) [[] Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`4)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20241122
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 2
`
`DETAILED ACTION
`
`1.
`
`This final Office action is responsive to Applicant's amendmentfiled November
`
`18, 2024. Claims 1 and 19-20 have been amended. Claims 2-3 and 14 are canceled.
`
`Claims 1, 4-13, and 15-20 are presented for examination.
`
`Notice of Pre-AlA or AIA Status
`
`2.
`
`The present application, filed on or after March 16, 2013, is being examined
`
`underthe first inventor to file provisions of the AIA.
`
`Responseto Arguments
`
`3.
`
`Applicant's arguments filed November 18, 2024 have beenfully considered but
`
`they are not persuasive.
`
`Regarding the rejection under 35 U.S.C. § 101, “Applicant respectfully submits,
`
`the claims are not directed to purely mental steps as the claims recite the hardware
`
`componentsof circuitry and a display.” (Page 8 of Applicant’s response) As explained in
`
`the rejection, most of the recited claim limitations fall under the scope of at least one
`
`abstract idea. The hardware componentsare only generally applied to the abstract
`
`ideas and, thus, do notintegrate the abstract ideas into a practical application or provide
`
`an inventive concept.
`
`Applicant submits that the claims present limitations that are not well-understood,
`
`routine, and conventional (page 9 of Applicant's response); however, as supportfor this
`
`assertion, Applicant points to the amended language of claim 1, mostof the details of
`
`which describe the various abstract ideas referenced in the rejection. Applicant does not
`
`show howthe additional elements perform activity that is not well-understood, routine,
`
`and conventional.
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 3
`
`On page 13 of the response, Applicant states, “Like Amdocs, Bascom, and
`
`Example 35, the present claims have a combination of steps that operate in a non-
`
`conventional and non-generic manner... Applicants thus respectfully submit thatit is
`
`unambiguous, when reading the claims as a whole, that the claims involve a very
`
`specific technological environment. Upon a review of Applicants’ claims, it is clear that
`
`the elements therein, when consideredat least in combination, recite an inventive
`
`conceptthatis clearly ‘significantly more’ than any alleged abstract idea.” Applicant
`
`cites mostof the limitations of claim 1, but Applicant does not explicitly address how the
`
`additional elements are integrated in a non-conventional and non-generic manner,for
`
`example. Applicant has also failed to specifically address the Examiner’s analysis
`
`presented in the rejection.
`
`Regarding the art rejection, on page 15 of Applicant's response, Applicant argues
`
`the following:
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 4
`
`r14
`
`The Examiner points out that content is targeted to a user based on the user’s
`
`profile and the user’
`
`s profile includes attributes of the user’
`
`s emotion type
`
`. As explained
`
`in the rejection
`
`3
`
`a sequence of media types may be presented as contentitems targeted
`
`to a user (Hawthorne
`
`4 46), which implies that a combination of the available content
`7
`
`items may be correlated to the user
`
`s profile and current emotional state. Hawthorne
`
`addresses the recited content items (/.e., a text
`
`3
`
`a still image
`
`3
`
`a video, a sound including
`
`a voice and music
`
`)
`
`as follows:
`
`“Here, each content item can be, but is not limited to, a
`
`media type of a (displayed or spoken
`
`) text (for a non-limiting example
`
`3
`
`an article, a
`
`quote, a personal story, or a book passage
`
`(a3)
`
`still or moving)
`
`image,
`
`a video clip, an
`
`audio clip (for a non
`
`limiting example,
`
`a piece of music or sounds from nature), and
`
`
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 5
`
`other types of content items from which a user can learn information or be emotionally
`
`impacted.” (Hawthorne: § 40) The user’s profile is refined based on feedback provided
`
`by the user in response to content presented to him/her based on his/her emotional
`
`state at the time, as seen in Jf 60 and 62 of Hawthorne:
`
`[0060] In the example of FIG. 6, the flowchart 600 continues
`block 608 wherethe retrieved content is customized based on
`the profile and/or the current emotional state of the user. Such
`customization reflects the user's preference as to whatkind of
`content items he/she would like to be included in the content
`to fit his/her emotional state at the time, as well as how each
`of the items in the content is preferred to be presented to
`him/her....
`
`the flowchart 600 may
`In the example of FIG. 6,
`[0062]
`optionally continue to block 612 wherethe user is enabled to
`provide feedback by rating and commenting on the content
`presented. Such feedback will then be used to update the
`profile
`of
`the user
`in order
`to make future content
`customization more accurate.
`
`The cyclical manner in which feedback regarding content and one’s emotional state
`
`may be used to update the user’s profile (correlating appropriate content to emotional
`
`state) for improved accuracy of content customization based on the user’s emotional
`
`state assists in estimating (e.g., refining and/or identifying) a users emotion type (e.g., a
`
`user’s profile, including emotion information) based on the various types of content
`
`(which mayincludea text, a still image, a video, and a sound including a voice and
`
`music). At the very least, Hawthorne disclosesidentifying a correlation between a user's
`
`emotion type and content and Hawthorne suggestsidentifying a correlation between a
`
`user’s emotion type and various types of respective content since Hawthorne explains
`
`that available content may includea text, a still image, a video, and a sound including a
`
`voice and music (as discussed above). Applicant’s Specification references the ability to
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 6
`
`input a combination of the various content items into an emotion calculation device, but
`
`the corresponding analysis is described moreasidentifying correlations between
`
`content items and emotion type (as seen in 9 111-112 and 121-144 of Applicant’s
`
`Specification), which is commensurate in scope with Hawthorne’s explicitly disclosed
`
`and suggested correlations between a user’s emotion type and content items (which
`
`may includeatext, a still image, a video, and a sound including a voice and music.
`
`Claim Rejections - 35 USC § 1017
`
`4.
`
`35 U.S.C. 101 reads as follows:
`
`Whoeverinvents or discovers any new and useful process, machine, manufacture, or
`composition of matter, or any new and useful improvementthereof, may obtain a patent
`therefor, subject to the conditions and requirementsofthistitle.
`
`5.
`
`Claims 1, 4-13, and 15-20 are rejected under 35 U.S.C. 101 because the claimed
`
`invention is directed to non-statutory subject matter.
`
`Claims 1, 4-13, and 15-20 are rejected under 35 U.S.C. 101 because the claimed
`
`invention is directed to an abstract idea without significantly more. The claimed
`
`invention is directed to determining a user’s emotion type and presenting information
`
`based on the emotion type without significantly more.
`
`|Step
`1: Statutory
`Yes — The claims fall within at least one of the four categories of patenteligible
`Category?
`subject matter. Apparatus (claims 1, 4-13, 15-18), Process (claim 19), Article of
`Manufacture (claim 20
`Yes — Aside from the additional elements identified in Step 2A — Prong 2 below,
`2A — Prong 1:
`Judicial Exception|the claims perform the following:
`Recited?
`[Claims 1,19, 20]
`estimate a user's emotion type based on a combination of a
`text, a still image, a video, and a soundincluding a voice and music;
`acquire first content information regarding first content, wherein the first
`contentis any of a product, a text, a still image, a video, a sound, anda
`combination of the product, the text, the still image, and the video; and
`calculate a matching frequencyfor the first content information for each of a
`
`plurality of emotion types of users that are associated with the users based on
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`
`Page 7
`
`their preferences and characteristics, wherein the emotion types of the users
`include a conservative emotion and a unique emotion type;
`visualize and display matching information for comparing the matching
`frequency between the emotion typesin a first display area, on a display; and
`estimate a category of the emotion type of the user based on thefirst
`content.
`
`display the emotion type of which the matching frequencyis highest,
`[Claim 4]
`as an optimal emotion type, in the first display area in close proximity to the
`matching information.
`[Claim 5]
`wherein when the emotion type and the optimal emotion type
`included in the matching information are selected, display detailed information of
`the selected emotion type or optimal emotion type.
`[Claim 6]
`acquire sense-of-values information of the user.
`[Claim 7]
`estimate a category of the emotion type of the user based on the
`sense-of-values information.
`
`acquire at least one second contentinformation regarding a second
`[Claim 8]
`content different from the first content generated based on thefirst content
`information, and
`calculate a matching frequency for the second content information, for each
`of the plurality of the emotion types.
`[Claim 9]
`display the matching frequency ofthe first content information in the
`first display area, and display the matching frequency of the second content
`information in a seconddisplay area closeto the first display area.
`[Claim 10]|wherein whenthe first contentis text, calculate a touching level
`indicating a level of the text touching a mind of the user.
`[Claim 11]
`visualize and display the touching level.
`[Claim 12]
`present the text to the user belonging to the emotion type according
`to an emotion value of the text based on the touching level.
`[Claim 13]
`present optimal content that is optimal to the user based on sense-
`of-values information of the user.
`
`[Claim 15]|wherein when the touching level displayed on the display is selected
`by the user, score and display a degree at which each of wordsrelated to a
`plurality of predetermined genresis included in the text and an appearance
`frequency of the word.
`[Claim 16]
`detect a timing for updating the emotion type to which the useris
`classified, based on the sense-of-values information.
`
`
`
`Page 8
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`calculate a compatibility level between the emotion types.
`[Claim 17]
`[Claim 18]|wherein whenthe first content is the product,
`acquire the sense-of-values information of the user for the product for each
`of the emotion typesin a time-series manner, and
`display a temporal change of the sense-of-values information for the
`productfor each of the emotion types.
`
`Aside from the additional elements, the aforementioned claim details exemplify the
`abstract idea(s) of a mental process(since the details include concepts performed
`in the human mind, including an observation, evaluation, judgment, and/or
`opinion). As explained in MPEP § 2106.04(a)(1)(III), “[t]he courts consider a
`mental process (thinking) that ‘can be performed in the human mind, or by a
`human using a pen and paperto be an abstract idea. CyberSource Corp. v. Retail
`Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011).”
`The limitations reproduced above, as drafted, are a processthat, underits
`broadest reasonable interpretation, covers performanceof the limitations in the
`mind butfor the recitation of generic computer components. Thatis, other than
`reciting the additional elements identified in Step 2A — Prong 2 below, nothing in
`the claim elements precludes the steps from practically being performed in the
`mind and/or by a human using a pen and paper. For example, but for the
`recitations of generic computer and other processing components(identified in
`Step 2A — Prong 2 below), the respectively recited steps/functions of the claims,
`as drafted and set forth above, are a processthat, under its broadest reasonable
`interpretation, covers performance ofthe limitations in the mind and/or with the
`use of pen and paper.If a claim limitation, under its broadest reasonable
`interpretation, covers performanceofthe limitation in the mind (and/or with pen
`and paper) but for the recitation of generic computer components, thenit falls
`within the “Mental Processes” grouping of abstract ideas. Accordingly, the claims
`recite an abstract idea.
`
`Aside from the additional elements, the aforementioned claim details exemplify a
`method of organizing human activity (since the details include examplesof
`commercial or legal interactions, including advertising, marketing or sales activities
`or behaviors, and/or business relations and managing personal behavior or
`relationships or interactions between people, including social activities, teaching,
`and following rules or instructions). More specifically, the evaluated processis
`related to determining a user’s emotion type and presenting information based on
`the emotion type, which (under its broadest reasonable interpretation) is an
`example of marketing activities and managing personal behavior (i.e., organizing
`humanactivity); therefore, aside from the recitations of generic computer and
`other processing components (identified in Step 2A — Prong 2 below), the
`limitations identified in the more detailed claim listing above encompassthe
`abstract idea of organizing human activity.
`
`
`
`Page 9
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`2A — Prong 2:
`Integrated into a
`Practical
`Application?
`
`Various calculating steps are recited throughout the claims and these are
`examples of mathematical concepts.
`No — The judicial exception(s) is/are not integrated into a practical application.
`
`The apparatus and methodclaims include circuitry to perform the recited
`operations.
`
`2016}.
`
`The additional elements are recited at a high-level of generality (/.e., as generic
`processing elements performing generic computer functions) such that the
`incorporation of the additional processing elements amounts to no more than mere
`instructions to apply the judicial exception(s) using generic computer components.
`There is no indication in the Specification that the steps/functions of the claims
`require any inventive programming or necessitate any specialized or other
`inventive computer components(i.e., the steps/functions of the claims may be
`implemented using capabilities of general-purpose computer components).
`Accordingly, the additional elements do not integrate the abstract ideasinto a
`practical application because they do not impose any meaningful limits on
`practicing the abstract idea. The claims are directed to an abstract idea(s).
`
`The article of manufacture (e.g., CRM) claim includes a non-transitory computer-
`readable medium storing executable instruction, which when executedbycircuitry,
`causethe circuitry to perform the recited operations.
`
`The claims as a whole merely describe how to generally “apply” the abstract
`idea(s) in a computer environment. The claimed processing elements are recited
`at a high level of generality and are merely invoked as a tool to perform the
`abstract idea(s). Simply implementing the abstract idea(s) on a general-purpose
`processor is not a practical application of the abstract idea(s); Applicant’s
`specification discloses that the invention may be implemented using general-
`purpose processing elements and other generic components (Spec: Ff 323-327).
`
`The use of a processor/processing elements (e.g., as recited in all of the claims),
`suchascircuitry, facilitates generic processor operations. The use of a memoryor
`machine-readable media with executable instructions (e€.g., as recited in the article
`of manufacture claim) facilitates generic processor operations.
`
`The processing components presented in the claims simply utiize the capabilities
`of a general-purpose computer and are, thus, merely tools io implernent the
`abstract ideais}). As seen in MPEP § 2106.05(aN}and § 2106.05((2), the court
`found that acceleraling a process when the increased speed solely comes fram
`ihe capabiliies of a general-purpose computer is nat sufficient ta show an
`improvernent in computer-functionality and it amounts to a mere invocation of
`computers or machinery as a tool to perforn an existing process (see FairWarning
`iP LLC vy. lato Sys, 839 F. Gd 1089, 1095, 120 USPO2d 1293, 1296 (Fed. Cir.
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`There is no transformation or reduction of a particular article to a different state or
`thing recited in the claims.
`
`Page 10
`
`Additionally, even when considering the operations of the additional elements as
`an ordered combination, the ordered combination does not amountto significantly
`more than whatis present in the claims when each operation is considered
`separately.
`No — The claims do not include additional elements that are sufficient to amount to
`significantly more than the judicial exception(s). As discussed above with respect
`to integration of the abstract idea(s) into a practical application, the use of the
`additional elements to perform the steps identified in Step 2A — Prong 1 above
`amounts to no more than mereinstructions to apply the exceptions using a generic
`computer component(s). Mereinstructions to apply an exception using a generic
`computer component(s) cannotprovide an inventive concept. The claims are not
`
`patent eligible.
`
`2B: Claim(s)
`Provide(s) an
`Inventive
`Concept?
`
`Claim Rejections - 35 USC § 103
`
`6.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousness rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102, if the differences between the
`claimed invention and the prior art are such that the claimed invention as a whole would have
`been obvious before the effectivefiling date of the claimed invention to a person having
`ordinary skill in the art to which the claimed invention pertains. Patentability shall not be
`negated by the manner in which the invention was made.
`
`7.
`
`Claims 1, 4-13, and 15-20 are rejected under 35 U.S.C. 103 as being
`
`unpatentable over Kim et al. (US 2013/0138684) in view of Albert et al. (US
`
`2016/0239573) in view of Kreifeldt et al. (US 2016/0197967) in view of Hawthorne et
`
`al. (2010/0107075).
`
`[Claim 1]
`
`Kim discloses an emotion calculation device comprising:
`
`units configured to:
`
`estimate a user’s emotion type (¥ 38 —“...it is determined whether emotional
`
`value information corresponding to the input music information can be searchedfor in a
`
`music emotion DB whichstores information about emotional values including valence
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 11
`
`values and arousal values of respective pieces of music...” Since the music emotion
`
`database stores emotional value information corresponding to music information, it is
`
`understood that the database has somehowelectronically acquired the first content
`
`information regarding the first content, i.e., the music) based on music ({ 38 — The
`
`content may be music, whichis an artistic and/or creative product; | 99 — The music
`
`may have a corresponding album image (i.e., still image) andtitle information (i.e.,
`
`text).);
`
`acquire first content information regarding first content ({ 38 — “...it is determined
`
`whether emotional value information corresponding to the input music information can
`
`be searchedfor in a music emotion DB whichstores information about emotional values
`
`including valence values and arousal values of respective pieces of music...” Since the
`
`music emotion database stores emotional value information corresponding to music
`
`information, it is understood that the database has somehowelectronically acquired the
`
`first content information regarding the first content, i.e., the music), wherein the first
`
`content is any ofa product, a text, a still image, a video, a sound, and a combination of
`
`the product, the text, the still image, and the video ({ 38 — The content may be music,
`
`which is an artistic and/or creative product; § 99 — The music may have a corresponding
`
`album image (i.e., still image) andtitle information (i.e., text).); and
`
`calculate a matching frequencyfor the first content information for each of a
`
`plurality of emotion types of uses that are associated with the users based on their
`
`preferences and characteristics (¢ 67 — A calculation unit is used to assist in the search
`
`for the most relevant music; fig. 11, 7] 48, 60-61, 63, 111, 116 — Multiple emotions
`
`corresponding to a user maybeinput as search conditions. There are multiple
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 12
`
`permutations of emotions that may be entered; ¢ 84 —“...it is possible to recommend
`
`songs in the light of the characteristics of the user by designating a search area in such
`
`a way asto adjust the value of the angle © depending on the user's emotional musical
`
`tendencies, as shown in ‘A’ in FIG. 7.”; § 62 —“...in order to search for music having
`
`very high similarity in emotion to the input music, pieces of music for which m (closerto
`
`n) emotion ranks are identical to those of the input music may be searchedfor in the
`
`music emotion DB becausesimilarity between pieces of music becomes higher when as
`
`many emotion ranks as possible are identical to those of the input music.” Analyzing
`
`how many emotion ranks are similar to and/or match those of the music is an example
`
`of evaluating a matching frequency of the plurality of emotions to each of multiple
`
`contents; J 40 — A degree of arousal and a measure of valence mayalso be evaluated;
`
`{ 110 — “Further, similarly to the emotions of music, since emotions felt in the same
`
`situation may differ for persons, preferred music that match themes mayalsodiffer for
`
`persons.” In other words, multiple segments of users may beclassified by the various
`
`permutations of emotions, moods, etc.); and
`
`estimate a category of the emotion type of the user based on thefirst content (4
`
`38 —“...it is determined whether emotional value information corresponding to the input
`
`music information can be searchedfor in a music emotion DB whichstores information
`
`about emotional values including valence values and arousal values of respective
`
`pieces of music...” Since the music emotion database stores emotional value
`
`information corresponding to music information, it is understood that the database has
`
`somehowelectronically acquired the first content information regarding the first content,
`
`i.e., the music; J 38 — The content may be music, whichis an artistic and/or creative
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 13
`
`product; 7 99 — The music may have a corresponding album image (i.e., still image) and
`
`title information (i.e., text)).
`
`Kim implements the disclosed invention using an apparatus comprising various
`
`units (Kim: ¥ 12). While Kim alludes to the use of types of electronic devices that are
`
`commonly knownto utilize circuitry, “such as smart phones, MP3 players, Personal
`
`Digital Assistants (PDA), computers, and digital audio equipment” (e.g., see J 5 of Kim),
`
`Kim doesnot explicitly disclose the use of circuitry to execute the disclosed operations.
`
`Like Kim, Albert evaluates human emotionsin order to more effectively target an
`
`audience. In Albert's case, business solutions may be targeted to users based on their
`
`respective psychological type (Albert: 4 170). The underlying analysis in Albert may be
`
`performed using an Application-Specific Integrated Circuit (ASIC) processor (Albert:
`
`51) and/or a peripheral integrated circuit element (Albert: ¥ 178), a display controller
`
`may comprisecircuitry to generate audio and video signals for rendered content (Albert:
`
`4 53), and a transceiver may comprise circuitry operable to perform communications
`
`(Albert: ¢ 55). The Examiner submits that it would have been obvious to one of ordinary
`
`skill in the art before the effective filing date of Applicant’s invention to modify Kim to
`
`use circuitry to execute the disclosed operations in orderto facilitate rapid execution of
`
`the respective claim operations while minimizing errors in calculations and analyses.
`
`Kim and Albert do not explicitly disclose that units are configured to estimate a
`
`user's emotion type based on a combination of a text, a still image, a video, a sound
`
`including a voice and music. However, Hawthorne discloses that content may be
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 14
`
`customized to a user based on the user’s profile and/or emotional state at the time
`
`(Hawthorne: § 40). Hawthorne addressesthe recited content items (i.e., a text,astill
`
`image, a video, a soundincluding a voice and music) as follows: “Here, each content
`
`item can be, butis not limited to, a media type of a (displayed or spoken) text (for a non-
`
`limiting example, an article, a quote, a personal story, or a book passage), a (still or
`
`moving) image, a video clip, an audio clip (for a non-limiting example, a piece of music
`
`or sounds from nature), and other types of content items from which a user can learn
`
`information or be emotionally impacted.” (Hawthorne: { 40) The user’s profile is refined
`
`based on feedback provided by the user in response to content presented to him/her
`
`based on his/her emotional state at the time, as seen in 4 60 and 62 of Hawthorne:
`
`[0060] In the example of FIG. 6, the flowchart 600 continues
`block 608 wherethe retrieved content is customized based on
`the profile and/or the current emotional state of the user. Such
`customization reflects the user's preference as to whatkind of
`content items he/she would like to be included in the content
`to fit his/her emotional state at the time, as well as how each
`of the items in the content is preferred to be presented to
`him/her...
`
`the flowchart 600 may
`In the example of FIG. 6,
`[0062]
`optionally continue to block 612 wherethe user is enabled to
`provide feedback by rating and commenting on the content
`presented. Such feedback will then be used to update the
`profile
`of
`the user
`in order
`to make future content
`customization more accurate.
`
`The cyclical manner in which feedback regarding content and one’s emotional state
`
`may be used to update the user’s profile (correlating appropriate content to emotional
`
`state) for improved accuracy of content customization based on the user’s emotional
`
`state assists in estimating (e.g., refining and/or identifying) a user's emotion type (e.g., a
`
`user’s profile, including emotion information) based on the various types of content
`
`
`
`Application/Control Number: 17/426,106
`Art Unit: 3625
`
`Page 15
`
`(which mayincludea text, a still image, a video, and a sound including a voice and
`
`music). At the very least, Hawthorne discloses identifying a correlation between a user’s
`
`emotion type and content and Hawthorne suggestsidentifying a correlation between a
`
`user’s emotion type and various types of respective content since Hawthorne explains
`
`that available content may includea text, a still image, a video, and a sound including a
`
`voice and music (as discussed above). (It is noted that Applicant's Specification
`
`referencesthe ability to input a combination of the various content items into an emotion
`
`calculation device, but the corresponding analysis is described more asidentifying
`
`correlations between content items and emotion type (as seen in F§ 111-112 and 121-
`
`144 of Applicant's Specification), which is commensurate in scope with Hawthorne’s
`
`explicitly disclosed and suggested correlations between a user’s emotion type and
`
`content items (which mayinclude a text, a still image, a video, and a soundincluding a
`
`voice and music.) The user’s profile may be updated based on the user’s feedback and
`
`ratings for presented content items (Hawthorne: { 55). A sequence of media types may
`
`be presented as content items targeted to a user (Hawthorne: § 46), which implies that
`
`a combination of the available content items may becorrelated to the user’s profile and
`
`current emotional state. The Examiner submits that it would have been obvious to one
`
`of ordinary skill in the art