throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2017/0154626 A1
`(43) Pub. Date:
`Jun. 1, 2017
`KM et al.
`
`US 2017 O154626A1
`
`(54) QUESTION AND ANSWER PROCESSING
`METHOD AND ELECTRONIC DEVICE FOR
`SUPPORTING THE SAME
`
`(52) U.S. Cl.
`CPC ............ G10L 15/22 (2013.01); G 10L 15/265
`(2013.01)
`
`(71) Applicant: SAMSUNGELECTRONICS CO.,
`LTD., Suwon-si (KR)
`(72) Inventors: Dai Yong KIM, Suwon-si (KR); Dong
`Hyun YEOM, Bucheon-si (KR)
`
`(21) Appl. No.: 15/360,050
`
`(22) Filed:
`(30)
`
`Nov. 23, 2016
`Foreign Application Priority Data
`
`Nov. 27, 2015 (KR) ........................ 10-2015-O1671 OO
`
`Publication Classification
`
`(51) Int. Cl.
`GIOL I5/22
`
`(2006.01)
`
`ABSTRACT
`(57)
`An electronic device is provided. The electronic device
`includes a communication interface comprising communi
`cation circuitry and a processor configured to functionally
`connect with the communication interface, wherein the
`processor is configured to: obtain a voice signal, obtain
`context information associated with the user in connection
`with obtaining the Voice signal, determine first response
`information corresponding to the Voice signal, if the context
`information meets a first condition, determine second
`response information corresponding to the Voice signal, if
`the context information meets a second condition and send
`at least part of response information corresponding to the
`first response information or the second response informa
`tion to an output device operatively connected with the
`electronic device or an external electronic device for the
`electronic device.
`
`
`
`100
`
`COMMUNI CAT ON
`NTERFACE
`
`ADO DEVCE
`
`NPUT DEVCE
`
`PROCESSOR
`
`18O
`
`DSPAY
`
`SENSOR
`
`QUES ON DATABASE
`
`ANSWER DATABASE
`
`USER INFORMATON
`
`EXERCISE MODULE
`
`-1-
`
`Amazon v. SoundClear
`US Patent 11,069,337
`Amazon Ex. 1009
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 1 of 16
`
`US 2017/O154626 A1
`
`
`
`| | 0 | -3
`
`HOSNES
`
`-2-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 2 of 16
`
`US 2017/O154626 A1
`
`120
`
`
`
`GENERATION MODUE
`
`FIG 2
`
`-3-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 3 of 16
`
`US 2017/O154626 A1
`
`609
`
`9 | 9 | -3
`
`
`
`
`
`
`
`
`
`
`
`-4-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 4 of 16
`
`US 2017/O154626 A1
`
`COLLEC A LEAST ONE AR BUTES
`NFORMATION FROM QUESTION OR ANSWER
`
`40
`
`
`
`
`
`COLLECT CONTEX INFORMATON
`CORRESPONDING TO ATR BUTES INFORMA ON
`
`403
`
`GENERATE ADD ONAL INFORMAT ON BASED
`ON COLLECTED CONTEXT NFORMATON
`
`405
`
`FG4
`
`-5-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 5 of 16
`
`US 2017/O154626 A1
`
`
`
`EXRACT ARBUTES NFORMATON
`OF AT LEAST ONE OF QUESTION AND ANSWER
`
`50
`
`SELECT USER NFORMATION
`CORRESPONDING TO ATTRIBUTES INFORMATION
`
`ONAL INFORMATON BASED ON
`GENERATE ADD
`SELECTED USER INFORMA ON AND
`ANSWER ACCORD NG O SPECIFIED RULE
`
`505
`
`FIG 5
`
`-6-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 6 of 16
`
`US 2017/O154626 A1
`
`
`
`
`
`
`
`
`
`-7-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 7 of 16
`
`US 2017/O154626 A1
`
`70
`
`703
`
`705
`
`707
`
`
`
`
`
`EXTERNAL ELECTRONIC DEVCE
`
`
`
`
`
`FIG 7
`
`-8-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 8 of 16
`
`US 2017/O154626 A1
`
`COLLECT ANSWER ADDRESS INFORMA ON
`
`
`
`
`
`80
`
`OUTPUT ADDRESS INFORMA. ON
`
`803
`
`S ADDRESS
`NFORMATION SELECTED2
`
`805
`
`NO
`
`
`
`
`
`
`
`
`
`
`
`OUTPUT INFORMA ON CORRESPOND NG
`TO ADDRESS INFORMATON
`
`F. G8
`
`-9-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 9 of 16
`
`US 2017/O154626 A1
`
`
`
`EXTRAC NFORMATON IN WHCH T S
`MPOSSIBLE TO PERFORM TTS CONVERSON
`
`90
`
`SYNCHRONIZE EXTRACTED INFORMAON
`WTH TS CONVERSON INFORMATON
`
`OUTPUT EXTRACTED INFORMATION TO
`SPEC FEED DEW CE WHE OUTPUT ING
`TS CONVERSON NFORMATON
`
`903
`
`905
`
`FIG 9
`
`-10-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 10 of 16 US 2017/0154626 A1
`
`
`
`COLLECTED? e.
`
`S VOICE
`
`YeS
`
`COLLEC CONTEX NFORMATION
`
`003
`
`ANAYZE VOICE NFORMA ON
`BASED ON CONEXT NFORMATON
`
`1005
`
`DETERMNE SHORT ANSWER
`OR DESCRIPTIVE ANSWER
`
`1007
`
`OUTPUT
`
`1009
`
`
`
`
`
`
`
`
`
`
`
`F GOA
`
`-11-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 11 of 16
`
`US 2017/O154626 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`990||
`
`80 | | 0 | -
`
`-12-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 12 of 16 US 2017/0154626 A1
`
`101
`
`/
`ou COMMUNICATION D SENSOR UNIT--1180
`UNT
`FAC NG PARTN-1 10
`
`
`
`190
`
`
`
`DR V NG UNIT
`
`- )
`
`60
`
`DSPAY UN
`
`POWER UN
`
`140
`
`SENSOR UNIT
`
`150
`
`STORAGE UNT
`
`30
`
`FIG 11
`
`-13-
`
`

`

`Patent Application Publication
`
`Jun
`
`1.
`
`2017. Sheet 13 Of 16
`
`US 2017/O154626 A1
`
`
`
`80724
`
`0072;
`
`(1072;
`
`+072;
`
`Z || || 0 | -
`
`06@ |
`
`r-~~~~|~~~~
`
`| |
`
`-14-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 14 of 16
`
`US 2017/O154626 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`-15-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 15 of 16 US 2017/0154626 A1
`
`
`
`APPCA. ON 1470
`
`BROWSER
`475
`
`CAMERA AARY PAYMENT
`1476
`477
`485
`
`AR
`
`MED A PLAYER
`482
`
`ALBUM
`483
`
`CLOCK
`148
`
`APPECAON
`MANAGER
`1441
`
`POWER MANAGER
`A45
`
`NOTF CATON
`MANAGER
`449
`
`WNDOW
`MANAGER
`1442
`
`DATABASE
`MANAGER
`1446
`
`OCATION
`MANAGER
`450
`
`M DDLEWARE 1430
`MUT MEDA
`MANAGER
`1443
`
`PACKAGE
`MANAGER
`447
`
`GRAPHC
`MANAGER
`451
`
`KERNE
`
`420
`
`RESOURCE
`MANAGER
`1444
`
`CONNECWY
`MANAGER
`1448
`
`SECURITY
`MANAGER
`1452
`
`PAYMENT
`MANAGER
`454.
`
`RUNME
`BRARY
`435
`
`SYSTEM RESOURCE MANAGER 142
`
`DEVCE DRIVER A23
`
`FIG 14.
`
`-16-
`
`

`

`Patent Application Publication
`
`Jun. 1, 2017. Sheet 16 of 16
`
`US 2017/O154626 A1
`
`909 ?
`
`S ? SK | eue
`
`
`
`
`
`
`
`
`
`
`-17-
`
`

`

`US 2017/O 154626 A1
`
`Jun. 1, 2017
`
`QUESTION AND ANSWER PROCESSING
`METHOD AND ELECTRONIC DEVICE FOR
`SUPPORTING THE SAME
`
`CROSS-REFERENCE TO RELATED
`APPLICATION
`0001. This application is based on and claims priority
`under 35 U.S.C. S 119 to a Korean patent application filed on
`Nov. 27, 2015 in the Korean Intellectual Property Office and
`assigned Serial number 10-2015-0167100, the disclosure of
`which is incorporated by reference herein in its entirety.
`
`TECHNICAL FIELD
`0002 The present disclosure relates generally to a ques
`tion and answer (QA) processing method in an electronic
`device and an apparatus therefor.
`
`BACKGROUND
`0003. An electronic device may include a QA system
`which supports processing answers to user question items.
`The QA System may receive a command Such as S Voice or
`text from a user and may analyze the received user command
`via a recognition engine. The QA System may extract a
`keyword Suitable for the user command and may extract
`founded contents corresponding to the extracted keyword.
`0004 As the QA system simply outputs a short answer to
`a command Such as a voice or text input from the user, it may
`output an answer which does not correspond to the intention
`of the user. For example, since the QA System simply
`outputs only a specified answer to a user command, there
`may be a small amount of information to be obtained by the
`user or accuracy of information to be obtained by the user
`may be low. Thus, it is difficult to provide efficient and
`accurate information. Also, since the QA System outputs
`only a specified answer, answers which do not correspond to
`user conditions may be provided.
`
`SUMMARY
`0005 Aspects of the present disclosure address at least
`the above-mentioned problems and/or disadvantages and to
`provide at least the advantages described below. Accord
`ingly, an example aspect of the present disclosure is to
`provide a QA processing method for allowing a user to more
`easily recognize and understand an answer based on various
`output methods and an electronic device for Supporting the
`SaC.
`0006. Accordingly, another example aspect of the present
`disclosure is to provide a QA processing method for improv
`ing accuracy for information to be obtained by a user by
`providing at least one of an answer and/or additional infor
`mation and an electronic device for Supporting the same.
`0007 Accordingly, another example aspect of the present
`disclosure is to provide a QA processing method for con
`veniently providing an environment for verifying an answer
`by providing the answer via various electronic devices based
`on user conditions and an electronic device for Supporting
`the same.
`0008. In accordance with an example aspect of the pres
`ent disclosure, an electronic device is provided. The elec
`tronic device may include a communication interface com
`prising communication circuitry and a processor configured
`to functionally connect with the communication circuitry,
`wherein the processor is configured to obtain a voice signal
`
`of a user, to obtain (or verify) context information associated
`with the user in connection with obtaining the Voice signal,
`to determine first response information corresponding to the
`Voice signal, if the context information meets a first condi
`tion, to determine second response information correspond
`ing to the Voice signal, if the context information meets a
`second condition and to send at least part of response
`information corresponding to the first response information
`or the second response information to an output device
`operatively connected with the electronic device or an
`external electronic device for the electronic device.
`0009. In accordance with another example aspect of the
`present disclosure, an electronic device is provided. The
`electronic device may include an audio device comprising
`audio circuitry configured to obtain a voice signal, a memory
`configured to store a question or an answer associated with
`the voice signal and a processor operatively connected with
`the audio device and/or the memory, wherein the processor
`is configured to convert the obtained Voice signal into text,
`to search for a question corresponding to the converted text
`or an answer corresponding to the question based on data
`stored in the memory, to obtain attributes information of the
`found question or answer, to generate additional information
`based on the answer and the attributes information and to
`output at least one of the answer or the additional informa
`tion.
`0010. In accordance with another example aspect of the
`present disclosure, a question and answer (QA) processing
`method is provided. The question and answer (QA) process
`ing method may include obtaining a voice signal in the
`electronic device, converting the obtained Voice signal into
`text, searching for a question corresponding to the converted
`text or an answer corresponding to the question based on
`data stored in a memory operatively connected with the
`electronic device, obtaining attributes information of the
`found question or answer, generating additional information
`based on the answer and the attributes information and
`outputting at least one of the answer or the additional
`information.
`0011. Other aspects and salient features of the disclosure
`will become apparent to those skilled in the art from the
`following detailed description, which, taken in conjunction
`with the annexed drawings, discloses various example
`embodiments of the present disclosure.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0012. The above and other aspects, features, and advan
`tages of various example embodiments of the present dis
`closure will be more apparent from the following detailed
`description, taken in conjunction with the accompanying
`drawings, in which like reference numerals refer to like
`elements, and wherein:
`0013 FIG. 1 is a block diagram illustrating an example
`electronic device operation environment according to vari
`ous example embodiments;
`0014 FIG. 2 is a block diagram illustrating an example of
`a processor according to an example embodiment;
`0015 FIG. 3 is a flowchart illustrating an example of a
`QA processing method according to an example embodi
`ment;
`0016 FIG. 4 is a flowchart illustrating an example of a
`method for generating additional information according to
`an example embodiment;
`
`-18-
`
`

`

`US 2017/O 154626 A1
`
`Jun. 1, 2017
`
`0017 FIG. 5 is a flowchart illustrating another example
`of a method for generating additional information according
`to an example embodiment;
`0018 FIG. 6 is a flowchart illustrating an example of a
`method for outputting a descriptive answer according to an
`example embodiment;
`0019 FIG. 7 is a flowchart illustrating another example
`of a method for outputting a descriptive answer according to
`an example embodiment;
`0020 FIG. 8 is a flowchart illustrating another example
`of a method for outputting a descriptive answer according to
`an example embodiment;
`0021
`FIG. 9 is a flowchart illustrating an example of an
`information output method according to an example
`embodiment;
`0022 FIG. 10A is a flowchart illustrating another
`example of a QA processing method according to an
`example embodiment;
`0023 FIG. 10B is a flowchart illustrating another
`example of a QA processing method according to an
`example embodiment;
`0024 FIG. 11 is a diagram illustrating an example form
`of an electronic device according to an example embodi
`ment,
`0025 FIG. 12 is a block diagram illustrating example
`hardware components of an electronic device according to
`an example embodiment;
`0026 FIG. 13 is a block diagram illustrating example
`software components of an electronic device according to an
`example embodiment;
`0027 FIG. 14 is a block diagram illustrating an example
`configuration of a program module according to various
`example embodiments; and
`0028 FIG. 15 is a diagram illustrating example voice
`recognition and output of an electronic device according to
`an example embodiment.
`0029. Throughout the drawings, it should be noted that
`like reference numbers are used to depict the same or similar
`elements, features, and structures.
`
`DETAILED DESCRIPTION
`0030 Various example embodiments of the present dis
`closure may be described with reference to accompanying
`drawings. Accordingly, those of ordinary skill in the art will
`recognize that modifications, equivalents, and/or alterna
`tives of the various example embodiments described herein
`can be variously made without departing from the scope and
`spirit of the present disclosure. With regard to description of
`drawings, similar elements may be marked by similar ref
`erence numerals.
`0031. In the disclosure disclosed herein, the expressions
`“have”, “may have”, “include and “comprise', or “may
`include and “may comprise' used herein indicate existence
`of corresponding features (e.g., elements such as numeric
`values, functions, operations, or components) but do not
`exclude presence of additional features.
`0032. In the disclosure disclosed herein, the expressions
`“A or B', 'at least one of A or/and B, or "one or more of
`A or/and B, and the like used herein may include any and
`all combinations of one or more of the associated listed
`items. For example, the term “A or B', 'at least one of A and
`B', or “at least one of A or B may refer to all of the case
`(1) where at least one A is included, the case (2) where at
`
`least one B is included, or the case (3) where both of at least
`one A and at least one B are included.
`0033. The terms, such as “first”, “second, and the like
`used herein may refer to various elements of various
`example embodiments, but do not limit the elements. Fur
`thermore, Such terms may be used to distinguish one element
`from another element. For example, “a first user device' and
`“a second user device' may indicate different user devices
`regardless of the order or priority thereof. For example, “a
`first user device' and “a second user device' indicate
`different user devices.
`0034. It will be understood that when an element (e.g., a
`first element) is referred to as being “(operatively or com
`municatively) coupled with/to’ or “connected to another
`element (e.g., a second element), it may be directly coupled
`with/to or connected to the other element or an intervening
`element (e.g., a third element) may be present. On the other
`hand, when an element (e.g., a first element) is referred to as
`being “directly coupled with?to’’ or “directly connected to
`another element (e.g., a second element), it should be
`understood that there are no intervening element (e.g., a
`third element).
`0035. According to the situation, the expression “config
`ured to used herein may be used as, for example, the
`expression “suitable for”, “having the capacity to’,
`"designed to”, “adapted to”, “made to’, or “capable of. The
`term “configured to must not mean only “specifically
`designed to in hardware. Instead, the expression "a device
`configured to may mean that the device is “capable of
`operating together with another device or other components.
`CPU, for example, a “processor configured to perform A, B,
`and C may refer, for example, to a dedicated processor
`(e.g., an embedded processor) for performing a correspond
`ing operation or a generic-purpose processor (e.g., a central
`processing unit (CPU) or an application processor) which
`may perform corresponding operations by executing one or
`more Software programs which are stored in a memory
`device.
`0036 Terms used in the present disclosure are used to
`describe various example embodiments and are not intended
`to limit the scope of the present disclosure. The terms of a
`singular form may include plural forms unless otherwise
`specified. Unless otherwise defined herein, all the terms used
`herein, which include technical or scientific terms, may have
`the same meaning that is generally understood by a person
`skilled in the art. It will be further understood that terms,
`which are defined in a dictionary and commonly used,
`should also be interpreted as is customary in the relevant
`related art and not in an idealized or overly formal manner
`unless expressly so defined herein in various example
`embodiments of the present disclosure. In some cases, even
`if terms are terms which are defined in the description, they
`may not be interpreted to exclude embodiments of the
`present disclosure.
`0037. An electronic device according to various example
`embodiments of the present disclosure may include at least
`one of Smartphones, tablet personal computers (PCs),
`mobile phones, video telephones, e-book readers, desktop
`PCs, laptop PCs, netbook computers, workstations, servers,
`personal digital assistants (PDAs), portable multimedia
`players (PMPs), Motion Picture Experts Group (MPEG-1 or
`MPEG-2) Audio Layer 3 (MP3) players, mobile medical
`devices, cameras, wearable devices (e.g., head-mounted
`devices (HMDs). Such as electronic glasses), an electronic
`
`-19-
`
`

`

`US 2017/O 154626 A1
`
`Jun. 1, 2017
`
`apparel, electronic bracelets, electronic necklaces, electronic
`appcessories, electronic tattoos, Smart watches, and the like,
`but are not limited thereto.
`0038 According to another example embodiment, the
`electronic devices may be home appliances. The home
`appliances may include at least one of for example, televi
`sions (TVs), digital versatile disc (DVD) players, audios,
`refrigerators, air conditioners, cleaners, ovens, microwave
`ovens, washing machines, air cleaners, set-top boxes, home
`automation control panels, security control panels, TV boxes
`(e.g., Samsung HomeSyncR), Apple TV(R), or Google TV(R),
`game consoles (e.g., XboxR) or PlayStation(R), electronic
`dictionaries, electronic keys, camcorders, electronic picture
`frames, or the like, but are not limited thereto.
`0039. According to another example embodiment, the
`electronic device may include at least one of medical
`devices (e.g., various portable medical measurement devices
`(e.g., a blood glucose monitoring device, a heartbeat mea
`Suring device, a blood pressure measuring device, a body
`temperature measuring device, and the like)), a magnetic
`resonance angiography (MRA), a magnetic resonance imag
`ing (MRI), a computed tomography (CT), Scanners, and
`ultrasonic devices), navigation devices, global positioning
`system (GPS) receivers, event data recorders (EDRs), flight
`data recorders (FDRs), vehicle infotainment devices, elec
`tronic equipment for vessels (e.g., navigation systems and
`gyrocompasses), avionics, security devices, head units for
`vehicles, industrial or home robots, automatic teller's
`machines (ATMs), points of sales (POSs), or internet of
`things (e.g., light bulbs, various sensors, electric or gas
`meters, sprinkler devices, fire alarms, thermostats, Street
`lamps, toasters, exercise equipment, hot water tanks, heat
`ers, boilers, and the like), or the like, but are not limited
`thereto.
`0040. According to another example embodiment, the
`electronic devices may include at least one of parts of
`furniture or buildings/structures, electronic boards, elec
`tronic signature receiving devices, projectors, or various
`measuring instruments (e.g., water meters, electricity
`meters, gas meters, or wave meters, and the like). In the
`various embodiments, the electronic device may be one of
`the above-described various devices or a combination
`thereof, or the like, but are not limited thereto. An electronic
`device according to an example embodiment may be a
`flexible device. Furthermore, an electronic device according
`to an example embodiment may not be limited to the
`above-described electronic devices and may include other
`electronic devices and new electronic devices according to
`the development of technologies.
`0041. Hereinafter, an electronic device according to the
`various example embodiments may be described with ref
`erence to the accompanying drawings. The term “user” used
`herein may refer to a person who uses an electronic device
`or may refer to a device (e.g., an artificial intelligence
`electronic device) that uses an electronic device.
`0042 FIG. 1 is a block diagram illustrating an example
`electronic device operation environment according to vari
`ous example embodiments.
`0043 Referring to FIG. 1, an electronic device operation
`environment 10 may include, for example, an electronic
`device 100 and/or at least one external electronic device 200.
`The electronic device 100 may receive an input, such as, for
`example, a user command (e.g., text or a voice) and may
`output an answer in response to analyzing the received
`
`command. In this operation, the electronic device 100 may
`send, for example, the answer or link information (e.g.,
`uniform resource locator (URL) address information, sound
`quick response (QR) information, or the like) associated
`with obtaining the answer to the at least one external
`electronic device 200. The external electronic device 200
`may output the response or the link information. The exter
`nal electronic device 200 may output an answer in response
`to receiving an input (e.g., a user input) or a control signal
`of the electronic device 100 (e.g., a signal for requesting the
`external electronic device 200 to output a response associ
`ated with the link information). In connection with output
`ting the link information, the external electronic device 200
`may receive and output the response from another device
`(e.g., another external electronic device, or a server which
`may provide the response) based on a communication mod
`ule embedded therein. According to various example
`embodiments, the external electronic device 200 may
`include an audio device which may receive a voice signal.
`If receiving the voice signal, the external electronic device
`200 may send the received voice signal to the electronic
`device 100.
`0044. In various example embodiments, the electronic
`device 100 may include a processor 120, a memory 130, an
`input device (e.g., including input circuitry) 150, a commu
`nication interface (e.g., including communication circuitry)
`170, an output device (e.g., an audio device 110 and/or a
`display 160), a sensor 180 (e.g., a camera or an image
`sensor), and a bus 115. Additionally or alternatively, if the
`electronic device 100 is manufactured in a specified form
`(e.g., a robot), it may further include an exercise module
`190. Also, the memory 130 may include, for example, a
`question database 131, an answer database 133, and/or user
`information 135.
`0045. The above-mentioned electronic device 100 may
`collect, for example, a command (e.g., a command of its
`user) based on the audio device 110, the input device 150, or
`the communication interface 170 and may output an answer
`to the collected command. In this operation, the electronic
`device 100 may collect context information associated with
`the user and may change a type of an answer to be output,
`a form of the answer to be output, and/or a device to which
`the answer will be output, based on the collected context
`information. The electronic device 100 may output at least
`one of a short answer or a descriptive answer via at least one
`specified device (e.g., the electronic device 100 or an
`external electronic device) based on the collected context
`information.
`0046. The memory 130 may include a volatile and/or
`non-volatile memory. The memory 130 may store, for
`example, a command or data associated with at least another
`of the components of the electronic device 100. According
`to an example embodiment, the memory 130 may store
`Software and/or a program (not shown). The program may
`include, for example, a kernel, a middleware, an application
`programming interface (API), and/or an least one applica
`tion program (or "at least one application'), and the like. At
`least part of the kernel, the middleware, or the API may be
`referred to as an operating system (OS).
`0047. The kernel may control or manage, for example,
`system resources (e.g., the bus 110, the processor 120, or the
`memory 130, and the like) used to execute an operation or
`function implemented in the other programs (e.g., the
`middleware, the API, or the application program). Also, as
`
`-20-
`
`

`

`US 2017/O 154626 A1
`
`Jun. 1, 2017
`
`the middleware, the API, or the application program
`accesses a separate component of the electronic device 100,
`the kernel may provide an interface which may control or
`manage system resources.
`0048. The middleware may play a role as, for example, a
`go-between such that the API or the application program
`communicates with the kernel to communicate data.
`0049. Also, the middleware may process one or more
`work requests, received from the application program, in
`order of priority. For example, the middleware may assign
`priority which may use system resources (the bus 110, the
`processor 120, or the memory 130, and the like) of the
`electronic device 100 to at least one of the at least one
`application program. For example, the middleware may
`perform scheduling or load balancing for the one or more
`work requests by processing the one or more work requests
`in order of the priority assigned to the at least one of the at
`least one application program.
`0050. The API may be, for example, an interface in which
`the application program controls a function provided from
`the kernel or the middleware. For example, the API may
`include at least one interface or function (e.g., a command)
`for file control, window control, image processing, or text
`control, and the like.
`0051. The memory 130 may store, for example, an oper
`ating system (OS) and the like associated with operating the
`electronic device 100. Also, the memory 130 may store an
`analysis program associated with analyzing a user voice and
`a user command. According to an example embodiment, the
`memory 130 may store a question database 131 and/or an
`answer database 133 in connection with Supporting a QA
`service.
`0052. The question database 131 may include, for
`example, data for a variety of short-answer questions or a
`variety of descriptive questions. In this regard, the question
`database 131 may include information mapped to questions
`on at least one short answer. Also, the question database 131
`may include information mapped to questions on at least one
`descriptive answer. In this regard, the question database 131
`may be operated in the form of being mapped to the answer
`database 133.
`0053 According to various example embodiments, the
`question database 131 may include attributes information of
`questions. The attributes information may include at least
`one item included in a main classification and at least item
`included in a Subclassification in connection with the items
`included in the main classification. For example, the ques
`tion database 131 may include attributes information asso
`ciated with 'economy, politics, sports, game, movie, music,
`entertainment, life,” or the likeas an item of the main
`classification. Also, the question database 131 may include
`attributes information associated with subclassifications of
`each main classification. The attributes information associ
`ated with the Subclassifications may include Subclassifica
`tions according to a region-based classification and Subclas
`sifications according to a type-based classification.
`According to an example embodiment, the “economy” cor
`responding to the item of the main classification may include
`a region-based subclassification Such as "household
`economy, local economy, or national economy'. The
`“economy” corresponding to the item of the main classifi
`cation may include a type-based Subclassification Such as
`“car, real estate, cash, or bond'. According to various
`example embodiments, a subclassification of the lift which
`
`is the item of the main classification may include “food,
`exercise, study, leisure, health, or the like’. The attributes
`information classified into the above-mentioned main clas
`sification or Subclassification may vary in form of classifi
`cation according to a system design scheme, a policy, or the
`like. For example, attributes information stored in the above
`mentioned question database 131 may be used to determine
`attributes of a question analyzed by a user Voice analysis.
`0054 The answer database 133 may include, for
`example, a short answer and/or a descriptive answer to at
`least one question. The answer database 133 may include
`short answers mapped to at least one question stored in the
`question database 131 or descriptive answers mapped to at
`least one question. The answer database 133 may include
`attributes information about the short answers or the descrip
`tive answers. The short answers or the descriptive answers
`may include information similar to attributes information
`about classifications described in the question database 131.
`According to various example embodiments, attributes
`information about an answer? question may include a text
`length of the answer, information indicating a special letter
`is included, a category, or combinations thereof.
`0055 According to various example embodiments, the
`descriptive answer may include, for example, a time series
`answer or a non-time series answer of a length or more. The
`time series answer may include answers classified into at
`least one stage. In connection with the time series answer,
`the answer database 133 may include condition information.
`The condition information may include conditions for out
`putting a next-stage answer in the time series answers.
`According to an example embodiment, the condition infor
`mation may include sensor information or voice informa
`tion.
`0056 Further, the memory 130 may store, for example,
`the user information 135. The user information 135 may
`include information about an age of the user, information
`indicating whether he or she married, information about his
`or her job, information about his or her position, information
`about his or her height, information about his or her weight,
`information about his or her children, information about his
`or her favorite food, information about his or her hobby,
`information about his or her specialty, and/or his or her
`health information. The above-mentioned user information
`135 may be information input by the user. The health
`information and the like of the user may include information
`produced based on biometric information collected by the
`sensor 180.
`0057 According to various embodiments, the memory
`130 may store sensor information collected by the sensor
`180. The stored sensor information may include at least one
`information associated with activity or a form of the user or
`an environment around him or her. The sensor information
`may be used to collect face identification information of the
`user, his or her operation information (or movement infor
`mation of the electronic devi

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket