`US 20170337921A1
`( 19 ) United States
`( 12 ) Patent Application Publication ( 10 ) Pub . No . : US 2017 / 0337921 A1
`AOYAMA et al .
`( 43 ) Pub . Date :
`Nov . 23 , 2017
`
`( 54 ) INFORMATION PROCESSING DEVICE ,
`INFORMATION PROCESSING METHOD ,
`AND PROGRAM
`( 71 ) Applicant : SONY CORPORATION , Tokyo ( JP )
`( 72 ) Inventors : Kazumi AOYAMA , Saitama ( JP ) ;
`Yoko ITO , Tokyo ( JP )
`( 73 ) Assignee : SONY CORPORATION , Tokyo ( JP )
`15 / 531 , 827
`( 21 ) Appl . No . :
`( 22 ) PCT Filed :
`Nov . 26 , 2015
`( 86 ) PCT No . :
`PCT / JP2015 / 083232
`§ 371 ( C ) ( 1 ) ,
`May 31 , 2017
`( 2 ) Date :
`Foreign Application Priority Data
`( 30 )
`Feb . 27 , 2015
`( JP ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2015 - 039138
`
`( 51 )
`
`( 52 )
`
`Publication Classification
`
`Int . Cl .
`GIOL 15 / 22
`( 2006 . 01 )
`G06F 17727
`( 2006 . 01 )
`GO6F 3 / 16
`( 2006 . 01 )
`GIOL 13 / 027
`( 2013 . 01 )
`GIOL 15 / 26
`( 2006 . 01 )
`U . S . CI .
`??? . . . . . . . . . . . . GIOL 15 / 22 ( 2013 . 01 ) ; GIOL 13 / 027
`( 2013 . 01 ) ; GIOL 15 / 26 ( 2013 . 01 ) ; G06F 3 / 165
`( 2013 . 01 ) ; G06F 17 / 2785 ( 2013 . 01 ) ; GIOL
`2015 / 223 ( 2013 . 01 )
`
`( 57 )
`ABSTRACT
`There is provided an information processing device to
`control response to a sound input in a preferred mode
`corresponding to
`a change in
`a situation or a user , the
`information processing device including : a control unit
`configured to control output of a response to speech of a user
`in accordance with acquired information regarding a speech
`state of the user .
`
`WAAMAAVAA
`
`START
`w
`
`1
`
`www
`
`ACQUIRE SOUND INPUT
`
`LAAMAHAL AAAAA
`* * *
`
`* * * * * *
`
`wwwwwwwwwww
`
`wwwwwwww
`
`w
`
`wsing
`
`ANALYZE SOUND INPUT
`
`AKAN MAKANAN KAWASANYA
`
`$ 101
`
`S103
`
`* * * * * * * * * *
`
`Wwwwww
`
`www
`
`S107
`W
`DETECT SPEECH STYLE
`
`www . win
`
`S200
`wwwwwwwwwwwwwwwwwAAAAAAAAAMAAN
`ESTIMATE EXTERNAL
`ENVIRONMENT
`S300
`wwwmmmmm
`XV
`ESTIMATE USER STATE
`
`MWINY
`
`ARAW
`
`Sini wwwiiiiiiiiiiiiiii
`
`wwiiiiiiiiiiwy
`
`????????????????????
`
`Viewing
`
`ww
`
`w
`
`. * * * * *
`
`wwwwwwwwwwwwwwwww .
`
`wwwwwwwww
`
`way
`GENERATE
`RESPONSE COMPONENT
`
`S105
`
`GENERATE
`RESPONSE PARAMETER
`
`$ 109
`
`Ewiiiiii
`
`www
`
`w wwwwwwwww
`
`GENERATE
`RESPONSE CONTENT
`
`INWAY
`
`ST ! !
`
`v
`
`i i .
`
`.
`
`. .
`
`:
`
`. . .
`.
`. .
`END
`
`-1-
`
`Amazon v. SoundClear
`US Patent 11,069,337
`Amazon Ex. 1010
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 1 of 10
`
`US 2017 / 0337921 A1
`
`FIG . 1
`
`c10a
`
`*
`
`CHECK MY SCHEDULE
`FOR TOMORROW AT 1 PM
`
`Ub
`
`c10b
`
`WHAT TIME IS IT
`IN LONDON ?
`
`* * * * * * *
`
`*
`
`. . . . . ' . ' ,
`
`. . ' .
`
`????????????????
`c11a
`
`
`
`Vapesoregroupement destinator
`
`???????????????????????????????????????????????
`
`AVE
`WITH MR . YAMADA
`
`Www
`
`c11b
`
`Whe
`
`IS SOM
`
`w
`
`eergeven
`
`e
`
`-2-
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 2 of 10
`
`US 2017 / 0337921 A1
`
`YA
`
`ELLER
`. . :
`: : : : : : : : : : : : : :
`
`44444444
`: : : : : : : : : : : : : : : : : . . . . . :
`
`. . .
`
`30
`
`vanni
`
`STORAGE UNIT
`
`??????
`
`in
`
`Pirmwire
`
`4 444444
`
`FA
`
`- 1 . 7611111111rrrrrrrwirrinimu11100wiriririririririmirimor0iiniirinmurrimuririririririr - 10 wmwimmiiniriiiiiiuvririririririri
`
`.
`
`.
`
`
`
`dirdi vedr
`
`
`
`???????? -
`
`ELISE
`
`132
`
`. .
`
`"
`
`.
`
`. .
`
`MAGNITUDE OF SOUND
`
`.
`
`*
`
`STATE ESTIMATION UNIT
`
`USER
`
`*
`
`*
`
`??????????????
`WinRAR
`
`EXPRESSION STYLE
`Entrerrors
`STYLE DETECTION UNT
`SPEECH
`( 414424
`
`- - -
`
`1
`
`10
`
`W W
`
`WWWWWW
`
`W
`
`
`
`PAPPA MWW
`
`P AR
`
`- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - PrePPAN
`
`FIG . 2
`
`
`
`CONTROL UNIT
`
`. . .
`
`arrrrrrrrrrrrrr
`
`i
`
`inahitrin
`
`n
`
`
`
`? w ??? ??????? ??? : ??
`
`
`
`
`
`
`
`
`
`114
`
`-
`
`-
`
`- -
`
`-
`
`-
`
`- -
`
`-
`
`QUTPUT UNIT
`RESPONSE GENERATION UNIT
`terrrrrrrect
`mmmmmmm
`?????????E COMPONENT GENERATION
`UNIT
`MEANING ANALYSIS UNT .
`SOUND REGUGMTION UNT
`
`1444
`
`1
`
`UNIT SOUND SECTION DETECTION
`
`
`
`
`112
`
`-
`
`. . -
`
`. . .
`
`.
`
`. . . .
`
`-
`
`* ,
`
`- - '
`
`
`
`SOUND ANALYSIS UNIT
`* D11
`
`EGOCCECODE
`
`SPEECH SPEED
`
`
`UNIT SOUND FEATURE EXTRACTION
`
`
`113
`
`Jare
`
`
`
`
`
`MAGE ANALYSIS UNIT
`
`??????????????????????? . ? . ? . ? .
`
`? . ? . ? . ? . ? . ?????????????????????????????????????
`
`?
`
`??
`
`?
`
`
`
`
`
`? - , ? ? ????
`
`;
`
`W
`
`WW
`
`rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr . . .
`
`reer
`
`Livinitatii
`
`IMAGING UNIT
`
`* * * * * * * * * * * * * * * * *
`
`*
`
`SOUND COLLECTION UNT .
`
`Exhi
`
`. . * *
`
`.
`
`.
`
`. .
`
`.
`
`.
`
`. . . .
`
`
`
`ENVIRONMENTAL SOUND ANALYSIS UNIT
`
`· RESPONSE PARAMETER GENERATION
`UND
`
`Wwwwwwwwwwwwwww www
`DEGREE OF NOISE
`EXTERNAL ENVIRONMENT ESTIMATION
`UNIT
`VOLUME OF NOISE
`ENVIRONMENTAL SOUND RECOGNITION
`
`UNT
`
`+ 4 + 4444BEECE
`
`
`
`. . xirinci sorrer . Riv
`
`. . . . . . . . . . . . . . . . . . . . . . . .
`
`rrrrrrr
`
`
`
`
`
`san
`
`" . . . . "
`
`131
`
`Y 13
`
`- - - . . - - -
`
`m
`
`. .
`
`
`
`SKV . VN
`
`??????????????????????
`
`-3-
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 3 of 10
`
`US 2017 / 0337921 A1
`
`FIG . 3
`
`WWW .
`* *
`
`* * *
`
`* * * * * *
`
`*
`
`WYYYYYYY
`
`CHECK MY SCHEDULE FOR TOMORROW AT 1 PM
`
`* *
`
`*
`
`*
`
`* * * * * * * * * * . * . . . .
`
`is . verirminnixa
`
`. . . .
`
`.
`
`?????????
`
`
`
`, » . ????? . ? ???????? ?? ????
`
`
`
`
`
`
`
`shooting MEANING ANALYSIS
`
`- - SOUND INPUT
`Wirini
`
`- -
`
`- -
`
`-
`
`- -
`
`- - - -
`
`-
`
`*
`
`* * *
`
`*
`
`* *
`
`* *
`
`?
`
`???????
`
`
`
`?????????????? . ????? ; ????
`
`
`
`
`
`
`
`
`
`? ?????????
`
`
`
`?????? ????
`
`???
`
`
`
`. ???????? ?????? : ??????
`
`
`
`
`
`
`
`* ???? ??????????
`
`?????????????????????????????????
`
`TASK SCHEDULE CONFIRMATION
`taisesti ARGUMENT
`WWW
`EXPRESSION
`DATE - TIME TOMORROW AT 1 PM
`
`*
`
`. . .
`
`siwn .
`
`- - -
`
`
`
`
`
`
`
`??????????????????????????????
`
`????????????????????
`
`*
`?????
`
`SEARCH KEY
`2014 / 10 / 1 13 : 00
`
`mn
`
`-
`
`- - - -
`
`- - - - - - - - - - - - - izr .
`
`21421mu4972
`
`-
`
`- - -
`
`-
`
`-
`
`-
`
`- -
`
`- -
`
`-
`
`-
`
`-
`
`- -
`
`-
`
`-
`
`- LL L
`
`.
`
`2 .
`
`- .
`
`.
`
`? ? ?
`
`? ?
`
`? . ?
`
`?
`
`?
`
`??
`
`? ? .
`
`?
`
`
`
` . ?? .
`
`ACQUISITION OF RESPONSE COMPONENT
`
`*
`
`*
`
`.
`
`*
`
`-
`
`?
`
`?
`
`?
`
`
`
`???????? *
`
`? ?
`
`?
`
`
`
`??? i ; - - : ri
`
`;
`
`?
`
`
`
`- . ???
`
`£
`
`;
`
`?
`
`?
`
`?
`
`|
`
`.
`
`?
`
`?
`
`?
`
`?
`
`?
`
`????
`
`*
`
`; *
`
`- - - - -
`
`! !
`
`!
`
`- - -
`
`w
`
`-
`
`Watt
`
`- - - - . .
`
`. diwr .
`
`*
`
`*
`
`EXECUTION RESULT OF TASK
`SEARCH KEY
`PARTICIPANT * * 114 ?
`TITLE
`ARA
`2014 / 10 / 1 13 : 00 MEETING IN ROOMATARO YAMADA
`
`1
`
`-
`
`-
`
`xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
`
`??????????????????????
`
`WWW
`
`. .
`
`.
`
`FIG . 4
`
`*
`
`*
`
`POSTURE
`
`DEGREE OF COMPOSURE
`
`*
`
`* *
`
`A
`
`RREKRka
`
`*
`
`*
`
`*
`
`*
`
`*
`
`*
`
`*
`
`* * *
`
`* * * *
`
`* SITTING
`* * LYING
`* * STANDING
`1444
`
`wwwwwwwwwwwwwww
`
`???????????????
`
`. . . . . .
`
`. . .
`
`. .
`
`. . . . . . . . .
`
`.
`
`. . .
`
`!
`
`!
`
`!
`
`!
`
`! . '
`
`. ' . '
`
`!
`
`!
`
`' .
`
`+ 10
`
`* * * *
`
`* * * * * * * * * * *
`
`+ 20
`- 10
`
`wwwwwwwww
`
`HAHAHAHAH
`
`.
`
`.
`
`. . . .
`
`. . . .
`
`. .
`
`. . .
`
`FIG . 5
`DEGREE OF COMPOSURE
`SPEECH SPEED
`* * * LESS THAN 4 LETTERS / SECOND + 20
`4 TO 6 LETTERS / SECOND
`* * * * * * 7 OR MORE LETTERS / SECOND - 20
`
`* *
`
`ANN
`
`E TTOREY '
`
`' ' ' ' ' ' ' '
`
`????? - tir
`
`W
`
`* * *
`
`* *
`
`* * *
`
`-4-
`
`
`
`Patent Application Publication _ Nov . 23 , 2017 Sheet 4 of 10
`
`_ US 2017 / 033792141
`
`FG . &
`
`???????????????????????????????????
`
`????
`
`??????
`• •
`
`MOTION
`STOPPED
`WALKING
`????????????????
`RUNNING
`
`iii
`???????????????
`• • . ' . -
`" . … . … . … … . .
`. ?
`:
`
`?
`
`?
`
`?
`
`?
`
`?
`
`?
`
`? ?
`
`? ? ? ?
`
`? ?
`
`. '
`
`. " . '
`
`" . '
`
`DEGREE OF COMPOSURE
`
`f0 ?
`- 20
`
`
`
`??????????? .
`
`
`
`?? ???
`
`;
`
`;
`
`;
`
`???????
`
`130
`
`??????????
`
`????????
`
`?????
`
`_
`ff67
`
`
`
`? ??????????????????????????
`
`????
`
`? ; ;
`
`?????
`
`{ Ser fun :
`PERSONAL NAME DATA
`
`
`
`. ?? ???????? ??????
`
`
`
`??????
`
`?????????????????????
`
`?
`
`??
`
`:
`
`
`
`. ??
`
`
`
`. ??????? . ??
`
`
`
`
`
`
`
`. ??? . ?????????
`
`
`
`
`
`: ?
`
`
`
`?? ??????
`
`
`
`. ?????? . . ? . ????????
`
`
`
`
`
` . ?? .
`
`
`
`»
`
`. ??
`
`FORMAL NAME
`
`???????????????????
`
`NAME
`
`???????????????????????????????
`
`?
`
`?????????????
`
`?
`
`???? ???? ?
`
`??? ???? 555
`
`
`
`
`
`??? ???? ???? . ? . ?????? ?? . ?
`
`1
`
`??? ? . ?
`
`. ?
`
`? ??? . ? .
`
`???????? ? .
`
`?????
`
`?
`
`… . ? ? ? ? ?
`
`. . . . . .
`
`MAIKO TANAKA
`??????????
`YUKARI SUZUKI
`TARO YAMADA
`
`$ $ $ $
`
`$
`
`$
`
`$ $
`
`$ $
`
`$
`
`$
`
`$ $
`
`$ $ $
`
`???????????????
`
`WHAFKO
`?????????????????????
`YUKARI - CHAN
`MR . YAMADA
`
`
`
`? ??????
`
`? :
`
`
`
`. - ? ? ? ???
`
`
`
`?????????
`
`
`
`???? ???????????????
`
`???
`
`?
`
`????
`
`;
`
`?
`
`"
`
`"
`
`
`
`???????? ?????? ?????? ? . ?????????
`
`
`
`
`
`
`
`
`
`??? ?????
`
`????
`
`? .
`
`?
`
`????? ????? ??????? ?? ??? ?? ?? ??? ??? ?????? ??? ??? ??? ?? ?? ?? ??? ???????? ??
`?? ?????? ?????? ?
`
`
`
` ? ? ' 88
`
`44
`
`?
`
`?
`
`?
`
`?
`
`?
`
`?
`
`? ? ?
`
`??????? ?????????
`
`????????????????????????????????????????
`
`?????
`
`???????
`
`?????????????
`
`- . - - . - - - -
`
`?? ?? ?? . ?? . ?? . ? ?? ?? ? ?? ???
`
`
`%
` ; ] E55
`
`A ' £ : ' } { { BE [
`WESTERN CALENDAR
`( JAPANESE CALENDAR
`42 : 3024 : ] ]
`
`JAPANESE CALENDAR MODEL
`12 - HOUR { QUE
`WITH AM AND PM
`
`
`
`????????????????????????????????? ?????
`
`?????
`
`??????????????????
`
`????
`
`? ? ? . ? ??? ???? ?? ????? . ????? .
`
`. ??
`
`? ??? ????? ???? . ???? ???????? ?????????????? ???????????????? . .
`
`?
`
`. . ??????
`
`. : : ? .
`
`???
`
`? ? . ? . ??? ??? ??? . &
`
`?
`
`?
`
`?
`
`?????????????????????????????????????????
`
`
`
`???????? ?????????????????????????????
`
`????????????? ?
`
`?????????????????????????????
`?????????????
`????????
`
`????????????????????????
`
`????????
`
`? ???????????????????????????????????????????????????????????????????????????????? . … . … . . …
`
`?????
`
`??????????????? + + + + + + + + + + +
`
`? ???? ?
`
`???????????????????????????????????????????????????????
`
`??? ? ? . . . . … . … . . . ? ???????????????????????????????????????????????????????
`?????????????
`
`? ??????????????????????????????????????????????
`
`???????????????
`
`-5-
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 5 of 10
`
`US 2017 / 0337921 A1
`
`FIG . 8
`
`START
`
`ACQUIRE SOUND INPUT
`
`*
`
`NRV
`
`5101
`
`wwwwwwwww
`
`w
`
`wwwwwww4444444
`
`ANALYZE SOUND INPUT
`
`S103
`
`- - - - - - - - - -
`
`- - - - - - - - - - - - - - - - - -
`
`S107
`DETECT SPEECH STYLE
`TALAAAKAAN S200
`ESTIMATE EXTERNAL
`ENVIRONMENT WYWY
`S300
`ESTIMATE USER STATE
`
`DETECT SPEC
`
`wwwwwwwwwwwwwww
`
`wy
`
`* * * *
`
`ALLU M
`
`* *
`
`Lumiwi
`
`Menit
`
`GENERATE
`GENERATE
`RESPONSE COMPONENT
`
`OF
`S105 W
`
`GENERATE
`GENERATE
`RESPONSE PARAMETER
`
`100
`: S109
`
`www
`
`i
`
`ng
`
`wwwwww
`
`the most GENERATE
`
`RESPONSE CONTENT
`
`YWVYTYYYY
`
`T
`
`1
`
`Ytters
`
`END
`
`ILL * AAAAA AAAAAAAA
`7
`inin
`
`w
`
`w
`
`wwwwwwwwwwwwwwwwwwwww
`
`1111111111111111111YYY
`
`M
`
`mmmmmmmmmmmmm
`
`-6-
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 6 of 10
`
`US 2017 / 0337921 A1
`
`FIG . 9
`
`START
`
`. . . .
`
`wwwwwwwwww
`
`DETECT ARRIVAL DIRECTION
`OF TARGET SOUND
`
`090
`
`Winwynniki
`
`FORM FILTER WITH BLIND SPOT
`IN DIRECTION OF TARGET SOUND
`
`* * * * * * *
`
`* *
`
`* * *
`
`* * *
`
`* *
`
`S203
`
`APPLY FILTER TO INPUT SOUND K - S205
`
`WWWXXXXXXXXXX
`
`wwwwwwwww
`CALCULATE DEGREE OF NOISES 207
`
`w
`
`w
`
`Certo
`
`END
`
`"
`
`th
`
`FIG . 10
`
`START
`
`????
`
`W
`
`iki
`
`311
`SPECIFY WEIGHT
`CORRESPONDING
`TO POSTURE OF USER .
`
`VAATA
`
`Vinh
`
`S321
`SPECIFY WEIGHT
`CORRESPONDING
`TO SPEECH SPEED . ???????????????
`
`* *
`
`S331 .
`SPECIFY WEIGHT
`CORRESPONDING
`TO MOTION OF USER
`
`
`
`: ????????????
`
`Awitiw
`
`www
`
`* *
`
`* * *
`
`w
`
`wwwwwwwwww
`
`WWW
`
`WWWWWWWWWWWW
`
`CALCULATE DEGREE 1 San
`OF COMPOSURE
`
`END
`
`?????????????????????????
`
`-7-
`
`
`
`Patent Application Publication _ Nov . 23 , 2017 Sheet 7 0 ' 10
`
`_ US 2017 / 0337921A1
`
`FIG . 11
`
`? ? ?
`
`????
`
`? ?
`
`
`
`?? ?
`
`??
`
`????
`
`???????????????????
`
`???
`
`?
`
`… ?
`
`?
`
`?????????????
`
`?
`
`SILENT
`
`
`
`? ??? ????
`
`.
`
`
`
` ?????????? ??? : ??
`
`
`
`
`
`
`???
`NOISY
`
`??
`
`??????? ????
`
`
`
` ? , ?????
`
`????
`
`??
`
`
`
`. ???
`
`
`
`? ??????????
`
`
`
`? ??
`
`-
`
`-
`
`-
`
`????????
`
`???????????????????????????????????????????????????????????
`
`1 ' . r - - . ? - x
`
`? ? ? ? ?
`
`? . ? ?
`
`AU ?
`
`?
`
`??????????????
`
`HURRIED
`
`? ? ? ? ? ? ? ? ? ? ? ?
`
`??????????????????????????
`
`?????
`
`? ? . .
`
`? ? ? ? ?? ?? ' . -
`
`PLEASE
`{ LY
`ALK Stu
`
`
`
`
`
`: . 2 ; ? 2 ? ??? ? ???????? ??? ???
`
`? .
`
`
`
`? . ?? ?? ??????
`
`
`
`
`
`www - * * - - ; - .
`
`-
`? ??? . , ' - - '
`
`. ??????????
`
`?????
`
`wwww
`
`PLEASE TALK
`IN SILENT PLACE
`
`?????
`
`- . . . : . . ; , . . . . . . . . . . . … . . … . . . . . . . . . . . . . , : , , : . - . . . . . , , , , . ? ? . . . ? . . .
`
`: ?
`
`,
`
`
`
`?? ? ? . ?? . ? ? . ????? ?????
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`? ???? ??? ?? ???????
`
`
`
`
`
`???????? ?
`
`.
`
`
`
` . ? . . ????? .
`
`
`
`????
`
`
`
`. ????
`
`,
`
`.
`
`- .
`
`??
`
`???
`
`
`
`??? ?????
`
`??????
`
`
`
`. ???
`
`
`
`??????? ???
`
`??????
`
`?????????
`
`
`
` ?????? ??? ??????
`
`
`
`???
`
`
`
`??????? ??
`
`.
`
`FIG : 42
`
`???
`
`??? ????? . ??? ???? . - . ???? ????? ????? ?? ???? ?? . . . ????? ? .
`
`. - : - - - ;
`
`- -
`
`: ? . - - ????????????? . ? : . ?? ? ?????????? . - -
`
`???
`PLEASE TALK
`IN SILENT PLACE
`
`???
`
`
`
`
`
`- ??? ??? ?????? ;
`
`- - - - . -
`
`? ??? - .
`
`? ????
`
`? ? . ??????
`
`. ???? ? r ? ? -
`
`PLEASE
`TALK SLOWLY
`
`???????????????
`
`???
`
`.
`
`.
`
`??
`
`?????????
`
`???????????????
`
`
`
`
`
`. ??? . ??????
`
`
`
`-
`
`??
`
`??????
`
`-8-
`
`
`
`Patent Application Publication _ N0v . 23 , 2017 Sheet 8 0 ' 10
`
`_ US 2017 / 033792141
`
`FIG . 13
`
`?????????????????????
`
`SOUND INPUT
`
`??
`
`???? . ?? ??? ?? . ? ?????? . ?? ? ?????? ?????????????????? - - . ?
`
`?
`
`???????????????????? ????????????????????????????????? + + + + + + + + +
`
`
`
`?? ? . ???????? - . ; - . - ?? ? ? ? ? . .
`
`-
`
`?
`
`-
`
`?
`
`?
`
`?
`
`?
`
`?
`
`?
`
`; - , ?????? . ??? ? - u ? ? ?? ?
`
`?
`
`
`
`??? ???
`
`
`
`Che < & rrif scha49 : 3 : 8 ' 43 corrkPFFF9 % 8 '
`
`: ; Accc
`
`aftermixyFm .
`
`- " - " " -
`
`r ' " - * * ?
`
`? .
`
`?
`
`MEANING ANALYSIS
`
`?????
`
`?????????
`
`????
`
`????
`
`??
`
`???
`
`. ??
`
`??????
`
`?
`
`;
`
`-
`
`
`
`??? :
`
`??
`
`
`
`?????? .
`
`
`
`
`
`??? " - ????????
`
`
`
`????? ?? : ?????
`
`
`
`
`
`- - . - - ?? ?? . ??? ? - -
`
`.
`
`- -
`
`
`
`???? ??? . ???? ?? : ??
`
`
`
`
`
`
`
`
`
`MMirum
`
`? ??? ??? ???? . ? ????????????
`
`? ??
`
`
`wwwmmwwww ?
`
`
`
`
`
`
`
`???????????????????????
`
`SEARCH KEY
`2 { ? F 4 / ffj ' f ' 3 : 00 ;
`
`???
`
`- ?? .
`
`? . - -
`
`- -
`
`- - - -
`
`- - -
`
`- - - - -
`
`- -
`
`-
`
`*
`
`* * - "
`
`"
`
`"
`
`" "
`
`.
`ACQUISITION OF RESPONSE COMPONENT
`
`TASK : SCHEDULE CONFIRMATION
`?????????????
`ARGUMENT
`EXPRESSION
`tomorrow 1 o ' clock aftemoon
`
`- -
`
`- - . ???? ?? - - - -
`
`' . - ? - - - -
`
`Pater F irrig
`
`?????
`
` ?? . » . ????
`
`1
`
`.
`
`. .
`
`.
`
`.
`
`.
`
`. . . .
`
`… . .
`
`… … . .
`
`? ? ?
`
`
`
` ???? ? ?
`
`? . ?
`
`?
`
`?
`
`??
`
`.
`
`
`
` ???? . ??
`
`
`
`- - - EXECUTION RESULT OF TASK
`? ????
`
`-
`
`SEARCH KEY
`
`?
`
`-
`
`?????????
`
`
`
` . ?? ? ? * ?????????
`
`
`
`
`
`???????
`
`
`
`?? ?
`
`
`
`????????? .
`
`??
`
`??
`
`- -
`
`
`
`?? 3 23
`
`'
`
`; {
`
`#
`
`
`
`3 : #
`
`TITLE
`Meeting at room A
`
`????
`
`
`
`?? wwwmwvwww
`
`???
`
`?
`
`?
`
`
`
` . ? ? . ???
`
`
`
`???
`
`????????????????????????????????????????????????????????
`
`
`
`????? . .
`
`
`
`????? ????
`
`????????
`
`?
`
`?
`
`?
`
`
`
`? ?????
`
`-
`
`.
`
`?
`
`?
`
`?
`
`- - ???
`
`-
`
`-
`
`??????????????
`PARTICIPANT
`_ RMichae
`
`??????????????????????
`
`
`
`?? : ? ????
`
`
`
`
`
`?? . ? ?
`
`
`
`?????? ?
`
`
`
` ? ?? ?
`
`
`
`?
`
`?
`
`ff6 ; , 14
`
`??
`
`??????
`
`?
`
`?
`
`?
`
`df '
`
`????
`
`les
`
`?
`
`-
`
`-
`
`-
`
`?
`
`cf
` ££££f3fi ; :
`
`?? ?
`?
`?
`?
`
`??? . ?
`. ??
`
`??
`
`? .
`
`?
`
`.
`
`?
`
`?
`
`? ?
`
`?
`
`?
`
`?
`
`?
`
`? ? ? ?
`
`?
`
`PERSONAL NAME DATA
`
`;
`
` + + + ? ????????????????
`????????????
`
`
`
`+
`
`??????????? ???????????
`
`NAME
`
`Mickey
`Betty
`
`?
`
`: 566666666
`
`66666???????????
`
`
`
`. ?? ??? - ' ??? . ?? ? . ??????? . ??? ?????? ?? ?? . ???? ?? ?? . ?? . ??????? ?? ? . . . ?????
`
`????????????????????????????????????????????????????
`
`? ???? ?? ????? . ?? . rr rr rr rr rr rr
`
`?
`
`? ?
`
`?
`
`? ?
`
`- ;
`
`??
`
`??????
`
`* . % ? .
`
`+ ??????????????????????????????????????????????????
`
`? ????????????????????????????????????????????
`
`??????????????????
`
`FORMAL NAME
`Michael Smith
`Elizabeth Green
`Katharine McPhee
`???????????
`?????
`?????? ??????
`
`?
`
`???????????
`?????????????????
`
`
`
`?????????? ??????
`
`???????????????
`
`???????????????
`
`?????????
`????????
`
`Kaie
`
`. . . . " . . -
`
`
`
`
`. ???? ??????????
`
`
`
`
`
` : ?? ? ?
`
`
`
`?? :
`
`??????????????????????????????
`
`
`
`??? ??????? : ???????
`
`
`
`???????????????
`
`?????? ???????????? : ???
`
`
`
`
`?
`
`?????
`??????????????????
`???????????????
`
`. . . ' , ' . … . . . . . . . . … … . . . … . . … . . . ' . . . , " . ' . " . ' . ' : .
`
`?????????????????????????????????????
`
`-9-
`
`
`
`Patent Application Publication _ Nov . 23 , 2017 Sheet 9 0 ' 10
`
`_ US 2017 / 0337921A1
`
`FFIG . 45
`
`????
`
`????????? ???? ?????? ;
`
`. ' ????????? : A ?? ????????????????????????? ? ?????
`
`?
`
`? : ?? . ?
`
`??
`
`??
`
`?
`
`??
`
`?? .
`
`?
`
`?
`
`?
`
`?
`
`?
`
`?? .
`
`??
`
`?
`
`??
`
`?
`
`?
`
`?
`
`???
`
`???
`
`??
`
`.
`
`??
`
`
`
`??? .
`
`???
`
`. ??
`
`?
`
`??????
`
`; E3 ??????? ) K
`
`.
`
`? ?
`
`?
`
`?
`
`?
`
`- -
`
`-
`
`-
`
`-
`
`- - - - - 1 -
`
`???
`
`:
`
`_ F1CQis
`
`- ' - * * . w
`? . ? ? . ?
`.
`
`??????????
`
`???????
`
`- - - - - - - - ; - " -
`
`??????
`
`?????????????
`
`Pfaase talk in
`Siferit piae ,
`
`in a hurry
`
`? ?
`
`? ?? ????? . . . ? . ? - - - - - ????????? . ? ? . ?? ' ? : .
`
`?? ? - - - ?? ? ? ? ?? ?? ? ? ? ? ?
`
`? ? . ?
`
`Phetas & dix sity % 848
`
`?? > 1 ? ? ? ? ? ?? ? … , , ;
`
`???? ? ??? . ? ? ? . ?? ? ? ? ? ? ? ? ?? ; , '
`
`?? ???? .
`
`
`
`???? ? ? ? ??????????
`
`
`
`
`
`??????????? ?
`
`
`
`
`
`. - ???????????
`
`?
`
`? ?
`
`? ?
`
`? ? ? ? ? ?
`
`?
`
`- . ? ?
`
`? ? ? ? ? ?
`
`? ? ? ? ?
`
`. . ?
`
`?
`
`? ? .
`
`- . .
`
`?
`
`'
`
`: .
`
`
`
` ?? . ? -
`
`?
`
`-
`
`?
`
`.
`
`. . .
`
`- ; .
`
`FFIG . &
`
`??? ?
`???
`? ? ?? . ? - - . . it - . . . ????? . . - . : . ????? ???? . - ???????? , " - - - . - - . ????? ?? - - ??
`
`? ]
`
`??
`
`. .
`
`Please talk in
`Silent place .
`
`.
`
`_
`
`_ -
`
`-
`
`- _ '
`
`.
`
`? ? ? - * . " - ' . ??
`
`?????????
`
`- - ' - " . * * " . ' ?
`
`-
`
`-
`
`-
`
`-
`-
`-
`
`"
`*
`
`*
`
`-
`w
`
`? ? - - ' - . - . - ; .
`
`{ Pease talis Sir ? viV . '
`r . ???????
`??????????????
`
`
`????
`
`i
`
`.
`
`.
`
`i
`
`c
`
`??????????????
`
`???? ???? ?????????????
`
`?
`
`? .
`
`?? .
`
`-10-
`
`
`
`Patent Application Publication
`
`Nov . 23 , 2017 Sheet 10 of 10
`
`US 2017 / 0337921 A1
`
`FIG . 17
`popes
`
`wwwwwwwwwwwwwww
`
`w
`
`wwwwwwwwwwwwwwwwwwwwwwwwwwwwwww
`
`913
`COMMUNICATION
`DEVICE
`
`risinin
`
`en Na
`
`915
`SOUND
`COLLECTION
`DEVICE
`
`YYYYYY
`
`917
`IMAGING
`DEVICE
`wwwwwwwwwwwwwwwwwwwww !
`
`* * * * *
`
`AHHHHHHHHH444
`
`F
`
`*
`
`* * * * * * *
`
`*
`
`12
`
`. . . . . . . . . . . . . . . .
`
`mini
`
`. . . . . . . .
`
`steron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
`
`.
`
`MAMAMATAMU
`
`* *
`
`*
`
`911
`Simon
`REPORT
`DEVICE
`
`wwwwwwwwww
`
`* * * * * * * * * *
`
`* * * * *
`
`*
`
`LLLLL
`
`L
`
`LLLL
`
`* * *
`
`* . . . .
`
`* *
`
`wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww
`
`wwwwww
`wonnen
`
`m
`
`nili
`hirini
`
`s
`
`pi
`
`. . . . .
`
`. . . . .
`
`919
`
`MWWWWWWWY
`
`W 901
`PROCESSOR
`
`Liz
`
`vera
`
`w
`
`903
`MEMORY
`
`.
`
`??????????????
`
`905
`
`STORAGE
`
`Www
`
`907
`Www
`MANIPULATION
`DEVICE
`wwwwwwwwwwwwww
`
`-11-
`
`
`
`US 2017 / 0337921 A1
`
`Nov . 23 , 2017
`
`INFORMATION PROCESSING DEVICE ,
`INFORMATION PROCESSING METHOD ,
`AND PROGRAM
`
`TECHNICAL FIELD
`[ 0001 ] The present disclosure relates to an information
`processing device , an information processing method , and a
`program .
`
`BACKGROUND ART
`In recent years , sound combination technologies of
`[ 0002 ]
`converting text information into sounds , sound recognition
`technologies of recognizing content uttered by users as text
`information , and natural language processing technologies
`of recognizing content indicated by sentences have been
`developed . Therefore , interactive user interfaces ( UIS )
`which are based on sound inputs and are configured to be
`able to manipulate various household electrical appliances
`such as television receivers or information devices such as
`personal computers have been spread by applying the tech
`nologies and allowing users to execute dialogs with the
`devices by sounds . For example , Patent Literature 1 dis
`closes an example of an information processing device
`capable of instructing a user to execute an intended opera
`tion through a dialog with the user .
`CITATION LIST
`
`[ 0003 ]
`
`Patent Literature
`Patent Literature 1 : JP 2005 - 3926A
`DISCLOSURE OF INVENTION
`Technical Problem
`[ 0004 ] On the other hand , there are phrases used in dialogs
`that have the same meaning but are expressed using different
`phrases ( that is , phrases expressed differently ) such as “ 3
`pm ” and “ 15 : 00 , " and everyday phrases differ with each
`user . Therefore , when responses to sound inputs of users are
`output with different expressions from everyday phrases
`used by the users , the users feel uncomfortable with the
`responses in some cases .
`[ 0005 ]
`In addition , situations in which dialogs with users
`are executed ( for example , user states or surrounding envi
`ronments ) may not normally be constant . In different situ
`ations , there are cases in which users may feel uncomfort
`able with responses that the users would feel were natural in
`some situations .
`[ 0006 ]
`Therefore , for interactive user interfaces based on
`sound inputs , there are requests for realizing dialogs with
`users in more natural ( that is , less uncomfortable , modes in
`accordance with changes in situations or users .
`[ 0007 ] Accordingly , the present disclosure proposes an
`information processing device , an information processing
`method , and a program capable of controlling a response to
`a sound input in a preferred mode corresponding to a change
`in a situation or a user .
`
`Solution to Problem
`[ 0008 ] According to the present disclosure , there is pro
`vided an information processing device including : a control
`
`unit configured to control output of a response to speech of
`a user in accordance with acquired information regarding a
`speech state of the user .
`[ 0009 ] Further , according to the present disclosure , there
`is provided an information processing method including :
`controlling , by a processor , output of a response to speech of
`a user in accordance with acquired information regarding a
`speech state of the user .
`[ 0010 ] Further , according to the present disclosure , there
`is provided a program causing a computer to execute :
`controlling output of a response to speech of a user in
`accordance with acquired information regarding a speech
`state of the user .
`Advantageous Effects of Invention
`[ 0011 ] According to the present disclosure , as described
`above , it is possible to provide an information processing
`device , an information processing method , and a program
`capable of controlling a response to a sound input in a
`preferred mode corresponding to a change in a situation or
`a user .
`[ 0012 ] Note that the effects described above are not nec
`essarily limitative . With or in the place of the above effects ,
`there may be achieved any one of the effects described in
`this specification or other effects that may be grasped from
`this specification .
`BRIEF DESCRIPTION OF DRAWINGS
`FIG . 1 is an explanatory diagram illustrating an
`[ 0013 ]
`overview of an information processing device according to
`an embodiment of the present disclosure .
`[ 0014 ] FIG . 2 is a block diagram illustrating an example of
`a functional configuration of the information processing
`device according to the embodiment .
`[ 0015 ] . FIG . 3 is an explanatory diagram illustrating analy
`sis of meaning content indicated by a sound input and an
`example of a process based on a result of the analysis .
`[ 0016 ] FIG . 4 is an explanatory diagram illustrating an
`example of a relation between the degree of composure and
`a user state .
`[ 00171 . FIG . 5 is an explanatory diagram illustrating an
`example of a relation between the degree of composure and
`a user state .
`[ 0018 ]
`FIG . 6 is an explanatory diagram illustrating an
`example of a relation between the degree of composure and
`a user state .
`[ 0019 ]
`FIG . 7 is an explanatory diagram illustrating an
`example of a response parameter stored as continuous
`information .
`[ 0020 ] FIG . 8 is a flowchart illustrating an example of the
`flow of a series of operations of the information processing
`device according to the embodiment .
`[ 0021 ] FIG . 9 is a flowchart illustrating an example of an
`operation of the information processing device according to
`the embodiment .
`[ 0022 ] FIG . 10 is a flowchart illustrating an example of an
`operation of the information processing device according to
`the embodiment .
`[ 0023 ] FIG . 11 is an explanatory diagram illustrating an
`overview of the information processing device 1 according
`to a first modification example .
`
`-12-
`
`
`
`US 2017 / 0337921 A1
`
`Nov . 23 , 2017
`
`to a fries and the inform explanator
`
`[ 0024 ] FIG . 12 is an explanatory diagram illustrating an
`overview of the information processing device 1 according
`to a first modification example .
`[ 0025 ] FIG . 13 is an explanatory diagram illustrating an
`example of the information processing device according to
`the embodiment .
`[ 0026 ]
`FIG . 14 is an explanatory diagram illustrating an
`example of the information processing device according to
`the embodiment .
`[ 0027 ] FIG . 15 is an explanatory diagram illustrating an
`example of the information processing device according to
`the embodiment .
`[ 0028 ]
`FIG . 16 is an explanatory diagram illustrating an
`example of the information processing device according to
`the embodiment .
`[ 0029 ] .
`FIG . 17 is a diagram illustrating an example of a
`hardware configuration of the information processing device
`according to the embodiment .
`MODE ( S ) FOR CARRYING OUT THE
`INVENTION
`[ 0030 ] Hereinafter , ( a ) preferred embodiment ( s ) of the
`present disclosure will be described in detail with reference
`to the appended drawings . In this specification and the
`appended drawings , structural elements that have substan
`tially the same function and structure are denoted with the
`same reference numerals , and repeated explanation of these
`structural elements is omitted .
`[ 0031 ] Also , the description will be made in the following
`order .
`[ 0032 ]
`1 . Overview
`[ 0033 ]
`2 . Functional configuration
`[ 0034 ]
`3 . Process
`[ 0035 ]
`4 . Modification examples
`[ 0036 ]
`4 . 1 . First modification example : feedback of vari
`ous recognized states
`[ 0037 ]
`4 . 2 . Second modification example : control
`example of response content according to individual
`recognition result and situation
`[ 0038 ]
`5 . Example
`[ 0039 ]
`6 . Hardware configuration
`[ 0040 ]
`7 . Conclusion
`1 . OVERVIEW
`First , an overview of an information processing
`[ 0041 ]
`device according to an embodiment of the present disclosure
`will be described and a task of the information processing
`device according to the embodiment will be outlined with
`reference to FIG . 1 . FIG . 1 is an explanatory diagram
`illustrating an overview of an information processing device
`1 according to the embodiment .
`[ 0042 ] As illustrated in FIG . 1 , the information processing
`device 1 according to the embodiment is configured to use
`a sound produced by a user as input information ( hereinafter
`referred to as a “ sound input ' ) by applying a so - called
`interactive user interface ( UI ) to execute various processes
`on the basis of the sound input . Specifically , the information
`processing device 1 recognizes content spoken by the user
`by acquiring the sound input from the user as acoustic
`information and analyzing the acoustic information on the
`basis of a sound recognition technology or a natural lan
`guage processing technology . Then , the information pro -
`cessing device 1 executes various processes in accordance
`
`with the content recognized on the basis of the sound input
`and presents a natural sentence indicating the execution
`result as a sound ( acoustic information ) or text information
`( display information ) to the user .
`[ 0043 ] For example , in the example illustrated in FIG . 1 ,
`the information processing device 1 receives a sound input
`c10b " What time is it in London ? ” from a user Ub , confirms
`the time in London , and outputs response information c11b
`“ It ' s 5 : 00 pm ” as a sound on the basis of a result of the
`confirmation .
`[ 0044 ]
`In this case , for example , on the basis of the
`analysis result of the sound input c106 , the information
`processing device 1 recognizes instruction content ( that is ,
`confirming the current time in London ) indicated by the
`sound input c10b . Then , on the basis of a recognition result
`of the instruction content indicated by the sound input c10b ,
`the information processing device 1 confirms the time in
`London , for example , by executing an application ( for
`example , an application supplying a clocking function ) for
`confirming the time in different countries . Then , on the basis
`of a confirmation result of the time , the information pro
`cessing device 1 generates the response information c11b for
`presenting the confirmation result as a natural sentence and
`outputs the response information c11b as a sound .
`[ 0045 ]
`In addition , the information processing device 1
`may recognize an individual user on the basis of an analysis
`result of a sound input or acquired information ( for example ,
`captured image information ) other than the sound input by
`a so - called individual recognition technology and execute
`various processes in accordance with the recognition result .
`[ 0046 ] For example , in the example illustrated in FIG . 1 ,
`the information processing device 1 receives the sound input
`c10a “ Check my schedule for tomorrow at 1 pm ” from the
`user Ua , confirms the schedule of the user Ua , and outputs
`response information clla “ You have a meeting with Mr .
`Yamada in room A ” as a sound on the basis of a result of the
`confirmation .
`[ 0047 ]
`In this case , for example , on the basis of an analysis
`result of the sound input c10a , the information processing
`device 1 recognizes instruction content indicated by the
`sound input c10a ( that is , confirming tomorrow ' s schedule
`at 1 pm ) . In addition , for example , on the basis of the
`analysis result of the sound input c10a or an image of the
`user Ua captured by a different imaging unit ( not illustrated ) ,
`the information processing device 1 individually recognizes
`the user Ua . Then , on the basis of a recognition result of the
`instruction content indicated by the sound input c10a or a
`result of the individual recognition of the user Ua , the
`information processing device 1 confirms the schedule of the
`user Ua registered in , for example , an application for man
`aging schedules by executing the application . Then , on the
`basis of a confirmation result of the schedule , the informa
`tion processing device 1 generates the response information
`clla for presenting the confirmation result as a natural
`sentence and outputs the response information clla as a
`sound .
`In this configuration , the user talks with the infor
`[ 0048 ]
`mation processing device 1 by sound to cause the informa
`tion processing device 1 to execute various functions .
`[ 0049 ] On the other hand , in phrases used in dialogs
`between people ( that is , users ) , there are phrases with the
`same meaning but expressed differently , such as “ 3 pm " and
`“ 15 : 00 , " and everyday phrases differ with each speaker ( that
`is , speaking styles are different ) in some cases . Therefore ,
`
`-13-
`
`
`
`US 2017 / 0337921 A1
`
`Nov . 23 , 2017
`
`for example , when the information processing device 1
`outputs “ 15 : 00 ” as a response indicating a time to a user who
`normally says “ 3 pm , ” the user may feel uncomfortable with
`the different response from the speaking style of the user in
`some cases .
`[ 0050 ]
`A difference in the speaking style is not limited to
`the names of times as shown above , but names of peoples
`can be exemplified as other specific examples . For example ,
`a certain user calls a person with the name “ Taro Yamada ”
`by adding an honorific title to the surname such as “ Mr .
`Yamada " in some cases . In addition , another user calls the
`person with the name “ Taro Yamada ” using only the first
`name “ Taro ” in some cases . In this case , when the informa
`tion processing device 1 outputs a response calling the
`person with the name “ Taro Yamada ” by his full name “ Taro
`Yamada , " a user who calls him “ Mr . Yamada ” may feel
`uncomfortable with the response in some cases .
`[ 0051 ]
`In addition , a situation ( for example , a user state or
`a surrounding environment ) in which the information pro
`cessing device