`
`(10) International Publication Number
`WO 2018/118895 A2
`
` =
`
`WIPOIPCT
`
`(19) World Intellectual Property
`Organization
`International Bureau
`
`(43) International Publication Date
`28 June 2018 (28.06.2018)
`
`(51) International Patent Classification:
`GO6F 3/16 (2006.01)
`HO1A 11/00 (2006.01)
`HO4R 1/02 (2006.01)
`
`(21) International Application Number:
`
`PCT/US2017/067274
`
`(74)
`
`(22) International Filing Date:
`19 December 2017 (19.12.2017)
`
`Amazon Technologies, Inc., P.O. Box 81226, Seattle, WA
`98108-1226 (US). WALLISER, Mare, Rene; c/o Ama-
`zon Technologies, Inc., P.O. Box 81226, Seattle, W
`98108-1226 (US).
`
`Agent: CUNNINGHAM, Aaron et al.; Lee & Hayes,
`PLLC, 601 W. Riverside Ave, Suite 1400, Spokanc, WA
`99201 CUS).
`
`(25) Filing Language:
`
`(26) Publication Language:
`
`English
`
`English
`
`(30) Priority Data:
`15/389,818
`
`23 December 2016 (23.12.2016) US
`
`INC.
`(71) Applicant: AMAZON TECHNOLOGIES,
`[US/US]; P.O. Box $1226, Seattle, WA 98108-1226 (US).
`
`(72) Inventors: CHUA, Albert John, Yu Sam; c/o Ama-
`zon Technologies, Inc., P.O. Box 81226, Seattle, WA
`98108-1226 (US). GEJJI, Pushkaraksha; c/o Ama-
`von Technologies, Inc., P.O. Box 81226, Seattle, WA
`98108-1226 (US). CYBART, Adam, Kenneth;
`c/o
`Amazon Technologies,
`Inc., P.O. Box 81226, Scattle,
`WA 98108-1226 (US). PANCE, Aleksandar; c/o Ama-
`zon Technologies, Inc., P.O. Box 81226, Seattle, WA
`98108-1226 (US). CANIZARES, Wilfrido, Loor; c/o
`
`(81) Designated States (unless otherwise indicated, for every
`kind of national protection available): AE, AG, AL, AM,
`AO, AT, AU, AZ, BA, BB, BG, BH, BN, BR, BW, BY, BZ,
`CA, CH, CL, CN, CO, CR, CU, CZ, DE, DJ, DK, DM, DO,
`DZ, EC, EE, EG, ES, FL GB, GD, GE, GH, GM, GT, HN,
`HR, HU, 0D,IL, IN, IR, 1S, JO, JP, KE, KG, KH, KN, KP,
`KR, KW, KZ, LA, LC, LK, LR, LS, LU, LY, MA, MD, ME,
`MG, MK, MN, MW, MX, MY, MZ, NA, NG, NI, NO, NZ,
`OM, PA, PE, PG, PH, PL, PT, QA, RO, RS, RU, RW, SA,
`SC, SD, SE, SG, SK, SL, SM, ST, SV, SY, TH, TJ, TM, TN,
`TR, TT, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW.
`
`(84) Designated States (unless otherwise indicated, for every
`kind of regional protection available): ARIPO (BW, GH,
`GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, ST, SZ, TZ,
`UG, 2M, ZW), Eurasian (AM, AZ, BY, KG, KZ, RU, TJ,
`TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK,
`EE, ES, FL FR, GB, GR, HR, HU,IE, IS, IT, LT, LU, LV,
`
`(54) Title: VOICE ACTIVATED MODULAR CONTROLLER
`
`(57) Abstract: A modular controller may be mounted in an opening, such as a
`standard single wide or double wide electrical junction box, in a wall or other
`surface. The modular controller mayinclude a power module and a front mod-
`ule. The power module may be mounted in the opening ofthe surtace, and may
`be configured to provide electrical power to the front module. The front module
`may be detachably coupleable to the power module. The front module maybe
`configured to receive audio commands, gesture commands, and/or presence in-
`put correspondingto a desired action, and may cause the actionto be performed
`by a device ofthe front module and/or an external device. The front module may
`include various devices (e.g., components) capable of providing various func-
`tionalities, and maybe selected for coupling to a power module in a particular
`location based at least in part onthe functionalities.
`
`
`
`wo2018/118895A2|IEVRIATINIITIIMMRTTTCAATMT
`
`
`
`
`
`
`
`
`DATA STORE 134
`User PROFLE 136
`
`
`
`
`
`
`
`MODULAR CONTROLLER 106
`Poseare Unt 142 rent Unir 140,
`
` PROCESEOR 138) Mewory130 Converter:
`
`158,
` IneuT DEVICE(S)
`My
`Orit
`WL SINAL Processins MocuLE142
`
`114 Device(s) 159 INTERPRETATION MOGULE 144 if
`
`
`
`
`Exreeaa
`dl
`COMMAND MODULE 143
`VETORS
`
`IreRFAcEJDevce Por” Hit
`152
`ind
`CONNECTOR 160
`
`
` Dara Store 148
`
`
`
`
`
`
`
`
`FIG. 1
`
`[Continued on next page]
`
`AMZNO0000952
`
`
`
`WO 2018/118895 AZTMT
`
`MC, MK, MT, NL, NO, PL, PT, RO, RS, SE, SL SK, SM,
`TR), OAPI (BF, BJ, CF, CG, CI, CM, GA, GN, GQ, GW,
`KM, ML, MR, NE, SN, TD, TG).
`
`Published:
`
`— without international search report and to be republished
`upon receipt ofthat report (Rule 48.2(g))
`
`AMZNO0000953
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`VOICE ACTIVATED MODULAR CONTROLLER
`
`RELATED APPLICATIONS
`
`[0001]
`
`This application claims priority to U.S. Patent Application No. 15/389,818, filed on December 23,
`
`2016, the entire contents of which are incorporated herein byreference.
`
`BACKGROUND
`
`[0002]
`
`Homes are becoming more wired and connected with the proliferation of computing devices such
`
`as desktops, tablets, entertainment systems, and portable communication devices. As these computing devices
`
`evolve, many different ways have been introduced to allow users to interact with computing devices, such as
`
`through mechanical devices (e.g., keyboards, mice, eic.), touch screens, motion, and gesture. Another way to
`
`interact with computing devices is through natural language input such as speech input and gestures.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`fo001]
`
`The detailed description 1s set forth below with reference to the accompanying figures.
`
`In the figures,
`
`the left-most digit(s) of a reference numberidentifies the figure in which the reference numberfirst appears. The
`
`use of the same reference numbers in different figures indicates similar or identical items. The systems depicted
`
`in the accompanying figures are not to scale and components within the figures may be depicted notto scale with
`eachother.
`
`[0002]
`
`FIG. 1 shows an example interactive device computing architecture set in a home environment. The
`
`architecture includes at least one modular controller physically situated in the home.
`
`10
`
`20
`
`[0003]
`controller.
`
`[0004]
`
`[0005]
`
`25
`
`FIG.2 illustrates a perspective viewof an example front unit and an example power unit of a modular
`
`FIG.3 illustrates an example mounting bracket for a two-gang modular controller.
`
`FIG.4 illustrates an example connector configured to transmit electrical power and/or data between a
`
`power unit and a front unit.
`
`[0006]
`
`FIG. 5 illustrates another example connector configured to transmit electrical power and/or data
`
`between a power unit and a front unit.
`
`[0007]
`
`FIG. 6 illustrates a front viewof an example two-gang modular controller.
`
`In this example, the modular
`
`30
`
`controller includes a first front unit and a second front unit, each with different functionality.
`
`[0008]
`
`FIG.7Aillustrates a perspective view of an cxample modular controller having a front unit with a front
`
`panel configured to fit within an opening of a standard wallplate.
`
`[6009]
`
`FIG. 7B illustrates the assembled modular controller of FIG. 7A with the standard wallplate mounted
`
`on the modular controller.
`
`[0010]
`
`FIG.8A illustrates a sequence of views showing an example microphone array expanding ona front
`
`panel of a front unit by sliding microphones of the microphonearray into a second position.
`
`AMZNO0000954
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`[0011]
`
`FIG. 8B illustrates a sequence of views showing an example microphone array expanding on a front
`
`pancl of a front unit by swinging microphones of the microphone array into a sccond position.
`
`[0012]
`
`FIG. 9 illustrates an example process for remotely controlling a device of an environmental and/or
`
`entertainment control system using a modular controller.
`
`{001 3]
`
`FIG. 10 illustrates an example process for configuring a modular controller for use in an environmental
`
`and/or entertainment control system.
`
`DETAILED DESCRIPTION
`
`[601 4]
`
`This disclosure describes, in part, techniques and devices for providing centralized environmental
`
`and/or entertainment control with an interactive modular controller. Environmental control may include control
`
`of lights (e.g., on, off, dim, etc.), temperature (e.g.. air conditioning, heating, fan control, etc.), alarm systems,
`
`doors, windows, windowshades, and/or various other environmental systems. Entertainment control may include
`
`control of visual displays. audio presentations, two-way communications, and the like. The modular controller
`
`includes a power unit and a front unit. The power unit may be configured to mount onto a surface in an
`
`environment. The power unit maybe sized to fit in a standard electrical junction box. Multiple different front
`
`10
`
`5
`
`units may be configured to interchangeably couple to a standardized power unit. For example, the power unit may
`
`be mounted into a wall so that a front side of the power unit is substantially aligned with (.e., flush against) the
`
`wall. The environment may include multiple power units. For example, each room in a home environment may
`
`include one or more power units. Each power unit may provide electrical power and/or data to a front unit.
`
`[0015]
`
`The front unit may detachably couple to the power unit in the environment. The front unit may
`
`comprise a voice, touch, and/or gesture-controlled device. The front unit may include a computing system thatis
`
`communicatively coupled (e.g., wired or wireless connection) to internal and/or external devices to affect the
`
`environment and/or entertamment control. The internal and/or external devices to which the front unit computing
`
`system is communicatively coupled may mnclude mobile and/or stationary computing devices (e.g.. a tablel
`
`computing device, a mobile phone, a laptop computer, a desktop computer, a set-top box, a wearable device, etc.),
`
`appliances(e.g., a television, an audio svstem, a garage door opener, a washing machine, a dryer, a dishwasher, a
`
`coffee maker, a refrigerator, a door, motorized windowshades, a telephone, a tablet, etc.), fixtures (e.¢., a light, a
`
`lock, a sink, a toilet, a door bell, a smoke alarm, a fire alarm, a carbon monoxide detector, etc.), or other types of
`devices in the environment.
`
`[0016]
`
`The front unit may include various components and settings based on a location of the front unit in the
`
`environment. For example, a front unit located in a foyer of a house may include a display for a presenting a
`
`weather forecast for a uscr to vicw prior to departing the housc, and speakers for grecting the user upon cntry into
`
`the house. For another example, a front unit located m a living room may include a mechanical switch configured
`
`to control one or more lights and/or a display to view thermostat settings. A user may select a front unit for a
`
`particular location in the home environment based on the functionalities of the front unit.
`
`{0017}
`
`The user in the environment mayissue, to the front unit, a command (¢.¢., voice, touch, and/or gesture
`
`input) including a request for the front unit to cause a second device in the environment to perform an action
`
`2
`
`AMZNO0000955
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`(é.g., operation). The command mayinclude a requestfor the front unit to cause a component(e.g., internal device)
`
`of the front unit to perform the action, such as “play music.” Additionally or altcrnatively, the command may
`
`include a request for the front unit to cause an external device to performthe action, such as “turn on the lights.”
`
`[0018]
`
`The front unit may include a microphone array configured to receive a voice command, a touch
`
`command, and/or one or more sensors configured to receive a gesture command (e.g., a movement of a bodypart
`
`corresponding to a request for the front unit to cause an action to be performed) and/or a touch command (e.¢.,
`
`physical touch via an input device corresponding to a request for the front unit to cause an action to be performed).
`
`The respective component of the front unit may receive the command and send, over a network, a signal
`
`corresponding to the command to a computing system, which in some cases may include a main logic board
`
`housing a system on a chip. The computing system maybe a local computing system (e.g., internal to the front
`
`unit and/or power unit of the modular controller) and/or a remote computing system (e.g., a computing device
`
`external to the modular controller). The computing system may perform speech and/or gesture recognition on the
`
`signal to identify the command. The computing system may then interpret (e.g., determine a meaning) the
`
`command, and a device corresponding thereto. For instance, if the command is a request to “turn on the lights,”
`
`the computing system may identify the location of the front unit and one or more lights corresponding to the
`
`location. The computing system may then gencrate a control signal including an mstruction for a controller of the
`
`identified device to performan action, the control signal including an identification and/or location ofthe identified
`
`device. The computing system may then send the control signal to the controller over the network.
`
`In examples
`
`in which the computing system includes a remote service, the computing system maysend the control signal to the
`controller via the front unit.
`
`[0019]
`
`Furthermore, while the above examples describe a user requesting an action to be performed, in other
`
`instances a device may initiate a process for causing an internal and/or external device to perform an action. For
`
`example, a front unit may be programmedto cause a device to perform a certain action upon one or more conditions
`
`being met, such as a user being detected m an environment, a ime of day occurring, or the like. For example, a
`
`motion sensor may detect the presence of a user and mayinitiate a process for causing a light to turn on. For
`
`another example, a front unit may be configured to track user activity and/or preferences. The front unit may
`
`generate an activity model to anticipate settings for various devices based on the user activity and/or preferences.
`
`[6020]
`
`The present disclosure provides an overall understanding of the principles of the structure, function,
`
`manufacture, and use of the systems and methodsdisclosed herein. One or more examples of the present disclosure
`
`are illustrated in the accompanying drawings. While the techniques and devices presented herein are described
`
`with respect to the home environment, those of ordinary skill in the art will understand that the systems and
`
`methods specifically described herein andillustrated in the accompanying drawings are non-limiting embodiments
`
`and that the techniques and devices described herein may also be used in other environments (e.g., business
`
`environments, commercial environments, etc.). The features illustrated or described in connection with one
`
`embodiment may be combined with the features of other embodiments, including as between systems and methods.
`
`Such modifications and variations are intended to be included within the scope of the appended claims.
`
`10
`
`15
`
`30
`
`35
`
`AMZNO0000956
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`[0621]
`
`FIG. 1 showsan illustrative interaction computing architecture 100 set ina home environment 102 that
`
`includes a user 104 and a modular controlicr 106. Thc humanintcraction may include a voice command, a touch
`
`command, a gesiure command, and/or a physical presence in the home environment 102.
`
`[0022]
`
`Asillustrated, a user 104 in the environment 102 may issue a voice command 108 to the modular
`
`controller 106.
`
`In some examples, the command 108 may additionally or alternatively mclude a touch and/or
`
`gesture command. A front unit 110 of the modular controller 106 may receive the command 108 via one or more
`
`input device(s) 114. The input device(s) 114 may include one or more microphones to detect audio signals, one
`
`or more sensors to detect a touch, a gesture, and/or physical presence of the user, a keyboard, keypad, mouse,
`
`joystick, control buttons, touch screen display, etc. The input device(s) 114 maygenerate one or more signals for
`
`identifying the command 108 and causing the corresponding action to be performed, such as causing a light 116
`
`to be turned on and/or music 118 to be played.
`
`[6023]
`
`The imput device(s) 114 may send the signals corresponding to the command 108 to a computing
`
`system for processing. In some examples, the input device(s) may send the signals to a remote computing system,
`
`such as a remote service 120, either directly or viaa local computing system.
`
`In such examples, the input device(s)
`
`and/or the local computing system may cause the remote service 120 to perform one or more speech processing
`
`10
`
`15
`
`operations based on the signal.
`
`[0024]
`
`The remote service 120 may include one or more remote devices (or “computing resources”). The
`
`remote devices may form a portion of a network-accessible computing platform implemented as a computing
`
`infrastructure of processors, storage, software, data access, etc. that is maintained and accessible via a network 122,
`
`such as the Internet. The remote devices do not require end-user knowledge of physical location and configuration
`
`of the system that delivers the services. Common expressions associated for these remote devices include “on-
`30 ce
`
`demand computing,” “software as a service (SaaS),” “platform computing,” “network-accessible platform,”
`
`“cloud
`
`services,” “data centers,” and so forth. Further, while FIG. | illustrates the remote service 120 as beimg accessible
`
`over a network,
`
`in other examples,
`
`the remote service 120 may comprise a local hub within the home
`
`environment 102.
`
`[0625]
`
`Asillustrated, the remote service 120 includes one or more processor(s) 124 and computer-readable
`
`media 126, which have access to a signal processing module 128, an interpretation module 130, a command
`
`module 132, and a data store 134. The processor(s) 124 mayinclude a central processing unit (CPU), a graphics
`
`processing unit (GPU), both CPU and GPU,a microprocessor, a digital signal processor or other processing units
`
`30
`
`or components knownin the art. Alternatively, or in addition, the functionally described herein maybe performed,
`
`at least in part, by one or more hardware logic components. For example, and withoutlimitation, illustrative types
`
`of hardware logic components that can be used include field-programmable gate arrays (FPGAs), application-
`
`specific integrated circuits (ASICs), application-specific standard products (ASSPs), system-on-a-chip systems
`
`(SOCs), complex programmable logic devices (CPLDs), etc. Additionally, each of the processor(s) 124 may
`
`AMZNO0000957
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`possess its own local memory, which also may store program modules, program data, and/or one or more operating
`
`systcms.
`
`[0026]
`
`The computer-readable media 126 may include volatile and nonvolatile memory, removable and non-
`
`removable media implemented in any methodor technologyfor storage of information, such as computer-readable
`
`instructions, data structures, program modules, or other data. Such memoryincludes, butis not limited to, RAM,
`
`ROM, EEPROM,flash memory or other memorytechnology, CD-ROM,digital versatile disks (DVD) or other
`
`optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, RAID
`
`storage systems, or any other medium whichcan be used to store the desired information and which can be accessed
`
`by a computing device. The computer-readable media 126 may be implemented as computer-readable storage
`
`media (“CRSM”), which may be any available physical media accessible by the processor(s) to execute
`
`instructions stored on the computer-readable media.
`
`In various examples, CRSM mayinclude random access
`
`memory (“RAM”) and Flash memory.
`
`In some examples, CRSM mayinclude, but is not limited to, read-only
`
`memory (“ROM”), electrically erasable programmable read-only memory ((EEPROM”), or any other tangible
`
`medium which can be used to store the desired information and which can be accessed by the processor(s) 124.
`
`[0027]
`
`Several functional modules, data stores, and so forth may be stored within the computer-readable
`
`media 126 and configured to cxccute on the processor(s) 124. A few cxample functional modulcs (e.g., signal
`
`processing module 128, interpretation module 130, and command module 132) are shown as applications stored
`
`in the memory and executed on the processor(s) 124, although the same functionality may alternatively be
`
`implemented in hardware, firmware, or as a system on a chip (SOC). Additionally, thoughillustrated as signal
`
`processing module 128, interpretation module 130, and command module 132, the functionality of the modules
`
`may be performed by a greater or lesser number of modules.
`
`[0028]
`
`In various examples, the front unit 110 of the modular controller 106 may send signals corresponding
`
`to the command 108 (e.g., voice, touch, gesture and/or presence command) to the remote service 120 for speech
`
`processing (e.g., automatic speech recognition (ASR), natural language understanding (NLU), query parsing, etc.)
`
`to identify the command. The remote service 120 may receive the signals at the signal processing module 128. In
`
`various examples, the signal processing module 128 may perform object and/or speech recognition processing
`
`techniques to generate a processed signal. For example, the signal processing module 128 mayreceive a signal
`
`related to a gesture command, and may perform scale-invariant feature transform, speeded up robust features,
`
`greyscale matching, gradient matching, facial recognition, or other techniques to identify a gesture and/or a user
`
`associated with the signal. For another example, the signal processing module 128 may receive an audio signal
`
`and may perform beamforming, acoustic-echo cancelation, background noise reduction, or other techniques to
`
`generate one or more processed audio signals having a higher signal-to-noise ratio than the pre-processed audio
`
`signals.
`
`[0029]
`
`The remote service 120 may route the signals (e.g.,
`
`touch, gesture, presence, audio, etc.) to the
`
`interpretation module 130 configured to interpret the command associated therewith.
`
`In various examples, the
`
`interpretation module 130 may perform ASR on audio data to determine text. In such examples, the interpretation
`
`module 130 may process the text to determine multiple portions thercof. For cxamplc, a first portion of the text
`
`10
`
`15
`
`30
`
`35
`
`Fr
`
`AMZNO0000958
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`may correspond to a command and a second portion of the text may correspond to a conditional statement that
`
`may rcelatc to when the command should be cxceuted.
`
`In such cxamples, the interpretation module 130 or other
`
`module of the remote service 120 may be configured to determine if the condition is met. For another example, a
`
`first portion of the text may correspond to a first command and a second portion may correspond to a second
`
`command. Additionally or alternatively, the interpretation module 130 may perform NLU on input data to
`
`determine a meaningof the data (e.g., a meaning of the command).
`
`In some examples, NLU maybe performed
`
`on the text determined as a result of the ASR processing. NLU processing interprets a text string to derive an
`
`intent or a desired action from the user as well as the pertinent pieces of information in the text that allowa device
`
`to complete the desired action.
`
`In various examples, the interpretation module 130 may compare wordsin the
`
`signals to words, phrases, touch selections, gestures, and/or presence settings in the data store 134.
`
`In some
`
`examples, one or more of the words, phrases, touch selections, gestures, and/or presence settings may be user-
`
`specific and stored in a user profile 136.
`
`In the illustrative example, the interpretation module 130 mayreceive a
`
`processed signal correspondingto the voice command 108, and maydetermine that the processed signal includes
`
`a first command to turn on the light 116 and a second command to play music 118.
`
`[0030]
`
`The output data from the interpretation module 130 maythen be sent to the command module 132 for
`
`commandprocessing. The command module 132 may dctcrminc a device associated with the commands, and may
`
`generate an instruction (e.g., a control signal) to perform a corresponding action (e.g., operation). The imstruction
`
`may include information about the action and/or information about the device identified to performthe action. The
`
`remote service 120 may then send the mstruction to the associated device directly and/or via the front unit 110. In
`
`the illustrative example, the remote service 120 sendsinstructionsto the front unit 110 to play music 118 and turn
`
`on the light 116, such as byactivating a light switch on the front unit 110.
`
`In some examples, the instructions may
`
`include a commandto send the controlsignal to a device, suchas the light 116, to perform the corresponding action
`
`(turn on”).
`
`[0031]
`
`The front unit 110 may communicate with the remote service 120 and/or one or more devices over one
`
`or more network(s) 122, which may comprise wired technologies (e.g., wires, USB, fiber optic cable, etc.),
`
`wireless technologies (e.g., WiFi, RF, cellular,
`
`satellite, etc.), or other connection technologies.
`
`The
`
`network(s) 122 may represent of any type of communication network, including data and/or voice network, and
`
`maybe implemented using wired infrastructure (e.g., cable, CATS, fiber optic cable, etc.), a wireless infrastructure
`
`(e.g., RF, cellular, microwave, satellite, etc.), and/or other connection technologies.
`
`In some examples, the front
`
`unit 110 may also communicate with devices via short-range wireless communication protocols (e.g., Bluetooth®,
`
`Zigbee®:, eic.). For example,
`
`the front unit 110 may send a control signal to the light 116 via a Bluetooth®
`
`connection.
`
`[6032]
`
`In some examples, the input device(s) 114 may receive the command, and may send the signals to a
`
`local computing system for processing. The local computing system of the front unit 110 may include one or more
`
`processor(s) 138 and computer-readable media 140. The processor(s) 138 may include a central processing unit
`
`(CPU), a graphics processing unit (GPU), both CPU and GPU, a microprocessor, a digital signal processor or other
`
`processing units or componcnts known in the art. Altcrnativcly, or in addition, the functionally described hercin
`
`10
`
`15
`
`30
`
`35
`
`6
`
`AMZNO0000959
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`may be performed, at least in part, by one or more hardware logic components. For example, and without
`
`limitation, illustrative typcs of hardware logic componcnts that can be used include ficld-programmablc gate arrays
`
`(FPGAs), application-specific integrated circuits (ASICs), application-specific slandard products (ASSPs),
`
`system-on-a-chip systems (SOCs), complex programmable logic devices (CPLDs), etc. Additionally, each of the
`
`processor(s) 138 may possess its own local memory, whichalso maystore program modules, program data, and/or
`
`one or more operating systems.
`
`[0033]
`
`The computer-readable media 140, similar to computer-readable media 126 described above, may
`
`include any tangible medium used to store the desired information and which can be accessed bythe processor(s)
`
`138. Several modules such as instruction, data stores, and so forth may be stored within the computer-readable
`
`media 140 and configured to execute on the processor(s) 138. A few example functional modules are shownas
`
`applications stored in the computer-readable media 140 and executed on the processor(s) 138, although the same
`
`functionality may alternatively be implemented in hardware, firmware, or as a system on a chip (SOC). For
`
`example, the computer-readable media 140 may include a signal processing module 142.
`
`In some examples, the
`
`signal processing module 142 may be configured to employ any number of conventional speech processing
`
`techniques such as automatic speech recognition, natural language understanding, and extensive lexicons to
`
`interpret voicc input, such as signal proccssing modulc 128.
`
`In othcr cxamples,
`
`the signal processing
`
`module 142 may simply be programmed to identify the user uttering a predefined word or phrase (e.g¢., a “wake
`
`word”), a making a predefined touch input, or making a predefined gesture, after which the front unit 110 may
`
`begin uploading audio signals to the remote service 120 for more robust processing. In such examples, the signal
`
`processing module 142 may compare words detected by a microphoneto the wake word and responsive to a match,
`
`may determine that words detected thereafter comprise a command. Additionally, the front unit 110 may include
`
`an interpretation module 144, a command module 146, and/or a data store 148, to process a signal, interpret the
`
`contents of the signal, and generate a commandassociated with the signal, in lieu of, or in addition to, the remote
`service 120 described above.
`
`[0034]
`
`In various examples, the front unit 110 may inchide one or more output device(s) 150 configured to
`
`perform requested actions. The output devices(s) 150 may include a speaker, a display, a wireless transceiver
`
`configured to transmit a signal to an external device (e.g., an external speaker, display, computing device,
`
`audio/visual system, a second front unit, etc.), a vibrator to create haptic sensations, and the like. The output
`
`device(s) 130 may include device controllers configured to receive control signals. Responsive to receiving a
`
`control signal from the local computing system and/or the remote service 120, the output device(s) 150 may
`
`perform the requested action.
`
`In the illustrative example, responsive to receiving a control signal associated with
`
`the command 108 to “play music,” the speakers on the front unit 110 mayplay the requested music 118.
`
`[6035]
`
`In some examples, the front unit 110 may include one or more network interface(s) 152 to facilitate
`
`network communications with external devices.
`
`In some examples, the network interface(s) 152 may be coupled
`
`to an antenna to facilitate wireless communication to the network(s) 122. The network interface(s) 152 may
`
`implement one or more various wireless technologies, such as Wi-Fi, Bluetooth®, RF, and so on. Additionally,
`
`the front unit 110 may include an external device port 154, such as a USB port, to connect external devices directly
`
`10
`
`15
`
`30
`
`35
`
`AMZNO000960
`
`
`
`WO 2018/118893
`
`PCT/US2017/067274
`
`to the front unit 110. The external device port 154 mayfacilitate power and/or data transfer betweenthe front unit
`
`110 and the external device. The external device may couple to the external device port 154 and/or the front
`
`unit 110 via a USB connector, a pogo pin, a mechanical attachment (e.g., a snap-fil connector, a magnetic
`
`connector, etc.), or other type of coupling mechanism.
`
`[0036]
`
`In various examples, the front unit 110 may receive electrical power from and/or transfer data to/from
`
`the power unit 112. The power unit 112 may be mounted in a wall, such as in the wall of environment.
`
`In some
`
`examples, the power unit 112 may be insect into the wall such that a back side ofa face plate is substantially flush
`
`against (e.g., aligned with) a surface of the wall.
`
`In various examples, the power unit 112 maybesizedto fit into
`
`a standard electrical junction box. The power unit 112 may include a power converter 156 configured to provide
`
`an appropriale power format (¢.g., vollage, current, etc.). Additionally, in some examples, the power unit 112 may
`
`be configured to provide a means for data transfer via one or more networkinterface(s) 158, such as, for example,
`
`via a power-line communication connection. The power unit 112 mayprovide electrical power and/or data transfer
`
`via a connector 160.
`
`[0037]
`
`FIG. 2 illustrates a perspective view of an example power unit 200, such as power unit 112, and an
`
`example front unit 202, such as front unit 110, of a modular controller 204, such as modular controller 106. As
`
`discussed above, the powcr unit 200 may be configured to reccive powcr, such as via one or morc wires 206,
`
`convert the electrical power to a format compatible with the front unit 202 (e.g., AC to DC conversion, such as
`
`110V to 9V conversion, 220V to 9V conversion, etc.), and transmit powerto the front unit 202.
`
`[0038]
`
`In various examples, the power unit 200 may comprise a form factor configured to fit in a standard
`
`electrical junction box. In such examples, the power unit 200 may comprise a height, width, and/or depth to house
`
`within a junction box.
`
`In various examples, the power unit 200 may be sized to fit within a single gang, double
`
`gang, or multiple gang junction box. In some examples, two or more power units 200 mayfit in a double gang or
`
`multiple gang junction box.
`
`[0039]
`
`The power unit 200 may be configured to couple to one or more wires 206 for receiving power(e.g.,
`
`connect to a power source) via a terminal. In various examples, the wires 206 may include one or moreelectrical
`
`wires (¢.g., hot, ground, load, neutral).
`
`In some examples, the wires 206 may also be used to transmit data (e.g.,
`