`(12) Patent Application Publication (10) Pub. No.: US 2014/0336935 A1
`Zhu et al.
`(43) Pub. Date:
`Nov. 13, 2014
`
`US 20140336935A1
`
`(54) METHODS AND SYSTEMS FOR DETECTING
`WEATHER CONDITIONS USING VEHICLE
`ONBOARD SENSORS
`
`(52) U.S. Cl.
`CPC ....................................... G0IW I/00 (2013.01)
`USPC .............................................................. 702/3
`
`(71) Applicant: Google Inc., (US)
`(72) Inventors: Jiajun Zhu, Sunnyvale, CA (US);
`Dmitri Dolgov, Mountain View, CA
`(US); Dave Ferguson, San Francisco,
`CA (US)
`r ar.
`(73) Assignee: Google Inc., Mountain View, CA (US)
`(21) Appl. No.: 13/888,634
`(22) Filed:
`May 7, 2013
`
`Publication Classification
`
`(51) Int. Cl.
`G0IV I/00
`
`
`
`(2006.01)
`
`(57)
`ABSTRACT
`Example methods and systems for detecting weather condi
`tions using vehicle onboard sensors are provided. An example
`method includes receiving laser data collected for an environ
`ment of a vehicle, and the laser data includes a plurality of
`laser data points. The method also includes associating, by a
`conting device, laser data points of the plurality s R
`data points with one or more objects in the environment, and
`determining given laser data points of the plurality of laser
`data points that are unassociated with the one or more objects
`in the environment as being representative of an untracked
`object. The method also includes based on one or more
`untracked objects being determined, identifying by the com
`puting device an indication of a weather condition of the
`environment.
`
`IPR2025-00943
`Tesla EX1033 Page 1
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 1 of 7
`
`US 2014/0336935 A1
`
`
`
`VEHICLE 100
`
`PROPULSEON
`SYSTEM 102
`ENGINE
`MOTOR
`18
`
`ENERGY
`SOURCE
`19
`
`TRANSMESS
`ON
`120
`
`SENSOR SYSTEM
`104
`GLOBAL.
`POSITONING
`SYSTEM
`122
`PRECIPITATION
`SENSOR
`123
`NERTIAL
`MEASUREMENT
`UNT
`124
`
`RADAR UN
`126
`
`LASER
`RANGEFENDER
`LiDAR UNIT 128
`
`MCROPHONE
`131
`
`PROCESSOR
`13
`
`NSTRUCTIONS
`115
`
`DATA SORAGE
`L-
`4
`COMPUTER SYSTEM
`112
`
`CONTROL SYSTEM
`106
`
`STEERING UNIT
`132
`
`THROTTLE
`134
`
`PERIPHERALS
`108
`WRELESS
`COMMUNICAT
`ON SYSTEM
`146
`
`TOUCH
`SCREEN
`48
`
`BRAKE UNIT
`136
`
`MCROPHONE
`150
`
`SPEAKER
`152
`
`SENSORFUSION
`ALGORTHM
`138
`
`COMPUTER VISION
`SYSTEM
`140
`
`NAVIGATION |
`PATHING SYSTEM
`142
`
`OBSTACLE
`AVODANCE
`SYSTEM
`144
`
`POWER SUPPLY
`110
`
`USER iNTERFACE
`116
`
`IPR2025-00943
`Tesla EX1033 Page 2
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 2 of 7
`
`US 2014/0336935 A1
`
`
`
`RIGHT SIDE WEW
`
`BACK WEW
`
`TOP WEW
`
`FIG. 2
`
`IPR2025-00943
`Tesla EX1033 Page 3
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 3 of 7
`
`US 2014/0336935 A1
`
`300
`
`
`
`RECEIVENG LASER DATA COLLECTED FOR AN ENVIRONMENT OF A
`WEHICLE
`
`302
`
`ASSOCATING, BY A COMPUTING DEVICE, LASER DATA POINTS OF THE
`PLURALY OF ASER DATA PONTS WHONE OR NORE OBJECTS EN
`THE ENVIRONMENT
`
`304
`
`DETERMINING GIVEN LASER DATA POINTS OF THE PLURALITY OF
`LASER DAA PONTS THAT ARE UNASSOCATED WITH THE ONE OR
`WORE OBJECTS N. HE ENVIRONMEN AS BEENGREPRESENTAVE OF
`AN UNTRACKED OBJECT
`
`306
`
`BASED ON ONE OR MORE UNTRACKED OBJECTS BEING DETERMINED,
`EDENTEFYNG BY THE COMPUTENG OEVCE AN ENDCATION OF A
`WEATHER CONDON OF THE ENVIRONMENT
`
`308
`
`FIG. 3
`
`IPR2025-00943
`Tesla EX1033 Page 4
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 4 of 7
`
`US 2014/0336935 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`402
`
`AGREEMENT
`WTH RADAR
`DATA2
`
`AGREEMENT
`WH CAMERA
`DATA2
`
`AGREEMENT
`WTH OTHER SENSOR
`DATAP
`
`AGREEMENT
`WTH WEATHER
`DATAP
`
`
`
`400
`
`404
`
`
`
`
`
`N O
`REPROCESSING LASER
`DATA
`
`412
`
`
`
`
`
`OENTEFYING AN
`NDCATION OF A
`WEATHER CONDITION
`
`FIG. 4
`
`IPR2025-00943
`Tesla EX1033 Page 5
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 5 of 7
`
`US 2014/0336935 A1
`
`
`
`IPR2025-00943
`Tesla EX1033 Page 6
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 6 of 7
`
`US 2014/0336935 A1
`
`
`
`g
`
`IPR2025-00943
`Tesla EX1033 Page 7
`
`
`
`Patent Application Publication
`
`Nov. 13, 2014 Sheet 7 of 7
`
`US 2014/0336935 A1
`
`
`
`IPR2025-00943
`Tesla EX1033 Page 8
`
`
`
`US 2014/0336935 A1
`
`Nov. 13, 2014
`
`METHODS AND SYSTEMIS FOR DETECTING
`WEATHER CONDITIONS USING VEHICLE
`ONBOARD SENSORS
`
`BACKGROUND
`0001 Unless otherwise indicated herein, the materials
`described in this section are not prior art to the claims in this
`application and are not admitted to be prior art by inclusion in
`this section.
`0002 Autonomous vehicles use various computing sys
`tems to aid in the transport of passengers from one location to
`another. Some autonomous vehicles may require an initial
`input or continuous input from an operator, Such as a pilot,
`driver, or passenger. Other autonomous systems, for example
`autopilot systems, may be used when the system has been
`engaged, which permits the operator to Switch from a manual
`mode (where the operator exercises a high degree of control
`over the movement of the vehicle) to an autonomous mode
`(where the vehicle essentially drives itself) to modes that lie
`somewhere in between.
`0003. Such vehicles are typically equipped with various
`types of sensors in order to detect objects in the Surroundings.
`For example, an autonomous vehicle may include lasers,
`Sonar, radar, cameras, and other devices which scan and
`record data from surroundings of the vehicle. Sensor data
`from one or more of these devices may be used to detect
`objects and their respective characteristics (position, shape,
`heading, speed, etc.). This detection and identification is use
`ful for the safe operation of autonomous vehicle.
`
`SUMMARY
`0004. Within examples, devices and methods for detecting
`weather conditions using vehicle onboard sensors are pro
`vided.
`0005. In one example, a method is provided that comprises
`receiving laser data collected for an environment of a vehicle,
`and the laser data includes a plurality of laser data points. The
`method also comprises associating, by a computing device,
`laser data points of the plurality of laser data points with one
`or more objects in the environment. The method also com
`prises determining given laser data points of the plurality of
`laser data points that are unassociated with the one or more
`objects in the environment as being representative of an
`untracked object, and based on one or more untracked objects
`being determined, identifying by the computing device an
`indication of a weather condition of the environment.
`0006. In another example, a non-transitory computer read
`able storage medium having stored therein instructions, that
`when executed by a computing device, cause the computing
`device to perform functions. The functions comprise receiv
`ing laser data collected for an environment of a vehicle, and
`the laser data includes a plurality of laser data points. The
`functions also comprise associating laser data points of the
`plurality of laser data points with one or more objects in the
`environment, and determining given laser data points of the
`plurality of laser data points that are unassociated with the one
`or more objects in the environment as being representative of
`an untracked object. The functions also comprise based on
`one or more untracked objects being determined, identifying
`an indication of a weather condition of the environment.
`0007. In still another example, a system is provided that
`comprises at least one processor, and data storage comprising
`instructions executable by the at least one processor to cause
`
`the system to perform functions. The functions comprise
`receiving laser data collected for an environment of a vehicle
`and the laser data includes a plurality of laser data points, and
`associating laser data points of the plurality of laser data
`points with one or more objects in the environment. The
`functions also comprise determining given laser data points
`of the plurality of laser data points that are unassociated with
`the one or more objects in the environment as being repre
`sentative of an untracked object, and based on one or more
`untracked objects being determined, identifying an indication
`of a weather condition of the environment.
`0008. In still another example, a device is provided com
`prising a means for receiving laser data collected for an envi
`ronment of a vehicle, and the laser data includes a plurality of
`laser data points. The device also comprises means for asso
`ciating laser data points of the plurality of laser data points
`with one or more objects in the environment. The device also
`comprises means for determining given laser data points of
`the plurality of laser data points that are unassociated with the
`one or more objects in the environment as being representa
`tive of an untracked object, and based on one or more
`untracked objects being determined, means for identifying by
`the computing device an indication of a weather condition of
`the environment.
`0009. These as well as other aspects, advantages, and
`alternatives, will become apparent to those of ordinary skill in
`the art by reading the following detailed description, with
`reference where appropriate to the accompanying figures.
`
`BRIEF DESCRIPTION OF THE FIGURES
`0010 FIG. 1 is a functional block diagram depicting a
`vehicle according to an example embodiment.
`0011
`FIG. 2 depicts an example vehicle that can include
`all or some of the functions described in connection with the
`vehicle in reference to FIG. 1.
`0012 FIG. 3 is a block diagram of an example method for
`detecting a weather condition using onboard vehicle sensors,
`in accordance with at least some embodiments described
`herein.
`0013 FIG. 4 is a block diagram of example methods for
`determining further indications of weather conditions using
`onboard vehicle sensors, in accordance with at least some
`embodiments described herein.
`0014 FIG. 5 is an example conceptual side view illustra
`tion of identifying a weather condition including an indica
`tion that a surface on which a vehicle travels is wet.
`0015 FIG. 6 is another example conceptual illustration of
`identifying an indication that a surface on which a vehicle
`travels is wet.
`0016 FIG. 7 is another example conceptual illustration of
`identifying a weather condition that includes fog.
`0017 FIG. 8A is another example conceptual illustration
`of identifying an indication that a weather condition of the
`environment includes fog.
`0018 FIG. 8B is an example conceptual illustration of an
`image captured by the vehicle in FIG. 8A.
`0019 FIG. 9A is another example conceptual illustration
`of identifying a weather condition, which in this instance, is a
`Sunny condition based on camera images.
`0020 FIG.9B is an example conceptual illustration of an
`image captured by the vehicle in FIG.9A.
`0021
`FIG. 10 includes another example conceptual side
`view illustration of identifying an indication that an environ
`ment of a vehicle is Sunny.
`
`IPR2025-00943
`Tesla EX1033 Page 9
`
`
`
`US 2014/0336935 A1
`
`Nov. 13, 2014
`
`DETAILED DESCRIPTION
`0022. The following detailed description describes vari
`ous features and functions of the disclosed systems and meth
`ods with reference to the accompanying figures. In the fig
`ures, similar symbols identify similar components, unless
`context dictates otherwise, and the figures or components of
`the figures may not necessarily be drawn to scale for illustra
`tion purposes. The illustrative system and method embodi
`ments described herein are not meant to be limiting. It may be
`readily understood that certain aspects of the disclosed sys
`tems and methods can be arranged and combined in a wide
`variety of different configurations, all of which are contem
`plated herein.
`0023. Within examples, methods and systems are pro
`vided for detecting weather conditions using vehicle onboard
`sensors, and modifying behavior of the vehicle accordingly.
`In some examples, self-driving cars or autonomous vehicles
`may not drive or drive as well under certain weather condi
`tions such as heavy rain, wet-road, fog, direct Sun light, etc.,
`and thus, behavior the autonomous vehicle may be based on
`the detected weather condition.
`0024. In one example, a method is provided that comprises
`receiving laser data collected for an environment of a vehicle.
`A computing device may be configured to associate laser data
`points of the plurality of laser data points with one or more
`objects in the environment, and determine given laser data
`points of the plurality of laser data points that are unassoci
`ated with the one or more objects in the environment as being
`representative of an untracked object. Based on one or more
`untracked objects being determined, the computing device
`may identify an indication of a weather condition of the
`environment.
`0025. In a specific example, a radar sensor may be unable
`to detect some weather conditions, such as rain or fog (e.g.,
`specifically rain/water that is an arch of water kicked up off of
`a Surface by a vehicle traveling a certain speed like a rooster
`tail of water). However, a laser sensor may collect laser data
`relating to such water conditions. Thus, for any untracked
`received laser data (e.g., laser data that does not match to
`received radar data), an indication of a weather condition can
`be determined, and the specific weather condition can be
`ascertained by details of the untracked object.
`0026. As another example, Sunlight may contaminate
`laser data by causing additional laser data points or altering
`wavelengths of laser data points. Thus, Some laser data col
`lected may not be representative of objects in an environment
`of the vehicle. By comparing to data of objects in the envi
`ronment (Such as tracked objects using laser or radar data),
`those laser data points that are contaminated may be identi
`fied, and based on identifying Such contaminated data, a
`Sunny weather condition can be made. Additional details may
`be accessed to further confirm the sunny weather condition,
`Such as a geographic location of the vehicle and a time of day,
`information about the weather from a server, or image data
`from a camera coupled to the vehicle.
`0027. As yet another example, a laser sensor may be
`unable to detect objects through fog, and may in fact, receive
`data points reflected off the fog. For any untracked received
`laser data (e.g., laser data that does not match to tracked or
`known objects in the environment), an indication can be
`determined that the vehicle is in an environment that has a
`foggy weather condition.
`0028. Further information may be used to provide a higher
`confidence level or confirmation of the weather condition,
`
`such as a speed of the vehicle. For example, if the vehicle is
`not moving or moving slowly, then it is less likely that
`vehicles in front of the vehicle are moving fast enough to
`cause water to be kicked up off the road and detected by a
`laser sensor. Other information that may be used includes rain
`detector information, information about the weather from a
`server, or image data from a camera coupled to the sensor.
`0029. The indication of the weather condition can be use
`ful to determine safe driving actions of an autonomous
`vehicle. Example actions may include providing instructions
`to indicate a request to transition to a manual mode, or if
`remaining in autonomous mode then Switching to a mode
`specific to wet roadways (i.e., driving at slower speeds, allow
`ing for larger distances to accomplish braking, etc.).
`0030 Example systems within the scope of the present
`disclosure will now be described in greater detail. Generally,
`an example system may be implemented in or may take the
`form of an automobile. However, an example system may
`also be implemented in or take the form of other vehicles,
`Such as cars, trucks, motorcycles, buses, boats, airplanes,
`helicopters, lawn mowers, recreational vehicles, amusement
`park vehicles, farm equipment, construction equipment,
`trams, golf carts, trains, and trolleys. Other vehicles are pos
`sible as well.
`0031
`FIG. 1 is a functional block diagram depicting a
`vehicle 100 according to an example embodiment. The
`vehicle 100 is configured to operate fully or partially in an
`autonomous mode, and thus may be referred to as an "autono
`mous vehicle.” For example, a computer system 112 may
`control the vehicle 100 while in an autonomous mode via
`control instructions to a control system 106 for the vehicle
`100. The computer system 112 may receive information from
`a sensor System 104, and base one or more control processes
`(such as the setting a heading so as to avoid a detected
`obstacle) upon the received information in an automated fash
`1O.
`0032. The vehicle 100 may be fully autonomous or par
`tially autonomous. In a partially autonomous vehicle some
`functions can optionally be manually controlled (e.g., by a
`driver) some or all of the time. Further, a partially autono
`mous vehicle may be configured to switch between a fully
`manual operation mode and a partially-autonomous and/or a
`fully-autonomous operation mode.
`0033. The vehicle 100 may include various subsystems
`Such as a propulsion system 102, a sensor system 104, a
`control system 106, one or more peripherals 108, as well as a
`power Supply 110, a computer system 112, and a user inter
`face 116. The vehicle 100 may include more or fewer sub
`systems and each Subsystem may include multiple elements.
`Further, each of the subsystems and elements of vehicle 100
`may be interconnected. Thus, one or more of the described
`functions of the vehicle 100 may be divided up into additional
`functional or physical components, or combined into fewer
`functional or physical components. In some further
`examples, additional functional and/or physical components
`may be added to the examples illustrated by FIG. 1.
`0034. The propulsion system 102 may include compo
`nents operable to provide powered motion to the vehicle 100.
`Depending upon the embodiment, the propulsion system 102
`may include an engine/motor 118, an energy source 119, a
`transmission 120, and wheels/tires 121. The engine/motor
`118 could be any combination of an internal combustion
`engine, an electric motor, Steam engine, Stirling engine, or
`other types of engines and/or motors. In some embodiments,
`
`IPR2025-00943
`Tesla EX1033 Page 10
`
`
`
`US 2014/0336935 A1
`
`Nov. 13, 2014
`
`the propulsion system 102 may include multiple types of
`engines and/or motors. For instance, a gas-electric hybrid
`vehicle may include a gasoline engine and an electric motor.
`Other examples are possible as well.
`0035. The energy source 119 may represent a source of
`energy that may, in full or in part, power the engine/motor
`118. That is, the engine/motor 118 may be configured to
`convert the energy source 119 into mechanical energy to
`operate the transmission 120. Examples of energy sources
`119 may include gasoline, diesel, other petroleum-based
`fuels, propane, other compressed gas-based fuels, ethanol,
`Solar panels, batteries, capacitors, flywheels, regenerative
`braking systems, and/or other sources of electrical power, etc.
`The energy source 119 may also provide energy for other
`systems of the automobile 100.
`0036. The transmission 120 may include elements that are
`operable to transmit mechanical power from the engine/mo
`tor 118 to the wheels/tires 121. Such elements may include a
`gearbox, a clutch, a differential, a drive shaft, and/or axle(s),
`etc. The transmission 120 may include other elements as well.
`The drive shafts may include one or more axles that may be
`coupled to the one or more wheels/tires 121.
`0037. The wheels/tires 121 may be arranged to stably sup
`port the vehicle 100 while providing frictional traction with a
`surface, such as a road, upon which the vehicle 100 moves.
`Accordingly, the wheels/tires 121 of vehicle 100 may be
`configured in various formats, including a unicycle, bicyclef
`motorcycle, tricycle, or car/truck four-wheel format. Other
`wheel/tire geometries are possible, such as those including
`six or more wheels. Any combination of the wheels/tires 121
`of vehicle 100 may be operable to rotate differentially with
`respect to other wheels/tires 121. The wheels/tires 121 may
`represent at least one wheel that is fixedly attached to the
`transmission 120 and at least one tire coupled to a rim of the
`wheel that could make contact with the driving surface. The
`wheels/tires 121 may include any combination of metal and
`rubber, or another combination of materials.
`0038. The sensor system 104 generally includes one or
`more sensors configured to detect information about the envi
`ronment surrounding the vehicle 100. For example, the sensor
`system 104 may include a Global Positioning System (GPS)
`122, a precipitation sensor 123, an inertial measurement unit
`(IMU) 124, a RADAR unit 126 (radio detection and ranging),
`a laser rangefinder/LIDAR unit 128 (laser imaging detection
`and ranging), a camera 130, and/or a microphone 131. The
`sensor system 104 may also include sensors configured to
`monitor internal systems of the vehicle 100 (e.g., O monitor,
`fuelgauge, engine oil temperature, wheel speed sensors, etc.).
`One or more of the sensors included in the sensor system 104
`may be configured to be actuated separately and/or collec
`tively in order to modify a position and/oran orientation of the
`Ole O OSSOS.
`0039. Sensors in the sensor system 104 may be configured
`to provide data that is processed by the computer system 112
`in real-time. For example, sensors may continuously update
`outputs to reflect an environment being sensed at or over a
`range of time, and continuously or as-demanded provide that
`updated output to the computer system 112 so that the com
`puter system 112 can determine whether the vehicle's then
`current direction or speed should be modified in response to
`the sensed environment.
`0040. The GPS 122 may be any sensor configured to esti
`mate a geographic location of the vehicle 100. To this end,
`
`GPS 122 may include a transceiver operable to provide infor
`mation regarding the position of the vehicle 100 with respect
`to the Earth.
`0041. The precipitation sensor 123 may be mounted under
`or incorporated into a windshield of the vehicle 100. Precipi
`tation sensors may also be mounted at various other locations,
`Such as at or near a location of headlamps, etc. In one
`example, the precipitation sensor 123 may include a set of one
`or more infrared light-emitting diodes (LEDs) and a photo
`detector such as a photodiode. Light emitted by the LEDs
`may be reflected by the windshield back to the photodiode.
`The less light the photodiode receives may be indicative of the
`more precipitation outside of the vehicle 100. An amount of
`reflected light or some other indicator of the detected amount
`of precipitation may be passed to computer system 112.
`0042. The IMU 124 may include any combination of sen
`sors (e.g., accelerometers and gyroscopes) configured to
`sense position and orientation changes of the vehicle 100
`based on inertial acceleration.
`0043. The RADAR unit 126 may represent a system that
`utilizes radio signals to sense objects within the local envi
`ronment of the vehicle 100. In some embodiments, in addition
`to sensing the objects, the RADAR unit 126 may additionally
`be configured to sense the speed and/or heading of the
`objects.
`0044 Similarly, the laser rangefinder or LIDAR unit 128
`may be any sensor configured to sense objects in the environ
`ment in which the vehicle 100 is located using lasers.
`Depending upon the embodiment, the laser rangefinder/LI
`DAR unit 128 could include one or more laser sources, a laser
`scanner, and one or more detectors, among other system
`components. The laser rangefinder/LIDAR unit 128 could be
`configured to operate in a coherent (e.g., using heterodyne
`detection) or an incoherent detection mode.
`0045. The camera 130 may include one or more devices
`configured to capture a plurality of images of the environment
`surrounding the vehicle 100. The camera 130 may be a still
`camera or a video camera. In some embodiments, the camera
`130 may be mechanically movable such as by rotating and/or
`tilting a platform to which the camera is mounted. As such, a
`control process of the vehicle 100 may be implemented to
`control the movement of the camera 130.
`0046. The sensor system 104 may also include a micro
`phone 131. The microphone 131 may be configured to cap
`ture sound from the environment surrounding the vehicle 100.
`In some cases, multiple microphones can be arranged as a
`microphone array, or possibly as multiple microphone arrays.
`0047. The control system 106 may be configured to con
`trol operation(s) of the vehicle 100 and its components.
`Accordingly, the control system 106 may include various
`elements include steering unit 132, throttle 134, brake unit
`136, a sensor fusion algorithm 138, a computer vision system
`140, a navigation/pathing system 142, and an obstacle avoid
`ance system 144, etc.
`0048. The steering unit 132 may represent any combina
`tion of mechanisms that may be operable to adjust the heading
`of vehicle 100. For example, the steering unit 132 can adjust
`the axis (or axes) of one or more of the wheels/tires 121 so as
`to effect turning of the vehicle 100. The throttle 134 may be
`configured to control, for instance, the operating speed of the
`engine/motor 118 and, in turn, control the speed of the vehicle
`100. The brake unit 136 may include any combination of
`mechanisms configured to decelerate the vehicle 100. The
`brake unit 136 may, for example, use friction to slow the
`
`IPR2025-00943
`Tesla EX1033 Page 11
`
`
`
`US 2014/0336935 A1
`
`Nov. 13, 2014
`
`wheels/tires 121. In other embodiments, the brake unit 136
`inductively decelerates the wheels/tires 121 by a regenerative
`braking process to convert kinetic energy of the wheels/tires
`121 to electric current. The brake unit 136 may take other
`forms as well.
`0049. The sensor fusion algorithm 138 may be an algo
`rithm (or a computer program product storing an algorithm)
`configured to accept data from the sensor system 104 as an
`input. The data may include, for example, data representing
`information sensed at the sensors of the sensor System 104.
`The sensor fusion algorithm 138 may include or be config
`ured to be executed using, for instance, a Kalman filter, Baye
`sian network, or other algorithm. The sensor fusion algorithm
`138 may provide various assessments based on the data from
`sensor System 104. Depending upon the embodiment, the
`assessments may include evaluations of individual objects
`and/or features in the environment of vehicle 100, evaluations
`of particular situations, and/or evaluations of possible
`impacts based on the particular situation. Other assessments
`are possible.
`0050. The computer vision system 140 may be any system
`operable to process and analyze images captured by camera
`130 in order to identify objects and/or features in the envi
`ronment of vehicle 100 that could include traffic signals, road
`way boundaries, other vehicles, pedestrians, and/or obstacles,
`etc. The computer vision system 140 may use an object rec
`ognition algorithm, a Structure From Motion (SFM) algo
`rithm, video tracking, and other computer vision techniques.
`In some embodiments, the computer vision system 140 could
`be additionally configured to map an environment, track
`objects, estimate the speed of objects, etc.
`0051. The navigation and pathing system 142 may be any
`system configured to determine a driving path for the vehicle
`100. For example, the navigation/pathing system 142 may
`determine a series of speeds and directional headings to effect
`movement of the vehicle 100 along a path that substantially
`avoids perceived obstacles while generally advancing the
`vehicle 100 along a roadway-based path leading to an ulti
`mate destination, which may be set according to user inputs
`via the user interface 116, for example. The navigation and
`pathing system 142 may additionally be configured to update
`the driving path dynamically while the vehicle 100 is in
`operation. In some embodiments, the navigation and pathing
`system 142 could be configured to incorporate data from the
`sensor fusion algorithm 138, the GPS 122, and one or more
`predetermined maps so as to determine the driving path for
`vehicle 100.
`0052. The obstacle avoidance system 144 may represent a
`control system configured to identify, evaluate, and avoid or
`otherwise negotiate potential obstacles in the environment of
`the vehicle 100. For example, the obstacle avoidance system
`144 may effect changes in the navigation of the vehicle 100 by
`operating one or more Subsystems in the control system 106
`to undertake Swerving maneuvers, turning maneuvers, brak
`ing maneuvers, etc. In some embodiments, the obstacle
`avoidance system 144 is configured to automatically deter
`mine feasible (“available') obstacle avoidance maneuvers on
`the basis of Surrounding traffic patterns, road conditions, etc.
`For example, the obstacle avoidance system 144 may be
`configured such that a Swerving maneuver is not undertaken
`when other sensor Systems detect vehicles, construction bar
`riers, other obstacles, etc. in the region adjacent the vehicle
`100 that would be Swerved into. In some embodiments, the
`obstacle avoidance system 144 may automatically select the
`
`maneuver that is both available and maximizes safety of
`occupants of the vehicle. For example, the obstacle avoidance
`system 144 may select an avoidance maneuver predicted to
`cause the least amount of acceleration in a passenger cabin of
`the vehicle 100.
`0053. The control system 106 may additionally or alterna
`tively include components other than those shown and
`described.
`0054) The vehicle 100 also includes peripherals 108 con
`figured to allow interaction between the vehicle 100 and
`external sensors, other vehicles, other computer systems, and/
`or a user, such as an occupant of the vehicle 100. For example,
`the peripherals 108 for receiving information from occupants,
`external systems, etc. may include a wireless communication
`system 146, a touchscreen 148, a microphone 150, and/or a
`speaker 152.
`0055. In some embodiments, the peripherals 108 function
`to receive inputs for a user of the vehicle 100 to interact with
`the user interface 116. To this end, the touchscreen 148 can
`both provide information to a user of the vehicle 100, and
`convey information from the user indicated via the touch
`screen 148 to the user interface 116. The touchscreen 148 can
`be configured to sense both touch positions and touch ges
`tures from the finger of a user (or stylus, etc.) via capacitive
`sensing, resistance sensing, optical sensing, a Surface acous
`tic wave process, etc. The touchscreen 148 can be capable of
`sensing finger movement in a direction parallel or planar to
`the touchscreen Surface, in a direction normal to the touch
`screen Surface, or both, and may also be capable of sensing a
`level of pressure applied to the touchscreen Surface. An occu
`pant of the vehicle 100 can also utilize a voice command
`interface. For example, the microphone 150 can be config
`ured to receive audio (e.g., a Voice command or other audio
`input) from an occupant of the vehicle 100. Similarly, the
`speaker 152 can be configured to output audio to the occupant
`of the vehicle 100.
`0056. In some embodiments, the peripherals 108 function
`to allow communication between the vehicle 100 and external
`systems, such as devices, sensors, other vehicles, etc. within
`its surrounding environment and/or controllers, servers, etc.,
`physically located far from the vehicle 100 that provide useful
`information regarding the vehicle's Surroundings, such as
`traffic information, weather information, etc. For example,
`the wireless communication system 146 can wirelessly com
`municate with one or more devices directly or via a commu
`nication network. The wireless communication system 146
`can optionally use 3G cellular communication, Such as
`CDMA, EVDO, GSM/GPRS, and/or 4G cellular communi
`cation, such as WiMAX or LTE. Additionally or alternatively,
`the wireless communication system 146 can communicate
`with a wireless local area network (WLAN), for example,
`using WiFi. In some embodiments, the wireless communica
`tion system 146 could communicate directly with a device,
`for example, using an infrared link, short-range wireless link,
`etc. The wireless communication system 146 can include one
`or more dedicated short range communication (DSRC)
`devices that can include public and/or private data communi
`cations between