`(12) Patent Application Publication (10) Pub. No.: US 2007/0219720 A1
`Trepagnier et al.
`(43) Pub. Date:
`Sep. 20, 2007
`
`US 20070219720A1
`
`(54) NAVIGATION AND CONTROL SYSTEM FOR
`AUTONOMOUS VEHICLES
`
`(75) Inventors: Paul Gerard Trepagnier, Metairie, LA
`(US); Jorge Emilio Nagel, Caracas
`(VE); Powell McVay Kinney, Slidell,
`LA (US); Matthew Taylor Dooner,
`New Orleans, LA (US); Bruce Mackie
`Wilson, Metairie, LA (US); Carl
`Reimers Schneider JR., Metairie, LA
`(US); Keith Brian Goeller, Covington,
`LA (US)
`Correspondence Address:
`OBLON, SPIVAK, MCCLELLAND, MAIER &
`NEUSTADT, P.C.
`194O DUKE STREET
`ALEXANDRIA, VA 22314 (US)
`(73) Assignee: The Gray Insurance Company, Metai
`rie, LA
`(21) Appl. No.:
`11/376,160
`
`(22) Filed:
`
`Mar. 16, 2006
`
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`GOIC 2L/00
`(52) U.S. Cl. ........................... 701/300; 701/200; 701/207
`
`(57)
`
`ABSTRACT
`
`A navigation and control system including a sensor config
`ured to locate objects in a predetermined field of view from
`a vehicle. The sensor has an emitter configured to repeatedly
`scana beam into a two-dimensional sector of a plane defined
`with respect to a first predetermined axis of the vehicle, and
`a detector configured to detect a reflection of the emitted
`beam from one of the objects. The sensor includes a panning
`mechanism configured to pan the plane in which the beam
`is scanned about a second predetermined axis to produce a
`three dimensional field of view. The navigation and control
`system includes a processor configured to determine the
`existence and location of the objects in the three dimensional
`field of view based on a position of the vehicle and a time
`between an emittance of the beam and a reception of the
`reflection of the emitted beam from one of the objects.
`
`
`
`16
`
`IPR2025-00943
`Tesla EX1013 Page 1
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 1 of 16
`
`US 2007/0219.720 A1
`
`
`
`e
`
`/
`
`IPR2025-00943
`Tesla EX1013 Page 2
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 2 Of 16
`
`US 2007/0219720 A1
`
`
`
`s
`5
`
`IPR2025-00943
`Tesla EX1013 Page 3
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 3 of 16
`
`US 2007/0219720 A1
`
`( ) -----------------+-------------
`
`
`
`, !?f; WOSS) || .
`
`| e?i
`
`
`
`| || 80, IHSJ || ||· · ·
`
`
`
`-`Þt-f1 e?t HIGHWAY) WAR
`
`---------
`
`N --
`
`us
`w
`
`
`
`5%-|?IMO ANIMOrris?ae)
`
`
`
`
`
`
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 4
`
`
`
`Patent Application Publication Sep. 20
`2007 Sheet 4 Of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 5
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 5 of 16
`
`US 2007/0219720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 6
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 6 of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 7
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 7 of 16
`
`US 2007/0219.720 A1
`
`602
`
`
`
`
`
`SCANNING REPEATEDLY ABEAM INTO A
`TWO-DIMENSIONAL SECTOR OF A PLANE
`DEFINED WITH RESPECT TO A FIRST A
`PREDETERMINED AXS OF THE VEHICLE
`
`
`
`
`
`DETECTING AREFLECTION OF THE EMITTED BEAN
`FROM OBJECTS REMOVED FROM THE VEHICLE
`
`|
`
`PANNING THE PLANE IN WHICH THE BEAMIS SCANNED. "
`ABOUT A SECOND PREDETERMINED AXIS TO PRODUCE A
`THREE DIMENSIONAL FIELD OF VIEW
`
`604
`
`606
`
`608
`
`DETERMINING THE EXISTENCE AND LOCATION OF THE
`OBJECTS IN THE THREE DIMENSIONAL FIELD OF VIEW
`BASED ON A POSITION OF THE VEHICLE AND A TIME
`BETWEEN AN EMITTANCE OF THE BEAMANDA
`RECEPTION OF THE REFLECTION OF THE EMITTED BEAM
`
`
`
`
`
`
`
`
`
`
`
`
`
`Figure 6A
`
`IPR2025-00943
`Tesla EX1013 Page 8
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 8 of 16
`
`US 2007/021972.0 A1
`
`620
`
`
`
`
`
`DETERMINE THE LOCATION OF
`AN OBJECTINGEOSPATIAL
`COORDINATES
`
`
`
`
`
`ASCERTAIN THE ALTITUDE OF THE
`VEHICLE AND GENERATE A CONTOUR
`MAP OF THE OBJECT
`
`
`
`
`
`DETERMINE FROM CONTOUR MAPIF
`THE OBJECTISAN OBSTACLE
`
`622
`
`624
`
`S
`
`ss
`
`DETERMINE IF THE
`OBSTACLE IS TO BE
`AVODED
`
`630
`
`
`
`
`
`PROVIDE STEERING AND
`SPEED CONTROLS TO THE
`VEHICLE TO GUIDE THE
`VEHICLE TO TS DESTINATION
`
`
`
`
`
`Figure 6B
`
`626
`
`
`
`
`
`628
`
`
`
`
`
`630
`
`PROVIDE STEERING AND
`SPEED CONTROLS TO THE
`VEHICLE TO AVOID THE
`OBSTACLE
`
`
`
`PROVIDE STEERING AND
`SPEED CONTROLS TO THE
`VEHICLE TO GUIDE THE
`VEHICLE TOTS DESTINATION
`
`
`
`
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 9
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 9 of 16
`
`US 2007/0219720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 10
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 10 of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 11
`
`
`
`Patent Application Publication Sep. 20,2007 Sheet 11 of 16
`
`US 2007/0219720 Al
`
`RNOILVOINNININO
`
`SYOMLAN
`
`9171
`
`STcl
`
`cOCI
`
`HOVAAALINI
`
`NOILVOINNWWOS
`soc]=~p0t1=~6021
`
`
`AYOWSN|PATIOWLNOD
`NIVAVTdSIG
`
`IPR2025-00943
`Tesla EX1013 Page 12
`
`IPR2025-00943
`Tesla EX1013 Page 12
`
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 12 of 16
`
`US 2007/0219.720 A1
`
`
`
`sensor Processing
`Computer
`
`Path Planning
`Computer
`
`IPR2025-00943
`Tesla EX1013 Page 13
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 13 of 16
`
`US 2007/0219.720 A1
`
`GPS/INS
`
`
`
`
`
`
`
`Obstacle
`Detection
`
`Path.
`Planning
`
`.
`
`
`
`
`
`Speed
`Planning
`
`
`
`
`
`
`
`Accelerator
`1Brake
`Control
`
`
`
`
`
`Steering
`Control
`
`
`
`EMC Vehicle
`Control
`
`FIGURE 10
`
`IPR2025-00943
`Tesla EX1013 Page 14
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 14 of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 15
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 15 of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 16
`
`
`
`Patent Application Publication Sep. 20, 2007 Sheet 16 of 16
`
`US 2007/0219.720 A1
`
`
`
`IPR2025-00943
`Tesla EX1013 Page 17
`
`
`
`US 2007/0219.720 A1
`
`Sep. 20, 2007
`
`NAVIGATION AND CONTROL SYSTEM FOR
`AUTONOMOUS VEHICLES
`
`DISCUSSION OF THE BACKGROUND
`0001) 1. Field of the Invention
`0002 The invention relates to a three-dimensional (3D)
`imaging sensor and method for controlling an autonomous
`vehicle.
`0003 2. Background of the Invention
`0004. In a modern vehicle, the driver remains a critical
`component of the vehicle's control system as the driver
`makes numerous decisions directed to the safe operation of
`the vehicle including speed, steering, obstacle and hazard
`recognition, and avoidance thereof. Yet, the driver's ability
`to perform all of these functions can become compromised
`due to physical factors such as driver fatigue, driver impair
`ment, driver inattention, or other factors such as visibility
`that reduce the reaction time needed by the driver to suc
`cessfully avoid hazards.
`0005 Furthermore, in environmentally dangerous sur
`roundings Such as for example in warfare settings or in
`settings where toxic or nuclear radiation hazards are present,
`the driver is at risk. Indeed, roadside bombs in Iraq are just
`one contemporary example of the loss of human life which
`could in many situations be avoided if Supply trucks bring
`ing materials to the troops were unmanned.
`0006.
`In other more conventional environments, the
`driver may become disoriented or incapable of physically
`commanding the vehicle as would occur if the driver suf
`fered a medical emergency or if for example the driver
`became disoriented under the driving conditions. One
`example of Such a disorienting or incapacitating environ
`ment would be a car or ship being driven or steered under
`Snow, fog, rain, and/or nighttime blackout conditions where
`the diver (or captain of the ship) is handicapped in his or her
`ability to perceive and react to hazards approaching or to
`which the ship is approaching.
`0007 Thus, whether addressing human deficiencies in
`the control of a vehicle or whether in environmentally
`hazardous conditions where human control is not preferred,
`there exists a need to have a system and method for
`vehicular identification of objects in the path of the vehicle.
`0008 Numerous articles on the development of autono
`mously driven vehicles and laser detection and visualization
`systems have been reported such as the following reference
`articles all of which are incorporated herein by reference:
`0009) 1) H. Wang, J. Kearney, J. Cremer, and P. Willem
`Sen, “Steering Autonomous Driving Agents Through
`Intersections in Virtual Urban Environments,’ 2004 Inter
`national Conference on Modeling, Simulation, and Visu
`alization Methods, (2004);
`0010) 2) R. Frezza, G. Picci, and S. Soatto, “A
`Lagrangian Formulation of Nonholonomic Path Follow
`ing.” The Confluence of Vision and Control, (A. S. Morse
`et al. (eds), Springer Verlag, 1998);
`0011
`3) J. Shirazi, Java Performance Tuning, (OReilly &
`Associates, 2000);
`
`0012 4) J. Witt, C. Crane III, and D. Armstrong,
`“Autonomous Ground Vehicle Path Tracking.” Journal of
`Robotic Systems, (21 (8), 2004);
`0013 5) C. Crane III, D. Armstrong Jr., M. Torrie, and S.
`Gray, “Autonomous Ground Vehicle Technologies
`Applied to the DARPA Grand Challenge.” International
`Conference on Control, Automation, and Systems,
`(2004);
`0014 6) T. Berglund, H. Jonsson, and I. Soderkvist, “An
`Obstacle-Avoiding Minimum Variation B-spline Prob
`lem.” International Conference on Geometric Modeling
`and Graphics, (July, 2003);
`0.015
`7) D. Coombs, B. Yoshimi, T. Tsai, and E. Kent,
`“Visualizing Terrain and Navigation Data.” NISTIR 6720,
`(Mar. 1, 2001):
`0016 8) U.S. Pat. No. 5,644,386 to Jenkins et al;
`0017 9) U.S. Pat. No. 5,870,181 to Andressen;
`0018) 10) U.S. Pat. No. 5.200,606 to Krasutsky et al; and
`0.019
`11) U.S. Pat. No. 6,844,924 to Ruffetal;
`Despite this work, realization of suitable visualization,
`obstacle identification, and obstacle avoidance systems
`and methods has not been without problems limiting the
`operation of vehicles.
`
`SUMMARY OF THE INVENTION
`0020. Accordingly, one object of the present invention
`accomplished in various of the embodiments is to provide a
`system and method for vehicle object identification in which
`a location of the object relative to the vehicle is identified.
`0021 Another object of the present invention accom
`plished in various of the embodiments is to provide a system
`and method for vehicle object identification in which a path
`for the vehicle relative to the location of the object is
`identified.
`0022. Yet another object of the present invention accom
`plished in various of the embodiments is to provide a system
`and method for identifying objects in a field of view of the
`vehicle as obstacles.
`0023 Still another object of the present invention accom
`plished in various of the embodiments is to provide steering
`and vehicle control directions to avoid the obstacles
`0024. Various of these and other objects are provided for
`in certain ones of the embodiments of the present invention.
`0025. In one embodiment of the present invention, a
`navigation and control system includes a sensor configured
`to locate objects in a predetermined field of view from a
`vehicle. The sensor has an emitter configured to repeatedly
`scana beam into a two-dimensional sector of a plane defined
`with respect to a first predetermined axis of the vehicle, and
`a detector configured to detect a reflection of the emitted
`beam from one of the objects. The sensor includes a panning
`mechanism configured to pan the plane in which the beam
`is scanned about a second predetermined axis to produce a
`three dimensional field of view. The navigation and control
`system includes a processor configured to determine the
`existence and location of the objects in the three dimensional
`field of view based on a position of the vehicle and a time
`
`IPR2025-00943
`Tesla EX1013 Page 18
`
`
`
`US 2007/0219.720 A1
`
`Sep. 20, 2007
`
`between an emittance of the beam and a reception of the
`reflection of the emitted beam from one of the objects.
`0026.
`In one embodiment of the present invention, a
`method for navigation and control of a vehicle includes
`scanning a beam into a sector of a plane defined with respect
`to a first predetermined axis of the vehicle, detecting a
`reflection of the emitted beam from an object removed from
`the vehicle, panning the plane in which the beam is scanned
`about a second predetermined axis to produce a three
`dimensional field of view, and determining the existence and
`location of the object in the three dimensional field of view
`based on a position of the vehicle and a time between an
`emittance of the beam and a reception of the reflection of the
`emitted beam.
`0027. It is to be understood that both the foregoing
`general description of the invention and the following
`detailed description are exemplary, but are not restrictive of
`the invention.
`BRIEF DESCRIPTION OF THE DRAWINGS
`0028. A more complete appreciation of the present inven
`tion and many attendant advantages thereof will be readily
`obtained as the same becomes better understood by refer
`ence to the following detailed description when considered
`in connection with the accompanying drawings, wherein:
`0029 FIG. 1A is a schematic illustration of an autono
`mous vehicle according to one embodiment of the present
`invention in which a two-dimensional (2D) scan is made in
`a sector of a plane normal to a predetermined axis of a
`vehicle:
`0030 FIG. 1B is a schematic illustration of an autono
`mous vehicle according to one embodiment of the present
`invention in which a three-dimensional (3D) scan is made by
`displacing the scan out the plane normal to the predeter
`mined axis of a vehicle:
`0031
`FIG. 2 is a schematic illustration of an emitter and
`detector System according to one embodiment of the present
`invention;
`0032 FIG. 3 is a detailed schematic illustration showing
`a perspective frontal view of a laser radar (LADAR) imag
`ing sensor according to one embodiment of the present
`invention;
`0033 FIG. 4 is a detailed schematic illustration showing
`a perspective rear view of the laser radar (LADAR) imaging
`sensor depicted in FIG. 3;
`0034 FIG. 5 is a contour map of a 3D scan obtained by
`methods of the present invention;
`0035 FIG. 6A is a flow chart illustrating particular
`method steps of the present invention;
`0.036
`FIG. 6B is a flow chart illustrating particular
`method steps of the present invention for obstacle identifi
`cation and avoidance;
`0037 FIG. 7A is a schematic illustration of a vehicle
`according to the present invention utilizing multiple
`LADAR imaging sensors to Scan separate fields of view:
`0038 FIG. 7B is a schematic illustration of a vehicle
`according to the present invention utilizing multiple
`LADAR imaging sensors to scan the same or overlapping
`fields of view:
`
`0039 FIG. 8 is a schematic illustration of an exemplary
`computer system of the present invention;
`0040 FIG. 9 is a high level schematic of a computing
`network system of the present invention;
`0041 FIG. 10 is a high level schematic of the vehicle
`control system of the present invention; and
`0042 FIGS. 11-13 are schematics of path determinations
`made by the present invention.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`0043 Referring now to the drawings, wherein like ref
`erence numerals designate identical, or corresponding parts
`throughout the several views, and more particularly to FIG.
`1A, which depicts an imaging sensor 8 mounted, in one
`embodiment, on top of a vehicle 10 in which a two
`dimensional (2D) scan is made in a sector of a plane 11
`normal to a predetermined axis of the vehicle 10 referred to
`here for illustration purposes as a “vertical scanning plane.
`The imaging sensor 8 includes in one embodiment an
`emitter 12 (as shown in FIG. 2) that transmits laser pulses (or
`light) 14 from the imaging sensor 8 into the environment
`about the vehicle 10. As shown in FIG. 1A, the laser (or
`light) pulses 14 are emitted into the vertical scanning plane
`11. To produce a three-dimensional (3D) image, the imaging
`sensor 8 is panned (or oscillated) in and out of plane 11 to
`create a 3D scanning volume 16, as shown in FIG. 1B. The
`imaging sensor 8 detects objects 22 (as shown in FIG. 1B)
`in the environment nearby the vehicle 10 by detecting light
`reflected from the objects 22.
`0044 As shown in FIG. 2, the imaging sensor 8 includes
`a detector 18 for detecting return of an echoed signal 20. The
`imaging sensor 8 utilizes a processor 24 for controlling the
`timing and emission of the laser pulses 14 and for correlating
`emission of the laser pulses 14 with reception of the echoed
`signal 20. The processor 24 may be on-board the vehicle or
`a part of the imaging sensor 8. Details of exemplary pro
`cessors and their functions are provided later.
`0045. In an exemplary example, laser pulses 14 from
`emitter 12 pass through a beam expander 13a and a colli
`mator 13b. The laser pulses 14 are reflected at a stationary
`mirror 15a to a rotating mirror 26 and then forwarded
`through lens 27a and a telescope 27b to form a beam for the
`laser pulses 14 with a diameter of 1-10 mm, providing a
`corresponding resolution for the synthesized three-dimen
`sional field of view. The telescope 27b serves to collect light
`reflected from objects 22.
`0046.
`In one embodiment of the present invention, the
`detector 18 is configured to detect light only of a wavelength
`of the emitted light in order to discriminate the laser light
`reflected from the object back to the detector from back
`ground light. Accordingly, the imaging sensor 8 operates, in
`one embodiment of the present invention, by sending out a
`laser pulse 14 that is reflected by an object 22 and measured
`by the detector 18 provided the object is within range of the
`sensitivity of the detector 18. The elapsed time between
`emission and reception of the laser pulse permits the pro
`cessor 24 is used to calculate the distance between the object
`22 and the detector 18. In one embodiment of the present
`invention, the optics (i.e., 13a, 13b. 15a, 26, 27a, and 27b)
`are configured to direct the beam instantaneously into the
`
`IPR2025-00943
`Tesla EX1013 Page 19
`
`
`
`US 2007/0219.720 A1
`
`Sep. 20, 2007
`
`sector shown in FIG. 1A, and the detector 18 is a field
`programmable gate array for reception of the received
`signals at predetermined angular positions corresponding to
`a respective angular directional shown in FIG. 1A.
`0047 Via the rotating mirror 26, laser pulses 14 are swept
`through a radial sector C. within plane 11, as shown illus
`tratively in FIG. 1A. In one embodiment of the present
`invention, in order to accomplish mapping of objects in the
`field of view in front of the imaging sensor 8, the rotating
`mirror 26 is rotated across an angular displacement ranging
`from 30 to 90 degrees, at angular speeds ranging from
`100-10000 degrees per second. For example, a 90 degree
`scanning range can be scanned 75 times per second or an 80
`degree scanning range can be scanned between 5 and 100
`times per second. Furthermore, the angular resolution can be
`dynamically adjusted (e.g., providing on command angular
`resolutions of 0.01, 0.5, 0.75, or 1 degrees for different
`commercially available sensors 8.
`0.048 Commercially available components can be used
`for the emitter 12 and the detector 18 to provide ranging
`measurements. In one embodiment, the emitter 12, the
`detector 18, and the associated optics constitute a laser radar
`(LADAR) system, but other systems capable of making
`precise distance measurements can be used in the present
`invention, such as for example a light detection and ranging
`(LIDAR) sensor, a radar, or a camera. LIDAR (Light Detec
`tion and Ranging; or Laser Imaging Detection and Ranging)
`is a technology that determines distance to an object or
`Surface using laser pulses. Like the similar radar technology,
`which uses radio waves instead of light, the range to an
`object is determined by measuring the time delay between
`transmission of a pulse and detection of the reflected signal.
`LADAR (Laser Detection and Ranging) refers to elastic
`backscatter LIDAR systems. The term laser radar is also in
`use, but with laser radar laser light (and not radio waves) are
`used.
`0049. The primary difference between LIDAR and radar
`is that with LIDAR, much shorter wavelengths of the
`electromagnetic spectrum are used, typically in the ultravio
`let, visible, or near infrared. In general it is possible to image
`a feature or object only about the same size as the wave
`length, or larger. Thus, LIDAR provides more accurate
`mapping than radar systems. Moreover, an object needs to
`produce a dielectric discontinuity in order to reflect the
`transmitted wave. At radar (microwave or radio) frequen
`cies, a metallic object produces a significant reflection.
`However non-metallic objects. Such as rain and rocks pro
`duce weaker reflections, and some materials may produce no
`detectable reflection at all, meaning some objects or features
`are effectively invisible at radar frequencies. Lasers provide
`one solution to these problems. The beam densities and
`coherency are excellent. Moreover the wavelengths are
`much smaller than can be achieved with radio systems, and
`range from about 10 micrometers to the UV (e.g., 250 nm).
`At these wavelengths, a LIDAR system can offer much
`higher resolution than radar.
`0050. To produce a three-dimensional (3D) image, in one
`embodiment of the present invention, the imaging sensor 8
`is panned (or oscillated) in and out the plane 11 to create a
`3D scanning volume 16, as shown in FIG. 1B. For sake of
`illustration, FIG. 1B defines the scanning volume 16 by the
`angle C. (in the vertical scanning direction) and the angle B
`
`(in the horizontal scanning direction). The angle C, as noted
`earlier, ranges from 30 to 70 degrees, at angular speeds
`ranging from 100-1000 degrees per second. The angle B
`(i.e., the panning angle) ranges from 1 to 270 degrees, at a
`panning rate ranging from 1-150 degrees per second. Com
`bined the imaging sensor 8 typically can completely scan the
`3D scanning Volume 16 at more than two times a second.
`0051. In order to accurately determine the distance to
`objects in the 3D scanning volume 16, the direction that the
`imaging sensor 8 is pointed at the time of receiving light
`reflected from the objects 22 is needed (i.e., the angle of
`deflection from plane 28 is needed). Further, in one embodi
`ment of the present invention, geospatial positional data of
`the instantaneous vehicle position is utilized by processor 24
`to calculate based on the distance of the object from the
`vehicle and its direction from the vehicle, the geospatial
`location of the objects in the field of view. In one configu
`ration of the present invention, the processor 24 includes a
`personal computer running on a Linux operating system, and
`the algorithms are programmed in Java programming lan
`guage. Other computing Systems and programming lan
`guages can be used in the present invention (as discussed in
`more detail below). As shown in FIG. 2, processor 24 is in
`communication with a real time positioning device 25. Such
`as for example a global positioning system (GPS) and/or an
`inertial navigation system (INS), that transmits the location,
`heading, altitude, and speed of the vehicle multiple times per
`second to processor 24. The real time positioning device 25
`is typically mounted to the vehicle 10 and transmits data
`(such as location, heading, altitude, and speed of the vehicle)
`to all imaging sensors 8 (and all processors 24) on the
`vehicle 10.
`0.052 With commercially available GPS and the INS
`units, processor 24 can determine a position of an object in
`the field of view to an accuracy of better than 10 cm. In one
`embodiment of the present invention, the processor 24
`correlates GPS position, LADAR measurements, and angle
`of deflection data to produce a map of obstacles in a path of
`the vehicle. The accuracy of the map depends on the
`accuracy of the data from the positioning device 25. The
`following are illustrative examples of the accuracies of Such
`data: position 10 cm, forward velocity 0.07 km/hr, accel
`eration 0.01%, roll/pitch 0.03 degrees, heading 0.1 degrees,
`lateral velocity 0.2%.
`0053. In one embodiment of the present invention, a
`Kalman filter (commercially integrated) sorts through all
`data inputs to processor 24. A Kalman filter is a known
`method of estimating the state of a system based upon
`recursive measurement of noisy data. In this instance, the
`Kalman filter is able to much more accurately estimate
`vehicle position by taking into account the type of noise
`inherent in each type of sensor and then constructing an
`optimal estimate of the actual position. Such filtering is
`described by A. Kelly, in “A 3d State Space Formulation of
`a Navigation Kalman Filter for Autonomous Vehicles.”
`CMU Robotics Institute, Tech. Rep., 1994, the entire con
`tents of which are incorporated herein by reference. The
`Kalman filter is a set of mathematical equations that pro
`vides an efficient computational (recursive) means to esti
`mate the state of a process, in a way that minimizes the mean
`of the squared error. The filter is very powerful in several
`aspects: it Supports estimations of past, present, and even
`
`IPR2025-00943
`Tesla EX1013 Page 20
`
`
`
`US 2007/0219.720 A1
`
`Sep. 20, 2007
`
`future states, and it can do so even when the precise nature
`of the modeled system is unknown.
`0054 The positioning device 25, by including GPS and
`INS data, provide complementary data to the processor 24.
`GPS and INS have reciprocal errors. That is GPS is noisy
`with finite drift, while INS is not noisy but has infinite drift.
`Further, the processor 24 can be configured to accept addi
`tional inputs (discussed below) to reduce drift in its estimate
`of vehicle position when, for example the GPS data may not
`be available.
`0.055
`FIG. 3 is a detailed schematic illustration of imag
`ing sensor 8 of the present invention. FIG. 3 presents a
`frontal view of imaging sensor 8. FIG. 4 presents a rear
`perspective view of imaging sensor 8. FIG. 3 shows a motor
`30 configured to oscillate the imaging sensor 8 in and out of
`a plane normal to a predetermined axis of the imaging
`sensor. In one embodiment of the present invention, a
`12-volt DC motor operating at a speed of 120 RPM is used
`to oscillate the imaging sensor 8 in and out the plane. Other
`motors with reciprocating speeds different than 120 RPM
`can be used.
`0056. As shown in FIG. 3, an absolute rotary encoder 32
`is placed on a shaft 34 that is oscillating. The encoder 32
`provides an accurate reading of the angle at which the shaft
`34 is instantaneously located. By the encoder 32, an accurate
`measurement of the direction that the imaging sensor 8 is
`pointed, at the time of the scan, is known. In one embodi
`ment of the present invention, the encoder 32 is an ethernet
`optical encoder (commercially available from Fraba Posi
`tal), placed on shaft 34 to provide both the angular position
`and angular velocity of the shaft.
`0057 To decrease the delay between reading a value
`from the sensor and reading a value from the encoder, a
`separate 100 MBit ethernet connection with its own dedi
`cated ethernet card connected the processor 24 with the
`encoder. This created communications delays between the
`encoder and the I/O computer that were consistent at
`approximately 0.5 ms. Testing revealed that an actual
`LADAR scan was taken approximately 12.5 ms before the
`data was available at the I/O computer. When this time was
`added to the 0.5 ms of delay from the encoder communi
`cations, a 13 ms delay from the actual scan to the actual
`reading of the encoder position and Velocity was present. To
`counteract the angular offset this delay created, in one
`embodiment of the present invention, the velocity of the
`encoder is multiplied times the communications delay of
`0.013 seconds to calculate the angular offset due to the delay.
`This angular offset (which was either negative or positive
`depending on the direction of oscillation) was then added to
`the encoder's position, giving the actual angle at the time
`when the scan occurred. This processing permits the orien
`tation of the LADAR platform to be accurate within 0.05
`degrees.
`0058. Further, as shown in FIGS. 3 and 4, the metal shaft
`34 is attached to a detector bracket 36 which is supported by
`a metal casing 38 with bearing 40. Bearing 40 is attached to
`metal casing 38 with a fastening mechanism such as bolts 42
`and 44. Detector bracket 36 is attached to metal shaft 34.
`Further, as shown in FIGS. 3 and 4, metal shaft 46 is
`attached to bearing 48. Bearing 48 is attached to metal
`casing 38 with a fastening mechanism such as bolts 50 and
`52. Push rod 54 is attached to detector bracket 36 with ball
`
`joint 56 on slot 58. Push rod 54 is attached to pivot spacer
`60 with ball joint 62. Pivot spacer 60 is attached to servo arm
`64 on slot 66. Servo arm 64 is attached to metal shaft 68.
`Motor 30 is attached to servo arm 64 and is suspended from
`metal casing 38 by motor mounts 70, 72, and 74.
`0059. The imaging sensor 8 operates, in one embodiment,
`by oscillating a measurement sensor laterally about an axis
`of the vehicle 10, as shown in FIG. 1A. In the one embodi
`ment, the shaft 68 of motor 30 rotates at a constant speed,
`causing servo arm 64 to also spin at a constant speed. One
`end of push rod 54 moves with servo arm 64, causing
`detector bracket 36 to oscillate back and forth. The degree of
`rotation can be adjusted by moving the mount point of ball
`joint 56 along slot 58, and/or the mount point of ball joint 62
`along slot 66. Moving the mount point closer to shaft 46
`increases the angle of rotation, while moving the mount
`point away from shaft 46 decreases the angle of rotation.
`0060. While sensor 18 is oscillating, the sensor 18 is
`taking measurements of the Surrounding environment along
`the vertical scanning plane, as shown in FIG. 1A. The
`absolute rotary encoder 32 operates as an angular position
`mechanism, and transmits the absolute angle of deflection of
`detector bracket 36 to processor 24. At the same time, a real
`time positioning device 76. Such as a global positioning
`system (GPS) or an inertial navigation system (INS), trans
`mits the location, heading, altitude, and speed of the vehicle
`multiple times per second to processor 24. Software running
`on the processor 24 integrates the data, and, in one embodi
`ment, uses matrix transformations to transform the YZ
`measurements from each 2D scan (as shown in FIG. 1) into
`a 3D view of the surrounding environment. Due to the use
`of the real time positioning device, in the present invention,
`a terrain map can be calculated even while the vehicle is
`moving at speeds in excess of 45 miles per hour.
`Obstacle Detection Algorithms
`0061 The imaging sensor 8 is configured, in one embodi
`ment of the present invention, to receive data in the proces
`Sor regarding echoed or returned signals, GPS position,
`LADAR measurements, and angle of deflection data to
`determine what sectors of the 3D scan contain obstacles, and
`therefore to determine which sectors are most likely impass
`able by the vehicle. In one embodiment of the present
`invention, data coming from detector 18 is formatted in a
`coordinate system relative to the detector 18, as detailed in
`the exemplary illustrations below.
`0062 Since the imaging sensor 8 is in a “vertical
`configuration, the coordinate system in that configuration
`has Y-axis representing a distance from the sensor roughly
`parallel with the forward direction of the vehicle, with the
`X-axis lined up along the vertical direction. Hence, in one
`embodiment of the present invention, a search for obstacles
`is manifest in the slope and extent in the X-axis. To facilitate
`this search, all measurement points in the detector's scan are
`sorted by the Y-values (i.e., by a distance from the detector
`18).
`0063 FIG. 5 is a schematic representation of a contour
`map showing the reflected signatures from an object in a
`synthesized field of view. FIG. 5 depicts two common
`obstacles that the vehicle 10 needs to avoid. Obstacle 22a is
`a depression in the highway (e.g., a pothole). Obstacle 22b
`is an obstruction in the highway (e.g., a fallen boulder). As
`
`IPR2025-00943
`Tesla EX1013 Page 21
`
`
`
`US 2007/0219.720 A1
`
`Sep. 20, 2007
`
`shown in FIG. 5, lines of constant elevation have been added
`to the drawing in FIG. 5 for clarity. The imaging sensor 8 of
`the present invention, as shown in FIG. 1A, produces a
`vertical scan, which in FIG. 5 is shown depicted on an X-Y
`axis. A vertical scan provides the advanta