`( 71 ) Applicant : Bell Helicopter Textron Inc . , Fort
`Worth , TX ( US )
`( 72 ) Inventors : Michael John Ryan , Colleyville , TX
`( US ) ; Joseph Scott Drennan , Dallas ,
`TX ( US )
`( 73 ) Assignee : BELL HELICOPTER TEXTRON
`INC . , Fort Worth , TX ( US )
`Subject to any disclaimer , the term of this
`patent is extended or adjusted under 35
`U . S . C . 154 ( b ) by 39 days .
`( 21 ) Appl . No . : 15 / 608 , 968
`( 22 ) Filed :
`May 30 , 2017
`( 65 )
`Prior Publication Data
`US 2018 / 0351634 A1 Dec . 6 , 2018
`Int . CI .
`( 2006 . 01 )
`H04B 7 / 185
`H04N 21 / 218
`( 2011 . 01 )
`( 2006 . 01 )
`G08B 5 / 00
`U . S . CI .
`CPC . . . H04B 7 / 18506 ( 2013 . 01 ) ; H04N 21 / 21805
`( 2013 . 01 ) ; G08B 5 / 00 ( 2013 . 01 ) ; H04N
`2201 / 0087 ( 2013 . 01 ) ; H04N 2201 / 0089
`( 2013 . 01 )
`Field of Classification Search
`. . . . . . . . . . . . . . 382 / 100 , 103 ; 701 / 3
`USPC . . . . . .
`See application file for complete search history .
`References Cited
`U . S . PATENT DOCUMENTS
`12 / 1994 Wangler et al .
`5 , 371 , 581 A
`6 , 243 , 482 B1 *
`. . . . . . . . . . . . . . G01S 7 / 4802
`6 / 2001 Eibert
`342 / 65
`
`( 51 )
`
`( 52 )
`
`( 58 )
`
`( 56 )
`
`( 12 ) United States Patent
`Ryan et al .
`
`( 10 ) Patent No . : US 10 , 243 , 647 B2
`( 45 ) Date of Patent :
`Mar . 26 , 2019
`
`US010243647B2
`
`( * ) Notice :
`
`6 / 2006 Voos . . . . . . . . . . . . . . . . . . . . GO8G 5 / 0013
`7 , 061 , 401 B2 *
`340 / 961
`6 / 2013 Christoph . . . . . . . . . . . . . B64C 27 / 006
`8 , 473 , 189 B2 *
`244 / 17 . 11
`11 / 2015 Reinke
`9 , 177 , 482 B1 *
`. . G06T 7 / 62
`9 , 196 , 168 B2 11 / 2015 McCollough et al .
`. . . B64C 39 / 024
`9 , 830 , 713 B1 * 11 / 2017 Walker
`GO6K 9 / 00664
`9 , 911 , 344 B2 *
`3 / 2018 Kabrt
`10 , 053 , 226 B2 *
`8 / 2018 Leland
`. . . . . . . . . B64D 43 / 02
`2010 / 0292868 A1 * 11 / 2010 Rotem . . . . . . . . . . . . . . . . . G05D 1 / 0038
`701 / 2
`2010 / 0299067 A1 * 11 / 2010 McCollough . . . . . . . GOTS 13 / 9303
`701 / 301
`2012 / 0029738 A1 *
`2 / 2012 Brunetti . . . . . . . . . . . . . . GO8G 5 / 0078
`701 / 11
`( Continued )
`FOREIGN PATENT DOCUMENTS
`WO WO 2016 / 200270
`12 / 2016
`Primary Examiner — Ishrat I Sherali
`( 74 ) Attorney , Agent , or Firm — Patent Capital Group
`
`ABSTRACT
`( 57 )
`In one embodiment , an apparatus comprises a processing
`device configured to : obtain sensor data from one or more
`sensors associated with an aircraft , wherein the one or more
`sensors are configured to detect information associated with
`an operating environment of the aircraft ; detect an object
`near the aircraft based on the sensor data ; obtain a camera
`feed from a camera associated with the aircraft , wherein the
`camera feed comprises a camera view of at least a portion of
`the aircraft ; generate a display output based on the camera
`feed and the sensor data , wherein the display output com
`prises a visual perspective of the object relative to the
`aircraft ; and cause the display output to be displayed on a
`display device .
`
`25 Claims , 6 Drawing Sheets
`
`START
`
`602
`
`604
`
`OBTAIN SENSOR DATA FROM
`AIRCRAFT SENSORS
`
`DETECT OBJECT NEAR AIRCRAFT
`BASED ON SENSOR DATA
`
`606 Y
`
`OBTAIN CAMERA FEED FROM CAMERA
`
`608
`
`GENERATE DISPLAY OUTPUT BASED
`ON CAMERA FEED AND SENSOR DATA
`
`and DISPLAY GENERATED DISPLAY OUTPUT
`END
`
`Page 1 of 16
`
`DJI Exhibit 1001
`
`
`
`US 10 , 243 , 647 B2
`Page 2
`
`( 56 )
`
`References Cited
`U . S . PATENT DOCUMENTS
`2014 / 0327770 A1 * 11 / 2014 Wagreich . . . . . . . . . . . . . GO5D 1 / 0038
`348 / 148
`2016 / 0039529 A1
`2 / 2016 Buchmueller et al .
`9 / 2016 Leland . . . . . . . . . . . . . . . . . . B64D 43 / 02
`2016 / 0272340 A1 *
`2017 / 0025024 A1 *
`1 / 2017 Kabrt . . . . . . . . . . . . . . . . . GO6K 9 / 00664
`2017 / 0225680 A1 *
`8 / 2017 Huang . . . . . . . . . . . . . . . . . . B60W 30 / 09
`* cited by examiner
`
`Page 2 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 1 of 6
`
`US 10 , 243 , 647 B2
`
`140
`
`wwwwwwwwwwwww
`w ttttttt
`
`wwwwwwwwwwwwwwwwwwwwww
`
`130
`
`FIG . 1
`
`100
`
`Page 3 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 2 of 6
`
`US 10 , 243 , 647 B2
`
`205
`
`208
`
`w
`
`wwwwwwwww
`
`tato 207
`
`_
`
`- - 202
`
`206
`
`a
`
`FIG . 2A
`
`200
`
`Page 4 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 3 of 6
`
`US 10 , 243 , 647 B2
`
`A
`
`207
`
`- 208
`
`FIG . 2B
`
`520
`
`VISUAL CONTROL SYSTEM
`I / O DEVICES
`SENSORS 7521
`CAMERAS
`
`VISUAL
`CONTROL
`LOGIC
`510
`
`DISPLAY
`
`AUDIO
`
`/
`
`14
`
`LIGHTS F525
`
`COMMUNICATION
`INTERFACE
`
`FIG . 5
`
`Page 5 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 4 of 6
`
`US 10 , 243 , 647 B2
`
`when
`
`305
`
`302
`
`100
`
`304
`
`FIG . 3
`
`308
`
`Page 6 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 5 of 6
`
`US 10 , 243 , 647 B2
`
`x 408
`
`FIIT 20t
`
`FIG . 4
`
`400
`
`+ - + + 11 _ 007
`
`
`
`7 mm
`
`405 _ 4 .
`
`wwwwwwwwwwwwwwwwwwwwwwwwww
`
`wwwwwwwwwwwwwwww
`
`Page 7 of 16
`
`
`
`U . S . Patent
`
`Mar . 26 , 2019
`
`Sheet 6 of 6
`
`US 10 , 243 , 647 B2
`
`600
`
`START
`
`602 .
`
`OBTAIN SENSOR DATA FROM
`AIRCRAFT SENSORS
`
`604
`
`606
`
`DETECT OBJECT NEAR AIRCRAFT
`BASED ON SENSOR DATA
`
`OBTAIN CAMERA FEED FROM CAMERA
`
`GENERATE DISPLAY OUTPUT BASED
`6081 ON CAMERA FEED AND SENSOR DATA
`
`61
`
`DISPLAY GENERATED DISPLAY OUTPUT
`
`END
`FIG . 6
`
`Page 8 of 16
`
`
`
`US 10 , 243 , 647 B2
`
`AIRCRAFT VISUAL SENSOR SYSTEM
`TECHNICAL FIELD
`
`This disclosure relates generally to aircraft , and more 5
`particularly , though not exclusively , to an aircraft visual
`sensor system .
`
`another . Moreover , it will be appreciated that , while such a
`development effort might be complex and time - consuming ,
`it would nevertheless be a routine undertaking for those of
`ordinary skill in the art having the benefit of this disclosure .
`In the specification , reference may be made to the spatial
`relationships between various components and to the spatial
`orientation of various aspects of components as depicted in
`the attached drawings . However , as will be recognized by
`those skilled in the art after a complete reading of the present
`BACKGROUND
`10 disclosure , the devices , components , members , apparatuses ,
`etc . described herein may be positioned in any desired
`There are many hazards that may arise during operation of
`orientation . Thus , the use of terms such as “ above , ” “ below , "
`rotorcraft and other aircraft , including collisions , contact
`" upper , " " lower , ” or other similar terms to describe a spatial
`with moving components ( e . g . , rotors , propellers , and jet
`relationship between various components or to describe the
`engine intakes ) , landing on dangerous surfaces , and so forth .
`For example , the rotors of a rotorcraft ( eg , the main rotor 15 spatial orientation of aspects of such components , should be
`understood to describe a relative relationship between the
`and / or tail rotor ) present a risk of contact with objects , such
`components or a spatial orientation of aspects of such
`as people , animals , structures ( e . g . , buildings , powerlines ) ,
`components , respectively , as the components described
`terrain ( e . g . , the ground and other landing surfaces ) , and so
`herein may be oriented in any desired direction .
`forth . Moreover , many hazards may be difficult for a pilot to
`lot ' s field of view or 20
`Further , the present disclosure may repeat reference
`identify , as they may be outside the pilot ' s field of view or 20
`numerals and / or letters in the various examples . This rep
`otherwise difficult for the pilot to see .
`etition is for the purpose of simplicity and clarity and does
`not in
`itself dictate a relationship between the various
`SUMMARY
`embodiments and / or configurations discussed .
`Example embodiments that may be used to implement the
`According to one aspect of the present disclosure , an 25
`features and functionality of this disclosure will now be
`apparatus comprises a processing device configured to :
`described with more particular reference to the attached
`obtain sensor data from one or more sensors associated with
`FIGURES .
`an aircraft , wherein the one or more sensors are configured
`FIG . 1 illustrates an example embodiment of a rotorcraft
`to detect information associated with an operating environ
`ment of the aircraft : detect an object near the aircraft based 30 100 . Rotorcraft 100 includes a fuselage 110 , a rotor system
`120 , and an empennage 130 . The fuselage 110 is the main
`on the sensor data ; obtain a camera feed from a camera
`associated with the aircraft , wherein the camera feed com
`body of the rotorcraft , which may include a cabin for the
`prises a camera view of at least a portion of the aircraft ;
`crew , passengers , and / or cargo , and may also house certain
`generate a display output based on the camera feed and the
`mechanical and electrical components , such as the engine ( s ) ,
`sensor data , wherein the display output comprises a visual 35 transmission , and flight controls . The rotor system 120 is
`perspective of the object relative to the aircraft ; and cause
`used to generate lift for the rotorcraft using a plurality of
`rotating rotor blades . For example , torque generated by the
`the display output to be displayed on a display device .
`rotorcraft engine ( s ) causes the rotor blades to rotate , which
`BRIEF DESCRIPTION OF THE DRAWINGS
`in turn generates lift . Moreover , the pitch of each rotor blade
`40 can be adjusted in order to selectively control direction ,
`FIG . 1 illustrates an example rotorcraft in accordance with
`thrust , and lift for the rotorcraft 100 . The empennage 130 is
`the tail assembly of the rotorcraft . In some embodiments , the
`certain embodiments .
`FIGS . 2A and 2B illustrate an example embodiment of a
`empennage 130 may include horizontal and / or vertical sta
`visual sensor system for a rotorcraft tail rotor .
`bilizers , which may be respectively used for horizontal
`FIG . 3 illustrates an example embodiment of a visual 45 stability and vertical stability . The empennage 130 may also
`sensor system for detecting hazards below a rotorcraft .
`include a mechanism for providing anti - torque and / or direc
`FIG . 4 illustrates an example embodiment of a visual
`tional control , such as a tail rotor system 140 .
`There are many hazards that may arise during operation of
`sensor system for detecting hazards above a rotorcraft .
`FIG . 5 illustrates a block diagram for an example embodi -
`rotorcraft and other aircraft , including collisions , contact
`50 with moving components ( e . g . , rotors , propellers , and jet
`ment of an aircraft visual control system .
`FIG . 6 illustrates a flowchart for an example embodiment
`engine intakes ) , landing on uneven , obstructed , or otherwise
`dangerous surfaces , and so forth . For example , the rotors of
`of an aircraft visual sensor system .
`a rotorcraft 100 ( e . g . , main rotor 120 and / or tail rotor 140 )
`DETAILED DESCRIPTION
`present a risk of contact with objects , such as people ,
`55 animals , structures ( e . g . , buildings , powerlines ) , terrain
`The following disclosure describes various illustrative
`( e . g . , the ground and other landing surfaces ) , and so forth .
`embodiments and examples for implementing the features
`Moreover , many hazards may be difficult for a pilot to
`and functionality of the present disclosure . While particular
`identify , as they may be outside the pilot ' s field of view or
`components , arrangements , and / or features are described
`otherwise difficult for the pilot to see . Accordingly , embodi
`below in connection with various example embodiments , 60 ments of a visual sensor system for detecting and responding
`these are merely examples used to simplify the present
`to hazards during operation of an aircraft are described
`disclosure and are not intended to be limiting . It will of
`throughout this disclosure .
`course be appreciated that in the development of any actual
`The visual sensor system embodiments described
`embodiment , numerous implementation - specific decisions
`throughout this disclosure may detect and respond to haz
`must be made to achieve the developer ' s specific goals , 65 ards using a collection of sensors and / or cameras selectively
`including compliance with system , business , and / or legal
`positioned throughout an aircraft . For example , sensors can
`constraints , which may vary from one implementation to
`be used to identify the operating environment and / or situ
`
`Page 9 of 16
`
`
`
`US 10 , 243 , 647 B2
`
`the visual sensor system may include a variety of sensors for
`ational context of an aircraft , including objects , people ,
`detecting motion , distance , proximity , heat , visuals , and / or
`animals , structures , and / or terrain in the vicinity of the
`sound , among other examples . Moreover , in some embodi
`aircraft . Moreover , cameras can be selectively positioned on
`ments , the visual sensor system may be implemented using
`the aircraft to provide the pilot with meaningful perspectives
`of the aircraft and its surroundings , including any detected 5 lightweight and inexpensive " off - the - shelf ” sensors with
`hazards and their locations relative to the aircraft . In some
`relatively short range ( e . g . , 20 - 30 feet ) . Any type and / or
`embodiments , for example , a camera may be selectively
`combination of sensors can be used , including optical sen
`positioned to capture a portion of the aircraft and its sur
`sors ( e . g . , light detection and ranging ( LIDAR ) or other laser
`roundings , thus showing any surrounding hazards and their
`rangefinders , infrared ( IR ) sensors , ultraviolet ( UV ) sensors ,
`location relative to the aircraft . In some cases , the sensors 10 cameras ) , radio - based sensors ( e . g . , radar , ultrasonic sen
`used to detect hazards may be positioned separately from the
`sors ) , sound or acoustic sensors ( e . g . , microphones ) , thermal
`cameras used to capture visual perspectives of the relative
`sensors , electromagnetic sensors , and so forth .
`position of the hazards . For example , sensors that are used
`Moreover , in some embodiments , a visual sensor system
`to detect hazards near a particular portion of an aircraft may
`may leverage multiple sensor technologies to enhance the
`be positioned at or near the relevant portion of the aircraft , 15 sophistication and performance of the situational awareness
`while an associated camera may be positioned further away
`functionality . For example , multiple sensor technologies can
`to provide a view that shows the relevant portion of the
`be used to determine when an object comes within a certain
`aircraft and the surrounding hazards . By contrast , placing
`vicinity of an aircraft , the distance and / or location of the
`the camera near the same portion of the aircraft that the
`object relative to the aircraft , the velocity , direction , angle ,
`sensors are being used to monitor may result in
`a camera 20 and / or trajectory of the object , the size and shape of the
`perspective that only shows nearby hazards in isolation ,
`object , whether the object is living , the type of object , and
`without also showing the relevant portion of the aircraft ,
`so forth . For example , proximity , distance , and / or motion
`thus providing no visual context of the hazard ' s location
`sensors can be used to detect a nearby object and determine
`relative to the aircraft .
`its relative distance , location , and / or movement , visual sen
`In some embodiments , for example , the visual sensor 25 sors can be used to determine the size and shape of the
`system may include sensors positioned near the tail rotor of
`object , thermal sensors can be used to determine whether the
`a rotorcraft to detect hazards near the tail rotor , and may also
`object is living ( e . g . , based on heat emitted by the object ) ,
`include an associated camera positioned away from the tail
`and so forth . Moreover , various equipment and / or sensors
`rotor ( e . g . , near the forward end of the tail boom ) . In this
`can be used to identify the operational or flight status of the
`manner , the sensors are able to detect hazards near the tail 30 aircraft , such as whether the aircraft is grounded or in - flight ,
`rotor , and the camera provides a perspective that shows both
`its altitude , speed , and direction , the flight mode ( e . g . , hover ,
`the tail rotor and the surrounding hazards . As another
`forward flight , ascent , descent ) , and so forth . The type of
`example , the visual sensor system may include sensors
`object can then be identified based on the situational infor
`positioned near the top of a rotorcraft to detect hazards
`mation collectively obtained from the various sources , such
`above the rotorcraft and / or near the main rotor . In such 35 as the size and shape of the object , visuals or images of the
`embodiments , the visual sensor system may also include an
`object , whether it is living , the sounds that it emits , the
`associated camera positioned on the rotorcraft to capture a
`circumstances in which it is encountered , and so forth . For
`view of the upper portion of the rotorcraft and any hazards
`example , an object may be determined to be a bird if the
`immediately above . For example , in some embodiments , a
`object is living , has a size and / or shape that resembles a bird ,
`camera may be positioned on the tail of the rotorcraft and 40 and is encountered in the air ( e . g . , during flight ) . As another
`aimed towards the top of the rotorcraft . As another example ,
`example , an object may be determined to be a human if the
`the visual sensor system may include sensors positioned on
`object is living , has a size and / or shape that resembles a
`or near the bottom of a rotorcraft to detect hazards below the
`human , and is encountered on the ground . Other types of
`rotorcraft , such as dangerous landing conditions . Accord
`objects may be identified in
`a similar manner , including
`ingly , in this manner , the visual sensor system can incorpo - 45 buildings and other structures , powerlines , terrain , and so
`rate a collection of sensors and associated cameras to detect
`forth . In this manner , the sensors may be used to accurately
`and monitor hazards near various portions of an aircraft .
`detect hazards , such as objects approaching an aircraft or
`Moreover , in some embodiments , the visual sensor sys -
`dangerous landing conditions , among other examples .
`tem may integrate or combine data from cameras and
`In some cases , the output of a sensor or camera may be
`sensors . In some embodiments , for example , the video 50 subjected to noise . Accordingly , in some embodiments , the
`display of a camera may be overlaid with information about
`visual sensor system may reduce certain types of noise . For
`a hazard that is determined using sensors . For example , if a
`example , a sensor or camera could be obstructed by a nearby
`hazard detected by the sensors is not within view of the
`rotor or propeller that rotates through the field of view of the
`camera , a graphical representation of the hazard may be
`sensor or camera . Accordingly , in some embodiments , the
`superimposed on the video display using data from the 55 visual sensor system may filter out the rotor blade or
`sensors . The visual sensor system , for example , may use the
`propeller from the sensor or camera output . In some cases ,
`sensor data to visually plot the hazard on the video display
`for example , the rotor blade or propeller may be filtered
`at the appropriate location relative to the camera perspective .
`from the sensor or camera output based on the rotation
`In addition , or alternatively , the visual sensor system may
`frequency or rotations - per - minute ( RPMs ) of the rotor blade
`display information about the hazard on the video display , 60 or propeller . For example , the frame capture rate of a camera
`such as the distance , velocity , traveling direction , and / or
`may be synchronized with the RPMs of the rotor blade or
`propeller to avoid capturing frames when the camera is
`trajectory of the hazard , among other examples .
`In this manner , the visual sensor system facilitates situ -
`obstructed by the rotor blade or propeller . The capture rate
`ational awareness by identifying the operating environment
`of a sensor may be configured in a similar manner to avoid
`and situational context of an aircraft , including objects , 65 capturing data when obstructed by the rotor blade or pro
`people , animals , structures , or terrain that are within the
`peller . Alternatively , the RPMs of the rotor blade or propeller
`vicinity of the aircraft . In some embodiments , for example ,
`may be used to filter the output of a sensor or camera to
`
`Page 10 of 16
`
`
`
`US 10 , 243 , 647 B2
`
`contact with a rotor or propeller is particularly beneficial to
`remove data captured during obstructions from the rotor
`rotorcraft and other aircraft capable of vertical take - off and
`blade or propeller . In this manner , the sensor or camera can
`landing ( VTOL ) .
`" see through ” a rotor blade or propeller . As another example ,
`Example embodiments of a visual sensor system are
`the vibrations of an aircraft may cause a sensor or camera to
`move or vibrate , which may introduce noise into the output 5 described below with more particular reference to the
`of the sensor or camera . For example , a camera on the tail
`remaining FIGURES . Moreover , it should be appreciated
`that rotorcraft 100 of FIG . 1 is merely illustrative of a variety
`boom or tail rotor of a rotorcraft may shake during flight due
`of aircraft and other vehicles that can be used with embodi
`to vibrations from the rotorcraft . Accordingly , noise from
`ments described throughout this disclosure . Other aircraft
`vibrations may be filtered from the sensor or camera output ,
`thus stabilizing the sensor or camera . In some embodiments , 10 implementations can include , for example , fixed wing air
`planes , hybrid aircraft , tiltrotor aircraft , unmanned aircraft ,
`for example , a gyroscope could be used to filter noise caused
`gyrocopters , a variety of helicopter configurations , drones ,
`by vibrations . In this manner , for example , a camera view
`and other propeller and / or jet engine aircraft , among other
`may appear stabilized or still even when the camera is
`examples . The embodiments described throughout this dis
`subjected to vibrations .
`15 closure can similarly be implemented in any other type of
`Moreover , upon detecting a potential hazard , the visual
`vehicle , including land - based vehicles and water - based
`vehicles .
`sensor system may perform various remedial measures to
`FIGS . 2A and 2B illustrate an example embodiment of a
`minimize the risk presented by the hazard , such as providing
`a notification or warning to the appropriate personnel ( e . g . ,
`visual sensor system 200 for the tail rotor of a rotorcraft . As
`to a pilot , ground operator , or nearby aircraft or vehicle ) , 20 described further below , FIG . 2A illustrates an example
`displaying a camera view of the hazard to the pilot and / or
`embodiment of a rotorcraft with visual sensor system 200 ,
`superimposing a graphical representation of the hazard onto
`and FIG . 2B illustrates an example embodiment of a display
`a camera view of the aircraft , adjusting the flight path of the
`for providing a view of hazards detected by visual sensor
`aircraft , deploying a physical safety barrier ( e . g . , an airbag )
`system 200 .
`between the aircraft and the hazard , shutting off and / or 25
`FIG . 2A illustrates an example embodiment of a rotorcraft
`stopping an engine or rotor , using the thrust of an engine or
`100 with a visual sensor system 200 for detecting and
`responding to hazards near the tail rotor 140 . In some cases ,
`rotor to push an object away from the aircraft and / or
`for example , the tail rotor 140 may present a dangerous risk
`generate a burst of air as a warning to the object ( e . g . ,
`of contact with static and dynamic objects external to
`assuming the object is in the thrust path ) , and so forth . In
`30 rotorcraft 100 , particularly during vertical take - off and land
`some embodiments , for example , the visual sensor system
`ing ( VTOL ) . This risk presented by the tail rotor 140 ,
`may provide notifications or warnings about nearby hazards
`however , can be minimized or avoided using visual sensor
`using audible or visual cues ( e . g . , beeps , grawls , horns ,
`system 200 , as described further below .
`flashing lights , strobe lights ) and / or vibrations ( e . g . , vibra
`In the illustrated example , visual sensor system 200
`tions in the pedals , cyclic stick , or seat pan ) , among other 35 includes a visual control system 202 . one or more sensors
`examples . The particular remedial measures performed by
`204 , and a camera 206 . Visual control system 202 may be
`the visual sensor system may vary based on the circum
`used to control the operation of visual sensor system 200 . In
`stances , including the type of hazard identified , among other
`some embodiments , for example , visual control system 202
`factors .
`may be a collection of hardware and / or software configured
`The embodiments of a visual sensor system described 40 to control visual sensor system 200 . Sensors 204 may be
`throughout this disclosure provide numerous technical
`used to detect hazards near the tail rotor 140 of rotorcraft
`advantages , including , for example , accurately detecting
`100 , such as objects , people , animals , structures , surfaces ,
`and / or responding to hazards using a variety of sensor
`and / or terrain within the vicinity of the tail rotor 140 .
`technologies and / or cameras selectively positioned on an
`Sensors 204 may include a variety of types of sensors and / or
`aircraft . The described embodiments can be implemented in 45 sensor technologies , as described above in connection with
`a cost - efficient manner using lightweight and inexpensive
`FIG . 1 . In the illustrated example , sensors 204 are selec
`sensors ( e . g . , " off - the - shelf ' sensors and / or existing sensor
`t ively positioned near the tail rotor 140 for optimal detection
`technologies ) . The described embodiments can also display
`of hazards near the tail rotor 140 . For example , sensors 204
`meaningful views of detected hazards , for example , using
`may be arranged to detect objects that come within a
`selectively positioned cameras that provide optimal views of 50 particular distance of the tail rotor 140 in any direction , thus
`the hazards ( e . g . , views showing the hazards relative to the
`resulting in a sensor range 205 that forms a protective sphere
`aircraft ) , and / or incorporating visual representations of haz
`around the tail rotor 140 . Camera 206 may be used to
`ards onto the camera views using data from sensors . The
`provide a camera view of hazards detected by visual sensor
`described embodiments can also perform various other
`system 200 . Camera 206 ( or a collection of cameras 206 )
`remedial measures to minimize and / or eliminate the risk 55 may be selectively positioned on rotorcraft 100 to provide an
`presented by hazards , such as providing notifications and / or
`optimal view of hazards detected near the tail rotor 140 , such
`warnings , altering flight paths , shutting off or stopping
`as a view that provides spatial recognition by showing the
`engines or rotors , and so forth . The described embodiments
`hazards relative to the tail rotor 140 . In the illustrated
`may integrate a novel and unique combination of hardware
`example , camera 206 is positioned away from the tail rotor
`and software that greatly improves situational awareness 60 140 , near the forward end of the tail boom of rotorcraft 100 ,
`during operation of an aircraft , thus resulting in significantly
`in order to provide a camera view 207 that shows both the
`increased safety . The safety benefits are particularly advan -
`tail rotor 140 and any surrounding hazards . By contrast ,
`tageous to rotorcraft and other aircraft with unprotected
`placing the camera near the tail rotor may result in a
`rotors and / or propellers that can potentially contact static
`perspective that shows hazards near the tail rotor in isolation
`and dynamic objects external to the aircraft , as the risk of 65 ( e . g . , without showing the tail rotor itself ) , thus providing no
`dangerous contact can be minimized or avoided using the
`visual context showing the hazard ' s location relative to the
`described embodiments . Moreover , reducing the risk of
`rotorcraft . In the illustrated example , an object 208 near the
`
`Page 11 of 16
`
`
`
`US 10 , 243 , 647 B2
`
`detected by the sensors is not within view of the camera , a
`tail rotor 140 is within the sensor range 205 but outside of
`graphical representation of the hazard may be superimposed
`the camera view 207 . Accordingly , in some embodiments , if
`on the camera feed using data from the sensors . For
`an object detected by the sensors 204 is not within the
`example , the sensor data may be used to visually plot a 2D
`camera view 207 , a graphical representation of the object
`5 or 3D representation of the hazard on display 210 at the
`may be superimposed on the camera feed using data from
`appropriate location relative to the camera perspective ( e . g . ,
`the sensors 204 , as described below in connection with FIG .
`using software in visual control system 202 of FIG . 2A ) . In
`2B .
`some embodiments , the hazard may be represented using a
`In this manner , visual sensor system 200 may include an
`rough approximation or rendering , such as a shape or
`arrangement of sensors 204 and / or cameras 206 to provide
`depth perception for protective spherical zones around the 10 symbol ( e . g . , circle , box , star , or dot ) that may be blinking
`tail rotor 140 ( e . g . , a sphere with a defined radius of 10 - 15
`or flashing . In other embodiments , the hazard may be
`feet ) . The location of static and dynamic objects in the
`represented using a more sophisticated pixelated rendering
`proximity of the tail rotor 140 can be detected and analyzed ,
`that provides a more realistic depiction of the hazard . The
`and appropriate remedial measures may be performed , such
`representation of the hazard may depend on the sophistica
`as providing warning ( s ) to the pilot . For example , sensors 15 tion of the sensors available to the visual sensor system .
`204 may be used to observe all objects within their respec -
`Basic distance sensors , for example , may only be capable of
`tive fields of view , and based on the data from sensors 204 ,
`determining the location of objects , and thus the objects may
`visual control system 202 may perform geolocation tech -
`be represented using a predetermined shape or symbol .
`niques such as triangulation to locate the objects in space
`More advanced sensor technologies , however , may enable
`relative to the tail rotor 140 . Moreover , based on the location 20 sophisticated identification of the size , shape , and / or type of
`and movement of an object , the pilot may be given visual
`objects , thus enabling the objects to be represented using
`and / or audible cues as to the nature and risk of colliding with
`more realistic representations .
`the object . In some embodiments , for example , the visual
`In the illustrated example , display 210 is used to display
`sensor system 200 may provide notifications or warnings
`a camera view 207 of the tail rotor using the feed from
`about nearby hazards using audible or visual cues ( e . g . , 25 camera 206 of FIG . 2A . In addition , display 210 is also used
`beeps , grawls , horns , flashing lights , strobe lights ) and / or
`to display a graphical representation of an object 208
`vibrations ( e . g . , vibrations in the pedals , cyclic stick , or seat
`detected near the tail rotor by sensor ( s ) 204 and that is not
`pan ) , among other examples . Moreover , in some embodi -
`within the camera view 207 . In this manner , the pilot can be
`ments , increasing levels of warning may be provided to the
`provided with a view of hazards even when the hazards are
`pilot based on the proximity of an object to the tail rotor 140 , 30 outside the camera view .
`such as an increasing number and / or increasing volume of
`FIG . 3 illustrates an example embodiment of a visual
`audible cues as an object approaches the tail rotor 140 . Other
`sensor system 300 for detecting hazards below a rotorcraft
`remedial measures may also be performed to minimize the
`