Top Banner

Click here to load reader

A Vision-Based Automatic Landing Method for Fixed Wing UAVs

Dec 27, 2015

ReportDownload

Documents

ali-yasin

aa

  • J Intell Robot Syst (2010) 57:217231DOI 10.1007/s10846-009-9382-2

    A Vision-Based Automatic Landing Methodfor Fixed-Wing UAVs

    Sungsik Huh David Hyunchul Shim

    Received: 1 February 2009 / Accepted: 1 August 2009 / Published online: 23 October 2009 Springer Science + Business Media B.V. 2009

    Abstract In this paper, a vision-based landing system for small-size xed-wingunmanned aerial vehicles (UAVs) is presented. Since a single GPS without adifferential correction typically provide position accuracy of at most a few meters, anairplane equipped with a single GPS only is not guaranteed to land at a designatedlocation with a sufcient accuracy. Therefore, a visual servoing algorithm is proposedto improve the accuracy of landing. In this scheme, the airplane is controlled to yinto the visual marker by directly feeding back the pitch and yaw deviation anglessensed by the forward-looking camera during the terminal landing phase. The visualmarker is a monotone hemispherical airbag, which serves as the arresting devicewhile providing a strong and yet passive visual cue for the vision system. The airbag isdetected by using color- and moment-based target detection methods. The proposedidea was tested in a series of experiments using a blended wing-body airplane andproven to be viable for landing of small xed-wing UAVs.

    Keywords Landing dome Fixed-wing UAVs Visual servoing Automatic landing

    1 Introduction

    It is well known that the landing is the most accident-prone stage for both mannedand unmanned airplanes since it is a delicate process of dissipating the large amountof kinetic and potential energy of the airplane in the presence of various dynamic andoperational constraints. Therefore, commercial airliners heavily rely on instrumentlanding system (ILS) wherever available. As for UAVs, a safe and sound retrieval of

    S. Huh (B) D. H. ShimDepartment of Aerospace Engineering, KAIST, Daejeon, South Koreae-mail: [email protected]: http://unmanned.kaist.ac.kr

    D. H. Shime-mail: [email protected]

    Reprinted from the journal 217

  • J Intell Robot Syst (2010) 57:217231

    the airframe is still a signicant concern. Typically, UAVs land manually by eitherinternal or external human pilots on a conventional runway or arresting mechanisms.For manual landing, the pilot obtains visual cue by naked eye or through the liveimages taken by the onboard camera. Piloting outside of the vehicle needs a lotof practice due to the limited situation awareness. As a consequence, a signicantportion of mishaps happen during the landing phase. Nearly 50% of xed-wingUAVs such as Hunter and Pioneer operated by the US military suffer accidentsduring landing. As for Pioneer UAVs, almost 70% of mishaps occur during landingdue to human factors [1, 2]. Therefore, it has been desired to automate the landingprocess of UAVs in order to reduce the number of accidents while alleviating thepilots load.

    There are a number of automatic landing systems currently in service. GlobalHawk relies on a high-precision differential GPS during take-off and landing [3].Sierra Nevada Corporations UCARS or TALS1 are externally located aiding sys-tems consisting of a tracking radar and communication radios. It has been success-fully used for landing of many military xed-wing and helicopter UAVs such asHunter or Fire Scout on runways or even on a ship deck. The information includingthe relative position or altitude of the inbound aircraft is measured by the groundtracking radar and relayed back for automated landing. Some UAVs are retrievedin a conned space using a special net, which arrests the vehicle own manuallyby the external pilot. Scan Eagle is retrieved by a special arresting cable attachedto the tall boom, to which the vehicle is precisely guided by a differential GPS.2

    These external aids listed above rely on special equipment and radio communication,which are not always available or applicable due to complexity, cost, or limits fromoperating environment. Therefore, automatic landing systems that are inexpensive,passive, and reliable are highly desired.

    Vision-based landing has been found attractive since it is passive and does notrequire any special equipment other than a camera and a vision processing unitonboard. A vision-enabled landing system will detects the runway or other visualmarkers and guide the vehicle to the touchdown point. There are a range ofprevious works, theoretical or experimental, for xed-wing and helicopter UAVs[46]. Notably, Barber et al. [7] proposed a vision-based landing for small xed-wingUAVs, where visual marker is used to generate the roll and pitch commands to theight controller.

    In this paper, a BWB-shaped xed-wing UAV is controlled to land automaticallyby using vision algorithms and relatively simple landing aids. The proposed methodis shown to be viable without resorting to expensive sensors and external devices.The overall approach, its component technologies, and the landing aids including thelanding dome and recovery net are described in Section 2. The vision algorithmsincluding detection and visual servoing algorithm which enables airplane to landautomatically are described in Section 3. Finally, the experiment results of vision-based landing test are shown to validate the algorithm in Section 4. In Section 5,conclusion and closing remarks are given.

    1http://www.sncorp.com2http://www.insitu.com/scaneagle

    218 Reprinted from the journal

  • J Intell Robot Syst (2010) 57:217231

    2 System Description

    The proposed landing system consists of three major components: an inated domeas a visual marker, a vision processing unit, and a ight controller using a visualservoing algorithm. In order to develop a low-cost landing system without expensivesensors or special supporting systems, the vision system is integrated with an MEMS-based inertial measurement unit (IMU) and a single GPS. Without a differentialcorrection, the positioning accuracy of GPS is known to be a few meters horizontallyand twice as bad vertically. Therefore, even if the position of the dome is accuratelyknown, it will be still difcult to hit the dome consistently with the navigation systemthat has position error larger than the size of the landing dome.

    For landing, with roughly estimated location of the dome, the airplane starts yingalong a glide slope leading to the estimated position of the dome. When the visionsystem locks on the dome, the ight control switches from glide slope tracking todirect visual servoing, where the offset of the dome from the center of the imagetaken from the onboard front-looking camera is used as the error signals for thepitch and yaw control loops. In the following, detailed descriptions on the proposedapproach are presented (Fig. 1).

    2.1 Airframe and Avionics

    The platform used in this research is a BWB-based UAV (Fig. 2). BWBs areknown to carry a substantially large payload per airframe weight due to the extralift generated by the airfoil-shaped fuselage. Our BWB testbed is constructed withsturdy expanded polypropylene (EPP), which is reinforced with carbon ber sparsembedded at strategic locations. The vehicle is resilient to shock and crash, a highlywelcomed trait for the landing experiments in this paper.

    This BWB has only two control surfaces at the trailing edge of the wing, knownas elevons, which serve as both aileron and elevator. At the wingtips, winglets areinstalled pointing down to improve the directional stability. The vehicle has a DCbrushless motor mounted at the trailing edge of the airframe, powered by Lithium-polymer cells. The same battery powers the avionics to lower the overall weight. The

    InitialEstimation

    Beginning ofglide slope

    Terminal visualservoing

    Air dome

    Fig. 1 Proposed dome-assisted landing procedure of a UAV

    Reprinted from the journal 219

  • J Intell Robot Syst (2010) 57:217231

    Fig. 2 The KAIST BWB (blended wing-body) UAV testbed

    large payload compartment (30 15 7.5 cm) in the middle of the fuselage housesthe ight computer, IMU, battery, and radio receiver. These parts are all off the shelfproducts.

    The avionics of KAIST BWB UAV consists of a ight control computer (FCC), aninertial measurement unit(IMU), a GPS, and a servo controller module. Flight con-trol computer is constructed using PC104-compatible boards due to their industry-grade reliability, compactness and expandability. The navigation system is builtaround a GPS-aided INS. U-blox Supersense 5 GPS is used for its excellent track-ing capability even when the signal quality is low. Inertial Sciences MEMS-IMUshowed a robust performance during the ight. The vehicle communicates with theground station through a high-bandwidth telemetry link at 900 MHz to reduce thecommunication latency (Table 1, Fig. 3).

    The onboard vision system consists of a small forward-looking camera mountedat the nose of the airframe, a video overlay board, and a 2.4 GHz analog videotransmitter with a dipole antenna mounted at a wingtip. Due to the payload limit,the image processing is currently done at the ground station, which also serves asthe monitoring and command station. Therefore the image processing and visionalgorithm results are transmitted back to the airplane over the telemetry link. Imageprocessing and vision algorithms are processed on a laptop computer with PentiumCore2 CPU 2 GHz processor and 2 GB memory. Vision algorithm is written in MSVisual C++ using OpenCV library.3 It takes 240 ms to process the vision algorithmat an image, then, the visual servoing command is transmitted back to the vehicle at4 Hz, which is shown to be sufcient for the visual servoing problem considered inthis research.

    3Intel Open Source Computer Vision Library

    220 Reprinted from the journal

  • J Intell Robot Syst (2010) 57:217231

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.