Target Localization for Autonomous Landing Site Detection: A Review and Preliminary Result with Static Image Photogrammetry
The advancement of autonomous technology in Unmanned Aerial Vehicles (UAVs) has piloted a new era in aviation. While UAVs were initially utilized only for the military, rescue, and disaster response, they are now being utilized for domestic and civilian purposes as well. In order to deal with its ex...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Published: |
Multidisciplinary Digital Publishing Institute (MDPI)
2023
|
Online Access: | http://scholars.utp.edu.my/id/eprint/37427/ https://www.scopus.com/inward/record.uri?eid=2-s2.0-85169112309&doi=10.3390%2fdrones7080509&partnerID=40&md5=fd3951dd84fffd8f8ae958428e4b3f95 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The advancement of autonomous technology in Unmanned Aerial Vehicles (UAVs) has piloted a new era in aviation. While UAVs were initially utilized only for the military, rescue, and disaster response, they are now being utilized for domestic and civilian purposes as well. In order to deal with its expanded applications and to increase autonomy, the ability for UAVs to perform autonomous landing will be a crucial component. Autonomous landing capability is greatly dependent on computer vision, which offers several advantages such as low cost, self-sufficiency, strong anti-interference capability, and accurate localization when combined with an Inertial Navigation System (INS). Another significant benefit of this technology is its compatibility with LiDAR technology, Digital Elevation Models (DEM), and the ability to seamlessly integrate these components. The landing area for UAVs can vary, ranging from static to dynamic or complex, depending on their environment. By comprehending these characteristics and the behavior of UAVs, this paper serves as a valuable reference for autonomous landing guided by computer vision and provides promising preliminary results with static image photogrammetry. © 2023 by the authors. |
---|