Multi-Temporal Analysis of Forestry and Coastal Environments Using UASs uri icon

abstract

  • Due to strong improvements and developments achieved in the last decade, it is clear that applied research using remote sensing technology such as unmanned aerial vehicles (UAVs) can provide a flexible, efficient, non-destructive, and non-invasive means of acquiring geoscientific data, especially aerial imagery. Simultaneously, there has been an exponential increase in the development of sensors and instruments that can be installed in UAV platforms. By combining the aforementioned factors, unmanned aerial system (UAS) setups composed of UAVs, sensors, and ground control stations, have been increasingly used for remote sensing applications, with growing potential and abilities. This paper's overall goal is to identify advantages and challenges related to the use of UAVs for aerial imagery acquisition in forestry and coastal environments for preservation/prevention contexts. Moreover, the importance of monitoring these environments over time will be demonstrated. To achieve these goals, two case studies using UASs were conducted. The first focuses on phytosanitary problem detection and monitoring of chestnut tree health (Padrela region, Valpaços, Portugal). The acquired high-resolution imagery allowed for the identification of tree canopy cover decline by means of multi-temporal analysis. The second case study enabled the rigorous and non-evasive registry process of topographic changes that occurred in the sandspit of Cabedelo (Douro estuary, Porto, Portugal) in different time periods. The obtained results allow us to conclude that the UAS constitutes a low-cost, rigorous, and fairly autonomous form of remote sensing technology, capable of covering large geographical areas and acquiring high precision data to aid decision support systems in forestry preservation and coastal monitoring applications. Its swift evolution makes it a potential big player in remote sensing technologies today and in the near future.

publication date

  • December 1, 2017