Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
41 result(s) for "Sandino, Juan"
Sort by:
A Review of UAV Path-Planning Algorithms and Obstacle Avoidance Methods for Remote Sensing Applications
The rapid development of uncrewed aerial vehicles (UAVs) has significantly increased their usefulness in various fields, particularly in remote sensing. This paper provides a comprehensive review of UAV path planning, obstacle detection, and avoidance methods, with a focus on its utilisation in both single and multiple UAV platforms. The paper classifies the algorithms into two main categories: (1) global and local path-planning approaches in single UAVs; and (2) multi-UAV path-planning methods. It further analyses obstacle detection and avoidance methods, as well as their capacity to adapt, optimise, and compute efficiently in different operational environments. The outcomes highlight the advantages and limitations of each method, offering valuable information regarding their suitability for remote sensing applications, such as precision agriculture, urban mapping, and ecological surveillance. Additionally, this review also identifies limitations in the existing research, specifically in multi-UAV frameworks, and provides recommendations for future developments to improve the adaptability and effectiveness of UAV operations in dynamic and complex situations.
Detection of White Leaf Disease in Sugarcane Crops Using UAV-Derived RGB Imagery with Existing Deep Learning Models
White leaf disease (WLD) is an economically significant disease in the sugarcane industry. This work applied remote sensing techniques based on unmanned aerial vehicles (UAVs) and deep learning (DL) to detect WLD in sugarcane fields at the Gal-Oya Plantation, Sri Lanka. The established methodology to detect WLD consists of UAV red, green, and blue (RGB) image acquisition, the pre-processing of the dataset, labelling, DL model tuning, and prediction. This study evaluated the performance of the existing DL models such as YOLOv5, YOLOR, DETR, and Faster R-CNN to recognize WLD in sugarcane crops. The experimental results indicate that the YOLOv5 network outperformed the other selected models, achieving a precision, recall, mean average precision@0.50 (mAP@0.50), and mean average precision@0.95 (mAP@0.95) metrics of 95%, 92%, 93%, and 79%, respectively. In contrast, DETR exhibited the weakest detection performance, achieving metrics values of 77%, 69%, 77%, and 41% for precision, recall, mAP@0.50, and mAP@0.95, respectively. YOLOv5 is selected as the recommended architecture to detect WLD using the UAV data not only because of its performance, but this was also determined because of its size (14 MB), which was the smallest one among the selected models. The proposed methodology provides technical guidelines to researchers and farmers for conduct the accurate detection and treatment of WLD in the sugarcane fields.
UAV Framework for Autonomous Onboard Navigation and People/Object Detection in Cluttered Indoor Environments
Response efforts in emergency applications such as border protection, humanitarian relief and disaster monitoring have improved with the use of Unmanned Aerial Vehicles (UAVs), which provide a flexibly deployed eye in the sky. These efforts have been further improved with advances in autonomous behaviours such as obstacle avoidance, take-off, landing, hovering and waypoint flight modes. However, most UAVs lack autonomous decision making for navigating in complex environments. This limitation creates a reliance on ground control stations to UAVs and, therefore, on their communication systems. The challenge is even more complex in indoor flight operations, where the strength of the Global Navigation Satellite System (GNSS) signals is absent or weak and compromises aircraft behaviour. This paper proposes a UAV framework for autonomous navigation to address uncertainty and partial observability from imperfect sensor readings in cluttered indoor scenarios. The framework design allocates the computing processes onboard the flight controller and companion computer of the UAV, allowing it to explore dangerous indoor areas without the supervision and physical presence of the human operator. The system is illustrated under a Search and Rescue (SAR) scenario to detect and locate victims inside a simulated office building. The navigation problem is modelled as a Partially Observable Markov Decision Process (POMDP) and solved in real time through the Augmented Belief Trees (ABT) algorithm. Data is collected using Hardware in the Loop (HIL) simulations and real flight tests. Experimental results show the robustness of the proposed framework to detect victims at various levels of location uncertainty. The proposed system ensures personal safety by letting the UAV to explore dangerous environments without the intervention of the human operator.
UAVs and Machine Learning Revolutionising Invasive Grass and Vegetation Surveys in Remote Arid Lands
The monitoring of invasive grasses and vegetation in remote areas is challenging, costly, and on the ground sometimes dangerous. Satellite and manned aircraft surveys can assist but their use may be limited due to the ground sampling resolution or cloud cover. Straightforward and accurate surveillance methods are needed to quantify rates of grass invasion, offer appropriate vegetation tracking reports, and apply optimal control methods. This paper presents a pipeline process to detect and generate a pixel-wise segmentation of invasive grasses, using buffel grass (Cenchrus ciliaris) and spinifex (Triodia sp.) as examples. The process integrates unmanned aerial vehicles (UAVs) also commonly known as drones, high-resolution red, green, blue colour model (RGB) cameras, and a data processing approach based on machine learning algorithms. The methods are illustrated with data acquired in Cape Range National Park, Western Australia (WA), Australia, orthorectified in Agisoft Photoscan Pro, and processed in Python programming language, scikit-learn, and eXtreme Gradient Boosting (XGBoost) libraries. In total, 342,626 samples were extracted from the obtained data set and labelled into six classes. Segmentation results provided an individual detection rate of 97% for buffel grass and 96% for spinifex, with a global multiclass pixel-wise detection rate of 97%. Obtained results were robust against illumination changes, object rotation, occlusion, background cluttering, and floral density variation.
Advancing Sparse Vegetation Monitoring in the Arctic and Antarctic: A Review of Satellite and UAV Remote Sensing, Machine Learning, and Sensor Fusion
Polar vegetation is a critical component of global biodiversity and ecosystem health but is vulnerable to climate change and environmental disturbances. Analysing the spatial distribution, regional variations, and temporal dynamics of this vegetation is essential for implementing conservation efforts in these unique environments. However, polar regions pose distinct challenges for remote sensing, including sparse vegetation, extreme weather, and frequent cloud cover. Advances in remote sensing technologies, including satellite platforms, uncrewed aerial vehicles (UAVs), and sensor fusion techniques, have improved vegetation monitoring capabilities. This review explores applications—including land cover mapping, vegetation health assessment, biomass estimation, and temporal monitoring—and the methods developed to address these needs. We also examine the role of spatial, spectral, and temporal resolution in improving monitoring accuracy and addressing polar-specific challenges. Sensors such as Red, Green, and Blue (RGB), multispectral, hyperspectral, Synthetic Aperture Radar (SAR), light detection and ranging (LiDAR), and thermal, as well as UAV and satellite platforms, are analysed for their roles in low-stature polar vegetation monitoring. We highlight the potential of sensor fusion and advanced machine learning techniques in overcoming traditional barriers, offering a path forward for enhanced monitoring. This paper highlights how advances in remote sensing enhance polar vegetation research and inform adaptive management strategies.
Drone-Based Autonomous Motion Planning System for Outdoor Environments under Object Detection Uncertainty
Recent advances in autonomy of unmanned aerial vehicles (UAVs) have increased their use in remote sensing applications, such as precision agriculture, biosecurity, disaster monitoring, and surveillance. However, onboard UAV cognition capabilities for understanding and interacting in environments with imprecise or partial observations, for objects of interest within complex scenes, are limited, and have not yet been fully investigated. This limitation of onboard decision-making under uncertainty has delegated the motion planning strategy in complex environments to human pilots, which rely on communication subsystems and real-time telemetry from ground control stations. This paper presents a UAV-based autonomous motion planning and object finding system under uncertainty and partial observability in outdoor environments. The proposed system architecture follows a modular design, which allocates most of the computationally intensive tasks to a companion computer onboard the UAV to achieve high-fidelity results in simulated environments. We demonstrate the system with a search and rescue (SAR) case study, where a lost person (victim) in bushland needs to be found using a sub-2 kg quadrotor UAV. The navigation problem is mathematically formulated as a partially observable Markov decision process (POMDP). A motion strategy (or policy) is obtained once a POMDP is solved mid-flight and in real time using augmented belief trees (ABT) and the TAPIR toolkit. The system’s performance was assessed using three flight modes: (1) mission mode, which follows a survey plan and used here as the baseline motion planner; (2) offboard mode, which runs the POMDP-based planner across the flying area; and (3) hybrid mode, which combines mission and offboard modes for improved coverage in outdoor scenarios. Results suggest the increased cognitive power added by the proposed motion planner and flight modes allow UAVs to collect more accurate victim coordinates compared to the baseline planner. Adding the proposed system to UAVs results in improved robustness against potential false positive readings of detected objects caused by data noise, inaccurate detections, and elevated complexity to navigate in time-critical applications, such as SAR.
Unmanned Aerial Vehicles for Real-Time Vegetation Monitoring in Antarctica: A Review
The unique challenges of polar ecosystems, coupled with the necessity for high-precision data, make Unmanned Aerial Vehicles (UAVs) an ideal tool for vegetation monitoring and conservation studies in Antarctica. This review draws on existing studies on Antarctic UAV vegetation mapping, focusing on their methodologies, including surveyed locations, flight guidelines, UAV specifications, sensor technologies, data processing techniques, and the use of vegetation indices. Despite the potential of established Machine-Learning (ML) classifiers such as Random Forest, K Nearest Neighbour, and Support Vector Machine, and gradient boosting in the semantic segmentation of UAV-captured images, there is a notable scarcity of research employing Deep Learning (DL) models in these extreme environments. While initial studies suggest that DL models could match or surpass the performance of established classifiers, even on small datasets, the integration of these advanced models into real-time navigation systems on UAVs remains underexplored. This paper evaluates the feasibility of deploying UAVs equipped with adaptive path-planning and real-time semantic segmentation capabilities, which could significantly enhance the efficiency and safety of mapping missions in Antarctica. This review discusses the technological and logistical constraints observed in previous studies and proposes directions for future research to optimise autonomous drone operations in harsh polar conditions.
Drone hyperspectral imaging and artificial intelligence for monitoring moss and lichen in Antarctica
Uncrewed aerial vehicles (UAVs) have become essential for remote sensing in extreme environments like Antarctica, but detecting moss and lichen using conventional red, green, blue (RGB) and multispectral sensors remains challenging. This study investigates the potential of hyperspectral imaging (HSI) for mapping cryptogamic vegetation and presents a workflow combining UAVs, ground observations, and machine learning (ML) classifiers. Data collected during a 2023 summer expedition to Antarctic Specially Protected Area 135, East Antarctica, were used to evaluate 12 configurations derived from five ML models, including gradient boosting (XGBoost, CatBoost) and convolutional neural networks (CNNs) (G2C-Conv2D, G2C-Conv3D, and UNet), tested with full and light input feature sets. The results show that common indices like normalised difference vegetation index (NDVI) are inadequate for moss and lichen detection, while novel spectral indices are more effective. Full models achieved high performance, with CatBoost and UNet reaching 98.3% and 99.7% weighted average accuracy, respectively. Light models using eight key wavelengths (i.e., 404, 480, 560, 655, 678, 740, 888, and 920 nm) performed well, with CatBoost at 95.5% and UNet at 99.8%, demonstrating suitability for preliminary monitoring of moss health and lichen. These findings underscore the importance of key spectral bands for large-scale HSI monitoring using UAVs and satellites in Antarctica, especially in geographic regions with limited spectral range.
Monitoring of Antarctica’s Fragile Vegetation Using Drone-Based Remote Sensing, Multispectral Imagery and AI
Vegetation in East Antarctica, such as moss and lichen, vulnerable to the effects of climate change and ozone depletion, requires robust non-invasive methods to monitor its health condition. Despite the increasing use of unmanned aerial vehicles (UAVs) to acquire high-resolution data for vegetation analysis in Antarctic regions through artificial intelligence (AI) techniques, the use of multispectral imagery and deep learning (DL) is quite limited. This study addresses this gap with two pivotal contributions: (1) it underscores the potential of deep learning (DL) in a field with notably limited implementations for these datasets; and (2) it introduces an innovative workflow that compares the performance between two supervised machine learning (ML) classifiers: Extreme Gradient Boosting (XGBoost) and U-Net. The proposed workflow is validated by detecting and mapping moss and lichen using data collected in the highly biodiverse Antarctic Specially Protected Area (ASPA) 135, situated near Casey Station, between January and February 2023. The implemented ML models were trained against five classes: Healthy Moss, Stressed Moss, Moribund Moss, Lichen, and Non-vegetated. In the development of the U-Net model, two methods were applied: Method (1) which utilised the original labelled data as those used for XGBoost; and Method (2) which incorporated XGBoost predictions as additional input to that version of U-Net. Results indicate that XGBoost demonstrated robust performance, exceeding 85% in key metrics such as precision, recall, and F1-score. The workflow suggested enhanced accuracy in the classification outputs for U-Net, as Method 2 demonstrated a substantial increase in precision, recall and F1-score compared to Method 1, with notable improvements such as precision for Healthy Moss (Method 2: 94% vs. Method 1: 74%) and recall for Stressed Moss (Method 2: 86% vs. Method 1: 69%). These findings contribute to advancing non-invasive monitoring techniques for the delicate Antarctic ecosystems, showcasing the potential of UAVs, high-resolution multispectral imagery, and ML models in remote sensing applications.
A Green Fingerprint of Antarctica: Drones, Hyperspectral Imaging, and Machine Learning for Moss and Lichen Classification
Mapping Antarctic Specially Protected Areas (ASPAs) remains a critical yet challenging task, especially in extreme environments like Antarctica. Traditional methods are often cumbersome, expensive, and risky, with limited satellite data further hindering accuracy. This study addresses these challenges by developing a workflow that enables precise mapping and monitoring of vegetation in ASPAs. The processing pipeline of this workflow integrates small unmanned aerial vehicles (UAVs)—or drones—to collect hyperspectral and multispectral imagery (HSI and MSI), global navigation satellite system (GNSS) enhanced with real-time kinematics (RTK) to collect ground control points (GCPs), and supervised machine learning classifiers. This workflow was validated in the field by acquiring ground and aerial data at ASPA 135, Windmill Islands, East Antarctica. The data preparation phase involves a data fusion technique to integrate HSI and MSI data, achieving the collection of georeferenced HSI scans with a resolution of up to 0.3 cm/pixel. From these high-resolution HSI scans, a series of novel spectral indices were proposed to enhance the classification accuracy of the model. Model training was achieved using extreme gradient boosting (XGBoost), with four different combinations tested to identify the best fit for the data. The research results indicate the successful detection and mapping of moss and lichens, with an average accuracy of 95%. Optimised XGBoost models, particularly Model 3 and Model 4, demonstrate the applicability of the custom spectral indices to achieve high accuracy with reduced computing power requirements. The integration of these technologies results in significantly more accurate mapping compared to conventional methods. This workflow serves as a foundational step towards more extensive remote sensing applications in Antarctic and ASPA vegetation mapping, as well as in monitoring the impact of climate change on the Antarctic ecosystem.