Maize Phenotyping Using UAV-borne Hyperspectral, LiDAR, and Thermal Data Fusion and Machine Learning.
Abstract
The development of unmanned aerial vehicles (UAVs) and miniaturized sensing divides have been emerging as a nondestructive and innovative means for crop monitoring, especially for high-throughput plant phenotyping. Most of the current remote sensing studies have focused on single modality data on plant trait predictions. This study demonstrated the use of UAV, multi-sensors, and computer vision schemes to collect and estimate maize (Zea mays) phenomics for various purposes such as food security and bioenergy. A swarm of UAVs carrying a navigation system and a constellation of remote sensors: Headwall Photonics hyperspectral imager, Velodynes LiDAR point cloud, and ICIs thermal scanners was deployed over a maize experiment site in Urbana-Champaign, IL at the reproductive stage R5. At the season end, a full suite of maize traits was manually measured as ground truth data, including dry stalk biomass (Kg/Ha), cob biomass (Kg/Ha), dry grain yield (Kg/Ha), harvest index, grain nitrogen utilization efficiency, grain nitrogen content (Kg/Ha), plant nitrogen content (Kg/Ha), and grain density. Methodologically, the maize trait predictions were carried out by an extensive experiment varying from mono-sensory models to multimodality models, from a naïve fusion method (normalized difference spectral indices - NDSI) to traditional machine learning (support vector machine - SVR and random forest - RFR) to deep learning (convolutional neural network - CNN) for imagery learning. The findings showed that the UAV remotely sensed images can successfully predict all of the maize traits at the early stage. In detail, the CNN assembled from 3D and 2D convolutional operations and data augmentation overperformed conventional methods and achieved the highest coefficient R2 and lowest model errors (MAE and RMSE) by a fusion of hyperspectral and LiDAR imagery data across digital color-coded maps of maize traits. The study results advocate the potential and of low-cost multi-sensory aerial robots and leading-edge computer vision realm in accurately estimating crop traits, and more importantly, provide aid to decision makers (farmers, food processors, suppliers, etc.) to secure food chains and facilitate bioenergy transformation in the context of global climate change.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2021
- Bibcode:
- 2021AGUFM.H51H..09N