Skip to:

Enhancing Accessibility, Reliability, and Validation of Actionable Information from Unmanned Aerial Vehicle Image Data

Project Impacts

The utility of UAV data is limited on many fronts. We have addressed several of these issues with this project. The first was to reduce the cost barrier by taking an image from an inexpensive sensor and generating an index value that simplifies the information being communicated. This was achieved using a deep learning neural network based on the Pix2Pix neural network. Our testing indicated the result was comparable to more expensive alternatives in practice, although room for improvement was noted. Second, we created methods to provide background information with UAV data that provides context. If we think about the mental process a subject matter expert would use to identify the cause of a production stress, we know that they are able to consider external factors that may be important to understanding why the stress is present. Thus, we incorporated the ability to have weather data matched to the UAV imagery as weather is explanatory in many issues of crop stress, but also speaks to image quality concerns. Third, we invented, designed, constructed, and tested an autonomous mobile ground control point (GCP). The GCP provides reference data on position for georectification of image data, as well as calibration for height, temperature, and reflectance of crops in the field.

The GCP navigates farm fields in collaboration with a UAV, providing multiple instances of references in the imagery of the field during UAV operation. Field test accuracy levels achieved were 10 cm for georeferencing, 4 cm for height, 1% for reflectance, and 2°C for temperature. Together these actions collectively improved several aspects data quality and usability.