Georectifying drone image data over water surfaces without fixed ground control: Methodology, uncertainty assessment and application over an estuarine environment

Watts, J, Holding, T, Anderson, K, Bell, TG, Chapron, B, Donlon, C, Collard, F, Wood, N, Walker, D, DeBell, L, Duffy, JP and Shutler, J 2024 Georectifying drone image data over water surfaces without fixed ground control: Methodology, uncertainty assessment and application over an estuarine environment. Estuarine, Coastal and Shelf Science, 305. 108853. https://doi.org/10.1016/j.ecss.2024.108853

[img]
Preview
Text
1-s2.0-S0272771424002415-main (1).pdf - Published Version
Available under License Creative Commons Attribution.

Download (7MB) | Preview
Official URL: http://dx.doi.org/10.1016/j.ecss.2024.108853

Abstract/Summary

Light-weight consumer-grade drones have the potential to provide geospatial image data to study a broad range of oceanic processes. However, rigorously tested methodologies to effectively and accurately geolocate and rectify these image data over mobile and dynamic water surfaces, where temporally fixed points of reference are unlikely to exist, are limited. We present a simple to use automated workflow for georectifying individual aerial images using position and orientation data from the drone’s on-board sensor (i.e. direct-georectification). The presented methodology includes correcting for camera lens distortion and viewing angle and exploits standard mathematics and camera data processing techniques. The method is used to georectify image datasets from test flights with different combinations of altitude and camera angle. Using a test site over land, directly-georectified images, as well as the same images georectified using standard photogrammetry software, are evaluated using a network of known ground control points. The novel methodology performs well with the camera at nadir (both 10 m and 25 m above ground level) and exhibits a mean spatial accuracy of ±1 m. The same accuracy is achieved when the camera angle is 30◦ at 10 m above ground level but decreases to ±2.9 m at 30◦ and 25 m. The accuracy changes because the uncertainties are a function of the altitude and angle of the camera versus the ground. Drone in-flight positioning errors can reduce the accuracy further to ±5 m with the camera at 30◦ and 25 m. An ensemble approach is used to map the uncertainties within the camera field-of-view to show how they change with viewing distance and drone position and orientation. The complete approach is demonstrated over an estuarine environment that includes the shoreline and open water, producing results consistent with the land-based field-tests of accuracy. Overall, the workflow presented here provides a low cost and agile solution for direct-georectification of drone-captured image data over water surfaces. This approach could be used for collecting and processing image data from drones or ship-mounted cameras to provide observations of ocean colour, sea-ice, ocean glitter, sea surface roughness, white-cap coverage, coastal water quality, and river plumes. The Python scripts for the complete image georectification workflow, including uncertainty map generation, are available from https://github.com/JamieLab/SArONG.

Item Type: Publication - Article
Divisions: Plymouth Marine Laboratory > Science Areas > Marine Biochemistry and Observations
Depositing User: S Hawkins
Date made live: 22 Jul 2024 16:13
Last Modified: 22 Jul 2024 16:13
URI: https://plymsea.ac.uk/id/eprint/10258

Actions (login required)

View Item View Item