»Using Tensorflow for Infrared UAV-based Wildlife Detection«
2019-06-25, 11:45–12:00, Room 1
We developed a prototype to detect and classify animal signatures in real-time using deep learning frameworks in combination with UAV-based infrared remote sensing data. Being robust and highly performant, the approach exhibits an enormous potential for applicability in ecology and forestry management use case scenarios such as population assessment, mowing operation fawn recovery and wildlife damage prevention.
Classical approaches to monitor wildlife populations such as tracking, tracing and camera trapping involve a large amount of manual labour and hold several disadvantages, namely a high risk for methodological bias or a lack of scalability. In this project we demonstrate a prototypical solution for wildlife detection using deep learning object detection algorithms and UAV-based infrared remote sensing in order to alleviate these constraints.
During 27 flight campaigns in winter/spring 2018 seven local wildlife game reserves in northwestern Switzerland and the southern Black Forest were sampled using multicopters and fixed wing UAVs. Multiple sensors such as high-resolution RGB cameras, multispectral near-infrared sensors (NIR) and thermographic imaging sensors (black body radiation) were compared. Employing different analytical remote sensing approaches, thermographic data acquired from an altitude below 100m above ground level proved highly utilizable for object detection algorithms.
Experimental development was performed by testing multiple state of the art machine learning techniques. A deep learning environment was implemented using Python, Tensorflow and an “Inception”-class architecture. This neural network was subsequently trained on 8000 manually labelled animal signatures in order to extract their characteristic features. Some of the trained species yielded extremely robust signature detection results (deer, caprines, bison and cattle), while for others sample size or image resolution were insufficient (boars, humans and smaller mammals).
The final prototype uses an inferencing model applied to a live video stream which performs in real-time under field conditions. Overall detection rates range from 88.6% in the species-wise classification scenario to 94.8% in the unspecific detection scenario. These values show that this new approach features an enormous innovation potential for the future of wildlife monitoring.