Institute for Environmental Solutions (IES) introduces a new wild animal counting approach using innovative technological solutions – drones, motion-activated camera traps and passive acoustic sensor networks as well as deer tracking with GPS transmitters. IES test the technologies in cooperation with SIA “Forest Owners’ Consulting Centre” and Latvian State Forest Research Institute "Silava".
Red deer, roe deer, elk and wild boar are four dominant even-toed ungulates species in Latvia. These animals have an important role in our ecosystem, but they can also cause damage to the agricultural and forestry sectors. Furthermore, annually 750.000 vehicle collisions in Europe are reported with ungulates involvement, thus, indicating a problem related to safety on roads.
IES researchers in cooperation with partners are developing innovative ICT-based (data-driven) wild animal (ungulate) counting methodology to support decision-making on sustainable wildlife management and conflict resolution among landowners, hunters and society.
IES researcher Alekss Vecvanags showcased the latest conclusions and described further steps of the research.
IES: In the previous research season (April – June 2020) you continued trials with passive acoustic sensor networks (microphones) by using human-made noises. This approach allowed us to test the acoustic sensors for the detection of the noises and locate the sources of them. How has the data collection using microphone networks progressed?
Alekss Vecvanags (A.V.): Yes, in the beginning, for microphone network testing we used easily identifiable human-made noises. Ungulates are silent animals, therefore harder to track. Red deer and elk are noisy only during mating season in autumn (August until the beginning of October) when they make sounds to entice females and scare off competitors. Mating season is the most suitable time to use microphone networks in the research.
Currently, we are running two types of microphone network tests:
1. Placing microphones in a grid layout. Placing microphones in the research area based on the grid layout. Thus, we can cover large territory with the potential to hear animal mating calls. If these sounds can be heard in some of the microphones but not in others, we can try to determine the approximate location of the animal. This is done based on the sound intensity and locations of microphones.
2. Triangulation tests. In the research area, we place 3 microphones (approximately 300 m apart) to form a triangle. All 3 of the microphones are synchronized with a GPS watch (with a potential error of 1 millisecond). Microphones capture animal sounds. By the time difference of the sound detected in each microphone, we can calculate the location of the animal.
Spectrogram of red deer mating call in passive acoustic sensor data. Data: Institute for Environmental Solutions.
IES: Please describe the progress of the use of camera trap technologies for this research?
A.V.: Automatization of camera trap data processing is still in process, but we have stepped closer to the intended outcome. We have automatized the data gathering process. Previously, our researchers had to do it manually – they went to the forest once a month, searched for the cameras, then gathered the data and brought them to the office, just after that they started to analyze them. But now we have developed a system equipped with remote technologies that deliver the data directly from camera traps to the researchers’ computers as soon as the image or video is captured.
Additionally, we are developing automated data sorting method. We have collected a large amount of data that has helped us to create a computer vision algorithm – a deep neural network. Now, we are developing an automated algorithm that will be capable of sorting data collected by camera traps, as well as identifying the animals captured in images and videos.
Video captured with a camera trap, showing red deer in a swamp area, Rāmuļu territory. Video: Institute for Environmental Solutions.
IES: How the training of the algorithm is done so far and how much data do you need for this process?
A.V.: Algorithm training is a time-consuming process. We needed to input a great amount of image samples of the animals of interest – red deer, roe deer, elk and wild boar. Initially, for the training of the algorithm we used online databases of animal images. More data used in learning process can provide a better precision of an algorithm. Therefore, it is beneficial to use existing databases with millions of animal images. After that, we input image examples of the animals of interest captured in Latvia. It was an important step, because species can visually differ depending on the territories they live in. Therefore, we used data gathered by Latvian State Forest Research Institute "Silava", as well as images captured by camera traps during this research.
IES: How did you chose the locations for placing camera traps?
A.V.: At the beginning of the research, we placed cameras next to animal feeders that they visit regularly. We did it for the purpose of testing. It was crucial to understand how the camera traps work, how frequently motion and other parameters are detected. Additionally, we needed to collect a large amount of data that we can use for the training of computer vision algorithm.
Now, we have relocated the camera traps on the animal pathways in different habitats. This will give as an information about animal preferences towards different habitats, therefore allowing us to start to work on animal habitat modelling.
IES: What conclusions have you already made from the trials of camera traps?
A.V.: From the trials of the first 5 cameras we placed we observed that sometimes they frighten the animals while switching on. We have two theories why this is happening. The first theory is that, as a result of animal movements, the camera trap is switched on. At this point, camera emits the close-infrared light. Human eye is not capable of seeing it, but animals might react. Second theory is that the camera creates a sound, when it switches on that might frighten the animal. On one hand, these factors may cause animal migration to the different territory, because they are scared to cross the camera traps. On the other hand, we have observed that if nothing bad happens after the scary moment, an animal gets used to it and do not feel disturbed anymore.
During the testing process, we used external battery power for camera traps. Now we have decided to try solar panels that will allow us to change the batteries and accumulators much rarely. This solution will also reduce a human presence in the researched wildlife areas, thereby reducing the impact on the research results.
IES: It was planned to expand the camera trap network in autumn 2020. Has the expansion been done already and why it was necessary?
A.V.: At the moment, we are increasing our network by purchasing 24 more camera traps. It is planned to place them in the areas of various forest types with trees of different ages. This approach will provide us more complete data sets on animal behaviour. Thus, allowing us to research animal migration in different habitats. By understanding which habitats animals prefer we will be able to model animal habitats.
IES: How frequently camera traps capture images and videos of animals?
A.V.: 20 to 50 videos with ungulates were gathered daily in camera trap tests that we carried out next to the feeders. The amount of data gathered from camera traps placed on animal pathways is relatively smaller, but it reflects true insights in animal density in the area.
Camera traps have captured different animals – ungulates that we are interested in, as well as birds, foxes, rabbits and other animals. Most interesting case during this research has been a wild bear captured by camera traps.
IES: How frequently camera traps capture images with 4 red deer equipped with GPS transmitters?
A.V.: Most frequently, red deer hind Elmīra, has been seen with her fawn. Other three red deer, which were equipped with GPS transmitter collars in Spring we have not captured in the research area for a while. In GPS data, we can see that they have migrated to the other areas during the Summer. We hope that along with the cooler weather, the three migrated deer will return.
IES: In this research drones are also used for animal tracking and counting. When do you plan to restart drone flights over the research area?
A.V.: Summer is a time when the role of drones in animal counting and tracking decreases, because animals are hiding under the tree crowns. During this period animals in drone data can be seen only when they cross the open areas. We are planning to start to use the drones at the end of the autumn, when trees will be without leaves and in thermal data warm animals on the cold background will stand out more. This technology will help us to obtain additional information on the animal location in the area, both on spatial and time scales. Tus, allowing us to compare the drone data with the data gathered by camera traps.
IES: Please share the upcoming future plans of this research.
A.V.: The main plan is to deploy an entire network of 30 camera traps, so we can representatively research all forest types, as well as to develop automated data flow. We will continue the collection of data. When we will have enough data, we will be able to model animal population density, as well as the impact of environment and other factors.
The research is a part of the project “ICT-based wild animal census approach for sustainable wildlife management” (No. 184.108.40.206/18/A/146) is part of European Regional Development Fund, 1.1.1 "Improve research and innovation capacity and the ability of Latvian research institutions to attract external funding, by investing in human capital and infrastructure" 220.127.116.11. measure “Support for applied research”.
Find more about this project here.