• Home
  • New initiatives
  • Partners
  • Tools
  • About us
  • Blog
  • Opportunities
  • Contact

Deep learning to detect animal behaviour

  • Posted by Natasha Watson
  • On June 9, 2021

By Ellen Ditria

Studying and quantifying animal behaviour is important to understand how animals interact with their environments. However manually extracting and analysing behavioural data from large volumes of camera footage is often time consuming. Our new research shows how artificial intelligence can be a valuable tool to analyse underwater footage more effectively.

Deep learning techniques have emerged as useful tools in automating the analysis of certain behaviours under controlled or laboratory conditions. The complexities of using raw footage from the field however has resulted in this technology remaining largely unexplored as a possible data analysis alternative for animals in situ.

Counting the frequency of certain animal behaviours can be used as an important metric in understanding more complex behaviours. For example, kangaroos will stand on their hind legs with the head and ears raised, a behaviour frequently use by biologists to score vigilance to a perceived threat. The more times the animal exhibits the behaviour, the higher the vigilance score.

Using machine learning to automate these observational methods could assist in observer bias. It could also increase the sample size of studies when coupled with remote sensing methods like camera traps or remote video footage to collect data. Additionally, it may also limit confounding factors such as the observer’s presence that may affect the animal’s behaviour.

We found that by combining dense optical flow and deep learning algorithms, we can automatically identify the frequency of the grazing behaviour exhibited by Luderick (Girella tricuspidata) collected by underwater cameras.

Dense optical flow is able to estimate motion of pixels between two frames. The red-green-blue (RGB) coloured output denotes area and direction of pixel movement, while the black areas indicate no movement. The output is classified by the deep learning algorithm as “grazing” or “no grazing” behaviours. Additionally, we applied a spatio-temporal algorithm which looks at frames together, instead of in isolation, to see patterns in the pixel movement across frames.

Using this method across 18 videos we were able to detect 34 out of 37 grazing events (a rate of 92%).

Deep learning shows promise as a viable tool for determining animal behaviour from underwater videos. With further development it offers an alternative to current time-consuming manual methods of data extraction.

0 Comments

Recent Posts
  • Smarter monitoring for healthier oceans: How the GLOW team uses FishID
  • Mapping shellfish reefs in southeast Queensland for protection, management and restoration
  • New Paper: Integrating socioeconomic and ecological data into restoration practice
  • Fish AI Consortium Presentation: Rapid improvements in fisheries monitoring with underwater computer vision
  • Co-occurrence of ecosystem services to inform global mangrove conservation planning
Archives
  • May 2025
  • January 2025
  • December 2024
  • April 2024
  • February 2024
  • January 2024
  • December 2023
  • August 2023
  • November 2022
  • March 2022
  • February 2022
  • November 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • January 2021
  • December 2020
  • October 2020
  • September 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
Categories
  • Blue Carbon
  • Communication
  • Conference
  • Conservation
  • FishID projects
  • Jobs
  • News
  • Opinion
  • Research
  • Uncategorized

Automated fish tracking for aquatic conservation

Previous thumb

Global Mangrove Alliance: The State of the Worlds Mangrove's report

Next thumb
Scroll
PhD APPLICATIONS OPEN

Range of projects available with up to $15,000 funds for field work and collaborative travel.

Find out more

PARTNERING FOR CHANGE

GLOW is proud to be an active member of the Global Mangrove Alliance.

Check out the GMA website

@2018 Griffith University, CRICOS Provider - 00233E. Images: Tom Rayner, Anusha Rajkaran and via Creative Commons.
  • Home
  • New initiatives
  • Partners
  • Tools
  • About us
  • Blog
  • Opportunities
  • Contact