• Home
  • Team
  • Tools
  • Blog
  • Opportunities
  • Contact
  • Home
  • Team
  • Tools
  • Blog
  • Opportunities
  • Contact

Automated fish identification and abundance using artificial intelligence

  • Posted by Ellen Ditria
  • On June 26, 2020

By Ellen Ditria

While running video footage fresh from your underwater camera into a computer that automatically identifies species and determines abundance may seem like wishful thinking for ecologists, successes in recent research shows we’re close to having an accessible tool for researchers.

The use of camera technology in aquatic sciences has increase rapidly over the last few decades with the advent of digital imagery and miniturisation. From current uses of baited underwater cameras to marine drones and camera traps, more digital data are being collected as equipment becomes cheaper more accessible.

While we can now capture and share more data over larger temporal and spatial scales it has left us with a severe bottleneck of processing and analysing large volumes of data. Deep learning, a branch of machine learning, has emerged as a highly versatile and promising technology to solve this bottleneck problem.

In our recent study, we found that deep learning algorithms were able to detect and count a target species of fish (luderick) from raw video footage in seagrass habitats; faster, and with higher accuracy that human.

 

https://globalwetlandsproject.org/wp-content/uploads/2020/07/video_automated-analysis-web.mp4

When compared with a group of citizen scientists and a group of fish experts, the deep learning computer models were not only more accurate, but the results between individual models were also far more consistent than among individual humans and achieved in a fraction of the time.

The models and humans were presented with a set of 50 images from seagrass environments, and were asked to identify and count only the luderick in each image. The models achieved a performance of 95%, while experts achieved 88%, and citizen scientists on average lagged at 82%.

We also tested the performance of each analysis method on identifying and observing the MaxN for 32 videos. Surprisingly, the model again achieved the highest performance score (87%), even though the experts and citizen scientists (85% and 79% respectively) were expected to perform better as humans have the advantage of being able to draw from context and observe the fish back and forth over the course of the video, which the computer cannot.

Arguably more important than high accuracy, data analysis methods that are consistent in their answers are important for examining changes over time. The ability and confidence of individuals in analysing video footage varies widely, and long-term science and monitoring often needs to involve different individuals year after year while trying to collect data that may be used for trends and changes.

Our research has created a tool for scientists to simply upload their raw video footage the FishID platform (developed at Griffith University) to process and convert their image-based data to a usable CSV file output for further analysis.

Reducing the time needed to trawl through video footage means that more footage gets analysed, which can increase sample sizes and potentially enable scientists to analysis more data over space and time.

0 Comments

Recent Posts
  • Multiple stressors in coastal wetlands: shifting our focus to real world scenarios
  • Global Trends in Mangrove Forest Fragmentation
  • New funding to support a synthesis of mangrove threats, governance, and conservation outcomes.
  • Deep learning for ecological monitoring: performance in novel habitats and benefits of varied training data
  • Integrating Artificial Intelligence and Citizen Science can Supercharge Ecological Monitoring
Archives
  • January 2021
  • December 2020
  • October 2020
  • September 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
Categories
  • Blue Carbon
  • Communication
  • Conference
  • Conservation
  • Jobs
  • News
  • Opinion
  • Research
  • Uncategorized

Data Science: The new language of ecologists

Previous thumb

Long-distance sea turtle migration provides unique opportunity to combine and test exciting tracking techniques

Next thumb
Scroll
PhD APPLICATIONS OPEN

Range of projects available with $25,000 funds for field work and collaborative travel.

Find out more.

PARTNERING FOR CHANGE

GLOW is proud to be an active member of the Global Mangrove Alliance.

Check out the GMA website.

@2018 Griffith University, CRICOS Provider - 00233E. Images: Tom Rayner, Anusha Rajkaran and via Creative Commons.