Automating the analysis of fish grazing behaviour from videos using image classification and optical flow
Author(s)
Ditria, EM
Jinks, EL
Connolly, RM
Griffith University Author(s)
Year published
2021
Metadata
Show full item recordAbstract
Studying and quantifying behaviour is important to understand how animals interact with their environments. However, manually extracting and analysing behavioural data from the large volume of camera footage collected is often time consuming. Deep learning techniques have emerged as useful tools in automating the analysis of certain behaviours under controlled or laboratory conditions, but the complexities of using raw footage from the field has resulted in this technology remaining largely unexplored as a possible data analysis alternative for animals in situ. Here, we use deep learning techniques to automate the analysis ...
View more >Studying and quantifying behaviour is important to understand how animals interact with their environments. However, manually extracting and analysing behavioural data from the large volume of camera footage collected is often time consuming. Deep learning techniques have emerged as useful tools in automating the analysis of certain behaviours under controlled or laboratory conditions, but the complexities of using raw footage from the field has resulted in this technology remaining largely unexplored as a possible data analysis alternative for animals in situ. Here, we use deep learning techniques to automate the analysis of fish grazing behaviour from real-world field imagery. We collected video footage in sea grass meadows in Queensland, Australia, and trained models on a training data set of over 3000 annotations. We used a combination of dense optical flow to assess pixel movement in underwater footage, spatiotemporal filtering to increase accuracy, and deep learning algorithms to classify grazing behaviour of luderick, Girella tricuspidata. When tested on novel videos the model had not seen in training, the model correctly identified nearly all individual grazing events. Deep learning shows promise as a viable tool for determining animal behaviour from underwater videos, and with further development offers an alternative to current time-consuming manual methods of data extraction.
View less >
View more >Studying and quantifying behaviour is important to understand how animals interact with their environments. However, manually extracting and analysing behavioural data from the large volume of camera footage collected is often time consuming. Deep learning techniques have emerged as useful tools in automating the analysis of certain behaviours under controlled or laboratory conditions, but the complexities of using raw footage from the field has resulted in this technology remaining largely unexplored as a possible data analysis alternative for animals in situ. Here, we use deep learning techniques to automate the analysis of fish grazing behaviour from real-world field imagery. We collected video footage in sea grass meadows in Queensland, Australia, and trained models on a training data set of over 3000 annotations. We used a combination of dense optical flow to assess pixel movement in underwater footage, spatiotemporal filtering to increase accuracy, and deep learning algorithms to classify grazing behaviour of luderick, Girella tricuspidata. When tested on novel videos the model had not seen in training, the model correctly identified nearly all individual grazing events. Deep learning shows promise as a viable tool for determining animal behaviour from underwater videos, and with further development offers an alternative to current time-consuming manual methods of data extraction.
View less >
Journal Title
Animal Behaviour
Volume
177
Subject
Biological sciences
Agricultural, veterinary and food sciences
Psychology