»Deep Learning for automating fish age from otolith images«
2020-09-21, 14:15–14:35, Room 2
In this work, we investigate the ability of modern computer to provide an automatic extraction of fish age from otolith images. The dataset used in this work is provided from the database of the Hellenic Center of Marine Research (HCMR) and includes a large collection of 5027 otolith images and measurements of length for red mullus fish species.
Accurate and efficient estimation of fish ageing is crucial for assessing the status of a fish stock, modelling its trend and developing management plans for ensuring its sustainability. Prevailed methods of fish ageing are based on readings of otolith images by experts and are often time-consuming and vulnerable to errors. Consequently, automating this task is of great interest. Convolution Neural Networks are emerging as a powerful tool for automating processes across data domains and can be equally applied to a fish imagery. In this study, we investigated the feasibility of neural networks to provide an automatic reading of fish otoliths, necessary for ageing, from digital images. The dataset used included 5027 otolith images of the red mullet (Mullus barbatus) from Greek waters from age groups 0-5+. To achieve our goal, a pre-trained convolutional neural network designed for image classification was adopted and finetuned, considering fish age estimation as a multi-class classification task. Having as input an otolith image, a classifier must select a class: Age-0, Age-1, Age-2, Age-3, Age-4, Age-5+, representing the age of the fish. Additionally, the potential benefit of multitask learning for improving model’s predictability by further including the variable of fish length in the network architecture was explored. Results showed that the ages of red mullus otoliths were estimated correctly by 64.4%, being more efficient too younger age-0 and age-1 classes. Multitask learning increased the correct age prediction by 7.4%, reaching 69.2% and proved better to identify older ages within a range of +3.6% to +20.1%. Limitations and further suggestions for improvement of this work are discussed. Ultimately, the present study attempts to provide a viable and faster alternative to manual data analysis methods currently used to extract physical information from images of fish otoliths.