Nordic Machine Intelligence, vol. 02, pp. 1–3, 2022
https://doi.org/10.5617/nmi.9657
FishAI: Sustainable Commercial
Fishing
Tor-Arne Schmidt Nordmo
1
, Ove Kvalsvik
2
, Svein Ove Kvalsund
2
, Birte Hansen
3
,
Dag Johansen
1
, Håvard Dagenborg Johansen
1
, Michael A. Riegler
1, 4
1. UiT The Arctic University of Norway, Norway
2. Vekstlandet, Norway
3. NORA—Norwegian Artificial Intelligence Research Consortium, Norway
4. SimulaMet, Norway
Abstract
FishAI: Sustainable Commercial Fishing is the second chal-
lenge at the Nordic AI Meet following the successful MedAI,
which had a focus on medical image segmentation and trans-
parency in machine learning (ML)-based systems. FishAI fo-
cuses on a new domain, namely, commercial fishing and how
to make it more sustainable with the help of machine learning.
A range of public available datasets is used to tackle three spe-
cific tasks. The first one is to predict fishing coordinates to
optimize catching of specific fish, the second one is to create
a report that can be used by experienced fishermen, and the
third task is to make a sustainable fishing plan that provides a
route for a week. The second and third task require to some
extend explainable and interpretable models that can provide
explanations. A development dataset is provided and all meth-
ods will be tested on a concealed test dataset and assessed by
an expert jury.
Keywords: artificial intelligence; machine learning; trans-
parency; fishing; automatic reporting
Introduction
With a őshing zone spanning 2.1 million square meters,
Norway is considered Europe’s largest őshing and aqua-
culture nation. Every year, commercial vessels catch ősh
with a total value of around 20 billion NOK from the Nor-
wegian őshing zone.
The overall migration patterns of the major ősh species
are relatively predictable and common knowledge. A
ősherman knows, for example, that the mackerel season
starts in mid-September and plans accordingly. On a
daily basis, however, ősh populations can move over large
distances, and with the main decision-making tool being
the captain´s experience and intuition, boats often search
for days or even weeks before making a catch. The
number of boats is not negligible; there are currently
around 1,100 Norwegian vessels over 11 meters involved.
It is estimated that each vessel burns around 2,000ś2,500
liters of fuel per day, which translates to approximately 5
million kg CO2-equivalents per day.
Although the őshing ŕeet over time has shown an
impressive ability to renew itself, the core operation
of searching and catching ősh clearly has room for
improvements in a sustainability context. Speciőcally,
a more energy-efcient commercial őshing practice and
operation should be a goal. In other words, there are great
environmental beneőts and opportunities in optimizing
commercial őshing activities by reducing unnecessary
transport distances. With the recent release of catch data
made available by the Norwegian Directorate of Fisheries,
a signiőcant potential of applying artiőcial intelligence
opened up, which we want to explore with this challenge.
Dataset Details
We provide the participants with a collection of four
publicly available datasets: a catch note dataset, a
temperature dataset, a salinity dataset, and a moon phase
dataset. All datasets can be used in all tasks and can be
downloaded via: https://tinyurl.com/54w5bvxa. For the
GPS coordinates predictions, the catch notes dataset is
the ground truth. Participants are also encouraged to
use other data sources if they are public available. In
the following we provide a more detailed description of
each dataset and what the participants can expect for
the evaluation of their results.
Catch Notes Dataset
The catch notes data contains catch notes collected by
the Norwegian Fishing Directorate from 2000 to today
for vessels larger than 15 meters. The notes consist
of information about the catch that is manually logged
during landing, e.g., when it was caught, where it was
© 2022 Author(s). This is an open access article licensed under the Creative Commons Attribution License 4.0.
(http://creativecommons.org/licenses/by/4.0/).