I-RIM Conference 2019 October 18 - 20, Rome, Italy ISBN: 9788894580501 Robotics for Precision Agriculture @DIAG Ciro Potena, Mulham Fawakherji, Carlos Carbone, Marco Imperoli, Daniele Nardi, Alberto Pretto Sapienza Università di Roma, {lastname }@diag.uniroma1.it Abstract Flourish 1 is a recent H2020 project, whose aim was to develop a multi-platform robotic solution for precision agriculture, combining a micro UAV and a UGV. The aim of this document is to sketch the contribution of Sapienza Univ. of Rome in the con- text of the Flourish project, as well as the current follow-up activities in precision agriculture. 1 Introduction The Flourish project started in 2014 with the aim of show- ing through a prototype system the potential of autonomous robots to support the practices of Precision Agriculture. Specifically, the project focus was on targeted weed removal, to reduce the use of pesticides and human labor. The proto- type system is composed of a micro UAV for aerial surveying of the crop field and a UGV (i.e. Bosch Bonirob) for oper- ation in the field. The overall Flourish concept is shown in Fig. 1. The Flourish consortium was led by ETH Zurich and includes seven partners, among them RoCoCo lab from Sapienza Univ. of Rome. Below we address the scientific contributions developed at Sapienza Univ. of Rome. 2 Research highlights The main results achieved by our group in Flourish are in the following research areas: (i) Crop/weed classification; (ii) UGV localization and field mapping; (iii) UGV and UAV map realignment. Flourish software and the Datasets are available at: www.dis.uniroma1.it/~labrococo/fsd 2.1 Crop/weed classification The basic step for limiting weeds is their identification on the field. In particular crop/weed classification aims at spotting cultivated plants and weeds from top views of the field. This problem can be effectively addressed by convolutional neural networks; however, this approach suffers from the effort in input data labeling, that requires a manual pixel-wise anno- tation. We proposed to use synthetic images generated by a representation of the field built through Unreal Engine 4 (see 1 http://flourish-project.eu/ an example in 2) to support the training of the network. In [Potena et al., 2017b; Di Cicco et al., 2017] we report that the use of the simulated images, whose annotation comes di- rectly from the simulation environment) can substantially re- duce the amount of labelled real images, while keeping the same classification accuracy (∼ 91%). Figure 1: Flourish concept: UAV and UGV collaboratively map and act upon the weeds. Figure 2: Synthetic image of a sugar beet field annotated. 2.2 UGV localization and field mapping The goal of this research was to solve the Simultaneous Lo- calization and mapping problem for the UGV. In fact, GPS alone is not fully satisfactory, because it is not always suf- ficiently accurate, even in the most advanced technologies on the market, considering the navigation requirements for the UGV moving in between narrow crop rows. Moreover, the scene is rather homogeneous and does not provide good enough features for data association. Our work aims at im- proving the overall system performance by combining a vari- ety of sensors (shown in 3) to leverage their capabilities and DOI: 10.5281/zenodo.4782495