An Overview of AUV Algorithms Research and Testbed at the University of Michigan Ryan M. Eustice * , Hunter C. Brown * , Ayoung Kim * Department of Naval Architecture & Marine Engineering Department of Mechanical Engineering University of Michigan Ann Arbor, Michigan 48109-2145 email:{eustice, hcbrown, ayoungk}@umich.edu Abstract—This paper provides a general overview of the autonomous underwater vehicle (AUV) research projects being pursued within the Perceptual Robotics Laboratory (PeRL) at the University of Michigan. Founded in 2007, PeRL’s research thrust is centered around improving AUV autonomy via al- gorithmic advancements in sensor-driven perceptual feedback for environmentally-based real-time mapping, navigation, and control. In this paper we discuss our three major research areas of: (1) real-time visual simultaneous localization and mapping (SLAM); (2) cooperative multi-vehicle navigation; and (3) perception-driven control. Pursuant to these research ob- jectives, PeRL has acquired and significantly modified two commercial off-the-shelf (COTS) Ocean-Server Technology, Inc. Iver2 AUV platforms to serve as a real-world engineering testbed for algorithm development and validation. Details of the design modification, and related research enabled by this integration effort, are discussed herein. I. I NTRODUCTION The Perceptual Robotics Laboratory (PeRL) at the Uni- versity of Michigan (UMich) is actively involved in three major research efforts: real-time vision-based simultaneous lo- calization and mapping (SLAM), heterogeneous multi-vehicle cooperative navigation, and perception-driven control. The laboratory chose to purchase two commercial off-the-shelf (COTS) Ocean-Server Technology autonomous underwater vehicles (AUVs) to support these research goals, and upgraded the vehicles with additional perceptual and navigation sensors to enable this research. A. Real-Time Visual SLAM The first of the three PeRL research domains, real-time vision-based SLAM algorithms [1]–[4], has direct application to ship-hull inspection [5] and deep sea archaeological mis- sions [6]. Present day means for ship hull and port facility inspection require either putting divers in the water or piloting a remotely operated vehicle (ROV) over the area of interest— both of which are manpower intensive and generally cannot guarantee 100% survey coverage. Automating this task, how- ever, is challenging and compounded by the fact that areas around ships in berth are severely confined, cluttered, and complex sensing environments (e.g., acoustically, optically, magnetically). Current tethered robotic inspection systems present issues of snagging, maneuver degradation, and tether management, all of which make maneuvering around the ship at pier difficult. Moreover, current robotic inspection methods require human in-the-loop intervention for both sensory inter- pretation and control (piloting). Navigation feedback in these scenarios is typically performed using acoustic transponder- based time-of-flight ranging [7], [8]. This necessitates setup and calibration of the associated acoustic-beacon navigation infrastructure, and therefore vitiates our ability to rapidly and repeatably inspect multiple underwater structures. In light of this, there exists a need to automate this task through the use of untethered robotic vehicles. To do so with AUVs requires overcoming several present-day science and technology challenges inherent to the inspection task. For example, areas around ships in berth are severely confined, cluttered, and complex sensing environments (e.g., rudders, screws). This necessitates the need for advanced navigation and localization systems that can work in confined, mag- netically noisy spaces. In addition, to ensure 100% survey coverage of ship hulls, pier structures, and pilings requires technological advances in our understanding of autonomous environmental perception and control. The underlying algo- rithm should facilitate in-situ sensor-reactive navigation and mapping in these environments while accommodating map- based learning through time via revisited exploration (a pre- requisite for hull change detection). Moreover, the increased diversity of threat objects and associated potential for more false alarms due to a cluttered environment necessitates that fusion take place from multiple types of sensors for robust- ness and redundancy. In combination, all of these individual challenges/requirements, together, suggest that a feature-based navigation and mapping strategy approach would accommo- date the needs of autonomous automated search and inspection by AUVs. The technical objective of this work is to develop an optical/acoustic real-time feature-based navigation (FBN) ca- pability for explosive ordnance disposal (EOD) autonomous ship-hull inspection. Fig. 1 depicts core elements of the over- all FBN methodology, called visually augmented navigation (VAN). The VAN framework uses visual perception to aug- ment the onboard dead-reckon navigation capabilities of the unmanned underwater vehicle (UUV). VAN uses a pose- graph SLAM framework [3], [4], [9] to incorporate pairwise constraints from overlapping sensor imagery. These constraints