1 MagIK: a Hand Tracking Magnetic Positioning System based on a Kinematic Model of the Hand Francesco Santoni, Alessio De Angelis, Antonio Moschitta and Paolo Carbone Department of Engineering, University of Perugia, Perugia 06125, Italy Email: {francesco.santoni, alessio.deangelis, antonio.moschitta, paolo.carbone}@unipg.it Abstract—In this paper we present a hand tracking system based on magnetic positioning. A single magnetic node is mounted on each fingertip, and two magnetic nodes on the back side of the hand. A fixed array of receiving coils is used to detect the magnetic field, from which it is possible to infer position and orientation of each magnetic node. A kinematic model of the whole hand has been developed. Starting from the positioning data of each magnetic node, the kinematic model can be used to calculate position and flexion angle of each finger joint, plus the position and orientation of the hand in space. Relying on magnetic fields, the hand tracking system can work also in non- line-of-sight conditions. The gesture reconstruction is validated by comparing it with a commercial hand tracking system based on a depth camera. The system requires a small amount of electronics to be mounted on the hand. This would allow building a light and comfortable data glove that could be used for several purposes: human-machine interface, sign language recognition, diagnostics, and rehabilitation. I. I NTRODUCTION C APTURING whole hand gestures has been the subject of intense research for decades [1]–[4]. By capturing hand gestures, it is intended measuring the position and orienta- tion in space of each hand component: fingertips, phalanges, joints, wrist. A hand tracking system is supposed to not only recognize static hand poses but also to track the motion of each part. There are numerous possible applications of hand gesture tracking. Maybe the most known is the implementation of hand tracking as a human-computer interface, particularly in virtual/augmented reality applications [5]–[9]. Another well- known application, even of considerable commercial inter- est, is lifelike computer graphics and animation [10]–[12]. As a human-machine interface, hand tracking can be used also to control industrial equipments and robots in real-time [13], [14], even remotely [15]. In combination with machine learning techniques, robots can be taught to perform specific tasks by the repeated observation of human gestures [16]. In a clinical setting, the assessment of the hand kinematics can be exploited for diagnostic and rehabilitation purposes [17]. Hand tracking data and machine learning can be used to recognize characteristic motor symptoms associated with a particular disease (e.g. Parkinson’s disease) [18]–[20], or to compare performed gestures with standard references extracted from control sets in targeted rehabilitation exercises [21], [22]. Hand tracking and machine learning are also being studied for automatic sign language recognition [3], [23], [24]. Several techniques have been studied for hand tracking. Vision-based techniques use cameras to capture hand images, then employ machine learning and optimization algorithms to identify features and patterns from which the hand con- figuration can be inferred [25], [26]. Techniques to identify skin features such as colors, shades, textures have been implemented, as well as contour tracing to discriminate the hand silhouette from the background. These techniques are based on 2D information, and are not particularly efficient. An enhancement solution has been the application on the various hand components of specific markers that can be visually identified [27]. To obtain depth information, stereo cameras, time-of-flight (ToF) or structured light techniques have been used [3], [25]. The LeapMotion controller, that in the following of this paper we will use as a comparison, is a quite well known commercial hand tracking system [20], [28], [29]. It employs three infra-red (IR) LEDs to illuminate the scene, and two IR cameras. Identification and reconstruction of the hand pose are done using stereo-vision techniques. The main draw-back of vision based techniques is that they require line- of-sight conditions, and cannot perform well if, because of the hand orientation or the presence of objects being manipulated, the whole hand is not visible to the cameras [4]. A different solution is the implementation of several sensors mounted on the hand itself, so as not to require an external capture apparatus such as a camera [1]. Stretch or bend sensors have been used to measure joint flexion angles [30], [31], while inertial measurement units (IMU), such as accelerometers or gyroscopes, have been used to obtain informations on position and orientation [5], [17], [32]. Both types of sensors can be integrated in the same system [33]. Generally, the problem with this kind of solution is that, when applied to the whole hand, it may result in a bulky structure that makes movements and objects manipulation uncomfortable. Another technique that does not require line-of-sight, is based on magnetic field sources and sensors [5], [17], [18]. Magnetic sources can be mounted on the hand, while sensors are external [34], or, conversely, the magnetic field source is external, while magnetic sensors are mounted on the hand, as, for example, using the commercial magnetic positioning system Polhemus [35], [36]. In a different approach, both sources and sensors are mounted on the hand [37], [38], but in this case only the relative distance and orientation can be measured, hence the hand configuration can be reconstructed, but not its absolute position and orientation in space. A further distinction in hand tracking approaches is between discriminative and generative methods [2], [4]. In the first case, hand gesture reconstruction is mainly based on collected This is the author's version of an article that has been published in this journal. Changes were made to this version by the publisher prior to publication. The final version of record is available at http://dx.doi.org/10.1109/TIM.2021.3065761 Copyright (c) 2021 IEEE. Personal use is permitted. For any other purposes, permission must be obtained from the IEEE by emailing pubs-permissions@ieee.org.