International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 02 | Feb 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 719
Virtual Mouse Control Using Hand Gesture Recognition
G N Srinivas
1
, S Sanjay Pratap
2
, V S Subrahmanyam
3
, K G Nagapriya
4
, A Venkata Srinivasa Rao
5
5
Department of ECE, Sasi Institute of Technology & Engineering, Tadepalligudem, W.G.Dist, India.
1,2,3,4
UG Students, Department of ECE, Sasi Institute of Technology & Engineering, Tadepalligudem, W.G.Dist, India
---------------------------------------------------------------------***---------------------------------------------------------------------
Abstract - The mouse is an excellent device for human-
computer interaction. We currently use three types of mice:
wired, wireless, and Bluetooth. We need power to connect a
dongle to a PC in all of these scenarios. This work employs
cutting-edge machine learning and computer vision
algorithms to recognize hand gestures, which work flawlessly
without the use of any hardware. It is compatible with CNN
models implemented by mediapipe. In this paper, we work on
S.Shriram’s algorithm that [11] propose a method for
controlling the cursor's location using only one's hands and no
mouse. Some actions, like clicking and dragging items, will
necessitate a variety of hand movements. The proposed system
will only require the use of a single computer. A camera is used
as an input device. The following programs will be used: The
proposed system necessitates the use of Python and OpenCV.
The output of the camera will be displayed on the system's
display so that the end-user can fine-tune it.
Key Words: Camera, Machine Learning, CNN Model,
Mediapipe, Virtual Mouse, etc
1. INTRODUCTION
Hand gestures are universally recognized as the most
expressive and effective form of human communication.
Hand signals, thumbs up, and thumbs down have always
existed. Gestures are regarded as the most natural way for
people to communicate with one another. It has a lot of
personality. It was written in such a way that it could be
understood by the deaf and dumb. So why not put it to use
on our machines? In this work, we present actual hand
gestures. The initial setup includes a low-cost USB web
camera for system input.
This paper proposes a real-time hand gesture system. The
experimental setup of the system makes use of a low-cost
web camera with high-definition recording capability that is
installed in a fixed position. A camera mounted on a
computer monitor is used to photograph a laptop. This
project proposes an effective hand gesture segmentation
technique based on pre-processing, background subtraction,
and edge detection approaches.
The Python programming language and OpenCV, a computer
vision library, were used to create the AI virtual mouse
system. The MediaPipe package is used to track hands and
fingers, as well as the Pynput, Autopy, and PyAutoGUI
packages for navigating the computer's window screen and
performing operations like left click, right click, and
scrolling.
2. LITERATURE SURVEY
Chen-Chiung Hsieh et al. [1] proposed "A Real Time Hand
Gesture Recognition System Using Motion History Image" to
control the mouse cursor. The proposed method employs an
adaptive skin colour detection model to reduce
misclassifications. To develop these methodologies, they
used a C++ software platform with the image processing
library open cv installed. Kamran Niyazi et al [2] proposed
"Mouse Simulation Using Two Colored Tapes," which used
the Background Subtraction method, Skin Detection method,
and HSV Color Model to control the cursor and perform
clicking operations. The distance between the tape colours
was used to guide the clicking operations. This model was
created using Java software. Abhik Banerjee et al. [3]
proposed a "Mouse Control Using a Web Camera Based on
Color Detection" to control cursor movements and click
events by detecting camera colour. Each colour represents a
different cursor control, and clicking actions are performed
by simultaneously detecting the colours. This method was
created with the help of MATLAB software and the MATLAB
image processing tool box. "Vision-based Computer Mouse
Control Using Hand Gestures" [4] was proposed by Sandeep
Thakur et al. To improve the efficiency and reliability of the
interaction, this method employs a vision-based system to
control various mouse activities such as left and right
clicking with hand gestures. To improve the system's
efficiency and performance, different colour caps are used on
fingers to recognise hand gestures. The MATLAB
environment was used to implement this method. To control
the mouse cursor, Horatiu-stefan Grif et al [5] proposed
"Mouse Cursor Control Based on Hand Gesture". They used
an external camera attached to a hand pad and colour strips
attached to the fingers in the proposed method. To
implement this methodology, they used C programming
software along with an image processing library called
OpenCV. Pooja Kumari et al. [6] proposed "Cursor Control
Using Hand Gestures" for controlling a mouse with camera-
captured hand gestures. The camera acts as a sensor in this
method, capturing and recognising colour tips attached to
the hand. Because it requires the user to have colour tips on
his hand in order to control the mouse, this method is also
known as the marker-based approach method. To
implement this methodology, they used the MATLAB
environment, the MATLAB Image Processing Tool box, and