To see the other types of publications on this topic, follow the link: Hand gesture detection.

Journal articles on the topic 'Hand gesture detection'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Hand gesture detection.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Shaha, Arya Ashish, Sanika Ashok Salunkhe, Milind Mahadeo Gargade, and Rutuja Vijay Kumbhar. "Hand Gesture Detection with Indication." International Journal of Engineering Research for Sustainable Development 1, no. 1 (2025): 1–4. https://doi.org/10.5281/zenodo.15332400.

Full text
Abstract:
AbstractThis paper presents a method for hand gesture detection and indication using a flex sensor and Arduino. Hand gesture detection using flexible sensors and Arduino platforms offers an innovative approach to human-computer interaction, enabling intuitive control systems in various applications such as robotics, gaming, and assistive technology. This paper presents a system that utilizes flex sensors integrated with Arduino microcontrollers to detect hand gestures and provide real-time indications based on the recognized gestures. Flex sensors, which measure the degree of bending in fingers or hands, are strategically placed on a glove or wearable device to capture the user's hand movements. The Arduino system processes the sensor data to interpret specific gestures, such as open, closed, or finger-specific movements. Upon detecting a gesture, the system triggers corresponding feedback mechanisms, such as visual indicators on a screen or haptic responses, providing clear indications to the user. The approach's simplicity, low cost, and high accuracy make it suitable for applications in gesture-based control systems, virtual reality interfaces, and sign language translation. Experimental results show that the system is capable of reliably detecting a range of hand gestures and delivering appropriate responses, demonstrating its effectiveness in creating intuitive interaction environments.
APA, Harvard, Vancouver, ISO, and other styles
2

Shaha, Arya Ashish, Sanika Ashok Salunkhe, Milind Mahadeo Gargade, and Rutuja Vijay Kumbhar. "Hand Gesture Detection with Indication." International Journal of Engineering Research for Sustainable Development 1, no. 1 (2025): 1–4. https://doi.org/10.5281/zenodo.15332725.

Full text
Abstract:
<em>Abstract</em> <em>This paper presents a method for hand gesture detection and indication using a flex sensor and Arduino. </em><em>Hand gesture detection using flexible sensors and Arduino platforms offers an innovative approach to human-computer interaction, enabling intuitive control systems in various applications such as robotics, gaming, and assistive technology. This paper presents a system that utilizes flex sensors integrated with Arduino microcontrollers to detect hand gestures and provide real-time indications based on the recognized gestures. Flex sensors, which measure the degree of bending in fingers or hands, are strategically placed on a glove or wearable device to capture the user's hand movements. The Arduino system processes the sensor data to interpret specific gestures, such as open, closed, or finger-specific movements. Upon detecting a gesture, the system triggers corresponding feedback mechanisms, such as visual indicators on a screen or haptic responses, providing clear indications to the user. The approach's simplicity, low cost, and high accuracy make it suitable for applications in gesture-based control systems, virtual reality interfaces, and sign language translation. Experimental results show that the system is capable of reliably detecting a range of hand gestures and delivering appropriate responses, demonstrating its effectiveness in creating intuitive interaction environments.</em>
APA, Harvard, Vancouver, ISO, and other styles
3

Prananta, Gidion Bagas, Hagi Azzam Azzikri, and Chaerur Rozikin. "REAL-TIME HAND GESTURE DETECTION AND RECOGNITION USING CONVOLUTIONAL ARTIFICIAL NEURAL NETWORKS." METHODIKA: Jurnal Teknik Informatika dan Sistem Informasi 9, no. 2 (2023): 30–34. http://dx.doi.org/10.46880/mtk.v9i2.1911.

Full text
Abstract:
Real-time hand gesture detection is an interesting topic in pattern recognition and computer vision. In this study, we propose the use of a Convolutional Neural Network (CNN) to detect and recognize hands in real-time. Our goal is to develop a system that can accurately identify and interpret user gestures in real-time. The proposed approach involves two main stages, namely hand gesture recognition and gesture recognition. For stage detection, we use the CNN architecture to recognize hands in the video. We train the CNN model using a dataset containing various hand gestures. Once a hand is detected, we extract the relevant hand region and proceed to the gesture recognition stage. The gesture recognition stage involves training and testing CNN models for different hand signal recognition. We use a hand gesture dataset that contains a variety of common hand signals. The experimental results show that the proposed system can detect and recognize hand movements in real-time with satisfactory accuracy. Although there are still some challenges that need to be overcome, this research provides a solid foundation for further development in real-time hand gesture recognition.
APA, Harvard, Vancouver, ISO, and other styles
4

Nyirarugira, Clementine, Hyo-rim Choi, and TaeYong Kim. "Hand Gesture Recognition Using Particle Swarm Movement." Mathematical Problems in Engineering 2016 (2016): 1–8. http://dx.doi.org/10.1155/2016/1919824.

Full text
Abstract:
We present a gesture recognition method derived from particle swarm movement for free-air hand gesture recognition. Online gesture recognition remains a difficult problem due to uncertainty in vision-based gesture boundary detection methods. We suggest an automated process of segmenting meaningful gesture trajectories based on particle swarm movement. A subgesture detection and reasoning method is incorporated in the proposed recognizer to avoid premature gesture spotting. Evaluation of the proposed method shows promising recognition results: 97.6% on preisolated gestures, 94.9% on stream gestures with assistive boundary indicators, and 94.2% for blind gesture spotting on digit gesture vocabulary. The proposed recognizer requires fewer computation resources; thus it is a good candidate for real-time applications.
APA, Harvard, Vancouver, ISO, and other styles
5

Yu, Myoungseok, Narae Kim, Yunho Jung, and Seongjoo Lee. "A Frame Detection Method for Real-Time Hand Gesture Recognition Systems Using CW-Radar." Sensors 20, no. 8 (2020): 2321. http://dx.doi.org/10.3390/s20082321.

Full text
Abstract:
In this paper, a method to detect frames was described that can be used as hand gesture data when configuring a real-time hand gesture recognition system using continuous wave (CW) radar. Detecting valid frames raises accuracy which recognizes gestures. Therefore, it is essential to detect valid frames in the real-time hand gesture recognition system using CW radar. The conventional research on hand gesture recognition systems has not been conducted on detecting valid frames. We took the R-wave on electrocardiogram (ECG) detection as the conventional method. The detection probability of the conventional method was 85.04%. It has a low accuracy to use the hand gesture recognition system. The proposal consists of 2-stages to improve accuracy. We measured the performance of the detection method of hand gestures provided by the detection probability and the recognition probability. By comparing the performance of each detection method, we proposed an optimal detection method. The proposal detects valid frames with an accuracy of 96.88%, 11.84% higher than the accuracy of the conventional method. Also, the recognition probability of the proposal method was 94.21%, which was 3.71% lower than the ideal method.
APA, Harvard, Vancouver, ISO, and other styles
6

USHA, M. NITHIN GOWDA H. SANTHOSH M. NANDA KUMAR S. NUTHAN PAWAR E. "REAL-TIME HAND SIGN TRAINING AND DETECTION." International Journal For Technological Research In Engineering 11, no. 5 (2024): 149–51. https://doi.org/10.5281/zenodo.10554229.

Full text
Abstract:
Real-time hand sign recognition and detection are major for applications in human computer interaction, sign language interpretation, and gesture-based control systems.This project focuses on creating realtime hand gesture and finger gesture annotations using the MediaPipe framework in Python.The &nbsp;hand gestures and finger gestures using keypoints and finger coordinates found by the MediaPipe framework.The system offers two machine learning models: one for recognizing hand signs and another for detecting finger gestures. It provides resources, including sample programs, model files, and training data, allowing users to utilize pre-trained models. Key Dependencies: MediaPipe, OpenCV, TensorFlow tf-nightly for TFLite models with LSTM, scikit for confusion matrix display, matplotlib (for visualization The system's structure consists of sample programs, model files, and training data, offering users flexibility in training and utilizing the models. A demo program is also provided for realtime use with a webcam, complete with options for customization. this project offers a robust solution training and detection. Users can effectively recognize hand signs and finger gestures through the integrating the MediaPipe framework and learning models, enabling applications in human-computer interaction and beyond.
APA, Harvard, Vancouver, ISO, and other styles
7

Herbert, Oswaldo Mendoza, David Pérez-Granados, Mauricio Alberto Ortega Ruiz, Rodrigo Cadena Martínez, Carlos Alberto González Gutiérrez, and Marco Antonio Zamora Antuñano. "Static and Dynamic Hand Gestures: A Review of Techniques of Virtual Reality Manipulation." Sensors 24, no. 12 (2024): 3760. http://dx.doi.org/10.3390/s24123760.

Full text
Abstract:
This review explores the historical and current significance of gestures as a universal form of communication with a focus on hand gestures in virtual reality applications. It highlights the evolution of gesture detection systems from the 1990s, which used computer algorithms to find patterns in static images, to the present day where advances in sensor technology, artificial intelligence, and computing power have enabled real-time gesture recognition. The paper emphasizes the role of hand gestures in virtual reality (VR), a field that creates immersive digital experiences through the Ma blending of 3D modeling, sound effects, and sensing technology. This review presents state-of-the-art hardware and software techniques used in hand gesture detection, primarily for VR applications. It discusses the challenges in hand gesture detection, classifies gestures as static and dynamic, and grades their detection difficulty. This paper also reviews the haptic devices used in VR and their advantages and challenges. It provides an overview of the process used in hand gesture acquisition, from inputs and pre-processing to pose detection, for both static and dynamic gestures.
APA, Harvard, Vancouver, ISO, and other styles
8

Aryan, Gupta, Narula Rachna, Garg Parth, Joshi Diksha, and Upadhyay Navneet. "Touchless Operations Using Hand Gestures Detection." Recent Innovations in Wireless Network Security 5, no. 3 (2023): 31–40. https://doi.org/10.5281/zenodo.8285242.

Full text
Abstract:
<em>Humans have only recently begun using hand gestures to interact with computers. The integration of the real and digital worlds is the aim of gesture recognition. It is considerably simpler to convey our intentions and ideas to the computer via hand gestures. A simple and efficient touchless method of interacting with computer systems is through hand gestures. However, the limited end-user adoption of hand gesture-based systems is mostly caused by the significant technical challenges involved in successfully identifying in-air movements. Image recognition is one of the many ways that a computer may identify a hand gesture. The ability to recognise human movements is made possible by the deployment of a convolutional neural network (CNN). In this research, we build a straightforward hand tracking technique to operate a Robot Operating System (ROS) based surveillance car with socket programming using Google MediaPipe, a Machine Learning (ML) pipeline that integrates Palm Detection and Hand Landmark Models. In the investigation, steering speed and direction of a ROS automobile are controlled. Vehicles for surveillance that can be operated using hand gestures may help to enhance security measures.</em> <strong><em>&nbsp;</em></strong>
APA, Harvard, Vancouver, ISO, and other styles
9

Meng, Yuting, Haibo Jiang, Nengquan Duan, and Haijun Wen. "Real-Time Hand Gesture Monitoring Model Based on MediaPipe’s Registerable System." Sensors 24, no. 19 (2024): 6262. http://dx.doi.org/10.3390/s24196262.

Full text
Abstract:
Hand gesture recognition plays a significant role in human-to-human and human-to-machine interactions. Currently, most hand gesture detection methods rely on fixed hand gesture recognition. However, with the diversity and variability of hand gestures in daily life, this paper proposes a registerable hand gesture recognition approach based on Triple Loss. By learning the differences between different hand gestures, it can cluster them and identify newly added gestures. This paper constructs a registerable gesture dataset (RGDS) for training registerable hand gesture recognition models. Additionally, it proposes a normalization method for transforming hand gesture data and a FingerComb block for combining and extracting hand gesture data to enhance features and accelerate model convergence. It also improves ResNet and introduces FingerNet for registerable single-hand gesture recognition. The proposed model performs well on the RGDS dataset. The system is registerable, allowing users to flexibly register their own hand gestures for personalized gesture recognition.
APA, Harvard, Vancouver, ISO, and other styles
10

Naveen, Y., and Ch Navya Sree. "GestureFlow: Advanced Hand Gesture Control System." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 04 (2025): 1–9. https://doi.org/10.55041/ijsrem44540.

Full text
Abstract:
Our project "GestureFlow: Advanced Hand Gesture Control System" leverages real-time computer vision and deep learning techniques to create a robust, touchless control interface using hand gestures. The system utilizes MediaPipe Hands for efficient hand landmark detection and processes dynamic hand movements and finger configurations to identify a wide range of intuitive gestures such as swipes, pinches, and specific finger patterns. These gestures are mapped to actions like mouse control, clicks, volume adjustment, media playback, screenshot capture, window management, and many more. The system includes smoothed cursor tracking, velocity-based gesture recognition, and responsive command execution to ensure real-time performance. It also offers dynamic visual feedback and adaptive handling of gesture timing to improve precision and usability. Overall, the project presents an accessible, multi-functional human-computer interaction framework aimed at enhancing hands-free control and reducing reliance on traditional input devices in everyday computing environments. Keywords: Hand Gesture Recognition, Human-Computer Interaction, Computer Vision, Deep Learning, MediaPipe, Real-Time Control, Touchless Interface, Gesture Classification, Cursor Navigation, Accessibility, Adaptive Gestures, Visual Feedback.
APA, Harvard, Vancouver, ISO, and other styles
11

Alvin, Arsheldy, Nabila Husna Shabrina, Aurelius Ryo, and Edgar Christian. "Hand Gesture Detection for Sign Language using Neural Network with Mediapipe." Ultima Computing : Jurnal Sistem Komputer 13, no. 2 (2021): 57–62. http://dx.doi.org/10.31937/sk.v13i2.2109.

Full text
Abstract:
The most popular way of interfacing with most computer systems is a mouse and keyboard. Hand gestures are an intuitive and effective touchless way to interact with computer systems. However, hand gesture-based systems have seen low adoption among end-users primarily due to numerous technical hurdles in detecting in-air gestures accurately. This paper presents Hand Gesture Detection for American Sign Language using K-Nearest Neighbor with Mediapipe, a framework developed to bridge this gap. The framework learns to detect gestures from demonstrations, it is customizable by end-users, and enables users to interact in real-time with computers having only RGB cameras, using gestures.
APA, Harvard, Vancouver, ISO, and other styles
12

Alnaim, N., and M. Abbod. "Hand Gesture Detection Using Neural Networks Algorithms." International Journal of Machine Learning and Computing 9, no. 6 (2019): 782–87. http://dx.doi.org/10.18178/ijmlc.2019.9.6.873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Lu, Cong, Haoyang Zhang, Yu Pei, et al. "Online Hand Gesture Detection and Recognition for UAV Motion Planning." Machines 11, no. 2 (2023): 210. http://dx.doi.org/10.3390/machines11020210.

Full text
Abstract:
Recent advances in hand gesture recognition have produced more natural and intuitive methods of controlling unmanned aerial vehicles (UAVs). However, in unknown and cluttered environments, UAV motion planning requires the assistance of hand gesture interaction in complex flight tasks, which remains a significant challenge. In this paper, a novel framework based on hand gesture interaction is proposed, to support efficient and robust UAV flight. A cascading structure, which includes Gaussian Native Bayes (GNB) and Random Forest (RF), was designed, to classify hand gestures based on the Six Degrees of Freedom (6DoF) inertial measurement units (IMUs) of the data glove. The hand gestures were mapped onto UAV’s flight commands, which corresponded to the direction of the UAV flight.The experimental results, which tested the 10 evaluated hand gestures, revealed the high accuracy of online hand gesture recognition under asynchronous detection (92%), and relatively low latency for interaction (average recognition time of 7.5 ms; average total time of 3 s).The average time of the UAV’s complex flight task was about 8 s shorter than that of the synchronous hand gesture detection and recognition. The proposed framework was validated as efficient and robust, with extensive benchmark comparisons in various complex real-world environments.
APA, Harvard, Vancouver, ISO, and other styles
14

Sunyoto, Andi. "A Robust Hand Silhouette Orientation Detection Method for Hand Gesture Recognition." Journal of Southwest Jiaotong University 56, no. 3 (2021): 43–52. http://dx.doi.org/10.35741/issn.0258-2724.56.3.4.

Full text
Abstract:
The computer vision approach is most widely used for research related to hand gesture recognition. The detection of the image orientation has been discovered to be one of the keys to determine its success. The degree of freedom for a hand determines the shape and orientation of a gesture, which further causes a problem in the recognition methods. This article proposed evaluating orientation detection for silhouette static hand gestures with different poses and orientations without considering the forearm. The longest chord and ellipse were the two popular methods compared. The angles formed from two wrist points were selected as ground truth data and calculated from the horizontal axis. The performance was analyzed using the error values obtained from the difference in ground truth data angles compared to the method's results. The method has errors closer to zero that were rated better. Moreover, the method was evaluated using 1187 images, divided into four groups based on the forearm presence, and the results showed its effect on orientation detection. It was also discovered that the ellipse method was better than the longest chord. This study's results are used to select hand gesture orientation detection to increase accuracy in the hand gesture recognition process.
APA, Harvard, Vancouver, ISO, and other styles
15

Padmaja, M. "Wave Your Way: Navigation Through Hand Gestures." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem42033.

Full text
Abstract:
Several technologies are continuously developing in today’s technological world where human-computer interaction is very important. In human-computer interactions, hand gesture recognition is essential. We can control our system by showing our hands in front of a webcam, and hand gesture recognition can be useful for all kinds of people. A specific interactive module like a virtual mouse that makes use of Object Tracking and Gestures will help us interact and serve as an alternative way to the traditional touchscreen and physical mouse. The system allows people to control a computer cursor using hand movements recorded by a camera by utilizing computer vision techniques and machine learning algorithms. Three key components of the suggested system are hand detection, gesture recognition, and cursor control. Hand detection is the process of locating and tracking the user's hand within the camera's field of view. This process may employ methods such as deep learning-based object detection, backdrop subtraction, and skin colour segmentation. In this, we can also capture the gestures clearly even when the camera quality is poor, and it can provide better navigation without any clumsiness. KEYWORDS: Hand Gesture Recognition, Human-Computer Interaction, Computer Vision, video capturing, Python Libraries, Accessibility, Real-Time Systems, speech Recognition.
APA, Harvard, Vancouver, ISO, and other styles
16

Feng, Guo-Hua, and Gui-Rong Lai. "Hand Gesture Detection and Recognition Using Spectrogram and Image Processing Technique with a Single Pair of Ultrasonic Transducers." Applied Sciences 11, no. 12 (2021): 5407. http://dx.doi.org/10.3390/app11125407.

Full text
Abstract:
This paper presents an effective signal processing scheme of hand gesture recognition with a superior accuracy rate of judging identical and dissimilar hand gestures. This scheme is implemented with the air sonar possessing a pair of cost-effective ultrasonic emitter and receiver along with signal processing circuitry. Through the circuitry, the Doppler signals of hand gestures are obtained and processed with the developed algorithm for recognition. Four different hand gestures of push motion, wrist motion from flexion to extension, pinch out, and hand rotation are investigated. To judge the starting time of hand gesture occurrence, the technique based on continuous short-period analysis is proposed. It could identify the starting time of the hand gesture with small-scale motion and avoid faulty judgment while no hand in front of the sonar. Fusing the short-time Fourier transform spectrogram of hand gesture to the image processing techniques of corner feature detection, feature descriptors, and Hamming-distance matching are the first-time, to our knowledge, employed to recognize hand gestures. The results show that the number of matching points is an effective parameter for classifying hand gestures. Based on the experimental data, the proposed scheme could achieve an accuracy rate of 99.8% for the hand gesture recognition.
APA, Harvard, Vancouver, ISO, and other styles
17

Sravani, Jiripurapu, Soma Yagna Priya, Gowra Pavan Kumar, Chereddy Mohith Sankar, and KRMC Sekhar. "Hand Gesture Detection using Deep Learning with YOLOv5." International Journal of Multidisciplinary Research and Growth Evaluation. 6, no. 2 (2025): 742–50. https://doi.org/10.54660/.ijmrge.2025.6.2.742-750.

Full text
Abstract:
Hand gesture recognition has become a significant technological advancement in assistive communication, offering a reliable means of interaction for individuals with hearing and speech impairments. This research introduces an intelligent gesture detection system powered by YOLOv5, a leading object detection model, to enable accurate and real-time recognition of Indian Sign Language (ISL) gestures. The system effectively handles diverse environmental conditions and user-specific variations using an extensive and well-annotated dataset. The methodology encompasses essential stages such as image preprocessing, data augmentation, and feature extraction to optimize model performance. Furthermore, a user-friendly web interface allows users to upload images for gesture detection, with corresponding text and audio outputs generated using a text-to-speech module. Designed for seamless scalability, the system can accommodate additional gestures and languages, making it a versatile solution for educational institutions, healthcare facilities, and public service sectors. By fostering greater inclusivity and accessibility, this approach represents a step forward in empowering the hearing-impaired community through innovative deep-learning applications.
APA, Harvard, Vancouver, ISO, and other styles
18

Nikita Kashyap, Pragati Patharia, and Arun Kumar Kashyap. "Gesture-Based Control of Multimedia Player Using Python and OpenCV." International Journal of Advanced Technology and Social Sciences 1, no. 4 (2023): 267–78. http://dx.doi.org/10.59890/ijatss.v1i4.983.

Full text
Abstract:
The increasing significance of computers in our daily lives, coupled with the rise of ubiquitous computing, has necessitated effective human-computer interaction. Hand gesture recognition systems have emerged as a real-time video-based solution for detecting and interpreting hand gestures, offering intelligent and natural human-computer interaction (HCI) methods. This project focuses on leveraging human hands as input devices for computer operation. Developed using Python and the OpenCV library, the program utilizes a computer webcam to capture and analyze hand shapes and patterns. The program provides real-time feedback by displaying recognized hand gestures on the live video stream. The ultimate outcome of this project is an application that enhances user experiences in contactless systems. The project recognizes and detects human hand motions using the Python computer language through a process flow that includes background subtraction, hand ROI segmentation, contour detection, and finger recognition. Techniques for processing images are used, including hand gesture detection, pattern recognition, thresholding, and contour detection. The processing of incoming photos and the creation of related keystrokes are made possible by OpenCV, a rich set of image processing tools
APA, Harvard, Vancouver, ISO, and other styles
19

Kashyap, Nikita, Pragati Patharia, and Arun Kumar Kashyap. "Gesture-Based Control of Multimedia Player Using Python and OpenCV." Gesture-Based Control of Multimedia Player Using Python and OpenCV 1, Vol. 1 No. 4 (2023): December 2023 (2024): 12. https://doi.org/10.59890/ijatss.v1i4.983.

Full text
Abstract:
The increasing significance of computers in our daily lives, coupled with the rise of ubiquitous computing, has necessitated effective human-computer interaction. Hand gesture recognition systems have emerged as a real-time video-based solution for detecting and interpreting hand gestures, offering intelligent and natural human-computer interaction (HCI) methods. This project focuses on leveraging human hands as input devices for computer operation. Developed using Python and the OpenCV library, the program utilizes a computer webcam to capture and analyze hand shapes and patterns. The program provides real-time feedback by displaying recognized hand gestures on the live video stream. The ultimate outcome of this project is an application that enhances user experiences in contactless systems. The project recognizes and detects human hand motions using the Python computer language through a process flow that includes background subtraction, hand ROI segmentation, contour detection, and finger recognition. Techniques for processing images are used, including hand gesture detection, pattern recognition, thresholding, and contour detection. The processing of incoming photos and the creation of related keystrokes are made possible by OpenCV, a rich set of image processing tools
APA, Harvard, Vancouver, ISO, and other styles
20

SHAIK, Dr ABDUL NABI, E. SAI PRIYA, G. NIKITHA, K. PRACHEEN KUMAR, and N. SHRAVYA SHREE. "CONTROLLING VIDEOLAN CLIENT MEDIA USING LUCAS KANADE OPTIMAL FLOW ALGORITHM AND OPENCV." YMER Digital 21, no. 05 (2022): 246–55. http://dx.doi.org/10.37896/ymer21.05/29.

Full text
Abstract:
In this project we've discussed a system which uses some dynamic hand gesture recognition technique to manage the media players like VLC media player. This project consist of certain modules which segments the foreground a part of the frame using detection of skin colour and approximate median technique. the popularity of hand gesture is finished by creating a call Tree, that uses certain features extracted from the segmented part. This hand gesture recognition technique introduces a replacement, natural thanks to interact with computers and is beneficial to several folks in our day-to-day life. Hand gestures are associated with the human hands in hand gesture recognition. This project is employed for controlling certain operations (Pause, Play, Volume up, Volume Down, Mute) on video player by mere hand gestures without getting in the rigmarole of pressing buttons or tapping onto the screen. this may be directly associated with our day to life like in presentations. We have considered hand gestures and their directional motion defines a gesture for the applying. during this application image retrieving is finished employing a Webcam. Some functions in VLC media players are used most often and these functions uses some predefined gestures. Shows the defined gestures in keeping with the VLC player control operation. We created a VLC Media Player Controller using Hand Gesture Recognition System to form ‘HUMAN LIFE EASY AND BETTER’. This project is implemented using two steps: (1.) Creation of Hand Gesture Recognition System: this is often done using image processing using OpenCV library. (2.) Controlling VLC Media Player using hand gestures: during this step we controlled the player using shell commands which were recalled using python commands through OS library
APA, Harvard, Vancouver, ISO, and other styles
21

Chen, Yin-Lin, Wen-Jyi Hwang, Tsung-Ming Tai, and Po-Sheng Cheng. "Sensor-Based Hand Gesture Detection and Recognition by Key Intervals." Applied Sciences 12, no. 15 (2022): 7410. http://dx.doi.org/10.3390/app12157410.

Full text
Abstract:
This study aims to present a novel neural network architecture for sensor-based gesture detection and recognition. The algorithm is able to detect and classify accurately a sequence of hand gestures from the sensory data produced by accelerometers and gyroscopes. Each hand gesture in the sequence is regarded as an object with a pair of key intervals. The detection and classification of each gesture are equivalent to the identification and matching of the corresponding key intervals. A simple automatic labelling is proposed for the identification of key intervals without manual inspection of sensory data. This could facilitate the collection and annotation of training data. To attain superior generalization and regularization, a multitask learning algorithm for the simultaneous training for gesture detection and classification is proposed. A prototype system based on smart phones for remote control of home appliances was implemented for the performance evaluation. Experimental results reveal that the proposed algorithm provides an effective alternative for applications where accurate detection and classification of hand gestures by simple networks are desired.
APA, Harvard, Vancouver, ISO, and other styles
22

Himanshu, Gupta, Ramjiwal Aniruddh, and T. Jose Jasmin. "Vision Based Approach to Sign Language Recognition." International Journal of Advances in Applied Sciences (IJAAS) 7, no. 2 (2018): 156–61. https://doi.org/10.11591/ijaas.v7.i2.pp156-161.

Full text
Abstract:
We propose an algorithm for automatically recognizing some certain amount of gestures from hand movements to help deaf and dumb and hard hearing people. Hand gesture recognition is quite a challenging problem in its form. We have considered a fixed set of manual commands and a specific environment, and develop a effective, procedure for gesture recognition. Our approach contains steps for segmenting the hand region, locating the fingers, and finally classifying the gesture which in general terms means detecting, tracking and recognising. The algorithm is non-changing to rotations, translations and scale of the hand. We will be demonstrating the effectiveness of the technique on real imagery.
APA, Harvard, Vancouver, ISO, and other styles
23

Ameliasari, Maya, Aji Gautama Putrada, and Rizka Reza Pahlevi. "An Evaluation of SVM in Hand Gesture Detection Using IMU-Based Smartwatches for Smart Lighting Control." JURNAL INFOTEL 13, no. 2 (2021): 47–53. http://dx.doi.org/10.20895/infotel.v13i2.656.

Full text
Abstract:
Hand gesture detection with a smartwatch can be used as a smart lighting control on the internet of things (IoT) environment using machine learning techniques such as support vector machine (SVM). However, several parameters affect the SVM model's performance and need to be evaluated. This study evaluates the parameters in building an SVM model for hand gesture detection in intelligent lighting control. In this study, eight gestures were defined to turn on and off four different lights, and then the data were collected through a smartwatch with an Inertial Measurement Unit (IMU) sensor. Feature selection using Pearson Correlation is then carried out on 36 features extracted from each gesture data. Finally, two sets of gestures were compared to evaluate the effect of gesture selection on model performance. The first set of gestures show that the accuracy of 10 features compared to the accuracy of 36 features is 94% compared to 71%, respectively. Furthermore, the second set of gestures has an accuracy lower than the first set of gestures, which is 64%. Results show that the lower the number of features, the better the accuracy. Then, the set of gestures that are not too distinctive show lower accuracy than the highly distinctive gesture sets. The conclusion is, in implementing gesture detection with SVM, low data dimensions need to be maintained through feature selection methods, and a distinctive set of gesture selection is required for a model with good performance.
APA, Harvard, Vancouver, ISO, and other styles
24

Tran, Dinh-Son, Ngoc-Huynh Ho, Hyung-Jeong Yang, Eu-Tteum Baek, Soo-Hyung Kim, and Gueesang Lee. "Real-Time Hand Gesture Spotting and Recognition Using RGB-D Camera and 3D Convolutional Neural Network." Applied Sciences 10, no. 2 (2020): 722. http://dx.doi.org/10.3390/app10020722.

Full text
Abstract:
Using hand gestures is a natural method of interaction between humans and computers. We use gestures to express meaning and thoughts in our everyday conversations. Gesture-based interfaces are used in many applications in a variety of fields, such as smartphones, televisions (TVs), video gaming, and so on. With advancements in technology, hand gesture recognition is becoming an increasingly promising and attractive technique in human–computer interaction. In this paper, we propose a novel method for fingertip detection and hand gesture recognition in real-time using an RGB-D camera and a 3D convolution neural network (3DCNN). This system can accurately and robustly extract fingertip locations and recognize gestures in real-time. We demonstrate the accurateness and robustness of the interface by evaluating hand gesture recognition across a variety of gestures. In addition, we develop a tool to manipulate computer programs to show the possibility of using hand gesture recognition. The experimental results showed that our system has a high level of accuracy of hand gesture recognition. This is thus considered to be a good approach to a gesture-based interface for human–computer interaction by hand in the future.
APA, Harvard, Vancouver, ISO, and other styles
25

Elmezain, Mahmoud, Majed M. Alwateer, Rasha El-Agamy, Elsayed Atlam, and Hani M. Ibrahim. "Forward Hand Gesture Spotting and Prediction Using HMM-DNN Model." Informatics 10, no. 1 (2022): 1. http://dx.doi.org/10.3390/informatics10010001.

Full text
Abstract:
Automatic key gesture detection and recognition are difficult tasks in Human–Computer Interaction due to the need to spot the start and the end points of the gesture of interest. By integrating Hidden Markov Models (HMMs) and Deep Neural Networks (DNNs), the present research provides an autonomous technique that carries out hand gesture spotting and prediction simultaneously with no time delay. An HMM can be used to extract features, spot the meaning of gestures using a forward spotting mechanism with varying sliding window sizes, and then employ Deep Neural Networks to perform the recognition process. Therefore, a stochastic strategy for creating a non-gesture model using HMMs with no training data is suggested to accurately spot meaningful number gestures (0–9). The non-gesture model provides a confidence measure, which is utilized as an adaptive threshold to determine where meaningful gestures begin and stop in the input video stream. Furthermore, DNNs are extremely efficient and perform exceptionally well when it comes to real-time object detection. According to experimental results, the proposed method can successfully spot and predict significant motions with a reliability of 94.70%.
APA, Harvard, Vancouver, ISO, and other styles
26

R. Sudhakar, V. Gayathri, P. Gomathi, S. Renuka, and N. Hemalatha. "Sign Language Detection." South Asian Journal of Engineering and Technology 13, no. 1 (2023): 49–56. http://dx.doi.org/10.26524/sajet.2023.13.5.

Full text
Abstract:
This project focuses on detection of sign for hand gesture techniques and introduces merits and drawbacks in various circumstances. The hand segmentation theory and hand detection system is used for constructing hand gesture recognition by using Python with OpenCV. The hand gesture is as a natural interface which motivates research in gesture taxonomies, representations, and recognition methods/algorithms and software platforms/ frameworks, all of which are covered with detail in this project. The ever increasing public acceptance and fund for multinational projects emphasizes need for sign language. The desire for computer-based solution is important in recent age of technology for deaf people. Still, researchers are studying the problem for quite sometimes and results are showing promises. This project represents the comprehensive review of vision oriented sign recognition methodologies, emphasizing importance of taking things into consideration moreover with algorithm's recognition accuracy during predicting the success in real world scenario. This project matches given image with dataset images with numerous categories of sign (gestures). Here the convolutional neural network (CNN) has been implemented to increase the accuracy level. This project applies gray scale conversion, then binary image conversion and finally histogram construction and matching of given test image with data set images. The coding language used is Python 3.8.
APA, Harvard, Vancouver, ISO, and other styles
27

Pooja, J. Pawar, and S. Banait Archana. "SMART VIRTUAL MOUSE SYSTEM USING HAND GESTURE AND FINGERTIP DETECTION." Journal of the Maharaja Sayajirao University of Baroda 59, no. 1 (I) (2025): 483–91. https://doi.org/10.5281/zenodo.15318689.

Full text
Abstract:
Abstract: The rapid advancement of Human-Computer Interaction (HCI) has led to innovative input systems that replace traditional devices like the mouse and keyboard. This research investigates how hand gestures can be used to replace physical mouse inputs in a computer system, utilizing computer vision and deep learning techniques. We developed a virtual mouse system that captures hand gestures using a standard webcam, which is then processed by computer vision algorithms to control mouse functions such as cursor movement, clicking, scrolling, and zooming. The system was tested under various conditions, including variations in hand size, gesture speed, and environmental lighting. Key findings indicate that the system achieves 99% accuracy under optimal conditions but drops to 92% in low or overexposed lighting and at distances greater than 30 cm. Furthermore, recognition of complex gestures like right-click and scrolling remains a challenge. To address these issues, future iterations will incorporate adaptive lighting correction algorithms, additional training data for gesture refinement, and alternative detection methods. Despite these challenges, the system offers a cost-effective, innovative solution for individuals with physical disabilities and presents promising potential for immersive, hands-free interaction with digital systems in the future. Keywords: Human-Computer Interaction, Virtual Mouse, Hand Gesture Recognition, Computer Vision, Deep Learning, Python.&nbsp;
APA, Harvard, Vancouver, ISO, and other styles
28

Xu, Jiawei, Lu Leng, and Byung-Gyu Kim. "Gesture Recognition and Hand Tracking for Anti-Counterfeit Palmvein Recognition." Applied Sciences 13, no. 21 (2023): 11795. http://dx.doi.org/10.3390/app132111795.

Full text
Abstract:
At present, COVID-19 is posing a serious threat to global human health. The features of hand veins in infrared environments have many advantages, including non-contact acquisition, security, privacy, etc., which can remarkably reduce the risks of COVID-19. Therefore, this paper builds an interactive system, which can recognize hand gestures and track hands for palmvein recognition in infrared environments. The gesture contours are extracted and input into an improved convolutional neural network for gesture recognition. The hand is tracked based on key point detection. Because the hand gesture commands are randomly generated and the hand vein features are extracted from the infrared environment, the anti-counterfeiting performance is obviously improved. In addition, hand tracking is conducted after gesture recognition, which prevents the escape of the hand from the camera view range, so it ensures that the hand used for palmvein recognition is identical to the hand used during gesture recognition. The experimental results show that the proposed gesture recognition method performs satisfactorily on our dataset, and the hand tracking method has good robustness.
APA, Harvard, Vancouver, ISO, and other styles
29

Dunai, Larisa, Isabel Seguí Verdú, Dinu Turcanu, and Viorel Bostan. "Prosthetic Hand Based on Human Hand Anatomy Controlled by Surface Electromyography and Artificial Neural Network." Technologies 13, no. 1 (2025): 21. https://doi.org/10.3390/technologies13010021.

Full text
Abstract:
Humans have a complex way of expressing their intuitive intentions in real gestures. That is why many gesture detection and recognition techniques have been studied and developed. There are many methods of human hand signal reading, such as those using electroencephalography, electrocorticography, and electromyography, as well as methods for gesture recognition. In this paper, we present a method based on real-time surface electroencephalography hand-based gesture recognition using a multilayer neural network. For this purpose, the sEMG signals have been amplified, filtered and sampled; then, the data have been segmented, feature extracted and classified for each gesture. To validate the method, 100 signals for three gestures with 64 samples each signal have been recorded from 2 users with OYMotion sensors and 100 signals for three gestures from 4 users with the MyWare sensors. These signals were used for feature extraction and classification using an artificial neuronal network. The model converges after 10 sessions, achieving 98% accuracy. As a result, an algorithm was developed that aimed to recognize two specific gestures (handling a bottle and pointing with the index finger) in real time with 95% accuracy.
APA, Harvard, Vancouver, ISO, and other styles
30

Bhagwat, Prasad, Prashant Raut, Shubham Darade, Soham Gangurde, and Neha R. Hiray. "Hand Gesture Enabled Operation for Computer Using Deep Learning." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 11 (2024): 1–7. http://dx.doi.org/10.55041/ijsrem38487.

Full text
Abstract:
This project aims to develop a system that detects hand gestures and prints corresponding instructions for computer operators. The system utilizes Deep learning algorithms CNN and mediapipe to recognize hand gestures, and then prints instructions for the operator to perform specific computer operations.Operations like Notepad Open,Shutdown,Volume Up,Volume Down etc.The system will consist of a webcam, a Deep learning model, and a printer. The webcam will capture images of hand gestures, which will be processed using the Deep learning model to identify the specific gesture. The system will then print the corresponding instruction on the printer. The system will be designed to be user-friendly and adaptable to different environments and hardware configurations.The system aims to improve efficiency, reduce errors, and enhance accessibility for computer operators. The proposed system consists of a hand gesture detection module, instruction printing module, and computer operation module. The system is trained using a dataset of hand gestures and corresponding computer operations. Experimental results show that the system achieves high accuracy in gesture recognition and instruction printing. The system has potential applications in various fields, including gaming, education, and healthcare. Key Words: Hand gesture detection, Deep learning, instruction printing, computer operation, accessibility, Mediapipe, CNN.
APA, Harvard, Vancouver, ISO, and other styles
31

Kumar, Sunil, Tanu Srivastava, and Raj Shree Singh. "Hand Gesture Recognition Using Principal Component Analysis." Asian Journal of Engineering and Applied Technology 6, no. 2 (2017): 32–35. http://dx.doi.org/10.51983/ajeat-2017.6.2.820.

Full text
Abstract:
Nowadays actions are increasingly being handled in electronic ways, instead of physical interaction. From earlier times biometrics is used in the authentication of a person. It recognizes a person by using a human trait associated with it like eyes (by calculating the distance between the eyes) and using hand gestures, fingerprint detection, face detection etc. Advantages of using these traits for identification are that they uniquely identify a person and cannot be forgotten or lost. These are unique features of a human being which are being used widely to make the human life simpler. Hand gesture recognition system is a powerful tool that supports efficient interaction between the user and the computer. The main moto of hand gesture recognition research is to create a system which can recognise specific hand gestures and use them to convey useful information for device control. This paper presents an experimental study over the feasibility of principal component analysis in hand gesture recognition system. PCA is a powerful tool for analyzing data. The primary goal of PCA is dimensionality reduction. Frames are extracted from the Sheffield KInect Gesture (SKIG) dataset. The implementation is done by creating a training set and then training the recognizer. It uses Eigen space by processing the eigenvalues and eigenvectors of the images in training set. Euclidean distance with the threshold value is used as similarity metric to recognize the gestures. The experimental results show that PCA is feasible to be used for hand gesture recognition system.
APA, Harvard, Vancouver, ISO, and other styles
32

Nadar, Daswini, Saista Anjum, and K. C. Sriharipriya. "Hand Gesture Recognition System based on 60 GHz FMCW Radar and Deep Neural Network." International Journal of Electrical and Electronics Research 11, no. 3 (2023): 760–65. http://dx.doi.org/10.37391/ijeer.110319.

Full text
Abstract:
The proposed study provides a novel technique for recognizing hand gestures that use a combination of Deep Convolutional Neural Networks (DCNN) and 60 GHz Frequency Modulated Continuous Wave (FMCW) radar. The motion of a Human's hand is detected using the FMCW radar, and the various gestures are classified using the DCNN. Motion detection and frequency analysis are two techniques that the suggested system combines. The basis of the capability of motion detection in FMCW radars' is to recognize the Doppler shift in the received signal brought on by the target's motion. To properly identify the hand motions, the presented technique combines these two techniques. The system is analyzed using a collection of hand gesture photos, and the outcomes are analyzed with those of other hand gesture recognition systems which are already in use. A dataset of five different hand gestures is used to examine the proposed system. According to the experimental data, the suggested system can recognize gestures with an accuracy of 96.5%, showing its potential as a productive gesture recognition system. Additionally, the suggested system has a processing time of 100 ms and can run in real time. The outcomes also demonstrate the proposed system's resistance to noise and its ability to recognize gestures in a variety of configurations. For gesture detection applications in virtual reality and augmented reality systems, this research offers a promising approach.
APA, Harvard, Vancouver, ISO, and other styles
33

Priyanka R., Prahanya Sriram, Jayasree L. N., and Angelin Gladston. "Shape-Based Features for Optimized Hand Gesture Recognition." International Journal of Artificial Intelligence and Machine Learning 11, no. 1 (2021): 23–38. http://dx.doi.org/10.4018/ijaiml.2021010103.

Full text
Abstract:
Gesture recognition is the most intuitive form of human-computer interface. Hand gestures provide a natural way for humans to interact with computers to perform a variety of different applications. However, factors such as complexity of hand gesture structures, differences in hand size, hand posture, and environmental illumination can influence the performance of hand gesture recognition algorithms. Considering the above factors, this paper aims to present a real time system for hand gesture recognition on the basis of detection of some meaningful shape-based features like orientation, center of mass, status of fingers, thumb in terms of raised or folded fingers of hand and their respective location in image. The internet is growing at a very fast pace. The use of web browser is also growing. Everyone has at least two or three most frequently visited website. Thus, in this paper, effectiveness of the gesture recognition and its ability to control the browser via the recognized hand gestures are experimented and the results are analyzed.
APA, Harvard, Vancouver, ISO, and other styles
34

R N, Pushpa. "Sign Language Detection Using Deep Learning." International Journal for Research in Applied Science and Engineering Technology 12, no. 12 (2024): 565–71. https://doi.org/10.22214/ijraset.2024.65817.

Full text
Abstract:
Sign language recognition is an essential tool for bridging communication gaps between individuals with hearing or speech impairments and the broader community. This study introduces an advanced sign language recognition system leveraging computer vision and machine learning techniques. The system utilizes real-time hand tracking and gesture recognition to identify and classify hand gestures associated with common phrases such as "Hello," "I love you," and "Thank you." A two-step approach is implemented: first, a data collection module captures hand images using a robust preprocessing pipeline, ensuring uniformity in image size and quality; second, a classification module uses a trained deep learning model to accurately predict gestures in real-time. The framework integrates OpenCV for image processing, CVZone modules for hand detection, and TensorFlow for gesture classification. Extensive testing demonstrates the system's capability to process live video input, classify gestures accurately, and display corresponding labels seamlessly. This solution addresses challenges in gesture recognition, such as variable hand shapes and dynamic backgrounds, through efficient preprocessing and model training. By offering a scalable and efficient design, this work has the potential to contribute significantly to assistive technologies and accessible communication systems, paving the way for further advancements in human-computer interaction and inclusive technology.
APA, Harvard, Vancouver, ISO, and other styles
35

Journal, IJSREM. "Hand Gesture Recognition and Motion Detection System for Interactive Applications." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 07, no. 10 (2023): 1–11. http://dx.doi.org/10.55041/ijsrem26131.

Full text
Abstract:
Security systems and interactive multimedia applications are two examples of the many fields in which Gesture Sense (GS) technology, a significant development in human-computer interaction, has found use. This paper introduces the "Arduino Hand Gesture Identification and Motion Detection System," a revolutionary combination of Python, Arduino Uno R3, and ultrasonic sensors intended to improve interactive apps through accurate gesture identification. Real-time tracking of hand movements is made possible by ultrasonic sensors, which are renowned for their fine-grained motion detection capabilities. Python allows the Arduino Uno R3 to run complex gesture detection algorithms. A user's engagement is improved by an LCD that presents recognized gestures and status updates, which provides immediate visual feedback. The system's outstanding accuracy, responsiveness, and usability are demonstrated by severe testing, which shows that it typically recognizes objects with an accuracy of 93%. In the context of interactive applications, this research study demonstrates how GS technology has the potential to revolutionize user experiences by ensuring seamless and immersive interactions. Key Words: Interactive application, Human-computer interaction, Hand gesture recognition, Motion detection.
APA, Harvard, Vancouver, ISO, and other styles
36

Anthoniraj, Dr S. "GestureSpeak & Real-Time Virtual Mouse Using Hand Gestures." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 04 (2025): 1–9. https://doi.org/10.55041/ijsrem46121.

Full text
Abstract:
Abstract - GestureSpeak presents a new method of virtual mouse control that enables real-time hand gestures to be used to interface with computers. GestureSpeak, which was created to increase accessibility and inclusion, uses machine learning and computer vision algorithms to identify and decipher hand gestures and convert them into virtual mouse operations. To improve communication for those who use sign language, the system also has a sign language interpreter that translates American Sign Language (ASL) movements into spoken words. GestureSpeak overcomes the drawbacks of physical input devices and conventional mouse systems by building upon standard gesture recognition techniques, offering users with physical disabilities a flexible and hands-free option. GestureSpeak hopes to provide a smooth user experience in a variety of settings by optimizing gesture detection and performance. Keywords — Computer Vision, Accessibility, Sign Language Translation, Virtual Mice, Real-time Gesture Detection and Human-Computer Interaction
APA, Harvard, Vancouver, ISO, and other styles
37

Bhandari R, Aishwarya. "Hand Gesture Recognition System using Thermal Images." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 05 (2025): 1–9. https://doi.org/10.55041/ijsrem47014.

Full text
Abstract:
Abstract— Hand gesture detection is a pivotal technology in advancing human-computer interaction, offering intuitive and touch-free control across various applications. This paper presents the development of a robust hand gesture detection system utilizing Convolutional Neural Networks (CNNs), leveraging their ability to automatically extract and learn spatial hierarchies of features from input images. The proposed system processes hand images and accurately classifies various gestures, addressing challenges associated with traditional recognition methods that require extensive feature engineering and complex pre-processing. The primary objective of this research is to enable seamless interaction between users and devices through hand gestures, enhancing accessibility in domains such as gaming, assistive technology, and virtual reality. A labeled dataset of hand gesture images is used to train and optimize the CNN model to achieve high accuracy and low latency in real-time predictions. The performance of the system is further enhanced by implementing efficient CNN architectures and optimizing the model for low- power devices, thereby expanding its practical applications. The proposed system demonstrates the potential to provide a more natural, flexible, and immersive interaction experience across diverse digital environments. Keywords— Machine learning, Deep learning, Hand Gesture, image processing, Convolutional neural networks, Thermal images
APA, Harvard, Vancouver, ISO, and other styles
38

V V, Nivedha, and Suganya K S. "Ml Based Hand Gesture Recognition." Indian Journal of Computer Science and Technology 2, no. 3 (2023): 01–07. http://dx.doi.org/10.59256/indjcst.20230203001.

Full text
Abstract:
The projects focus on Gesture recognition using the media pipe hands Lite framework where the custom data set of 10 gesture Images is trained with an in built media pipe hands model which contains 2CNN models-A palm detection model running on single Shot detection architecture and a hand landmark generator running on regression model architecture. The data set is successfully trained and tested with the proposed method and an accuracy of 98 percent is obtained. Keywords: Gesture Recognition, Media pipe, Hands De-tection, CNN, Regression Model, Hand Landmark Detection, Machine Learning, Image Processing.
APA, Harvard, Vancouver, ISO, and other styles
39

Mayank, Anmol, and Asmika Jain. "SignNet- Hand Sign Detection." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 04 (2025): 1–9. https://doi.org/10.55041/ijsrem44390.

Full text
Abstract:
Sign language is a critical communication tool for individuals with hearing and speech impairments, yet its limited understanding among the general population creates significant social and professional barriers. This research proposes a real- time hand sign detection system that leverages computer vision and deep learning to bridge this gap, enabling seamless interaction between sign language users and non-users. The system employs Convolutional Neural Networks (CNNs) for spatial feature extraction and Recurrent Neural Networks (RNNs) for temporal sequence modeling, achieving accurate recognition of hand gestures and their conversion into text or speech. By processing video input at 20–30 frames per second, the system ensures efficient real-time performance suitable for everyday use. Preliminary evaluations suggest an accuracy of 85–95% on a vocabulary of 100–200 signs, with potential scalability to larger datasets. This solution has wide-ranging applications, including education, healthcare, and customer service, fostering inclusivity and accessibility. By addressing the challenges of gesture variability and processing latency, this work advances the development of automated sign language interpretation, paving the way for more equitable communication in diverse settings. Keywords— Sign Language Recognition, Convolutional Neural Networks (CNNs), Hand Gesture Detection, Recurrent Neural Networks (RNNs), Deep Learning, Real-Time Processing.
APA, Harvard, Vancouver, ISO, and other styles
40

Singh, Koustub. "Presentation and Multimedia Control Using Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 13, no. 4 (2025): 1055–62. https://doi.org/10.22214/ijraset.2025.68449.

Full text
Abstract:
This paper introduces a gesture-based control system designed to empower users to seamlessly manage presentations and multimedia applications through intuitive hand gestures. By integrating robust computer vision techniques capable of realtime gesture detection, this approach enables users to execute essential actions, such as slide transitions and media playback, using simple hand movements captured by a standard webcam. The system employs hand tracking and gesture recognition modules that identify specific gestures mapped to commands in both Presentation and Multimedia modes, providing flexibility and ease of control without the need for physical interaction. This solution is compatible with existing presentation and multimedia tools, offering users a cohesive, hands-free experience. To maximize usability and performance, the system follows established software engineering practices, ensuring a streamlined interface and an efficient, maintainable code structure. This paper provides a comprehensive overview of the design, implementation, and potential applications of this gesture-controlled system, underscoring its effectiveness in enhancing accessibility and user experience across a range of digital interaction contexts.
APA, Harvard, Vancouver, ISO, and other styles
41

Harale, A. D., K. J. Karande, Sagar S. Bhumkar, Sanjay T. Gaikwad, and Snehal S. Kumbhar. "Wireless Hand Geture Control Robot with Object Detection." June-July 2023, no. 34 (July 13, 2023): 1–10. http://dx.doi.org/10.55529/jipirs.34.1.10.

Full text
Abstract:
This study uses image processing and Internet of Things (IoT) technology to demonstrate a wireless hand gesture control robot with object identification. Users can remotely operate a robot using hand gestures that are photographed by a camera and analyzed using machine learning algorithms and image processing methods for gesture identification. The system has object detection capabilities as well, allowing the robot to find and recognize items in its environment. In order to allow users to construct their own gestures and instructions, the suggested system is made to be adaptive and versatile. Additionally, it incorporates Internet of Things (IoT) technology, enabling remote control and supervision of the robot via a web-based interface or a mobile application. The system offers a distinct and user-friendly interface and has potential uses in home automation, surveillance, and remote exploration.
APA, Harvard, Vancouver, ISO, and other styles
42

Raheja, J. L., Ankit Chaudhary, and Shobhit Maheshwari. "Hand gesture pointing location detection." Optik 125, no. 3 (2014): 993–96. http://dx.doi.org/10.1016/j.ijleo.2013.07.167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

MOHAMMED, Adam Ahmed Qaid, Jiancheng Lv, and MD Sajjatul Islam. "A Deep Learning-Based End-to-End Composite System for Hand Detection and Gesture Recognition." Sensors 19, no. 23 (2019): 5282. http://dx.doi.org/10.3390/s19235282.

Full text
Abstract:
Recent research on hand detection and gesture recognition has attracted increasing interest due to its broad range of potential applications, such as human-computer interaction, sign language recognition, hand action analysis, driver hand behavior monitoring, and virtual reality. In recent years, several approaches have been proposed with the aim of developing a robust algorithm which functions in complex and cluttered environments. Although several researchers have addressed this challenging problem, a robust system is still elusive. Therefore, we propose a deep learning-based architecture to jointly detect and classify hand gestures. In the proposed architecture, the whole image is passed through a one-stage dense object detector to extract hand regions, which, in turn, pass through a lightweight convolutional neural network (CNN) for hand gesture recognition. To evaluate our approach, we conducted extensive experiments on four publicly available datasets for hand detection, including the Oxford, 5-signers, EgoHands, and Indian classical dance (ICD) datasets, along with two hand gesture datasets with different gesture vocabularies for hand gesture recognition, namely, the LaRED and TinyHands datasets. Here, experimental results demonstrate that the proposed architecture is efficient and robust. In addition, it outperforms other approaches in both the hand detection and gesture classification tasks.
APA, Harvard, Vancouver, ISO, and other styles
44

Wei, Haoyu, Pengfei Li, Kai Tang, Wei Wang, and Xi Chen. "Alternating Electric Field-Based Static Gesture-Recognition Technology." Sensors 19, no. 10 (2019): 2375. http://dx.doi.org/10.3390/s19102375.

Full text
Abstract:
Currently, gesture recognition based on electric-field detection technology has received extensive attention, which is mostly used to recognize the position and the movement of the hand, and rarely used for identification of specific gestures. A non-contact gesture-recognition technology based on the alternating electric-field detection scheme is proposed, which can recognize static gestures in different states and dynamic gestures. The influence of the hand on the detection system is analyzed from the principle of electric-field detection. A simulation model of the system is established to investigate the charge density on the hand surface and the potential change of the sensing electrodes. According to the simulation results, the system structure is improved, and the signal-processing circuit is designed to collect the signal of sensing electrodes. By collecting a large amount of data from different operators, the tree-model recognition algorithm is designed and a gesture-recognition experiment is implemented. The results show that the gesture-recognition correct rate is over 90%. With advantages of high response speed, low cost, small volume, and immunity to the surrounding environment, the system could be assembled on a robot that communicates with operators.
APA, Harvard, Vancouver, ISO, and other styles
45

Hendra, Yomei, Putri Sakinah, Fajar Maulana, and Kiki Hariani Manurung. "Integrasi Model Pembelajaran Mesin dalam Game Menggunakan Gerakan Tangan." INFORMATIKA 12, no. 3 (2024): 617–25. https://doi.org/10.36987/informatika.v12i3.6826.

Full text
Abstract:
This study develops a Tetris game controlled through hand gestures using a machine learning model. The primary objective of this research is to create an interactive and responsive gaming experience by utilizing hand gesture detection as the main control mechanism. A hand gesture dataset was collected from videos segmented into individual frames, which were then analyzed using MediaPipe to detect and label gestures. The machine learning model employs a Convolutional Neural Network (CNN) trained to recognize hand gesture patterns and translate them into commands within the game. After implementation, an evaluation was conducted by distributing questionnaires to 18 Informatics students at Adzkia University to assess the system's comfort and responsiveness. The questionnaire results showed a high satisfaction level, with an average score of 84.56, covering evaluations of control ease, gesture detection accuracy, and system responsiveness. The average score for ease of use reached 85, indicating that the majority of users found the gesture-based controls comfortable. This study demonstrates that applying machine learning models in gesture-based control games can provide a more interactive and responsive experience, with potential applications in other interactive technologies.
APA, Harvard, Vancouver, ISO, and other styles
46

Wong, Sai-Keung, Kai-Min Chen, and Ting-Yu Chen. "Interactive Sand Art Drawing Using RGB-D Sensor." International Journal of Software Engineering and Knowledge Engineering 28, no. 05 (2018): 643–61. http://dx.doi.org/10.1142/s0218194018500183.

Full text
Abstract:
We present an interactive system using one RGB-D sensor, which allows a user to use bare hands to perform sand drawing. Our system supports the common sand drawing functions, such as sand erosion, sand spilling, and sand leaking. To use hands to manipulate the virtual sand, we design four key hand gestures. The idea is that the gesture of one hand controls the drawing actions. The motion and gesture of the other hand control the drawing positions. There are three major steps. First, our system adopts a vision-based bare-hand detection method which computes the hand position and recognizes the hand gestures. Second, the drawing positions and the drawing actions are sent to a sand drawing subsystem. Finally, the subsystem performs the sand drawing actions. Experimental results show that our system enables users to draw a rich variety of sand pictures.
APA, Harvard, Vancouver, ISO, and other styles
47

Deo, Aditi, Aishwarya Wankhede, Rutuja Asawa, and Supriya Lohar. "MEMS Accelerometer Based Hand Gesture-Controlled Robot." International Journal for Research in Applied Science and Engineering Technology 10, no. 8 (2022): 265–67. http://dx.doi.org/10.22214/ijraset.2022.46158.

Full text
Abstract:
Abstract: Gesture detection has gotten a lot of attention from a lot of different research communities, including humancomputer interaction and image processing. User interface technology has become increasingly significant as the number of human-machine interactions in our daily lives has increased. Gestures as intuitive expressions will substantially simplify the interaction process and allow humans to command computers and machines more intuitively. Robots can now be controlled by a remote control, a mobile phone, or a direct wired connection. When considering cost and required hardware, all of this adds to the complexity, particularly for low-level applications. MEMS accelerometer-based gesture-controlled robot attempts to create a gesture-controlled robot that can be commanded wirelessly. User is able to control motions of the robot by wearing the controller glove and performing predefined gestures.
APA, Harvard, Vancouver, ISO, and other styles
48

Patil, Chetana D., Amrita Sonare, Aliasgar Husain, Aniket Jha, and Ajay Phirke. "Controlled Hand Gestures using Python and OpenCV." International Journal for Research in Applied Science and Engineering Technology 11, no. 5 (2023): 2973–77. http://dx.doi.org/10.22214/ijraset.2023.52285.

Full text
Abstract:
Abstract: Due to its communal nature, gesture recognition has been used in recent trends to develop machines. Gestures are a type of verbal communication that allows humans and computers to communicate with one another. Artificial intelligence makes heavy use of hand gesture detection to improve functionality and person commerce. Then, to carry out particular tasks, we used counterplotted action dyads and some Python libraries (OpenCV, cvzone) that aid in image capture, pre-processing, and discovery
APA, Harvard, Vancouver, ISO, and other styles
49

Mambou, Krejcar, Maresova, Selamat, and Kuca. "Novel Hand Gesture Alert System." Applied Sciences 9, no. 16 (2019): 3419. http://dx.doi.org/10.3390/app9163419.

Full text
Abstract:
. Sexual assault can cause great societal damage, with negative socio-economic, mental, sexual, physical and reproductive consequences. According to the Eurostat, the number of crimes increased in the European Union between 2008 and 2016. However, despite the increase in security tools such as cameras, it is usually difficult to know if an individual is subject to an assault based on his or her posture. Hand gestures are seen by many as the natural means of nonverbal communication when interacting with a computer, and a considerable amount of research has been performed. In addition, the identifiable hand placement characteristics provided by modern inexpensive commercial depth cameras can be used in a variety of gesture recognition-based systems, particularly for human-machine interactions. This paper introduces a novel gesture alert system that uses a combination of Convolution Neural Networks (CNNs). The overall system can be subdivided into three main parts: firstly, the human detection in the image using a pretrained “You Only Look Once (YOLO)” method, which extracts the related bounding boxes containing his/her hands; secondly, the gesture detection/classification stage, which processes the bounding box images; and thirdly, we introduced a module called “counterGesture”, which triggers the alert.
APA, Harvard, Vancouver, ISO, and other styles
50

Liu, Chang, and Tamás Szirányi. "Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue." Sensors 21, no. 6 (2021): 2180. http://dx.doi.org/10.3390/s21062180.

Full text
Abstract:
Unmanned aerial vehicles (UAVs) play an important role in numerous technical and scientific fields, especially in wilderness rescue. This paper carries out work on real-time UAV human detection and recognition of body and hand rescue gestures. We use body-featuring solutions to establish biometric communications, like yolo3-tiny for human detection. When the presence of a person is detected, the system will enter the gesture recognition phase, where the user and the drone can communicate briefly and effectively, avoiding the drawbacks of speech communication. A data-set of ten body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV on-board camera. The two most important gestures are the novel dynamic Attention and Cancel which represent the set and reset functions respectively. When the rescue gesture of the human body is recognized as Attention, the drone will gradually approach the user with a larger resolution for hand gesture recognition. The system achieves 99.80% accuracy on testing data in body gesture data-set and 94.71% accuracy on testing data in hand gesture data-set by using the deep learning method. Experiments conducted on real-time UAV cameras confirm our solution can achieve our expected UAV rescue purpose.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography