To see the other types of publications on this topic, follow the link: Gestures recognition system human motions.

Journal articles on the topic 'Gestures recognition system human motions'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Gestures recognition system human motions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liang, Xiubo, Zhen Wang, Weidong Geng, and Franck Multon. "A Motion-based User Interface for the Control of Virtual Humans Performing Sports." International Journal of Virtual Reality 10, no. 3 (2011): 1–8. http://dx.doi.org/10.20870/ijvr.2011.10.3.2815.

Full text
Abstract:
Traditional human computer interfaces are not intuitive and natural for the choreography of human motions in the field of VR and video games. In this paper we present a novel approach to control virtual humans performing sports with a motion-based user interface. The process begins by asking the user to draw some gestures in the air with a Wii Remote. The system then recognizes the gestures with pre-trained hidden Markov models. Finally, the recognized gestures are employed to choreograph the simulated sport motions of a virtual human. The average recognition rate of the recognition algorithm
APA, Harvard, Vancouver, ISO, and other styles
2

R.Vasavi, Rahul Nenavath, Snigdha A, Jeffery Moses K., and Simha S.Vishal. "Painting with Hand Gestures using MediaPipe." International Journal of Innovative Science and Research Technology 7, no. 12 (2023): 1285–91. https://doi.org/10.5281/zenodo.7514430.

Full text
Abstract:
The main objective of this project is that the hand gesture recognition can also be utilised in applications including industrial automation control, sign language interpretation, and rehabilitation equipment for individuals with physical disabilities of the upper extremities. And it is also find applications in varied domains like virtual environments, medical systems, smart surveillance etc. Hand gesture recognition is most significant for human-computer interaction. Gesture Recognition is a technique that uses mathematical algorithms to recognise human gestures. Gesture recognition identifi
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Jiabiao, Ruiheng Liu, Tianyu Zhang, and Jianbin Liu. "A Symmetrical Leech-Inspired Soft Crawling Robot Based on Gesture Control." Biomimetics 10, no. 1 (2025): 35. https://doi.org/10.3390/biomimetics10010035.

Full text
Abstract:
This paper presents a novel soft crawling robot controlled by gesture recognition, aimed at enhancing the operability and adaptability of soft robots through natural human–computer interactions. The Leap Motion sensor is employed to capture hand gesture data, and Unreal Engine is used for gesture recognition. Using the UE4Duino, gesture semantics are transmitted to an Arduino control system, enabling direct control over the robot’s movements. For accurate and real-time gesture recognition, we propose a threshold-based method for static gestures and a backpropagation (BP) neural network model f
APA, Harvard, Vancouver, ISO, and other styles
4

Valarezo Añazco, Edwin, Seung Ju Han, Kangil Kim, Patricio Rivera Lopez, Tae-Seong Kim, and Sangmin Lee. "Hand Gesture Recognition Using Single Patchable Six-Axis Inertial Measurement Unit via Recurrent Neural Networks." Sensors 21, no. 4 (2021): 1404. http://dx.doi.org/10.3390/s21041404.

Full text
Abstract:
Recording human gestures from a wearable sensor produces valuable information to implement control gestures or in healthcare services. The wearable sensor is required to be small and easily worn. Advances in miniaturized sensor and materials research produces patchable inertial measurement units (IMUs). In this paper, a hand gesture recognition system using a single patchable six-axis IMU attached at the wrist via recurrent neural networks (RNN) is presented. The IMU comprises IC-based electronic components on a stretchable, adhesive substrate with serpentine-structured interconnections. The p
APA, Harvard, Vancouver, ISO, and other styles
5

Tabassum, Shaheen, and Raghavendra R. "Sign Language Recognition and Converting into Text." International Journal for Research in Applied Science and Engineering Technology 10, no. 4 (2022): 1385–90. http://dx.doi.org/10.22214/ijraset.2022.41266.

Full text
Abstract:
Abstract: Sign language is a mode of communication that use a variety of hand movements and actions to convey a message. Deciphering these motions might be a pattern recognition challenge. People use a range of gestures and behaviours to communicate with one another. This study is a system for a human-computer interface that can identify american sign language gestures and produce textual output that reflects the meaning of the gesture. To identify and learn gestures, the proposed system would employ convolutional neural networks and long short term memory networks. This will help to break dow
APA, Harvard, Vancouver, ISO, and other styles
6

Kodulkar, Prof Rajeshwari J. "Deep Neural Networks for Human Action Recognition." International Journal for Research in Applied Science and Engineering Technology 9, no. 9 (2021): 2141–44. http://dx.doi.org/10.22214/ijraset.2021.38206.

Full text
Abstract:
Abstract: In deep neural networks, human action detection is one of the most demanding and complex tasks. Human gesture recognition is the same as human action recognition. Gesture is defined as a series of bodily motions that communicate a message. Gestures are a more natural and preferable way for humans to engage with computers, thereby bridging the gap between humans and robots. The finest communication platform for the deaf and dumb is human action recognition. We propose in this work to create a system for hand gesture identification that recognizes hand movements, hand characteristics s
APA, Harvard, Vancouver, ISO, and other styles
7

Mira, Anwar, and Olaf Hellwich. "Building an Integrative System: Enriching Gesture Language Video Recognition through the Multi-Stream of Hybrid and Improved Deep Learning Models with an Adaptive Decision System." Inteligencia Artificial 27, no. 74 (2024): 181–213. http://dx.doi.org/10.4114/intartif.vol27iss74pp181-213.

Full text
Abstract:
The recognition of hand gestures is of growing importance in developing human-machine interfaces that rely on hand motions for communication. However, recognizing hand gesture motions poses challenges due to overlapping gestures from different categories that share similar hand poses. Temporal information has proven to be more effective in distinguishing sequences of hand gestures. To address these challenges, this research presents an innovative adaptive decision-making system that aim to enhance gesture recognition within the identical category have been introduced. The system capitalizes on
APA, Harvard, Vancouver, ISO, and other styles
8

Reddy, Cherukupally Karunakar, Suraj Janjirala, and Kevulothu Bhanu Prakash. "Gesture Controlled Virtual Mouse with the Support of Voice Assistant." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (2022): 2314–20. http://dx.doi.org/10.22214/ijraset.2022.44323.

Full text
Abstract:
Abstract: This work offers a cursor control system that utilises a web cam to capture human movements and a voice assistant to quickly traverse system controls. Using MediaPipe, the system will let the user to navigate the computer cursor with their hand motions. It will use various hand motions to conduct activities such as left click and dragging. It also allows you to choose numerous items, adjust the volume, and adjust the brightness. MediaPipe, OpenCV etc advanced libraries in python are used to build the system. A hand gesture and a voice assistant used to physically control all i/o oper
APA, Harvard, Vancouver, ISO, and other styles
9

Kulkarni, Padmaja Vivek, Boris Illing, Bastian Gaspers, Bernd Brüggemann, and Dirk Schulz. "Mobile manipulator control through gesture recognition using IMUs and Online Lazy Neighborhood Graph search." ACTA IMEKO 8, no. 4 (2019): 3. http://dx.doi.org/10.21014/acta_imeko.v8i4.677.

Full text
Abstract:
Gesture-based control potentially eliminates the need for wearisome physical controls and facilitates easy interaction between a human and a robot. At the same time, it is intuitive and enables a natural means of control. In this paper, we present and evaluate a framework for gesture recognition using four wearable Inertial Measurement Units (IMUs) to indirectly control a mobile robot. Six gestures involving different hand and arm motions are defined. A novel algorithm based on an Online Lazy Neighborhood Graph (OLNG) search is used to recognise and classify the gestures online. A software fra
APA, Harvard, Vancouver, ISO, and other styles
10

Aarthy, S. L., V. Malathi, Monia Hamdi, Inès Hilali-Jaghdam, Sayed Abdel-Khalek, and Romany F. Mansour. "Recognition of Hand Gesture Using Electromyography Signal: Human-Robot Interaction." Journal of Sensors 2022 (July 11, 2022): 1–9. http://dx.doi.org/10.1155/2022/4718684.

Full text
Abstract:
Recognition of hand gestures has been developed in various research domains and proven to have significant benefits in improving the necessity of human-robot interaction (HRI). The introduction of intelligent statistics knowledge methodologies, such as big data and machine learning, has ushered in a new era of data science and made it easier to classify hand motions accurately using electromyography (EMG) signals. However, the collecting and labelling of the vast dataset enforces a significant workload; resulting in implementations takes a long time. As a result, a unique strategy for combinin
APA, Harvard, Vancouver, ISO, and other styles
11

Journal, IJSREM. "A IMPLEMENTATION ON IMPLEMTATION OF VISION GLIDE TECHNIQUE FOR SMOOTH NAVIGATION WITH CAMERA BASED VIRTUAL MOUSE." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 07, no. 11 (2023): 1–11. http://dx.doi.org/10.55041/ijsrem26650.

Full text
Abstract:
As artificial intelligence technology has advanced, it has become commonplace to employ hand gesture detection to control virtual objects. The suggested system in this research is a hand gesture-controlled virtual mouse that uses AI algorithms to recognize hand gestures and transform them into mouse movements. People who have trouble using a conventional mouse or keyboard can use the system to provide an alternate interface. The suggested method takes pictures of the user's hand with a camera, which an AI program then utilizes to identify the motions the user is making. Since the development o
APA, Harvard, Vancouver, ISO, and other styles
12

Choudhari, Prof Y. D. "A REVIEW ON IMPLEMTATION OF SIGN LANGUAGE TRANSLATOR." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 07, no. 11 (2023): 1–11. http://dx.doi.org/10.55041/ijsrem27384.

Full text
Abstract:
As artificial intelligence technology has advanced, it has become commonplace to employee gesture detection to control virtual objects. The suggested system in this research is a hand gesture- controlled virtual mouse that uses AI algorithms to recognize hand gestures and transform them into mouse movements. People who have trouble using a conventional mouse or keyboard can use the system to provide an alternate interface. The suggested method takes pictures of the user's hand with a camera, which an AI program then utilizes to identify the motions the user is making. Since the development of
APA, Harvard, Vancouver, ISO, and other styles
13

Newby, Gregory B. "Gesture Recognition Based upon Statistical Similarity." Presence: Teleoperators and Virtual Environments 3, no. 3 (1994): 236–43. http://dx.doi.org/10.1162/pres.1994.3.3.236.

Full text
Abstract:
One of the improvements virtual reality offers traditional human-computer interfaces is that it enables the user to interact with virtual objects using gestures. The use of natural hand gestures for computer input provides opportunities for direct manipulation in computing environments, but not without some challenges. The mapping of a human gesture onto a particular system function is not nearly so easy as mapping with a keyboard or mouse. Reasons for this difficulty include individual variations in the exact gesture movement, the problem of knowing when a gesture starts and ends, and variati
APA, Harvard, Vancouver, ISO, and other styles
14

Xiao, Yang, Zhijun Zhang, Aryel Beck, Junsong Yuan, and Daniel Thalmann. "Human–Robot Interaction by Understanding Upper Body Gestures." Presence: Teleoperators and Virtual Environments 23, no. 2 (2014): 133–54. http://dx.doi.org/10.1162/pres_a_00176.

Full text
Abstract:
In this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human–object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is
APA, Harvard, Vancouver, ISO, and other styles
15

SHAIK, Dr ABDUL NABI, E. SAI PRIYA, G. NIKITHA, K. PRACHEEN KUMAR, and N. SHRAVYA SHREE. "CONTROLLING VIDEOLAN CLIENT MEDIA USING LUCAS KANADE OPTIMAL FLOW ALGORITHM AND OPENCV." YMER Digital 21, no. 05 (2022): 246–55. http://dx.doi.org/10.37896/ymer21.05/29.

Full text
Abstract:
In this project we've discussed a system which uses some dynamic hand gesture recognition technique to manage the media players like VLC media player. This project consist of certain modules which segments the foreground a part of the frame using detection of skin colour and approximate median technique. the popularity of hand gesture is finished by creating a call Tree, that uses certain features extracted from the segmented part. This hand gesture recognition technique introduces a replacement, natural thanks to interact with computers and is beneficial to several folks in our day-to-day lif
APA, Harvard, Vancouver, ISO, and other styles
16

Zheng, Zepei. "Human Gesture Recognition in Computer Vision Research." SHS Web of Conferences 144 (2022): 03011. http://dx.doi.org/10.1051/shsconf/202214403011.

Full text
Abstract:
Human gesture recognition is a popular issue in the studies of computer vision, since it provides technological expertise required to advance the interaction between people and computers, virtual environments, smart surveillance, motion tracking, as well as other domains. Extraction of the human skeleton is a rather typical gesture recognition approach using existing technologies based on two-dimensional human gesture detection. Likewise, I t cannot be overlooked that objects in the surrounding environment give some information about human gestures. To semantically recognize the posture of the
APA, Harvard, Vancouver, ISO, and other styles
17

Lei, Wentai, Xinyue Jiang, Long Xu, Jiabin Luo, Mengdi Xu, and Feifei Hou. "Continuous Gesture Recognition Based on Time Sequence Fusion Using MIMO Radar Sensor and Deep Learning." Electronics 9, no. 5 (2020): 869. http://dx.doi.org/10.3390/electronics9050869.

Full text
Abstract:
Gesture recognition that is based on high-resolution radar has progressively developed in human-computer interaction field. In a radar recognition-based system, it is challenging to recognize various gesture types because of the lacking of gesture transversal feature. In this paper, we propose an integrated gesture recognition system that is based on frequency modulated continuous wave MIMO radar combined with deep learning network for gesture recognition. First, a pre-processing algorithm, which consists of the windowed fast Fourier transform and the intermediate-frequency signal band-pass-fi
APA, Harvard, Vancouver, ISO, and other styles
18

Chen, Shuai, Yoichiro Maeda, and Yasutake Takahashi. "Chaotic Music Generation System Using Music Conductor Gesture." Journal of Advanced Computational Intelligence and Intelligent Informatics 17, no. 2 (2013): 194–200. http://dx.doi.org/10.20965/jaciii.2013.p0194.

Full text
Abstract:
In research on interactive music generation, we propose a music generation method in which the computer generates music under the recognition of a humanmusic conductor’s gestures. In this research, generated music is tuned by parameters of a network of chaotic elements which are determined by the recognized gesture in real time. The music conductor’s hand motions are detected by Microsoft Kinect in this system. Music theories are embedded in the algorithm and, as a result, generated music is richer. Furthermore, we constructed the music generation system and performed experiments for generatin
APA, Harvard, Vancouver, ISO, and other styles
19

HWANG, BON-WOO, SUNGMIN KIM, and SEONG-WHANe LEE. "A FULL-BODY GESTURE DATABASE FOR HUMAN GESTURE ANALYSIS." International Journal of Pattern Recognition and Artificial Intelligence 21, no. 06 (2007): 1069–84. http://dx.doi.org/10.1142/s0218001407005806.

Full text
Abstract:
This paper presents a full-body gesture database which contains 2D video data and 3D motion data of 14 normal gestures, 10 abnormal gestures and 30 command gestures for 20 subjects. We call this database the Korea University Gesture (KUG) database. Using 3D motion cameras and 3 sets of stereo cameras, we captured 3D motion data and 3 pairs of stereo-video data in 3 different directions for normal and abnormal gestures. In case of command gestures, 2 pairs of stereo-video data were obtained by 2 sets of stereo cameras with different focal lengths in order to capture views of whole body and uppe
APA, Harvard, Vancouver, ISO, and other styles
20

Varma, Anshal, Sanyukta Pawaskar, Sumedh More, and Ashwini Raorane. "Computer Control Using Vision-Based Hand Motion Recognition System." ITM Web of Conferences 44 (2022): 03069. http://dx.doi.org/10.1051/itmconf/20224403069.

Full text
Abstract:
In our day-to-day communication and expression, gestures play a crucial role. As a result, using them to interact with technical equipment requires small cognitive data processing on our part. Because it creates a large barrier between the user and the machine, using a physical device for human-computer interaction, such as a mouse or keyboard, obstructs the natural interface. In this study, we created a sophisticated marker-free hand gesture detection structure that can monitor both dynamic and static hand gestures. Our system turns motion detection into actions such as opening web pages and
APA, Harvard, Vancouver, ISO, and other styles
21

Vasuki, M. "Al Powered Real-Time Sign Language Detection and Translation System for Inclusive Communication Between Deaf and Hearing Communities Worldwide." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 06 (2025): 1–9. https://doi.org/10.55041/ijsrem50025.

Full text
Abstract:
Abstract - Sign language is a vital communication tool for individuals who are deaf or hard of hearing, yet it remains largely inaccessible to the wider population. This project aims to address this barrier by developing a sign language recognition system that converts hand gestures into text, followed by text-to-speech (TTS) conversion. The system utilizes Convolutional Neural Networks (CNNs) to recognize static hand gestures and translate them into corresponding textual representations. The text is then processed by a TTS engine, which generates spoken language, making it comprehensible to i
APA, Harvard, Vancouver, ISO, and other styles
22

Byun and Lee. "Implementation of Hand Gesture Recognition Device Applicable to Smart Watch Based on Flexible Epidermal Tactile Sensor Array." Micromachines 10, no. 10 (2019): 692. http://dx.doi.org/10.3390/mi10100692.

Full text
Abstract:
Ever since the development of digital devices, the recognition of human gestures has played an important role in many Human-Computer interface applications. Various wearable devices have been developed, and inertial sensors, magnetic sensors, gyro sensors, electromyography, force-sensitive resistors, and other types of sensors have been used to identify gestures. However, there are different drawbacks for each sensor, which affect the detection of gestures. In this paper, we present a new gesture recognition method using a Flexible Epidermal Tactile Sensor based on strain gauges to sense defor
APA, Harvard, Vancouver, ISO, and other styles
23

Alnaim, Norah, Maysam Abbod, and Rafiq Swash. "Recognition of Holoscopic 3D Video Hand Gesture Using Convolutional Neural Networks." Technologies 8, no. 2 (2020): 19. http://dx.doi.org/10.3390/technologies8020019.

Full text
Abstract:
The convolutional neural network (CNN) algorithm is one of the efficient techniques to recognize hand gestures. In human–computer interaction, a human gesture is a non-verbal communication mode, as users communicate with a computer via input devices. In this article, 3D micro hand gesture recognition disparity experiments are proposed using CNN. This study includes twelve 3D micro hand motions recorded for three different subjects. The system is validated by an experiment that is implemented on twenty different subjects of different ages. The results are analysed and evaluated based on executi
APA, Harvard, Vancouver, ISO, and other styles
24

Yang, Linchu, Ji’an Chen, and Weihang Zhu. "Dynamic Hand Gesture Recognition Based on a Leap Motion Controller and Two-Layer Bidirectional Recurrent Neural Network." Sensors 20, no. 7 (2020): 2106. http://dx.doi.org/10.3390/s20072106.

Full text
Abstract:
Dynamic hand gesture recognition is one of the most significant tools for human–computer interaction. In order to improve the accuracy of the dynamic hand gesture recognition, in this paper, a two-layer Bidirectional Recurrent Neural Network for the recognition of dynamic hand gestures from a Leap Motion Controller (LMC) is proposed. In addition, based on LMC, an efficient way to capture the dynamic hand gestures is identified. Dynamic hand gestures are represented by sets of feature vectors from the LMC. The proposed system has been tested on the American Sign Language (ASL) datasets with 360
APA, Harvard, Vancouver, ISO, and other styles
25

Yoo, Minjeong, Yuseung Na, Hamin Song, et al. "Motion Estimation and Hand Gesture Recognition-Based Human–UAV Interaction Approach in Real Time." Sensors 22, no. 7 (2022): 2513. http://dx.doi.org/10.3390/s22072513.

Full text
Abstract:
As an alternative to traditional remote controller, research on vision-based hand gesture recognition is being actively conducted in the field of interaction between human and unmanned aerial vehicle (UAV). However, vision-based gesture system has a challenging problem in recognizing the motion of dynamic gesture because it is difficult to estimate the pose of multi-dimensional hand gestures in 2D images. This leads to complex algorithms, including tracking in addition to detection, to recognize dynamic gestures, but they are not suitable for human–UAV interaction (HUI) systems that require sa
APA, Harvard, Vancouver, ISO, and other styles
26

Singh, Ram Krishna. "Hand Gesture Controlled Presentation Using Computer Vision." International Journal for Research in Applied Science and Engineering Technology 12, no. 5 (2024): 5120–26. http://dx.doi.org/10.22214/ijraset.2024.62780.

Full text
Abstract:
Abstract: Making presentations is essential in a lot of facets oflife. At some point in your life, whether you're a student, worker, business owner, or employee of an organization, you've probably given presentations.Presentations might seem dull at times since you have to use a keyboard or other specialized device to manipulate and alter the slides. Our goal is to enable hand gesture control of the slide display for users. Human-computer interaction has seen a sharp increase in the use of gestures in recent years. The system has attempted to use hand movements to control several PowerPoint ca
APA, Harvard, Vancouver, ISO, and other styles
27

Li, Yunhe, Yi Xie, and Qinyu Zhang. "3D Gesture Recognition Based on Handheld Smart Terminals." International Journal of Ambient Computing and Intelligence 9, no. 4 (2018): 96–111. http://dx.doi.org/10.4018/ijaci.2018100106.

Full text
Abstract:
With the popularity of smart devices, it has become impossible for traditional human-computer interaction techniques to accommodate people's needs. This article proposes an iOS-based three dimensional (3D) gesture recognition system, gathering users' specific gestures from their handheld smart terminals to judge implications of these gestures, so to control other smart terminals with more natural human-computer interactions. In this article, gestures were recognized by reading data about corresponding 3D gesture data with motion sensors of smart terminals using optimized dynamic time warping (
APA, Harvard, Vancouver, ISO, and other styles
28

Wang, Tao, Xiaolong Cai, Liping Wang, and Haoye Tian. "Interactive Design of 3D Dynamic Gesture Based on SVM-LSTM Model." International Journal of Mobile Human Computer Interaction 10, no. 3 (2018): 49–63. http://dx.doi.org/10.4018/ijmhci.2018070104.

Full text
Abstract:
Visual hand gesture interaction is one of the main ways of human-computer interaction, and provides users more interactive degrees of freedom and more realistic interactive experience. Authors present a hybrid model based on SVM-LSTM, and design a three-dimensional dynamic gesture interaction system. The system uses Leap Motion to capture gesture information, combined with SVM powerful static gesture classification ability and LSTM powerful variable-length time series gesture processing ability, enabling real-time recognition of user gestures. The gesture interaction method can automatically d
APA, Harvard, Vancouver, ISO, and other styles
29

Dang, Xiaochao, Yang Liu, Zhanjun Hao, Xuhao Tang, and Chenguang Shao. "Air Gesture Recognition Using WLAN Physical Layer Information." Wireless Communications and Mobile Computing 2020 (August 13, 2020): 1–14. http://dx.doi.org/10.1155/2020/8546237.

Full text
Abstract:
In recent years, the researchers have witnessed the important role of air gesture recognition in human-computer interactive (HCI), smart home, and virtual reality (VR). The traditional air gesture recognition method mainly depends on external equipment (such as special sensors and cameras) whose costs are high and also with a limited application scene. In this paper, we attempt to utilize channel state information (CSI) derived from a WLAN physical layer, a Wi-Fibased air gesture recognition system, namely, WiNum, which solves the problems of users’ privacy and energy consumption compared with
APA, Harvard, Vancouver, ISO, and other styles
30

Journal, IJSREM. "Hand Gesture Recognition and Motion Detection System for Interactive Applications." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 07, no. 10 (2023): 1–11. http://dx.doi.org/10.55041/ijsrem26131.

Full text
Abstract:
Security systems and interactive multimedia applications are two examples of the many fields in which Gesture Sense (GS) technology, a significant development in human-computer interaction, has found use. This paper introduces the "Arduino Hand Gesture Identification and Motion Detection System," a revolutionary combination of Python, Arduino Uno R3, and ultrasonic sensors intended to improve interactive apps through accurate gesture identification. Real-time tracking of hand movements is made possible by ultrasonic sensors, which are renowned for their fine-grained motion detection capabiliti
APA, Harvard, Vancouver, ISO, and other styles
31

Rai, Radha, Prakhar Kumar Chandraker, Shaik Nowsheen, Shantanu Kumar Singh, and Veena G. "AI Cricket Score Board." International Journal of Innovative Research in Information Security 9, no. 02 (2023): 28–38. http://dx.doi.org/10.26562/ijiris.2023.v0902.05.

Full text
Abstract:
Gesture Recognition pertains to recognizing meaningful expressions of motion by a human, involving the hands, arms, face, head, and/or body. It is of utmost important in designing an intelligent and efficient human–computer interface. The applications of gesture recognition are manifold, ranging from sign language through medical rehabilitation, monitoring patients or elder people, surveillance systems, sports gesture analysis, human behavior analysis etc., to virtual reality. In recent years, there has been increased interest in video summarization and automatic sports highlights generation i
APA, Harvard, Vancouver, ISO, and other styles
32

Lee, Chanhwi, Jaehan Kim, Seoungbae Cho, Jinwoong Kim, Jisang Yoo, and Soonchul Kwon. "Development of Real-Time Hand Gesture Recognition for Tabletop Holographic Display Interaction Using Azure Kinect." Sensors 20, no. 16 (2020): 4566. http://dx.doi.org/10.3390/s20164566.

Full text
Abstract:
The use of human gesturing to interact with devices such as computers or smartphones has presented several problems. This form of interaction relies on gesture interaction technology such as Leap Motion from Leap Motion, Inc, which enables humans to use hand gestures to interact with a computer. The technology has excellent hand detection performance, and even allows simple games to be played using gestures. Another example is the contactless use of a smartphone to take a photograph by simply folding and opening the palm. Research on interaction with other devices via hand gestures is in progr
APA, Harvard, Vancouver, ISO, and other styles
33

Heickal, Hasnain, Tao Zhang, and Md Hasanuzzaman. "Computer Vision-Based Real-Time 3D Gesture Recognition Using Depth Image." International Journal of Image and Graphics 15, no. 01 (2015): 1550004. http://dx.doi.org/10.1142/s0219467815500047.

Full text
Abstract:
Gesture is one of the fundamental ways of human machine natural interaction. To understand gesture, the system should be able to interpret 3D movements of human. This paper presents a computer vision-based real-time 3D gesture recognition system using depth image which tracks 3D joint position of head, neck, shoulder, arms, hands and legs. This tracking is done by Kinect motion sensor with OpenNI API and 3D motion gesture is recognized using the movement trajectory of those joints. User to Kinect sensor distance is adapted using proposed center of gravity (COG) correction method and 3D joint p
APA, Harvard, Vancouver, ISO, and other styles
34

Okada, Shogo, and Osamu Hasegawa. "Incremental Learning, Recognition, and Generation of Time-Series Patterns Based on Self-Organizing Segmentation." Journal of Advanced Computational Intelligence and Intelligent Informatics 10, no. 3 (2006): 395–408. http://dx.doi.org/10.20965/jaciii.2006.p0395.

Full text
Abstract:
We segments and symbolizes image information on a series of human behavior as an aggregate unit of motions in a self-organizing manner and proposes a system that recognizes the entire behavior as a symbol string. This system symbolizes the motion unit incrementally and also generates motion from a symbol. To implement the system, we used a mixture of experts with a non-monotonous recurrent neural network used as the expert and our own DP matching method. In addition, our proposal makes not only teacher-labeled patterns, but also teacher-unlabeled patterns available for learning. By using this
APA, Harvard, Vancouver, ISO, and other styles
35

Dr.Sk., Mahboob Basha, H.C.Srivalli, B.Jahnavi, and C.V.Basanth. "Hand Gestures Classification and Image Processing using Convolution Neural Network Algorithm." International Journal of Innovative Science and Research Technology 8, no. 4 (2023): 800–805. https://doi.org/10.5281/zenodo.7912710.

Full text
Abstract:
The deaf community communicates primarily through the use of sign language. In general, sign language is much more figuratively formable for communication, which helps to advance and broaden the conversation. The ASL is regarded as the universal sign language, although there are numerous variations and other sign systems used in various parts of the world. There are fewer major ideas and concepts assigned. There are fewer principal ideas and assigned appearances in sign language. The main goal of this effort is to create a system of sign language that will benefit the deaf community and speed
APA, Harvard, Vancouver, ISO, and other styles
36

Dr.Sk., Mahboob Basha, H.C.Srivalli, B.Jahnavi, and C.V.Basanth. "Hand Gestures Classification and Image Processing using Convolution Neural Network Algorithm." International Journal of Innovative Science and Research Technology 8, no. 4 (2023): 800–805. https://doi.org/10.5281/zenodo.7922779.

Full text
Abstract:
The deaf community communicates primarily through the use of sign language. In general, sign language is much more figuratively formable for communication, which helps to advance and broaden the conversation. The ASL is regarded as the universal sign language, although there are numerous variations and other sign systems used in various parts of the world. There are fewer major ideas and concepts assigned. There are fewer principal ideas and assigned appearances in sign language. The main goal of this effort is to create a system of sign language that will benefit the deaf community and speed
APA, Harvard, Vancouver, ISO, and other styles
37

SOREL, ANTHONY, RICHARD KULPA, EMMANUEL BADIER, and FRANCK MULTON. "DEALING WITH VARIABILITY WHEN RECOGNIZING USER'S PERFORMANCE IN NATURAL 3D GESTURE INTERFACES." International Journal of Pattern Recognition and Artificial Intelligence 27, no. 08 (2013): 1350023. http://dx.doi.org/10.1142/s0218001413500237.

Full text
Abstract:
Recognition of natural gestures is a key issue in many applications including videogames and other immersive applications. Whatever is the motion capture device, the key problem is to recognize a motion that could be performed by a range of different users, at an interactive frame rate. Hidden Markov Models (HMM) that are commonly used to recognize the performance of a user however rely on a motion representation that strongly affects the overall recognition rate of the system. In this paper, we propose to use a compact motion representation based on Morphology-Independent features and we eval
APA, Harvard, Vancouver, ISO, and other styles
38

Santosh Kumar J, Vamsi, Vinod, Madhusudhan and Tejas. "Design and Development of IoT Device that Recognizes Hand Gestures using Sensors." September 2021 7, no. 09 (2021): 29–34. http://dx.doi.org/10.46501/ijmtst0709006.

Full text
Abstract:
A hand gesture is a non-verbal means of communication involving the motion of fingers to convey information. Hand gestures are used in sign language and are a way of communication for deaf and mute people and also implemented to control devices too. The purpose of gesture recognition in devices has always been providing the gap between the physical world and the digital world. The way humans interact among themselves with the digital world could be implemented via gestures using algorithms. Gestures can be tracked using gyroscope, accelerometers, and more as well. So, in this project we aim to
APA, Harvard, Vancouver, ISO, and other styles
39

Wijaya, Ryan Satria, Senanjung Prayoga, Rifqi Amalya Fatekha, and Muhammad Thoriq Mubarak. "A Real-Time Hand Gesture Control of a Quadcopter Swarm Implemented in the Gazebo Simulation Environment." Journal of Applied Informatics and Computing 9, no. 3 (2025): 979–88. https://doi.org/10.30871/jaic.v9i3.9578.

Full text
Abstract:
With the advancement of technology, human-robot interaction (HRI) is becoming more intuitive, including through hand gesture-based control. This study aims to develop a real-time hand gesture recognition system to control a quadcopter swarm within a simulated environment using ROS and Gazebo. The system utilizes Google's MediaPipe framework for detecting 21 hand landmarks, which are then processed through a custom-trained neural network to classify 13 predefined gestures. Each gesture corresponds to a specific command such as basic motion, rotation, or swarm formation, and is published to the
APA, Harvard, Vancouver, ISO, and other styles
40

Xie, Jianan, Zhen Xu, Jiayu Zeng, Yuyang Gao, and Kenji Hashimoto. "Human–Robot Interaction Using Dynamic Hand Gesture for Teleoperation of Quadruped Robots with a Robotic Arm." Electronics 14, no. 5 (2025): 860. https://doi.org/10.3390/electronics14050860.

Full text
Abstract:
Human–Robot Interaction (HRI) using hand gesture recognition offers an effective and non-contact approach to enhancing operational intuitiveness and user convenience. However, most existing studies primarily focus on either static sign language recognition or the tracking of hand position and orientation in space. These approaches often prove inadequate for controlling complex robotic systems. This paper proposes an advanced HRI system leveraging dynamic hand gestures for controlling quadruped robots equipped with a robotic arm. The proposed system integrates both semantic and pose information
APA, Harvard, Vancouver, ISO, and other styles
41

KUMAR, SANJAY, DINESH K. KUMAR, ARUN SHARMA, and NEIL McLACHLAN. "VISUAL HAND GESTURES CLASSIFICATION USING WAVELET TRANSFORMS." International Journal of Wavelets, Multiresolution and Information Processing 01, no. 04 (2003): 373–92. http://dx.doi.org/10.1142/s0219691303000232.

Full text
Abstract:
This paper presents a novel technique for classifying human hand gestures based on stationary wavelet transform (SWT) and compares the results with classification based on Hu moments. The technique uses view-based approach for representation of hand actions, and artificial neural networks (ANN) for classification. This approach uses a cumulative image-difference technique where the time between the sequences of images is implicitly captured in the representation of action. This results in the construction of Motion History Images (MHI). These MHI's are decomposed into four sub-images using SWT
APA, Harvard, Vancouver, ISO, and other styles
42

Glory Veronica, Parsipogu, Ravi Kumar Mokkapati, Lakshmi Prasanna Jagupilla, and Chella Santhosh. "Static Hand Gesture Recognition Using Novel Convolutional Neural Network and Support Vector Machine." International Journal of Online and Biomedical Engineering (iJOE) 19, no. 09 (2023): 131–41. http://dx.doi.org/10.3991/ijoe.v19i09.39927.

Full text
Abstract:
Hand tracking and identification through visual means pose a challenging problem. To simplify the identification of hand gestures, some systems have incorporated position markers or colored bands, which are not ideal for controlling robots due to their inconvenience. The motion recognition problem can be solved by combining object identification, recognition, and tracking using image processing techniques. A wide variety of target detection and recognition image processing methods are available. This paper proposes novel CNN-based methods to create a user-free hand gesture detection system. Th
APA, Harvard, Vancouver, ISO, and other styles
43

Tang, Gilbert, Seemal Asif, and Phil Webb. "The integration of contactless static pose recognition and dynamic hand motion tracking control system for industrial human and robot collaboration." Industrial Robot: An International Journal 42, no. 5 (2015): 416–28. http://dx.doi.org/10.1108/ir-03-2015-0059.

Full text
Abstract:
Purpose – The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable manufacturing solution, but efficient control and communication are required for operations to be carried out effectively and safely. Design/methodology/approach – The integrated system consists of facial recognition, static pose recognition and dynamic hand motion tracking. Each sub-system has been tested in isolation before integration and demonstration of a sample task. Findings – It is demonstrated that the
APA, Harvard, Vancouver, ISO, and other styles
44

Wu, Bi-Xiao, Chen-Guang Yang, and Jun-Pei Zhong. "Research on Transfer Learning of Vision-based Gesture Recognition." International Journal of Automation and Computing 18, no. 3 (2021): 422–31. http://dx.doi.org/10.1007/s11633-020-1273-9.

Full text
Abstract:
AbstractGesture recognition has been widely used for human-robot interaction. At present, a problem in gesture recognition is that the researchers did not use the learned knowledge in existing domains to discover and recognize gestures in new domains. For each new domain, it is required to collect and annotate a large amount of data, and the training of the algorithm does not benefit from prior knowledge, leading to redundant calculation workload and excessive time investment. To address this problem, the paper proposes a method that could transfer gesture data in different domains. We use a r
APA, Harvard, Vancouver, ISO, and other styles
45

Abdulhamid, Mohanad, and Ndiwa Chesebe. "On Sign Language Toolbox Aid." Land Forces Academy Review 25, no. 1 (2020): 47–60. http://dx.doi.org/10.2478/raft-2020-0007.

Full text
Abstract:
AbstractThe world has been lately witnessing a landmark revolution in technology whereby more research and development is going into producing devices that are able to respond to human emotions, motions and behavior. These days, mobile phones are able to capture photos when the user is smiling or gesturing towards them. By making certain gestures, a user is able to control smart televisions and computers at the comfort of their homes, without the need of extra interface devices. The interaction between man and machines is being improved and made as natural as possible. More so, the application
APA, Harvard, Vancouver, ISO, and other styles
46

Rahul Swarup. "Hand Gesture Controlled Smart Car Using Image Recognition." Journal of Electrical Systems 20, no. 7s (2024): 2349–55. http://dx.doi.org/10.52783/jes.3971.

Full text
Abstract:
A hand gesture-controlled smart automobile is the newest initiative in the field of human-computer interaction, and it represents the evolution toward more natural and in- tuitive user interfaces. This paper describes how OpenCV and Google’s MediaPipe are intricately coordinated to produce a control strategy that is both agile and responsive. The deployed technology translates dynamic vehicle motion commands from complex human hand gestures using advanced image recognition algorithms. This is the pinnacle of interactive technology meeting real-world locomotion needs: state-of-the-art computer
APA, Harvard, Vancouver, ISO, and other styles
47

Shrawankar, Urmila, and Sayli Dixit. "Conversion of Tactile Sign Language into English for Deaf/Dumb Interaction." International Journal of Natural Computing Research 6, no. 1 (2017): 53–67. http://dx.doi.org/10.4018/ijncr.2017010104.

Full text
Abstract:
Natural language is the way of communication for normal human beings which includes spoken language, written and body gestures i.e. head gesture, hand gestures, facial expressions and lip motion etc. On the other hand, speech and hearing-impaired people uses sign language for communication which is not understandable for normal people thus they face problems of communication in society. In this problem interpreters are required but the human interpreters are costly and are not an efficient solution. Thus, there is a need of system which will translate the sign language into normal language whi
APA, Harvard, Vancouver, ISO, and other styles
48

Gutta, Srinivas, Ibrahim F. Imam, and Harry Wechsler. "Hand Gesture Recognition using Ensembles of Radial Basis Function (RBF) Networks and Decision Trees." International Journal of Pattern Recognition and Artificial Intelligence 11, no. 06 (1997): 845–72. http://dx.doi.org/10.1142/s021800149700038x.

Full text
Abstract:
Hand gestures are the natural form of communication among people, yet human-computer interaction is still limited to mice movements. The use of hand gestures in the field of human-computer interaction has attracted renewed interest in the past several years. Special glove-based devices have been developed to analyze finger and hand motion and use them to manipulate and explore virtual worlds. To further enrich the naturalness of the interaction, different computer vision-based techniques have been developed. At the same time the need for more efficient systems has resulted in new gesture recog
APA, Harvard, Vancouver, ISO, and other styles
49

Nandyal, Suvarna, and Suvarna Laxmikant Kattimani. "Umpire Gesture Detection and Recognition using HOG and Non-Linear Support Vector Machine (NL-SVM) Classification of Deep Features in Cricket Videos." Journal of Physics: Conference Series 2070, no. 1 (2021): 012148. http://dx.doi.org/10.1088/1742-6596/2070/1/012148.

Full text
Abstract:
Abstract Gesture Recognition pertains to recognizing meaningful expressions of motion by a human, involving the hands, arms, face, head, and/or body. It is of utmost importance in designing an intelligent and efficient human–computer interface. The applications of gesture recognition are manifold, ranging from sign language through medical rehabilitation, monitoring patients or elder people, surveillance systems, sports gesture analysis, human behaviour analysis etc., to virtual reality. In recent years, there has been increased interest in video summarization and automatic sports highlights g
APA, Harvard, Vancouver, ISO, and other styles
50

Verma, Prashant, and Khushboo Badli. "Real-Time Sign Language Detection using TensorFlow, OpenCV and Python." International Journal for Research in Applied Science and Engineering Technology 10, no. 5 (2022): 4483–88. http://dx.doi.org/10.22214/ijraset.2022.43439.

Full text
Abstract:
Abstract: Deaf and hard-of-hearing persons, as well as others who are unable to communicate verbally, utilise sign language to communicate within their communities and with others. Sign languages are a set of preset languages that communicate information using a visual-manual modality. The dilemma of real-time finger-spelling recognition in Sign Language is discussed. We gathered a dataset for identifying 36 distinct gestures (alphabets and numerals) and a dataset for typical hand gestures in ISL created from scratch using webcam images. The system accepts a hand gesture as input and displays
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!