To see the other types of publications on this topic, follow the link: GESTURE CONTROLLING.

Journal articles on the topic 'GESTURE CONTROLLING'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'GESTURE CONTROLLING.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sayre, Umang. "Controlling Mouse Cursor Using Hand Gestures." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem43347.

Full text
Abstract:
In the field of human-computer interaction ,input devices like keyboard, mouse have been the standard method of communication. However there is smoother and more natural way of communicating through the implementation of hand gesture. This study address the hands free application to control the cursor, aiming to develop a system that allow users to control the computer cursor using hand gesture. Python and OpenCV are powerful tools that can be utilized to recognize and interpret hand gestures. By implementing image processing techniques, developers can create software that can accurately identify different gestures. The PyAutoGUI module provides built-in functions for recognizing and responding to various gestures, making the process seamless and efficient. Keywords — Gesture control, Real-time mouse system, HCI, Python, MediaPipe, OpenCV, PyAutoGui
APA, Harvard, Vancouver, ISO, and other styles
2

Jahnavi, Mudili. "Controlling Computer Using Hand Gestures." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 05 (2025): 1–9. https://doi.org/10.55041/ijsrem49260.

Full text
Abstract:
Abstract: In the realm of Human-Computer Interaction (HCI), the integration of webcams and various sensors has made gesture recognition increasingly accessible and impactful. Hand gestures provide a natural and intuitive mode of communication, enabling seamless interaction between humans and computers. This paper highlights the potential of hand gestures as an effective medium for non-verbal communication and control, with applications spanning across multiple domains. The proposed system leverages image processing techniques, sensor technologies, and computer vision to enable gesture-based computer control. Emphasis is placed on the interdisciplinary nature of the research, including its applications in fields such as machine learning, healthcare, and mobile technology. Keywords: Hand Gesture Recognition, Human-Computer Interaction, Sensor Technology, Image Processing, Machine Learning, Android Application, Diabetes Monitoring, Computer Vision
APA, Harvard, Vancouver, ISO, and other styles
3

Vijaya, V. Krishna, Puvvala Harsha, Sricharan Murukutla, Kurra Eswar, and Nannapaneni Sravan Kuma. "Hand Gesture Controlling System." International Journal for Research in Applied Science and Engineering Technology 11, no. 1 (2023): 669–74. http://dx.doi.org/10.22214/ijraset.2023.48653.

Full text
Abstract:
bstract: As a result of the industrial 4.0 revolution, hand gestures are becoming more and more significant in the disciplines of robotics and IoT. Hand gestures are often used in the IoT field in applications for smart homes, wearable technology, vehicles, virtual reality, and other things. Let's try to infuse this work with some of our own originality. Combining Python and Arduino for Laptop/Computer Gesture Control. We'll utilise two Ultrasonic sensors to determine where our hand is with relation to a media player (VLC) and control it. We'll mount two ultrasonic sensors on top of the monitor and use an Arduino to gauge how far away it is from our hand. We will take particular activities in response to this measurement. We can do computer operations using the Python PyautoGUI module. The computer gets commands from the Arduino via the serial port. Currently, scientists are working to develop a hand gesture computer that runs entirely on hand gestures and sensors instead of any hardware. Few researchers have actually shown that we can control the video player, web browser, and text document with hand movements using Arduino and ultrasonic sensors.
APA, Harvard, Vancouver, ISO, and other styles
4

Varshika, DSS. "Media-Player Controlling by Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 9, no. VI (2021): 2022–30. http://dx.doi.org/10.22214/ijraset.2021.348515421.

Full text
Abstract:
In this Project we try to control our media player using hand gestures with the help of OpenCV and Python. Computer applications require interaction between human and computer. This interaction needs to be unrestricted and it has made it challenging to traditional input devices such as keyboard, mouse, pen etc. Hand gesture is an important component of body languages in linguistics. Human computer interaction becomes easy with the use of the hand as a device. Use of hand gestures to operate machines would make interaction interesting. Gesture recognition has gained a lot of importance. Hand gestures are used to control various applications like windows media player, robot control, gaming etc. Use of gesture makes interaction easy, convenient and does not require any extra device. Vision and audio recognition can be used together. But audio commands may not work in noisy environments.
APA, Harvard, Vancouver, ISO, and other styles
5

Alyamani, Hasan J. "Gesture Vocabularies for Hand Gestures for Controlling Air Conditioners in Home and Vehicle Environments." Electronics 12, no. 7 (2023): 1513. http://dx.doi.org/10.3390/electronics12071513.

Full text
Abstract:
With the growing prevalence of modern technologies as part of everyday life, mid-air gestures have become a promising input method in the field of human–computer interaction. This paper analyses the gestures of actual users to define a preliminary gesture vocabulary for home air conditioning (AC) systems and suggests a gesture vocabulary for controlling the AC that applies to both home and vehicle environments. In this study, a user elicitation experiment was conducted. A total of 36 participants were filmed while employing their preferred hand gestures to manipulate a home air conditioning system. Comparisons were drawn between our proposed gesture vocabulary (HomeG) and a previously proposed gesture vocabulary which was designed to identify the preferred hand gestures for in-vehicle air conditioners. The findings indicate that HomeG successfully identifies and describes the employed gestures in detail. To gain a gesture taxonomy that is suitable for manipulating the AC at home and in a vehicle, some modifications were applied to HomeG based on suggestions from other studies. The modified gesture vocabulary (CrossG) can identify the gestures of our study, although CrossG has a less detailed gesture pattern. Our results will help designers to understand user preferences and behaviour prior to designing and implementing a gesture-based user interface.
APA, Harvard, Vancouver, ISO, and other styles
6

SHAIK, Dr ABDUL NABI, E. SAI PRIYA, G. NIKITHA, K. PRACHEEN KUMAR, and N. SHRAVYA SHREE. "CONTROLLING VIDEOLAN CLIENT MEDIA USING LUCAS KANADE OPTIMAL FLOW ALGORITHM AND OPENCV." YMER Digital 21, no. 05 (2022): 246–55. http://dx.doi.org/10.37896/ymer21.05/29.

Full text
Abstract:
In this project we've discussed a system which uses some dynamic hand gesture recognition technique to manage the media players like VLC media player. This project consist of certain modules which segments the foreground a part of the frame using detection of skin colour and approximate median technique. the popularity of hand gesture is finished by creating a call Tree, that uses certain features extracted from the segmented part. This hand gesture recognition technique introduces a replacement, natural thanks to interact with computers and is beneficial to several folks in our day-to-day life. Hand gestures are associated with the human hands in hand gesture recognition. This project is employed for controlling certain operations (Pause, Play, Volume up, Volume Down, Mute) on video player by mere hand gestures without getting in the rigmarole of pressing buttons or tapping onto the screen. this may be directly associated with our day to life like in presentations. We have considered hand gestures and their directional motion defines a gesture for the applying. during this application image retrieving is finished employing a Webcam. Some functions in VLC media players are used most often and these functions uses some predefined gestures. Shows the defined gestures in keeping with the VLC player control operation. We created a VLC Media Player Controller using Hand Gesture Recognition System to form ‘HUMAN LIFE EASY AND BETTER’. This project is implemented using two steps: (1.) Creation of Hand Gesture Recognition System: this is often done using image processing using OpenCV library. (2.) Controlling VLC Media Player using hand gestures: during this step we controlled the player using shell commands which were recalled using python commands through OS library
APA, Harvard, Vancouver, ISO, and other styles
7

Chakradhar, K. S., Prasanna Rejinthala Lakshmi, salla chowdary Sree Rama Brunda, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (2023): 45–54. http://dx.doi.org/10.46632/eae/2/1/7.

Full text
Abstract:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
APA, Harvard, Vancouver, ISO, and other styles
8

Chakradhar, K. S., Lakshmi Prasanna Rejinthala, chowdary Sree Rama Brunda salla, Bharathi Pola, and Bhargava Petlu. "Controlling Media Player Using Hand Gestures." Electrical and Automation Engineering 2, no. 1 (2023): 45–54. http://dx.doi.org/10.46632/ese/2/1/7.

Full text
Abstract:
Computer usage is increasing rapidly day by day but the input devices are limited and to access them, we need to be near the screen. To overcome this problem and control the screen, we can use hand gestures. For every operation, we used different hand gestures. We proposed a python program to control the media player through hand gestures. In this method, we used libraries like OpenCV, Media Pipe, PyAuto GUI, and other libraries to capture the video, provide ready-to-use ML solutions and automate your GUI and programmatically control your keyboard and mouse. Hand gestures will be used as the input for providing natural interaction by reducing external hardware interaction. The whole process is divided into two steps. Firstly, gesture recognition through the camera is done by OpenCV and media Pipe helps to identify the gesture b its position, and the respective command is executed. Secondly, PyAuto GUI is used to automate the keyboard and controls the media player.
APA, Harvard, Vancouver, ISO, and other styles
9

Joshi, Dr S. M. "Controlling 3D Models in VR using Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 12, no. 4 (2024): 1605–9. http://dx.doi.org/10.22214/ijraset.2024.60166.

Full text
Abstract:
Abstract: The fusion of hand gesture detection technology with virtual reality (VR) environments enables users to interact organically with three-dimensional models. Utilizing VR headsets equipped with advanced hand tracking algorithms, individuals can engage with virtual objects through intuitive gestures, enriching their immersive experiences. Software development entails carrying out sophisticated gesture recognition algorithms and the assignment of specific actions to these gestures, such as grasping, rotating, and resizing 3D models. These interactions mirror real-world manipulations, empowering users to seamlessly explore and modify digital content. Furthermore, gestural menu systems streamline user interactions by offering convenient access to additional features. Rigorous testing and optimization processes ensure precision, responsiveness, and optimal performance, enhancing the overall user satisfaction.
APA, Harvard, Vancouver, ISO, and other styles
10

Labhane, Nishant M., Prashant Harsh, and Meghan Kulkarni. "Multipoint Hand Gesture Recognition For Controlling Bot." International Journal of Scientific Research 1, no. 1 (2012): 46–48. http://dx.doi.org/10.15373/22778179/jun2012/16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Monisha Sampath, PriyadarshiniVelraj, Vaishnavii Raghavendran, and M Sumithra. "Controlling media player using hand gestures with VLC media player." World Journal of Advanced Research and Reviews 14, no. 3 (2022): 466–72. http://dx.doi.org/10.30574/wjarr.2022.14.3.0565.

Full text
Abstract:
In today's international, anyone opts for instant interaction with complicated structures that ensure a brief response. Thus, with increasing improvement in technology, reaction time and ease of operations are the issues. Here is where human-computer interaction comes into play. This interplay is unrestricted and challenges the used gadgets consisting of the keyboard and mouse for input. Gesture recognition has been gaining tons of attention. Gestures are instinctive and are often utilized in everyday interactions. Therefore, communicating using gestures with computer systems creates an entire new trend of interaction. In this assignment, with the help of laptop vision and deep studying techniques, person hand movements (gestures) are used in real-time to manipulate the media player. In this project, seven gestures are defined to control the media gamers' usage of hand gestures. The proposed internet application permits the person to use their neighborhood device digicam to become aware of their gesture and execute the control over the media participant and comparable packages (with no extra hardware). It will increase performance and make interaction convenient through letting the user manage his/her pc/laptop from a distance.
APA, Harvard, Vancouver, ISO, and other styles
12

Monisha, Sampath, PriyadarshiniVelraj, Raghavendran Vaishnavii, and Sumithra M. "Controlling media player using hand gestures with VLC media player." World Journal of Advanced Research and Reviews 14, no. 3 (2022): 466–72. https://doi.org/10.5281/zenodo.7731775.

Full text
Abstract:
In today's international, anyone opts for instant interaction with complicated structures that ensure a brief response. Thus, with increasing improvement in technology, reaction time and ease of operations are the issues. Here is where human-computer interaction comes into play. This interplay is unrestricted and challenges the used gadgets consisting of the keyboard and mouse for input. Gesture recognition has been gaining tons of attention. Gestures are instinctive and are often utilized in everyday interactions. Therefore, communicating using gestures with computer systems creates an entire new trend of interaction. In this assignment, with the help of laptop vision and deep studying techniques, person hand movements (gestures) are used in real-time to manipulate the media player. In this project, seven gestures are defined to control the media gamers' usage of hand gestures. The proposed internet application permits the person to use their neighborhood device digicam to become aware of their gesture and execute the control over the media participant and comparable packages (with no extra hardware). It will increase performance and make interaction convenient through letting the user manage his/her pc/laptop from a distance.
APA, Harvard, Vancouver, ISO, and other styles
13

Patil, Omkar, Shreyash Patil, Nikhil Shingade, and Ashish Katkar. "Virtual Mouse System using Hand Gesture recognition with Open CV." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 01 (2024): 1–6. http://dx.doi.org/10.55041/ijsrem27906.

Full text
Abstract:
The use of hand gesture recognition for controlling virtual devices has gained popularity with the advancement of artificial intelligence technology. In this paper, we propose a hand gesture-controlled virtual mouse system that employs AI algorithms to identify and interpret hand gestures, translating them into mouse movements. The primary objective of this system is to provide an alternative interface, particularly beneficial for individuals facing challenges with traditional mouse or keyboard usage.Our proposed system utilizes a camera to capture images of the user's hand. To enable gesture recognition, the system undergoes training using a dataset comprising various hand gestures. Once a gesture is identified, it is translated into a corresponding mouse movement, subsequently executed on the virtual screen. An essential feature of our system is its scalability and adaptability, allowing seamless integration into diverse environments and devices. To enhance user interaction, our system incorporates dynamic/static hand gestures along with a voice assistant for virtually controlling all input operations. Notably, our approach employs Machine Learning (ML) and Computer Vision algorithms to recognize hand gestures and voice commands, eliminating the need for additional hardware requirements. The model is implemented using the MediaPipe framework, demonstrating the feasibility of our proposed system. In conclusion, our hand gesture-controlled virtual mouse system presents a promising avenue for improving user experience and accessibility through human-computer interaction. By leveraging AI, ML, and Computer Vision, we aim to create an intuitive and versatile interface that can be easily adapted to different environments, ultimately enhancing accessibility for a wider range of users. Keywords: Computer vision, hand gesture recognition, Media-pipe, virtual mouse.
APA, Harvard, Vancouver, ISO, and other styles
14

Holder, Sherrie, and Leia Stirling. "Effect of Gesture Interface Mapping on Controlling a Multi-degree-of-freedom Robotic Arm in a Complex Environment." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 64, no. 1 (2020): 183–87. http://dx.doi.org/10.1177/1071181320641045.

Full text
Abstract:
There are many robotic scenarios that require real-time function in large or unconstrained environments, for example, the robotic arm on the International Space Station (ISS). Use of fully-wearable gesture control systems are well-suited to human-robot interaction scenarios where users are mobile and must have hands free. A human study examined operation of a simulated ISS robotic arm using three different gesture input mappings compared to the traditional joystick interface. Two gesture mappings permitted multiple simultaneous inputs (multi-input), while the third was a single-input method. Experimental results support performance advantages of multi-input gesture methods over single input. Differences between the two multi-input methods in task completion and workload display an effect of user-directed attention on interface success. Mappings based on natural human arm movement are promising for gesture interfaces in mobile robotic applications. This study also highlights challenges in gesture mapping, including how users align gestures with their body and environment.
APA, Harvard, Vancouver, ISO, and other styles
15

Congdon, Eliza L., and Susan C. Levine. "Unlocking the Power of Gesture: Using Movement-Based Instruction to Improve First Grade Children’s Spatial Unit Misconceptions." Journal of Intelligence 11, no. 10 (2023): 200. http://dx.doi.org/10.3390/jintelligence11100200.

Full text
Abstract:
Gestures are hand movements that are produced simultaneously with spoken language and can supplement it by representing semantic information, emphasizing important points, or showing spatial locations and relations. Gestures’ specific features make them a promising tool to improve spatial thinking. Yet, there is recent work showing that not all learners benefit equally from gesture instruction and that this may be driven, in part, by children’s difficulty understanding what an instructor’s gesture is intended to represent. The current study directly compares instruction with gestures to instruction with plastic unit chips (Action) in a linear measurement learning paradigm aimed at teaching children the concept of spatial units. Some children performed only one type of movement, and some children performed both: Action-then-Gesture [AG] or Gesture-then-Action [GA]. Children learned most from the Gesture-then-Action [GA] and Action only [A] training conditions. After controlling for initial differences in learning, the gesture-then-action condition outperformed all three other training conditions on a transfer task. While gesture is cognitively challenging for some learners, that challenge may be desirable—immediately following gesture with a concrete representation to clarify that gesture’s meaning is an especially effective way to unlock the power of this spatial tool and lead to deep, generalizable learning.
APA, Harvard, Vancouver, ISO, and other styles
16

Lorang, Emily, Audra Sterling, and Bianca Schroeder. "Maternal Responsiveness to Gestures in Children With Down Syndrome." American Journal of Speech-Language Pathology 27, no. 3 (2018): 1018–29. http://dx.doi.org/10.1044/2018_ajslp-17-0138.

Full text
Abstract:
Purpose This study compared gesture use in young children with Down syndrome (DS) and typical development (TD) as well as how mothers respond to child gestures based on child age and diagnosis. Method Twenty-two mother–child dyads with DS and 22 mother–child dyads with TD participated. The child participants were between 22 and 63 months and were matched on chronological age. We coded child gesture use and whether mothers recoded child gestures (i.e., provided a verbal translation) during naturalistic interactions. Results The children with DS used more gestures than peers with TD. After controlling for expressive language ability, the two groups were not significantly different on child gesture use. Regardless of child diagnosis, mothers recoded approximately the same percentage of child gestures. There was a significant interaction between child diagnosis and child age when predicting the percentage of maternal gesture recodes; mothers of children with DS did not demonstrate differences in the percentage of maternal gesture recodes based on child age, but there was a negative relationship between the percentage of maternal gesture recodes and child age for the children with TD. Conclusions Young children with DS gesture more than chronological age–matched children with TD, therefore providing numerous opportunities for caregivers to recode child gestures and support language development. Early intervention should focus on increasing parent responsiveness to child gestures earlier in life in order to provide additional word-learning opportunities for children with DS.
APA, Harvard, Vancouver, ISO, and other styles
17

Yu, Hengcheng, and Zhengyu Chen. "Research on contactless control of elevator based on machine vision." Highlights in Science, Engineering and Technology 7 (August 3, 2022): 89–94. http://dx.doi.org/10.54097/hset.v7i.1022.

Full text
Abstract:
Aiming at the problem of cross-infection caused by elevator public buttons during the COVID-19 epidemic, a non-contact elevator button control gesture recognition system based on machine vision is designed. In order to improve the detection speed of gesture recognition, combined with the Spatial Pyramid Pooling (SPP) and replaced the Backbone in YOLOv5 with the lightweight model ShuffleNetV2, an improved YOLOv5_shff algorithm was proposed. After testing, in the task of recognizing gestures, the detection speed of the YOLOv5_shff algorithm is 14% higher than the original model, and the detection accuracy is 0.1% higher than the original model. Taking the improved YOLOv5_shff algorithm as the core, a gesture recognition system that can be applied to elevator button control is designed. The experimental data shows that the gesture recognition accuracy for controlling elevator buttons reaches 99.3%, which can meet the requirements of non-contact control of public elevators.
 Aiming at the problem of cross-infection caused by elevator public buttons during the COVID-19 epidemic, a non-contact elevator button control gesture recognition system based on machine vision is designed. In order to improve the detection speed of gesture recognition, combined with the Spatial Pyramid Pooling (SPP) and replaced the Backbone in YOLOv5 with the lightweight model ShuffleNetV2, an improved YOLOv5_shff algorithm was proposed. After testing, in the task of recognizing gestures, the detection speed of the YOLOv5_shff algorithm is 14% higher than the original model, and the detection accuracy is 0.1% higher than the original model. Taking the improved YOLOv5_shff algorithm as the core, a gesture recognition system that can be applied to elevator button control is designed. The experimental data shows that the gesture recognition accuracy for controlling elevator buttons reaches 99.3%, which can meet the requirements of non-contact control of public elevators.
APA, Harvard, Vancouver, ISO, and other styles
18

Sara, Meghana. "Gesture Controlled Robot for the Disabled using Arduino." International Journal for Research in Applied Science and Engineering Technology 12, no. 4 (2024): 5446–49. http://dx.doi.org/10.22214/ijraset.2024.61159.

Full text
Abstract:
Abstract: The gesture-controlled robot project presents a novel approach to human-robot interaction, utilizing hand gestures as an intuitive and natural means of control. This project aims to develop a system where users can seamlessly communicate with a robot using simple hand movements. The project consists of two main components: a handheld device for gesture input and a robot equipped with sensors and actuators for movement. Through wireless communication, the handheld device transmits signals encoding the user's hand gestures to the robot's receiver module, which decodes the signals, processes them, and translates them into commands for controlling the robot's movements. Advantages of this system include intuitive control, wireless operation, real-time interaction, enhanced safety, adaptability, educational value, and versatility. However, challenges such as gesture recognition complexity and limited gesture vocabulary must be addressed to ensure the system's effectiveness. The project has numerous applications, including entertainment, assistive technology, industrial automation, education, home automation, IoT, security, and interactive exhibits. By leveraging gesture-controlled robotics, this project contributes to advancing human-robot interaction technology and fostering innovation in various domains.
APA, Harvard, Vancouver, ISO, and other styles
19

Valle, Chelsea La, Karen Chenausky, and Helen Tager-Flusberg. "How do minimally verbal children and adolescents with autism spectrum disorder use communicative gestures to complement their spoken language abilities?" Autism & Developmental Language Impairments 6 (January 2021): 239694152110350. http://dx.doi.org/10.1177/23969415211035065.

Full text
Abstract:
Background and aims Prior work has examined how children and adolescents with autism spectrum disorder who are minimally verbal use their spoken language abilities during interactions with others. However, social communication includes other aspects beyond speech. To our knowledge, no studies have examined how minimally verbal children and adolescents with autism spectrum disorder are using their gestural communication during social interactions. Such work can provide important insights into how gestures may complement their spoken language abilities. Methods Fifty minimally verbal children and adolescents with autism spectrum disorder participated ( Mage = 12.41 years; 38 males). Gestural communication was coded from the Autism Diagnostic Observation Schedule. Children ( n = 25) and adolescents ( n = 25) were compared on their production of gestures, gesture–speech combinations, and communicative functions. Communicative functions were also assessed by the type of communication modality: gesture, speech, and gesture–speech to examine the range of communicative functions across different modalities of communication. To explore the role gestures may play the relation between speech utterances and gestural production was investigated. Results Analyses revealed that (1) minimally verbal children and adolescents with autism spectrum disorder did not differ in their total number of gestures. The most frequently produced gesture across children and adolescents was a reach gesture, followed by a point gesture (deictic gesture), and then conventional gestures. However, adolescents produced more gesture–speech combinations (reinforcing gesture-speech combinations) and displayed a wider range of communicative functions. (2) Overlap was found in the types of communicative functions expressed across different communication modalities. However, requests were conveyed via gesture more frequently compared to speech or gesture–speech. In contrast, dis/agree/acknowledging and responding to a question posed by the conversational partner was expressed more frequently via speech compared to gesture or gesture–speech. (3) The total number of gestures was negatively associated with total speech utterances after controlling for chronological age, receptive communication ability, and nonverbal IQ. Conclusions Adolescents may be employing different communication strategies to maintain the conversational exchange and to further clarify the message they want to convey to the conversational partner. Although overlap occurred in communicative functions across gesture, speech, and gesture–speech, nuanced differences emerged in how often they were expressed across different modalities of communication. Given their speech production abilities, gestures may play a compensatory role for some individuals with autism spectrum disorder who are minimally verbal. Implications Findings underscore the importance of assessing multiple modalities of communication to provide a fuller picture of their social communication abilities. Our results identified specific communicative strengths and areas for growth that can be targeted and expanded upon within gesture and speech to optimize social communication development.
APA, Harvard, Vancouver, ISO, and other styles
20

Nafde,, Dr Yogita. "HAND GESTURE CONTROLLED WHEEL CHAIR." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 04 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem30894.

Full text
Abstract:
The project for controlling wheelchairs through gestures offers a complete solution to enhance the mobility and independence of individuals with limited motor abilities using an intuitive gesture control system. This system incorporates essential elements, such as an Arduino Nano microcontroller, MPU (Motion Processing Unit), NRF wireless communication module, 775 DC motors, and a motor driver, to allow for smooth and responsive wheelchair operation.
APA, Harvard, Vancouver, ISO, and other styles
21

KALLIO, SANNA, JUHA KELA, PANU KORPIPÄÄ, and JANI MÄNTYJÄRVI. "USER INDEPENDENT GESTURE INTERACTION FOR SMALL HANDHELD DEVICES." International Journal of Pattern Recognition and Artificial Intelligence 20, no. 04 (2006): 505–24. http://dx.doi.org/10.1142/s0218001406004776.

Full text
Abstract:
Accelerometer-based gesture recognition facilitates a complementary interaction modality for controlling mobile devices and home appliances. Using gestures for the task of home appliance control requires use of the same device and gestures by different persons, i.e. user independent gesture recognition. The practical application in small embedded low-resource devices also requires high computational performance. The user independent gesture recognition accuracy was evaluated with a set of eight gestures and seven users, with a total of 1120 gestures in the dataset. Twenty-state continuous HMM yielded an average of 96.9% user independent recognition accuracy, which was cross-validated by leaving one user in turn out of the training set. Continuous and discrete five-state HMM computational performances were compared with a reference test in a PC environment, indicating that discrete HMM is 20% faster. Computational performance of discrete five-state HMM was evaluated in an embedded hardware environment with a 104 MHz ARM-9 processor and Symbian OS. The average recognition time per gesture calculated from 1120 gesture repetitions was 8.3 ms. With this result, the computational performance difference between the compared methods is considered insignificant in terms of practical application. Continuous HMM is hence recommended as a preferred method due to its better suitability for a continuous-valued signal, and better recognition accuracy. The results suggest that, according to both evaluation criteria, HMM is feasible for practical user independent gesture control applications in mobile low-resource embedded environments.
APA, Harvard, Vancouver, ISO, and other styles
22

Mahna, Shivanku, Ketan Sethi, and Sravan Ch. "Controlling Mouse using Hand Gesture Recognition." International Journal of Computer Applications 113, no. 5 (2015): 1–4. http://dx.doi.org/10.5120/19819-1652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Purushothaman, Amritha, and Suja Palaniswamy. "Development of Smart Home Using Gesture Recognition for Elderly and Disabled." Journal of Computational and Theoretical Nanoscience 17, no. 1 (2020): 177–81. http://dx.doi.org/10.1166/jctn.2020.8647.

Full text
Abstract:
Smart home has gained popularity not only as a luxury but also due to the numerous advantages. It is especially useful for senior citizens and children with disabilities. In this work, home automation is achieved using gesture for controlling appliances. Gesture recognition is an area in which lot of research and innovations are blooming. This paper discusses the development of a wearable device which captures hand gestures. The wearable device uses accelerometer and gyroscopes to sense and capture tilting, rotation and acceleration of the hand movement. Four different hand gestures are captured using this wearable device and machine learning algorithm namely Support Vector Machine has been used for classification of gestures to control ON/OFF of appliances.
APA, Harvard, Vancouver, ISO, and other styles
24

Anklesh G, Akash V, Prithivi Sakthi B, and Kanthimathi.M. "Hand Gesture Recognition for Video Player." International Research Journal on Advanced Engineering Hub (IRJAEH) 2, no. 04 (2024): 801–5. http://dx.doi.org/10.47392/irjaeh.2024.0112.

Full text
Abstract:
Hand Gesture Recognition (HGR) is an emerging technology that has numerous applications in various fields, including human-computer interaction and robotics. In this project, we developed a Hand Gesture Recognition system for video players that recognize hand gestures and perform actions such as play, pause, rewind, and fast-forward. The system was developed using Python programming language and the OpenCV, mediapipe, pyautogui libraries. A dataset of hand gestures was created by recording videos using a webcam and annotating the frames with corresponding labels. The Hand Gesture Recognition system achieved an accuracy of 92% on the test set and was able to accurately recognize hand gestures and perform the corresponding actions on a video player. The system has the potential to be used as a novel way to control video players, especially in situations where the user cannot use a mouse or keyboard. In conclusion, the Hand Gesture Recognition system developed in this project provides a promising solution for controlling video players using hand gestures. The system achieved a high level of accuracy in recognizing gestures and performing actions, and can potentially be used in a variety of applications. Further improvements and refinements can be made to the system in the future to make it even more effective and user- friendly.
APA, Harvard, Vancouver, ISO, and other styles
25

Huu, Phat Nguyen, and Tan Phung Ngoc. "Hand Gesture Recognition Algorithm Using SVM and HOG Model for Control of Robotic System." Journal of Robotics 2021 (June 16, 2021): 1–13. http://dx.doi.org/10.1155/2021/3986497.

Full text
Abstract:
In this study, we propose the gesture recognition algorithm using support vector machines (SVM) and histogram of oriented gradient (HOG). Besides, we also use the CNN model to classify gestures. We approach and select techniques of applying problem controlling for the robotic system. The goal of the algorithm is to detect gestures with real-time processing speed, minimize interference, and reduce the ability to capture unintentional gestures. Static gesture controls are used in this study including on, off, increasing, and decreasing. Besides, it uses motion gestures including turning on the status switch and increasing and decreasing the volume. Results show that the algorithm is up to 99% accuracy with a 70-millisecond execution time per frame that is suitable for industrial applications.
APA, Harvard, Vancouver, ISO, and other styles
26

Lu, Cong, Haoyang Zhang, Yu Pei, et al. "Online Hand Gesture Detection and Recognition for UAV Motion Planning." Machines 11, no. 2 (2023): 210. http://dx.doi.org/10.3390/machines11020210.

Full text
Abstract:
Recent advances in hand gesture recognition have produced more natural and intuitive methods of controlling unmanned aerial vehicles (UAVs). However, in unknown and cluttered environments, UAV motion planning requires the assistance of hand gesture interaction in complex flight tasks, which remains a significant challenge. In this paper, a novel framework based on hand gesture interaction is proposed, to support efficient and robust UAV flight. A cascading structure, which includes Gaussian Native Bayes (GNB) and Random Forest (RF), was designed, to classify hand gestures based on the Six Degrees of Freedom (6DoF) inertial measurement units (IMUs) of the data glove. The hand gestures were mapped onto UAV’s flight commands, which corresponded to the direction of the UAV flight.The experimental results, which tested the 10 evaluated hand gestures, revealed the high accuracy of online hand gesture recognition under asynchronous detection (92%), and relatively low latency for interaction (average recognition time of 7.5 ms; average total time of 3 s).The average time of the UAV’s complex flight task was about 8 s shorter than that of the synchronous hand gesture detection and recognition. The proposed framework was validated as efficient and robust, with extensive benchmark comparisons in various complex real-world environments.
APA, Harvard, Vancouver, ISO, and other styles
27

Sharma, Naina, Vaishali Nirgude, Tanya Shah, Chirag Bhagat, Amithesh Gupta, and Yash Gupta. "GESTURE RECOGNITION FOR TOUCH-FREE PC CONTROL USING A NEURAL NETWORK APPROACH." ICTACT Journal on Data Science and Machine Learning 5, no. 4 (2024): 690–97. https://doi.org/10.21917/ijdsml.2024.0142.

Full text
Abstract:
In the pursuit of advancing the field of touch-free human-computer interaction, this paper is focused on developing a gesture enabled PC control system that aims for enhancing user engagement and providing intuitive and flexible control methods, across various applications, particularly those benefiting individuals with mobility impairments. This system has expanding potential use in virtual and augmented reality environments. This study describes a unique method for temporal gesture identification that employs gesture kinematics for feature extraction and classification. Real-time hand tracking and key point identification were performed using MediaPipe. The Euclidean distances between the key points was normalised and input into a Multilayer perceptron model, which classified the gestures and mapped them to specific commands for controlling PC functions. This approach performed well over a large dataset, improving accuracy and usability. The gesture recognition system achieved an average accuracy of 97%, with precision, recall, and F1 score of 0.924, 0.924, and 0.926, respectively, across the five gestures. This system provides the ability of customization to users which allows them to create and map their own gestures to specific commands, in addition to using predefined ones. This level of personalization and flexibility is a significant advancement over existing systems, which typically offer fixed gesture-command mappings.
APA, Harvard, Vancouver, ISO, and other styles
28

RAUTARAY, SIDDHARTH S., and ANUPAM AGRAWAL. "VISION-BASED APPLICATION-ADAPTIVE HAND GESTURE RECOGNITION SYSTEM." International Journal of Information Acquisition 09, no. 01 (2013): 1350007. http://dx.doi.org/10.1142/s0219878913500071.

Full text
Abstract:
With the increasing role of computing devices, facilitating natural human computer interaction (HCI) will have a positive impact on their usage and acceptance as a whole. For long time, research on HCI has been restricted to techniques based on the use of keyboard, mouse, etc. Recently, this paradigm has changed. Techniques such as vision, sound, speech recognition allow for much richer form of interaction between the user and machine. The emphasis is to provide a natural form of interface for interaction. Gestures are one of the natural forms of interaction between humans. As gesture commands are found to be natural for humans, the development of gesture control systems for controlling devices have become a popular research topic in recent years. Researchers have proposed different gesture recognition systems which act as an interface for controlling the applications. One of the drawbacks of present gesture recognition systems is application dependence which makes it difficult to transfer one gesture control interface into different applications. This paper focuses on designing a vision-based hand gesture recognition system which is adaptive to different applications thus making the gesture recognition systems to be application adaptive. The designed system comprises different processing steps like detection, segmentation, tracking, recognition, etc. For making the system as application-adaptive, different quantitative and qualitative parameters have been taken into consideration. The quantitative parameters include gesture recognition rate, features extracted and root mean square error of the system while the qualitative parameters include intuitiveness, accuracy, stress/comfort, computational efficiency, user's tolerance, and real-time performance related to the proposed system. These parameters have a vital impact on the performance of the proposed application adaptive hand gesture recognition system.
APA, Harvard, Vancouver, ISO, and other styles
29

Thorat, Sakshi. "HCI Based Virtual Controlling System." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (2022): 630–35. http://dx.doi.org/10.22214/ijraset.2022.43645.

Full text
Abstract:
Abstract: Researchers around the globe are working on making our devices more interactive and making them function with minimal physical contact in this research project. The proposed system is an interactive computer system that can operate without a physical keyboard or mouse. This system will benefit everyone, particularly immobilized people with special needs operating a physical keyboard and mouse. So in the system, they have developed an interface that uses visual hand-gesture analysis. These gestures are used to assist those who are having trouble controlling or operating computers or gadgets. The model is being developed in such a way that it can assist us in recognizing and implementing it. Regarding hand gestures, our interface uses OpenCV, Python, and computer vision algorithms that can detect different finger orientations, distinguish the user's hand from the background, and distinguish significant hand movements from unwanted hand movements. Keywords: Machine Learning, Computer Vision, Image Processing, OpenCV, Human-Computer Interaction (HCI)
APA, Harvard, Vancouver, ISO, and other styles
30

Chamunorwa, Michael, Mikolaj P. Wozniak, Susanna Krämer, Heiko Müller, and Susanne Boll. "An Empirical Comparison of Moderated and Unmoderated Gesture Elicitation Studies on Soft Surfaces and Objects for Smart Home Control." Proceedings of the ACM on Human-Computer Interaction 7, MHCI (2023): 1–24. http://dx.doi.org/10.1145/3604245.

Full text
Abstract:
Conducting gesture elicitation studies (GES) in personal spaces such as smart homes is crucial to achieving high ecological validity of elicited gestures. However, supervising such studies is considered intrusive and negatively affects the results' quality. The alternative is to conduct unsupervised GES under similar conditions, but more side-by-side comparisons documenting the similarities and differences between both approaches are necessary. Consequently, we need more data describing the preferred approach and whether the differences or similarities in the results are so significant to cause concern. This research distributed a DIY observation kit, which 30 participants assembled and used to propose gestures for controlling elements in a smart living room using a pillow's surface, with and without supervision. Our results show that gestures from supervised and unsupervised studies differ in quantity and max-consensus but not in gesture Agreement Scores. Our results also show that participants preferred conducting unsupervised studies but proposed fewer gesture sets in this condition.
APA, Harvard, Vancouver, ISO, and other styles
31

Narayanpethkar, Sangamesh. "Computer Vision based Media Control using Hand Gestures." International Journal for Research in Applied Science and Engineering Technology 11, no. 5 (2023): 6642–46. http://dx.doi.org/10.22214/ijraset.2023.52881.

Full text
Abstract:
Abstract: Hand gestures are a form of nonverbal communication that can be used in several fields such as communication between deaf-mute people, robot control, human–computer interaction (HCI), home automation and medical applications. At this time and age, working with a computer in some capacity is a common task. In most situations, the keyboard and mouse are the primary input devices. However, there are several problems associated with excessive usage of the same interaction medium, such as health problems brought on by continuous use of input devices, etc. Humans basically communicate using gestures and it is indeed one of the best ways to communicate. Gesture-based real-time gesture recognition systems received great attention in recent years because of their ability to interact with systems efficiently through human-computer interaction. This project implements computer vision and gesture recognition techniques and develops a vision based low-cost input software for controlling the media player through gestures.
APA, Harvard, Vancouver, ISO, and other styles
32

Veena, Ms B. "Convolutional Neural Network for Hand Gesture Recognition using 8 different Gestures." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (2025): 1–9. https://doi.org/10.55041/ijsrem42394.

Full text
Abstract:
This paper describes the implementation of hand gesture recognition which uses deep learning algorithm as CNN. Therefore, proposed technique is hassle free as control is not based on joysticks or switches. There are eight conditions considered for hand gesture as up, stop , right, No ,Left ,Down, clockwise and anticlockwise. There are many researchers worked on this area using different sensors, machine learning algorithms and deep learning algorithms. Limitations of the state of art techniques are studied in this paper and designed a new modified convolutional neural network (CNN) for gesture recognition. Dataset is created which generates 1000 Gray scale images for each type of gesture. Training modified CNN model gives prediction accuracy of 98.4024 percent while random forest machine learning classifier gives prediction accuracy of 69 ppercent. It is observed that proposed model gives better accuracy compared to state of art technique for hand gesture recognition. Obtained hand gesture class can be send to controller for further controlling in future like home automation robotic car movement etc. Keywords: Robot Car Movement, Gesture Recognition, Random Forest, Deep Learning, CNN, layer modification. )
APA, Harvard, Vancouver, ISO, and other styles
33

Mwangi, Gerald Gitau, Job Mochengo Kerosi, and Ben Asiago Moywaywa. "Gesture-Driven Home Automation System." IOSR Journal of Electrical and Electronics Engineering 19, no. 6 (2024): 10–15. http://dx.doi.org/10.9790/0853-1906011015.

Full text
Abstract:
This project addresses the critical challenge of making home automation accessible and inclusive for individuals with physical disabilities and visual impairments. Existing home automation methods, such as voice control, mobile apps, and timers, exhibit shortcomings that hinder their effectiveness for this demographic. To overcome these limitations, the project shows the development of a gesture-based home automation system. This system integrates gesture recognition and security functionality. It operates in two stages: user authentication through face recognition and gesture-based appliance control. A Convolutional Neural Network (CNN) processes webcam-captured gestures, with signals relayed to an Arduino via Pyfirmata for appliance activation. A model was trained where it could detect the human hand(s) and more soshow the 20 landmarks that media pipe offers. The system worked well in controlling various home appliances using gestures. Through this innovative approach, the project contributes to the advancement of accessible smart home technology, fostering independence and inclusivity for users with diverse abilities.
APA, Harvard, Vancouver, ISO, and other styles
34

Chye, Kah Kien, Zi Yuan Ong, Jun Kit Chaw, Mei Choo Ang, Marizuana Mat Daud, and Tin Tin Ting. "GESTURE RECOGNITION FOR INTERACTIVE DASHBOARD: A 3D CONVOLUTIONAL NEURAL NETWORKS WITH LONG-SHORT-TERM-MEMORY APPROACH." Journal of Information System and Technology Management 8, no. 32 (2023): 36–52. http://dx.doi.org/10.35631/jistm.832003.

Full text
Abstract:
Gesture recognition technology has become increasingly popular in various fields due to its potential for controlling consumer devices, robotics, and even translating sign language. However, there is still a lack of practical applications in daily life, especially for consumer products. Previous studies primarily focused on gesture recognition and did not delve into exploring the potential applications of gesture recognition in human interaction. This work aims to address this gap by deploying a gesture recognition model into practical applications and exploring its potential in human interaction. The proposed model combines the use of 3D convolutional neural networks (3DCNN) and long-short-term-memory networks (LSTM) for spatiotemporal feature learning. Six common interactive dashboard gestures were extracted from the 20BN-Jester dataset to train the model. The validation accuracy reached 80.47% after ablation studies. Then, two test cases were conducted to evaluate the effectiveness of the model in controlling Spotify and the Artificial Intelligence of Things (AIoT) smart classroom. The response time for each recognition was approximately 100 to 150 milliseconds. The proposed research aims to develop a simple yet efficient model for hand gesture recognition, which has the potential to be applied in various fields beyond just music and smart classrooms. With the growing popularity of gesture recognition technology, this study contributes to the advancement of practical applications and product innovations that can be integrated into daily life, thereby making it more accessible to the general public.
APA, Harvard, Vancouver, ISO, and other styles
35

Shahil, Mohammed. "GestoMouse: A Real-Time Gesture-Controlled Virtual Mouse System." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 06 (2025): 1–9. https://doi.org/10.55041/ijsrem49415.

Full text
Abstract:
1.ABSTRACT The use of hand gesture recognition in controlling virtual devices has become popular due to the advancement of artificial intelligence technology. A hand gesture-controlled virtual mouse system that utilizes AI algorithms to recognize hand gestures and translate them into mouse movements is proposed in this paper. The system is designed to provide an alternative interface for people who have difficulty using a traditional mouse or keyboard. The proposed system uses a camera to capture images of the user’s hand, which are processed by an AI algorithm to recognize the gestures being made. The system is trained using a dataset of hand gestures to recognize different gestures. Once the gesture is recognized, it is translated into a corresponding mouse movement, which is then executed on the virtual screen. The system is designed to be scalable and adaptable to different types of environments and devices. All the input operations can be virtually controlled by using dynamic/static hand gestures along with a voice assistant. In our work we make use of ML and Computer Vision algorithms to recognize hand gestures and voice commands, which works without any additional hardware requirements. Key Words: optics, photonics, light, lasers, templates, journals
APA, Harvard, Vancouver, ISO, and other styles
36

Tawfeeq, Mohammed, and Ayam Abbass. "Control of Robot Directions Based on Online Hand Gestures." Iraqi Journal for Electrical and Electronic Engineering 14, no. 1 (2018): 41–50. http://dx.doi.org/10.37917/ijeee.14.1.5.

Full text
Abstract:
The evolution of wireless communication technology increases human machine interaction capabilities especially in controlling robotic systems. This paper introduces an effective wireless system in controlling the directions of a wheeled robot based on online hand gestures. The hand gesture images are captured and processed to be recognized and classified using neural network (NN). The NN is trained using extracted features to distinguish five different gestures; accordingly it produces five different signals. These signals are transmitted to control the directions of the cited robot. The main contribution of this paper is, the technique used to recognize hand gestures is required only two features, these features can be extracted in very short time using quite easy methodology, and this makes the proposed technique so suitable for online interaction. In this methodology, the preprocessed image is partitioned column-wise into two half segments; from each half one feature is extracted. This feature represents the ratio of white to black pixels of the segment histogram. The NN showed very high accuracy in recognizing all of the proposed gesture classes. The NN output signals are transmitted to the robot microcontroller wirelessly using Bluetooth. Accordingly the microcontroller guides the robot to the desired direction. The overall system showed high performance in controlling the robot movement directions.
APA, Harvard, Vancouver, ISO, and other styles
37

Yaseen, Oh-Jin Kwon, Jaeho Kim, Jinhee Lee, and Faiz Ullah. "Vision-Based Gesture-Driven Drone Control in a Metaverse-Inspired 3D Simulation Environment." Drones 9, no. 2 (2025): 92. https://doi.org/10.3390/drones9020092.

Full text
Abstract:
Unlike traditional remote control systems for controlling unmanned aerial vehicles (UAVs) and drones, active research is being carried out in the domain of vision-based hand gesture recognition systems for drone control. However, contrary to static and sensor based hand gesture recognition, recognizing dynamic hand gestures is challenging due to the complex nature of multi-dimensional hand gesture data, present in 2D images. In a real-time application scenario, performance and safety is crucial. Therefore we propose a hybrid lightweight dynamic hand gesture recognition system and a 3D simulator based drone control environment for live simulation. We used transfer learning-based computer vision techniques to detect dynamic hand gestures in real-time. The gestures are recognized, based on which predetermine commands are selected and sent to a drone simulation environment that operates on a different computer via socket connectivity. Without conventional input devices, hand gesture detection integrated with the virtual environment offers a user-friendly and immersive way to control drone motions, improving user interaction. Through a variety of test situations, the efficacy of this technique is illustrated, highlighting its potential uses in remote-control systems, gaming, and training. The system is tested and evaluated in real-time, outperforming state-of-the-art methods. The code utilized in this study are publicly accessible. Further details can be found in the “Data Availability Statement”.
APA, Harvard, Vancouver, ISO, and other styles
38

Vignesh, Selvaraj Nadar, Shubhra Sinha Vaishnavi, and Umesh Ratre Sushila. "Smart Home Automation using Hand Gesture Recognition System." International Journal of Engineering and Advanced Technology (IJEAT) 9, no. 2 (2019): 18–21. https://doi.org/10.35940/ijeat.B3055.129219.

Full text
Abstract:
Visual interpretation of hand gestures is a natural method of achieving Human-Computer Interaction (HCI). In this paper, we present an approach to setting up of a smart home where the appliances can be controlled by an implementation of a Hand Gesture Recognition System. More specifically, this recognition system uses Transfer learning, which is a technique of Machine Learning, to successfully distinguish between pre-trained gestures and identify them properly to control the appliances. The gestures are sequentially identified as commands which are used to actuate the appliances. The proof of concept is demonstrated by controlling a set of LEDs that represent the appliances, which are connected to an Arduino Uno Microcontroller, which in turn is connected to the personal computer where the actual gesture recognition is implemented.
APA, Harvard, Vancouver, ISO, and other styles
39

Yang, Yuchen. "Dynamic gesture recognition based on compressed video for UAV control." Journal of Physics: Conference Series 2580, no. 1 (2023): 012021. http://dx.doi.org/10.1088/1742-6596/2580/1/012021.

Full text
Abstract:
Abstract As a multifunctional aircraft with small size, low cost and easy control, UAVs can be used in many fields such as gardening, plant protection, mapping, logistics, military, etc. For a more convenient human-machine interaction mode, the user interacts with the UAVs in the form of dynamic gestures. The current traditional approach is to train convolutional neural networks with images as direct inputs to the system for the purpose of controlling UAVs. However, this approach leaves the temporal representation information between images missing, resulting in poor training results or over-reliance on computing resources. Therefore, this paper proposes an efficient processing strategy, that is, for simple gesture tasks, the optimized frame extraction algorithm is adopted to process picture-based gesture recognition; for complex gesture tasks, compressed video is used as system input to complete video-based gesture recognition. Based on the training results in action recognition datasets, UCF-101 and HMDB-51, the efficiency of compressed video in gesture recognition tasks has been verified, which can be applied to UAV dynamic gesture control.
APA, Harvard, Vancouver, ISO, and other styles
40

Moldovan, Constantin Catalin, and Ionel Staretu. "Real-Time Gesture Recognition for Controlling a Virtual Hand." Advanced Materials Research 463-464 (February 2012): 1147–50. http://dx.doi.org/10.4028/www.scientific.net/amr.463-464.1147.

Full text
Abstract:
Object tracking in three dimensional environments is an area of research that has attracted a lot of attention lately, for its potential regarding the interaction between man and machine. Hand gesture detection and recognition, in real time, from video stream, plays a significant role in the human-computer interaction and, on the current digital image processing applications, this represent a difficult task. This paper aims to present a new method for human hand control in virtual environments, by eliminating the need of an external device currently used for hand motion capture and digitization. A first step in this direction would be the detection of human hand, followed by the detection of gestures and their use to control a virtual hand in a virtual environment.
APA, Harvard, Vancouver, ISO, and other styles
41

S., Selvakumar. "A Gesture Based Interface for Robot Interaction and Control." Journal of Signal Processing 4, no. 3 (2018): 33–38. https://doi.org/10.5281/zenodo.2222243.

Full text
Abstract:
Motion acknowledgment, it is an innovation which is gone for translating human motions with the assistance of numerical calculations. When all is said in done, buyer electronic gear utilize remote control for UIs. By supplanting the remote control framework with hand signals is an imaginative UI that settle the entanglements in the utilization of remote control for residential supplies. The proposed model manages utilizing hand motion to play out the essential controls in robot movement controlling. This kind of UI utilizing signal has focal points of straightforward entry and human machine collaboration. Signals are a characteristic type of correspondence and are anything but difficult to learn. Anyway utilizing signals to control robot movement requires motion acknowledgment calculation and sufficient equipment identifying with it.
APA, Harvard, Vancouver, ISO, and other styles
42

Gentilucci, Maurizio, Elisa De Stefani, and Alessandro Innocenti. "From Gesture to Speech." Biolinguistics 6, no. 3-4 (2012): 338–53. http://dx.doi.org/10.5964/bioling.8925.

Full text
Abstract:
One of the major problems concerning the evolution of human language is to understand how sounds became associated to meaningful gestures. It has been proposed that the circuit controlling gestures and speech evolved from a circuit involved in the control of arm and mouth movements related to ingestion. This circuit contributed to the evolution of spoken language, moving from a system of communication based on arm gestures. The discovery of the mirror neurons has provided strong support for the gestural theory of speech origin because they offer a natural substrate for the embodiment of language and create a direct link between sender and receiver of a message. Behavioural studies indicate that manual gestures are linked to mouth movements used for syllable emission. Grasping with the hand selectively affected movement of inner or outer parts of the mouth according to syllable pronunciation and hand postures, in addition to hand actions, influenced the control of mouth grasp and vocalization. Gestures and words are also related to each other. It was found that when producing communicative gestures (emblems) the intention to interact directly with a conspecific was transferred from gestures to words, inducing modification in voice parameters. Transfer effects of the meaning of representational gestures were found on both vocalizations and meaningful words. It has been concluded that the results of our studies suggest the existence of a system relating gesture to vocalization which was precursor of a more general system reciprocally relating gesture to word.
APA, Harvard, Vancouver, ISO, and other styles
43

Barrett, Anna M., Anne L. Foundas, and Kenneth M. Heilman. "Speech and gesture are mediated by independent systems." Behavioral and Brain Sciences 28, no. 2 (2005): 125–26. http://dx.doi.org/10.1017/s0140525x05220034.

Full text
Abstract:
Arbib suggests that language emerged in direct relation to manual gestural communication, that Broca's area participates in producing and imitating gestures, and that emotional facial expressions contributed to gesture-language coevolution. We discuss functional and structural evidence supporting localization of the neuronal modules controlling limb praxis, speech and language, and emotional communication. Current evidence supports completely independent limb praxis and speech/language systems.
APA, Harvard, Vancouver, ISO, and other styles
44

Fuad, Muhammad, Faikul Umam, Sri Wahyuni, et al. "Towards Controlling Mobile Robot Using Upper Human Body Gesture Based on Convolutional Neural Network." Journal of Robotics and Control (JRC) 4, no. 6 (2023): 856–67. http://dx.doi.org/10.18196/jrc.v4i6.20399.

Full text
Abstract:
Human-Robot Interaction (HRI) has challenges in investigation of a nonverbal and natural interaction. This study contributes to developing a gesture recognition system capable of recognizing the entire human upper body for HRI, which has never been done in previous research. Preprocessing is applied to improve image quality, reduce noise and highlight important features of each image, including color segmentation, thresholding and resizing. The hue, saturation, value (HSV) color segmentation is executed by utilizing blue color backdrop and additional lighting to deal with illumination issue. Then thresholding is performed to get a black and white image to distinguish between background and foreground. The resizing is completed to adjust the image to match the size expected by the model. The preprocessed data image is used as input for gesture recognition based on Convolutional Neural Network (CNN). This study recorded five gestures from five research subjects in difference gender and body posture with total of 450 images which divided into 380 and 70 images for training and testing respectively. Experiments that performed in an indoor environment showed that CNN achieved 92% of accuracy in the gesture recognition. It has lower level of accuracy compare to AlexNet model but with faster training computation time of 9 seconds. This result was obtained by testing the system over various distances. The optimal distance for a camera setting from user to interact with mobile robot by using gesture was 2.5 m. For future research, the proposed method will be improved and implemented for mobile robot motion control.
APA, Harvard, Vancouver, ISO, and other styles
45

Rana, Harsh Ganpatbhai, and Aayushi Sanjay Soni. "An IoT based Game Controlling Device (JoyGlove)." International Journal on Recent and Innovation Trends in Computing and Communication 8, no. 4 (2020): 09–11. http://dx.doi.org/10.17762/ijritcc.v8i4.5366.

Full text
Abstract:
Nowadays, there is a huge inclination of youngsters and adults towards video games and of course in our childhood everyone might definitely have experienced it. However, we are controlling the game using typical input devices such as mouse, keyboard, joystick, etc but how about, if we control the game using our hand gestures ? These days we have lots of game controllers that are surpassing the gaming experience however they are quite expensive too. Through this project we have designed our own game controlling glove using Arduino. In addition we have also developed a car game using UNITY 3D. The areas of IOT and hand gesture recognition will be explored by this project.
APA, Harvard, Vancouver, ISO, and other styles
46

Yang, Geng, Honghao Lv, Feiyu Chen, et al. "A Novel Gesture Recognition System for Intelligent Interaction with a Nursing-Care Assistant Robot." Applied Sciences 8, no. 12 (2018): 2349. http://dx.doi.org/10.3390/app8122349.

Full text
Abstract:
The expansion of nursing-care assistant robots in smart infrastructure has provided more applications for homecare services, which has raised new demands for smart and natural interaction between humans and robots. This article proposed an innovative hand motion trajectory (HMT) gesture recognition system based on background velocity features. Here, a new wearable wrist-worn camera prototype for gesture’s video collection was designed, and a new method for the segmentation of continuous gestures was shown. Meanwhile, a nursing-care assistant robot prototype was designed for assisting the elderly, which is capable of carrying the elderly with omnidirectional motion and grabbing the specified object at home. In order to evaluate the performance of the gesture recognition system, 10 special gestures were defined as the move commands for interaction with the robot, and 1000 HMT gesture samples were obtained from five subjects for leave-one-subject-out (LOSO) cross-validation classification with an average recognition accuracy of up to 97.34%. Moreover, the performance and practicability of the proposed system were further demonstrated by controlling the omnidirectional movement of the nursing-care assistant robot using the predefined gesture commands.
APA, Harvard, Vancouver, ISO, and other styles
47

Oppenheim, Matthew. "HeadBanger: controlling switchable software with head gesture." Journal of Assistive Technologies 10, no. 1 (2016): 2–10. http://dx.doi.org/10.1108/jat-04-2015-0015.

Full text
Abstract:
Purpose – The purpose of this paper is to present a novel non-contact method of using head movement to control software without the need for wearable devices. Design/methodology/approach – A webcam and software are used to track head position. When the head is moved through a virtual target, a keystroke is simulated. The system was assessed by participants with impaired mobility using Sensory Software’s Grid 2 software as a test platform. Findings – The target user group could effectively use this system to interact with switchable software. Practical implications – Physical head switches could be replaced with virtual devices, reducing fatigue and dissatisfaction. Originality/value – Using a webcam to control software using head gestures where the participant does not have to wear any specialised technology or a marker. This system is shown to be of benefit to motor impaired participants for operating switchable software.
APA, Harvard, Vancouver, ISO, and other styles
48

D.B., GOROKHOV, and KUZNETSOVA V.S. "GESTURE APPROACH TO CONTROLLING AUGMENTED REALITY OBJECTS." Issues of Social-Economic development of Siberia 58, no. 4 (2024): 39–45. https://doi.org/10.18324/2224-1833-2024-3-39-45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Shukla, Anita, Ankit Jain, Prajjwal Mishra, and Rahul Kushwaha. "Human Gesture Controlled Car Robot." SAMRIDDHI : A Journal of Physical Sciences, Engineering and Technology 11, no. 02 (2019): 115–22. http://dx.doi.org/10.18090/samriddhi.v11i02.5.

Full text
Abstract:
In present era of development and growth, the technology has made possible for people to operate electronic devices more conveniently. Now we are able to operate machines without giving it a touch with the help of technology called Hand Gesture Recognition. Here we have devised a gesture controlled car robot which uses PIC16F738 microcontroller and accelerometer to achieve human computer interaction. In this paper we deal with development and implementation of wireless and an accelerometer based hand gesture controlled car robot using RF transmitter and receiver. Transmitter detects the movement of hand and sends the command to the receiver by RF; receiver receives the command and moves the robot accordingly. Apart from conventional approach of controlling mechanism of car robots via buttons etc., here we have developed a definite and effective algorithm for identification of the gestures.
APA, Harvard, Vancouver, ISO, and other styles
50

ZAMMIT, MARIA, and GRAHAM SCHAFER. "Maternal label and gesture use affects acquisition of specific object names." Journal of Child Language 38, no. 1 (2010): 201–21. http://dx.doi.org/10.1017/s0305000909990328.

Full text
Abstract:
ABSTRACTTen mothers were observed prospectively, interacting with their infants aged 0 ; 10 in two contexts (picture description and noun description). Maternal communicative behaviours were coded for volubility, gestural production and labelling style. Verbal labelling events were categorized into three exclusive categories: label only; label plus deictic gesture; label plus iconic gesture. We evaluated the predictive relations between maternal communicative style and children's subsequent acquisition of ten target nouns. Strong relations were observed between maternal communicative style and children's acquisition of the target nouns. Further, even controlling for maternal volubility and maternal labelling, maternal use of iconic gestures predicted the timing of acquisition of nouns in comprehension. These results support the proposition that maternal gestural input facilitates linguistic development, and suggest that such facilitation may be a function of gesture type.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!