Academic literature on the topic 'Gestures recognition system human motions'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gestures recognition system human motions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Gestures recognition system human motions"

1

Liang, Xiubo, Zhen Wang, Weidong Geng, and Franck Multon. "A Motion-based User Interface for the Control of Virtual Humans Performing Sports." International Journal of Virtual Reality 10, no. 3 (2011): 1–8. http://dx.doi.org/10.20870/ijvr.2011.10.3.2815.

Full text
Abstract:
Traditional human computer interfaces are not intuitive and natural for the choreography of human motions in the field of VR and video games. In this paper we present a novel approach to control virtual humans performing sports with a motion-based user interface. The process begins by asking the user to draw some gestures in the air with a Wii Remote. The system then recognizes the gestures with pre-trained hidden Markov models. Finally, the recognized gestures are employed to choreograph the simulated sport motions of a virtual human. The average recognition rate of the recognition algorithm
APA, Harvard, Vancouver, ISO, and other styles
2

R.Vasavi, Rahul Nenavath, Snigdha A, Jeffery Moses K., and Simha S.Vishal. "Painting with Hand Gestures using MediaPipe." International Journal of Innovative Science and Research Technology 7, no. 12 (2023): 1285–91. https://doi.org/10.5281/zenodo.7514430.

Full text
Abstract:
The main objective of this project is that the hand gesture recognition can also be utilised in applications including industrial automation control, sign language interpretation, and rehabilitation equipment for individuals with physical disabilities of the upper extremities. And it is also find applications in varied domains like virtual environments, medical systems, smart surveillance etc. Hand gesture recognition is most significant for human-computer interaction. Gesture Recognition is a technique that uses mathematical algorithms to recognise human gestures. Gesture recognition identifi
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Jiabiao, Ruiheng Liu, Tianyu Zhang, and Jianbin Liu. "A Symmetrical Leech-Inspired Soft Crawling Robot Based on Gesture Control." Biomimetics 10, no. 1 (2025): 35. https://doi.org/10.3390/biomimetics10010035.

Full text
Abstract:
This paper presents a novel soft crawling robot controlled by gesture recognition, aimed at enhancing the operability and adaptability of soft robots through natural human–computer interactions. The Leap Motion sensor is employed to capture hand gesture data, and Unreal Engine is used for gesture recognition. Using the UE4Duino, gesture semantics are transmitted to an Arduino control system, enabling direct control over the robot’s movements. For accurate and real-time gesture recognition, we propose a threshold-based method for static gestures and a backpropagation (BP) neural network model f
APA, Harvard, Vancouver, ISO, and other styles
4

Valarezo Añazco, Edwin, Seung Ju Han, Kangil Kim, Patricio Rivera Lopez, Tae-Seong Kim, and Sangmin Lee. "Hand Gesture Recognition Using Single Patchable Six-Axis Inertial Measurement Unit via Recurrent Neural Networks." Sensors 21, no. 4 (2021): 1404. http://dx.doi.org/10.3390/s21041404.

Full text
Abstract:
Recording human gestures from a wearable sensor produces valuable information to implement control gestures or in healthcare services. The wearable sensor is required to be small and easily worn. Advances in miniaturized sensor and materials research produces patchable inertial measurement units (IMUs). In this paper, a hand gesture recognition system using a single patchable six-axis IMU attached at the wrist via recurrent neural networks (RNN) is presented. The IMU comprises IC-based electronic components on a stretchable, adhesive substrate with serpentine-structured interconnections. The p
APA, Harvard, Vancouver, ISO, and other styles
5

Tabassum, Shaheen, and Raghavendra R. "Sign Language Recognition and Converting into Text." International Journal for Research in Applied Science and Engineering Technology 10, no. 4 (2022): 1385–90. http://dx.doi.org/10.22214/ijraset.2022.41266.

Full text
Abstract:
Abstract: Sign language is a mode of communication that use a variety of hand movements and actions to convey a message. Deciphering these motions might be a pattern recognition challenge. People use a range of gestures and behaviours to communicate with one another. This study is a system for a human-computer interface that can identify american sign language gestures and produce textual output that reflects the meaning of the gesture. To identify and learn gestures, the proposed system would employ convolutional neural networks and long short term memory networks. This will help to break dow
APA, Harvard, Vancouver, ISO, and other styles
6

Kodulkar, Prof Rajeshwari J. "Deep Neural Networks for Human Action Recognition." International Journal for Research in Applied Science and Engineering Technology 9, no. 9 (2021): 2141–44. http://dx.doi.org/10.22214/ijraset.2021.38206.

Full text
Abstract:
Abstract: In deep neural networks, human action detection is one of the most demanding and complex tasks. Human gesture recognition is the same as human action recognition. Gesture is defined as a series of bodily motions that communicate a message. Gestures are a more natural and preferable way for humans to engage with computers, thereby bridging the gap between humans and robots. The finest communication platform for the deaf and dumb is human action recognition. We propose in this work to create a system for hand gesture identification that recognizes hand movements, hand characteristics s
APA, Harvard, Vancouver, ISO, and other styles
7

Mira, Anwar, and Olaf Hellwich. "Building an Integrative System: Enriching Gesture Language Video Recognition through the Multi-Stream of Hybrid and Improved Deep Learning Models with an Adaptive Decision System." Inteligencia Artificial 27, no. 74 (2024): 181–213. http://dx.doi.org/10.4114/intartif.vol27iss74pp181-213.

Full text
Abstract:
The recognition of hand gestures is of growing importance in developing human-machine interfaces that rely on hand motions for communication. However, recognizing hand gesture motions poses challenges due to overlapping gestures from different categories that share similar hand poses. Temporal information has proven to be more effective in distinguishing sequences of hand gestures. To address these challenges, this research presents an innovative adaptive decision-making system that aim to enhance gesture recognition within the identical category have been introduced. The system capitalizes on
APA, Harvard, Vancouver, ISO, and other styles
8

Reddy, Cherukupally Karunakar, Suraj Janjirala, and Kevulothu Bhanu Prakash. "Gesture Controlled Virtual Mouse with the Support of Voice Assistant." International Journal for Research in Applied Science and Engineering Technology 10, no. 6 (2022): 2314–20. http://dx.doi.org/10.22214/ijraset.2022.44323.

Full text
Abstract:
Abstract: This work offers a cursor control system that utilises a web cam to capture human movements and a voice assistant to quickly traverse system controls. Using MediaPipe, the system will let the user to navigate the computer cursor with their hand motions. It will use various hand motions to conduct activities such as left click and dragging. It also allows you to choose numerous items, adjust the volume, and adjust the brightness. MediaPipe, OpenCV etc advanced libraries in python are used to build the system. A hand gesture and a voice assistant used to physically control all i/o oper
APA, Harvard, Vancouver, ISO, and other styles
9

Kulkarni, Padmaja Vivek, Boris Illing, Bastian Gaspers, Bernd Brüggemann, and Dirk Schulz. "Mobile manipulator control through gesture recognition using IMUs and Online Lazy Neighborhood Graph search." ACTA IMEKO 8, no. 4 (2019): 3. http://dx.doi.org/10.21014/acta_imeko.v8i4.677.

Full text
Abstract:
Gesture-based control potentially eliminates the need for wearisome physical controls and facilitates easy interaction between a human and a robot. At the same time, it is intuitive and enables a natural means of control. In this paper, we present and evaluate a framework for gesture recognition using four wearable Inertial Measurement Units (IMUs) to indirectly control a mobile robot. Six gestures involving different hand and arm motions are defined. A novel algorithm based on an Online Lazy Neighborhood Graph (OLNG) search is used to recognise and classify the gestures online. A software fra
APA, Harvard, Vancouver, ISO, and other styles
10

Aarthy, S. L., V. Malathi, Monia Hamdi, Inès Hilali-Jaghdam, Sayed Abdel-Khalek, and Romany F. Mansour. "Recognition of Hand Gesture Using Electromyography Signal: Human-Robot Interaction." Journal of Sensors 2022 (July 11, 2022): 1–9. http://dx.doi.org/10.1155/2022/4718684.

Full text
Abstract:
Recognition of hand gestures has been developed in various research domains and proven to have significant benefits in improving the necessity of human-robot interaction (HRI). The introduction of intelligent statistics knowledge methodologies, such as big data and machine learning, has ushered in a new era of data science and made it easier to classify hand motions accurately using electromyography (EMG) signals. However, the collecting and labelling of the vast dataset enforces a significant workload; resulting in implementations takes a long time. As a result, a unique strategy for combinin
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Gestures recognition system human motions"

1

Kolesnik, Paul. "Conducting gesture recognition, analysis and performance system." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81499.

Full text
Abstract:
A number of conducting gesture analysis and performance systems have been developed over the years. However, most of the previous projects either primarily concentrated on tracking tempo and amplitude indicating gestures, or implemented individual mapping techniques for expressive gestures that varied from research to research. There is a clear need for a uniform process that could be applied toward analysis of both indicative and expressive gestures. The proposed system provides a set of tools that contain extensive functionality for identification, classification and performance with
APA, Harvard, Vancouver, ISO, and other styles
2

Bernard, Arnaud Jean Marc. "Human computer interface based on hand gesture recognition." Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/42748.

Full text
Abstract:
With the improvement of multimedia technologies such as broadband-enabled HDTV, video on demand and internet TV, the computer and the TV are merging to become a single device. Moreover the previously cited technologies as well as DVD or Blu-ray can provide menu navigation and interactive content. The growing interest in video conferencing led to the integration of the webcam in different devices such as laptop, cell phones and even the TV set. Our approach is to directly use an embedded webcam to remotely control a TV set using hand gestures. Using specific gestures, a user is able to control
APA, Harvard, Vancouver, ISO, and other styles
3

Rådell, Dennis. "Combining Eye Tracking and Gestures to Interact with a Computer System." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-205308.

Full text
Abstract:
Eye tracking and gestures are relatively new input methods, changing the way humans interact with computers. Gestures can be used for games or controlling a computer through an interface. Eye tracking is another way of interacting with computers, often by combining with other inputs such as a mouse or touch pad. Gestures and eye tracking have been used in commercially available products, but seldom combined to create a multimodal interaction. This thesis presents a prototype which combines eye tracking with gestures to interact with a computer. To accomplish this, the report investigates diffe
APA, Harvard, Vancouver, ISO, and other styles
4

Lam, Benny, and Jakob Nilsson. "Creating Good User Experience in a Hand-Gesture-Based Augmented Reality Game." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-156878.

Full text
Abstract:
The dissemination of new innovative technology requires feasibility and simplicity. The problem with marker-based augmented reality is similar to glove-based hand gesture recognition: they both require an additional component to function. This thesis investigates the possibility of combining markerless augmented reality together with appearance-based hand gesture recognition by implementing a game with good user experience. The methods employed in this research consist of a game implementation and a pre-study meant for measuring interactive accuracy and precision, and for deciding upon which g
APA, Harvard, Vancouver, ISO, and other styles
5

Renteria, Bustamante Leonardo Fabian, and Pietro Pantano. "A machine learning system for developing a Human-Robot interface for automatic facial emotions and hand gestures recognition." Thesis, 2017. http://hdl.handle.net/10955/1313.

Full text
Abstract:
Dottorato "Archimede" in Scienze, Comunicazione e Tecnologie, Ciclo XXVIII,a.a. 2015-2016<br>Emotions are an essential part of people's lives because they help us to make decisions, communicate and somehow to understand each other. Additionally, facial emotion perception plays an important role in various fields of psychology, neuroscience, computational intelligence and robotics. In the last years, robots have stopped being used as simple machines dedicated to carry out repetitive, difficult and dangerous jobs. Thanks to rapid technological advances and the emergence of more powerful computer
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Gestures recognition system human motions"

1

Świtoński, Adam, Bartosz Piórkowski, Henryk Josiński, Konrad Wojciechowski, and Aldona Drabik. "Recognition of Human Gestures Represented by Depth Camera Motion Sequences." In New Trends in Intelligent Information and Database Systems. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16211-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ferreira, António, Guilherme Silva, André Dias, Alfredo Martins, and Aurélio Campilho. "Motion Descriptor for Human Gesture Recognition in Low Resolution Images." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27146-0_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wilkowski, A. "HMM-Based System for Recognizing Gestures in Image Sequences and Its Application in Continuous Gesture Recognition." In Human-Computer Systems Interaction. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03202-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Arbib, Michael. "Chapter 1. Pantomime within and beyond the evolution of language." In Perspectives on Pantomime. John Benjamins Publishing Company, 2024. http://dx.doi.org/10.1075/ais.12.01arb.

Full text
Abstract:
The core of the paper is a critique of the role of pantomime in the author’s theory (the Mirror System Hypothesis, MSH, itself evolving) of the biocultural evolution that led to human brains that were “language ready” long before humans developed languages. We argue that the notion of “ad hoc” pantomime posited there should be modified to a notion of “ur-pantomime” in which pantomimes are somewhat ritualized by individual users but not yet conventionalized by the group. We extend this to offer a taxonomy of pantomime, with the above forms distinguished from both pantomime exhibited by apes and
APA, Harvard, Vancouver, ISO, and other styles
5

Singh, Aradhana Kumari. "Hidden Markov Model for Gesture Recognition." In Challenges and Applications for Hand Gesture Recognition. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-7998-9434-6.ch006.

Full text
Abstract:
In this chapter, hidden Markov model (HMM) is used to apply for gesture recognition. The study comprises the design, implementation, and experimentation of a system for making gestures, training HMMs, and identifying gestures with HMMs in order to better understand the behaviour of hidden Markov models (HMMs). One person extends his flattened, vertical palm toward the other, as though to reassure the other that his hands are safe. The other individual smiles and responds in kind. This wave gesture has been associated with friendship from childhood. Human motions can be thought of as a pattern
APA, Harvard, Vancouver, ISO, and other styles
6

Babu, C. V. Suresh, Rithika Purushothaman, K. Anusha, and Shri Sakthi. "Smart Gesture Controlled Systems Using IoT." In Advances in Medical Technologies and Clinical Practice. IGI Global, 2023. http://dx.doi.org/10.4018/978-1-6684-8938-3.ch006.

Full text
Abstract:
A new technology called a smart gesture-controlled system enables users to interact with devices and systems by moving their hands or body. It interprets specific user motions into commands that control the system or device utilizing sensors, machine learning algorithms, and software interfaces. This technology has the potential to alter how we interact with technology by making it more approachable to a wider range of users. Gesture recognition systems have been used in human-computer interaction for many years. On the other hand, recent developments in sensors and machine learning algorithms
APA, Harvard, Vancouver, ISO, and other styles
7

Jaén-Vargas, M., K. Reyes Leiva, F. Fernandes, et al. "A Deep Learning Approach to Recognize Human Activity Using Inertial Sensors and Motion Capture Systems." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2021. http://dx.doi.org/10.3233/faia210196.

Full text
Abstract:
Human Activity Recognition (HAR) plays an important role in behavior analysis, video surveillance, gestures recognition, gait analysis, and posture recognition. Given the recent progress of Artificial Intelligence (AI) applied to HAR, the inputs that are the data from wearable sensors can be treated as time-series from which movement events can be classified with high accuracy. In this study, a dataset of raw sensor data served as input to four different deep learning networks (DNN, CNN, LSTM, and CNN-LSTM). Differences in accuracy and learning time were then compared and evaluated for each mo
APA, Harvard, Vancouver, ISO, and other styles
8

Giakoumis Dimitris, Drosou Anastasios, Cipresso Pietro, et al. "Real-time Monitoring of Behavioural Parameters Related to Psychological Stress." In Studies in Health Technology and Informatics. IOS Press, 2012. https://doi.org/10.3233/978-1-61499-121-2-287.

Full text
Abstract:
We have developed a system, allowing real-time monitoring of human gestures, which can be used for the automatic recognition of behavioural correlates of psychological stress. The system is based on a low-cost camera (Microsoft Kinect), which provides video recordings capturing the subject's upper body activity. Motion History Images (MHIs) are calculated in real-time from these recordings. Appropriate algorithms are thereafter applied over the MHIs, enabling the real-time calculation of activity-related behavioural parameters. The system's efficiency in real-time calculation of behavioural pa
APA, Harvard, Vancouver, ISO, and other styles
9

Kale, Geetanjali Vinayak, and Varsha Hemant Patil. "A Study of Vision Based Human Motion Recognition and Analysis." In Computer Vision. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5204-8.ch099.

Full text
Abstract:
Vision based human motion recognition has fascinated many researchers due to its critical challenges and a variety of applications. The applications range from simple gesture recognition to complicated behaviour understanding in surveillance system. This leads to major development in the techniques related to human motion representation and recognition. This paper discusses applications, general framework of human motion recognition, and the details of each of its components. The paper emphasizes on human motion representation and the recognition methods along with their advantages and disadva
APA, Harvard, Vancouver, ISO, and other styles
10

Gavrilova, Marina L., Ferdous Ahmed, A. S. M. Hossain Bari, et al. "Multi-Modal Motion-Capture-Based Biometric Systems for Emergency Response and Patient Rehabilitation." In Advances in Medical Technologies and Clinical Practice. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7525-2.ch007.

Full text
Abstract:
This chapter outlines the current state of the art of Kinect sensor gait and activity authentication. It also focuses on emotional cues that could be observed from human body and posture. It presents a prototype of a system that combines recently developed behavioral gait and posture recognition methods for human emotion identification. A backbone of the system is Kinect sensor gait recognition, which explores the relationship between joint-relative angles and joint-relative distances through machine learning. The chapter then introduces a real-time gesture recognition system developed using K
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Gestures recognition system human motions"

1

Kim, Joonhyun, Jungsoo Lee, and Wansoo Kim. "Transferable Convolutional Neural Networks for IMU-based Motion Gesture Recognition in Human-Machine Interaction." In 2024 24th International Conference on Control, Automation and Systems (ICCAS). IEEE, 2024. https://doi.org/10.23919/iccas63016.2024.10773204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Hong, and Jeong-Hoi Koo. "Development of a Wearable Gesture Recognition System." In ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-80061.

Full text
Abstract:
This paper presents an effort of developing a wearable gesture recognition system. The objective of this work is to design and build a mechatronic device that can recognize human gestures. This device can be used to help the communication between humans or humans and machines (such as unmanned vehicles). The device is composed of two main components, a data acquisition system and a gesture recognition system. The data acquisition system obtains sensory information from human motions and encodes the information for transmission to the gesture recognition system. Upon receiving the signals, the
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Haodong, Wenjin Tao, Ming C. Leu, and Zhaozheng Yin. "Dynamic Gesture Design and Recognition for Human-Robot Collaboration With Convolutional Neural Networks." In 2020 International Symposium on Flexible Automation. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/isfa2020-9609.

Full text
Abstract:
Abstract Human-robot collaboration (HRC) is a challenging task in modern industry and gesture communication in HRC has attracted much interest. This paper proposes and demonstrates a dynamic gesture recognition system based on Motion History Image (MHI) and Convolutional Neural Networks (CNN). Firstly, ten dynamic gestures are designed for a human worker to communicate with an industrial robot. Secondly, the MHI method is adopted to extract the gesture features from video clips and generate static images of dynamic gestures as inputs to CNN. Finally, a CNN model is constructed for gesture reco
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Haodong, Ming C. Leu, Wenjin Tao, and Zhaozheng Yin. "Design of a Real-Time Human-Robot Collaboration System Using Dynamic Gestures." In ASME 2020 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/imece2020-23650.

Full text
Abstract:
Abstract With the development of industrial automation and artificial intelligence, robotic systems are developing into an essential part of factory production, and the human-robot collaboration (HRC) becomes a new trend in the industrial field. In our previous work, ten dynamic gestures have been designed for communication between a human worker and a robot in manufacturing scenarios, and a dynamic gesture recognition model based on Convolutional Neural Networks (CNN) has been developed. Based on the model, this study aims to design and develop a new real-time HRC system based on multi-thread
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Jinfa, and Won-jong Kim. "Development of a Mobile Robot Providing a Natural Way to Interact With Electronic Devices." In ASME 2016 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/dscc2016-9751.

Full text
Abstract:
This paper provides a natural, yet low-cost way for human to interact with electronic devices indoor: the development of a human-following mobile robot capable of controlling other electrical devices for the user based on the user’s gesture commands. The overall experimental setup consists of a skid-steered mobile robot, Kinect sensor, laptop, wide-angle camera and two lamps. The OpenNI middleware is used to process data from the Kinect sensor, and the OpenCV is used to process data from the wide-angle camera. A new human-following algorithm is proposed based on human motion estimation. The hu
APA, Harvard, Vancouver, ISO, and other styles
6

Reid, Chris, and Biswanath Samanta. "Gesture Recognition for Control in Human-Robot Interactions." In ASME 2014 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/imece2014-38504.

Full text
Abstract:
In co-robotics applications, the robots must be capable of taking inputs from human partners in different forms, including both static and sequential hand gestures, in dynamic interactions for enhanced effectiveness as socially assistive agents. This paper presents the development of a gesture recognition algorithm for control of robots. The algorithm focuses on the detection of skin colors using monocular vision of a moving robot base where the inherent instability negates the effectiveness of methods like background subtraction. The algorithm is implemented in the open-source, open-access ro
APA, Harvard, Vancouver, ISO, and other styles
7

Hsiao, Jen-Hsuan, Yu-Heng Deng, Tsung-Ying Pao, Hsin-Rung Chou, and Jen-Yuan (James) Chang. "Design of a Wireless 3D Hand Motion Tracking and Gesture Recognition Glove for Virtual Reality Applications." In ASME 2017 Conference on Information Storage and Processing Systems collocated with the ASME 2017 International Technical Conference and Exhibition on Packaging and Integration of Electronic and Photonic Microsystems. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/isps2017-5450.

Full text
Abstract:
Hand motion tracking and gesture recognition are of crucial interest to the development of virtual reality systems and controllers. In this paper, a wireless data glove that can accurately sense hands’ dynamic movements and gestures of different modes was proposed. This data glove was custom-built, consisting of flex and inertial sensors, and a microcontroller with multi-channel ADC (analog to digital converter). For the classification algorithm, a hierarchical gesture system using Naïve Bayes Classifier was built. This low training time recognition algorithm allows categorization of all input
APA, Harvard, Vancouver, ISO, and other styles
8

Godage, Ishika, Ruvan Weerasignhe, and Damitha Sandaruwan. "Sign Language Recognition for Sentence Level Continuous Signings." In 10th International Conference on Natural Language Processing (NLP 2021). Academy and Industry Research Collaboration Center (AIRCC), 2021. http://dx.doi.org/10.5121/csit.2021.112305.

Full text
Abstract:
It is no doubt that communication plays a vital role in human life. There is, however, a significant population of hearing-impaired people who use non-verbal techniques for communication, which a majority of the people cannot understand. The predominant of these techniques is based on sign language, the main communication protocol among hearing impaired people. In this research, we propose a method to bridge the communication gap between hearing impaired people and others, which translates signed gestures into text. Most existing solutions, based on technologies such as Kinect, Leap Motion, Co
APA, Harvard, Vancouver, ISO, and other styles
9

Purwar, Anurag, and Rumit Desai. "Using Kinect to Capture Human Motion for Mechanism Synthesis, Motion Generation and Visualization." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-60499.

Full text
Abstract:
In this paper, we are presenting a framework for capturing human motions using Microsoft Kinect sensor for the purpose of 1) generating task positions for mechanism and robot synthesis, and 2) generation and visualization of B-spline inter-polated and approximated motion from the captured task positions. The theoretical foundation of this work lies in Kinematic Mapping, Dual and Bi-quaternions, and NURBS (Non-Uniform Rational B-spline) geometry. Lately, Kinect has opened doors for creation of natural and intuitive human-machine interactive (HMI) systems in medicine, robotic manipulation, CAD,
APA, Harvard, Vancouver, ISO, and other styles
10

Brett Talbot, Thomas, and Chinmay Chinara. "Open Medical Gesture: An Open-Source Experiment in Naturalistic Physical Interactions for Mixed and Virtual Reality Simulations." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002054.

Full text
Abstract:
Mixed (MR) and Virtual Reality (VR) simulations are hampered by requirements for hand controllers or attempts to perseverate in use of two-dimensional computer interface paradigms from the 1980s. From our efforts to produce more naturalistic interactions for combat medic training for the military, we have developed an open-source toolkit that enables direct hand controlled responsive interactions that is sensor independent and can function with depth sensing cameras, webcams or sensory gloves. From this research and review of current literature, we have discerned several best approaches for ha
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!