To see the other types of publications on this topic, follow the link: Virtual Hand.

Journal articles on the topic 'Virtual Hand'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Virtual Hand.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Parusharamudu, Mr M., N. Vivek, and Ch Sanjay Vardhan. "Virtual Mouse Using Hand Gestures." International Journal of Research Publication and Reviews 6, no. 4 (April 2025): 14589–93. https://doi.org/10.55248/gengpi.6.0425.1662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

WILLIAMS, N. W. "The Virtual Hand." Journal of Hand Surgery 22, no. 5 (October 1997): 560–67. http://dx.doi.org/10.1016/s0266-7681(97)80345-2.

Full text
Abstract:
Virtual reality technologies are now at a stage in which the various disciplines can be brought together to construct a virtual human hand. Devices can be constructed to record multiple joint positions accurately in clinical environments. Joint prostheses may be tested virtually before undergoing clinical trials, albeit in a simple way at present, but may eventually be incorporated into a virtual model of the hand and driven by goniometric gloves. This will allow more detailed analyses of implant in situ behaviour. These exciting developments will provide a huge advance in our understanding of the functions of the real hand and also a potential way of assessing outcomes in a simple and repeatable fashion. We are on the edge of a new era in hand surgery when the computer scientist, biomechanic, control engineer, hand therapist and surgeon will be able to alternate between the virtual and the real world in producing better outcomes for patients.
APA, Harvard, Vancouver, ISO, and other styles
3

Thakur, Shanu. "Virtual Mouse." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 05 (May 11, 2024): 1–5. http://dx.doi.org/10.55041/ijsrem33792.

Full text
Abstract:
Now a days the use of hand gesture recognition became popular due to the advance technology of artificial Intelligence (AI). In recent years, there has been a growing interest in developing alternative methods for human-computer interaction (HCI) that go beyond traditional input devices like keyboards and mouse. Hand gesture recognition has emerged as a promising approach, offering intuitive and natural ways for users to interact with digital interfaces. The proposed system utilizes computer vision techniques to detect and track hand gestures in real-time. By mapping specific gestures to mouse control commands, users can navigate and interact with graphical user interfaces (GUIs) without the need for physical input devices. In this paper A hand gesture- controlled virtual mouse system utilizes the AI and Machine learning algorithms to recognize the proper hand gestures and translate them into the mouse movements. The system we are designed that the people who have problems or difficulty using a traditional mouse or keyboard it will be appropriate for them. For this we are using camera that capture images of the user’s hand, which are processed by an AI and ML algorithm to recognize the gesture of hands being made. We trained the system by using a dataset of hand gestures. Keywords — Hand gesture recognition, Machine Learning, Artificial Intelligence, Virtual Mouse, Python, Media Pipe
APA, Harvard, Vancouver, ISO, and other styles
4

Cai, Xian Juan, Cheng Cheng, and Umwali Marine. "Autonomous Virtual Hand Behavior Construction in Virtual Manufacturing Environment." Applied Mechanics and Materials 743 (March 2015): 734–37. http://dx.doi.org/10.4028/www.scientific.net/amm.743.734.

Full text
Abstract:
This paper describes a method for autonomous virtual hand behavior construction in virtual manufacturing environment. The new mechanism is aimed to reduce user’s cognitive and manipulation loads. Based on a realistic and accurate virtual hand model; a strategy and an algorithm for behavior simulation, as well as continuous operation simulation; an intelligent and efficient virtual hand is generated. The experimental results demonstrate the ability of the method, to create an autonomous and reliable virtual hand in virtual manufacturing environment.
APA, Harvard, Vancouver, ISO, and other styles
5

Krishna, Golla Sai, G. S. S. M. Dileep, and Dr R. Shalini. "Virtual Actions Using Hand Gestures." IOSR Journal of Computer Engineering 26, no. 6 (December 2024): 43–48. https://doi.org/10.9790/0661-2606034348.

Full text
Abstract:
This paper introduces a groundbreaking method for enhancing human-computer interaction through virtual actions using hand gestures. The suggested system employs sophisticated computer vision and machine learning strategies, integrating MediaPipe Hands to achieve accurate detection and tracking of hand landmarks. By converting designated hand gestures into virtual mouse actions—such as moving the cursor, left-clicking, rightclicking, double-clicking, and taking screenshots—the system offers an intuitive, touch-free way to control computer systems. The gadget records live video with a webcam, analyzes hand landmarks, and determines the distance and angles to precisely recognize movements. OpenCV provides effective visualization, while the PyAutoGUI and Pynput libraries facilitate smoother mouse and keyboard interactions. This groundbreaking method addresses accessibility challenges and has significant implications in diverse areas, including gaming, healthcare, remote presentations, and contexts that necessitate contactless or hygienic interaction. The results of the study demonstrate the effectiveness of gesture-based virtual activities as an intuitive, flexible, and creative engagement approach. By improving natural user interfaces, the project opens up new possibilities for humancomputer interaction in both personal and professional contexts.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, JingRong, YuHua Xu, JianLong Ni, and QingHui Wang. "Glove-based virtual hand grasping for virtual mechanical assembly." Assembly Automation 36, no. 4 (September 5, 2016): 349–61. http://dx.doi.org/10.1108/aa-01-2016-002.

Full text
Abstract:
Purpose Hand gesture-based interaction can provide far more intuitive, natural and immersive feelings for users to manipulate 3D objects for virtual assembly (VA). A mechanical assembly consists of mostly general-purpose machine elements or mechanical parts that can be defined into four types based on their geometric features and functionalities. For different types of machine elements, engineers formulate corresponding grasping gestures based on their domain knowledge or customs for ease of assembly. Therefore, this paper aims to support a virtual hand to assemble mechanical parts. Design/methodology/approach It proposes a novel glove-based virtual hand grasping approach for virtual mechanical assembly. The kinematic model of virtual hand is set up first by analyzing the hand structure and possible movements, and then four types of grasping gestures are defined with joint angles of fingers for connectors and three types of parts, respectively. The recognition of virtual hand grasping is developed based on collision detection and gesture matching. Moreover, stable grasping conditions are discussed. Findings A prototype system is designed and developed to implement the proposed approach. The case study on VA of a two-stage gear reducer demonstrates the functionality of the system. From the users’ feedback, it is found that more natural and stable hand grasping interaction for VA of mechanical parts can be achieved. Originality/value It proposes a novel glove-based virtual hand grasping approach for virtual mechanical assembly.
APA, Harvard, Vancouver, ISO, and other styles
7

Reddy, Mr K. Vikram. "Hand Gesture based Virtual Mouse." International Journal for Research in Applied Science and Engineering Technology 9, no. 5 (May 31, 2021): 1646–49. http://dx.doi.org/10.22214/ijraset.2021.34497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mochimaru, Masaaki, Natsuki Miyata, Makiko Kouchi, Mitsunori Tada, Toru Nakata, and Tsuneya Kurihara. "Digital Hand for Virtual Prototyping." Reference Collection of Annual Meeting 2004.8 (2004): 189–90. http://dx.doi.org/10.1299/jsmemecjsm.2004.8.0_189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Indraneel, K. J. S. S., P. Narendra Reddy, G. Leela Srinivas, J. Alekh Vara Prasad, and Ananthoju Vijay Kumar. "Hand Gesture Based Virtual Mouse." International Journal for Research in Applied Science and Engineering Technology 11, no. 5 (May 31, 2023): 2458–61. http://dx.doi.org/10.22214/ijraset.2023.51731.

Full text
Abstract:
Abstract: Since the invention of the PC, a method for creating a connection cycle between humans and computers is currently being developed. A truly innovative piece of HCI(Human Computer Interaction), the mouse. There are still such gadgets around even though remote control or Bluetooth mouse technology is still in development. A Bluetooth mouse requires a dongle for connectivity and battery. The difficulty of using a mouse increases when it has additional devices. The suggested mouse framework goes beyond the point. This study suggest an HCI-based virtual mouse framework that makes use of computer vision and hand signals. Signals generated with a location method and shading division and captured using a built-in camera and webcam. The customer will be able to exercise partial control over
APA, Harvard, Vancouver, ISO, and other styles
10

Kalipu, Ravi Kumar, Harish Kurmana, Divakar Allaboina, Sanjay Kumar Chilla, Bhavish Lakkavarapu, and Ravi Kumar Nubothu. "Virtual Mouse Using Hand Gestures." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 03 (March 31, 2025): 1–9. https://doi.org/10.55041/ijsrem43518.

Full text
Abstract:
The Virtual Mouse using Hand Gesture Recognition is an innovative system that allows users to control a computer cursor through hand gestures instead of a traditional mouse. This project utilizes Python, machine learning, and OpenCV to detect and interpret hand movements in real time. A camera captures gestures, which are processed using computer vision techniques to perform mouse actions such as clicking, scrolling, and cursor movement. Additionally, a zoom-in and zoom-out feature enhances user interaction through specific gestures. This touch-free interface provides an intuitive and hygienic alternative to traditional input devices, making it particularly beneficial for accessibility, gaming, and contactless computing applications. Index Terms— Hand Gesture Recognition, Virtual Mouse, Computer Vision, OpenCV, Machine Learning, Human-Computer Interaction (HCI), Touch-Free Interface, Accessibility, Gesture Control, Real-Time Processing, Contactless Computing.
APA, Harvard, Vancouver, ISO, and other styles
11

Madan, Atul K., Constantine T. Frantzides, Nina Shervin, and Christopher L. Tebbit. "Assessment of Individual Hand Performance in Box Trainers Compared to Virtual Reality Trainers." American Surgeon 69, no. 12 (December 2003): 1112–14. http://dx.doi.org/10.1177/000313480306901219.

Full text
Abstract:
Training residents in laparoscopic skills is ideally initiated in an inanimate laboratory with both box trainers and virtual reality trainers. Virtual reality trainers have the ability to score individual hand performance although they are expensive. Here we compared the ability to assess dominant and nondominant hand performance in box.trainers with virtual reality trainers. Medical students without laparoscopic experience were utilized in this study (n = 16). Each student performed tasks on the LTS 2000, an inanimate box trainer (placing pegs with both hands and transferring pegs from one hand to another), as well as a task on the MIST-VR, a virtual reality trainer (grasping a virtual object and placing it in a virtual receptable with alternating hands). A surgeon scored students for the inanimate box trainer exercises (time and errors) while the MIST-VR scored students (time, economy of movements, and errors for each hand). Statistical analysis included Pearson correlations. Errors and time for the one-handed tasks on the box trainer did not correlate with errors, time, or economy measured for each hand by the MIST-VR (r = 0.01 to 0.30; P = NS). Total errors on the virtual reality trainer did correlate with errors on transferring pege (r = 0.61; P < 0.05). Economy and time of both dominant and nondominant hand from the MIST-VR correlated with time of transferring pegs in the box trainer (r = 0.53 to 0.77; P < 0.05). While individual hand assessment by the box trainer during 2-handed tasks was related to assessment by the virtual reality trainer, individual hand assessment during 1-handed tasks did not correlate with the virtual reality trainer. Virtual reality trainers, such as the MIST-VR, allow assessment of individual hand skills which may lead to improved laparoscopic skill acquisition. It is difficult to assess individual hand performance with box trainers alone.
APA, Harvard, Vancouver, ISO, and other styles
12

Mesuda, Yuko, Shigeru Inui, and Yosuke Horiba. "Virtual manipulations for draping." International Journal of Clothing Science and Technology 27, no. 3 (June 1, 2015): 417–33. http://dx.doi.org/10.1108/ijcst-10-2013-0119.

Full text
Abstract:
Purpose – Draping is one method used in clothing design. It is important to virtualize draping in real time, and virtual cloth handling is a key technology for this purpose. A mouse is often used for real-time cloth handling in many studies. However, gesture manipulation is more realistic than movements using the mouse. The purpose of this paper is to demonstrate virtual cloth manipulation using hand gestures in the real world. Design/methodology/approach – In this study, the authors demonstrate three types of manipulation: moving, cutting, and attaching. The user’s hand coordinates are obtained with a Kinect, and the cloth model is manipulated by them. The cloth model is moved based on the position of the hand coordinates. The cloth model is cut along a cut line calculated from the hand coordinates. In attaching the cloth model, it is mapped to a dummy model and then part of the cloth model is fixed and another part is released. Findings – This method can move the cloth model according to the motion of the hands. The authors have succeeded in cutting the cloth model based on the hand trajectory. The cloth model can be attached to the dummy model and its form is changed along the dummy model shape. Originality/value – Cloth handling in many studies is based on indirect manipulation using a mouse. In this study, the cloth model is manipulated according to hand motion in the real world in real time.
APA, Harvard, Vancouver, ISO, and other styles
13

Groen, Joris, and Peter J. Werkhoven. "Visuomotor Adaptation to Virtual Hand Position in Interactive Virtual Environments." Presence: Teleoperators and Virtual Environments 7, no. 5 (October 1998): 429–46. http://dx.doi.org/10.1162/105474698565839.

Full text
Abstract:
In virtual environments the virtual hand may not always be exactly aligned with the real hand. Such misalignment may cause an adaptation of the users' eye-hand coordination. Further, misalignment may cause a decrease in manipulation performance compared to aligned conditions. This experimental study uses a prism-adaptation paradigm to explore visuomotor adaptation to misaligned virtual hand position. Participants were immersed in an interactive virtual environment with a deliberately misaligned virtual hand position (a lateral shift of 10 cm). We carried out pointing tests with a nonvisible hand in the real world before (pretest) and after (posttest) immersion in the virtual world. A comparison of preand post-tests revealed aftereffects of the adaptation of eye-hand coordination in the opposite direction of the lateral shift (negative aftereffects). The magnitude of the aftereffect was 20% under stereoscopic viewing conditions. However, decreased manipulation performance in VE (speed/accuracy) during the immersion with misaligned hand conditions was not found.
APA, Harvard, Vancouver, ISO, and other styles
14

Liu, Xinhua, Xiaolong Cui, Guomin Song, and Bihong Xu. "Development of a virtual maintenance system with virtual hand." International Journal of Advanced Manufacturing Technology 70, no. 9-12 (November 17, 2013): 2241–47. http://dx.doi.org/10.1007/s00170-013-5473-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Cui, Dixuan, and Christos Mousas. "Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback." Multimodal Technologies and Interaction 6, no. 9 (September 6, 2022): 76. http://dx.doi.org/10.3390/mti6090076.

Full text
Abstract:
We conducted a virtual reality study to explore virtual hand illusion through three levels of appearance (Appearance dimension: realistic vs. pixelated vs. toon hand appearances) and two levels of tactile feedback (Tactile dimension: no tactile vs. tactile feedback). We instructed our participants to complete a virtual assembly task in this study. Immediately afterward, we asked them to provide self-reported ratings on a survey that captured presence and five embodiment dimensions (hand ownership, touch sensation, agency and motor control, external appearance, and response to external stimuli). The results of our study indicate that (1) tactile feedback generated a stronger sense of presence, touch sensation, and response to external stimuli; (2) the pixelated hand appearance provided the least hand ownership and external appearance; and (3) in the presence of the pixelated hand, prior virtual reality experience of participants impacted their agency and motor control and their response to external stimuli ratings. This paper discusses our findings and provides design considerations for virtual reality applications with respect to the realistic appearance of virtual hands and tactile feedback.
APA, Harvard, Vancouver, ISO, and other styles
16

Vorobeva, V. P., O. S. Perepelkina, and G. A. Arina. "Equivalence of the Classical Rubber Hand Illusion and the Virtual Hand Illusion." Experimental Psychology (Russia) 13, no. 3 (2020): 31–45. http://dx.doi.org/10.17759/exppsy.2020130303.

Full text
Abstract:
Computer technologies implementation into the body illusions research is increasing because they allow to controllably model complex processes that cannot be realised in ordinary life. It was previously demonstrated that the rubber hand illusion may be reconstructed in the virtual setting and cause similar changes in the somatoperception when the virtual hand begins to feel like your own. This result suggests that the phenomenological experience obtained in the classical illusion and in its virtual reality version has much in common. However, a direct experimental comparison of the two illusion variants has not been made, therefore, in this research we studied the equivalence of the rubber and virtual hand illusions (RHI and VHI). The sample consisted of 16 subjects (18—25 years). As registration methods we used a subjective sense of ownership of an artificial limb and the proprioceptive drift of the real hand towards the illusory hand. The analysis has proved the equivalence of illusions.
APA, Harvard, Vancouver, ISO, and other styles
17

Ranjan, Shudhanshu, Sumit Anand, Vinayak Gontia, Aman Chandra, Vineet Sharan, and Spandana SG. "Virtual Mouse Control using Hand Gesture." International Journal of Engineering Research in Computer Science and Engineering 9, no. 11 (November 1, 2022): 20–23. http://dx.doi.org/10.36647/ijercse/09.11.art006.

Full text
Abstract:
This research proposes a way for controlling the cursor's placement using only one's hands and no electronic equipment. While movements like clicking and dragging things will be done using various hand gestures. The suggested system will just require a webcam as an input device. OpenCV and Python, as well as additional tools, will be required for the suggested system.
APA, Harvard, Vancouver, ISO, and other styles
18

Ting, Zhang, Cheng Cheng, Ge Wei, and Zhang Jing. "The Construction of Autonomous Virtual Hand in Virtual Assembly Environments." International Journal of Computer Theory and Engineering 9, no. 3 (2017): 197–201. http://dx.doi.org/10.7763/ijcte.2017.v9.1137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Abbad, Khalid, Majid Ben Yakhlef, Abderrahim Saaidi, and Abderrazzak Ait Mouhou. "Virtual Hand Skinning Using Volumetric Shape." International Journal of Computer Aided Engineering and Technology 1, no. 1 (2023): 1. http://dx.doi.org/10.1504/ijcaet.2023.10040912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Peña-Pitarch, Esteban, Neus Ticó Falguera, and Jingzhou (James) Yang. "Virtual human hand: model and kinematics." Computer Methods in Biomechanics and Biomedical Engineering 17, no. 5 (August 24, 2012): 568–79. http://dx.doi.org/10.1080/10255842.2012.702864.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

M P, SINDHU, AKHILESH A, AFSAN M, and AKSHAY K. SUNIL. "Virtual Drawing Board Using Hand Tracking." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 12 (December 23, 2024): 1–8. https://doi.org/10.55041/ijsrem39912.

Full text
Abstract:
Abstract—This project presents the development of a virtual drawing board that leverages hand tracking technology for an intuitive and immersive drawing experience. The system uses computer vision algorithms to detect and track hand movements in real time, allowing users to draw in a virtual environment without the need for physical input devices like a mouse or stylus. The key objectives of the project include achieving high accuracy in hand gesture recognition, minimizing latency, and creating a user-friendly interface. The proposed solution is implemented using [Technology/Tools], and experimental results show that the system is effective in accurately capturing hand movements for drawing. This project contributes to the field of human- computer interaction by providing a novel method for digital content creation, with potential applications in art, design, and education. Index Terms—component, formatting, style, styling, insert
APA, Harvard, Vancouver, ISO, and other styles
22

Stirling, Paul HC, and Jane E. McEachan*. "Establishing a virtual hand surgery clinic." Journal of Hand Surgery (European Volume) 45, no. 9 (June 15, 2020): 1002–4. http://dx.doi.org/10.1177/1753193420932736.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Mouhou, Abderrazzak Ait, Abderrahim Saaidi, Majid Ben Yakhlef, and Khalid Abbad. "Virtual hand skinning using volumetric shape." International Journal of Computer Aided Engineering and Technology 18, no. 1/2/3 (2023): 77. http://dx.doi.org/10.1504/ijcaet.2023.127788.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Rathnam, Martin Joel, Pradyuman Sharma, Anand Shaw, Anoop Krishnan, and Vivek Vishal. "Virtual Mouse – Hand Gesture Recognition System." International Journal of Innovative Research in Advanced Engineering 10, no. 06 (June 23, 2023): 313–19. http://dx.doi.org/10.26562/ijirae.2023.v1006.13.

Full text
Abstract:
The purpose of this research is to enhance the identification of hand gestures in a Human-Computer Interaction scenario, while also minimizing the computational time required and enhancing user comfort. To achieve this goal, the authors created a computer mouse control application that employs an algorithm and selected hand features, resulting in efficient performance. The integration of voice assistant with the recommended hand postures further improves user experience and facilitates system operation.
APA, Harvard, Vancouver, ISO, and other styles
25

Patil, Mr Soham. "Hand Tracking-Based Virtual Mouse System." International Journal for Research in Applied Science and Engineering Technology 13, no. 5 (May 31, 2025): 2622–23. https://doi.org/10.22214/ijraset.2025.70798.

Full text
Abstract:
Abstract: This study presents a real-time hand gesture-based virtual mouse system leveraging computer vision technologies. The system utilizes a webcam, OpenCV, Mediapipe, PyAutoGUI, and Pynput to allow touchless cursor movement, left/right clicks, double click, and screenshot functionality. The goal is to improve human-computer interaction, especially in hygienic or accessibility-focused environments. This method replaces traditional physical mouse devices with intuitive gestures, enhancing ease of use and inclusivity.
APA, Harvard, Vancouver, ISO, and other styles
26

Pandi, Mr R. Maruthu, and Dr B. Ramya. "Temporary Anchorage Devices – A Review Virtual Mouse Using Hand Gesture." International Journal of Research Publication and Reviews 6, no. 4 (April 2025): 1504–9. https://doi.org/10.55248/gengpi.6.0425.1365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Choi, Daewoong, Hyeonjoong Cho, Kyeongeun Seo, Sangyub Lee, Jaekyu Lee, and Jaejin Ko. "Designing Hand Pose Aware Virtual Keyboard With Hand Drift Tolerance." IEEE Access 7 (2019): 96035–47. http://dx.doi.org/10.1109/access.2019.2929310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

S, Dr Nandagopal, Soundarya R, Vaishnavi S, Vanisri S, and Hiranya S. "AI Virtual Painter using OpenCV and Mediapipe." International Journal of Engineering Research in Computer Science and Engineering 9, no. 11 (November 1, 2022): 13–16. http://dx.doi.org/10.36647/ijercse/09.11.art004.

Full text
Abstract:
Recognition of hand gestures has great importance for Human-computer interaction (HCI). The human hand is very small with complex junctions compared with the entire human body, hence recognizing the human hand is not an easy task. By using the hand gesture recognition the hand point/coordinates of hands can be detected using which we can make many impossible happens. Our work indicates one such finding, that is, VIRTUAL PAINTER. In our project, the main objective is to display the words on the monitor screen which we write on air in front of the webcam. This is done by recognizing the human hand through the normal webcam of a computer and the hand points are detected using the MediaPipe python library. Using the detected hand points the count of opened fingers is stored. When the index and middle fingers are open it means it is in Selection mode and when only the index finger is open it is in drawing mode. In Selection mode, the colors using which we want to draw can be selected from the list of colors that are made displayed on the screen. In drawing mode, the content that is written in front of the camera on air is drawn on the monitor screen. The application of this implementation can be used in many places where immediate implementation or explanation is needed.
APA, Harvard, Vancouver, ISO, and other styles
29

Lohmann, Johannes, Anna Belardinelli, and Martin V. Butz. "Hands Ahead in Mind and Motion: Active Inference in Peripersonal Hand Space." Vision 3, no. 2 (April 18, 2019): 15. http://dx.doi.org/10.3390/vision3020015.

Full text
Abstract:
According to theories of anticipatory behavior control, actions are initiated by predicting their sensory outcomes. From the perspective of event-predictive cognition and active inference, predictive processes activate currently desired events and event boundaries, as well as the expected sensorimotor mappings necessary to realize them, dependent on the involved predicted uncertainties before actual motor control unfolds. Accordingly, we asked whether peripersonal hand space is remapped in an uncertainty anticipating manner while grasping and placing bottles in a virtual reality (VR) setup. To investigate, we combined the crossmodal congruency paradigm with virtual object interactions in two experiments. As expected, an anticipatory crossmodal congruency effect (aCCE) at the future finger position on the bottle was detected. Moreover, a manipulation of the visuo-motor mapping of the participants’ virtual hand while approaching the bottle selectively reduced the aCCE at movement onset. Our results support theories of event-predictive, anticipatory behavior control and active inference, showing that expected uncertainties in movement control indeed influence anticipatory stimulus processing.
APA, Harvard, Vancouver, ISO, and other styles
30

AKASHKUMAR, M. "Gesture Based System Typing Virtual Keyboard." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 01 (January 20, 2025): 1–9. https://doi.org/10.55041/ijsrem40915.

Full text
Abstract:
Gesture-based virtual keyboards have become a cutting-edge substitute for conventional physical and touch screen keyboards as touch less interfaces become more and more popular. In order to eliminate the necessity for physic -al interaction, this study introduces a gesture-based virtual keyboard system that allows users to input text using hand gestures. The technology maps hand gestures to response keyboard inputs by detecting and interpreting them using computer vision and machine learning algorithms. The suggested method guarantees precise and effective text en -tering by utilizing technologies like real-time gesture detect ion and deep learning-based hand tracking. This method may find use in assistive technology, hands-free computer settings, virtual reality (VR), and augmented reality Keywords— VS Code Software,CV,Python
APA, Harvard, Vancouver, ISO, and other styles
31

Zhou, Xiaozhou, Yu Jin, Lesong Jia, and Chengqi Xue. "Study on Hand–Eye Cordination Area with Bare-Hand Click Interaction in Virtual Reality." Applied Sciences 11, no. 13 (July 1, 2021): 6146. http://dx.doi.org/10.3390/app11136146.

Full text
Abstract:
In virtual reality, users’ input and output interactions are carried out in a three-dimensional space, and bare-hand click interaction is one of the most common interaction methods. Apart from the limitations of the device, the movements of bare-hand click interaction in virtual reality involve head, eye, and hand movements. Consequently, clicking performance varies among locations in the binocular field of view. In this study, we explored the optimal interaction area of hand–eye coordination within the binocular field of view in a 3D virtual environment (VE), and implemented a bare-hand click experiment in a VE combining click performance data, namely, click accuracy and click duration, following a gradient descent method. The experimental results show that click performance is significantly influenced by the area where the target is located. The performance data and subjective preferences for clicks show a high degree of consistency. Combining reaction time and click accuracy, the optimal operating area for bare-hand clicking in virtual reality is from 20° to the left to 30° to the right horizontally and from 15° in the upward direction to 20° in the downward direction vertically. The results of this study have implications for guidelines and applications for bare-hand click interaction interface designs in the proximal space of virtual reality.
APA, Harvard, Vancouver, ISO, and other styles
32

YANG, XIAOLI, and YOUN K. KIM. "HAND MANIPULATION TRAINING IN HAPTIC VIRTUAL ENVIRONMENTS." International Journal of Information Acquisition 05, no. 03 (September 2008): 269–81. http://dx.doi.org/10.1142/s021987890800165x.

Full text
Abstract:
Continuing advances in virtual reality (VR) technology with respect to the new addition of force and touch feedbacks have enhanced VR realism and led to the development of many useful and accessible VR systems. One of the emerging research fields is in rehabilitation training. This paper introduces a virtual reality-based hand manipulation training system with three applications: virtual writing, virtual painting and virtual dialing. The system is mainly for training hand movement precision, speed, force, and direction control. A haptic device — PHANTOM Premium 1.0 is used to give the user immediate force feedbacks to feel immersed in the virtual environment during the training session. A new collision detection method is developed for accurate and rapid calculation of the interaction between the haptic and virtual environments. The implementation performances are calculated and given to the user in real time. The practicing results are also saved for evaluation and supervision by the specialist.
APA, Harvard, Vancouver, ISO, and other styles
33

Tang, Xiao, Ruihui Li, and Chi-Wing Fu. "CAFI-AR." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, no. 4 (December 21, 2022): 1–23. http://dx.doi.org/10.1145/3569499.

Full text
Abstract:
Freehand interaction enhances user experience, allowing one to use bare hands to manipulate virtual objects in AR. Yet, it remains challenging to accurately and efficiently detect contacts between real hand and virtual object, due to the imprecise captured/estimated hand geometry. This paper presents CAFI-AR, a new approach for Contact-Aware Freehand Interaction with virtual AR objects, enabling us to automatically detect hand-object contacts in real-time with low latency. Specifically, we formulate a compact deep architecture to efficiently learn to predict hand action and contact moment from sequences of captured RGB images relative to the 3D virtual object. To train the architecture for detecting contacts on AR objects, we build a new dataset with 4,008 frame sequences, each with annotated hand-object interaction information. Further, we integrate CAFI-AR into our prototyping AR system and develop various interactive scenarios, demonstrating fine-grained contact-aware interactions on a rich variety of virtual AR objects, which cannot be achieved by existing AR interaction approaches. Lastly, we also evaluate CAFI-AR, quantitatively and qualitatively, through two user studies to demonstrate its effectiveness in terms of accurately detecting the hand-object contacts and promoting fluid freehand interactions
APA, Harvard, Vancouver, ISO, and other styles
34

Panditi Uma Vignesh, Mohammed Thaqib-ul-Rahman, Sripada Dheeraj, Tileti Sai Charan Reddy, and Mohammed Ayaz Uddin. "AI Virtual Mouse Using Hand Gesture Recognition." International Research Journal on Advanced Engineering Hub (IRJAEH) 3, no. 05 (May 13, 2025): 2259–63. https://doi.org/10.47392/irjaeh.2025.0332.

Full text
Abstract:
The AI Virtual Mouse leverages advanced computer vision and artificial intelligence techniques to replace traditional physical mice with an innovative, touch-free solution for human-computer interaction. This system uses a camera to capture real-time hand movements and gestures, which are processed through machine learning algorithms for accurate gesture recognition. The recognized gestures are mapped to conventional mouse functionalities such as cursor movement, clicking, dragging, and scrolling. Designed with user accessibility and ergonomic considerations in mind, the AI Virtual Mouse provides an intuitive and efficient alternative for individuals with physical limitations or those seeking hands-free control. The implementation involves a combination of technologies, including OpenCV for image processing, MediaPipe for hand tracking, and deep learning models for gesture classification.
APA, Harvard, Vancouver, ISO, and other styles
35

Han, Yoon Chung, and Byeong-jun Han. "Virtual pottery: a virtual 3D audiovisual interface using natural hand motions." Multimedia Tools and Applications 73, no. 2 (February 27, 2013): 917–33. http://dx.doi.org/10.1007/s11042-013-1382-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Gupta, Mr Deven, Ms Anamika Zagade, Ms Mansi Gawand, Ms Saloni Mhatre, and Prof J. W. Bakal. "Gesture Controlled Virtual Artboard." International Journal for Research in Applied Science and Engineering Technology 12, no. 2 (February 29, 2024): 1557–65. http://dx.doi.org/10.22214/ijraset.2024.58680.

Full text
Abstract:
Abstract: Hand Gesture Recognition has brought a brand new era to Artificial Intelligence. Gesture recognition is technology that interprets hand movements as commands. Gesture Controlled Virtual Artboard Project is an innovative and dynamic digital artboard. Use of this artboard can break various education barrier to provide students with fun and creative way of learning and hence It revolutionize the way of traditional teaching. Physically challenged or aged people find it difficult to identify and press the exact key on keyboard. Existing system allows user for free hand drawing with 3 different colors only. Overcoming all this difficulty stands as the ultimate goal for our proposed system. This system works on motion sensing technology which make use of OpenCV module & Mediapipe library to explicate results based on gestures of hand. Real time data is collected through webcam and this python application uses Mediapipe library to track hand gestures. Hence, this system allows user to navigate the fingers in mid air and as per the finger gestures they will draw different shapes or free hand drawing in various colors and also can erase.This Project also enables users to conduct PowerPoint presentations using gestures.
APA, Harvard, Vancouver, ISO, and other styles
37

Kang, Taeseok, Minsu Chae, Eunbin Seo, Mingyu Kim, and Jinmo Kim. "DeepHandsVR: Hand Interface Using Deep Learning in Immersive Virtual Reality." Electronics 9, no. 11 (November 6, 2020): 1863. http://dx.doi.org/10.3390/electronics9111863.

Full text
Abstract:
This paper proposes a hand interface through a novel deep learning that provides easy and realistic interactions with hands in immersive virtual reality. The proposed interface is designed to provide a real-to-virtual direct hand interface using a controller to map a real hand gesture to a virtual hand in an easy and simple structure. In addition, a gesture-to-action interface that expresses the process of gesture to action in real-time without the necessity of a graphical user interface (GUI) used in existing interactive applications is proposed. This interface uses the method of applying image classification training process of capturing a 3D virtual hand gesture model as a 2D image using a deep learning model, convolutional neural network (CNN). The key objective of this process is to provide users with intuitive and realistic interactions that feature convenient operation in immersive virtual reality. To achieve this, an application that can compare and analyze the proposed interface and the existing GUI was developed. Next, a survey experiment was conducted to statistically analyze and evaluate the positive effects on the sense of presence through user satisfaction with the interface experience.
APA, Harvard, Vancouver, ISO, and other styles
38

Liu, Siyuan, and Chao Sun. "Master–Slave Control System for Virtual–Physical Interactions Using Hands." Sensors 23, no. 16 (August 11, 2023): 7107. http://dx.doi.org/10.3390/s23167107.

Full text
Abstract:
Among the existing technologies for hand protection, master–slave control technology has been extensively researched and applied within the field of safety engineering to mitigate the occurrence of safety incidents. However, it has been identified through research that traditional master–slave control technologies no longer meet current production and lifestyle needs, and they have even begun to pose new safety risks. To resolve the safety risks exposed by traditional master–slave control, this research fuses master–slave control technology for hands with virtual reality technology, and the design of a master–slave control system for hands based on virtual reality technology is investigated. This study aims to realize the design of a master–slave control system for virtual–physical interactions using hands that captures the position, orientation, and finger joint angles of the user’s hand in real time and synchronizes the motion of the slave interactive device with that of a virtual hand. With amplitude limiting, jitter elimination, and a complementary filtering algorithm, the original motion data collected by the designed glove are turned into a Kalman-filtering-algorithm-based driving database, which drives the synchronous interaction of the virtual hand and a mechanical hand. As for the experimental results, the output data for the roll, pitch, and yaw were in the stable ranges of −0.1° to 0.1°, −0.15° to 0.15°, and −0.15° to 0.15°, respectively, which met the accuracy requirements for the system’s operation under different conditions. More importantly, these data prove that, in terms of accuracy and denoising, the data-processing algorithm was relatively compatible with the hardware platform of the system. Based on the algorithm for the virtual–physical interaction model, the authors introduced the concept of an auxiliary hand into the research, put forward an algorithmic process and a judgement condition for the stable grasp of the virtual hand’s, and solved a model-penetrating problem while enhancing the immersive experience during virtual–physical interactions. In an interactive experiment, a dynamic accuracy test was run on the system. As shown by the experimental data and the interactive effect, the system was satisfactorily stable and interactive.
APA, Harvard, Vancouver, ISO, and other styles
39

Ahmed Mohamed, Esraa, Mohamed M.A khallaf, Samia M. A. Saied, and Hussein G.H..Mogahed. "EFFECT OF VIRTUAL REALITY ON HAND DYSFUNCTION POST BURN." Cuestiones de Fisioterapia 54, no. 4 (February 20, 2025): 5140–49. https://doi.org/10.48047/d5s4sx47.

Full text
Abstract:
Background: Severe functional impairments and negative effects on both mental and physical well-being can result from burns to the hands Purpose: is to evaluate the impact of virtual reality in improving ROM, hand function and strength in rehabilitation of burned hand. Subjects and methods: Forty-two patients with (seconddegree) hand burn participated in this study. Their ages range from 20-45 year. Group (A) involved 21 patients was given in Virtual reality (VR)-based rehabilitation plus conventional hand rehabilitation, Group (B) involved 21 patients was given conventional hand rehabilitation only. Patients in Group A were treated for thirty-minutes standard conventional program and thirty-minutes VR-based rehabilitation (leap motion controller) 3 sessions per week for 8 weeks. Patients in group B were given conventional hand rehabilitation (CON) (scar massage, stretching exercises, ROM exercises, strengthening exercises, splint) 3 sessions per week for 8 weeks. Patients in both groups were recruited from merit university outpatients clinic. Initial measurements were taken before the first session, then again after 2 months of treatment. Results: all the ROM of the hand improved markedly post intervention among both groups and most of these improvement were statically significant with(p<0.05) , regarding the hand grip there was a significant improvement among both groups with (p< 0.001) and regarding the hand function there was a significant improvement among both groups with (p< 0.001) Conclusion: VR based rehabilitation has a non- significant effect on improving hand range of motion, hand power and hand function in patients with partial-thickness (second-degree) burn in comparison with conventional hand rehabilitation. despite this ,there was an improvement in each group separately
APA, Harvard, Vancouver, ISO, and other styles
40

Rathi, Aman. "AI Virtual Mouse." International Journal for Research in Applied Science and Engineering Technology 13, no. 5 (May 31, 2025): 852–58. https://doi.org/10.22214/ijraset.2025.70308.

Full text
Abstract:
The method for creating a process of human-computer interaction has changed since the advancement of computer technology. The mouse is a great tool for the human-computer interaction. This study offers a way to move the pointer without using any technological devices. On the other hand, other hand moves can be applied for operations like drag and click objects. The suggested system will just need a camera as an input device. Along with additional tools, the system will need to be used with OpenCV and Python. The output from the camera shall be shown on a display that is connected so that the user may adjust it further. The Python tools that will be used to build this system are NumPy and Mediapipe a very effective framework that offers quick fixes for AI tasks. It allows users to move the computer cursor around using hand movements while holding dvds like markers of color. Certain finger movements can be used to carry out actions like pulling and left- clicking. A hand gesture detection system for controlling a virtual mouse is presented in this research, enabling more organic human-computer interaction.
APA, Harvard, Vancouver, ISO, and other styles
41

R., SATHEESHKUMAR. "IOT Integrated Virtual Hand for Robotic Arm Using Leap Motion Controller." Journal of Research on the Lepidoptera 51, no. 2 (April 20, 2020): 49–60. http://dx.doi.org/10.36872/lepi/v51i2/301077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Adkins, Alex, Lorraine Lin, Aline Normoyle, Ryan Canales, Yuting Ye, and Sophie Jörg. "Evaluating Grasping Visualizations and Control Modes in a VR Game." ACM Transactions on Applied Perception 18, no. 4 (October 31, 2021): 1–14. http://dx.doi.org/10.1145/3486582.

Full text
Abstract:
A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience.
APA, Harvard, Vancouver, ISO, and other styles
43

Guha, Joy, Shreya Kumari, and Prof Shiv Kumar Verma. "AI Virtual Mouse Using Hand Gesture Recognition." International Journal for Research in Applied Science and Engineering Technology 10, no. 4 (April 30, 2022): 3070–76. http://dx.doi.org/10.22214/ijraset.2022.41981.

Full text
Abstract:
Abstract: The PC mouse is one of the wondrous developments of people in the field of Human-Computer Interaction (HCI) innovation. In new age of innovation, remote mouse or a contact less mouse actually utilizes gadgets and isn't liberated from gadgets completely, since it utilizes power from the gadget or might be from outside power sources like battery and gain space and electric power, likewise during COVID pandemic it is encouraged to make social separating and keep away from to contact things which gave by various people groups. Inside the projected AI virtual mouse utilizing hand signal framework, this constraint might be resolve by involving advanced camera or sacred camera for perceive the hand motions and fingers recognition abuse PC machine vision. The algorithmic rule used in the framework utilizes the man-made consciousness and AI algorithmic rule. Upheld the hand signals, the gadget might be controlled pretty much and might do left click, right snap, looking over capacities, and PC gadget pointer perform while not the utilization of the genuine mouse. Index Terms: deep learning base computer vision, real time mouse system, Media-pipe , HCI
APA, Harvard, Vancouver, ISO, and other styles
44

Westley, S., R. Mistry, and B. Dheansa. "Accuracy of virtual assessment in hand trauma." JPRAS Open 31 (March 2022): 92–98. http://dx.doi.org/10.1016/j.jpra.2021.10.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Sebelius, F., M. Axelsson, N. Danielsen, J. Schouenborg, and T. Laurell. "Real-time control of a virtual hand." Technology and Disability 17, no. 3 (August 23, 2005): 131–41. http://dx.doi.org/10.3233/tad-2005-17301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Xiong, Wei, Qing Hui Wang, and Cheng Zhong Zhang. "A Hybrid Interaction Method with Virtual Hand." Advanced Materials Research 311-313 (August 2011): 1539–45. http://dx.doi.org/10.4028/www.scientific.net/amr.311-313.1539.

Full text
Abstract:
In this paper, a novel solution combing heuristic constraints with physical-based method for virtual interaction is proposed. With which, a grasping simulation is divided into three stages including pregrasping, preliminary grasping and stable grasping and the two techniques have been used in different phases. A dexterous virtual hand, consisting of geometric model, kinematics model, collision model and contact force model, is also constructed to be used in this solution. By analyzing the physical structures and perceptual characteristics of human finger, a haptic rendering algorithm is presented to obtain natural and stable virtual interaction.
APA, Harvard, Vancouver, ISO, and other styles
47

Chan, A., R. W. H. Lau, and L. Li. "Hand Motion Prediction for Distributed Virtual Environments." IEEE Transactions on Visualization and Computer Graphics 14, no. 1 (January 2008): 146–59. http://dx.doi.org/10.1109/tvcg.2007.1056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Sanchez-Vives, Maria V., Bernhard Spanlang, Antonio Frisoli, Massimo Bergamasco, and Mel Slater. "Virtual Hand Illusion Induced by Visuomotor Correlations." PLoS ONE 5, no. 4 (April 29, 2010): e10381. http://dx.doi.org/10.1371/journal.pone.0010381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

El Magrouni, Issam, Abdelaziz Ettaoufik, Siham Aouad, and Abderrahim Maizate. "Hand Gesture Recognition for Virtual Mouse Control." International Journal of Interactive Mobile Technologies (iJIM) 19, no. 02 (January 27, 2025): 53–64. https://doi.org/10.3991/ijim.v19i02.51879.

Full text
Abstract:
Our work delved into the complexities of real-time hand motion interpretation and fingertip recognition to simulate the functionality of a traditional mouse. We developed a python-based technique that seamlessly translates hand movements into mouse commands by analyzing the angles between fingers and calculating the ratio of the hand’s silhouette to its convex hull. Our methodology was refined to ensure an intuitive and accurate user experience. However, challenges remained in achieving robustness and accuracy in gesture recognition systems in various scenarios, including variations in lighting, hand orientation, and individual human characteristics. These factors had a significant impact on system performance and reliability. To address these challenges, our approach incorporated the algorithms and machine learning models designed to adapt to different conditions. Despite these advances, further research and development were essential to improve the reliability and comprehensiveness of gesture recognition technologies.
APA, Harvard, Vancouver, ISO, and other styles
50

Sun, Hanqiu, and Kwok-hang Tsang. "Fuzzy Posture Input for Virtual-Hand Models." Presence: Teleoperators and Virtual Environments 9, no. 5 (October 2000): 473–85. http://dx.doi.org/10.1162/105474600566961.

Full text
Abstract:
Virtual-hand input realizes the natural and dexterous functionality in direct humancomputer interaction. The recognition of virtual-hand models is difficult due to two reasons: the complexity of hand structure and the lack of accurate measure, which may be caused by either mechanical noise or human factors. This paper presents a novel fuzzy-logic recognition system that can effectively deal with imprecise hand data. The system consists of three components: the classifier, identifier, and posture database. Fuzzy-logic processing is applied in both the classifier (to build class-indexing structure in the posture database) and in the identifier (to find the most likely match from the database in real-time VR applications). The posture database provides a GUI interface for the user to browse the posture images and interactively update the database by adding and deleting the sample postures, as well as adjusting the certainty threshold in recognition. Our experiments show that the fuzzy-logic method can keep nearly constant performance in recognition using tens of microseconds, even as the size of posture database increases. The recognition rate declines only slightly when the standard derivation of the noise distribution (Gaussian distribution) in the input parameters is below 15 deg. The bounded value in uniform distribution to achieve accurate recognition is below 20. The results show that the fuzzy-logic processing can improve the tolerance of noise or imprecise data in an efficient way. A free-hand modeler based on fuzzy-logic processing has been developed for creating virtual objects and Web-based 3-D VRML worlds.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography