Academic literature on the topic 'Gesture-based interface'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gesture-based interface.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Gesture-based interface"

1

Gupta, Namita, Pooja Mittal, Sumantra Dutta Roy, Santanu Chaudhury, and Subhashis Banerjee. "Developing a Gesture-based Interface." IETE Journal of Research 48, no. 3-4 (2002): 237–44. http://dx.doi.org/10.1080/03772063.2002.11416282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kirmizibayrak, Can, Nadezhda Radeva, Mike Wakid, John Philbeck, John Sibert, and James Hahn. "Evaluation of Gesture Based Interfaces for Medical Volume Visualization Tasks." International Journal of Virtual Reality 11, no. 2 (2012): 1–13. http://dx.doi.org/10.20870/ijvr.2012.11.2.2839.

Full text
Abstract:
Interactive systems are increasingly used in medical applications with the widespread availability of various imaging modalities. Gesture-based interfaces can be beneficial to interact with these kinds of systems in a variety of settings, as they can be easier to learn and can eliminate several shortcomings of traditional tactile systems, especially for surgical applications. We conducted two user studies that explore different gesture-based interfaces for interaction with volume visualizations. The first experiment focused on rotation tasks, where the performance of the gesture-based interface (using Microsoft Kinect) was compared to using the mouse. The second experiment studied localization of internal structures, comparing slice-based visualizations via gestures and the mouse, in addition to a 3D Magic Lens visualization. The results of the user studies showed that the gesture-based interface outperform the traditional mouse both in time and accuracy in the orientation matching task. The traditional mouse was the superior interface for the second experiment in terms of accuracy. However, the gesture-based Magic Lens interface was found to have the fastest target localization time. We discuss these findings and their further implications in the use of gesture-based interfaces in medical volume visualization, and discuss the possible underlying psychological mechanisms why these methods can outperform traditional interaction methods
APA, Harvard, Vancouver, ISO, and other styles
3

Elmagrouni, Issam, Abdelaziz Ettaoufik, Siham Aouad, and Abderrahim Maizate. "Approach for Improving User Interface Based on Gesture Recognition." E3S Web of Conferences 297 (2021): 01030. http://dx.doi.org/10.1051/e3sconf/202129701030.

Full text
Abstract:
Gesture recognition technology based on visual detection to acquire gestures information is obtained in a non-contact manner. There are two types of gesture recognition: independent and continuous gesture recognition. The former aims to classify videos or other types of gesture sequences that only contain one isolated gesture instance in each sequence (e.g., RGB-D or skeleton data). In this study, we review existing research methods of visual gesture recognition and will be grouped according to the following family: static, dynamic, based on the supports (Kinect, Leap…etc), works that focus on the application of gesture recognition on robots and works on dealing with gesture recognition at the browser level. Following that, we take a look at the most common JavaScript-based deep learning frameworks. Then we present the idea of defining a process for improving user interface control based on gesture recognition to streamline the implementation of this mechanism.
APA, Harvard, Vancouver, ISO, and other styles
4

De Marsico, M., S. Levialdi, M. Nappi, and S. Ricciardi. "FIGI: floating interface for gesture-based interaction." Journal of Ambient Intelligence and Humanized Computing 5, no. 4 (2012): 511–24. http://dx.doi.org/10.1007/s12652-012-0160-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Takahashi, Tomoichi, and Fumio Kishino. "Hand gesture coding based on experiments using a hand gesture interface device." ACM SIGCHI Bulletin 23, no. 2 (1991): 67–74. http://dx.doi.org/10.1145/122488.122499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shin, Seunghyeok, and Whoi-Yul Kim. "Skeleton-Based Dynamic Hand Gesture Recognition Using a Part-Based GRU-RNN for Gesture-Based Interface." IEEE Access 8 (2020): 50236–43. http://dx.doi.org/10.1109/access.2020.2980128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lee, Seongjo, Sohyun Sim, Kyhyun Um, Young-Sik Jeong, Seung-won Jung, and Kyungeun Cho. "Development of a Hand Gestures SDK for NUI-Based Applications." Mathematical Problems in Engineering 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/212639.

Full text
Abstract:
Concomitant with the advent of the ubiquitous era, research into better human computer interaction (HCI) for human-focused interfaces has intensified. Natural user interface (NUI), in particular, is being actively investigated with the objective of more intuitive and simpler interaction between humans and computers. However, developing NUI-based applications without special NUI-related knowledge is difficult. This paper proposes a NUI-specific SDK, called “Gesture SDK,” for development of NUI-based applications. Gesture SDK provides a gesture generator with which developers can directly define gestures. Further, a “Gesture Recognition Component” is provided that enables defined gestures to be recognized by applications. We generated gestures using the proposed SDK and developed a “Smart Interior,” NUI-based application using the Gesture Recognition Component. The results of experiments conducted indicate that the recognition rate of the generated gestures was 96% on average.
APA, Harvard, Vancouver, ISO, and other styles
8

Ryumin, Dmitry, Ildar Kagirov, Alexandr Axyonov, et al. "A Multimodal User Interface for an Assistive Robotic Shopping Cart." Electronics 9, no. 12 (2020): 2093. http://dx.doi.org/10.3390/electronics9122093.

Full text
Abstract:
This paper presents the research and development of the prototype of the assistive mobile information robot (AMIR). The main features of the presented prototype are voice and gesture-based interfaces with Russian speech and sign language recognition and synthesis techniques and a high degree of robot autonomy. AMIR prototype’s aim is to be used as a robotic cart for shopping in grocery stores and/or supermarkets. Among the main topics covered in this paper are the presentation of the interface (three modalities), the single-handed gesture recognition system (based on a collected database of Russian sign language elements), as well as the technical description of the robotic platform (architecture, navigation algorithm). The use of multimodal interfaces, namely the speech and gesture modalities, make human-robot interaction natural and intuitive, as well as sign language recognition allows hearing-impaired people to use this robotic cart. AMIR prototype has promising perspectives for real usage in supermarkets, both due to its assistive capabilities and its multimodal user interface.
APA, Harvard, Vancouver, ISO, and other styles
9

Lim, C. J., Nam-Hee Lee, Yun-Guen Jeong, and Seung-Il Heo. "Gesture based Natural User Interface for e-Training." Journal of the Ergonomics Society of Korea 31, no. 4 (2012): 577–83. http://dx.doi.org/10.5143/jesk.2012.31.4.577.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jophin, Shany. "Gesture Based Interface Using Motion and Image Comparison." International Journal of Advanced Information Technology 2, no. 3 (2012): 37–46. http://dx.doi.org/10.5121/ijait.2012.2303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography