Academic literature on the topic 'Gestural interfaces'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gestural interfaces.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Gestural interfaces"

1

Sluÿters, Arthur, Mehdi Ousmer, Paolo Roselli, and Jean Vanderdonckt. "QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion Controller." Proceedings of the ACM on Human-Computer Interaction 6, EICS (2022): 1–47. http://dx.doi.org/10.1145/3532211.

Full text
Abstract:
Despite the tremendous progress made for recognizing gestures acquired by various devices, such as the Leap Motion Controller, developing a gestural user interface based on such devices still induces a significant programming and software engineering effort before obtaining a running interactive application. To facilitate this development, we present QuantumLeap, a framework for engineering gestural user interfaces based on the Leap Motion Controller. Its pipeline software architecture can be parameterized to define a workflow among modules for acquiring gestures from the Leap Motion Controller, for segmenting them, recognizing them, and managing their mapping to functions of the application. To demonstrate its practical usage, we implement two gesture-based applications: an image viewer that allows healthcare workers to browse DICOM medical images of their patients without any hygiene issues commonly associated with touch user interfaces and a large-scale application for managing multimedia contents on wall screens. To evaluate the usability of QuantumLeap, seven participants took part in an experiment in which they used QuantumLeap to add a gestural interface to an existing application.
APA, Harvard, Vancouver, ISO, and other styles
2

Wolf, Catherine G. "A Comparative Study of Gestural and Keyboard Interfaces." Proceedings of the Human Factors Society Annual Meeting 32, no. 5 (1988): 273–77. http://dx.doi.org/10.1177/154193128803200506.

Full text
Abstract:
This paper presents results from two experiments which compared gestural and keyboard interfaces to a spreadsheet program. This is the first quantitative comparison of these two types of interfaces known to the author. The gestural interface employed gestures (hand-drawn marks such as carets or brackets) for commands, and handwriting as input techniques. In one configuration, the input/output hardware consisted of a transparent digitizing tablet mounted on top of an LCD which allowed the user to interact with the program by writing on the tablet with a stylus. The experiments found that participants were faster with the gestural interface. Specifically, subjects performed the operations in about 72% of the time taken with the keyboard. In addition, there was a preference for the gestural interface over the keyboard interface. These findings are explained in terms of the fewer number of movements required to carry out an operation with the gestural interface, the greater ease of remembering gestural commands, and the benefits of performing operations directly on objects of interest.
APA, Harvard, Vancouver, ISO, and other styles
3

Caon, Maurizio, Rico Süsse, Benoit Grelier, Omar Abou Khaled, and Elena Mugellini. "Design of an ergonomic gestural interface for professional road cycling." Work 66, no. 4 (2020): 933–44. http://dx.doi.org/10.3233/wor-203238.

Full text
Abstract:
BACKGROUND: Connected bike computers can support professional cyclists in achieving better performances but interacting with them requires taking their hands off the handlebar compromising focus and safety. OBJECTIVE: This research aims at exploring the design of an ergonomic interface based on micro-gestures that can allow cyclists to interact with a device while holding the handlebar. METHODS: Three different studies were conducted with seven professional cyclists adopting the gesture-elicitation technique. One study aimed at eliciting free micro-gestures; a second to evaluate gestures recognizable with a smart glove; the last focused on the gestures recognized through an interactive armband. RESULTS: The analysis of the micro-gestures elicited during these studies allowed producing a first set of guidelines to design gestural interfaces for drop-bars (a specific type of handlebar for road bikes). These guidelines suggest which fingers to use and how to design their movement in order to provide an ergonomic interface. It also introduces the principle of symmetry for the attribution of symbols to symmetric referents. Finally, it provides suggestions on the design of the interactive drop-bar. CONCLUSIONS: The guidelines provided in this paper can support the design of gestural interfaces for professional cyclists that can enhance performance and increase safety.
APA, Harvard, Vancouver, ISO, and other styles
4

Wodehouse, Andrew, and Jonathon Marks. "Gestural Product Interaction." International Journal of Art, Culture and Design Technologies 3, no. 2 (2013): 1–13. http://dx.doi.org/10.4018/ijacdt.2013070101.

Full text
Abstract:
This research explores emotional response to gesture in order to inform future product interaction design. After describing the emergence and likely role of full-body interfaces with devices and systems, the importance of emotional reaction to the necessary movements and gestures is outlined. A gestural vocabulary for the control of a web page is then presented, along with a semantic differential questionnaire for its evaluation. An experiment is described where users undertook a series of web navigation tasks using the gestural vocabulary, then recorded their reaction to the experience. A number of insights were drawn on the context, precision, distinction, repetition and scale of gestures when used to control or activate a product. These insights will be of help in interaction design, and provide a basis for further development of the gestural vocabulary.
APA, Harvard, Vancouver, ISO, and other styles
5

LaViola, Joseph J. "3D Gestural Interaction: The State of the Field." ISRN Artificial Intelligence 2013 (December 18, 2013): 1–18. http://dx.doi.org/10.1155/2013/514641.

Full text
Abstract:
3D gestural interaction provides a powerful and natural way to interact with computers using the hands and body for a variety of different applications including video games, training and simulation, and medicine. However, accurately recognizing 3D gestures so that they can be reliably used in these applications poses many different research challenges. In this paper, we examine the state of the field of 3D gestural interfaces by presenting the latest strategies on how to collect the raw 3D gesture data from the user and how to accurately analyze this raw data to correctly recognize 3D gestures users perform. In addition, we examine the latest in 3D gesture recognition performance in terms of accuracy and gesture set size and discuss how different applications are making use of 3D gestural interaction. Finally, we present ideas for future research in this thriving and active research area.
APA, Harvard, Vancouver, ISO, and other styles
6

Norman, Donald A., and Jakob Nielsen. "Gestural interfaces." Interactions 17, no. 5 (2010): 46–49. http://dx.doi.org/10.1145/1836216.1836228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Knight, Julia, and Alexis Weedon. "Gestural interfaces." Convergence: The International Journal of Research into New Media Technologies 17, no. 3 (2011): 235–36. http://dx.doi.org/10.1177/1354856511412455.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vasiljevic, Gabriel Alves Mendes, Leonardo Cunha de Miranda, and Erica Esteves Cunha de Miranda. "A Case Study of MasterMind Chess: Comparing Mouse/Keyboard Interaction with Kinect-Based Gestural Interface." Advances in Human-Computer Interaction 2016 (2016): 1–10. http://dx.doi.org/10.1155/2016/4602471.

Full text
Abstract:
As gestural interfaces emerged as a new type of user interface, their use has been vastly explored by the entertainment industry to better immerse the player in games. Despite being mainly used in dance and sports games, little use was made of gestural interaction in more slow-paced genres, such as board games. In this work, we present a Kinect-based gestural interface for an online and multiplayer chess game and describe a case study with users with different playing skill levels. Comparing the mouse/keyboard interaction with the gesture-based interaction, the results of the activity were synthesized into lessons learned regarding general usability and design of game control mechanisms. These results could be applied to slow-paced board games like chess. Our findings indicate that gestural interfaces may not be suitable for competitive chess matches, yet it can be fun to play while using them in casual matches.
APA, Harvard, Vancouver, ISO, and other styles
9

Johnson, Bridget. "Emerging Technologies for Real-Time Diffusion Performance." Leonardo Music Journal 24 (December 2014): 13–15. http://dx.doi.org/10.1162/lmj_a_00188.

Full text
Abstract:
With the ascendance of the field of new interfaces for musical expression, a new phase of sound diffusion has emerged. Rapid development is taking place across the field, with a focus on gestural interaction and the development of custom performance interfaces. This article discusses how composers and performers embracing technology have broadened the boundaries of spatial performance. A particular focus is placed on performance interfaces built by the author that afford the artist more control over performative gestures. These new works serve as examples of the burgeoning field of diffusion performance interface design.
APA, Harvard, Vancouver, ISO, and other styles
10

Mead, Patrick, David Keller, and Megan Kozub. "Point with your eyes not with your hands." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 60, no. 1 (2016): 835–39. http://dx.doi.org/10.1177/1541931213601190.

Full text
Abstract:
The emergence of gesture based controls like the Microsoft Kinect provides new opportunities for creative and innovative methods of human computer interaction. However, such devices are not without their limitations. The gross-motor movements of gestural interaction present physical limitations that may negatively affect interaction speed, accuracy, and workload, and subsequently affect the design of system interfaces and inputs. Conversely, interaction methods such as eye tracking require little physical effort, leveraging the unconscious and natural behaviors of human eye-movements as inputs. Unfortunately, eye tracking, in most cases, is limited to a simple pointing device. However, this research shows by combining these interactions into gaze-based gestural controls it is possible to overcome the limitations of each method, improving interaction performance by associating gestural commands to interface elements within a user’s field of view.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Gestural interfaces"

1

Lindberg, Martin. "Introducing Gestures: Exploring Feedforward in Touch-Gesture Interfaces." Thesis, Malmö universitet, Fakulteten för kultur och samhälle (KS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-23555.

Full text
Abstract:
This interaction design thesis aimed to explore how users could be introduced to the different functionalities of a gesture-based touch screen interface. This was done through a user-centred design research process where the designer was taught different artefacts by experienced users. Insights from this process lay the foundation for an interactive, digital gesture-introduction prototype.Testing said prototype with users yielded this study's results. While containing several areas for improvement regarding implementation and behaviour, the prototype's base methods and qualities were well received. Further development would be needed to fully assess its viability. The user-centred research methods used in this project proved valuable for later ideation and prototyping stages. Activities and results from this project indicate a potential for designers to further explore the possibilities for ensuring the discoverability of touch-gesture interactions. For future projects the author suggests more extensive research and testing using a greater sample size and wider demographic.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Sirui, and 李思锐. "Attentive gestural user interface for touch screens." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50900080.

Full text
Abstract:
Gestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed. Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive. To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm. It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces.<br>published_or_final_version<br>Computer Science<br>Master<br>Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
3

Samothrakis, Eric. "Designing interactive musical interfaces : musical and collaborative media projects using tangible and gestural interfaces." Thesis, University of Bristol, 2015. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.687608.

Full text
Abstract:
The focus of this thesis is to explore interactivity within musical composition. I start by examining the evolution of the relationship between artist, medium, and audience during the 20th century regarding how it distanced itself from a primarily hierarchical construct towards a more interdependent and flexible structure. The once established model where the composer's creative voice was the only one echoed in one's work has thereafter been challenged. Gradually, new compositional techniques appeared that allowed performers to influence the musical outcome of a composer's work (e.g. compositional techniques such as indeterminism, alternative notation system such as graphical scores, etc.). Towards the end of the 20th century, technological advances in the field of human-computer interaction (HCI) accelerated this trend by providing artists with the ability to construct customised musical interactive interfaces. Such interfaces have the potential to facilitate the involvement of audience members and extend the role of performers so they can leave a temporal mark on the final artistic work. In the course of my research thesis, I experiment with the designing of such musical interactive interfaces. I focus on the creation of solo projects, aiming at assisting studio based composition and enhancing musical performance, as well as collaborative cross-disciplinary projects (involving choreography, theatre, etc.) where the performers' and the audience's participation influence the musical outcome. In my commentary, I describe with a practical perspective the reasoning behind their physical, as well as conceptual attributes on a per case study basis.
APA, Harvard, Vancouver, ISO, and other styles
4

Chueke, J. "Perceptible affordances and feedforward for gestural interfaces : assessing effectiveness of gesture acquisition with unfamiliar interactions." Thesis, City, University of London, 2016. http://openaccess.city.ac.uk/15762/.

Full text
Abstract:
The move towards touch-based interfaces disrupts the established ways in which users manipulate and control graphical user interfaces. The predominant mode of interaction established by the desktop interface is to ‘double-click’ an icon in order to open an application, file or folder. Icons show users where to click and their shape, colour and graphic style suggests how they respond to user action. In sharp contrast, in a touch-based interface, an action may require a user to form a gesture with a certain number of fingers, a particular movement, and in a specific place. Often, none of this is suggested in the interface. This thesis adopts the approach of research through design to address the problem of how to inform the user about which gestures are available in a given touch-based interface, how to perform each gesture, and, finally, the effect of each gesture on the underlying system. Its hypothesis is that presenting automatic and animated visual prompts that depict touch and preview gesture execution will mitigate the problems users encounter when they execute commands within unfamiliar gestural interfaces. Moreover, the thesis claims the need for a new framework to assess the efficiency of gestural UI designs. A significant aspect of this new framework is a rating system that was used to assess distinct phases within the users’ evaluation and execution of a gesture. In order to support the thesis hypothesis, two empirical studies were conducted. The first introduces the visual prompts in support of training participants in unfamiliar gestures and gauges participants’ interpretation of their meaning. The second study consolidates the design features that yielded fewer error rates in the first study and assesses different interaction techniques, such as the moment to display the visual prompt. Both studies demonstrate the benefits in providing visual prompts to improve user awareness of available gestures. In addition, both studies confirm the efficiency of the rating system in identifying the most common problems users have with gestures and identifying possible design features to mitigate such problems. The thesis contributes: 1) a gesture-and-effect model and a corresponding rating system that can be used to assess gestural user interfaces, 2) the identification of common problems users have with unfamiliar gestural interfaces and design recommendations to mitigate these problems, and 3) a novel design technique that will improve user awareness of unfamiliar gestures within novel gestural interfaces.
APA, Harvard, Vancouver, ISO, and other styles
5

Cardoso, Vanessa da Silva. "Proposta de requisitos para elaboração de aplicativo de modelagem, a partir de dobraduras, com interface gestual." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/98133.

Full text
Abstract:
Esse trabalho partiu da reflexão sobre interfaces gráficas sofisticadas e pouco intuitivas e o impacto das mesmas na modelagem tridimensional. A partir dessa reflexão, a pesquisa foi direcionada para o estudo de interfaces gestuais. Adotou-se, como foco do estudo, a técnica de geração de modelos tridimensionais, chamada origami por seu caráter exploratório, sistemático e sua aproximação com a área da matemática, engenharia, arquitetura e design. O problema de pesquisa consistia em identificar padrões de gestos da mão e diretrizes para uma interface gestual voltada à construção de modelos tridimensionais, a partir das técnicas de origami. A pesquisa buscou fundamentação na bibliografia e na análise sistemática de vídeos de modelagem física referentes aos movimentos da mão e ao comportamento dos modelos. Com os dados levantados foram gerados esquema e gráficos para análise. Os dados gerados auxiliaram na seleção de um vocabulário de gestos e de diretrizes para interface e comportamento dos modelos no ambiente digital que passaram por testes de aplicabilidade no ambiente físico. Os resultados gerados visaram direcionar para desenvolvedores de softwares sobre as principais características da modelagem de origami no meio digital com utilização de interface gestual.<br>This work was based on the consideration of sophisticated and somewhat intuitive graphical interfaces and their impact on three-dimensional modeling. Based on this discussion, the research is directed towards the gestural interfaces studies. It was adopted, as the focus of study, the threedimensional models generating technique, called origami. Because of his approach to the mathematics, engineering, architecture and design field and his exploratory and systematic characteristic. The research problem was to identify gestures hand patterns and guidelines for a gestural interface focused on building three-dimensional models, from the origami techniques. The work sought, justification in the literature and systematic analysis of physical modeling video related to hand movements and models behavior. With the collected data, graphics and analysis scheme were generated. The generated data helped in the selection of a gestures vocabulary and guidelines for interface and behavior of the models in the digital environment. Those were submitted on applicability tests in the physical environment. The results generated, aims give some directions for developers of software about the main features of the origami modeling in digital media using gestural interface.
APA, Harvard, Vancouver, ISO, and other styles
6

Stößel, Christian [Verfasser], and Lucienne [Akademischer Betreuer] Blessing. "Gestural interfaces for elderly users - help or hindrance? / Christian Stößel. Betreuer: Lucienne Blessing." Berlin : Universitätsbibliothek der Technischen Universität Berlin, 2012. http://d-nb.info/1026483735/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rateau, Hanaë. "Exploring interactive sub-spaces for gestural midair interaction." Thesis, Lille 1, 2017. http://www.theses.fr/2017LIL10054/document.

Full text
Abstract:
Cette thèse s'intéresse à comment utiliser les gestes dans l'air pour enrichir les l'interaction Homme-Machine en utilisant des espaces interactifs. Cette thèse s'inspire d'un concept de la communication non verbale : la proxémie. Cette théorie, introduite par Edward T. Hall, affirme entre autres que notre perception de l'espace est dynamique. Et s'accorde à l'environnement que nous percevons. En m'inspirant de cette théorie, je présente ici un nouveau concept d'interaction accompagné de son framework de design : Mimetic Interaction Space (MIS). Afin de montrer ce que le concept peut apporter à l'interaction, en plus d'une relecture de la littérature sur l'interaction dans les airs, je propose trois instanciations de ce concept autour des trois types d'utilisation du concept. La première instanciation est pour le contrôle indirect sur un écran distant en utilisant un MIS comme une interface à part entière. La seconde instanciation d'enrichir l'interaction sur tablette en utilisant un ou des MISs l'entourant. Deux propositions d'utilisation sont faites. Une première en subdivisant le MIS en plusieurs autour de la tablette. Puis une seconde utilisation du MIS comme étant la continuité de l'écran de la tablette. La troisième instanciation se fait dans le contexte de l'interaction sur très grands écrans tactiles. Ici, un MIS a pour rôle de faire la transition continue entre l'interaction tactile et l'interaction dans les airs. Pour finir, j'introduis quelques pistes de développement pour l'avenir des MIS et je propose une réflexion sur une facette du concept des MIS qui ouvre d'importantes questions sur l'interaction basée MIS<br>This dissertation focuses on how to exploit gestural midair interaction to extend the possibilities of existing devices by using interactive spaces. The starting point is in the nonverbal communication theory of proxemics introduced by Eward T. Hall who stated that our perception of space is dynamic. From this, I argue that we could apply this dynamic understanding of space to interactive spaces. I propose a novel concept of interaction and an associated design framework for interactive spaces : Mimetic Interaction Space (MIS). To show the prospects MIS gives for midair interaction, I propose three instantiations of the concept that uses it in different ways. The first one is the use of MISs as a standalone interface the control of a remote display. The second instantiation is the use of one or several MIS tied up to the tablet in two ways. First by cutting out the MIS in multiple ones. The second way of using a MIS linked to the tablet is by considering it as a continuation of the tablet screen around it.The third instantiation is in the context of interaction on wall displays where a MIS is placed right in front of the screen and has the role of a transition space from touch to midair interaction. This MIS allows for a continuous transition between the physical and direct nature of touch interaction, and the more abstract nature of midair interaction. I finally conclude by discussing the future of interfaces regarding midair gestures. I also discuss a facet of MIS that opens a novel way to think about MIS interaction
APA, Harvard, Vancouver, ISO, and other styles
8

Cuccurullo, Stefania. "Virtual worlds for education: methodology, interaction and evaluation." Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1433.

Full text
Abstract:
2011 - 2012<br>When students arrive in the classroom they expect to be involved in immersive, fun and challenging learning experiences. There is a high risk that they become quickly bored by the traditional instructional methods. The technological evolution offers a great variety of sophisticated interactive devices and applications that can be combined with innovative learning approaches to enhance study efficiency during the learning process. 3D immersive multi-user Virtual Worlds (VWs) are increasingly becoming popular and accessible to wide public due to the advances in computational power graphics and network bandwidth also connected with reduced costs. As a consequence, it is possible to offer more engaging user experiences. This is particularly true in the learning sector, where an increasing interest is worldwide rising towards three-dimensional (3D) VWs and new interaction modalities to which young digital native people are accustomed to. Researches on the educational value of VWs have revealed their potential as learning platforms. However, further studies are always needed in order to assess their effectiveness, satisfactorily and social engagement not only in the general didactic use of the environment, but also for each specific learning subjects, activities and modality. The main challenge is to well exploit VW features and determine learning approaches and interaction modalities in which the didactic actions present added value with respect to traditional education. Indeed, educational VW activities are evolving from the early ones based only on information displaying towards simulated laboratories and new interaction modalities. The main objective of this thesis is to propose new learning methodologies in Virtual Worlds, also experimenting new interaction modalities and evaluating the effectiveness of the support provided. To this aim we first investigate how effectively a 3D city-building game supports the learning of the waste disposal practice and promotes behavior change. The game is one of the results of a research project funded by Regione Campania and is addressed to primary school children. A deep analysis of the didactic methodologies adopted worldwide has been performed to propose a reputation-based learning approach based on collaborative, competitive and individual activities. Also, the effectiveness of the proposed approach has been evaluated. The didactic opportunities offered by VWs when considering new interaction approaches are also investigated. Indeed, if for the last four decades keyboard and mouse have been the primary means for interacting with computers, recently, the availability of greater processing power, wider memories, cameras, and sensors make it possible to introduce new interaction modalities in commonly used software. Gestural interfaces offer new interaction modalities that the primary school children known well and may result accepted also for higher students. To assess the potentiality of this new interaction approach during learning activities we selected Geography as subject, since there is a decreasing interest of the students towards this topic. To this aim the GeoFly system supporting the Geography learning based on a Virtual Globe and on the interaction modalities offered by Microsoft Kinect has been developed. GeoFly is designed for elementary school level Geography students. It enables the exploration of the World by flying, adopting the bird (or aeroplane) metaphor. It also enables the teacher to create learning trips by associating to specific places images, text and videos, to develop learning activities concerning geographically situated scenarios. The proposed approach has been evaluated through a controlled experiment aiming at assessing the effect of the adoption of GeoFly on both the students' attitude towards learning Geography and also on their knowledge. [edited by author]<br>XI n.s.
APA, Harvard, Vancouver, ISO, and other styles
9

Marangoni, Matthew J. "Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition." Wright State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=wright1369256636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Perrotin, Olivier. "Chanter avec les mains : interfaces chironomiques pour les instruments de musique numériques." Thesis, Paris 11, 2015. http://www.theses.fr/2015PA112207/document.

Full text
Abstract:
Le travail de cette thèse porte sur l'étude du contrôle en temps réel de synthèse de voix chantée par une tablette graphique dans le cadre de l'instrument de musique numérique Cantor Digitalis.La pertinence de l'utilisation d'une telle interface pour le contrôle de l'intonation vocale a été traitée en premier lieu, démontrant que la tablette permet un contrôle de la hauteur mélodique plus précis que la voix réelle en situation expérimentale.Pour étendre la justesse du jeu à toutes situations, une méthode de correction dynamique de l'intonation a été développée, permettant de jouer en dessous du seuil de perception de justesse et préservant en même temps l'expressivité du musicien. Des évaluations objective et perceptive ont permis de valider l'efficacité de cette méthode.L'utilisation de nouvelles interfaces pour la musique pose la question des modalités impliquées dans le jeu de l'instrument. Une troisième étude révèle une prépondérance de la perception visuelle sur la perception auditive pour le contrôle de l'intonation, due à l'introduction d'indices visuels sur la surface de la tablette. Néanmoins, celle-ci est compensée par l'important pouvoir expressif de l'interface.En effet, la maîtrise de l'écriture ou du dessin dès l'enfance permet l'acquisition rapide d'un contrôle expert de l'instrument. Pour formaliser ce contrôle, nous proposons une suite de gestes adaptés à différents effets musicaux rencontrés dans la musique vocale. Enfin, une pratique intensive de l'instrument est réalisée au sein de l'ensemble Chorus Digitalis à des fins de test et de diffusion. Un travail de recherche artistique est conduit tant dans la mise en scène que dans le choix du répertoire musical à associer à l'instrument. De plus, un retour visuel dédié au public a été développé, afin d'aider à la compréhension du maniement de l'instrument<br>This thesis deals with the real-time control of singing voice synthesis by a graphic tablet, based on the digital musical instrument Cantor Digitalis.The relevance of the graphic tablet for the intonation control is first considered, showing that the tablet provides a more precise pitch control than real voice in experimental conditions.To extend the accuracy of control to any situation, a dynamic pitch warping method for intonation correction is developed. It enables to play under the pitch perception limens preserving at the same time the musician's expressivity. Objective and perceptive evaluations validate the method efficiency.The use of new interfaces for musical expression raises the question of the modalities implied in the playing of the instrument. A third study reveals a preponderance of the visual modality over the auditive perception for the intonation control, due to the introduction of visual clues on the tablet surface. Nevertheless, this is compensated by the expressivity allowed by the interface.The writing or drawing ability acquired since early childhood enables a quick acquisition of an expert control of the instrument. An ensemble of gestures dedicated to the control of different vocal effects is suggested.Finally, an intensive practice of the instrument is made through the Chorus Digitalis ensemble, to test and promote our work. An artistic research has been conducted for the choice of the Cantor Digitalis' musical repertoire. Moreover, a visual feedback dedicated to the audience has been developed, extending the perception of the players' pitch and articulation
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Gestural interfaces"

1

Bulmer, Russell K. Gestural interfaces to virtual reality musical instruments. University of Manchester, Department of Computer Science, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Do Gestural Interfaces Promote Thinking? Embodied Interaction: Congruent Gestures and Direct-Touch Promote Performance in Math. [publisher not identified], 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, M. Using gestures to interface with a virtual reality. Trinity College, Dublin, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dennis, Wixon, ed. Brave NUI world: Designing natural user interfaces for touch and gesture. Morgan Kaufmann, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fels, S. Sidney. Glove-Talk II: Mapping hard gestures to speech using neural networks : an approach to building adaptive interfaces. University of Toronto, Dept. of Computer Science, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eleni, Efthimiou, Kouroupetroglou Georgios, and Fotinia Stavroula-Evita, eds. Gesture and sign language in human-computer interaction and embodied communication: 9th International Gesture Workshop, GW 2011, Athens, Greece, May 25-27, 2011 : revised selected papers. Springer, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Designing Gestural Interfaces. OReilly Media, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Saffer, Dan. Designing Gestural Interfaces: Touchscreens and Interactive Devices. O'Reilly Media, Incorporated, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Saffer, Dan. Designing Gestural Interfaces: Touchscreens and Interactive Devices. O'Reilly Media, Incorporated, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wigdor, Daniel, and Dennis Wixon. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Elsevier Science & Technology Books, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Gestural interfaces"

1

Lange, Christian, Thomas Hermann, and Helge Ritter. "Holistic Body Tracking for Gestural Interfaces." In Gesture-Based Communication in Human-Computer Interaction. Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-24598-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zaiti, Ionut-Alexandru, and Stefan-Gheorghe Pentiuc. "Gestural Interfaces for Mobile and Ubiquitous Applications." In Lecture Notes in Electrical Engineering. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-05440-7_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Reily, Todd, and Martina Balestra. "Applying Gestural Interfaces to Command-and-Control." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21708-1_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stößel, Christian, Hartmut Wandke, and Lucienne Blessing. "Gestural Interfaces for Elderly Users: Help or Hindrance?" In Gesture in Embodied Communication and Human-Computer Interaction. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-12553-9_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Yougen, Hanchen Song, Wei Deng, and Lingda Wu. "Separating and Recognizing Gestural Strokes for Sketch-Based Interfaces." In Foundations and Applications of Intelligent Systems. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37829-4_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vallat, Matthieu, Alessandro Silacci, Omar Abou Khaled, Elena Mugellini, Giuseppe Fedele, and Maurizio Caon. "Comparing the Ergonomics of Gestural Interfaces While Running on a Treadmill." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-51549-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Barros, Gil. "Extending ActionSketch for New Interaction Styles: Gestural Interfaces and Interactive Environments." In Design, User Experience, and Usability. User Experience Design for Diverse Interaction Platforms and Environments. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-07626-3_48.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Agner, Luiz, Adriano Bernardo Renzi, Natanne Viegas, Priscila Buares, and Vitor Zanfagnini. "Evaluating Interaction Design in Brazilian Tablet Journalism: Gestural Interfaces and Affordance Communicability." In Design, User Experience, and Usability: Users and Interactions. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-20898-5_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Anuchitkittikul, Burin, Masashi Okamoto, Sadao Kurohashi, Toyoaki Nishida, and Yoichi Sato. "Video Content Manipulation by Means of Content Annotation and Nonsymbolic Gestural Interfaces." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30132-5_56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Monarca, Ivonne, Yingying Yuki Chen, Audrey Bichelmeir, Kayla Anderson, Monica Tentori, and Franceli L. Cibrian. "Designing a Game for Haptic Interfaces to Uncover Gestural Pattern in Children." In Proceedings of the International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2022). Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-21333-5_93.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Gestural interfaces"

1

Jego, Jean-Francois, Alexis Paljic, and Philippe Fuchs. "User-defined gestural interaction: A study on gesture memorization." In 2013 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, 2013. http://dx.doi.org/10.1109/3dui.2013.6550189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Margolis, Todd. "Gestural interfaces for immersive environments." In IS&T/SPIE Electronic Imaging, edited by Margaret Dolinsky and Ian E. McDowall. SPIE, 2014. http://dx.doi.org/10.1117/12.2042363.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bellani, Pierstefano, Marina Carulli, and Giandomenico Caruso. "Gestural Interfaces to Support the Sketching Activities of Designers." In ASME 2021 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/detc2021-71233.

Full text
Abstract:
Abstract The several loops characterizing the design process used to slow down the development of new projects. Since the 70s, the design process has changed due to the new technologies and tools related to Computer-Aided Design software and Virtual Reality applications that make almost the whole process digital. However, the concept phase of the design process is still based on traditional approaches, while digital tools are poor exploited. In this phase, designers need tools that allow them to rapidly save and freeze their ideas, such as sketching on paper, which is not integrated in the digital-based process. The paper presents a new gestural interface to give designers more support by introducing an effective device for 3D modelling to improve and speed up the conceptual design process. We designed a set of gestures to allow people from different background to 3D model their ideas in a natural way. A testing session with 17 participants allowed us to verify if the proposed interaction was intuitive or not. At the end of the tests, all participants succeeded in the 3D modelling of a simple shape (a column) by only using air gestures in a relatively short amount of time exactly how they expected it to be built, confirming the proposed interaction.
APA, Harvard, Vancouver, ISO, and other styles
4

"Session details: Gestural input." In IUI06: 11th International Conference on Intelligent User Interfaces. ACM, 2006. http://dx.doi.org/10.1145/3259618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

LaViola, Joseph J. "An introduction to 3D gestural interfaces." In ACM SIGGRAPH 2014 Courses. ACM Press, 2014. http://dx.doi.org/10.1145/2614028.2615424.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cani, Maire-Paule. "Session details: Haptic, gestural, hybrid interfaces." In SA09: SIGGRAPH ASIA 2009. ACM, 2009. http://dx.doi.org/10.1145/3251420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sorce, Salvatore, Alessio Malizia, Vito Gentile, and Antonio Gentile. "Touchless gestural interfaces for networked public displays." In the 2015 ACM International Joint Conference. ACM Press, 2015. http://dx.doi.org/10.1145/2800835.2807958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vatavu, Radu-Daniel. "Designing gestural interfaces for the interactive TV." In the 11th european conference. ACM Press, 2013. http://dx.doi.org/10.1145/2465958.2465981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cramariuc, Gabriel, and Stefan gheorghe Pentiuc. "OBSERVING MOTOR DEVELOPMENT OF PRESCHOOL CHILDREN USING DEPTH CAMERAS." In eLSE 2015. Carol I National Defence University Publishing House, 2015. http://dx.doi.org/10.12753/2066-026x-15-076.

Full text
Abstract:
In this article we analyze the 3D interaction of children (using gestures made with whole body) using depth cameras (ie, Microsoft Kinect). It should be noted that there are few research studies in the academic literature that investigates the interactions that children made with whole body through gesture acquisition systems using Kinect sensor, which translates into lack of level design recommendations regarding prototypes involving this type of interaction for children. Natural interfaces, such as those favored by Kinect sensor tend to be intuitive and to require little knowledge from users. Gestures commonly preferred by users are likely to be easier to perform than gestures proposed by designers - which often requires a sustained practice from user. There are few recommendations on specific child gesture interaction with a depth chamber as Microsoft Kinect. As the current market related to Kinect games for children continues to grow, there are still few informations about the ways in which young users interact with these interfaces. Children performance in gestural command execution performance may be different from adults. Children are a unique group of users that require special consideration. We did a study in children aged 3 to 6 years to identify how children perform gestures in working with interfaces that require interaction with the whole body when they were faced with various tasks but common in preschool activities. Gestures and body movements were videotaped, arranged by category and coded for the purpose of comparison by age, sex and degree of experimentation with Kinect interfaces. In the same study parents answered a questionnaire to provide data on the degree of motor development of their children. We correlated the data obtained and we analyzed opportunities to use Kinect type deep cameras for motor development of preschool children.
APA, Harvard, Vancouver, ISO, and other styles
10

Katsuragawa, Keiko, James R. Wallace, and Edward Lank. "Gestural Text Input Using a Smartwatch." In AVI '16: International Working Conference on Advanced Visual Interfaces. ACM, 2016. http://dx.doi.org/10.1145/2909132.2909273.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Gestural interfaces"

1

Perzanowski, Dennis, Alan C. Schultz, William Adams, and Elaine Marsh. Using a Natural Language and Gesture Interface for Unmanned Vehicles. Defense Technical Information Center, 2000. http://dx.doi.org/10.21236/ada435161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!