Academic literature on the topic 'Auditory and visual input'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Auditory and visual input.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Auditory and visual input"

1

Mao, Yu-Ting, Tian-Miao Hua, and Sarah L. Pallas. "Competition and convergence between auditory and cross-modal visual inputs to primary auditory cortical areas." Journal of Neurophysiology 105, no. 4 (2011): 1558–73. http://dx.doi.org/10.1152/jn.00407.2010.

Full text
Abstract:
Sensory neocortex is capable of considerable plasticity after sensory deprivation or damage to input pathways, especially early in development. Although plasticity can often be restorative, sometimes novel, ectopic inputs invade the affected cortical area. Invading inputs from other sensory modalities may compromise the original function or even take over, imposing a new function and preventing recovery. Using ferrets whose retinal axons were rerouted into auditory thalamus at birth, we were able to examine the effect of varying the degree of ectopic, cross-modal input on reorganization of dev
APA, Harvard, Vancouver, ISO, and other styles
2

Dzulkifli, Mariam Adawiah, and Ain Zurzillah Abdul Halim. "Input Modality and its Effect on Memory Recall." Journal of Cognitive Sciences and Human Development 9, no. 2 (2023): 89–100. http://dx.doi.org/10.33736/jcshd.5699.2023.

Full text
Abstract:
One’s learning performance may be influenced by many internal and external factors. In addition to one’s cognitive ability, matters related to the academic context such as learning materials, contents and instruction can regulate and influence learning performance. The study aimed to examine the effects of different input modalities on learning performance by measuring memory recall success. A total of 96 participants took part in an experimental study employing a between-subject design. They were randomly assigned to one of the three groups that were presented with either visual or auditory o
APA, Harvard, Vancouver, ISO, and other styles
3

Moradi, Vahid, Kiana Kheirkhah, Saeid Farahani, and Iman Kavianpour. "Investigating the Effects of Hearing Loss and Hearing Aid Digital Delay on Sound-Induced Flash Illusion." Journal of Audiology and Otology 24, no. 4 (2020): 174–79. http://dx.doi.org/10.7874/jao.2019.00507.

Full text
Abstract:
Background and Objectives: The integration of auditory-visual speech information improves speech perception; however, if the auditory system input is disrupted due to hearing loss, auditory and visual inputs cannot be fully integrated. Additionally, temporal coincidence of auditory and visual input is a significantly important factor in integrating the input of these two senses. Time delayed acoustic pathway caused by the signal passing through digital signal processing. Therefore, this study aimed to investigate the effects of hearing loss and hearing aid digital delay circuit on sound-induce
APA, Harvard, Vancouver, ISO, and other styles
4

Ginsburg, Harvey J., Cathy Jenkins, Rachel Walsh, and Brad Peck. "Visual Superiority Effect in Televised Prevention of Victimization Programs for Preschool Children." Perceptual and Motor Skills 68, no. 3_suppl (1989): 1179–82. http://dx.doi.org/10.2466/pms.1989.68.3c.1179.

Full text
Abstract:
Preschool children have been reported to remember more visual than auditory content from television programs. 80 preschool children were randomly assigned to conditions where visual or auditory components of a televised program on personal safety were manipulated. Visually modeled actions were slightly more salient for preschool-age children than actions represented auditorily. The combination of visual and auditory input provided the superior educational method.
APA, Harvard, Vancouver, ISO, and other styles
5

VanRullen, Rufin, Benedikt Zoefel, and Barkin Ilhan. "On the cyclic nature of perception in vision versus audition." Philosophical Transactions of the Royal Society B: Biological Sciences 369, no. 1641 (2014): 20130214. http://dx.doi.org/10.1098/rstb.2013.0214.

Full text
Abstract:
Does our perceptual awareness consist of a continuous stream, or a discrete sequence of perceptual cycles, possibly associated with the rhythmic structure of brain activity? This has been a long-standing question in neuroscience. We review recent psychophysical and electrophysiological studies indicating that part of our visual awareness proceeds in approximately 7–13 Hz cycles rather than continuously. On the other hand, experimental attempts at applying similar tools to demonstrate the discreteness of auditory awareness have been largely unsuccessful. We argue and demonstrate experimentally
APA, Harvard, Vancouver, ISO, and other styles
6

Robinson, Christopher W., and Vladimir M. Sloutsky. "When Audition Dominates Vision." Experimental Psychology 60, no. 2 (2013): 113–21. http://dx.doi.org/10.1027/1618-3169/a000177.

Full text
Abstract:
Presenting information to multiple sensory modalities sometimes facilitates and sometimes interferes with processing of this information. Research examining interference effects shows that auditory input often interferes with processing of visual input in young children (i.e., auditory dominance effect), whereas visual input often interferes with auditory processing in adults (i.e., visual dominance effect). The current study used a cross-modal statistical learning task to examine modality dominance in adults. Participants ably learned auditory and visual statistics when auditory and visual se
APA, Harvard, Vancouver, ISO, and other styles
7

Garner, Aleena R., and Georg B. Keller. "A cortical circuit for audio-visual predictions." Nature Neuroscience 25, no. 1 (2021): 98–105. http://dx.doi.org/10.1038/s41593-021-00974-7.

Full text
Abstract:
AbstractLearned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppre
APA, Harvard, Vancouver, ISO, and other styles
8

Takeshima, Yasuhiro, and Jiro Gyoba. "Changing Pitch of Sounds Alters Perceived Visual Motion Trajectory." Multisensory Research 26, no. 4 (2013): 317–32. http://dx.doi.org/10.1163/22134808-00002422.

Full text
Abstract:
Several studies have examined the effects of auditory stimuli on visual perception. In studies of cross-modal correspondences, auditory pitch has been shown to modulate visual motion perception. In particular, low-reliability visual motion stimuli tend to be affected by metaphorically or physically congruent or incongruent sounds. In the present study, we examined the modulatory effects of auditory pitch on visual perception of motion trajectory for visual inputs of varying reliability. Our results indicated that an auditory pitch implying the illusory motion toward the outside of the visual f
APA, Harvard, Vancouver, ISO, and other styles
9

Robinson, Christopher W., and Vladimir M. Sloutsky. "Visual processing speed: effects of auditory input on visual processing." Developmental Science 10, no. 6 (2007): 734–40. http://dx.doi.org/10.1111/j.1467-7687.2007.00627.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

McDaniel, Jena, and Stephen Camarata. "Does Access to Visual Input Inhibit Auditory Development for Children With Cochlear Implants? A Review of the Evidence." Perspectives of the ASHA Special Interest Groups 2, no. 9 (2017): 10–24. http://dx.doi.org/10.1044/persp2.sig9.10.

Full text
Abstract:
Purpose We review the evidence for attenuating visual input during intervention to enhance auditory development and ultimately improve spoken language outcomes in children with cochlear implants. Background Isolating the auditory sense is a long-standing tradition in many approaches for teaching children with hearing loss. However, the evidence base for this practice is surprisingly limited and not straightforward. We review four bodies of evidence that inform whether or not visual input inhibits auditory development in children with cochlear implants: (a) audiovisual benefits for speech perce
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Auditory and visual input"

1

Roe, Anna Wang. "Functional transformations of visual input by auditory thalamus and cortex : an experimentally induced visual pathway in ferrets." Thesis, Massachusetts Institute of Technology, 1991. http://hdl.handle.net/1721.1/13942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mao, Yuting. "The Reorganization of Primary Auditory Cortex by Invasion of Ectopic Visual Inputs." Digital Archive @ GSU, 2012. http://digitalarchive.gsu.edu/biology_diss/112.

Full text
Abstract:
Brain injury is a serious clinical problem. The success of recovery from brain injury involves functional compensation in the affected brain area. We are interested in general mechanisms that underlie compensatory plasticity after brain damage, particularly when multiple brain areas or multiple modalities are included. In this thesis, I studied the function of auditory cortex after recovery from neonatal midbrain damage as a model system that resembles patients with brain damage or sensory dysfunction. I addressed maladaptive changes of auditory cortex after invasion by ectopic visual inputs.
APA, Harvard, Vancouver, ISO, and other styles
3

Castle, Heidi. "The workload implications of haptic displays in multi-display environments such as the cockpit : dual-task interference of within-sense haptic inputs (tactile/proprioceptive) and between-sense inputs (tactile/proprioceptive/auditory/visual)." Thesis, Cranfield University, 2007. http://dspace.lib.cranfield.ac.uk/handle/1826/3859.

Full text
Abstract:
Visual workload demand within the cockpit is reaching saturation, whereas the haptic sense (proprioceptive and tactile sensation) is relatively untapped, despite studies suggesting the benefits of haptic displays. MRT suggests that inputs from haptic displays will not interfere with inputs from visual or auditory displays. MRT is based on the premise that multisensory integration occurs only after unisensory processing. However, recent neuroscientific findings suggest that the distinction between unisensory versus multisensory processing is much more blurred than previously thought. This progr
APA, Harvard, Vancouver, ISO, and other styles
4

Castle, H. "The workload implications of haptic displays in multi-display environments such as the cockpit: Dual-task interference of within-sense haptic inputs (tactile/proprioceptive) and between-sense inputs (tactile/proprioceptive/auditory/visual)." Thesis, Cranfield University, 2007. http://hdl.handle.net/1826/3859.

Full text
Abstract:
Visual workload demand within the cockpit is reaching saturation, whereas the haptic sense (proprioceptive and tactile sensation) is relatively untapped, despite studies suggesting the benefits of haptic displays. MRT suggests that inputs from haptic displays will not interfere with inputs from visual or auditory displays. MRT is based on the premise that multisensory integration occurs only after unisensory processing. However, recent neuroscientific findings suggest that the distinction between unisensory versus multisensory processing is much more blurred than previously thought. This progr
APA, Harvard, Vancouver, ISO, and other styles
5

Wilkie, Sonia. "Auditory manipulation of visual perception." Thesis, View thesis, 2008. http://handle.uws.edu.au:8081/1959.7/39802.

Full text
Abstract:
Psychological research on cross-modal auditory-visual perception has focused predominantly on the manipulation of sensory information by visual information. There are relatively few studies of the way auditory stimuli may affect other sensory information. The Sound-induced Illusory Flash is one illusory paradigm that involves the auditory system biasing visual information. However, little is known about this cross-modal illusion. More research is needed into the structure of the illusion that investigates the different conditions under which the Sound induced Illusory Flash manifests and is en
APA, Harvard, Vancouver, ISO, and other styles
6

Wilkie, Sonia. "Auditory manipulation of visual perception." View thesis, 2008. http://handle.uws.edu.au:8081/1959.7/39802.

Full text
Abstract:
Thesis (M.A. (Hons.))--University of Western Sydney, 2008.<br>Thesis accompanied by CD-ROM with demonstration of possible creative applications. A thesis presented to the University of Western Sydney, College of Arts, MARCS Auditory Laboratories, in fulfilment of the requirements for the degree of Master of Arts (Honours). Includes bibliographies. Thesis minus demonstration CD-ROM also available online at: http://handle.uws.edu.au:8081/1959.7/39849.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhao, Hang Ph D. Massachusetts Institute of Technology. "Visual and auditory scene parsing." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122101.

Full text
Abstract:
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Thesis: Ph. D. in Mechanical Engineering and Computation, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2019<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 121-132).<br>Scene parsing is a fundamental topic in computer vision and computational audition, where people develop computational approaches to achieve human perceptual system's ability in understanding s
APA, Harvard, Vancouver, ISO, and other styles
8

Lee, Chung-sze Eunice. "Auditory, visual and auditory-visual contributions to the Cantonese-speaking hearing-impaired adolescents' recognition of consonants." Click to view the E-thesis via HKUTO, 1999. http://sunzi.lib.hku.hk/hkuto/record/B3621002X.

Full text
Abstract:
Thesis (B.Sc)--University of Hong Kong, 1999.<br>"A dissertation submitted in partial fulfilment of the requirements for the Bachelor of Science (Speech and Hearing Sciences), The University of Hong Kong, May 14, 1999." Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
9

Storms, Russell L. "Auditory-visual cross-modal perception phenomena." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1998. http://handle.dtic.mil/100.2/ADA355474.

Full text
Abstract:
Dissertation (Ph.D. in Computer Science) Naval Postgraduate School, September 1998.<br>Dissertation supervisor(s): Michael J. Zyda. "September 1998." Includes bibliographical references (p. 207-222). Also Available online.
APA, Harvard, Vancouver, ISO, and other styles
10

Saliba, Anthony John. "Auditory-visual integration in sound localisation." Thesis, University of Essex, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.249979.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Auditory and visual input"

1

Ando, Yoichi. Auditory and Visual Sensations. Springer New York, 2010. http://dx.doi.org/10.1007/b13253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

service), SpringerLink (Online, ed. Auditory and Visual Sensations. Springer-Verlag New York, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gregory, Kramer, Santa Fe Institute (Santa Fe, N.M.), and International Conference on Auditory Display (1st : 1992 : Santa Fe, N.M.), eds. Auditory display: Sonification, audification, and auditory interfaces. Addison-Wesley, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Press, Leonard J. Parallels between auditory & visual processing. Optometric Extension Program Foundation, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Storms, Russell L. Auditory-visual cross-modal perception phenomena. Naval Postgraduate School, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Barsch, Ray H. Fine tuning: An auditory-visual training program. Academic Therapy Publicatons, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Evamy, Barbara. Auditory & visual discrimination exercises: A teacher's aid. B. Evamy, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zheng, Nanning, ed. Cognitive Computing of Visual and Auditory Information. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3228-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alan, Slater, ed. Perceptual development: Visual, auditory, and speech perception in infancy. Psychology Press, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Alan, Slater, ed. Perceptual development: Visual, auditory, and speech perception in infancy. Psychology Press, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Auditory and visual input"

1

Nilsson, Lars-Göran, Kjell Ohlsson, and Jerker Rönnberg. "Capacity Differences in Processing and Storage of Auditory and Visual Input." In Attention and Performance VI. Routledge, 2022. http://dx.doi.org/10.4324/9781003309734-34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kobayashi, Akemi, Ryosuke Aoki, Norimichi Kitagawa, Toshitaka Kimura, Youichi Takashima, and Tomohiro Yamada. "Towards Enhancing Force-Input Interaction by Visual-Auditory Feedback as an Introduction of First Use." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-39516-6_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sur, Mriganka. "Visual Plasticity in the Auditory Pathway: Visual Inputs Induced into Auditory Thalamus and Cortex Illustrate Principles of Adaptive Organization in Sensory Systems." In Dynamic Interactions in Neural Networks: Models and Data. Springer New York, 1989. http://dx.doi.org/10.1007/978-1-4612-4536-0_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Honda, Tatsuya, Tetsuaki Baba, and Makoto Okamoto. "Ontenna: Design and Social Implementation of Auditory Information Transmission Devices Using Tactile and Visual Senses." In Lecture Notes in Computer Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08645-8_16.

Full text
Abstract:
Abstract Ontenna is a device that can be worn on the hair, earlobe, collar, or sleeve, and it transmits sound characteristics to the human body using vibrations and light. It can serve as an auxiliary acoustic sensory device for the Deaf and Hard of Hearing (DHH), whereas for others, it can serve as a novel acoustic perception device. A condenser microphone mounted on the main body of Ontenna acquires sound pressure data and drives the vibration motor and light-emitting diode in real-time according to the input signals. This allows the user to perceive various sonic features such as the rhythm
APA, Harvard, Vancouver, ISO, and other styles
5

Irvine, Dexter R. F. "Auditory Nerve Input to the Central Processor." In The Auditory Brainstem. Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-71057-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vatikiotis-Bateson, Eric, and Kevin G. Munhall. "Auditory-Visual Speech Processing." In The Handbook of Speech Production. John Wiley & Sons, Inc, 2015. http://dx.doi.org/10.1002/9781118584156.ch9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ando, Yoichi. "Introduction to Visual Sensations." In Auditory and Visual Sensations. Springer New York, 2009. http://dx.doi.org/10.1007/b13253_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Diamond, Irving T., David Fitzpatrick, and James M. Sprague. "The Extrastriate Visual Cortex." In Association and Auditory Cortices. Springer US, 1985. http://dx.doi.org/10.1007/978-1-4757-9619-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ando, Yoichi. "Introduction." In Auditory and Visual Sensations. Springer New York, 2009. http://dx.doi.org/10.1007/b13253_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ando, Yoichi. "Applications (III) – Noise Measurement." In Auditory and Visual Sensations. Springer New York, 2009. http://dx.doi.org/10.1007/b13253_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Auditory and visual input"

1

Rosales López, Pedro Pablo, Javier Hugo Moran Ruiz, Vincent Stefano Ferida Del Aguila, Steissy Kimberly Meza Ramos, Jiréh Alonso Bautista Inga, and José Carlos Segundo Saavedra Mejía. "Visual and External Auditory in Information Retention." In 2024 International Symposium on Accreditation of Engineering and Computing Education (ICACIT). IEEE, 2024. https://doi.org/10.1109/icacit62963.2024.10788622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Masai, Katsutoshi, and Hideo Saito. "Augmented Auditory Feedback Towards Ball Sports Visual Skill Training." In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2025. https://doi.org/10.1109/vrw66409.2025.00184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Eng, Katelyn, Beverly Hannah, Keith Leung, and Yue Wang. "Effects of auditory, visual and gestural input on the perceptual learning of tones." In 7th International Conference on Speech Prosody 2014. ISCA, 2014. http://dx.doi.org/10.21437/speechprosody.2014-169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Zitong, and Wei Gong. "Multi-Sensory Integration and Emotional Responses: The Impact of Materials on Perception and Emotion." In 16th International Conference on Applied Human Factors and Ergonomics (AHFE 2025). AHFE International, 2025. https://doi.org/10.54941/ahfe1006024.

Full text
Abstract:
With the rapid advancement of virtual reality (VR) technology, multi-sensory integration has become a significant area of focus in the emotional design of virtual environments. Materials play an important role in shaping both perception and emotion, while tactile stimuli are particularly influential in cognitive and emotional responses. Additionally, research indicates that there is an integration between vision, hearing, and touch. Studies have demonstrated that combining various sensory inputs, particularly tactile, visual, and auditory stimuli, can enhance cross-sensory integration. While r
APA, Harvard, Vancouver, ISO, and other styles
5

Medini, Chaitanya, Arathi G. Rajendran, Aiswarya Jijibai, Bipin Nair, and Shyam Diwakar. "Computational characterization of cerebellum granule neuron responses to auditory and visual inputs." In 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI). IEEE, 2016. http://dx.doi.org/10.1109/icacci.2016.7732020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gehlen, Joshua, Alina Schmitz-hübsch, Sebastian Handke, and Wolfgang Koch. "AI-Powered Auditory Control and Augmented Reality Interfaces for UAVs - A Contactless Control and Situation Awareness Concept." In 13th International Conference on Human Interaction & Emerging Technologies: Artificial Intelligence & Future Applications. AHFE International, 2025. https://doi.org/10.54941/ahfe1005917.

Full text
Abstract:
Unmanned Aerial Vehicles (UAVs) are increasingly utilized in military and civilian tasks like search and rescue, however, traditional operation methods can be risky in hazardous situations. This article presents a novel UAV control concept leveraging artificial intelligence (AI) and Augmented Reality (AR) technology, allowing operators to manage drones without handheld devices through audio-based input and output. The suggested system employs headsets and AR glasses to provide real-time visual feedback, enhancing situational awareness and decision-making by displaying critical data such as UAV
APA, Harvard, Vancouver, ISO, and other styles
7

Corpataux, Sam, Marine Capallera, Omar Abou Khaled, and Elena Mugellini. "Enhancing User Immersion in Virtual Reality by Integrating Collective Emotions through Audio-Visual Analysis." In 15th International Conference on Applied Human Factors and Ergonomics (AHFE 2024). AHFE International, 2024. http://dx.doi.org/10.54941/ahfe1004687.

Full text
Abstract:
In the rapidly evolving field of virtual reality (VR), deep user immersion remains a major challenge for researchers and developers alike. Effectively integrating emotional cues into VR environments to enhance the user experience could be a key issue. This study introduces a positive advancement by presenting an innovative solution that combines audio and video analysis to detect and integrate collective emotion while watching 360° events. Our approach sets itself apart from previous work, which focused either on the visual or auditory aspect, by embracing a holistic perspective that more accu
APA, Harvard, Vancouver, ISO, and other styles
8

Iltanen, Mika, Asko Ellman, and Joonas Laitinen. "Wearable Haptic Device for an IPT System Based on Pneumatic Muscles." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-34750.

Full text
Abstract:
The human haptic system has an important role to play in human interaction with Virtual Environments (VEs). Unlike the visual and auditory systems, the haptic sense is capable of both sensing and acting on the environment and is an indispensable part of many human activities. In order to provide the realism needed for effective and compelling applications, VEs need to provide inputs to, and mirror the outputs of, the haptic system. These characteristics are the most important issues in the design of confined spaces and mechanical constructions using 6 DOF input devices in Immersive Projection
APA, Harvard, Vancouver, ISO, and other styles
9

Farooq, Ahmed, Jari Kangas, and Roope Raisamo. "TAUCHI-GPT: Leveraging GPT-4 to create a Multimodal Open-Source Research AI tool." In AHFE 2023 Hawaii Edition. AHFE International, 2023. http://dx.doi.org/10.54941/ahfe1004176.

Full text
Abstract:
In the last few year advances in deep learning and artificial intelligence have made it possible to generate high-quality text, audio, and visual content automatically for a wide range of application areas including research and education. However, designing and customizing an effective R&amp;D tool capable of providing necessary tool-specific output, and breaking down complex research tasks requires a great deal of expertise and effort, and is often a time-consuming and expensive process. Using existing Generative Pre-trained Transformers (GPT) and foundational models, it is now possible to l
APA, Harvard, Vancouver, ISO, and other styles
10

G, Ramya. "Enhanced Multimodal Emotion Recognition using Deep Representation Learning and Cross-Modal Feature Fusion." In International Conference on Modern Trends in Engineering and Management (ICMTEM-25). International Journal of Advanced Trends in Engineering and Management, 2025. https://doi.org/10.59544/wntq3923/icmtem25p37.

Full text
Abstract:
Emotion recognition is a crucial aspect of humancomputer interaction, enabling machines to understand and respond to human emotions more effectively. In this paper, we explore state of the art models for multimodal emotion recognition, leveraging textual, auditory, and visual inputs. By integrating these diverse data sources, we develop an ensemble model designed to enhance the accuracy and robustness of emotion detection. Our approach aims to capture intricate emotional cues from speech patterns, facial expressions, and textual content, ensuring a more comprehensive understanding of human emo
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Auditory and visual input"

1

Visram, Anisa, Iain Jackson, Ibrahim Almufarrij, Michael Stone, and Kevin Munro. Comparing visual reinforcement audiometry outcomes using different auditory stimuli and visual rewards. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, 2021. http://dx.doi.org/10.37766/inplasy2021.1.0080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Richardson, James. Auditory and Visual Sensory Stores: a Recognition Task. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.1557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Driesen, Jacob. Differential Effects of Visual and Auditory Presentation on Logical Reasoning. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.2546.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Metaxas, D. Human Identification and Recognition of Emotional State from Visual Input. Defense Technical Information Center, 2005. http://dx.doi.org/10.21236/ada448621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Brady-Herbst, Brenene. An Analysis of Spondee Recognition Thresholds in Auditory-only and Audio-visual Conditions. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.7094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Harsh, John R. Auditory and Visual Evoked Potentials as a Function of Sleep Deprivation and Irregular Sleep. Defense Technical Information Center, 1989. http://dx.doi.org/10.21236/ada228488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yu, Wanchi. Implicit Learning of Children with and without Developmental Language Disorder across Auditory and Visual Categories. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.7460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jokeit, H., R. Goertzl, E. Kuchleri, and S. Makeig. Event-Related Changes in the 40 Hz Electroencephalogram in Auditory and Visual Reaction Time Tasks. Defense Technical Information Center, 1994. http://dx.doi.org/10.21236/ada379543.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Davis, Bradley M. Effects of Visual, Auditory, and Tactile Navigation Cues on Navigation Performance, Situation Awareness, and Mental Workload. Defense Technical Information Center, 2007. http://dx.doi.org/10.21236/ada463244.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Beiker, Sven, ed. Unsettled Issues Regarding Visual Communication Between Automated Vehicles and Other Road Users. SAE International, 2021. http://dx.doi.org/10.4271/epr2021016.

Full text
Abstract:
As automated road vehicles begin their deployment into public traffic, and they will need to interact with human driven vehicles, pedestrians, bicyclists, etc. This requires some form of communication between those automated vehicles (AVs) and other road users. Some of these communication modes (e.g., auditory, motion) were discussed in “Unsettled Issues Regarding Communication of Automated Vehicles with Other Road Users.” Unsettled Issues Regarding Visual Communication Between Automated Vehicles and Other Road Users focuses on sisual communication and its balance of reach, clarity, and intuit
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!