To see the other types of publications on this topic, follow the link: Auditory and visual input.

Journal articles on the topic 'Auditory and visual input'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Auditory and visual input.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Mao, Yu-Ting, Tian-Miao Hua, and Sarah L. Pallas. "Competition and convergence between auditory and cross-modal visual inputs to primary auditory cortical areas." Journal of Neurophysiology 105, no. 4 (2011): 1558–73. http://dx.doi.org/10.1152/jn.00407.2010.

Full text
Abstract:
Sensory neocortex is capable of considerable plasticity after sensory deprivation or damage to input pathways, especially early in development. Although plasticity can often be restorative, sometimes novel, ectopic inputs invade the affected cortical area. Invading inputs from other sensory modalities may compromise the original function or even take over, imposing a new function and preventing recovery. Using ferrets whose retinal axons were rerouted into auditory thalamus at birth, we were able to examine the effect of varying the degree of ectopic, cross-modal input on reorganization of dev
APA, Harvard, Vancouver, ISO, and other styles
2

Dzulkifli, Mariam Adawiah, and Ain Zurzillah Abdul Halim. "Input Modality and its Effect on Memory Recall." Journal of Cognitive Sciences and Human Development 9, no. 2 (2023): 89–100. http://dx.doi.org/10.33736/jcshd.5699.2023.

Full text
Abstract:
One’s learning performance may be influenced by many internal and external factors. In addition to one’s cognitive ability, matters related to the academic context such as learning materials, contents and instruction can regulate and influence learning performance. The study aimed to examine the effects of different input modalities on learning performance by measuring memory recall success. A total of 96 participants took part in an experimental study employing a between-subject design. They were randomly assigned to one of the three groups that were presented with either visual or auditory o
APA, Harvard, Vancouver, ISO, and other styles
3

Moradi, Vahid, Kiana Kheirkhah, Saeid Farahani, and Iman Kavianpour. "Investigating the Effects of Hearing Loss and Hearing Aid Digital Delay on Sound-Induced Flash Illusion." Journal of Audiology and Otology 24, no. 4 (2020): 174–79. http://dx.doi.org/10.7874/jao.2019.00507.

Full text
Abstract:
Background and Objectives: The integration of auditory-visual speech information improves speech perception; however, if the auditory system input is disrupted due to hearing loss, auditory and visual inputs cannot be fully integrated. Additionally, temporal coincidence of auditory and visual input is a significantly important factor in integrating the input of these two senses. Time delayed acoustic pathway caused by the signal passing through digital signal processing. Therefore, this study aimed to investigate the effects of hearing loss and hearing aid digital delay circuit on sound-induce
APA, Harvard, Vancouver, ISO, and other styles
4

Ginsburg, Harvey J., Cathy Jenkins, Rachel Walsh, and Brad Peck. "Visual Superiority Effect in Televised Prevention of Victimization Programs for Preschool Children." Perceptual and Motor Skills 68, no. 3_suppl (1989): 1179–82. http://dx.doi.org/10.2466/pms.1989.68.3c.1179.

Full text
Abstract:
Preschool children have been reported to remember more visual than auditory content from television programs. 80 preschool children were randomly assigned to conditions where visual or auditory components of a televised program on personal safety were manipulated. Visually modeled actions were slightly more salient for preschool-age children than actions represented auditorily. The combination of visual and auditory input provided the superior educational method.
APA, Harvard, Vancouver, ISO, and other styles
5

VanRullen, Rufin, Benedikt Zoefel, and Barkin Ilhan. "On the cyclic nature of perception in vision versus audition." Philosophical Transactions of the Royal Society B: Biological Sciences 369, no. 1641 (2014): 20130214. http://dx.doi.org/10.1098/rstb.2013.0214.

Full text
Abstract:
Does our perceptual awareness consist of a continuous stream, or a discrete sequence of perceptual cycles, possibly associated with the rhythmic structure of brain activity? This has been a long-standing question in neuroscience. We review recent psychophysical and electrophysiological studies indicating that part of our visual awareness proceeds in approximately 7–13 Hz cycles rather than continuously. On the other hand, experimental attempts at applying similar tools to demonstrate the discreteness of auditory awareness have been largely unsuccessful. We argue and demonstrate experimentally
APA, Harvard, Vancouver, ISO, and other styles
6

Robinson, Christopher W., and Vladimir M. Sloutsky. "When Audition Dominates Vision." Experimental Psychology 60, no. 2 (2013): 113–21. http://dx.doi.org/10.1027/1618-3169/a000177.

Full text
Abstract:
Presenting information to multiple sensory modalities sometimes facilitates and sometimes interferes with processing of this information. Research examining interference effects shows that auditory input often interferes with processing of visual input in young children (i.e., auditory dominance effect), whereas visual input often interferes with auditory processing in adults (i.e., visual dominance effect). The current study used a cross-modal statistical learning task to examine modality dominance in adults. Participants ably learned auditory and visual statistics when auditory and visual se
APA, Harvard, Vancouver, ISO, and other styles
7

Garner, Aleena R., and Georg B. Keller. "A cortical circuit for audio-visual predictions." Nature Neuroscience 25, no. 1 (2021): 98–105. http://dx.doi.org/10.1038/s41593-021-00974-7.

Full text
Abstract:
AbstractLearned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppre
APA, Harvard, Vancouver, ISO, and other styles
8

Takeshima, Yasuhiro, and Jiro Gyoba. "Changing Pitch of Sounds Alters Perceived Visual Motion Trajectory." Multisensory Research 26, no. 4 (2013): 317–32. http://dx.doi.org/10.1163/22134808-00002422.

Full text
Abstract:
Several studies have examined the effects of auditory stimuli on visual perception. In studies of cross-modal correspondences, auditory pitch has been shown to modulate visual motion perception. In particular, low-reliability visual motion stimuli tend to be affected by metaphorically or physically congruent or incongruent sounds. In the present study, we examined the modulatory effects of auditory pitch on visual perception of motion trajectory for visual inputs of varying reliability. Our results indicated that an auditory pitch implying the illusory motion toward the outside of the visual f
APA, Harvard, Vancouver, ISO, and other styles
9

Robinson, Christopher W., and Vladimir M. Sloutsky. "Visual processing speed: effects of auditory input on visual processing." Developmental Science 10, no. 6 (2007): 734–40. http://dx.doi.org/10.1111/j.1467-7687.2007.00627.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

McDaniel, Jena, and Stephen Camarata. "Does Access to Visual Input Inhibit Auditory Development for Children With Cochlear Implants? A Review of the Evidence." Perspectives of the ASHA Special Interest Groups 2, no. 9 (2017): 10–24. http://dx.doi.org/10.1044/persp2.sig9.10.

Full text
Abstract:
Purpose We review the evidence for attenuating visual input during intervention to enhance auditory development and ultimately improve spoken language outcomes in children with cochlear implants. Background Isolating the auditory sense is a long-standing tradition in many approaches for teaching children with hearing loss. However, the evidence base for this practice is surprisingly limited and not straightforward. We review four bodies of evidence that inform whether or not visual input inhibits auditory development in children with cochlear implants: (a) audiovisual benefits for speech perce
APA, Harvard, Vancouver, ISO, and other styles
11

Ghafouri, Ali, and Mohsen Masoomi. "The Effect of Visual and Auditory Input Enhancement on Vocabulary Acquisition of Iranian EFL University Students." International Journal of English Linguistics 6, no. 7 (2016): 81. http://dx.doi.org/10.5539/ijel.v6n7p81.

Full text
Abstract:
<p>This study was an attempt to investigate the effect of input enhancement instruction on vocabulary acquisition among Iranian university students. Moreover, the possible effect of two kinds of input enhancement (i.e., auditory and visual) was examined. To this end, 75 Iranian university students, majoring English language Teaching at Applied Science and Technology of Kurdistan University, Iran, were randomly selected. The method used in this study was quantitative research and the true experimental design. One experimental group received vocabulary instruction via visual input enhancem
APA, Harvard, Vancouver, ISO, and other styles
12

Grant, Ken W., and Louis D. Braida. "Evaluating the articulation index for auditory–visual input." Journal of the Acoustical Society of America 89, no. 6 (1991): 2952–60. http://dx.doi.org/10.1121/1.400733.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Grant, Ken W. "Evaluating the Articulation Index for auditory‐visual input." Journal of the Acoustical Society of America 82, S1 (1987): S4. http://dx.doi.org/10.1121/1.2024845.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Gutfreund, Y. "Gated Visual Input to the Central Auditory System." Science 297, no. 5586 (2002): 1556–59. http://dx.doi.org/10.1126/science.1073712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hausfeld, Lars, Alexander Gutschalk, Elia Formisano, and Lars Riecke. "Effects of Cross-modal Asynchrony on Informational Masking in Human Cortex." Journal of Cognitive Neuroscience 29, no. 6 (2017): 980–90. http://dx.doi.org/10.1162/jocn_a_01097.

Full text
Abstract:
In many everyday listening situations, an otherwise audible sound may go unnoticed amid multiple other sounds. This auditory phenomenon, called informational masking (IM), is sensitive to visual input and involves early (50–250 msec) activity in the auditory cortex (the so-called awareness-related negativity). It is still unclear whether and how the timing of visual input influences the neural correlates of IM in auditory cortex. To address this question, we obtained simultaneous behavioral and neural measures of IM from human listeners in the presence of a visual input stream and varied the a
APA, Harvard, Vancouver, ISO, and other styles
16

Willette, Thomas L., and George H. Early. "Abilities of Normal and Reading-Disabled Children to Combine the Visual and Auditory Modalities with Dimensions of Space and Time." Perceptual and Motor Skills 61, no. 3_suppl (1985): 1295–98. http://dx.doi.org/10.2466/pms.1985.61.3f.1295.

Full text
Abstract:
The effects of various combinations of modalities with the dimensions of space and time on reproduction of stimulus patterns by 162 normal and 83 reading-disabled children aged 6 to 12 yr. was studied. Scores on three subtests with three different combinations of modalities with space and time dimensions were analyzed. They were (a) visual temporal input, vocal temporal output; (b) auditory temporal input, vocal temporal output; and (c) visual spatial-temporal input, vocal temporal output. Visual spatial-temporal input was superior to the other subtests. Auditory temporal input was superior to
APA, Harvard, Vancouver, ISO, and other styles
17

Hardison, Debra M. "Visual and auditory input in second-language speech processing." Language Teaching 43, no. 1 (2009): 84–95. http://dx.doi.org/10.1017/s0261444809990176.

Full text
Abstract:
The majority of studies in second-language (L2) speech processing have involved unimodal (i.e., auditory) input; however, in many instances, speech communication involves both visual and auditory sources of information. Some researchers have argued that multimodal speech is the primary mode of speech perception (e.g., Rosenblum 2005). Research on auditory-visual (AV) input has been conducted more extensively in the fields of infant speech development (e.g., Meltzoff & Kuhl 1994), adult monolingual processing (e.g., McGurk & MacDonald 1976; see reference in this timeline), and the treat
APA, Harvard, Vancouver, ISO, and other styles
18

Bola, Łukasz, Maria Zimmermann, Piotr Mostowski, et al. "Task-specific reorganization of the auditory cortex in deaf humans." Proceedings of the National Academy of Sciences 114, no. 4 (2017): E600—E609. http://dx.doi.org/10.1073/pnas.1609000114.

Full text
Abstract:
The principles that guide large-scale cortical reorganization remain unclear. In the blind, several visual regions preserve their task specificity; ventral visual areas, for example, become engaged in auditory and tactile object-recognition tasks. It remains open whether task-specific reorganization is unique to the visual cortex or, alternatively, whether this kind of plasticity is a general principle applying to other cortical areas. Auditory areas can become recruited for visual and tactile input in the deaf. Although nonhuman data suggest that this reorganization might be task specific, hu
APA, Harvard, Vancouver, ISO, and other styles
19

Petro, L. S., A. T. Paton, and L. Muckli. "Contextual modulation of primary visual cortex by auditory signals." Philosophical Transactions of the Royal Society B: Biological Sciences 372, no. 1714 (2017): 20160104. http://dx.doi.org/10.1098/rstb.2016.0104.

Full text
Abstract:
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23 , 195–201. ( doi:10.1016/j.conb.2013.01.020 )), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24 , 1256–1262. ( doi:10.1016/j.cub.2014.04.020 )). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type o
APA, Harvard, Vancouver, ISO, and other styles
20

JERGER, SUSAN, MARKUS F. DAMIAN, NANCY TYE-MURRAY, and HERVÉ ABDI. "Children perceive speech onsets by ear and eye." Journal of Child Language 44, no. 1 (2016): 185–215. http://dx.doi.org/10.1017/s030500091500077x.

Full text
Abstract:
AbstractAdults use vision to perceive low-fidelity speech; yet how children acquire this ability is not well understood. The literature indicates that children show reduced sensitivity to visual speech from kindergarten to adolescence. We hypothesized that this pattern reflects the effects of complex tasks and a growth period with harder-to-utilize cognitive resources, not lack of sensitivity. We investigated sensitivity to visual speech in children via the phonological priming produced by low-fidelity (non-intact onset) auditory speech presented audiovisually (see dynamic face articulate cons
APA, Harvard, Vancouver, ISO, and other styles
21

Rammsayer, Thomas, and Stefan Pichelmann. "Visual-auditory differences in duration discrimination depend on modality-specific, sensory-automatic temporal processing: Converging evidence for the validity of the Sensory-Automatic Timing Hypothesis." Quarterly Journal of Experimental Psychology 71, no. 11 (2018): 2364–77. http://dx.doi.org/10.1177/1747021817741611.

Full text
Abstract:
The Sensory-Automatic Timing Hypothesis assumes visual-auditory differences in duration discrimination to originate from sensory-automatic temporal processing. Although temporal discrimination of extremely brief intervals in the range of tens-of-milliseconds is predicted to depend mainly on modality-specific, sensory-automatic temporal processing, duration discrimination of longer intervals is predicted to require more and more amodal, higher order cognitive resources and decreasing input from the sensory-automatic timing system with increasing interval duration. In two duration discrimination
APA, Harvard, Vancouver, ISO, and other styles
22

Jerger, Susan, Markus F. Damian, Cassandra Karl, and Hervé Abdi. "Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech." Journal of Speech, Language, and Hearing Research 61, no. 12 (2018): 3095–112. http://dx.doi.org/10.1044/2018_jslhr-h-17-0343.

Full text
Abstract:
Purpose Successful speech processing depends on our ability to detect and integrate multisensory cues, yet there is minimal research on multisensory speech detection and integration by children. To address this need, we studied the development of speech detection for auditory (A), visual (V), and audiovisual (AV) input. Method Participants were 115 typically developing children clustered into age groups between 4 and 14 years. Speech detection (quantified by response times [RTs]) was determined for 1 stimulus, /buh/, presented in A, V, and AV modes (articulating vs. static facial conditions).
APA, Harvard, Vancouver, ISO, and other styles
23

Wong, Wai Leung, and Urs Maurer. "The effects of input and output modalities on language switching between Chinese and English." Bilingualism: Language and Cognition 24, no. 4 (2021): 719–29. http://dx.doi.org/10.1017/s136672892100002x.

Full text
Abstract:
AbstractLanguage control is important for bilinguals to produce words in the right language. While most previous studies investigated language control using visual stimuli with vocal responses, language control regarding auditory stimuli and manual responses was rarely examined. In the present study, an alternating language switching paradigm was used to investigate language control mechanism under two input modalities (visual and auditory) and two output modalities (manual and vocal) by measuring switch costs in both error percentage and reaction time (RT) in forty-eight Cantonese–English ear
APA, Harvard, Vancouver, ISO, and other styles
24

Plyler, Patrick N., Rowan Lang, Amy L. Monroe, and Paul Gaudiano. "The Effects of Audiovisual Stimulation on the Acceptance of Background Noise." Journal of the American Academy of Audiology 26, no. 05 (2015): 451–60. http://dx.doi.org/10.3766/jaaa.14084.

Full text
Abstract:
Background: Previous examinations of noise acceptance have been conducted using an auditory stimulus only; however, the effect of visual speech supplementation of the auditory stimulus on acceptance of noise remains limited. Purpose: The purpose of the present study was to determine the effect of audiovisual stimulation on the acceptance of noise in listeners with normal and impaired hearing. Research Design: A repeated measures design was utilized. Study Sample: A total of 92 adult participants were recruited for this experiment. Of these participants, 54 were listeners with normal hearing an
APA, Harvard, Vancouver, ISO, and other styles
25

Kubicek, Claudia, Anne Hillairet de Boisferon, Eve Dupierrix, Hélène Lœvenbruck, Judit Gervain, and Gudrun Schwarzer. "Face-scanning behavior to silently-talking faces in 12-month-old infants: The impact of pre-exposed auditory speech." International Journal of Behavioral Development 37, no. 2 (2013): 106–10. http://dx.doi.org/10.1177/0165025412473016.

Full text
Abstract:
The present eye-tracking study aimed to investigate the impact of auditory speech information on 12-month-olds’ gaze behavior to silently-talking faces. We examined German infants’ face-scanning behavior to side-by-side presentation of a bilingual speaker’s face silently speaking German utterances on one side and French on the other side, before and after auditory familiarization with one of the two languages. The results showed that 12-month-old infants showed no general visual preference for either of the visual speeches, neither before nor after auditory input. But, infants who heard native
APA, Harvard, Vancouver, ISO, and other styles
26

Atcherson, Samuel R., Lisa Lucks Mendel, Wesley J. Baltimore, et al. "The Effect of Conventional and Transparent Surgical Masks on Speech Understanding in Individuals with and without Hearing Loss." Journal of the American Academy of Audiology 28, no. 01 (2017): 058–67. http://dx.doi.org/10.3766/jaaa.15151.

Full text
Abstract:
AbstractIt is generally well known that speech perception is often improved with integrated audiovisual input whether in quiet or in noise. In many health-care environments, however, conventional surgical masks block visual access to the mouth and obscure other potential facial cues. In addition, these environments can be noisy. Although these masks may not alter the acoustic properties, the presence of noise in addition to the lack of visual input can have a deleterious effect on speech understanding. A transparent (“see-through”) surgical mask may help to overcome this issue.To compare the e
APA, Harvard, Vancouver, ISO, and other styles
27

Silverman, Michael J., and Edward T. Schwartzberg. "Effects of Visual and Auditory Presentation Styles and Musical Elements on Working Memory as Measured by Monosyllabic Sequential Digit Recall." Psychological Reports 122, no. 4 (2018): 1297–312. http://dx.doi.org/10.1177/0033294118781937.

Full text
Abstract:
Information is often paired with music to facilitate memory and learning. However, there is a lack of basic research investigating how visual and auditory presentation styles and musical elements might facilitate recall. The purpose of this study is to isolate and determine the effects of visual and auditory presentation styles and musical elements on working memory as measured by sequential monosyllabic digit recall performance. Recall was tested on 60 undergraduate university students during six different conditions: (a) Visual + Auditory Chant, (b) Visual + Auditory Melody, (c) Visual + Aud
APA, Harvard, Vancouver, ISO, and other styles
28

van den Hurk, Job, Marc Van Baelen, and Hans P. Op de Beeck. "Development of visual category selectivity in ventral visual cortex does not require visual experience." Proceedings of the National Academy of Sciences 114, no. 22 (2017): E4501—E4510. http://dx.doi.org/10.1073/pnas.1612862114.

Full text
Abstract:
To what extent does functional brain organization rely on sensory input? Here, we show that for the penultimate visual-processing region, ventral-temporal cortex (VTC), visual experience is not the origin of its fundamental organizational property, category selectivity. In the fMRI study reported here, we presented 14 congenitally blind participants with face-, body-, scene-, and object-related natural sounds and presented 20 healthy controls with both auditory and visual stimuli from these categories. Using macroanatomical alignment, response mapping, and surface-based multivoxel pattern anal
APA, Harvard, Vancouver, ISO, and other styles
29

Tsang, Pamela S., and Michael A. Vidulich. "Time-Sharing Visual and Auditory Tracking Tasks." Proceedings of the Human Factors Society Annual Meeting 31, no. 2 (1987): 253–57. http://dx.doi.org/10.1177/154193128703100226.

Full text
Abstract:
Multiple resource theory suggests that distributing demands over separate resources will reduce resource competition and improve time-sharing efficiency. A recent hypothesis however suggests that the benefits of utilizing separate resources for the time-shared tasks may be mitigated if the two tasks are integrated. The present experiment examined the benefits of distributing the input demands of two tracking tasks as a function of task integrality. Visual and auditory compensatory tracking tasks were used. Timesharing two tracking tasks with the same order of control is said to be more integra
APA, Harvard, Vancouver, ISO, and other styles
30

Schall, Sonja, Stefan J. Kiebel, Burkhard Maess, and Katharina von Kriegstein. "Early auditory sensory processing is facilitated by visual mechanisms." Seeing and Perceiving 25 (2012): 184–85. http://dx.doi.org/10.1163/187847612x648143.

Full text
Abstract:
There is compelling evidence that low-level sensory areas are sensitive to more than one modality. For example, auditory cortices respond to visual-only stimuli (Calvert et al., 1997; Meyer et al., 2010; Pekkola et al., 2005) and conversely, visual sensory areas respond to sound sources even in auditory-only conditions (Poirier et al., 2005; von Kriegstein et al., 2008; von Kriegstein and Giraud, 2006). Currently, it is unknown what makes the brain activate modality-specific, sensory areas solely in response to input of a different modality. One reason may be that such activations are instrume
APA, Harvard, Vancouver, ISO, and other styles
31

Shinn‐Cunningham, Barbara G., and John Park. "Changes in auditory localization responses are mediated by visual input." Journal of the Acoustical Society of America 101, no. 5 (1997): 3193. http://dx.doi.org/10.1121/1.419247.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Fine, Ione, Eva M. Finney, Geoffrey M. Boynton, and Karen R. Dobkins. "Comparing the Effects of Auditory Deprivation and Sign Language within the Auditory and Visual Cortex." Journal of Cognitive Neuroscience 17, no. 10 (2005): 1621–37. http://dx.doi.org/10.1162/089892905774597173.

Full text
Abstract:
To investigate neural plasticity resulting from early auditory deprivation and use of American Sign Language, we measured responses to visual stimuli in deaf signers, hearing signers, and hearing nonsigners using functional magnetic resonance imaging. We examined “compensatory hypertrophy” (changes in the responsivity/size of visual cortical areas) and “cross-modal plasticity” (changes in auditory cortex responses to visual stimuli). We measured the volume of early visual areas (V1, V2, V3, V4, and MT+). We also measured the amplitude of responses within these areas, and within the auditory co
APA, Harvard, Vancouver, ISO, and other styles
33

Lüttke, Claudia S., Matthias Ekman, Marcel A. J. van Gerven, and Floris P. de Lange. "Preference for Audiovisual Speech Congruency in Superior Temporal Cortex." Journal of Cognitive Neuroscience 28, no. 1 (2016): 1–7. http://dx.doi.org/10.1162/jocn_a_00874.

Full text
Abstract:
Auditory speech perception can be altered by concurrent visual information. The superior temporal cortex is an important combining site for this integration process. This area was previously found to be sensitive to audiovisual congruency. However, the direction of this congruency effect (i.e., stronger or weaker activity for congruent compared to incongruent stimulation) has been more equivocal. Here, we used fMRI to look at the neural responses of human participants during the McGurk illusion—in which auditory /aba/ and visual /aga/ inputs are fused to perceived /ada/—in a large homogenous s
APA, Harvard, Vancouver, ISO, and other styles
34

Lüttke, Claudia S., Alexis Pérez-Bellido, and Floris P. de Lange. "Rapid recalibration of speech perception after experiencing the McGurk illusion." Royal Society Open Science 5, no. 3 (2018): 170909. http://dx.doi.org/10.1098/rsos.170909.

Full text
Abstract:
The human brain can quickly adapt to changes in the environment. One example is phonetic recalibration: a speech sound is interpreted differently depending on the visual speech and this interpretation persists in the absence of visual information. Here, we examined the mechanisms of phonetic recalibration. Participants categorized the auditory syllables /aba/ and /ada/, which were sometimes preceded by the so-called McGurk stimuli (in which an /aba/ sound, due to visual /aga/ input, is often perceived as ‘ada’). We found that only one trial of exposure to the McGurk illusion was sufficient to
APA, Harvard, Vancouver, ISO, and other styles
35

Yau, Jeffrey M., Pablo Celnik, Steven S. Hsiao, and John E. Desmond. "Dissociable crossmodal recruitment of visual and auditory cortex for tactile perception." Seeing and Perceiving 25 (2012): 7. http://dx.doi.org/10.1163/187847612x646307.

Full text
Abstract:
Primary sensory areas previously thought to be devoted to a single modality can exhibit multisensory responses. Some have interpreted these responses as evidence for crossmodal recruitment (i.e., primary sensory processing for inputs in a non-primary modality); however, the direct contribution of this activity to perception is unclear. We tested the specific contributions of visual and auditory cortex to tactile perception in healthy adult volunteers using anodal transcranial direct current stimulation (tDCS). This form of non-invasive neuromodulation can enhance neural excitability and facili
APA, Harvard, Vancouver, ISO, and other styles
36

Nardo, Davide, Valerio Santangelo, and Emiliano Macaluso. "Audiovisual stimulus-driven contributions to spatial orienting in ecologically valid situations: An fMRI study." Seeing and Perceiving 25 (2012): 16. http://dx.doi.org/10.1163/187847612x646389.

Full text
Abstract:
Mechanisms of audiovisual attention have been extensively investigated, yet little is known about their functioning in ecologically-valid situations. Here, we investigated brain activity associated with audiovisual stimulus-driven attention using naturalistic stimuli. We created 120 short videos (2.5 s) showing scenes of everyday life. Each video included a visual event comprising a lateralized (left/right) increase in visual saliency (e.g., an actor moving an object), plus a co-occurring sound either on the same or the opposite side of space. Subjects viewed the videos with/without the associ
APA, Harvard, Vancouver, ISO, and other styles
37

Yang, Shanshan, Defeng Li, and Victoria Lai Cheng Lei. "Multimodal processing in simultaneous interpreting with text: Evidence from ear-eye-voice span and performance." PLOS One 20, no. 7 (2025): e0326527. https://doi.org/10.1371/journal.pone.0326527.

Full text
Abstract:
Simultaneous interpreting (SI) with text, a hybrid modality combining auditory and visual inputs, presents greater cognitive complexity than traditional SI. This study investigates multimodal processing in Chinese-English SI with text by examining how source speech rate and professional experience modulate interpreters’ Ear-Eye-Voice Span (EIVS)—a temporal measure reflecting the cognitive coordination among auditory input, visual processing, and verbal output—and interpreting performance. Using eye-tracking technology, we analyzed EIVS patterns in 15 professional interpreters and 30 interpreti
APA, Harvard, Vancouver, ISO, and other styles
38

Recanzone, Gregg H. "Auditory Influences on Visual Temporal Rate Perception." Journal of Neurophysiology 89, no. 2 (2003): 1078–93. http://dx.doi.org/10.1152/jn.00706.2002.

Full text
Abstract:
Visual stimuli are known to influence the perception of auditory stimuli in spatial tasks, giving rise to the ventriloquism effect. These influences can persist in the absence of visual input following a period of exposure to spatially disparate auditory and visual stimuli, a phenomenon termed the ventriloquism aftereffect. It has been speculated that the visual dominance over audition in spatial tasks is due to the superior spatial acuity of vision compared with audition. If that is the case, then the auditory system should dominate visual perception in a manner analogous to the ventriloquism
APA, Harvard, Vancouver, ISO, and other styles
39

Penney, Catherine G. "Modality Effects in Delayed Free Recall and Recognition: Visual is Better than Auditory." Quarterly Journal of Experimental Psychology Section A 41, no. 3 (1989): 455–70. http://dx.doi.org/10.1080/14640748908402376.

Full text
Abstract:
During presentation of auditory and visual lists of words, different groups of subjects generated words that either rhymed with the presented words or that were associates. Immediately after list presentation, subjects recalled either the presented or the generated words. After presentation and test of all lists, a final free recall test and a recognition test were given. Visual presentation generally produced higher recall and recognition than did auditory presentation for both encoding conditions. The results are not consistent with explanations of modality effects in terms of echoic memory
APA, Harvard, Vancouver, ISO, and other styles
40

Ursino, Mauro, Cristiano Cuppini, and Elisa Magosso. "Multisensory Bayesian Inference Depends on Synapse Maturation during Training: Theoretical Analysis and Neural Modeling Implementation." Neural Computation 29, no. 3 (2017): 735–82. http://dx.doi.org/10.1162/neco_a_00935.

Full text
Abstract:
Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding—the idea that a population of neurons can encode probability functions to perform Baye
APA, Harvard, Vancouver, ISO, and other styles
41

Kiela, Douwe, and Stephen Clark. "Learning Neural Audio Embeddings for Grounding Semantics in Auditory Perception." Journal of Artificial Intelligence Research 60 (December 26, 2017): 1003–30. http://dx.doi.org/10.1613/jair.5665.

Full text
Abstract:
Multi-modal semantics, which aims to ground semantic representations in perception, has relied on feature norms or raw image data for perceptual input. In this paper we examine grounding semantic representations in raw auditory data, using standard evaluations for multi-modal semantics. After having shown the quality of such auditorily grounded representations, we show how they can be applied to tasks where auditory perception is relevant, including two unsupervised categorization experiments, and provide further analysis. We find that features transfered from deep neural networks outperform b
APA, Harvard, Vancouver, ISO, and other styles
42

Mao, Yu-Ting, and Sarah L. Pallas. "Cross-Modal Plasticity Results in Increased Inhibition in Primary Auditory Cortical Areas." Neural Plasticity 2013 (2013): 1–18. http://dx.doi.org/10.1155/2013/530651.

Full text
Abstract:
Loss of sensory input from peripheral organ damage, sensory deprivation, or brain damage can result in adaptive or maladaptive changes in sensory cortex. In previous research, we found that auditory cortical tuning and tonotopy were impaired by cross-modal invasion of visual inputs. Sensory deprivation is typically associated with a loss of inhibition. To determine whether inhibitory plasticity is responsible for this process, we measured pre- and postsynaptic changes in inhibitory connectivity in ferret auditory cortex (AC) after cross-modal plasticity. We found that blocking GABAAreceptors i
APA, Harvard, Vancouver, ISO, and other styles
43

Wickens, Christopher D., Stephen R. Dixon, and Bobbie Seppelt. "Auditory Preemption versus Multiple Resources: Who Wins in Interruption Management?" Proceedings of the Human Factors and Ergonomics Society Annual Meeting 49, no. 3 (2005): 463–66. http://dx.doi.org/10.1177/154193120504900353.

Full text
Abstract:
We examined the effects of modality (auditory versus visual) and spatial separation when a simulated vehicle control (tracking) task (the ongoing task: OT) was time shared with a digit entry task (the interrupting task: IT), contrasting the predictions of auditory preemption theory with that of multiple resource theory. Participants performed the tracking task with auditory display of the phone numbers, or with visual display at eccentricities ranging from 0 deg (overlay) to 45 deg. Auditory input improved IT performance relative to visual, but disrupted OT performance, thereby supporting the
APA, Harvard, Vancouver, ISO, and other styles
44

Anton, Kristina, Arne Ernst, and Dietmar Basta. "A static sound source can improve postural stability during walking." Journal of Vestibular Research 31, no. 3 (2021): 143–49. http://dx.doi.org/10.3233/ves-200015.

Full text
Abstract:
BACKGROUND: During walking, postural stability is controlled by visual, vestibular and proprioceptive input. The auditory system uses acoustic input to localize sound sources. For some static balance conditions, the auditory influence on posture was already proven. Little is known about the impact of auditory inputs on balance in dynamic conditions. OBJECTIVE: This study is aimed at investigating postural stability of walking tasks in silence and sound on condition to better understand the impact of auditory input on balance in movement. METHODS: Thirty participants performed: walking (eyes op
APA, Harvard, Vancouver, ISO, and other styles
45

Guttman, Sharon E., Lee A. Gilroy, and Randolph Blake. "Hearing What the Eyes See." Psychological Science 16, no. 3 (2005): 228–35. http://dx.doi.org/10.1111/j.0956-7976.2005.00808.x.

Full text
Abstract:
When the senses deliver conflicting information, vision dominates spatial processing, and audition dominates temporal processing. We asked whether this sensory specialization results in cross-modal encoding of unisensory input into the task-appropriate modality. Specifically, we investigated whether visually portrayed temporal structure receives automatic, obligatory encoding in the auditory domain. In three experiments, observers judged whether the changes in two successive visual sequences followed the same or different rhythms. We assessed temporal representations by measuring the extent to
APA, Harvard, Vancouver, ISO, and other styles
46

Randazzo, Melissa, Ryan Priefer, Paul J. Smith, Amanda Nagler, Trey Avery, and Karen Froud. "Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm." Brain Sciences 10, no. 6 (2020): 328. http://dx.doi.org/10.3390/brainsci10060328.

Full text
Abstract:
The McGurk effect, an incongruent pairing of visual /ga/–acoustic /ba/, creates a fusion illusion /da/ and is the cornerstone of research in audiovisual speech perception. Combination illusions occur given reversal of the input modalities—auditory /ga/-visual /ba/, and percept /bga/. A robust literature shows that fusion illusions in an oddball paradigm evoke a mismatch negativity (MMN) in the auditory cortex, in absence of changes to acoustic stimuli. We compared fusion and combination illusions in a passive oddball paradigm to further examine the influence of visual and auditory aspects of i
APA, Harvard, Vancouver, ISO, and other styles
47

Alencar, Caroline D. C., Blake E. Butler, and Stephen G. Lomber. "What and How the Deaf Brain Sees." Journal of Cognitive Neuroscience 31, no. 8 (2019): 1091–109. http://dx.doi.org/10.1162/jocn_a_01425.

Full text
Abstract:
Over the past decade, there has been an unprecedented level of interest and progress into understanding visual processing in the brain of the deaf. Specifically, when the brain is deprived of input from one sensory modality (such as hearing), it often compensates with supranormal performance in one or more of the intact sensory systems (such as vision). Recent psychophysical, functional imaging, and reversible deactivation studies have converged to define the specific visual abilities that are enhanced in the deaf, as well as the cortical loci that undergo crossmodal plasticity in the deaf and
APA, Harvard, Vancouver, ISO, and other styles
48

Han, Yueqiao, Martijn Goudbeek, Maria Mos, and Marc Swerts. "Relative Contribution of Auditory and Visual Information to Mandarin Chinese Tone Identification by Native and Tone-naïve Listeners." Language and Speech 63, no. 4 (2019): 856–76. http://dx.doi.org/10.1177/0023830919889995.

Full text
Abstract:
Speech perception is a multisensory process: what we hear can be affected by what we see. For instance, the McGurk effect occurs when auditory speech is presented in synchrony with discrepant visual information. A large number of studies have targeted the McGurk effect at the segmental level of speech (mainly consonant perception), which tends to be visually salient (lip-reading based), while the present study aims to extend the existing body of literature to the suprasegmental level, that is, investigating a McGurk effect for the identification of tones in Mandarin Chinese. Previous studies h
APA, Harvard, Vancouver, ISO, and other styles
49

DODD, BARBARA, and BETH McINTOSH. "Two-year-old phonology: impact of input, motor and cognitive abilities on development." Journal of Child Language 37, no. 5 (2009): 1027–46. http://dx.doi.org/10.1017/s0305000909990171.

Full text
Abstract:
ABSTRACTPrevious research has rarely compared the contributions of different underlying abilities to phonological acquisition. In this study, the auditory-visual speech perception, oro-motor and rule abstraction skills of 62 typically developing two-year olds were assessed and contrasted with the accuracy of their spoken phonology. Measures included auditory-visual speech perception, production of isolated and sequenced oro-motor movements, and verbal and non-verbal rule abstraction. Abilities in all three domains contributed to phonological acquisition. However, the use of atypical phonologic
APA, Harvard, Vancouver, ISO, and other styles
50

Lomber, Stephen, and M. A. Meredith. "Short periods of perinatal acoustic experience alter the developmental trajectory of auditory cortex." Journal of the Acoustical Society of America 153, no. 3_supplement (2023): A204. http://dx.doi.org/10.1121/10.0018665.

Full text
Abstract:
Compared to hearing subjects, psychophysical studies have revealed specific superior visual abilities following hearing loss early in development. The neural substrate for these superior abilities resides in auditory cortex that have been reorganized through crossmodal plasticity. Furthermore, the cartography of auditory cortex is altered following the loss of auditory input early in life. This study examined how perinatal exposure to brief periods of acoustic stimulation alters the developmental trajectory of auditory cortex. Movement detection, localization in the visual periphery, and face
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!