Academic literature on the topic 'Auditory cortex activation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Auditory cortex activation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Auditory cortex activation"

1

Chen, Ling-Chia, Pascale Sandmann, Jeremy D. Thorne, Martin G. Bleichner, and Stefan Debener. "Cross-Modal Functional Reorganization of Visual and Auditory Cortex in Adult Cochlear Implant Users Identified with fNIRS." Neural Plasticity 2016 (2016): 1–13. http://dx.doi.org/10.1155/2016/4382656.

Full text
Abstract:
Cochlear implant (CI) users show higher auditory-evoked activations in visual cortex and higher visual-evoked activation in auditory cortex compared to normal hearing (NH) controls, reflecting functional reorganization of both visual and auditory modalities. Visual-evoked activation in auditory cortex is a maladaptive functional reorganization whereas auditory-evoked activation in visual cortex is beneficial for speech recognition in CI users. We investigated their joint influence on CI users’ speech recognition, by testing 20 postlingually deafened CI users and 20 NH controls with functional near-infrared spectroscopy (fNIRS). Optodes were placed over occipital and temporal areas to measure visual and auditory responses when presenting visual checkerboard and auditory word stimuli. Higher cross-modal activations were confirmed in both auditory and visual cortex for CI users compared to NH controls, demonstrating that functional reorganization of both auditory and visual cortex can be identified with fNIRS. Additionally, the combined reorganization of auditory and visual cortex was found to be associated with speech recognition performance. Speech performance was good as long as the beneficial auditory-evoked activation in visual cortex was higher than the visual-evoked activation in the auditory cortex. These results indicate the importance of considering cross-modal activations in both visual and auditory cortex for potential clinical outcome estimation.
APA, Harvard, Vancouver, ISO, and other styles
2

Bergerbest, Dafna, Dara G. Ghahremani, and John D. E. Gabrieli. "Neural Correlates of Auditory Repetition Priming: Reduced fMRI Activation in the Auditory Cortex." Journal of Cognitive Neuroscience 16, no. 6 (2004): 966–77. http://dx.doi.org/10.1162/0898929041502760.

Full text
Abstract:
Repetition priming refers to enhanced or biased performance with repeatedly presented stimuli. Modality-specific perceptual repetition priming has been demonstrated behaviorally for both visually and auditorily presented stimuli. In functional neuroimaging studies, repetition of visual stimuli has resulted in reduced activation in the visual cortex, as well as in multimodal frontal and temporal regions. The reductions in sensory cortices are thought to reflect plasticity in modality-specific neocortex. Unexpectedly, repetition of auditory stimuli has resulted in reduced activation in multimodal and visual regions, but not in the auditory temporal lobe cortex. This finding puts the coupling of perceptual priming and modality-specific cortical plasticity into question. Here, functional magnetic resonance imaging was used with environmental sounds to reexamine whether auditory priming is associated with reduced activation in the auditory cortex. Participants heard environmental sounds (e.g., animals, machines, musical instruments, etc.) in blocks, alternating between initial and repeated presentations, and decided whether or not each sound was produced by an animal. Repeated versus initial presentations of sounds resulted in repetition priming (faster responses) and reduced activation in the right superior temporal gyrus, bilateral superior temporal sulci, and right inferior prefrontal cortex. The magnitude of behavioral priming correlated positively with reduced activation in these regions. This indicates that priming for environmental sounds is associated with modification of neural activation in modality-specific auditory cortex, as well as in multimodal areas.
APA, Harvard, Vancouver, ISO, and other styles
3

Soler-Vidal, Joan, Paola Fuentes-Claramonte, Pilar Salgado-Pineda, et al. "Brain correlates of speech perception in schizophrenia patients with and without auditory hallucinations." PLOS ONE 17, no. 12 (2022): e0276975. http://dx.doi.org/10.1371/journal.pone.0276975.

Full text
Abstract:
The experience of auditory verbal hallucinations (AVH, “hearing voices”) in schizophrenia has been found to be associated with reduced auditory cortex activation during perception of real auditory stimuli like tones and speech. We re-examined this finding using 46 patients with schizophrenia (23 with frequent AVH and 23 hallucination-free), who underwent fMRI scanning while they heard words, sentences and reversed speech. Twenty-five matched healthy controls were also examined. Perception of words, sentences and reversed speech all elicited activation of the bilateral superior temporal cortex, the inferior and lateral prefrontal cortex, the inferior parietal cortex and the supplementary motor area in the patients and the healthy controls. During the sentence and reversed speech conditions, the schizophrenia patients as a group showed reduced activation in the left primary auditory cortex (Heschl’s gyrus) relative to the healthy controls. No differences were found between the patients with and without hallucinations in any condition. This study therefore fails to support previous findings that experience of AVH attenuates speech-perception-related brain activations in the auditory cortex. At the same time, it suggests that schizophrenia patients, regardless of presence of AVH, show reduced activation in the primary auditory cortex during speech perception, a finding which could reflect an early information processing deficit in the disorder.
APA, Harvard, Vancouver, ISO, and other styles
4

Hubl, Daniela, Thomas Koenig, Werner K. Strik, Lester Melie Garcia, and Thomas Dierks. "Competition for neuronal resources: how hallucinations make themselves heard." British Journal of Psychiatry 190, no. 1 (2007): 57–62. http://dx.doi.org/10.1192/bjp.bp.106.022954.

Full text
Abstract:
BackgroundHallucinations are perceptions in the absence of a corresponding external sensory stimulus. However, during auditory verbal hallucinations, activation of the primary auditory cortex has been described.AimsThe objective of this study was to investigate whether this activation of the auditory cortex contributes essentially to the character of hallucinations and attributes them to alien sources, or whether the auditory activation is a sign of increased general auditory attention to external sounds.MethodThe responsiveness of the auditory cortex was investigated by auditory evoked potentials (N100) during the simultaneous occurrence of hallucinations and external stimuli. Evoked potentials were computed separately for periods with and without hallucinations; N100 power, topography and brain electrical sources were analysed.ResultsHallucinations lowered the N100 amplitudes and changed the topography, presumably due to a reduced left temporal responsivity.ConclusionsThis finding indicates competition between auditory stimuli and hallucinations for physiological resources in the primary auditory cortex. The abnormal activation of the primary auditory cortex may thus be a constituent of auditory hallucinations.
APA, Harvard, Vancouver, ISO, and other styles
5

Pollmann, Stefan, and Marianne Maertens. "Perception modulates auditory cortex activation." NeuroReport 17, no. 17 (2006): 1779–82. http://dx.doi.org/10.1097/wnr.0b013e3280107a98.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hsieh, P. J., J. T. Colas, and N. Kanwisher. "Spatial pattern of BOLD fMRI activation reveals cross-modal information in auditory cortex." Journal of Neurophysiology 107, no. 12 (2012): 3428–32. http://dx.doi.org/10.1152/jn.01094.2010.

Full text
Abstract:
Recent findings suggest that neural representations in early auditory cortex reflect not only the physical properties of a stimulus, but also high-level, top-down, and even cross-modal information. However, the nature of cross-modal information in auditory cortex remains poorly understood. Here, we used pattern analyses of fMRI data to ask whether early auditory cortex contains information about the visual environment. Our data show that 1) early auditory cortex contained information about a visual stimulus when there was no bottom-up auditory signal, and that 2) no influence of visual stimulation was observed in auditory cortex when visual stimuli did not provide a context relevant to audition. Our findings attest to the capacity of auditory cortex to reflect high-level, top-down, and cross-modal information and indicate that the spatial patterns of activation in auditory cortex reflect contextual/implied auditory information but not visual information per se.
APA, Harvard, Vancouver, ISO, and other styles
7

Ulualp, Seckin O., Bharat B. Biswal, F. Zerrin Yetkin, and Thomas M. Kidder. "Assessment of auditory cortex activation with functional magnetic resonance imaging." Otolaryngology–Head and Neck Surgery 122, no. 2 (2000): 241–45. http://dx.doi.org/10.1016/s0194-5998(00)70247-6.

Full text
Abstract:
OBJECTIVE The goal was to assess auditory cortex activation evoked by pure-tone stimulus with functional MRI. METHODS Five healthy children, aged 7 to 10 years, were studied. Hearing evaluation was performed by pure-tone audiometry in a sound-treated room and in the MRI scanner with the scanner noise in the background. Subjects were asked to listen to pure tones (500, 1000, 2000, and 4000 Hz) at thresholds determined in the MRI scanner. Functional image processing was performed with a cross-correlation technique with a correlation coefficient of 0.5 ( P < 0.0001). Auditory cortex activation was assessed by observing activated pixels in functional images. RESULTS Functional images of auditory cortex activation were obtained in 3 children. All children showed activation in Heschl's gyrus, middle temporal gyrus, superior temporal gyrus, and planum temporale. The number of activated pixels in auditory cortexes ranged from 4 to 33. CONCLUSIONS Functional images of auditory cortex activation evoked by pure-tone stimuli are obtained in healthy children with the functional MRI technique.
APA, Harvard, Vancouver, ISO, and other styles
8

van der Heijden, Kiki, Elia Formisano, Giancarlo Valente, Minye Zhan, Ron Kupers, and Beatrice de Gelder. "Reorganization of Sound Location Processing in the Auditory Cortex of Blind Humans." Cerebral Cortex 30, no. 3 (2019): 1103–16. http://dx.doi.org/10.1093/cercor/bhz151.

Full text
Abstract:
Abstract Auditory spatial tasks induce functional activation in the occipital—visual—cortex of early blind humans. Less is known about the effects of blindness on auditory spatial processing in the temporal—auditory—cortex. Here, we investigated spatial (azimuth) processing in congenitally and early blind humans with a phase-encoding functional magnetic resonance imaging (fMRI) paradigm. Our results show that functional activation in response to sounds in general—independent of sound location—was stronger in the occipital cortex but reduced in the medial temporal cortex of blind participants in comparison with sighted participants. Additionally, activation patterns for binaural spatial processing were different for sighted and blind participants in planum temporale. Finally, fMRI responses in the auditory cortex of blind individuals carried less information on sound azimuth position than those in sighted individuals, as assessed with a 2-channel, opponent coding model for the cortical representation of sound azimuth. These results indicate that early visual deprivation results in reorganization of binaural spatial processing in the auditory cortex and that blind individuals may rely on alternative mechanisms for processing azimuth position.
APA, Harvard, Vancouver, ISO, and other styles
9

David, Anthony S., Peter W. R. Woodruff, Robert Howard, et al. "Auditory hallucinations inhibit exogenous activation of auditory association cortex." NeuroReport 7, no. 4 (1996): 932–36. http://dx.doi.org/10.1097/00001756-199603220-00021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Qiang, Suang Xia, and Fei Zhao. "A Study of Language in Functional Cortex of Audiphone Deaf Signers Used FMRI." Advanced Materials Research 204-210 (February 2011): 5–10. http://dx.doi.org/10.4028/www.scientific.net/amr.204-210.5.

Full text
Abstract:
Using functional magnetic resonance imaging (fMRI), to observe the changes of cerebral functional cortex in prelingual deaf singers for Chinese sign language(CSL). Results:During observing and imitating CSL, the activated areas in all groups include bilateral middle frontal gyrus, middle temporal gyrus, superior parietal lobule, cuneate lobe, fusiform gyrus and lingual gurus. The activation of bilateral inferior frontal gyrus were found in groupⅠ, Ⅲ and Ⅳ, but no activation in groupⅡ. The activation of bilateral superior temporal gyrus and inferior parietal lobule were found in groupⅠand Ⅲ, but no activation in others. The volumes of bilateral inferior frontal gyrus in groupⅠwere greater than those in group Ⅲ and Ⅳ. The volumes of bilateral superior temporal gyrus in groupⅠwere greater than those in group Ⅲ. Conclusion:The cortex in PDS had occurred reorganization, after losing their auditory and learning the CSL. The activation of linguistic cortex can be found during oberserving and imitating CSL in PDS. The secondary auditory cortex and association area turn to take part in processing visual language when no auditory afference, whereas the primary auditory cortex do not participate the reorganization. Additionally, the visual cortex of PDS is more sensitive than that of normal heaing individuals.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Auditory cortex activation"

1

Chavez, Candice Monique. "Top-down modulation by medial prefrontal cortex of basal forebrain activation of auditory cortex during learning." CSUSB ScholarWorks, 2006. https://scholarworks.lib.csusb.edu/etd-project/3053.

Full text
Abstract:
The experiment tested the hypothesis that the acetylcholine (ACh) release in the rat auditory cortex is greater in rats undergoing auditory classical conditioning compared to rats in a truly random control paradigm where no associative learning takes place and that this is mediated by prefrontal afferent projections influencing the nucleus basalis magnocellularis (NBM), which in turn modulates ACh release in neocortex. Rats with bilateral ibotenic acid lesions of medial prefrontal and agranular insular cortices were tested in an auditory classical conditioning task while ACh was collected from the primary auditory cortex. It was hypothesized that lesions of these prefrontal areas would prevent learning-related increases of ACh release in the primary auditory cortex. The hypothesized results were supported. Results from this experiment provide unique evidence that medial prefrontal cortex projections to the NBM are important for mediating cortical ACh release during associative learning.
APA, Harvard, Vancouver, ISO, and other styles
2

Morita, Takeshi. "Enhanced activation of the auditory cortex in patients with inner-ear hearing impairment : An MEG study." Kyoto University, 2003. http://hdl.handle.net/2433/148740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mohammad, Penaz Parveen Sultana. "Quantitative Measurement of Cerebral Hemodynamics During Activation of Auditory Cortex With Single- and Multi-Distance Near Infrared Spectroscopy." Scholar Commons, 2018. https://scholarcommons.usf.edu/etd/7698.

Full text
Abstract:
Functional Near Infrared Spectroscopy (fNIRS) is a safe, low-cost, non-invasive opti-cal technique to monitor focal changes in brain activity using neurovascular coupling and measurements of local tissue oxygenation, i.e., changes in concentrations of oxygenated hemoglobin (HbO) and deoxygenated hemoglobin (HbR)[42]. This thesis utilizes two fNIRS approaches to measure hemodynamic changes associated with functional stimulation of the human auditory cortex. The first approach, single-distance continuous wave NIRS (CW-NIRS) utilizes relatively simple instrumentation and the Modified-Beer Lambert (MBL) law to estimate activation induced changes in tissue oxygenation (∆CHbO and ∆CHbR)[17]. The second more complex approach, frequency domain NIRS (FD-NIRS), employs a photon diffusion model of light propagation through tissue to measure both baseline (CHbO and CHbR), and stimulus induced changes in oxygenated and deoxygenated hemoglobin[10]. FD-NIRS is more quantitative, but requires measurements at multiple light source-detector separations and thus its use in measuring focal changes in cerebral hemodynamics have been limited. A commercial FD-NIRS instrument was used to measure the cerebral hemodynamics from the right auditory cortex of 9 adults (21 ± 35 years) with normal hearing, while presented with two types of auditory stimuli: a 1000 Hz Pure tone, and Broad band noise. Measured optical intensities were analyzed using both MBL and photon diffusion approaches. Oxygenated hemoglobin was found to increase by 0.351 ± 0.116 µM and 0.060 ± 0.084 µM for Pure tone and Broad band noise stimuli, when analyzed by the MBL method at the ‘best’ source-detector separation. On average (across all sources), MBL analysis estimated an increase in CHbO of 0.100±0.075 µM and 0.099±0.084 µM respectively for Pure tone and Broad band noise stimulation. In contrast, the frequency domain analysis method estimated CHbO to increase by −0.401 ± 0.384 µM and −0.031 ± 0.358 µM for Pure tone and Broad band noise stimulation respectively. These results suggest that although more quantitative, multi-distance FD-NIRS may underestimate focal changes in cerebral hemodynamics that occur due to functional activation. Potential reasons for this discrepancy, including the partial volume effect, are discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Perrot, Xavier. "Modulation centrale du fonctionnement cochléaire chez l’humain : activation et plasticité." Thesis, Lyon 2, 2009. http://www.theses.fr/2009LYO29998.

Full text
Abstract:
Le système auditif possède deux particularités. En périphérie, les mécanismes cochléaires actifs (MCA), sous-tendus par la motilité des cellules ciliées externes (CCE), interviennent dans la sensibilité auditive et la sélectivité fréquentielle. Sur le versant central, le système efférent olivocochléaire médian (SEOCM), qui se projette sur les CCE et module les MCA, améliore la perception auditive en milieu bruité. Sur le plan exploratoire, ces deux processus peuvent être évalués grâce aux otoémissions acoustiques provoquées (OEAP) et leur suppression controlatérale. Par ailleurs, des résultats expérimentaux chez l’animal ont montré l’existence d’un rétrocontrôle exercé par le système auditif corticofuge descendant (SACD) sur la cochlée, via le SEOCM.Le présent travail comporte trois études réalisées chez l’humain, visant à explorer les interactions entre SACD, SEOCM et MCA. Les études 1 et 2, utilisant une méthodologie innovante chez des patients épileptiques réalisant une stéréo-électroencéphalographie, ont révélé un effet atténuateur différentiel de la stimulation électrique intracérébrale sur l’amplitude des OEAP, en fonction des modalités de stimulation, ainsi qu’une variabilité de cet effet selon les caractéristiques de l’épilepsie. L’étude 3 a montré un renforcement bilatéral de l’activité du SEOCM chez des musiciens professionnels.Pris dans leur ensemble, ces résultats fournissent d’une part, des arguments directs et indirects en faveur de l’existence d’un SACD fonctionnel chez l’humain. D’autre part, des phénomènes de plasticité à long terme, pathologique ou supranormale, seraient susceptibles de modifier l’activité de cette voie cortico-olivocochléaire<br>The auditory system has two special features. At peripheral level, active cochlear micromechanisms (ACM), underlain by motility of outer hair cells (OHC), are involved in auditory sensitivity and frequency selectivity. At central level, the medial olivocochlear efferent system (MOCES), which directly projects onto OHC to modulate ACM, improves auditory perception in noise. From an exploratory point of view, both processes can be assessed through transient evoked otoacoustic emissions (TEOAE) and the procedure of contralateral suppression. In addition, experimental data in animals have disclosed a top-down control exerted by corticofugal descending auditory system (CDAS) on cochlea, via MOCES.The present work comprises three studies carried out in human, aiming to investigate interactions between CDAS, MOCES and ACM. The first and second studies, based on an innovative experimental procedure in epileptic patients undergoing presurgical stereoelectroencephalography, have revealed a differential attenuation effect of intracerebral electrical stimulation on TEOAE amplitude depending on stimulation modalities, as well as a variability of this effect depending on the clinical history of epilepsy. The third study has shown a bilateral enhancement of MOCES activity in professional musicians.Taking together, these results provide direct and indirect evidence for the existence of a functional CDAS in humans. Moreover, possible long-term plasticity phenomenon, either pathological –as in epileptic patients– or supernormal –as in professional musicians– may change cortico-olivocochlear activity
APA, Harvard, Vancouver, ISO, and other styles
5

Tien, Hui-chi, and 田輝勣. "Changes in Activation Pattern of the Auditory Cortex FollowingUnilateral Amplification: an fMRI study." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/01995053306418895200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Sheng Ru, and 王聖儒. "The activation of auditory cortex with the sound stimuli by fMRI: Unilateral sensorineural hearing-impaired human." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/55843789725143112145.

Full text
Abstract:
碩士<br>長庚大學<br>醫療機電工程研究所<br>98<br>Currently, the same management strategy—with respect to both clinical diagnosis and choice of hearing aids—is offered to all patients with unilateral sensorineural hearing loss (SNHL) in different ears. In some cases, neither treatment nor a hearing aid is considered for a patient with hearing loss simply because the patient still has a good ear. As a result, patients with unilateral SNHL may experience further deterioration in the auditory pathway. The aim of this study was to elucidate the relationship between the lesion side and the activation of the cerebral auditory cortex in such patients upon hearing their own name in an unknown voice. All experiments were performed at the Chang Gung Memorial Hospital, using a 3-Tesla magnetic resonance scanner made by Siemens. A total of ten patients with unilateral SNHL were recruited (three males and seven females, with an average age of 30 ± 14.9 years, ranging from 14 to 60 years, and an average history of hearing loss of 14 ± 6.29 years). Of these, five suffered from left-sided hearing impairment while the other five had right-sided lesions. During the experiment, an acoustic stimulus—the subject's own name—was played to the subject and the resulting cortex activation measured with functional magnetic resonance imaging (fMRI). The results showed a more extensive and stronger reaction in the cerebral cortex of the lesion side, regardless of whether the hearing impairment was left-sided or right-sided. In addition, a higher activation was found in patients with hearing loss in the left ear than in those with a lesion in the right ear. This study illustrated the differences in cerebral cortex activation due to the side of the lesion, which might affect the patient’s speech perception ability. Thus it provides more direct and objective evidence for the clinical evaluation of unilateral SNHL, in order to determine the patients’ need for cochlear implants or hearing aids.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Auditory cortex activation"

1

Scheich, Henning, and Michael Brosch. "Task-Related Activation of Auditory Cortex." In Neural Correlates of Auditory Cognition. Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-2350-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ashe, John H., and Norman M. Weinberger. "Acetylcholine Modulation of Cellular Excitability Via Muscarinic Receptors: Functional Plasticity in Auditory Cortex." In Activation to Acquisition. Birkhäuser Boston, 1991. http://dx.doi.org/10.1007/978-1-4684-0556-9_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hari, Riitta. "Activation of the Human Auditory Cortex by Various Sound Sequences : Neuromagnetic Studies." In Advances in Biomagnetism. Springer US, 1989. http://dx.doi.org/10.1007/978-1-4613-0581-1_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zatorre, Robert. "Early Sound Processing." In From Perception to Pleasure. Oxford University PressNew York, 2024. http://dx.doi.org/10.1093/oso/9780197558287.003.0002.

Full text
Abstract:
Abstract The focus here is on the basic anatomy and physiology of the auditory cortex, and its inputs and outputs. Predictive coding mechanisms emerge already at early levels of processing in the interactions between auditory subcortical nuclei and cortex. The brainstem nuclei and auditory cortical fields are organized hierarchically such that each region sends and receives inputs to the next level. Mismatch responses emerge at early levels and reflect predictive mechanisms. The functional properties of auditory cortex especially important for musical processing include periodicity coding, which is related to the representation of pitch. Auditory cortical responses can be characterized in terms of sensitivity to spectrotemporal modulation, allowing efficient encoding of complex sound patterns. Auditory cortical systems are also important for segregating multiple overlapping sounds, another function essential for music. Musical imagery is linked to the activation of auditory cortical circuits. Dysfunction of this circuitry can lead to auditory hallucinations.
APA, Harvard, Vancouver, ISO, and other styles
5

Gonzalo, Désirée, and Christian Büchel. "Audio-Visual Associative Learning Enhances Responses to Auditory Stimuli in Visual Cortex." In Functional Neuroimaging of Visual Cognition. Oxford University PressOxford, 2004. http://dx.doi.org/10.1093/oso/9780198528456.003.0011.

Full text
Abstract:
Abstract This study aimed to test whether audio-visual learning is mediated by modulation in sensory cortices (i.e. visual or auditory cortex). During fMRI scanning, subjects were exposed to repeated presentations of three tones and three visual stimuli. In 50% of presentations, one tone was paired with a face, and another one with a moving noise pattern. The third sound was not paired with any visual stimulus and was the target in an incidental detection task. Over time, activation in extrastriate visual cortex (V4) and the fusiform face area (FFA) increased in response to the sound that predicted the face, but not in response to the other two sounds.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmed, Raheel, and Rumana Ahmed. "Evidence of a Neuroinflammatory Model of Tinnitus." In Advances in the Auditory and Vestibular Systems [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.106082.

Full text
Abstract:
Emerging literature has highlighted the relationship between inflammatory and neuroinflammatory biomarkers and tinnitus. Neuroinflammation may help to explain the mechanisms underpinning hyperactivity in the cochlea, cochlear nucleus, inferior colliculus, medial geniculate body, and the auditory cortex in those with tinnitus. Glial activation and pro-inflammatory cytokines may cause excitatory-inhibitory synaptic imbalance. Advancing our understanding of these mechanisms may help elucidate the pathogenesis of tinnitus and lead to improvement in subtyping subjective tinnitus. The chapter explores our current understanding of the neuroinflammatory model within the context of the classical auditory pathway and what we can infer about the underlying mechanisms based on these studies.
APA, Harvard, Vancouver, ISO, and other styles
7

"Do we just not hear the voices? – Spontaneous activation of auditory cortex during silence in healthy adults." In Neuro-Visionen 4. Verlag Ferdinand Schöningh, 2007. http://dx.doi.org/10.30965/9783657764082_076.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Auditory cortex activation"

1

Sylcottt, Brian, Rui Wu, Shanyue Guan, and Chia-Cheng Lin. "Comparing Brain Activity Between Sitting and Standing Positions during Optic Flow with Coinciding Auditory Cognitive Tasks." In AHFE 2023 Hawaii Edition. AHFE International, 2023. http://dx.doi.org/10.54941/ahfe1004368.

Full text
Abstract:
Physical therapy intervention for people with vestibular disorders often includes optic flow stimulation. Such interventions can be performed with patients in either sitting or standing positions. Yet, little is known about how these positions affect brain activation during treatment. In this study, functional near-infrared spectroscopy (fNIRS) was used to investigate the reaction time and activation patterns of the prefrontal cortex (PFC) and temporoparietal junction VEST between sitting and standing conditions in the presence of both visual (optic flow) and cognitive (reaction time tasks) stimulation. 33 healthy adults participated in this two-visit study. In the first visit, participants were instructed to perform a series of reaction time tasks while sitting and experiencing optic flow at varying speeds through the HTC ViveTM virtual reality headset. In the second visit, participants performed the same tasks while standing. When compared with sitting, increased activation was observed in the left and right VEST for some of the standing trials. However, no statistical difference was found in the right or left PFC activation between sitting and standing positions when performing the reaction time tasks. These results suggest that, when compared to a sitting position, tasks performed in a standing position with optic flow stimulation will elicit greater VEST cortex activation, allow for multisensory integration training, and enhance positive outcomes after vestibular rehabilitation.
APA, Harvard, Vancouver, ISO, and other styles
2

Caravaca-Rodriguez, Daniel, Gregg J. Suaning, and Alejandro Barriga-Rivera. "Cross-modal activation of the primary visual cortex by auditory stimulation in RCS rats: considerations in visual prosthesis." In 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). IEEE, 2022. http://dx.doi.org/10.1109/embc48229.2022.9871504.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Heiskala, Juha, Kalle Kotilahti, Lauri Lipiäinen, Petri Hiltunen, P. Ellen Grant, and Ilkka Nissilä. "Optical tomographic imaging of activation of the infant auditory cortex using perturbation Monte Carlo with anatomical a priori information." In European Conference on Biomedical Optics. OSA, 2007. http://dx.doi.org/10.1364/ecbo.2007.6629_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Heiskala, Juha, Kalle Kotilahti, Lauri Lipiäinen, Petri Hiltunen, P. Ellen Grant, and Ilkka Nissilä. "Optical tomographic imaging of activation of the infant auditory cortex using perturbation Monte Carlo with anatomical a priori information." In European Conference on Biomedical Optics, edited by Brian W. Pogue and Rinaldo Cubeddu. SPIE, 2007. http://dx.doi.org/10.1117/12.728276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zakeri, Zohreh, Azfar Khalid, Ahmet Omurtag, Greg Hilliard, and Philip Breedon. "Building Trust and safety Correlates for Autonomous Systems using Physiological, Behavioral, and Subjective Measures." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1001595.

Full text
Abstract:
The use of collaborative robots (cobots) in the industrial setting has grown and continues to expand globally, especially in the context of the smart factory. Mistrust and stress results, as cobots don’t provide facial, auditory, and visual cues that workers normally use to predict behavior. For quantification of mental stress, physiological, behavioral and subjective measures are integrated, processed and analyzed in a smart factory lab setting. The impact on the human workers as mental stress and fatigue conditions are correlated with the task complexity, speed of work, length of collaborative task and cobot payload etc. Multimodal functional neuroimaging was used to record participants’ neural and cardiac activity, in addition to the standard subjective and behavioral measures as they collaborated with robots in multitasking contexts. Preliminary results show that task complexity is positively correlated with beta and gamma band power, left prefrontal cortex activation, and heart rate, while it is negatively correlated with alpha band power during task performance.
APA, Harvard, Vancouver, ISO, and other styles
6

Kusumoto, Junnosuke, Kyoko Shibata, and Hironobu Satoh. "Timbre estimation of compound tones from an auditory cortex by deep learning using fMRI: Sound pressure levels detection of specific frequency." In 2024 AHFE International Conference on Human Factors in Design, Engineering, and Computing (AHFE 2024 Hawaii Edition). AHFE International, 2024. http://dx.doi.org/10.54941/ahfe1005705.

Full text
Abstract:
Brain decoding have been widely treated in the neuroscience. However, compared to research in the visual cortex, progress in the field of auditory cortex has not been made. Therefore, the purpose of this study is to establish a technique to estimate sounds heard by human using deep learning from brain images captured by fMRI. The sounds we hear in usual have a unique timbre. Timbre is determined by the combination of sound pressure levels at the overtone, which is the natural multiple of the fundamental frequency, in a compound tone. Before, this research group decoded the pitch of pure tones, which are waves of a single frequency. As a result, the discrimination of two tones in increasing degrees and the detection of a specific pitch in triad were realized. Next phase of this research is to decode a sound pressure level at a specific frequency. By combining these methods, we believe it is possible to decode timbre by detecting a sound pressure level of specific overtone. In a previous report, we examined whether the brain activity of listening to pure tones at two different sound pressure levels at specific frequency can be discriminated by deep learning binary classification. The result was a discrimination rate of 70.84% with relative levels of 0 [dB] and -20 [dB] when 90 [dB] was used as a reference. This result indicates that the difference in brain activation intensity by sound pressure level could be handled as classification problem by deep learning. Therefore, the purpose of this paper is to detect the sound pressure level of pure tones using deep learning for application to timbre decoding. Specifically, we attempt to detect specific sound pressure level among the three tones of 0 [dB], -10 [dB], and -20 [dB] when 90 [dB] based on an absolute level of 90[dB]
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography