To see the other types of publications on this topic, follow the link: Multisensory neurons.

Journal articles on the topic 'Multisensory neurons'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Multisensory neurons.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Allman, Brian L., and M. Alex Meredith. "Multisensory Processing in “Unimodal” Neurons: Cross-Modal Subthreshold Auditory Effects in Cat Extrastriate Visual Cortex." Journal of Neurophysiology 98, no. 1 (July 2007): 545–49. http://dx.doi.org/10.1152/jn.00173.2007.

Full text
Abstract:
Historically, the study of multisensory processing has examined the function of the definitive neuron type, the bimodal neuron. These neurons are excited by inputs from more than one sensory modality, and when multisensory stimuli are present, they can integrate their responses in a predictable manner. However, recent studies have revealed that multisensory processing in the cortex is not restricted to bimodal neurons. The present investigation sought to examine the potential for multisensory processing in nonbimodal (unimodal) neurons in the retinotopically organized posterolateral lateral suprasylvian (PLLS) area of the cat. Standard extracellular recordings were used to measure responses of all neurons encountered to both separate- and combined-modality stimulation. Whereas bimodal neurons behaved as predicted, the surprising result was that 16% of unimodal visual neurons encountered were significantly facilitated by auditory stimuli. Because these unimodal visual neurons did not respond to an auditory stimulus presented alone but had their visual responses modulated by concurrent auditory stimulation, they represent a new form of multisensory neuron: the subthreshold multisensory neuron. These data also demonstrate that bimodal neurons can no longer be regarded as the exclusive basis for multisensory processing.
APA, Harvard, Vancouver, ISO, and other styles
2

Perrault, Thomas J., J. William Vaughan, Barry E. Stein, and Mark T. Wallace. "Superior Colliculus Neurons Use Distinct Operational Modes in the Integration of Multisensory Stimuli." Journal of Neurophysiology 93, no. 5 (May 2005): 2575–86. http://dx.doi.org/10.1152/jn.00926.2004.

Full text
Abstract:
Many neurons in the superior colliculus (SC) integrate sensory information from multiple modalities, giving rise to significant response enhancements. Although enhanced multisensory responses have been shown to depend on the spatial and temporal relationships of the stimuli as well as on their relative effectiveness, these factors alone do not appear sufficient to account for the substantial heterogeneity in the magnitude of the multisensory products that have been observed. Toward this end, the present experiments have revealed that there are substantial differences in the operations used by different multisensory SC neurons to integrate their cross-modal inputs, suggesting that intrinsic differences in these neurons may also play an important deterministic role in multisensory integration. In addition, the integrative operation employed by a given neuron was found to be well correlated with the neuron's dynamic range. In total, four categories of SC neurons were identified based on how their multisensory responses changed relative to the predicted addition of the two unisensory inputs as stimulus effectiveness was altered. Despite the presence of these categories, a general rule was that the most robust multisensory enhancements were seen with combinations of the least effective unisensory stimuli. Together, these results provide a better quantitative picture of the integrative operations performed by multisensory SC neurons and suggest mechanistic differences in the way in which these neurons synthesize cross-modal information.
APA, Harvard, Vancouver, ISO, and other styles
3

Jiang, Wan, Mark T. Wallace, Huai Jiang, J. William Vaughan, and Barry E. Stein. "Two Cortical Areas Mediate Multisensory Integration in Superior Colliculus Neurons." Journal of Neurophysiology 85, no. 2 (February 1, 2001): 506–22. http://dx.doi.org/10.1152/jn.2001.85.2.506.

Full text
Abstract:
The majority of multisensory neurons in the cat superior colliculus (SC) are able to synthesize cross-modal cues (e.g., visual and auditory) and thereby produce responses greater than those elicited by the most effective single modality stimulus and, sometimes, greater than those predicted by the arithmetic sum of their modality-specific responses. The present study examined the role of corticotectal inputs from two cortical areas, the anterior ectosylvian sulcus (AES) and the rostral aspect of the lateral suprasylvian sulcus (rLS), in producing these response enhancements. This was accomplished by evaluating the multisensory properties of individual SC neurons during reversible deactivation of these cortices individually and in combination using cryogenic deactivation techniques. Cortical deactivation eliminated the characteristic multisensory response enhancement of nearly all SC neurons but generally had little or no effect on a neuron's modality-specific responses. Thus, the responses of SC neurons to combinations of cross-modal stimuli were now no different from those evoked by one or the other of these stimuli individually. Of the two cortical areas, AES had a much greater impact on SC multisensory integrative processes, with nearly half the SC neurons sampled dependent on it alone. In contrast, only a small number of SC neurons depended solely on rLS. However, most SC neurons exhibited dual dependencies, and their multisensory enhancement was mediated by either synergistic or redundant influences from AES and rLS. Corticotectal synergy was evident when deactivating either cortical area compromised the multisensory enhancement of an SC neuron, whereas corticotectal redundancy was evident when deactivation of both cortical areas was required to produce this effect. The results suggest that, although multisensory SC neurons can be created as a consequence of a variety of converging tectopetal afferents that are derived from a host of subcortical and cortical structures, the ability to synthesize cross-modal inputs, and thereby produce an enhanced multisensory response, requires functional inputs from the AES, the rLS, or both.
APA, Harvard, Vancouver, ISO, and other styles
4

Wallace, M. T. "The Development of Multisensory Integration in the Brain." Perception 26, no. 1_suppl (August 1997): 35. http://dx.doi.org/10.1068/v970014.

Full text
Abstract:
Multisensory integration in the superior colliculus (SC) of the cat requires a protracted postnatal developmental time course. Kittens 3 – 135 days postnatal (dpn) were examined and the first neuron capable of responding to two different sensory inputs (auditory and somatosensory) was not seen until 12 dpn. Visually responsive multisensory neurons were not encountered until 20 dpn. These early multisensory neurons responded weakly to sensory stimuli, had long response latencies, large receptive fields, and poorly developed response selectivities. Most striking, however, was their inability to integrate cross-modality cues in order to produce the significant response enhancement or depression characteristic of these neurons in adults. The incidence of multisensory neurons increased gradually over the next 10 – 12 weeks. During this period, sensory responses became more robust, latencies shortened, receptive fields decreased in size, and unimodal selectivities matured. The first neurons capable of cross-modality integration were seen at 28 dpn. For the following two months, the incidence of such integrative neurons rose gradually until adult-like values were achieved. Surprisingly, however, as soon as a multisensory neuron exhibited this capacity, most of its integrative features were indistinguishable from those in adults. Given what is known about the requirements for multisensory integration in adult animals, this observation suggests that the appearance of multisensory integration reflects the onset of functional corticotectal inputs.
APA, Harvard, Vancouver, ISO, and other styles
5

Meredith, M. A., and B. E. Stein. "Spatial determinants of multisensory integration in cat superior colliculus neurons." Journal of Neurophysiology 75, no. 5 (May 1, 1996): 1843–57. http://dx.doi.org/10.1152/jn.1996.75.5.1843.

Full text
Abstract:
1. Although a representation of multisensory space is contained in the superior colliculus, little is known about the spatial requirements of multisensory stimuli that influence the activity of neurons here. Critical to this problem is an assessment of the registry of the different receptive fields within individual multisensory neurons. The present study was initiated to determine how closely the receptive fields of individual multisensory neurons are aligned, the physiological role of that alignment, and the possible functional consequences of inducing receptive-field misalignment. 2. Individual multisensory neurons in the superior colliculus of anesthetized, paralyzed cats were studied with the use of standard extracellular recording techniques. The receptive fields of multisensory neurons were large, as reported previously, but exhibited a surprisingly high degree of spatial coincidence. The average proportion of receptive-field overlap was 86% for the population of visual-auditory neurons sampled. 3. Because of this high degree of intersensory receptive-field correspondence, combined-modality stimuli that were coincident in space tended to fall within the excitatory regions of the receptive fields involved. The result was a significantly enhanced neuronal response in 88% of the multisensory neurons studied. If stimuli were spatially disparate, so that one fell outside its receptive field, either a decreased response occurred (56%), or no intersensory effect was apparent (44%). 4. The normal alignment of the different receptive fields of a multisensory neuron could be disrupted by passively displacing the eyes, pinnae, or limbs/body. In no case was a shift in location or size observed in a neuron's other receptive field(s) to compensate for this displacement. The physiological result of receptive-field misalignment was predictable and based on the location of the stimuli relative to the new positions of their respective receptive fields. Now, for example, one component of a spatially coincident pair of stimuli might fall outside its receptive field and inhibit the other's effects. 5. These data underscore the dependence of multisensory integrative responses on the relationship of the different stimuli to their corresponding receptive fields rather than to the spatial relationship of the stimuli to one another. Apparently, the alignment of different receptive fields for individual multisensory neurons ensures that responses to combinations of stimuli derived from the same event are integrated to increase the salience of that event. Therefore the maintenance of receptive-field alignment is critical for the appropriate integration of converging sensory signals and, ultimately, elicitation of adaptive behaviors.
APA, Harvard, Vancouver, ISO, and other styles
6

Carriere, Brian N., David W. Royal, and Mark T. Wallace. "Spatial Heterogeneity of Cortical Receptive Fields and Its Impact on Multisensory Interactions." Journal of Neurophysiology 99, no. 5 (May 2008): 2357–68. http://dx.doi.org/10.1152/jn.01386.2007.

Full text
Abstract:
Investigations of multisensory processing at the level of the single neuron have illustrated the importance of the spatial and temporal relationship of the paired stimuli and their relative effectiveness in determining the product of the resultant interaction. Although these principles provide a good first-order description of the interactive process, they were derived by treating space, time, and effectiveness as independent factors. In the anterior ectosylvian sulcus (AES) of the cat, previous work hinted that the spatial receptive field (SRF) architecture of multisensory neurons might play an important role in multisensory processing due to differences in the vigor of responses to identical stimuli placed at different locations within the SRF. In this study the impact of SRF architecture on cortical multisensory processing was investigated using semichronic single-unit electrophysiological experiments targeting a multisensory domain of the cat AES. The visual and auditory SRFs of AES multisensory neurons exhibited striking response heterogeneity, with SRF architecture appearing to play a major role in the multisensory interactions. The deterministic role of SRF architecture was tightly coupled to the manner in which stimulus location modulated the responsiveness of the neuron. Thus multisensory stimulus combinations at weakly effective locations within the SRF resulted in large (often superadditive) response enhancements, whereas combinations at more effective spatial locations resulted in smaller (additive/subadditive) interactions. These results provide important insights into the spatial organization and processing capabilities of cortical multisensory neurons, features that may provide important clues as to the functional roles played by this area in spatially directed perceptual processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Jiang, Wan, and Barry E. Stein. "Cortex Controls Multisensory Depression in Superior Colliculus." Journal of Neurophysiology 90, no. 4 (October 2003): 2123–35. http://dx.doi.org/10.1152/jn.00369.2003.

Full text
Abstract:
Multisensory depression is a fundamental index of multisensory integration in superior colliculus (SC) neurons. It is initiated when one sensory stimulus (auditory) located outside its modality-specific receptive field degrades or eliminates the neuron's responses to another sensory stimulus (visual) presented within its modality-specific receptive field. The present experiments demonstrate that the capacity of SC neurons to engage in multisensory depression is strongly dependent on influences from two cortical areas (the anterior ectosylvian and rostral lateral suprasylvian sulci). When these cortices are deactivated, the ability of SC neurons to synthesize visual-auditory inputs in this way is compromised; multisensory responses are disinhibited, becoming more vigorous and in some cases indistinguishable from responses to the visual stimulus alone. Although obtaining a more robust multisensory SC response when cortex is nonfunctional than when it is functional may seem paradoxical, these data may help explain previous observations that the loss of these cortical influences permits visual orientation behavior in the presence of a normally disruptive auditory stimulus.
APA, Harvard, Vancouver, ISO, and other styles
8

Perrault, Thomas J., J. William Vaughan, Barry E. Stein, and Mark T. Wallace. "Neuron-Specific Response Characteristics Predict the Magnitude of Multisensory Integration." Journal of Neurophysiology 90, no. 6 (December 2003): 4022–26. http://dx.doi.org/10.1152/jn.00494.2003.

Full text
Abstract:
Multisensory neurons in the superior colliculus (SC) typically respond to combinations of stimuli from multiple modalities with enhancements and/or depressions in their activity. Although such changes in response have been shown to follow a predictive set of integrative principles, these principles fail to completely account for the full range of interactions seen throughout the SC population. In an effort to better define this variability, we sought to determine if there were additional features of the neuronal response profile that were predictive of the magnitude of the multisensory interaction. To do this, we recorded from 109 visual-auditory SC neurons while systematically manipulating stimulus intensity. Along with the previously described roles of space, time, and stimulus effectiveness, two features of a neuron's response profile were found to offer predictive value as to the magnitude of the multisensory interaction: spontaneous activity and the level of sensory responsiveness. Multisensory neurons with little or no spontaneous activity and weak sensory responses had the capacity to exhibit large response enhancements. Conversely, neurons with modest spontaneous activity and robust sensory responses exhibited relatively small response enhancements. Together, these results provide a better view into multisensory integration, and suggest substantial heterogeneity in the integrative characteristics of the multisensory SC population.
APA, Harvard, Vancouver, ISO, and other styles
9

Wallace, M. T., M. A. Meredith, and B. E. Stein. "Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus." Journal of Neurophysiology 69, no. 6 (June 1, 1993): 1797–809. http://dx.doi.org/10.1152/jn.1993.69.6.1797.

Full text
Abstract:
1. Physiological methods were used to examine the pattern of inputs from different sensory cortices onto individual superior colliculus neurons. 2. Visual, auditory, and somatosensory influences from anterior ectosylvian sulcus (AES) and visual influences from lateral suprasylvian (LS) cortex were found to converge onto individual multisensory neurons in the cat superior colliculus. An excellent topographic relationship was found between the different sensory cortices and their target neurons in the superior colliculus. 3. Corticotectal inputs were derived solely from unimodal neurons. Multisensory neurons in AES and LS were not antidromically activated from the superior colliculus. 4. Orthodromic and antidromic latencies were consistent with monosynaptic corticotectal inputs arising from LS and the three subdivisions of AES (SIV, Field AES, and AEV). 5. Superior colliculus neurons that received convergent cortical inputs formed a principal component of the tecto-reticulospinal tract. Thus there appears to be extensive cortical control over the output neurons through which the superior colliculus mediates attentive and orientation behaviors. 6. Two other multisensory circuits were identified. A population of multisensory superior colliculus neurons was found, which neither received convergent cortical input nor projected into the tecto-reticulo-spinal tract. In addition, multisensory neurons in AES and LS proved to be independent of the superior colliculus (i.e., they were not corticotectal). While it is likely that these three distinct multisensory neural circuits have different functional roles, their constituent neurons appear to integrate their various sensory inputs in much the same way.
APA, Harvard, Vancouver, ISO, and other styles
10

Wallace, M. T., L. K. Wilkinson, and B. E. Stein. "Representation and integration of multiple sensory inputs in primate superior colliculus." Journal of Neurophysiology 76, no. 2 (August 1, 1996): 1246–66. http://dx.doi.org/10.1152/jn.1996.76.2.1246.

Full text
Abstract:
1. The properties of visual-, auditory-, and somatosensory-responsive neurons, as well as of neurons responsive to multiple sensory cues (i.e., multisensory), were examined in the superior colliculus of the rhesus monkey. Although superficial layer neurons responded exclusively to visual stimuli and visual inputs predominated in deeper layers, there was also a rich nonvisual and multisensory representation in the superior colliculus. More than a quarter (27.8%) of the deep layer population responded to stimuli from more than a single sensory modality. In contrast, 37% responded only to visual cues, 17.6% to auditory cues, and 17.6% to somatosensory cues. Unimodal- and multisensory-responsive neurons were clustered by modality. Each of these modalities was represented in map-like fashion, and the different representations were in alignment with one another. 2. Most deep layer visually responsive neurons were binocular and exhibited poor selectivity for such stimulus characteristics as orientation, velocity, and direction of movement. Similarly, most auditory-responsive neurons had contralateral receptive fields and were binaural, but had little frequency selectivity and preferred complex, broad-band sounds. Somatosensory-responsive neurons were overwhelmingly contralateral, high velocity, and rapidly adapting. Only rarely did somatosensory-responsive neurons require distortion of subcutaneous tissue for activation. 3. The spatial congruence among the different receptive fields of multisensory neurons was a critical feature underlying their ability to synthesize cross-modal information. 4. Combinations of stimuli could have very different consequences in the same neuron, depending on their temporal and spatial relationships. Generally, multisensory interactions were evident when pairs of stimuli were separated from one another by < 500 ms, and the products of these interactions far exceeded the sum of their unimodal components. Whether the combination of stimuli produced response enhancement, response depression, or no interaction depended on the location of the stimuli relative to one another and to their respective receptive fields. Maximal response enhancements were observed when stimuli originated from similar locations in space (as when derived from the same event) because they fell within the excitatory receptive fields of the same multisensory neurons. If, however, the stimuli were spatially disparate such that one fell beyond the excitatory borders of its receptive field, either no interaction was produced or this stimulus depressed the effectiveness of the other. Furthermore, maximal response interactions were seen with the pairing of weakly effective unimodal stimuli. As the individual unimodal stimuli became increasingly effective, the levels of response enhancement to stimulus combinations declined, a principle referred to as inverse effectiveness. Many of the integrative principles seen here in the primate superior colliculus are strikingly similar to those observed in the cat. These observations indicate that a set of common principles of multisensory integration is adaptable in widely divergent species living in very different ecological situations. 5. Surprisingly, a few multisensory neurons had individual receptive fields that were not in register with one another. This has not been noted in multisensory neurons of other species, and these "anomalous" receptive fields could present a daunting problem: stimuli originating from the same general location in space cannot simultaneously fall within their respective receptive fields, a stimulus pairing that may result in response depression. Conversely, stimuli that originate from separate events and disparate locations (and fall within their receptive fields) may result in response enhancement. However, the spatial principle of multisensory integration did not apply in these cases. (ABSTRACT TRUNCATED)
APA, Harvard, Vancouver, ISO, and other styles
11

Boulant, JA, and NL Silva. "Multisensory Hypothalamic Neurons May Explain Interactions Among Regulatory Systems." Physiology 4, no. 6 (December 1, 1989): 245–48. http://dx.doi.org/10.1152/physiologyonline.1989.4.6.245.

Full text
Abstract:
Homeostatic systems are controlled by hypothalamic neurons that sense endogenous factors, including temperature, osmolality, glucose, and reproductive hormones. The concept of "functional specificity" implies that each neuron senses only one factor, but in vitro studies show that a neuron can respond to multiple factors. This may explain many interactions observed between regulatory systems.
APA, Harvard, Vancouver, ISO, and other styles
12

Ghose, Dipanwita, Zachary P. Barnett, and Mark T. Wallace. "Impact of response duration on multisensory integration." Journal of Neurophysiology 108, no. 9 (November 1, 2012): 2534–44. http://dx.doi.org/10.1152/jn.00286.2012.

Full text
Abstract:
Multisensory neurons in the superior colliculus (SC) have been shown to have large receptive fields that are heterogeneous in nature. These neurons have the capacity to integrate their different sensory inputs, a process that has been shown to depend on the physical characteristics of the stimuli that are combined (i.e., spatial and temporal relationship and relative effectiveness). Recent work has highlighted the interdependence of these factors in driving multisensory integration, adding a layer of complexity to our understanding of multisensory processes. In the present study our goal was to add to this understanding by characterizing how stimulus location impacts the temporal dynamics of multisensory responses in cat SC neurons. The results illustrate that locations within the spatial receptive fields (SRFs) of these neurons can be divided into those showing short-duration responses and long-duration response profiles. Most importantly, discharge duration appears to be a good determinant of multisensory integration, such that short-duration responses are typically associated with a high magnitude of multisensory integration (i.e., superadditive responses) while long-duration responses are typically associated with low integrative capacity. These results further reinforce the complexity of the integrative features of SC neurons and show that the large SRFs of these neurons are characterized by vastly differing temporal dynamics, dynamics that strongly shape the integrative capacity of these neurons.
APA, Harvard, Vancouver, ISO, and other styles
13

Kurela, LeAnne R., and Mark T. Wallace. "Serotonergic Modulation of Sensory and Multisensory Processing in Superior Colliculus." Multisensory Research 30, no. 2 (2017): 121–58. http://dx.doi.org/10.1163/22134808-00002552.

Full text
Abstract:
The ability to integrate information across the senses is vital for coherent perception of and interaction with the world. While much is known regarding the organization and function of multisensory neurons within the mammalian superior colliculus (SC), very little is understood at a mechanistic level. One open question in this regard is the role of neuromodulatory networks in shaping multisensory responses. While the SC receives substantial serotonergic projections from the raphe nuclei, and serotonergic receptors are distributed throughout the SC, the potential role of serotonin (5-HT) signaling in multisensory function is poorly understood. To begin to fill this knowledge void, the current study provides physiological evidence for the influences of 5-HT signaling on auditory, visual and audiovisual responses of individual neurons in the intermediate and deep layers of the SC, with a focus on the 5HT2a receptor. Using single-unit extracellular recordings in combination with pharmacological methods, we demonstrate that alterations in 5HT2a receptor signaling change receptive field (RF) architecture as well as responsivity and integrative abilities of SC neurons when assessed at the level of the single neuron. In contrast, little changes were seen in the local field potential (LFP). These results are the first to implicate the serotonergic system in multisensory processing, and are an important step to understanding how modulatory networks mediate multisensory integration in the SC.
APA, Harvard, Vancouver, ISO, and other styles
14

Jiang, Wan, Huai Jiang, and Barry E. Stein. "Neonatal Cortical Ablation Disrupts Multisensory Development in Superior Colliculus." Journal of Neurophysiology 95, no. 3 (March 2006): 1380–96. http://dx.doi.org/10.1152/jn.00880.2005.

Full text
Abstract:
The ability of cat superior colliculus (SC) neurons to synthesize information from different senses depends on influences from two areas of the cortex: the anterior ectosylvian sulcus (AES) and the rostral lateral suprasylvian sulcus (rLS). Reversibly deactivating the inputs to the SC from either of these areas in normal adults severely compromises this ability and the SC-mediated behaviors that depend on it. In this study, we found that removal of these areas in neonatal animals precluded the normal development of multisensory SC processes. At maturity there was a substantial decrease in the incidence of multisensory neurons, and those multisensory neurons that did develop were highly abnormal. Their cross-modal receptive field register was severely compromised, as was their ability to integrate cross-modal stimuli. Apparently, despite the impressive plasticity of the neonatal brain, it cannot compensate for the early loss of these cortices. Surprisingly, however, neonatal removal of either AES or rLS had comparatively minor consequences on these properties. At maturity multisensory SC neurons were quite common: they developed the characteristic spatial register among their unisensory receptive fields and exhibited normal adult-like multisensory integration. These observations suggest that during early ontogeny, when the multisensory properties of SC neurons are being crafted, AES and rLS may have the ability to compensate for the loss of one another's cortico-collicular influences so that normal multisensory processes can develop in the SC.
APA, Harvard, Vancouver, ISO, and other styles
15

Stein, B. E. "Integration of Sensory Information in the Brain." Perception 26, no. 1_suppl (August 1997): 369. http://dx.doi.org/10.1068/v970009.

Full text
Abstract:
That sensory cues in one modality affect perception in another has been known for some time, and there are many examples of ‘intersensory’ influences within the broad phenomenon of cross-modal integration. The ability of the CNS to integrate cues from different sensory channels is particularly evident in the facilitated detection and reaction to combinations of concordant cues from different modalities, and in the dramatic perceptual anomalies that can occur when these cues are discordant. A substrate for multisensory integration is provided by the many CNS neurons (eg, in the superior colliculus) which receive convergent input from multiple sensory modalities. Similarities in the principles by which these neurons integrate multisensory information in different species point to a remarkable conservation in the integrative features of the CNS during vertebrate evolution. In general, profound enhancement or depression in neural activity can be induced in the same neuron, depending on the spatial and temporal relationships among the stimuli presented to it. The specific response product obtained in any given multisensory neuron is predictable on the basis of the features of its various receptive fields. Perhaps most striking, however, is the parallel which has been demonstrated between the properties of multisensory integration at the level of the single neuron in the superior colliculus and at the level of overt attentive and orientation behaviour.
APA, Harvard, Vancouver, ISO, and other styles
16

Xu, J., X. Sun, X. Zhou, J. Zhang, and L. Yu. "The cortical distribution of multisensory neurons was modulated by multisensory experience." Neuroscience 272 (July 2014): 1–9. http://dx.doi.org/10.1016/j.neuroscience.2014.04.068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wallace, Mark T., and Barry E. Stein. "Development of Multisensory Neurons and Multisensory Integration in Cat Superior Colliculus." Journal of Neuroscience 17, no. 7 (April 1, 1997): 2429–44. http://dx.doi.org/10.1523/jneurosci.17-07-02429.1997.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Morrow, Jeremiah, Clayton Mosher, and Katalin Gothard. "Multisensory Neurons in the Primate Amygdala." Journal of Neuroscience 39, no. 19 (March 11, 2019): 3663–75. http://dx.doi.org/10.1523/jneurosci.2903-18.2019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Rideaux, Reuben, Katherine R. Storrs, Guido Maiello, and Andrew E. Welchman. "How multisensory neurons solve causal inference." Proceedings of the National Academy of Sciences 118, no. 32 (August 4, 2021): e2106235118. http://dx.doi.org/10.1073/pnas.2106235118.

Full text
Abstract:
Sitting in a static railway carriage can produce illusory self-motion if the train on an adjoining track moves off. While our visual system registers motion, vestibular signals indicate that we are stationary. The brain is faced with a difficult challenge: is there a single cause of sensations (I am moving) or two causes (I am static, another train is moving)? If a single cause, integrating signals produces a more precise estimate of self-motion, but if not, one cue should be ignored. In many cases, this process of causal inference works without error, but how does the brain achieve it? Electrophysiological recordings show that the macaque medial superior temporal area contains many neurons that encode combinations of vestibular and visual motion cues. Some respond best to vestibular and visual motion in the same direction (“congruent” neurons), while others prefer opposing directions (“opposite” neurons). Congruent neurons could underlie cue integration, but the function of opposite neurons remains a puzzle. Here, we seek to explain this computational arrangement by training a neural network model to solve causal inference for motion estimation. Like biological systems, the model develops congruent and opposite units and recapitulates known behavioral and neurophysiological observations. We show that all units (both congruent and opposite) contribute to motion estimation. Importantly, however, it is the balance between their activity that distinguishes whether visual and vestibular cues should be integrated or separated. This explains the computational purpose of puzzling neural representations and shows how a relatively simple feedforward network can solve causal inference.
APA, Harvard, Vancouver, ISO, and other styles
20

Wallace, M. T., and B. E. Stein. "Onset of Cross-Modal Synthesis in the Neonatal Superior Colliculus is Gated by the Development of Cortical Influences." Journal of Neurophysiology 83, no. 6 (June 1, 2000): 3578–82. http://dx.doi.org/10.1152/jn.2000.83.6.3578.

Full text
Abstract:
Many neurons in the superior colliculus (SC) are able to integrate combinations of visual, auditory, and somatosensory stimuli, thereby markedly affecting the vigor of their responses to external stimuli. However, this capacity for multisensory integration is not inborn. Rather, it appears comparatively late in postnatal development and is not expressed until the SC passes through several distinct developmental stages. As shown here, the final stage in this sequence is one in which a region of association cortex establishes functional control over the SC, thus enabling the multisensory integrative capabilities of its target SC neurons. The first example of this corticotectal input was seen at postnatal day 28. For any individual SC neuron, the onset of corticotectal influences appeared to be abrupt. Because this event occurred at very different times for different SC neurons, a period of 3–4 postnatal months was required before the adult-like condition was achieved. The protracted postnatal period required for the maturation of these corticotectal influences corresponded closely with estimates of the peak period of cortical plasticity, raising the possibility that the genesis of these corticotectal influences, and hence the onset of SC multisensory integration, occurs only after the cortex is capable of exerting experience-dependent control over SC neurons.
APA, Harvard, Vancouver, ISO, and other styles
21

Anastasio, Thomas J., Paul E. Patton, and Kamel Belkacem-Boussaid. "Using Bayes' Rule to Model Multisensory Enhancement in the Superior Colliculus." Neural Computation 12, no. 5 (May 1, 2000): 1165–87. http://dx.doi.org/10.1162/089976600300015547.

Full text
Abstract:
The deep layers of the superior colliculus (SC) integrate multisensory inputs and initiate an orienting response toward the source of stimulation (target). Multisensory enhancement, which occurs in the deep SC, is the augmentation of a neural response to sensory input of one modality by input of another modality. Multisensory enhancement appears to underlie the behavioral observation that an animal is more likely to orient toward weak stimuli if a stimulus of one modality is paired with a stimulus of another modality. Yet not all deep SC neurons are multisensory. Those that are exhibit the property of inverse effectiveness: combinations of weaker unimodal responses produce larger amounts of enhancement. We show that these neurophysiological findings support the hypothesis that deep SC neurons use their sensory inputs to compute the probability that a target is present. We model multimodal sensory inputs to the deep SC as random variables and cast the computation function in terms of Bayes' rule. Our analysis suggests that multisensory deep SC neurons are those that combine unimodal inputs that would be more uncertain by themselves. It also suggests that inverse effectiveness results because the increase in target probability due to the integration of multisensory inputs is larger when the unimodal responses are weaker.
APA, Harvard, Vancouver, ISO, and other styles
22

Carriere, Brian N., David W. Royal, Thomas J. Perrault, Stephen P. Morrison, J. William Vaughan, Barry E. Stein, and Mark T. Wallace. "Visual Deprivation Alters the Development of Cortical Multisensory Integration." Journal of Neurophysiology 98, no. 5 (November 2007): 2858–67. http://dx.doi.org/10.1152/jn.00587.2007.

Full text
Abstract:
It has recently been demonstrated that the maturation of normal multisensory circuits in the cortex of the cat takes place over an extended period of postnatal life. Such a finding suggests that the sensory experiences received during this time may play an important role in this developmental process. To test the necessity of sensory experience for normal cortical multisensory development, cats were raised in the absence of visual experience from birth until adulthood, effectively precluding all visual and visual–nonvisual multisensory experiences. As adults, semichronic single-unit recording experiments targeting the anterior ectosylvian sulcus (AES), a well-defined multisensory cortical area in the cat, were initiated and continued at weekly intervals in anesthetized animals. Despite having very little impact on the overall sensory representations in AES, dark-rearing had a substantial impact on the integrative capabilities of multisensory AES neurons. A significant increase was seen in the proportion of multisensory neurons that were modulated by, rather than driven by, a second sensory modality. More important, perhaps, there was a dramatic shift in the percentage of these modulated neurons in which the pairing of weakly effective and spatially and temporally coincident stimuli resulted in response depressions. In normally reared animals such combinations typically give rise to robust response enhancements. These results illustrate the important role sensory experience plays in shaping the development of mature multisensory cortical circuits and suggest that dark-rearing shifts the relative balance of excitation and inhibition in these circuits.
APA, Harvard, Vancouver, ISO, and other styles
23

Zahar, Yael, Amit Reches, and Yoram Gutfreund. "Multisensory Enhancement in the Optic Tectum of the Barn Owl: Spike Count and Spike Timing." Journal of Neurophysiology 101, no. 5 (May 2009): 2380–94. http://dx.doi.org/10.1152/jn.91193.2008.

Full text
Abstract:
Temporal and spatial correlations between auditory and visual stimuli facilitate the perception of unitary events and improve behavioral responses. However, it is not clear how combined visual and auditory information is processed in single neurons. Here we studied responses of multisensory neurons in the barn owl's optic tectum (the avian homologue of the superior colliculus) to visual, auditory, and bimodal stimuli. We specifically focused on responses to sequences of repeated stimuli. We first report that bimodal stimulation tends to elicit more spikes than in the responses to its unimodal components (a phenomenon known as multisensory enhancement). However, this tendency was found to be history-dependent; multisensory enhancement was mostly apparent in the first stimulus of the sequence and to a much lesser extent in the subsequent stimuli. Next, a vector-strength analysis was applied to quantify the phase locking of the responses to the stimuli. We report that in a substantial number of multisensory neurons responses to sequences of bimodal stimuli elicited spike trains that were better phase locked to the stimulus than spike trains elicited by stimulating with the unimodal counterparts (visual or auditory). We conclude that multisensory enhancement can be manifested in better phase locking to the stimulus as well as in more spikes.
APA, Harvard, Vancouver, ISO, and other styles
24

Meng, Hui, and Dora E. Angelaki. "Responses of Ventral Posterior Thalamus Neurons to Three-Dimensional Vestibular and Optic Flow Stimulation." Journal of Neurophysiology 103, no. 2 (February 2010): 817–26. http://dx.doi.org/10.1152/jn.00729.2009.

Full text
Abstract:
Multisensory neurons tuned to both vestibular and visual motion (optic flow) signals are found in several cortical areas in the dorsal visual stream. Here we examine whether such convergence occurs subcortically in the macaque thalamus. We searched the ventral posterior nuclei, including the anterior pulvinar, as well as the ventro-lateral and ventral posterior lateral nuclei, areas that receive vestibular signals from brain stem and deep cerebellar nuclei. Approximately a quarter of cells responded to three-dimensional (3D) translational and/or rotational motion. More than half of the responsive cells were convergent, thus responded during both rotation and translation. The preferred axes of translation/rotation were distributed throughout 3D space. The majority of the neurons were excited, but some were inhibited, during rotation/translation in darkness. Only a couple of neurons were multisensory being tuned to both vestibular and optic flow stimuli. We conclude that multisensory vestibular/optic flow neurons, which are commonly found in cortical visual and visuomotor areas, are rare in the ventral posterior thalamus.
APA, Harvard, Vancouver, ISO, and other styles
25

Pratt, Kara G., and Carlos D. Aizenman. "Multisensory Integration in Mesencephalic Trigeminal Neurons in Xenopus Tadpoles." Journal of Neurophysiology 102, no. 1 (July 2009): 399–412. http://dx.doi.org/10.1152/jn.91317.2008.

Full text
Abstract:
Mesencephalic trigeminal (M-V) neurons are primary somatosensory neurons with somata located within the CNS, instead of in peripheral sensory ganglia. In amphibians, these unipolar cells are found within the optic tectum and have a single axon that runs along the mandibular branch of the trigeminal nerve. The axon has collaterals in the brain stem and is believed to make synaptic contact with neurons in the trigeminal motor nucleus, forming part of a sensorimotor loop. The number of M-V neurons is known to increase until metamorphosis and then decrease, suggesting that at least some M-V neurons may play a transient role during tadpole development. It is not known whether their location in the optic tectum allows them to process both visual and somatosensory information. Here we compare the anatomical and electrophysiological properties of M-V neurons in the Xenopus tadpole to principal tectal neurons. We find that, unlike principal tectal cells, M-V neurons can sustain repetitive spiking when depolarized and express a significant H-type current. M-V neurons could also be driven synaptically by visual input both in vitro and in vivo, but visual responses were smaller and longer-lasting than those seen in principal tectal neurons. We also found that the axon of M-V neurons appears to directly innervate a tentacle found in the corner of the mouth of premetamorphic tadpoles. Electrical stimulation of this transient sensory organ results in antidromic spiking in M-V neurons in the tectum. Thus M-V neurons may play an integrative multisensory role during tadpole development.
APA, Harvard, Vancouver, ISO, and other styles
26

Burnett, Luke R., Barry E. Stein, Thomas J. Perrault, and Mark T. Wallace. "Excitotoxic lesions of the superior colliculus preferentially impact multisensory neurons and multisensory integration." Experimental Brain Research 179, no. 2 (December 5, 2006): 325–38. http://dx.doi.org/10.1007/s00221-006-0789-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Krueger, Juliane, David W. Royal, Matthew C. Fister, and Mark T. Wallace. "Spatial receptive field organization of multisensory neurons and its impact on multisensory interactions." Hearing Research 258, no. 1-2 (December 2009): 47–54. http://dx.doi.org/10.1016/j.heares.2009.08.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Mao, Yu-Ting, Tian-Miao Hua, and Sarah L. Pallas. "Competition and convergence between auditory and cross-modal visual inputs to primary auditory cortical areas." Journal of Neurophysiology 105, no. 4 (April 2011): 1558–73. http://dx.doi.org/10.1152/jn.00407.2010.

Full text
Abstract:
Sensory neocortex is capable of considerable plasticity after sensory deprivation or damage to input pathways, especially early in development. Although plasticity can often be restorative, sometimes novel, ectopic inputs invade the affected cortical area. Invading inputs from other sensory modalities may compromise the original function or even take over, imposing a new function and preventing recovery. Using ferrets whose retinal axons were rerouted into auditory thalamus at birth, we were able to examine the effect of varying the degree of ectopic, cross-modal input on reorganization of developing auditory cortex. In particular, we assayed whether the invading visual inputs and the existing auditory inputs competed for or shared postsynaptic targets and whether the convergence of input modalities would induce multisensory processing. We demonstrate that although the cross-modal inputs create new visual neurons in auditory cortex, some auditory processing remains. The degree of damage to auditory input to the medial geniculate nucleus was directly related to the proportion of visual neurons in auditory cortex, suggesting that the visual and residual auditory inputs compete for cortical territory. Visual neurons were not segregated from auditory neurons but shared target space even on individual target cells, substantially increasing the proportion of multisensory neurons. Thus spatial convergence of visual and auditory input modalities may be sufficient to expand multisensory representations. Together these findings argue that early, patterned visual activity does not drive segregation of visual and auditory afferents and suggest that auditory function might be compromised by converging visual inputs. These results indicate possible ways in which multisensory cortical areas may form during development and evolution. They also suggest that rehabilitative strategies designed to promote recovery of function after sensory deprivation or damage need to take into account that sensory cortex may become substantially more multisensory after alteration of its input during development.
APA, Harvard, Vancouver, ISO, and other styles
29

Wallace, Mark T., M. Alex Meredith, and Barry E. Stein. "Multisensory Integration in the Superior Colliculus of the Alert Cat." Journal of Neurophysiology 80, no. 2 (August 1, 1998): 1006–10. http://dx.doi.org/10.1152/jn.1998.80.2.1006.

Full text
Abstract:
Wallace, Mark T., M. Alex Meredith, and Barry E. Stein. Multisensory integration in the superior colliculus of the alert cat. J. Neurophysiol. 80: 1006–1010, 1998. The modality convergence patterns, sensory response properties, and principles governing multisensory integration in the superior colliculus (SC) of the alert cat were found to have fundamental similarities to those in anesthetized animals. Of particular interest was the observation that, in a manner indistinguishable from the anesthetized animal, combinations of two different sensory stimuli significantly enhanced the responses of SC neurons above those evoked by either unimodal stimulus. These observations are consistent with the speculation that there is a functional link among multisensory integration in individual SC neurons and cross-modality attentive and orientation behaviors.
APA, Harvard, Vancouver, ISO, and other styles
30

Sarko, Diana K., and Dipanwita Ghose. "Developmental plasticity of multisensory circuitry: how early experience dictates cross-modal interactions." Journal of Neurophysiology 108, no. 11 (December 1, 2012): 2863–66. http://dx.doi.org/10.1152/jn.00383.2012.

Full text
Abstract:
Normal sensory experience is necessary for the development of multisensory processing, such that disruption through environmental manipulations eliminates or alters multisensory integration. In this Neuro Forum, we examine the recent paper by Xu et al. ( J Neurosci 32: 2287–2298, 2012) which proposes that the statistics of cross-modal stimuli encountered early in life might be a driving factor for the development of normal multisensory integrative abilities in superior colliculus neurons. We present additional interpretations of their analyses as well as future directions and translational implications of this study for understanding the neural substrates and plasticity inherent to multisensory processing.
APA, Harvard, Vancouver, ISO, and other styles
31

Wallace, M. T., and B. E. Stein. "Cross-modal synthesis in the midbrain depends on input from cortex." Journal of Neurophysiology 71, no. 1 (January 1, 1994): 429–32. http://dx.doi.org/10.1152/jn.1994.71.1.429.

Full text
Abstract:
1. The synthesis of information from different sensory modalities in the superior colliculus is an important precursor of attentive and orientation behavior. 2. This integration of multisensory information is critically dependent on inputs from a small area of association cortex, the anterior ectosylvian sulcus. Removal of these corticotectal influences can have a remarkably specific effect: it can eliminate multisensory integration in superior colliculus neurons while leaving their responses to unimodal cues intact. 3. Apparently, some of the associative functions of cortex are accomplished via its target neurons in the midbrain.
APA, Harvard, Vancouver, ISO, and other styles
32

Rowland, Benjamin A., Terrence R. Stanford, and Barry E. Stein. "A Model of the Neural Mechanisms Underlying Multisensory Integration in the Superior Colliculus." Perception 36, no. 10 (October 2007): 1431–43. http://dx.doi.org/10.1068/p5842.

Full text
Abstract:
Much of the information about multisensory integration is derived from studies of the cat superior colliculus (SC), a midbrain structure involved in orientation behaviors. This integration is apparent in the enhanced responses of SC neurons to cross-modal stimuli, responses that exceed those to any of the modality-specific component stimuli. The simplest model of multisensory integration is one in which the SC neuron simply sums its various sensory inputs. However, a number of empirical findings reveal the inadequacy of such a model; for example, the finding that deactivation of cortico-collicular inputs eliminates the enhanced response to a cross-modal stimulus without eliminating responses to the modality-specific component stimuli. These and other empirical findings inform a computational model that accounts for all of the most fundamental aspects of SC multisensory integration. The model is presented in two forms: an algebraic form that conveys the essential insights, and a compartmental form that represents the neuronal computations in a more biologically realistic way.
APA, Harvard, Vancouver, ISO, and other styles
33

Busch, Silas E., and Arseny S. Khakhalin. "Intrinsic temporal tuning of neurons in the optic tectum is shaped by multisensory experience." Journal of Neurophysiology 122, no. 3 (September 1, 2019): 1084–96. http://dx.doi.org/10.1152/jn.00099.2019.

Full text
Abstract:
For a biological neural network to be functional, its neurons need to be connected with synapses of appropriate strength, and each neuron needs to appropriately respond to its synaptic inputs. This second aspect of network tuning is maintained by intrinsic plasticity; yet it is often considered secondary to changes in connectivity and mostly limited to adjustments of overall excitability of each neuron. Here we argue that even nonoscillatory neurons can be tuned to inputs of different temporal dynamics and that they can routinely adjust this tuning to match the statistics of their synaptic activation. Using the dynamic clamp technique, we show that, in the tectum of Xenopus tadpole, neurons become selective for faster inputs when animals are exposed to fast visual stimuli but remain responsive to longer inputs in animals exposed to slower, looming, or multisensory stimulation. We also report a homeostatic cotuning between synaptic and intrinsic temporal properties of individual tectal cells. These results expand our understanding of intrinsic plasticity in the brain and suggest that there may exist an additional dimension of network tuning that has been so far overlooked. NEW & NOTEWORTHY We use dynamic clamp to show that individual neurons in the tectum of Xenopus tadpoles are selectively tuned to either shorter (more synchronous) or longer (less synchronous) synaptic inputs. We also demonstrate that this intrinsic temporal tuning is strongly shaped by sensory experiences. This new phenomenon, which is likely to be mediated by changes in sodium channel inactivation, is bound to have important consequences for signal processing and the development of local recurrent connections.
APA, Harvard, Vancouver, ISO, and other styles
34

Wallace, Mark T., and Barry E. Stein. "Early Experience Determines How the Senses Will Interact." Journal of Neurophysiology 97, no. 1 (January 2007): 921–26. http://dx.doi.org/10.1152/jn.00497.2006.

Full text
Abstract:
Multisensory integration refers to the process by which the brain synthesizes information from different senses to enhance sensitivity to external events. In the present experiments, animals were reared in an altered sensory environment in which visual and auditory stimuli were temporally coupled but originated from different locations. Neurons in the superior colliculus developed a seemingly anomalous form of multisensory integration in which spatially disparate visual-auditory stimuli were integrated in the same way that neurons in normally reared animals integrated visual-auditory stimuli from the same location. The data suggest that the principles governing multisensory integration are highly plastic and that there is no a priori spatial relationship between stimuli from different senses that is required for their integration. Rather, these principles appear to be established early in life based on the specific features of an animal's environment to best adapt it to deal with that environment later in life.
APA, Harvard, Vancouver, ISO, and other styles
35

Xu, Jinghong, Liping Yu, Terrence R. Stanford, Benjamin A. Rowland, and Barry E. Stein. "What does a neuron learn from multisensory experience?" Journal of Neurophysiology 113, no. 3 (February 1, 2015): 883–89. http://dx.doi.org/10.1152/jn.00284.2014.

Full text
Abstract:
The brain's ability to integrate information from different senses is acquired only after extensive sensory experience. However, whether early life experience instantiates a general integrative capacity in multisensory neurons or one limited to the particular cross-modal stimulus combinations to which one has been exposed is not known. By selectively restricting either visual-nonvisual or auditory-nonauditory experience during the first few months of life, the present study found that trisensory neurons in cat superior colliculus (as well as their bisensory counterparts) became adapted to the cross-modal stimulus combinations specific to each rearing environment. Thus, even at maturity, trisensory neurons did not integrate all cross-modal stimulus combinations to which they were capable of responding, but only those that had been linked via experience to constitute a coherent spatiotemporal event. This selective maturational process determines which environmental events will become the most effective targets for superior colliculus-mediated shifts of attention and orientation.
APA, Harvard, Vancouver, ISO, and other styles
36

Pluta, Scott R., Benjamin A. Rowland, Terrence R. Stanford, and Barry E. Stein. "Alterations to multisensory and unisensory integration by stimulus competition." Journal of Neurophysiology 106, no. 6 (December 2011): 3091–101. http://dx.doi.org/10.1152/jn.00509.2011.

Full text
Abstract:
In environments containing sensory events at competing locations, selecting a target for orienting requires prioritization of stimulus values. Although the superior colliculus (SC) is causally linked to the stimulus selection process, the manner in which SC multisensory integration operates in a competitive stimulus environment is unknown. Here we examined how the activity of visual-auditory SC neurons is affected by placement of a competing target in the opposite hemifield, a stimulus configuration that would, in principle, promote interhemispheric competition for access to downstream motor circuitry. Competitive interactions between the targets were evident in how they altered unisensory and multisensory responses of individual neurons. Responses elicited by a cross-modal stimulus (multisensory responses) proved to be substantially more resistant to competitor-induced depression than were unisensory responses (evoked by the component modality-specific stimuli). Similarly, when a cross-modal stimulus served as the competitor, it exerted considerably more depression than did its individual component stimuli, in some cases producing more depression than predicted by their linear sum. These findings suggest that multisensory integration can help resolve competition among multiple targets by enhancing orientation to the location of cross-modal events while simultaneously suppressing orientation to events at alternate locations.
APA, Harvard, Vancouver, ISO, and other styles
37

Jiang, Wan, Huai Jiang, Benjamin A. Rowland, and Barry E. Stein. "Multisensory Orientation Behavior Is Disrupted by Neonatal Cortical Ablation." Journal of Neurophysiology 97, no. 1 (January 2007): 557–62. http://dx.doi.org/10.1152/jn.00591.2006.

Full text
Abstract:
The integration of visual and auditory information can significantly amplify the sensory responses of superior colliculus (SC) neurons and the behaviors that depend on them. This response amplification depends on the development of SC inputs that are derived from two regions of cortex: the anterior ectosylvian sulcus (AES) and the rostral lateral suprasylvian sulcus (rLS). Neonatal ablation of these cortico-collicular areas has been shown to disrupt the development of the multisensory enhancement capabilities of SC neurons and the present results demonstrate that it also precludes the development of the normal multisensory enhancements in orientation behavior. Animals with neonatal ablation of AES and rLS were tested at maturity and found unable to benefit from the combination of visual and auditory cues in their efforts to localize targets in contralesional space. In contrast, their ipsilesional multisensory orientation capabilities were indistinguishable from those of normal animals. However, when only one of these cortical areas was removed during early life, later behavioral consequences were negligible. Whether similar compensatory processes would occur in adult animals remains to be determined. These observations, coupled with those from previous studies, also suggest that a surprisingly high proportion of SC neurons capable of multisensory integration must be present for orientation behavior benefits to be realized. Compensatory mechanisms can achieve this if early lesions spare either AES or rLS, but even the impressive plasticity of the neonatal brain cannot compensate for the early loss of both of them.
APA, Harvard, Vancouver, ISO, and other styles
38

Oh, Soo Min, Kyunghwa Jeong, Jeong Taeg Seo, and Seok Jun Moon. "Multisensory interactions regulate feeding behavior in Drosophila." Proceedings of the National Academy of Sciences 118, no. 7 (February 8, 2021): e2004523118. http://dx.doi.org/10.1073/pnas.2004523118.

Full text
Abstract:
The integration of two or more distinct sensory cues can help animals make more informed decisions about potential food sources, but little is known about how feeding-related multimodal sensory integration happens at the cellular and molecular levels. Here, we show that multimodal sensory integration contributes to a stereotyped feeding behavior in the model organism Drosophila melanogaster. Simultaneous olfactory and mechanosensory inputs significantly influence a taste-evoked feeding behavior called the proboscis extension reflex (PER). Olfactory and mechanical information are mediated by antennal Or35a neurons and leg hair plate mechanosensory neurons, respectively. We show that the controlled delivery of three different sensory cues can produce a supra-additive PER via the concurrent stimulation of olfactory, taste, and mechanosensory inputs. We suggest that the fruit fly is a versatile model system to study multisensory integration related to feeding, which also likely exists in vertebrates.
APA, Harvard, Vancouver, ISO, and other styles
39

Elliott, Terry, Xutao Kuang, Nigel R. Shadbolt, and Klaus-Peter Zauner. "Adaptation in multisensory neurons: Impact on cross-modal enhancement." Network: Computation in Neural Systems 20, no. 1 (January 2009): 1–31. http://dx.doi.org/10.1080/09548980902751752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Noel, Jean-Paul, Olaf Blanke, Elisa Magosso, and Andrea Serino. "Neural adaptation accounts for the dynamic resizing of peripersonal space: evidence from a psychophysical-computational approach." Journal of Neurophysiology 119, no. 6 (June 1, 2018): 2307–33. http://dx.doi.org/10.1152/jn.00652.2017.

Full text
Abstract:
Interactions between the body and the environment occur within the peripersonal space (PPS), the space immediately surrounding the body. The PPS is encoded by multisensory (audio-tactile, visual-tactile) neurons that possess receptive fields (RFs) anchored on the body and restricted in depth. The extension in depth of PPS neurons’ RFs has been documented to change dynamically as a function of the velocity of incoming stimuli, but the underlying neural mechanisms are still unknown. Here, by integrating a psychophysical approach with neural network modeling, we propose a mechanistic explanation behind this inherent dynamic property of PPS. We psychophysically mapped the size of participant’s peri-face and peri-trunk space as a function of the velocity of task-irrelevant approaching auditory stimuli. Findings indicated that the peri-trunk space was larger than the peri-face space, and, importantly, as for the neurophysiological delineation of RFs, both of these representations enlarged as the velocity of incoming sound increased. We propose a neural network model to mechanistically interpret these findings: the network includes reciprocal connections between unisensory areas and higher order multisensory neurons, and it implements neural adaptation to persistent stimulation as a mechanism sensitive to stimulus velocity. The network was capable of replicating the behavioral observations of PPS size remapping and relates behavioral proxies of PPS size to neurophysiological measures of multisensory neurons’ RF size. We propose that a biologically plausible neural adaptation mechanism embedded within the network encoding for PPS can be responsible for the dynamic alterations in PPS size as a function of the velocity of incoming stimuli. NEW & NOTEWORTHY Interactions between body and environment occur within the peripersonal space (PPS). PPS neurons are highly dynamic, adapting online as a function of body-object interactions. The mechanistic underpinning PPS dynamic properties are unexplained. We demonstrate with a psychophysical approach that PPS enlarges as incoming stimulus velocity increases, efficiently preventing contacts with faster approaching objects. We present a neurocomputational model of multisensory PPS implementing neural adaptation to persistent stimulation to propose a neurophysiological mechanism underlying this effect.
APA, Harvard, Vancouver, ISO, and other styles
41

Tele-Heri, Brigitta, Karoly Dobos, Szilvia Harsanyi, Judit Palinkas, Fanni Fenyosi, Rudolf Gesztelyi, Csaba E. More, and Judit Zsuga. "Vestibular Stimulation May Drive Multisensory Processing: Principles for Targeted Sensorimotor Therapy (TSMT)." Brain Sciences 11, no. 8 (August 23, 2021): 1111. http://dx.doi.org/10.3390/brainsci11081111.

Full text
Abstract:
At birth, the vestibular system is fully mature, whilst higher order sensory processing is yet to develop in the full-term neonate. The current paper lays out a theoretical framework to account for the role vestibular stimulation may have driving multisensory and sensorimotor integration. Accordingly, vestibular stimulation, by activating the parieto-insular vestibular cortex, and/or the posterior parietal cortex may provide the cortical input for multisensory neurons in the superior colliculus that is needed for multisensory processing. Furthermore, we propose that motor development, by inducing change of reference frames, may shape the receptive field of multisensory neurons. This, by leading to lack of spatial contingency between formally contingent stimuli, may cause degradation of prior motor responses. Additionally, we offer a testable hypothesis explaining the beneficial effect of sensory integration therapies regarding attentional processes. Key concepts of a sensorimotor integration therapy (e.g., targeted sensorimotor therapy (TSMT)) are also put into a neurological context. TSMT utilizes specific tools and instruments. It is administered in 8-weeks long successive treatment regimens, each gradually increasing vestibular and postural stimulation, so sensory-motor integration is facilitated, and muscle strength is increased. Empirically TSMT is indicated for various diseases. Theoretical foundations of this sensorimotor therapy are discussed.
APA, Harvard, Vancouver, ISO, and other styles
42

Dong, W. K., E. H. Chudler, K. Sugiyama, V. J. Roberts, and T. Hayashi. "Somatosensory, multisensory, and task-related neurons in cortical area 7b (PF) of unanesthetized monkeys." Journal of Neurophysiology 72, no. 2 (August 1, 1994): 542–64. http://dx.doi.org/10.1152/jn.1994.72.2.542.

Full text
Abstract:
1. The goal of this study was to quantitatively characterize the response properties of somatosensory and multisensory neurons in cortical area 7b (or PF) of monkeys that were behaviorally trained to perform an appetitive tolerance-escape task. Particular emphasis was given to characterizing nociceptive thermal responses and correlating such responses to thermal pain tolerance as measured by escape frequency. 2. A total of 244 neurons that responded to somatosensory stimulation alone or to both somatosensory and visual stimulation (multisensory) were isolated and studied in the trigeminal region of cortical area 7b. Thirty neurons responded only to visual stimulation. Thermoreceptive neurons formed approximately 13% (31 of 244) of the neurons that had somatosensory response properties. Thermal nociceptive neurons made up approximately 9% (21 of 244) of the neurons that had somatosensory response properties or approximately 68% (21 of 31) of the neurons that had thermoreceptive response properties. Thermal nociceptive neurons responded either exclusively to noxious thermal stimuli (high-threshold thermoreceptive, HTT) or differentially to nonnoxious and noxious thermal stimuli (wide-range thermoreceptive, WRT). Multimodal HTT neurons had nonnociceptive (low-threshold mechanoreceptive, LTM) and/or nociceptive (nociceptive-specific, wide-dynamic-range) mechanical receptive fields, whereas multimodal WRT neurons had only nonnociceptive (LTM) mechanical receptive fields. Thermal nonnociceptive neurons (low-threshold thermoreceptive, LTT) made up approximately 3% (8 of 244) of the neurons that had somatosensory properties or approximately 26% (8 of 31) of the neurons that were thermoreceptive. The background discharge of two thermoreceptive neurons (6%, 2 of 31) was inhibited by innocuous thermal stimulation. 3. Thermal nociceptive neurons (HTT and WRT) were functionally differentiated by statistical analyses into subpopulations that did encode (HTT-EN, WRT-EN) and did not encode (HTT-NE, WRT-NE) the magnitude of noxious thermal stimulus intensities. The mean slopes and median regression coefficients for the stimulus-response (S-R) functions of HTT-EN and WRT-EN neurons, respectively, were significantly greater than those for the S-R functions of HTT-NE and WRT-NE neurons. In contrast to HTT-NE and WRT-NE neurons, HTT-EN and WRT-EN neurons reliably encoded the magnitude of noxious thermal intensity by grading their mean discharge frequency. 4. The S-R functions of HTT-EN and WRT-EN neurons, unlike those of HTT-NE and WRT-NE neurons, closely approximated stimulus intensity-escape frequency functions.(ABSTRACT TRUNCATED AT 400 WORDS)
APA, Harvard, Vancouver, ISO, and other styles
43

Medrea, Ioana, and Kathleen E. Cullen. "Multisensory integration in early vestibular processing in mice: the encoding of passive vs. active motion." Journal of Neurophysiology 110, no. 12 (December 15, 2013): 2704–17. http://dx.doi.org/10.1152/jn.01037.2012.

Full text
Abstract:
The mouse has become an important model system for studying the cellular basis of learning and coding of heading by the vestibular system. Here we recorded from single neurons in the vestibular nuclei to understand how vestibular pathways encode self-motion under natural conditions, during which proprioceptive and motor-related signals as well as vestibular inputs provide feedback about an animal's movement through the world. We recorded neuronal responses in alert behaving mice focusing on a group of neurons, termed vestibular-only cells, that are known to control posture and project to higher-order centers. We found that the majority (70%, n = 21/30) of neurons were bimodal, in that they responded robustly to passive stimulation of proprioceptors as well as passive stimulation of the vestibular system. Additionally, the linear summation of a given neuron's vestibular and neck sensitivities predicted well its responses when both stimuli were applied simultaneously. In contrast, neuronal responses were suppressed when the same motion was actively generated, with the one striking exception that the activity of bimodal neurons similarly and robustly encoded head on body position in all conditions. Our results show that proprioceptive and motor-related signals are combined with vestibular information at the first central stage of vestibular processing in mice. We suggest that these results have important implications for understanding the multisensory integration underlying accurate postural control and the neural representation of directional heading in the head direction cell network of mice.
APA, Harvard, Vancouver, ISO, and other styles
44

McClure, John P., and Pierre-Olivier Polack. "Pure tones modulate the representation of orientation and direction in the primary visual cortex." Journal of Neurophysiology 121, no. 6 (June 1, 2019): 2202–14. http://dx.doi.org/10.1152/jn.00069.2019.

Full text
Abstract:
Multimodal sensory integration facilitates the generation of a unified and coherent perception of the environment. It is now well established that unimodal sensory perceptions, such as vision, are improved in multisensory contexts. Whereas multimodal integration is primarily performed by dedicated multisensory brain regions such as the association cortices or the superior colliculus, recent studies have shown that multisensory interactions also occur in primary sensory cortices. In particular, sounds were shown to modulate the responses of neurons located in layers 2/3 (L2/3) of the mouse primary visual cortex (V1). Yet, the net effect of sound modulation at the V1 population level remained unclear. In the present study, we performed two-photon calcium imaging in awake mice to compare the representation of the orientation and the direction of drifting gratings by V1 L2/3 neurons in unimodal (visual only) or multimodal (audiovisual) conditions. We found that sound modulation depended on the tuning properties (orientation and direction selectivity) and response amplitudes of V1 L2/3 neurons. Sounds potentiated the responses of neurons that were highly tuned to the cue’s orientation and direction but weakly active in the unimodal context, following the principle of inverse effectiveness of multimodal integration. Moreover, sound suppressed the responses of neurons untuned for the orientation and/or the direction of the visual cue. Altogether, sound modulation improved the representation of the orientation and direction of the visual stimulus in V1 L2/3. Namely, visual stimuli presented with auditory stimuli recruited a neuronal population better tuned to the visual stimulus orientation and direction than when presented alone. NEW & NOTEWORTHY The primary visual cortex (V1) receives direct inputs from the primary auditory cortex. Yet, the impact of sounds on visual processing in V1 remains controverted. We show that the modulation by pure tones of V1 visual responses depends on the orientation selectivity, direction selectivity, and response amplitudes of V1 neurons. Hence, audiovisual stimuli recruit a population of V1 neurons better tuned to the orientation and direction of the visual stimulus than unimodal visual stimuli.
APA, Harvard, Vancouver, ISO, and other styles
45

Caron-Guyon, Jeanne, Julien Corbo, Yoh’i Zennou-Azogui, Christian Xerri, Anne Kavounoudias, and Nicolas Catz. "Neuronal Encoding of Multisensory Motion Features in the Rat Associative Parietal Cortex." Cerebral Cortex 30, no. 10 (June 4, 2020): 5372–86. http://dx.doi.org/10.1093/cercor/bhaa118.

Full text
Abstract:
Abstract Motion perception is facilitated by the interplay of various sensory channels. In rodents, the cortical areas involved in multisensory motion coding remain to be identified. Using voltage-sensitive-dye imaging, we revealed a visuo–tactile convergent region that anatomically corresponds to the associative parietal cortex (APC). Single unit responses to moving visual gratings or whiskers deflections revealed a specific coding of motion characteristics strikingly found in both sensory modalities. The heteromodality of this region was further supported by a large proportion of bimodal neurons and by a classification procedure revealing that APC carries information about motion features, sensory origin and multisensory direction-congruency. Altogether, the results point to a central role of APC in multisensory integration for motion perception.
APA, Harvard, Vancouver, ISO, and other styles
46

Diehl, M. M., and L. M. Romanski. "Responses of Prefrontal Multisensory Neurons to Mismatching Faces and Vocalizations." Journal of Neuroscience 34, no. 34 (August 20, 2014): 11233–43. http://dx.doi.org/10.1523/jneurosci.5168-13.2014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Stein, Barry E., W. Scott Huneycutt, and M. Alex Meredith. "Neurons and behavior: the same rules of multisensory integration apply." Brain Research 448, no. 2 (May 1988): 355–58. http://dx.doi.org/10.1016/0006-8993(88)91276-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Ma, Wei Ji, and Alexandre Pouget. "Linking neurons to behavior in multisensory perception: A computational review." Brain Research 1242 (November 2008): 4–12. http://dx.doi.org/10.1016/j.brainres.2008.04.082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Noel, Jean-Paul, Andrea Serino, and Mark T. Wallace. "Increased Neural Strength and Reliability to Audiovisual Stimuli at the Boundary of Peripersonal Space." Journal of Cognitive Neuroscience 31, no. 8 (August 2019): 1155–72. http://dx.doi.org/10.1162/jocn_a_01334.

Full text
Abstract:
The actionable space surrounding the body, referred to as peripersonal space (PPS), has been the subject of significant interest of late within the broader framework of embodied cognition. Neurophysiological and neuroimaging studies have shown the representation of PPS to be built from visuotactile and audiotactile neurons within a frontoparietal network and whose activity is modulated by the presence of stimuli in proximity to the body. In contrast to single-unit and fMRI studies, an area of inquiry that has received little attention is the EEG characterization associated with PPS processing. Furthermore, although PPS is encoded by multisensory neurons, to date there has been no EEG study systematically examining neural responses to unisensory and multisensory stimuli, as these are presented outside, near, and within the boundary of PPS. Similarly, it remains poorly understood whether multisensory integration is generally more likely at certain spatial locations (e.g., near the body) or whether the cross-modal tactile facilitation that occurs within PPS is simply due to a reduction in the distance between sensory stimuli when close to the body and in line with the spatial principle of multisensory integration. In the current study, to examine the neural dynamics of multisensory processing within and beyond the PPS boundary, we present auditory, visual, and audiovisual stimuli at various distances relative to participants' reaching limit—an approximation of PPS—while recording continuous high-density EEG. We question whether multisensory (vs. unisensory) processing varies as a function of stimulus–observer distance. Results demonstrate a significant increase of global field power (i.e., overall strength of response across the entire electrode montage) for stimuli presented at the PPS boundary—an increase that is largest under multisensory (i.e., audiovisual) conditions. Source localization of the major contributors to this global field power difference suggests neural generators in the intraparietal sulcus and insular cortex, hubs for visuotactile and audiotactile PPS processing. Furthermore, when neural dynamics are examined in more detail, changes in the reliability of evoked potentials in centroparietal electrodes are predictive on a subject-by-subject basis of the later changes in estimated current strength at the intraparietal sulcus linked to stimulus proximity to the PPS boundary. Together, these results provide a previously unrealized view into the neural dynamics and temporal code associated with the encoding of nontactile multisensory around the PPS boundary.
APA, Harvard, Vancouver, ISO, and other styles
50

Yu, Liping, Benjamin A. Rowland, Jinghong Xu, and Barry E. Stein. "Multisensory plasticity in adulthood: cross-modal experience enhances neuronal excitability and exposes silent inputs." Journal of Neurophysiology 109, no. 2 (January 15, 2013): 464–74. http://dx.doi.org/10.1152/jn.00739.2012.

Full text
Abstract:
Multisensory superior colliculus neurons in cats were found to retain substantial plasticity to short-term, site-specific experience with cross-modal stimuli well into adulthood. Following cross-modal exposure trials, these neurons substantially increased their sensitivity to the cross-modal stimulus configuration as well as to its individual component stimuli. In many cases, the exposure experience also revealed a previously ineffective or “silent” input channel, rendering it overtly responsive. These experience-induced changes required relatively few exposure trials and could be retained for more than 1 h. However, their induction was generally restricted to experience with cross-modal stimuli. Only rarely were they induced by exposure to a modality-specific stimulus and were never induced by stimulating a previously ineffective input channel. This short-term plasticity likely provides substantial benefits to the organism in dealing with ongoing and sequential events that take place at a given location in space and may reflect the ability of multisensory superior colliculus neurons to rapidly alter their response properties to accommodate to changes in environmental challenges and event probabilities.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography