Academic literature on the topic 'Multisensory neurons'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multisensory neurons.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multisensory neurons"

1

Allman, Brian L., and M. Alex Meredith. "Multisensory Processing in “Unimodal” Neurons: Cross-Modal Subthreshold Auditory Effects in Cat Extrastriate Visual Cortex." Journal of Neurophysiology 98, no. 1 (July 2007): 545–49. http://dx.doi.org/10.1152/jn.00173.2007.

Full text
Abstract:
Historically, the study of multisensory processing has examined the function of the definitive neuron type, the bimodal neuron. These neurons are excited by inputs from more than one sensory modality, and when multisensory stimuli are present, they can integrate their responses in a predictable manner. However, recent studies have revealed that multisensory processing in the cortex is not restricted to bimodal neurons. The present investigation sought to examine the potential for multisensory processing in nonbimodal (unimodal) neurons in the retinotopically organized posterolateral lateral suprasylvian (PLLS) area of the cat. Standard extracellular recordings were used to measure responses of all neurons encountered to both separate- and combined-modality stimulation. Whereas bimodal neurons behaved as predicted, the surprising result was that 16% of unimodal visual neurons encountered were significantly facilitated by auditory stimuli. Because these unimodal visual neurons did not respond to an auditory stimulus presented alone but had their visual responses modulated by concurrent auditory stimulation, they represent a new form of multisensory neuron: the subthreshold multisensory neuron. These data also demonstrate that bimodal neurons can no longer be regarded as the exclusive basis for multisensory processing.
APA, Harvard, Vancouver, ISO, and other styles
2

Perrault, Thomas J., J. William Vaughan, Barry E. Stein, and Mark T. Wallace. "Superior Colliculus Neurons Use Distinct Operational Modes in the Integration of Multisensory Stimuli." Journal of Neurophysiology 93, no. 5 (May 2005): 2575–86. http://dx.doi.org/10.1152/jn.00926.2004.

Full text
Abstract:
Many neurons in the superior colliculus (SC) integrate sensory information from multiple modalities, giving rise to significant response enhancements. Although enhanced multisensory responses have been shown to depend on the spatial and temporal relationships of the stimuli as well as on their relative effectiveness, these factors alone do not appear sufficient to account for the substantial heterogeneity in the magnitude of the multisensory products that have been observed. Toward this end, the present experiments have revealed that there are substantial differences in the operations used by different multisensory SC neurons to integrate their cross-modal inputs, suggesting that intrinsic differences in these neurons may also play an important deterministic role in multisensory integration. In addition, the integrative operation employed by a given neuron was found to be well correlated with the neuron's dynamic range. In total, four categories of SC neurons were identified based on how their multisensory responses changed relative to the predicted addition of the two unisensory inputs as stimulus effectiveness was altered. Despite the presence of these categories, a general rule was that the most robust multisensory enhancements were seen with combinations of the least effective unisensory stimuli. Together, these results provide a better quantitative picture of the integrative operations performed by multisensory SC neurons and suggest mechanistic differences in the way in which these neurons synthesize cross-modal information.
APA, Harvard, Vancouver, ISO, and other styles
3

Jiang, Wan, Mark T. Wallace, Huai Jiang, J. William Vaughan, and Barry E. Stein. "Two Cortical Areas Mediate Multisensory Integration in Superior Colliculus Neurons." Journal of Neurophysiology 85, no. 2 (February 1, 2001): 506–22. http://dx.doi.org/10.1152/jn.2001.85.2.506.

Full text
Abstract:
The majority of multisensory neurons in the cat superior colliculus (SC) are able to synthesize cross-modal cues (e.g., visual and auditory) and thereby produce responses greater than those elicited by the most effective single modality stimulus and, sometimes, greater than those predicted by the arithmetic sum of their modality-specific responses. The present study examined the role of corticotectal inputs from two cortical areas, the anterior ectosylvian sulcus (AES) and the rostral aspect of the lateral suprasylvian sulcus (rLS), in producing these response enhancements. This was accomplished by evaluating the multisensory properties of individual SC neurons during reversible deactivation of these cortices individually and in combination using cryogenic deactivation techniques. Cortical deactivation eliminated the characteristic multisensory response enhancement of nearly all SC neurons but generally had little or no effect on a neuron's modality-specific responses. Thus, the responses of SC neurons to combinations of cross-modal stimuli were now no different from those evoked by one or the other of these stimuli individually. Of the two cortical areas, AES had a much greater impact on SC multisensory integrative processes, with nearly half the SC neurons sampled dependent on it alone. In contrast, only a small number of SC neurons depended solely on rLS. However, most SC neurons exhibited dual dependencies, and their multisensory enhancement was mediated by either synergistic or redundant influences from AES and rLS. Corticotectal synergy was evident when deactivating either cortical area compromised the multisensory enhancement of an SC neuron, whereas corticotectal redundancy was evident when deactivation of both cortical areas was required to produce this effect. The results suggest that, although multisensory SC neurons can be created as a consequence of a variety of converging tectopetal afferents that are derived from a host of subcortical and cortical structures, the ability to synthesize cross-modal inputs, and thereby produce an enhanced multisensory response, requires functional inputs from the AES, the rLS, or both.
APA, Harvard, Vancouver, ISO, and other styles
4

Wallace, M. T. "The Development of Multisensory Integration in the Brain." Perception 26, no. 1_suppl (August 1997): 35. http://dx.doi.org/10.1068/v970014.

Full text
Abstract:
Multisensory integration in the superior colliculus (SC) of the cat requires a protracted postnatal developmental time course. Kittens 3 – 135 days postnatal (dpn) were examined and the first neuron capable of responding to two different sensory inputs (auditory and somatosensory) was not seen until 12 dpn. Visually responsive multisensory neurons were not encountered until 20 dpn. These early multisensory neurons responded weakly to sensory stimuli, had long response latencies, large receptive fields, and poorly developed response selectivities. Most striking, however, was their inability to integrate cross-modality cues in order to produce the significant response enhancement or depression characteristic of these neurons in adults. The incidence of multisensory neurons increased gradually over the next 10 – 12 weeks. During this period, sensory responses became more robust, latencies shortened, receptive fields decreased in size, and unimodal selectivities matured. The first neurons capable of cross-modality integration were seen at 28 dpn. For the following two months, the incidence of such integrative neurons rose gradually until adult-like values were achieved. Surprisingly, however, as soon as a multisensory neuron exhibited this capacity, most of its integrative features were indistinguishable from those in adults. Given what is known about the requirements for multisensory integration in adult animals, this observation suggests that the appearance of multisensory integration reflects the onset of functional corticotectal inputs.
APA, Harvard, Vancouver, ISO, and other styles
5

Meredith, M. A., and B. E. Stein. "Spatial determinants of multisensory integration in cat superior colliculus neurons." Journal of Neurophysiology 75, no. 5 (May 1, 1996): 1843–57. http://dx.doi.org/10.1152/jn.1996.75.5.1843.

Full text
Abstract:
1. Although a representation of multisensory space is contained in the superior colliculus, little is known about the spatial requirements of multisensory stimuli that influence the activity of neurons here. Critical to this problem is an assessment of the registry of the different receptive fields within individual multisensory neurons. The present study was initiated to determine how closely the receptive fields of individual multisensory neurons are aligned, the physiological role of that alignment, and the possible functional consequences of inducing receptive-field misalignment. 2. Individual multisensory neurons in the superior colliculus of anesthetized, paralyzed cats were studied with the use of standard extracellular recording techniques. The receptive fields of multisensory neurons were large, as reported previously, but exhibited a surprisingly high degree of spatial coincidence. The average proportion of receptive-field overlap was 86% for the population of visual-auditory neurons sampled. 3. Because of this high degree of intersensory receptive-field correspondence, combined-modality stimuli that were coincident in space tended to fall within the excitatory regions of the receptive fields involved. The result was a significantly enhanced neuronal response in 88% of the multisensory neurons studied. If stimuli were spatially disparate, so that one fell outside its receptive field, either a decreased response occurred (56%), or no intersensory effect was apparent (44%). 4. The normal alignment of the different receptive fields of a multisensory neuron could be disrupted by passively displacing the eyes, pinnae, or limbs/body. In no case was a shift in location or size observed in a neuron's other receptive field(s) to compensate for this displacement. The physiological result of receptive-field misalignment was predictable and based on the location of the stimuli relative to the new positions of their respective receptive fields. Now, for example, one component of a spatially coincident pair of stimuli might fall outside its receptive field and inhibit the other's effects. 5. These data underscore the dependence of multisensory integrative responses on the relationship of the different stimuli to their corresponding receptive fields rather than to the spatial relationship of the stimuli to one another. Apparently, the alignment of different receptive fields for individual multisensory neurons ensures that responses to combinations of stimuli derived from the same event are integrated to increase the salience of that event. Therefore the maintenance of receptive-field alignment is critical for the appropriate integration of converging sensory signals and, ultimately, elicitation of adaptive behaviors.
APA, Harvard, Vancouver, ISO, and other styles
6

Carriere, Brian N., David W. Royal, and Mark T. Wallace. "Spatial Heterogeneity of Cortical Receptive Fields and Its Impact on Multisensory Interactions." Journal of Neurophysiology 99, no. 5 (May 2008): 2357–68. http://dx.doi.org/10.1152/jn.01386.2007.

Full text
Abstract:
Investigations of multisensory processing at the level of the single neuron have illustrated the importance of the spatial and temporal relationship of the paired stimuli and their relative effectiveness in determining the product of the resultant interaction. Although these principles provide a good first-order description of the interactive process, they were derived by treating space, time, and effectiveness as independent factors. In the anterior ectosylvian sulcus (AES) of the cat, previous work hinted that the spatial receptive field (SRF) architecture of multisensory neurons might play an important role in multisensory processing due to differences in the vigor of responses to identical stimuli placed at different locations within the SRF. In this study the impact of SRF architecture on cortical multisensory processing was investigated using semichronic single-unit electrophysiological experiments targeting a multisensory domain of the cat AES. The visual and auditory SRFs of AES multisensory neurons exhibited striking response heterogeneity, with SRF architecture appearing to play a major role in the multisensory interactions. The deterministic role of SRF architecture was tightly coupled to the manner in which stimulus location modulated the responsiveness of the neuron. Thus multisensory stimulus combinations at weakly effective locations within the SRF resulted in large (often superadditive) response enhancements, whereas combinations at more effective spatial locations resulted in smaller (additive/subadditive) interactions. These results provide important insights into the spatial organization and processing capabilities of cortical multisensory neurons, features that may provide important clues as to the functional roles played by this area in spatially directed perceptual processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Jiang, Wan, and Barry E. Stein. "Cortex Controls Multisensory Depression in Superior Colliculus." Journal of Neurophysiology 90, no. 4 (October 2003): 2123–35. http://dx.doi.org/10.1152/jn.00369.2003.

Full text
Abstract:
Multisensory depression is a fundamental index of multisensory integration in superior colliculus (SC) neurons. It is initiated when one sensory stimulus (auditory) located outside its modality-specific receptive field degrades or eliminates the neuron's responses to another sensory stimulus (visual) presented within its modality-specific receptive field. The present experiments demonstrate that the capacity of SC neurons to engage in multisensory depression is strongly dependent on influences from two cortical areas (the anterior ectosylvian and rostral lateral suprasylvian sulci). When these cortices are deactivated, the ability of SC neurons to synthesize visual-auditory inputs in this way is compromised; multisensory responses are disinhibited, becoming more vigorous and in some cases indistinguishable from responses to the visual stimulus alone. Although obtaining a more robust multisensory SC response when cortex is nonfunctional than when it is functional may seem paradoxical, these data may help explain previous observations that the loss of these cortical influences permits visual orientation behavior in the presence of a normally disruptive auditory stimulus.
APA, Harvard, Vancouver, ISO, and other styles
8

Perrault, Thomas J., J. William Vaughan, Barry E. Stein, and Mark T. Wallace. "Neuron-Specific Response Characteristics Predict the Magnitude of Multisensory Integration." Journal of Neurophysiology 90, no. 6 (December 2003): 4022–26. http://dx.doi.org/10.1152/jn.00494.2003.

Full text
Abstract:
Multisensory neurons in the superior colliculus (SC) typically respond to combinations of stimuli from multiple modalities with enhancements and/or depressions in their activity. Although such changes in response have been shown to follow a predictive set of integrative principles, these principles fail to completely account for the full range of interactions seen throughout the SC population. In an effort to better define this variability, we sought to determine if there were additional features of the neuronal response profile that were predictive of the magnitude of the multisensory interaction. To do this, we recorded from 109 visual-auditory SC neurons while systematically manipulating stimulus intensity. Along with the previously described roles of space, time, and stimulus effectiveness, two features of a neuron's response profile were found to offer predictive value as to the magnitude of the multisensory interaction: spontaneous activity and the level of sensory responsiveness. Multisensory neurons with little or no spontaneous activity and weak sensory responses had the capacity to exhibit large response enhancements. Conversely, neurons with modest spontaneous activity and robust sensory responses exhibited relatively small response enhancements. Together, these results provide a better view into multisensory integration, and suggest substantial heterogeneity in the integrative characteristics of the multisensory SC population.
APA, Harvard, Vancouver, ISO, and other styles
9

Wallace, M. T., M. A. Meredith, and B. E. Stein. "Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus." Journal of Neurophysiology 69, no. 6 (June 1, 1993): 1797–809. http://dx.doi.org/10.1152/jn.1993.69.6.1797.

Full text
Abstract:
1. Physiological methods were used to examine the pattern of inputs from different sensory cortices onto individual superior colliculus neurons. 2. Visual, auditory, and somatosensory influences from anterior ectosylvian sulcus (AES) and visual influences from lateral suprasylvian (LS) cortex were found to converge onto individual multisensory neurons in the cat superior colliculus. An excellent topographic relationship was found between the different sensory cortices and their target neurons in the superior colliculus. 3. Corticotectal inputs were derived solely from unimodal neurons. Multisensory neurons in AES and LS were not antidromically activated from the superior colliculus. 4. Orthodromic and antidromic latencies were consistent with monosynaptic corticotectal inputs arising from LS and the three subdivisions of AES (SIV, Field AES, and AEV). 5. Superior colliculus neurons that received convergent cortical inputs formed a principal component of the tecto-reticulospinal tract. Thus there appears to be extensive cortical control over the output neurons through which the superior colliculus mediates attentive and orientation behaviors. 6. Two other multisensory circuits were identified. A population of multisensory superior colliculus neurons was found, which neither received convergent cortical input nor projected into the tecto-reticulo-spinal tract. In addition, multisensory neurons in AES and LS proved to be independent of the superior colliculus (i.e., they were not corticotectal). While it is likely that these three distinct multisensory neural circuits have different functional roles, their constituent neurons appear to integrate their various sensory inputs in much the same way.
APA, Harvard, Vancouver, ISO, and other styles
10

Wallace, M. T., L. K. Wilkinson, and B. E. Stein. "Representation and integration of multiple sensory inputs in primate superior colliculus." Journal of Neurophysiology 76, no. 2 (August 1, 1996): 1246–66. http://dx.doi.org/10.1152/jn.1996.76.2.1246.

Full text
Abstract:
1. The properties of visual-, auditory-, and somatosensory-responsive neurons, as well as of neurons responsive to multiple sensory cues (i.e., multisensory), were examined in the superior colliculus of the rhesus monkey. Although superficial layer neurons responded exclusively to visual stimuli and visual inputs predominated in deeper layers, there was also a rich nonvisual and multisensory representation in the superior colliculus. More than a quarter (27.8%) of the deep layer population responded to stimuli from more than a single sensory modality. In contrast, 37% responded only to visual cues, 17.6% to auditory cues, and 17.6% to somatosensory cues. Unimodal- and multisensory-responsive neurons were clustered by modality. Each of these modalities was represented in map-like fashion, and the different representations were in alignment with one another. 2. Most deep layer visually responsive neurons were binocular and exhibited poor selectivity for such stimulus characteristics as orientation, velocity, and direction of movement. Similarly, most auditory-responsive neurons had contralateral receptive fields and were binaural, but had little frequency selectivity and preferred complex, broad-band sounds. Somatosensory-responsive neurons were overwhelmingly contralateral, high velocity, and rapidly adapting. Only rarely did somatosensory-responsive neurons require distortion of subcutaneous tissue for activation. 3. The spatial congruence among the different receptive fields of multisensory neurons was a critical feature underlying their ability to synthesize cross-modal information. 4. Combinations of stimuli could have very different consequences in the same neuron, depending on their temporal and spatial relationships. Generally, multisensory interactions were evident when pairs of stimuli were separated from one another by < 500 ms, and the products of these interactions far exceeded the sum of their unimodal components. Whether the combination of stimuli produced response enhancement, response depression, or no interaction depended on the location of the stimuli relative to one another and to their respective receptive fields. Maximal response enhancements were observed when stimuli originated from similar locations in space (as when derived from the same event) because they fell within the excitatory receptive fields of the same multisensory neurons. If, however, the stimuli were spatially disparate such that one fell beyond the excitatory borders of its receptive field, either no interaction was produced or this stimulus depressed the effectiveness of the other. Furthermore, maximal response interactions were seen with the pairing of weakly effective unimodal stimuli. As the individual unimodal stimuli became increasingly effective, the levels of response enhancement to stimulus combinations declined, a principle referred to as inverse effectiveness. Many of the integrative principles seen here in the primate superior colliculus are strikingly similar to those observed in the cat. These observations indicate that a set of common principles of multisensory integration is adaptable in widely divergent species living in very different ecological situations. 5. Surprisingly, a few multisensory neurons had individual receptive fields that were not in register with one another. This has not been noted in multisensory neurons of other species, and these "anomalous" receptive fields could present a daunting problem: stimuli originating from the same general location in space cannot simultaneously fall within their respective receptive fields, a stimulus pairing that may result in response depression. Conversely, stimuli that originate from separate events and disparate locations (and fall within their receptive fields) may result in response enhancement. However, the spatial principle of multisensory integration did not apply in these cases. (ABSTRACT TRUNCATED)
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Multisensory neurons"

1

Kuang, Xutao. "Adaptation in multisensory neurons." Thesis, University of Southampton, 2008. https://eprints.soton.ac.uk/65419/.

Full text
Abstract:
The most studied region in the mammalian brain for multisensory integration is the deep superior colliculus (DSC). Neurophysiological experiments have revealed many response properties of DSC neurons, such as cross-modal enhancement (CME) and sub-additive/additive/super-additive op- erational modes. CME occurs when the response of a multisensory neuron to stimulation in one sensory modality is enhanced, often non-linearly, by temporally and spatially coincident stimulation of a second sensory modality. Response enhancement is frequently larger for weaker input stimuli than for stronger stimuli, a phenomenon known as inverse e®ectiveness. It is believed that a non-linear, saturating response function may underlie CME associated with inverse effectiveness. We explore this idea in more detail, showing that apart from CME, many other response properties of DSC neurons, including the different dynamic ranges of responses to unimodal and multimodal stimuli and the diverse operational modes, also emerge as a direct consequence of a saturating response function such as a sigmoidal function. We then consider the question of how the exact form of a candidate, saturating sigmoidal function could be determined in a DSC neuron. In particular, we suggest that adaptation may determine its exact form. Adaptation to input statistics is a ubiquitous property of sensory neurons. Defining the operating point as the output probability density function, we argue that a neuron maintains an invariant operating point by adapting to the lowest-order moments of the input probability distribution. Based on this notion, we propose a novel adaptation rule that permits unisensory neurons to adapt to the lowest-order statistics of their inputs, and then extend this rule to allow adaptation in multisensory neurons, of which DSC neurons are an example. Adaptation in DSC neurons is expected to change the responses of a neuron to a fixed, probe or test stimulus. Such a neuron would therefore exhibit different CME when presented with the same stimulus drawn from different statistical ensembles. We demonstrate that, for suitable selections of test stimuli, adaptation to an increase in the mean, the variance or the correlation coefficient induce consistent changes in CME. By virtue of the robustness of the results, the underlying adaptation notion can be tested in neurophysiological experiments. Finally, it is known that descending cortical projections from the anterior ectosylvian sulcus and the rostral aspect of the lateral suprasylvian sulcus are indispensable for DSC neurons to exhibit CME. The structure of our proposed adaptation rule for multisensory neurons therefore permits us to speculate that the descending cortical inputs to multisensory DSC neurons facilitate the computation of the correlation coefficient between different sensory channels' activities.
APA, Harvard, Vancouver, ISO, and other styles
2

Brozzoli, Claudio. "Peripersonal space : a multisensory interface for body-objects interactions." Phd thesis, Université Claude Bernard - Lyon I, 2009. http://tel.archives-ouvertes.fr/tel-00675247.

Full text
Abstract:
Our ability to interact with the environment requires the integration of multisensory information for the construction of spatial representations. The peripersonal space (i.e., the sector of space closely surrounding one's body) and the integrative processes between visual and tactile inputs originating from this sector of space have been at the center of recent years investigations. Neurophysiological studies provided evidence for the presence in the monkey brain of bimodal neurons, which are activated by tactile as well as visual information delivered near to a specific body part (e.g., the hand). Neuropsychological studies on right brain-damaged patients who present extinction and functional neuroimaging findings suggest the presence of similar bimodal systems in the human brain. Studies on the effects of tool-use on visual-tactile interaction revealed similar dynamic properties of the peripersonal space in monkeys and humans. The functional role of the multisensory coding of peripersonal space is, in our hypothesis, that of providing the brain with a sensori-motor interface for body-objects interactions. Thus, not only it could be involved in driving involuntary defensive movements in response to objects approaching the body, but could be also dynamically maintained and updated as a function of manual voluntary actions performed towards objects in the reaching space. We tested the hypothesis of an involvement of peripersonal space in executing both voluntary and defensive actions. To these aims, we joined a well known cross-modal congruency effect between visual and tactile information to a kinematic approach to demonstrate that voluntary grasping actions induce an on-line re-weighting of multisensory interactions in the peripersonal space. We additionally show that this modulation is handcentred. We also used a motor evoked potentials approach to investigate which coordinates system is used to code the peripersonal space during motor preparation if real objects rapidly approach the body. Our findings provide direct evidence for automatic hand-centred coding of visual space and suggest that peripersonal space may also serve to represent rapidly 3 approaching and potentially noxious objects, thus enabling the rapid selection of appropriate motor responses. These results clearly show that peripersonal space is a multisensori-motor interface that might have been selected through evolution for optimising the interactions between the body and the objects in the external world.
APA, Harvard, Vancouver, ISO, and other styles
3

Vissani, Matteo. "Multisensory features of peripersonal space representation: an analysis via neural network modelling." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
The peripersonal space (PPS) is the space immediately surrounding the body. It is coded in the brain in a multisensory, body part-centered (e.g. hand-centered, trunk-centered), modular fashion. This is supported by the existence of multisensory neurons (in fronto-parietal areas) with tactile receptive field on a specific body part (hand, arm, trunk, etc.) and visual/auditory receptive field surrounding the same body part. Recent behavioural results (Serino et al. Sci Rep 2015), obtained by using an audio-tactile paradigm, have further supported the existence of distinct PPS representations, each specific of a single body part (hand, trunk, face) and characterized by specific properties. That study has also evidenced that the PPS representations– although distinct – are not independent. In particular, the hand-PPS loses its properties and assumes those of the trunk-PPS when the hand is close to the trunk, as the hand-PPS was encapsulated within the trunk-PPS. Similarly, the face-PPS appears to be englobed into the trunk-PPS. It remains unclear how this interaction, which manifests behaviourally, can be implemented at a neural level by the modular organization of PPS representations. The aim of this Thesis is to propose a neural network model to help the comprehension of the underlying neurocomputational mechanisms. The model includes three subnetworks devoted to the single PPS representations around the hand, face and the trunk. Furthermore, interaction mechanisms– controlled by proprioceptive neurons – have been postulated among the subnetworks. The network is able to reproduce the behavioural data, explaining them in terms of neural properties and response. Moreover, the network provides some novel predictions, that can be tested in vivo. One of this prediction has been tested in this work, by performing an ad-hoc behavioural experiment at the Laboratory of Cognitive Neuroscience (Campus Biotech, Geneva) under the supervision of the neuropsychologist Dr Serino.
APA, Harvard, Vancouver, ISO, and other styles
4

Lim, Hun Ki. "COMPUTATIONAL MODELING OF MULITSENSORY PROCESSING USING NETWORK OF SPIKING NEURONS." VCU Scholars Compass, 2011. http://scholarscompass.vcu.edu/etd/2523.

Full text
Abstract:
Multisensory processing in the brain underlies a wide variety of perceptual phenomena, but little is known about the underlying mechanisms of how multisensory neurons are generated and how the neurons integrate sensory information from environmental events. This lack of knowledge is due to the difficulty of biological experiments to manipulate and test the characteristics of multisensory processing. By using a computational model of multisensory processing this research seeks to provide insight into the mechanisms of multisensory processing. From a computational perspective, modeling of brain functions involves not only the computational model itself but also the conceptual definition of the brain functions, the analysis of correspondence between the model and the brain, and the generation of new biologically plausible insights and hypotheses. In this research, the multisensory processing is conceptually defined as the effect of multisensory convergence on the generation of multisensory neurons and their integrated response products, i.e., multisensory integration. Thus, the computational model is the implementation of the multisensory convergence and the simulation of the neural processing acting upon the convergence. Next, the most important step in the modeling is analysis of how well the model represents the target, i.e., brain function. It is also related to validation of the model. One of the intuitive and powerful ways of validating the model is to apply methods standard to neuroscience for analyzing the results obtained from the model. In addition, methods such as statistical and graph-theoretical analyses are used to confirm the similarity between the model and the brain. This research takes both approaches to provide analyses from many different perspectives. Finally, the model and its simulations provide insight into multisensory processing, generating plausible hypotheses, which will need to be confirmed by real experimentation.
APA, Harvard, Vancouver, ISO, and other styles
5

Filimon, Flavia. "Multisensory and sensorimotor representations for action in human posterior parietal cortex investigated with functional MRI." Diss., [La Jolla] : University of California, San Diego, 2008. http://wwwlib.umi.com/cr/ucsd/fullcit?p3320178.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2008.
Title from first page of PDF file (viewed September 24, 2008). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 133-135).
APA, Harvard, Vancouver, ISO, and other styles
6

Boumenir, Yasmine. "Spatial navigation in real and virtual urban environments: performance and multisensory processing of spatial information in sighted, visually impaired, late and congenitally blind individuals." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2011. http://tel.archives-ouvertes.fr/tel-00632703.

Full text
Abstract:
Previous studies investigating how humans build reliable spatial knowledge representations allowing them to find their way from one point to another in complex environments have been focused on comparing the relative importance of the two-dimensional visual geometry of routes and intersections, multi-dimensional data from direct exposure with the real world, or verbal symbols and/or instructions. This thesis sheds further light on the multi-dimensional and multi-sensorial aspects by investigating how the cognitive processing of spatial information derived from different sources of sensory and higher order input influences the performance of human observers who have to find their way from memory through complex and non-familiar real-world environments. Three experiments in large-scale urban environments of the real world, and in computer generated representations of these latter (Google Street View), were run to investigate the influence of prior exposure to 2D visual or tactile maps of an itinerary, compared with a single direct experience or verbal instructions, on navigation performances in sighted and/or visually deficient individuals, and in individuals temporarily deprived of vision. Performances were analyzed in terms of time from departure to destination, number of stops, number of wrong turns, and success rates. Potential strategies employed by individuals during navigation and mental mapping abilities were screened on the basis of questionnaires and drawing tests. Subjective levels of psychological stress (experiment 2) were measured to bring to the fore possible differences between men and women in this respect. The results of these experiments show that 2D visual maps, briefly explored prior to navigation, generate better navigation performances compared with poorly scaled virtual representations of a complex real-world environment (experiment 1), the best performances being produced by a single prior exposure to the real-world itinerary. However, brief familiarization with a reliably scaled virtual representation of a non-familiar real-world environment (Google Street View) not only generates optimal navigation in computer generated testing (virtual reality), but also produces better navigation performances when tested in the real-world environment and compared with prior exposure to 2D visual maps (experiment 2). Congenitally blind observers (experiment 3) who have to find their way from memory through a complex non-familiar urban environment perform swiftly and with considerable accuracy after exposure to a 2D tactile map of their itinerary. They are also able to draw a visual image of their itinerary on the basis of the 2D tactile map exposure. Other visually deficient or sighted but blindfolded individuals seem to have greater difficulty in finding their way again than congenitally blind people, regardless of the type of prior exposure to their test itinerary. The findings of this work here are discussed in the light of current hypotheses regarding the presumed intrinsic nature of human spatial representations, replaced herein within a context of working memory models. It is suggested that multi-dimensional temporary storage systems, capable of processing a multitude of sensory input in parallel and with a much larger general capacity than previously considered in terms of working memory limits, need to be taken into account for future research.
APA, Harvard, Vancouver, ISO, and other styles
7

Foxworthy, W. Alex. "Unique Features and Neuronal Properties in a Multisensory Cortex." VCU Scholars Compass, 2012. http://scholarscompass.vcu.edu/etd/388.

Full text
Abstract:
UNIQUE FEAUTRES OF ORGANIZATION AND NEURONAL PROPERTIES IN A MULTISENSORY CORTEX Multisensory processing is a ubiquitous sensory effect that underlies a wide variety of behaviors, such as detection and orientation, as well as perceptual phenomena from speech comprehension to binding. Such multisensory perceptual effects are presumed to be based in cortex, especially within areas known to contain multisensory neurons. However, unlike their lower-level/primary sensory cortical counterparts, little is known about the connectional, functional and laminar organization of higher-level multisensory cortex. Therefore, to examine the fundamental features of neuronal processing and organization in the multisensory cortical area of the posterior parietal cortex (PPr) of ferrets, the present experiments utilized a combination of immunohistological, neuroanatomical and multiple single-channel electrophysiological recording techniques. These experiments produced four main results. First, convergence of extrinsic inputs from unisensory cortical areas predominantly in layers 2-3 in PPr corresponded with the high proportion of multisensory neurons in those layers. This is consistent with multisensory responses in this higher-level multisensory region being driven by cortico-cortical, rather than thalamo-cortical connections. Second, the laminar organization of the PPr differed substantially from the pattern commonly observed in primary sensory cortices. The PPr has a reduced layer 4 compared to primary sensory cortices, which does not receive input from principal thalamic nuclei. Third, the distribution of unisensory and multisensory neurons and properties differs significantly by layer. Given the laminar-dependent input-output relationships, this suggests that unisensory and multisensory signals are processed in parallel as they pass through the circuitry of the PPr. Finally, specific functional properties of bimodal neurons differed significantly from those of their unisensory counterparts. Thus, despite their coextensive distribution within cortex, these results differentiate bimodal from unisensory neurons in ways that have never been examined before. Together these experiments represent the first combined anatomical-electrophysiological examination of the laminar organization of a multisensory cortex and the first systematic comparison of the functional properties of bimodal and unisensory neurons. These results are essential for understanding the neural bases of multisensory processing and carry significant implications for the accurate interpretation of macroscopic studies of multisensory brain regions (i.e. fMRI, EEG), because bimodal and unisensory neurons within a given neural region can no longer be assumed to respond similarly to a given external stimulus.
APA, Harvard, Vancouver, ISO, and other styles
8

Cojanu, Alexandru Ioan. "Intrinsic Features of the Multisensory Cortical Area LRSS in the Ferret." VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/159.

Full text
Abstract:
Environmental events simultaneously transduced by more than one sensory modality underlie multisensory processing in the CNS. While most studies of multisensory processing examine functional effects, none have evaluated the influence of local or columnar circuitry. The goal of the present study is to examine of local features of the ferret lateral rostral suprasylvian sulcus (LRSS), a multisensory cortex. Immunostaining revealed the cytoarchitectonic features of the LRSS: thick supragranular layers, a narrow layer IV, and moderately stained but differentiated infragranular layers. Golgi-Cox techniques were used with light microscopy and digital reconstruction to document neuronal morphology. Among the 90 reconstructed neurons, 4 distinct forms or pyramidal and 2 types of non-pyramidal neurons were found. Measurement of maximal dendritic spread indicates that a cortical column in the LRSS was 250.9 um in diameter. These results describe local features of the LRSS upon which future experiments of intrinsic circuitry will be based.
APA, Harvard, Vancouver, ISO, and other styles
9

Ohla, Kathrin. "Neuronal mechanisms of food perception." Doctoral thesis, Humboldt-Universität zu Berlin, Lebenswissenschaftliche Fakultät, 2016. http://dx.doi.org/10.18452/13985.

Full text
Abstract:
Die sensorischen und hedonischen Eigenschaften von Essen sind wichtige Einflussfaktoren für die Nahrungsauswahl und –aufnahme. Was macht die Anziehungskraft von Nahrungsreizen aus? Die sensorischen und hedonsichen Eigenschaften von Nahrungsreizen werden mit allen Sinnen, oftmals sogar gleichzeitig, verarbeitet. Nahrungswahrnehmung ist damit ein mutlisensorisches Phänomen. Der Geruch, der Anblick, der Tasteindruck oder Geräusche können bereits vor der Nahrungsaufnahme wahrgenommen werden und Erwartungen hinsichtlich des Geschmacks auslösen. Diese prä-ingestiven Wahrnehmungseindrücke spielen daher auch eine maßgebliche Rolle bei der Entstehung von Verlangen und Gelüsten. Während der Nahrungsaufnahme, beim Kauen und Schlucken, spielen die chemischen Sinne, Schmecken und Riechen, eine besondere Rolle. Der Gesamtsinneseindruck aus den chemischen Sinneskanälen wird auch als Flavor bezeichnet. Wobei angemerkt sein soll, dass auch nicht-chemische Sinne, Sehen, Hören und Tasten, in die Flavordefinition einbezogen werden können. Zweifelsohne stellt die Nahrungsaufnahme ein komplexes Verhalten dar, das perzeptuelle, kognitive und Stoffwechselprozesse gleichermaßen umfasst. Die vorliegende Habilitationsschrift widmet sich der Untersuchung der neurokognitiven Mechanismen der visuellen, gustatorischen und flavour Wahrnehmung von Nahrungsobjekten und umfasst Untersuchungen zur Vulnerabilität der neuronalen Repräsentationen durch kontextuelle Reize. Zusammenfassend schließt die Arbeit mit der Feststellung, dass ein umfassendes Verständnis der psychophysiologischen Mechanismen der sensorischen und hedonischen Verarbeitung von Nahrungsreizen über alle Sinne die perzeptuelle Grundlage für nahrungsbezogenes Urteilen und Entscheiden darstellt.
What characterizes food and makes it so tempting? Sensory and hedonic information about food is conveyed by all senses, activated more or less simultaneously, having led to the notion that food perception is a multisensory experience. The smell, sight, touch or sound of a food can be experienced before ingestion and elicit expectations about the "taste" of that food based on previous encounters. It is, therefore, not surprising that these so-called pre-ingestive sensory experiences play a role in the formation of cravings and the elicitation of appetitive responses. Only during consumption, the chemical senses, smell, taste and oral touch and irritation, are experienced in the oral cavity as food is masticated and swallowed and gives rise to the overall experience commonly referred to as taste. While the term taste is, strictly speaking, incorrect as it does not refer to the gustatory perception, many languages including German lack an appropriate term for the holistic flavor experience arising from the food-induced stimulation of the chemical senses, gustation, olfaction and oral somatosensation in a minimalist interpretation, or, in a broader sense, of all our senses, including hearing and vision. Undoubtedly, feeding behavior is characterized by a complex interplay of perceptual, cognitive and metabolic processes and research on the mechanisms by which these processes regulate food intake behavior is only in its infancy. In this thesis, I present a series of studies aiming to elucidate the cortical representations of the visual, gustatory and flavor components of food objects along with evidence for the vulnerability of these presentations to contextual information. Together, I reckon that an understanding of the psychophysiological mechanisms of the sensory and affective processing of food objects mediated by our senses, seeing, smelling, tasting, feeling and hearing, represents the perceptual basis of food-related decision making.
APA, Harvard, Vancouver, ISO, and other styles
10

Hagura, Nobuhiro. "Sensing hand movement from visual and kinesthetic information : neuronal correlates of multisensory processing in estimating the location of hand." Kyoto University, 2008. http://hdl.handle.net/2433/136457.

Full text
Abstract:
Kyoto University (京都大学)
0048
新制・課程博士
博士(人間・環境学)
甲第13921号
人博第394号
新制||人||97(附属図書館)
19||人博||394(吉田南総合図書館)
UT51-2008-C837
京都大学大学院人間・環境学・環境学研究科共生人間学専攻
(主査)教授 松村 道一, 教授 齋木 潤, 教授 船橋 新太郎, 内藤 栄一
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Multisensory neurons"

1

Enhancing performance for action and perception: Multisensory integration, neuroplasticity and neuroprosthetics. Amsterdam: Elsevier Science, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Maravita, Angelo, ed. Updates on multisensory perception: from neurons to cognition. Frontiers Media SA, 2012. http://dx.doi.org/10.3389/978-2-88919-058-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brogaard, Berit. Synesthetic Binding and the Reactivation Model of Memory. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199688289.003.0007.

Full text
Abstract:
Despite the recent surge in research on, and interest in, synesthesia, the mechanism underlying this condition is still unknown. Feedforward mechanisms involving overlapping receptive fields of sensory neurons as well as feedback mechanisms involving a lack of signal disinhibition have been proposed. Here I show that a broad range of studies of developmental synesthesia indicate that the mechanism underlying the phenomenon may in some cases involve the reinstatement of brain activity in sensory or cognitive streams in a way that is similar to what happens during memory retrieval of semantically associated items. In the chapter’s final sections I look at the relevance of synesthesia research, given the memory model, to our understanding of multisensory perception and common mapping patterns.
APA, Harvard, Vancouver, ISO, and other styles
4

Leech-Wilkinson, Daniel. Musical shape and feeling. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199351411.003.0028.

Full text
Abstract:
The concept of shape is widely used by musicians in talking and thinking about performance, yet the mechanisms that afford links between music and shape are little understood. Work on the psychodynamics of everyday life by Daniel Stern and on embodiment by Mark Johnson suggests relationships between the multiple dynamics of musical sound and the dynamics of feeling and motion. Recent work on multisensory and precognitive sensory perception and on the role of bimodal neurons in the sensorimotor system helps to explain how shape, as a percept representing changing quantity in any sensory mode, may be invoked by dynamic processes at many stages of perception and cognition. These processes enable ‘shape’ to do flexible and useful work for musicians needing to describe the quality of musical phenomena that are fundamental to everyday musical practice and yet too complex to calculate during performance.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Multisensory neurons"

1

Serafin, M., M. Mühlethaler, and P. P. Vidal. "Type I Medial Vestibular Neurons During Alertness, Following Adaptation, and During Rem Sleep Episodes in the Head-Fixed Guinea-Pig." In Multisensory Control of Posture, 21–32. Boston, MA: Springer US, 1995. http://dx.doi.org/10.1007/978-1-4615-1931-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Manikandan, N., S. Muruganand, Karuppasamy, and Senthil Subramanian. "Implantable Multisensory Microelectrode Biosensor for Revealing Neuron and Brain Functions." In Springer Proceedings in Physics, 763–69. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-97604-4_115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Blini, Elvio, Alessandro Farnè, Claudio Brozzoli, and Fadila Hadj-Bouziane. "Close is better." In The World at Our Fingertips, 47–60. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198851738.003.0003.

Full text
Abstract:
The neuroscientific approach to peripersonal space (PPS) stems directly from electrophysiological studies assessing the response properties of multisensory neurons in behaving non-human primates. This multisensory context fostered frameworks which i) stress the PPS role in actions (including defensive reactions) and affordances, which are optimally performed through multiple sensory convergence; and ii) largely make use of tasks that are multisensory in nature. Concurrently, however, studies on spatial attention reported proximity-related advantages in purely unisensory tasks. These advantages appear to share some key PPS features. Activations in brain areas reported to be multisensory, indeed, can also be found using unimodal (visual) paradigms. Overall, these findings point to the possibility that closer objects may benefit from being processed as events occurring in PPS. The dominant multisensory view of PPS should therefore be expanded accordingly, as perceptual advantages in PPS may be broader than previously thought.
APA, Harvard, Vancouver, ISO, and other styles
4

Henn, Volker. "Neuronal control of eye movements." In Multisensory Control of Movement, 6–26. Oxford University Press, 1993. http://dx.doi.org/10.1093/acprof:oso/9780198547853.003.0010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Grantyn, Alexej, Alain Berthoz, Etienne Olivier, and Mireille Chat. "Control of gaze by tectal and reticular projection neurones." In Multisensory Control of Movement, 185–200. Oxford University Press, 1993. http://dx.doi.org/10.1093/acprof:oso/9780198547853.003.0109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stein, Barry E., M. Alex Meredith, and Mark T. Wallace. "Chapter 8 The visually responsive neuron and beyond: multisensory integration in cat and monkey." In Progress in Brain Research, 79–90. Elsevier, 1993. http://dx.doi.org/10.1016/s0079-6123(08)60359-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multisensory neurons"

1

Lim, H. K., L. P. Keniston, J. H. Shin, C. Nguyen, M. A. Meredith, and K. J. Cios. "A Neuronal Multisensory Processing Simulator." In 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010. http://dx.doi.org/10.1109/ijcnn.2010.5596777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shen, Zheng, and Shuzhi Lin. "Fiber optic based multisensor to brain neurons in awake animals." In International Symposium on Biomedical Optics Europe '94, edited by Anna M. Verga Scheggi, Francesco Baldini, Pierre R. Coulet, and Otto S. Wolfbeis. SPIE, 1995. http://dx.doi.org/10.1117/12.201241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lim, Gi Hyun, Eurico Pedrosa, Filipe Amaral, Nuno Lau, Artur Pereira, Jose Luis Azevedo, and Bernardo Cunha. "Neural regularization jointly involving neurons and connections for robust image classification." In 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI). IEEE, 2017. http://dx.doi.org/10.1109/mfi.2017.8170451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Daneshzand, Mohammad, Devon Thomas, Khaled Elleithy, and Miad Faezipour. "An energy efficient wireless sensor network based on human neurons communication." In 2015 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI). IEEE, 2015. http://dx.doi.org/10.1109/mfi.2015.7295831.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

She, Xiwei, Yuxi Liao, Hongbao Li, Qiaosheng Zhang, Yiwen Wang, and Xiaoxiang Zheng. "Clustering and observation on neuron tuning property for brain machine interfaces." In 2014 International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI). IEEE, 2014. http://dx.doi.org/10.1109/mfi.2014.6997680.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Judy, Andrew A. Kostrzewski, Dai Hyun Kim, Yih-Shi Kuo, Gajendra D. Savant, and Barney B. Roberts. "Intelligent security system based on neuro-fuzzy multisensor data fusion." In SPIE's International Symposium on Optical Science, Engineering, and Instrumentation, edited by Bruno Bosacchi, David B. Fogel, and James C. Bezdek. SPIE, 1998. http://dx.doi.org/10.1117/12.326708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dichgans, Jakob, Jan Kallwies, and Hans-Joachim Wuensche. "Robust Vehicle Tracking with Monocular Vision using Convolutional Neuronal Networks." In 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI). IEEE, 2020. http://dx.doi.org/10.1109/mfi49285.2020.9235213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Doyle, Rory S., and Chris J. Harris. "Multisensor data fusion for obstacle tracking using neuro-fuzzy estimation algorithms." In SPIE's International Symposium on Optical Engineering and Photonics in Aerospace Sensing, edited by Nagaraj Nandhakumar. SPIE, 1994. http://dx.doi.org/10.1117/12.179031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jun, Su, Nataliia Roshchupkina, Oleksiy Roshchupkin, and Volodymyr Kochan. "Improving the adaptive neuro-fuzzy method to intellectualize multisensor signals processing." In 2018 International Conference on Development and Application Systems (DAS). IEEE, 2018. http://dx.doi.org/10.1109/daas.2018.8396097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nakajima, Kanako, Soichiro Morishita, Tomoki Kazawa, Ryohei Kanzaki, Hajime Asama, and Taketoshi Mishima. "Interpolation of the cross-sectional area of a premotor neuron in a silkworm moth brain using the ellipse model." In 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI 2008). IEEE, 2008. http://dx.doi.org/10.1109/mfi.2008.4648112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography