To see the other types of publications on this topic, follow the link: Integration of ERPs.

Journal articles on the topic 'Integration of ERPs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Integration of ERPs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Talsma, Durk, and Marty G. Woldorff. "Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity." Journal of Cognitive Neuroscience 17, no. 7 (July 2005): 1098–114. http://dx.doi.org/10.1162/0898929054475172.

Full text
Abstract:
We used event-related potentials (ERPs) to evaluate the role of attention in the integration of visual and auditory features of multisensory objects. This was done by contrasting the ERPs to multisensory stimuli (AV) to the sum of the ERPs to the corresponding auditory-only (A) and visual-only (V) stimuli [i.e., AV vs. (A + V)]. V, A, and VA stimuli were presented in random order to the left and right hemispaces. Subjects attended to a designated side to detect infrequent target stimuli in either modality there. The focus of this report is on the ERPs to the standard (i.e., nontarget) stimuli. We used rapid variable stimulus onset asynchronies (350-650 msec) to mitigate anticipatory activity and included “no-stim” trials to estimate and remove ERP overlap from residual anticipatory processes and from adjacent stimuli in the sequence. Spatial attention effects on the processing of the unisensory stimuli consisted of a modulation of visual P1 and N1 components (at 90-130 msec and 160-200 msec, respectively) and of the auditory N1 and processing negativity (100-200 msec). Attended versus unattended multisensory ERPs elicited a combination of these effects. Multisensory integration effects consisted of an initial frontal positivity around 100 msec that was larger for attended stimuli. This was followed by three phases of centro-medially distributed effects of integration and/or attention beginning at around 160 msec, and peaking at 190 (scalp positivity), 250 (negativity), and 300-500 msec (positivity) after stimulus onset. These integration effects were larger in amplitude for attended than for unattended stimuli, providing neural evidence that attention can modulate multisensory-integration processes at multiple stages.
APA, Harvard, Vancouver, ISO, and other styles
2

Fiorini, Linda, Marika Berchicci, Elena Mussini, Valentina Bianco, Stefania Lucia, and Francesco Di Russo. "Neural Basis of Anticipatory Multisensory Integration." Brain Sciences 11, no. 7 (June 25, 2021): 843. http://dx.doi.org/10.3390/brainsci11070843.

Full text
Abstract:
The brain is able to gather different sensory information to enhance salient event perception, thus yielding a unified perceptual experience of multisensory events. Multisensory integration has been widely studied, and the literature supports the hypothesis that it can occur across various stages of stimulus processing, including both bottom-up and top-down control. However, evidence on anticipatory multisensory integration occurring in the fore period preceding the presentation of the expected stimulus in passive tasks, is missing. By means of event-related potentials (ERPs), it has been recently proposed that visual and auditory unimodal stimulations are preceded by sensory-specific readiness activities. Accordingly, in the present study, we tested the occurrence of multisensory integration in the endogenous anticipatory phase of sensory processing, combining visual and auditory stimuli during unimodal and multimodal passive ERP paradigms. Results showed that the modality-specific pre-stimulus ERP components (i.e., the auditory positivity -aP- and the visual negativity -vN-) started earlier and were larger in the multimodal stimulation compared with the sum of the ERPs elicited by the unimodal stimulations. The same amplitude effect was also present for the early auditory N1 and visual P1 components. This anticipatory multisensory effect seems to influence stimulus processing, boosting the magnitude of early stimulus processing. This paves the way for new perspectives on the neural basis of multisensory integration.
APA, Harvard, Vancouver, ISO, and other styles
3

Samara, Tarek. "Impact of the Modularity of ERPs on the Information Systems Disintegration." International Journal of Strategic Information Technology and Applications 7, no. 1 (January 2016): 45–61. http://dx.doi.org/10.4018/ijsita.2016010104.

Full text
Abstract:
This paper aims to verify a possible impact of the ERP modularity on the information systems disintegration. As an ERP package could be viewed as an integration indicator, it could thus measure the information system integration rate (ISIR). When an ERP is modular, clients who desire to be independent from an ERP vendor could easily buy other subsystems from a third party; and thus the information system integration rate would be low. On the contrary, when an ERP is not modular, clients cannot easily buy, due to some issues, other subsystems from a third party; and thus the ISIR would be high. Consequently, when the level of modularity, proposed by ERP vendors, is modified from low to high; or when clients change their information technology strategy from dependence on an ERP vendor to an independence from this vendor, the information system integration rate could be affected. This paper tries to understand how these changes could impact the information system integration rate; and especially how they could provoke the disintegration of information system.
APA, Harvard, Vancouver, ISO, and other styles
4

Yang, Chin Lung, Charles A. Perfetti, and Franz Schmalhofer. "Less skilled comprehenders’ ERPs show sluggish word-to-text integration processes." Written Language and Literacy 8, no. 2 (December 31, 2005): 157–81. http://dx.doi.org/10.1075/wll.8.2.10yan.

Full text
Abstract:
We examined the word-to-text integration processes of less skilled comprehenders using ERPs recorded during text reading. The first sentence of each text controlled the accessibility of an antecedent referent for a critical word, which was the first content word of the second sentence. In the explicit condition, the critical word had occurred in the first sentence; in the paraphrase condition, a word or phrase similar in meaning had occurred in the first sentence; in the inference condition, a referent could have been established during the first sentence only if the reader made a forward inference; a baseline condition provided no obvious antecedent for the critical word. PCA, topographic results, and mean amplitude analyses converged on a picture of integration difficulty. Integration effects emerged in the expected mid-latency ranges for the explicit and inference conditions. The pattern of effects differed from that of skilled comprehenders, who, in another study, showed earlier integration effects for explicit and paraphrase conditions, but not reliably for the inference condition. Paraphrase effects were especially weak and late occurring for less skilled comprehenders. Compared with skilled comprehenders, less skilled comprehenders show slow word-to-text integration processes.
APA, Harvard, Vancouver, ISO, and other styles
5

Samara, Tarek. "Impact of the Interoperability of ERPs on Information Systems Disintegration." International Journal of Strategic Information Technology and Applications 7, no. 4 (October 2016): 94–109. http://dx.doi.org/10.4018/ijsita.2016100103.

Full text
Abstract:
As an ERP package could be viewed as an integration indicator, it could thus measure the information system integration rate (ISIR). Therefore, the integration rate (IR) of an information system (IS), which is composed of only one ERP package, should be higher than the IR of another IS, which is composed of an ERP that is well (or not well) integrated with other subsystems. Some “information technology strategies adopted by ERP vendors and/or by firms could affect, and/or they could be influenced by, the level of ERP interoperability. Whether or not a reliable interoperability is possible to be developed and/or proposed by ERP vendors; firms could interoperate, within the framework of their IS, an ERP package with other subsystems, or they could implement this package as the only component of their IS. When strategies change, the level of ERP interoperability could be modified from reliable to unreliable and the ISIR could be affected. This paper tries to understand how these changes could impact the level of ERP interoperability in a manner that could provoke the IS disintegration.
APA, Harvard, Vancouver, ISO, and other styles
6

King, Jonathan W., and Marta Kutas. "Who Did What and When? Using Word- and Clause-Level ERPs to Monitor Working Memory Usage in Reading." Journal of Cognitive Neuroscience 7, no. 3 (July 1995): 376–95. http://dx.doi.org/10.1162/jocn.1995.7.3.376.

Full text
Abstract:
ERPs were recorded from 24 undergraduates as they read sentences known to differ in syntactic complexity and working memory requirements, namely Object and Subject Relative sentences. Both the single-word and multiword analyses revealed significant differences due to sentence type, while multiword ERPs also showed that sentence type effects differed for Good and Poor comprehenders. At the single-word level, ERPs to both verbs in Object Relative sentences showed a left anterior negativity between 300 and 500 msec postword-onset relative to those to Subject Relative verbs. At the multiword level, a slow frontal positivity characterized Subject Relative sentences, but was absent for Object Relatives. This slow positivity appears to index ease of processing or integration. and was more robust in Good than in Poor comprehenders.
APA, Harvard, Vancouver, ISO, and other styles
7

Kadouri, Abdillah, and Anouar Ammi. "Proposition D’une Demarche De Gestion Pour Reduire Les Risques De Conduite Des Projets D’implementation Des ERP Logistique." European Scientific Journal, ESJ 12, no. 16 (June 28, 2016): 474. http://dx.doi.org/10.19044/esj.2016.v12n16p474.

Full text
Abstract:
The logistics activity exercise is in fact submitted to several constraints related to physical, financial and information flows; it also requires operational expenses optimization, compliance with performance indicators and processes control. Being structured in different integrated modules, the ERPs provide operational, tactical and strategic planning features that allow the company to balance its supply and demand plans. Although project management practices are considered effective nowadays, many companies are still struggling to implement these ERPs and automate the various logistics processes: indeed the situation of such projects becomes uncontrollable or they are abandoned. It is therefore appropriate to consider, in this article, the main risks that may arise during the logistics integration in the ERPs; then try to explain the most adequate governance to implement in order to reduce their impact.
APA, Harvard, Vancouver, ISO, and other styles
8

Faccia, Alessio, and Pythagoras Petratos. "Blockchain, Enterprise Resource Planning (ERP) and Accounting Information Systems (AIS): Research on e-Procurement and System Integration." Applied Sciences 11, no. 15 (July 23, 2021): 6792. http://dx.doi.org/10.3390/app11156792.

Full text
Abstract:
Accounting information systems (AISs), the core module of any enterprise resource planning (ERP) system, are usually designed as centralised systems. Nowadays, the continuous development and applications of blockchain, or more broadly—distributed ledger technology (DLT), can change the architecture, overcome and improve some limitations of centralised systems, most notably security and privacy. An increasing number of authors are suggesting the application of blockchain technologies in management, accounting and ERPs. This paper aims to examine the emerging literature on this field, and an immediate result is that blockchain applications can have significant benefits. The paper’s innovative contribution and considerable objective are to examine if blockchain can be successfully integrated with AIS and ERPs. We find that blockchain can facilitate integration at multiple levels and better serve various purposes as auditing compliance. To demonstrate that, we analyse e-procurement systems and operations using case study research methodology. The findings suggest that DLT, decentralised finance (DeFI), and financial technology (FinTech) applications can facilitate integrating AISs and ERP systems and yield significant benefits for efficiency, productivity and security.
APA, Harvard, Vancouver, ISO, and other styles
9

Carvalho, João. "Mainstream Party Strategies Towards Extreme Right Parties: The French 2007 and 2012 Presidential Elections." Government and Opposition 54, no. 2 (October 30, 2017): 365–86. http://dx.doi.org/10.1017/gov.2017.25.

Full text
Abstract:
The electoral success of extreme right parties (ERPs) has attracted a disproportionate number of studies. By contrast, research into the mainstream parties’ reactions to ERPs has engendered little interest. With few exceptions, the effects of the centre-right parties’ strategic options in electoral competitions with ERPs remain unexplored. To overcome this shortcoming, this investigation examines the strategies employed by the French centre-right party – Union pour un Movement Populaire (UMP) against the Front National in the 2007 and the 2012 presidential elections by focusing on the topics of immigration and integration. This study suggests that the adoption of accommodating approaches in both elections was followed by distinct levels of success in 2007 and 2012. Drawing on a qualitative comparative analysis, this article explores three hypotheses in order to enhance understanding of the divergent effectiveness of the UMP’s accommodative approaches in the elections studied.
APA, Harvard, Vancouver, ISO, and other styles
10

Stafura, Joseph Z., and Charles A. Perfetti. "Word-to-text integration: Message level and lexical level influences in ERPs." Neuropsychologia 64 (November 2014): 41–53. http://dx.doi.org/10.1016/j.neuropsychologia.2014.09.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Perri, Rinaldo Livio, Marika Berchicci, Valentina Bianco, Federico Quinzi, Donatella Spinelli, and Francesco Di Russo. "Awareness of perception and sensory–motor integration: ERPs from the anterior insula." Brain Structure and Function 223, no. 8 (July 5, 2018): 3577–92. http://dx.doi.org/10.1007/s00429-018-1709-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Coopmans, Cas W., and Mante S. Nieuwland. "Dissociating activation and integration of discourse referents: Evidence from ERPs and oscillations." Cortex 126 (May 2020): 83–106. http://dx.doi.org/10.1016/j.cortex.2019.12.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Föcker, Julia, and Brigitte Röder. "Event-Related Potentials Reveal Evidence for Late Integration of Emotional Prosody and Facial Expression in Dynamic Stimuli: An ERP Study." Multisensory Research 32, no. 6 (2019): 473–97. http://dx.doi.org/10.1163/22134808-20191332.

Full text
Abstract:
Abstract The aim of the present study was to test whether multisensory interactions of emotional signals are modulated by intermodal attention and emotional valence. Faces, voices and bimodal emotionally congruent or incongruent face–voice pairs were randomly presented. The EEG was recorded while participants were instructed to detect sad emotional expressions in either faces or voices while ignoring all stimuli with another emotional expression and sad stimuli of the task irrelevant modality. Participants processed congruent sad face–voice pairs more efficiently than sad stimuli paired with an incongruent emotion and performance was higher in congruent bimodal compared to unimodal trials, irrespective of which modality was task-relevant. Event-related potentials (ERPs) to congruent emotional face–voice pairs started to differ from ERPs to incongruent emotional face–voice pairs at 180 ms after stimulus onset: Irrespectively of which modality was task-relevant, ERPs revealed a more pronounced positivity (180 ms post-stimulus) to emotionally congruent trials compared to emotionally incongruent trials if the angry emotion was presented in the attended modality. A larger negativity to incongruent compared to congruent trials was observed in the time range of 400–550 ms (N400) for all emotions (happy, neutral, angry), irrespectively of whether faces or voices were task relevant. These results suggest an automatic interaction of emotion related information.
APA, Harvard, Vancouver, ISO, and other styles
14

Teder-Sälejärvi, W. A., F. Di Russo, J. J. McDonald, and S. A. Hillyard. "Effects of Spatial Congruity on Audio-Visual Multimodal Integration." Journal of Cognitive Neuroscience 17, no. 9 (September 2005): 1396–409. http://dx.doi.org/10.1162/0898929054985383.

Full text
Abstract:
Spatial constraints on multisensory integration of auditory (A) and visual (V) stimuli were investigated in humans using behavioral and electrophysiological measures. The aim was to find out whether cross-modal interactions between A and V stimuli depend on their spatial congruity, as has been found for multisensory neurons in animal studies (Stein & Meredith, 1993). Randomized sequences of unimodal (A or V) and simultaneous bimodal (AV) stimuli were presented to right-or left-field locations while subjects made speeded responses to infrequent targets of greater intensity that occurred in either or both modalities. Behavioral responses to the bimodal stimuli were faster and more accurate than to the uni-modal stimuli for both same-location and different-location AV pairings. The neural basis of this cross-modal facilitation was studied by comparing event-related potentials (ERPs) to the bimodal AV stimuli with the summed ERPs to the unimodal A and V stimuli. These comparisons revealed neural interactions localized to the ventral occipito-temporal cortex (at 190 msec) and to the superior temporal cortical areas (at 260 msec) for both same-and different-location AV pairings. In contrast, ERP interactions that differed according to spatial congruity included a phase and amplitude modulation of visual-evoked activity localized to the ventral occipito-temporal cortex at 100-400 msec and an amplitude modulation of activity localized to the superior temporal region at 260-280 msec. These results demonstrate overlapping but distinctive patterns of multisensory integration for spatially congruent and incongruent AV stimuli.
APA, Harvard, Vancouver, ISO, and other styles
15

Singh, Moon Inder, and Mandeep Singh. "Development of emotion classifier based on absolute and differential attributes of averaged signals of visually stimulated event related potentials." Transactions of the Institute of Measurement and Control 42, no. 11 (March 3, 2020): 2057–67. http://dx.doi.org/10.1177/0142331220904889.

Full text
Abstract:
Analysis and study of abstract human relations have always posed a daunting challenge for technocrats engaged in the field of psychometric analysis. The study on emotion recognition is all the more demanding as it involves integration of abstract phenomenon of emotion causation and emotion appraisal through physiological and brain signals. This paper describes the classification of human emotions into four classes, namely: low valence high arousal (LVHA), high valence high arousal (HVHA), high valence low arousal (HVLA) and low valence low arousal (LVLA) using Electroencephalogram (EEG) signals. The EEG signals have been collected on three EEG electrodes along the central line viz: Fz, Cz and Pz. The analysis has been done on average event related potentials (ERPs) and difference of average ERPs using Support Vector Machine (SVM) polynomial classifier. The four-class classification accuracy of 75% using average ERP attributes and an accuracy of 76.8% using difference of ERPs as attributes has been obtained. The accuracy obtained using differential average ERP attributes is better as compared with the already existing studies.
APA, Harvard, Vancouver, ISO, and other styles
16

Grainger, Jonathan, Katherine J. Midgley, and Phillip J. Holcomb. "Trans-saccadic repetition priming: ERPs reveal on-line integration of information across words." Neuropsychologia 80 (January 2016): 201–11. http://dx.doi.org/10.1016/j.neuropsychologia.2015.11.025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Forget, Joachim, Marco Buiatti, and Stanislas Dehaene. "Temporal Integration in Visual Word Recognition." Journal of Cognitive Neuroscience 22, no. 5 (May 2010): 1054–68. http://dx.doi.org/10.1162/jocn.2009.21300.

Full text
Abstract:
When two displays are presented in close temporal succession at the same location, how does the brain assign them to one versus two conscious percepts? We investigate this issue using a novel reading paradigm in which the odd and even letters of a string are presented alternatively at a variable rate. The results reveal a window of temporal integration during reading, with a nonlinear boundary around ∼80 msec of presentation duration. Below this limit, the oscillating stimulus is easily fused into a single percept, with all characteristics of normal reading. Above this limit, reading times are severely slowed and suffer from a word-length effect. ERPs indicate that, even at the fastest frequency, the oscillating stimulus elicits synchronous oscillations in posterior visual cortices, while late ERP components sensitive to lexical status vanish beyond the fusion threshold. Thus, the fusion/segregation dilemma is not resolved by retinal or subcortical filtering, but at cortical level by at most 300 msec. The results argue against theories of visual word recognition and letter binding that rely on temporal synchrony or other fine temporal codes.
APA, Harvard, Vancouver, ISO, and other styles
18

Varga, Nicole L., and Patricia J. Bauer. "Using Event-related Potentials to Inform the Neurocognitive Processes Underlying Knowledge Extension through Memory Integration." Journal of Cognitive Neuroscience 29, no. 11 (November 2017): 1932–49. http://dx.doi.org/10.1162/jocn_a_01168.

Full text
Abstract:
To build a general knowledge base, it is imperative that individuals acquire, integrate, and further extend knowledge across experiences. For instance, in one episode an individual may learn that George Washington was the first president. In a separate episode they may then learn that Washington was the commander of the Continental Army. Integration of the information in memory may then support self-derivation of the new knowledge that the leader of the Continental Army was also the first president. Despite a considerable amount of fMRI research aimed at further elucidating the neuroanatomical regions supporting this ability, a consensus has yet to be reached with regards to the precise neurocognitive processes involved. In the present research, we capitalized on the high temporal resolution of event-related potentials (ERPs) to inform the time course of processes elicited during successful integration and further extension of new factual knowledge. Adults read novel, related stem facts and were tested for self-derivation of novel integration facts while ERPs were recorded. Consistent with current theoretical models, memory integration was first triggered by novelty detection within 400 msec of experience of a second, related stem fact. Two additional temporally staged encoding processes were then observed interpreted to reflect (1) explicit meaning comprehension and (2) representation of the integrated relation in memory. During the test for self-derivation, a single ERP was elicited, which presumably reflected retrieval and/or recombination of previously integrated knowledge. Together, the present research provides important insight into the time course of neurocognitive processing associated with the formation of a knowledge base.
APA, Harvard, Vancouver, ISO, and other styles
19

LV, Yuan, Jiuqing LIANG, and Chunyan GUO. "The Influence of Semantic Integration between Items on Associative Recognition: Evidence from ERPs Study." Acta Psychologica Sinica 47, no. 4 (2015): 427. http://dx.doi.org/10.3724/sp.j.1041.2015.00427.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ullsperger, Markus, and D. Yves von Cramon. "The Role of Intact Frontostriatal Circuits in Error Processing." Journal of Cognitive Neuroscience 18, no. 4 (April 1, 2006): 651–64. http://dx.doi.org/10.1162/jocn.2006.18.4.651.

Full text
Abstract:
The basal ganglia have been suggested to play a key role in performance monitoring and resulting behavioral adjustments. It is assumed that the integration of prefrontal and motor cortico—striato—thalamo—cortical circuits provides contextual information to the motor anterior cingulate cortex regions to enable their function in performance monitoring. So far, direct evidence is missing, however. We addressed the involvement of frontostriatal circuits in performance monitoring by collecting event-related brain potentials (ERPs) and behavioral data in nine patients with focal basal ganglia lesions and seven patients with lateral prefrontal cortex lesions while they performed a flanker task. In both patient groups, the amplitude of the error-related negativity was reduced, diminishing the difference to the ERPs on correct responses. Despite these electrophysiological abnormalities, most of the patients were able to correct errors. Only in lateral prefrontal cortex patients whose lesions extended into the frontal white matter, disrupting the connections to the motor anterior cingulate cortex and the striatum, were error corrections severely impaired. In sum, the fronto—striato—thalamo—cortical circuits seem necessary for the generation of error-related negativity, even when brain plasticity has resulted in behavioral compensation of the damage. Thus, error-related ERPs in patients provide a sensitive measure of the integrity of the performance monitoring network.
APA, Harvard, Vancouver, ISO, and other styles
21

Shen, Stanley, Jess R. Kerlin, Heather Bortfeld, and Antoine J. Shahin. "The Cross-Modal Suppressive Role of Visual Context on Speech Intelligibility: An ERP Study." Brain Sciences 10, no. 11 (November 2, 2020): 810. http://dx.doi.org/10.3390/brainsci10110810.

Full text
Abstract:
The efficacy of audiovisual (AV) integration is reflected in the degree of cross-modal suppression of the auditory event-related potentials (ERPs, P1-N1-P2), while stronger semantic encoding is reflected in enhanced late ERP negativities (e.g., N450). We hypothesized that increasing visual stimulus reliability should lead to more robust AV-integration and enhanced semantic prediction, reflected in suppression of auditory ERPs and enhanced N450, respectively. EEG was acquired while individuals watched and listened to clear and blurred videos of a speaker uttering intact or highly-intelligible degraded (vocoded) words and made binary judgments about word meaning (animate or inanimate). We found that intact speech evoked larger negativity between 280–527-ms than vocoded speech, suggestive of more robust semantic prediction for the intact signal. For visual reliability, we found that greater cross-modal ERP suppression occurred for clear than blurred videos prior to sound onset and for the P2 ERP. Additionally, the later semantic-related negativity tended to be larger for clear than blurred videos. These results suggest that the cross-modal effect is largely confined to suppression of early auditory networks with weak effect on networks associated with semantic prediction. However, the semantic-related visual effect on the late negativity may have been tempered by the vocoded signal’s high-reliability.
APA, Harvard, Vancouver, ISO, and other styles
22

Obermeier, Christian, and Thomas C. Gunter. "Multisensory Integration: The Case of a Time Window of Gesture–Speech Integration." Journal of Cognitive Neuroscience 27, no. 2 (February 2015): 292–307. http://dx.doi.org/10.1162/jocn_a_00688.

Full text
Abstract:
This experiment investigates the integration of gesture and speech from a multisensory perspective. In a disambiguation paradigm, participants were presented with short videos of an actress uttering sentences like “She was impressed by the BALL, because the GAME/DANCE….” The ambiguous noun (BALL) was accompanied by an iconic gesture fragment containing information to disambiguate the noun toward its dominant or subordinate meaning. We used four different temporal alignments between noun and gesture fragment: the identification point (IP) of the noun was either prior to (+120 msec), synchronous with (0 msec), or lagging behind the end of the gesture fragment (−200 and −600 msec). ERPs triggered to the IP of the noun showed significant differences for the integration of dominant and subordinate gesture fragments in the −200, 0, and +120 msec conditions. The outcome of this integration was revealed at the target words. These data suggest a time window for direct semantic gesture–speech integration ranging from at least −200 up to +120 msec. Although the −600 msec condition did not show any signs of direct integration at the homonym, significant disambiguation was found at the target word. An explorative analysis suggested that gesture information was directly integrated at the verb, indicating that there are multiple positions in a sentence where direct gesture–speech integration takes place. Ultimately, this would implicate that in natural communication, where a gesture lasts for some time, several aspects of that gesture will have their specific and possibly distinct impact on different positions in an utterance.
APA, Harvard, Vancouver, ISO, and other styles
23

KAAN, EDITH, JOSEPH KIRKHAM, and FRANK WIJNEN. "Prediction and integration in native and second-language processing of elliptical structures." Bilingualism: Language and Cognition 19, no. 1 (December 29, 2014): 1–18. http://dx.doi.org/10.1017/s1366728914000844.

Full text
Abstract:
According to recent views of L2-sentence processing, L2-speakers do not predict upcoming information to the same extent as do native speakers. To investigate L2-speakers’ predictive use and integration of syntactic information across clauses, we recorded event-related potentials (ERPs) from advanced L2-learners and native speakers while they read sentences in which the syntactic context did or did not allow noun-ellipsis (Lau, E., Stroud, C., Plesch, S., & Phillips, C. (2006). The role of structural prediction in rapid syntactic analysis. Brain and Language, 98, 74–88.) Both native and L2-speakers were sensitive to the context when integrating words after the potential ellipsis-site. However, native, but not L2-speakers, anticipated the ellipsis, as suggested by an ERP difference between elliptical and non-elliptical contexts preceding the potential ellipsis-site. In addition, L2-learners displayed a late frontal negativity for ungrammaticalities, suggesting differences in repair strategies or resources compared with native speakers.
APA, Harvard, Vancouver, ISO, and other styles
24

Giard, M. H., and F. Peronnet. "Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study." Journal of Cognitive Neuroscience 11, no. 5 (September 1999): 473–90. http://dx.doi.org/10.1162/089892999563544.

Full text
Abstract:
The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to characterize the processing stages and the neural structures where multisensory interactions take place. Event-related potentials (ERPs) were recorded from 30 scalp electrodes while subjects performed a forced-choice reaction-time categorization task: At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two objects were defined by auditory features alone, visual features alone, or the combination of auditory and visual features. Subjects were more accurate and rapid at identifying multimodal than unimodal objects. Spatiotemporal analysis of ERPs and scalp current densities revealed several auditory-visual interaction components temporally, spatially, and functionally distinct before 200 msec poststimulus. The effects observed were (1) in visual areas, new neural activities (as early as 40 msec poststimulus) and modulation (amplitude decrease) of the N185 wave to unimodal visual stimulus, (2) in the auditory cortex, modulation (amplitude increase) of subcomponents of the unimodal auditory N1 wave around 90 to 110 msec, and (3) new neural activity over the right fronto-temporal area (140 to 165 msec). Furthermore, when the subjects were separated into two groups according to their dominant modality to perform the task in unimodal conditions (shortest reaction time criteria), the integration effects were found to be similar for the two groups over the nonspecific fronto-temporal areas, but they clearly differed in the sensory-specific cortices, affecting predominantly the sensory areas of the nondominant modality. Taken together, the results indicate that multisensory integration is mediated by flexible, highly adaptive physiological processes that can take place very early in the sensory processing chain and operate in both sensory-specific and nonspecific cortical structures in different ways.
APA, Harvard, Vancouver, ISO, and other styles
25

Longo, Matthew R., Jason Jiri Musil, and Patrick Haggard. "Visuo-tactile Integration in Personal Space." Journal of Cognitive Neuroscience 24, no. 3 (March 2012): 543–52. http://dx.doi.org/10.1162/jocn_a_00158.

Full text
Abstract:
Integration of information across sensory modalities is enhanced when stimuli in both modalities are in the same location. This “spatial rule” of multisensory integration has been primarily studied in humans by comparing stimuli located either in the same versus opposite side of the body midline or in peripersonal versus extrapersonal space, both of which involve large, categorical differences in spatial location. Here we used psychophysics and ERPs to investigate visuo-tactile integration in personal space (i.e., on the skin surface). We used the mirror box technique to manipulate the congruence of visual and tactile information about which finger on either the right or left hand had been touched. We observed clear compatibility effects for both visual and tactile judgments of which finger on the left hand had been touched. No such effects, however, were found for judgments about the right hand. ERP data showed a similar pattern. Amplitude of the vertex P200 potential was enhanced and that of the N2 was reduced for congruent visuo-tactile events on the left, but not the right, hand. Similarly, a later positivity over posterior parietal cortices (P300) showed contralateral enhancement for congruent visuo-tactile events on both the left and right hands. These results provide clear evidence for spatial constraints on visuo-tactile integration defined in personal space and also reveal clear lateralization of these effects. Furthermore, these results link these “ultraprecise” spatial constraints to processing in the right posterior parietal cortex.
APA, Harvard, Vancouver, ISO, and other styles
26

Regel, Stefanie, Thomas C. Gunter, and Angela D. Friederici. "Isn't It Ironic? An Electrophysiological Exploration of Figurative Language Processing." Journal of Cognitive Neuroscience 23, no. 2 (February 2011): 277–93. http://dx.doi.org/10.1162/jocn.2010.21411.

Full text
Abstract:
Although the neurocognitive processes underlying the comprehension of figurative language, especially metaphors and idioms, have been studied extensively, less is known about the processing of irony. In two experiments using event-related brain potentials (ERPs), we examined the types of cognitive processes involved in the comprehension of ironic and literal sentences and their relative time course. The experiments varied in modality (auditory, visual), task demands (comprehension task vs. passive reading), and probability of stimulus occurrence. ERPs consistently revealed a large late positivity (i.e., P600 component) in the absence of an N400 component for irony compared to equivalent literal sentences independent of modality. This P600 was shown to be unaffected by the factors task demands and probability of occurrence. Taken together, the findings suggest that the observed P600 is related to irony processing, and might be a reflection of pragmatic interpretation processes. During the comprehension of irony, no semantic integration difficulty arises (absence of N400), but late inferential processes appear to be necessary for understanding ironic meanings (presence of P600). This finding calls for a revision of current models of figurative language processing.
APA, Harvard, Vancouver, ISO, and other styles
27

Jessen, Anna, and Claudia Felser. "Reanalysing object gaps during non-native sentence processing: Evidence from ERPs." Second Language Research 35, no. 2 (January 20, 2018): 285–300. http://dx.doi.org/10.1177/0267658317753030.

Full text
Abstract:
The present study used event related potentials (ERPs) to investigate how native (L1) German-speaking second-language (L2) learners of English process sentences containing filler-gap dependencies such as Bill liked the house (women) that Bob built some ornaments for __ at his workplace. Using an experimental design which allowed us to dissociate filler integration from reanalysis effects, we found that fillers which were implausible as direct objects of the embedded verb (e.g. built the women) elicited similar brain responses (an N400) in L1 and L2 speakers when the verb was encountered. This confirms findings from behavioral and eye-movement studies indicating that both L1 and L2 speakers immediately try to integrate a filler with a potential lexical licensor. L1/L2 differences were observed when subsequent sentence material signaled that the direct-object analysis was in fact incorrect, however. We found reanalysis effects, in the shape of a P600 for sentences containing fillers that were plausible direct objects only for L2 speakers, but not for the L1 group. This supports previous findings suggesting that L2 comprehenders recover from an initially plausible first analysis less easily than L1 speakers.
APA, Harvard, Vancouver, ISO, and other styles
28

Ullah, Abrar, Rohaizat Bin Baharun, Muhammad Yasir, and Khalil MD Nor. "Enterprise Resource Planning Systems and User Performance in Higher Education Institutions of Pakistan." Journl of Applied Economics and Business Studies 4, no. 2 (June 30, 2020): 119–40. http://dx.doi.org/10.34260/jaebs.426.

Full text
Abstract:
The study is designed to assess the impact of Enterprise Resource Planning (ERP) systems on the performance perceived by users in Higher Education Institutions (HEIs) of Pakistan. This study sought to evaluate the effect of ERPs quality factors including Information Quality (IQ), System Quality (SQ) and Service Quality (SRQ) on User Performance (UP) towards system usage and the mediating role of Perceived Usefulness (PU), Perceived Ease of Use (PEOU) and User Satisfaction (US) between ERPs quality factors and UP. Consequently, a framework is proposed by integration of DeLone & McLean (D&M) Information Systems (IS) success model and Technology Acceptance Model (TAM) to address the research questions. The study used quantitative research methodology and data were collected from 317 employees from eight universities in Pakistan. Structural Equation Modelling (SEM) was performed using SmartPLS and the results indicated that SQ, IQ and SRQ has direct and positive effect on UP. Additionally, PU and PEOU are found to have mediating role between ERP quality factors and UP. Moreover, US has mediated the relationship between SQ, IQ with UP, but failed to mediate the effect of SRQ on UP. Theoretically, the study contributes by integrating factors from D&M IS success model and TAM to investigate the effect of ERP systems on UP through PU, PEOU and US. Practically, the study implied that practitioners needs to put efforts to provide a system which users perceive as useful and free of efforts.
APA, Harvard, Vancouver, ISO, and other styles
29

Koelsch, Stefan, Tomas Gunter, Angela D. Friederici, and Erich Schröger. "Brain Indices of Music Processing: “Nonmusicians” are Musical." Journal of Cognitive Neuroscience 12, no. 3 (May 2000): 520–41. http://dx.doi.org/10.1162/089892900562183.

Full text
Abstract:
Only little systematic research has examined event-related brain potentials (ERPs) elicited by the cognitive processing of music. The present study investigated how music processing is influenced by a preceding musical context, affected by the task relevance of unexpected chords, and influenced by the degree and the probability of violation. Four experiments were conducted in which “nonmusicians” listened to chord sequences, which infrequently contained a chord violating the sound expectancy of listeners. Integration of in-key chords into the musical context was reflected as a late negative-frontal deflection in the ERPs. This negative deflection declined towards the end of a chord sequence, reflecting normal buildup of musical context. Brain waves elicited by chords with unexpected notes revealed two ERP effects: an early right-hemispheric preponderant-anterior negativity, which was taken to reflect the violation of sound expectancy; and a late bilateral-frontal negativity. The late negativity was larger compared to in-key chords and taken to reflect the higher degree of integration needed for unexpected chords. The early right-anterior negativity (ERAN) was unaffected by the task relevance of unexpected chords. The amplitudes of both early and late negativities were found to be sensitive to the degree of musical expectancy induced by the preceding harmonic context, and to the probability for deviant acoustic events. The employed experimental design opens a new field for the investigation of music processing. Results strengthen the hypothesis of an implicit musical ability of the human brain.
APA, Harvard, Vancouver, ISO, and other styles
30

Gheza, Davide, Rudi De Raedt, Chris Baeken, and Gilles Pourtois. "Integration of reward with cost anticipation during performance monitoring revealed by ERPs and EEG spectral perturbations." NeuroImage 173 (June 2018): 153–64. http://dx.doi.org/10.1016/j.neuroimage.2018.02.049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Zarzycka, Ewelina. "Management Accountant’s Role and Functions in the Enterprise Resource Planning Environment – Author’s Own Research into Enterprises in Poland." Comparative Economic Research. Central and Eastern Europe 15, no. 2 (September 17, 2012): 47–64. http://dx.doi.org/10.2478/v10103-012-0009-7.

Full text
Abstract:
ERP systems have revolutionized practically all aspects of business processes in enterprises. They help improve the processes by ensuring their integration. Ensuring integration between financial and non-financial data, an ERP package gives new quality to the management of enterprise value. All these features make ERPs particularly important for specialists responsible for providing management information and measuring performance of the company. This article seeks to answer whether the implementation of an ERP system has an effect on the management accountant’s tasks and functions, especially in the field of performance measurement and internal reporting. The ERP impacts on the controller’s role in the organization will be evaluated using field studies on six enterprises owned by multinational corporations. The question that should be asked here is whether controller’s functions and tasks will also be unaffected.
APA, Harvard, Vancouver, ISO, and other styles
32

Pitts, Michael A., Antígona Martínez, and Steven A. Hillyard. "Visual Processing of Contour Patterns under Conditions of Inattentional Blindness." Journal of Cognitive Neuroscience 24, no. 2 (February 2012): 287–303. http://dx.doi.org/10.1162/jocn_a_00111.

Full text
Abstract:
An inattentional blindness paradigm was adapted to measure ERPs elicited by visual contour patterns that were or were not consciously perceived. In the first phase of the experiment, subjects performed an attentionally demanding task while task-irrelevant line segments formed square-shaped patterns or random configurations. After the square patterns had been presented 240 times, subjects' awareness of these patterns was assessed. More than half of all subjects, when queried, failed to notice the square patterns and were thus considered inattentionally blind during this first phase. In the second phase of the experiment, the task and stimuli were the same, but following this phase, all of the subjects reported having seen the patterns. ERPs recorded over the occipital pole differed in amplitude from 220 to 260 msec for the pattern stimuli compared with the random arrays regardless of whether subjects were aware of the patterns. At subsequent latencies (300–340 msec) however, ERPs over bilateral occipital-parietal areas differed between patterns and random arrays only when subjects were aware of the patterns. Finally, in a third phase of the experiment, subjects viewed the same stimuli, but the task was altered so that the patterns became task relevant. Here, the same two difference components were evident but were followed by a series of additional components that were absent in the first two phases of the experiment. We hypothesize that the ERP difference at 220–260 msec reflects neural activity associated with automatic contour integration whereas the difference at 300–340 msec reflects visual awareness, both of which are dissociable from task-related postperceptual processing.
APA, Harvard, Vancouver, ISO, and other styles
33

Biau, Emmanuel, and Salvador Soto-Faraco. "Beat gestures modulate auditory integration in speech perception." Seeing and Perceiving 25 (2012): 86. http://dx.doi.org/10.1163/187847612x647054.

Full text
Abstract:
In everyday life, people interact with each others through verbal communication but also by spontaneous beat gestures which are a very important part of the paralinguistic context during face-to-face conversations. Nonetheless, their role and neural correlates have been seldom addressed. Here we investigate the time course of beat-speech integration in natural speech perception conditions. We measured event-related potentials to words pronounced with or without an accompanying beat gesture while participants attended to a political speech. When the speaker was on sight, words pronounced with a beat gesture elicited appositive shift in ERPs at early sensory (before 100 ms) and at a later time window coinciding with the auditory component P2. This result remained partially true even when the auditory signal was removed from audiovisual signal. Interestingly, there was no difference with words pronounced without gesture when participants listened to the same speech passage without viewing of the speaker. We conclude that in a naturalistic speech context, beat gestures are integrated with speech early on in time and modulate the sensory/phonological levels of processing. We propose that these results suggest a possible role of beats as a highlighter, helping direct the focus of attention of the listener on important information, rather than adding information per se. Beat gestures would modulate how verbal information is treated.
APA, Harvard, Vancouver, ISO, and other styles
34

Chaushi, Blerta Abazi, Zamir Dika, and Agron Chaushi. "IMPROVING INSTITUTIONAL SERVICES THROUGH UNIVERSITY ERP: A STUDY OF THE ACADEMIC PLANNING MODULE DEVELOPMENT AT SEEU." SEEU Review 12, no. 2 (December 20, 2017): 62–81. http://dx.doi.org/10.1515/seeur-2017-0018.

Full text
Abstract:
Abstract Enterprise Resource Planning (ERP) systems are used by universities to handle the academic services and business processes while providing enhanced experience and services to students. This study begins with a background review of ERPs in higher education institutions, the impact on the business processes through optimization and the importance of critical success factors for easier implementation. Secondly, Academic Planning, a core part of the student module of ERPs for higher education, is analyzed in this paper from the prism of data integration, business process workflow, and process optimization. The issues that arise with development of a module are addressed through a case study at SEE-University. The data and business process workflows are based on an actual study by real implementation at this institution. The findings from this study will serve other universities who are in the process of implementation of an ERP to ease their development process and improve the efficiency of the services provided. Main contribution of this study is that it reduces the gap in literature and practice for issues and solutions that arise with the development of a new system, especially in higher education institutions, which in turn are very scarce in nature.
APA, Harvard, Vancouver, ISO, and other styles
35

Simon, David M., and Mark T. Wallace. "Integration and Temporal Processing of Asynchronous Audiovisual Speech." Journal of Cognitive Neuroscience 30, no. 3 (March 2018): 319–37. http://dx.doi.org/10.1162/jocn_a_01205.

Full text
Abstract:
Multisensory integration of visual mouth movements with auditory speech is known to offer substantial perceptual benefits, particularly under challenging (i.e., noisy) acoustic conditions. Previous work characterizing this process has found that ERPs to auditory speech are of shorter latency and smaller magnitude in the presence of visual speech. We sought to determine the dependency of these effects on the temporal relationship between the auditory and visual speech streams using EEG. We found that reductions in ERP latency and suppression of ERP amplitude are maximal when the visual signal precedes the auditory signal by a small interval and that increasing amounts of asynchrony reduce these effects in a continuous manner. Time–frequency analysis revealed that these effects are found primarily in the theta (4–8 Hz) and alpha (8–12 Hz) bands, with a central topography consistent with auditory generators. Theta effects also persisted in the lower portion of the band (3.5–5 Hz), and this late activity was more frontally distributed. Importantly, the magnitude of these late theta oscillations not only differed with the temporal characteristics of the stimuli but also served to predict participants' task performance. Our analysis thus reveals that suppression of single-trial brain responses by visual speech depends strongly on the temporal concordance of the auditory and visual inputs. It further illustrates that processes in the lower theta band, which we suggest as an index of incongruity processing, might serve to reflect the neural correlates of individual differences in multisensory temporal perception.
APA, Harvard, Vancouver, ISO, and other styles
36

Tse, Chun-Yu, Gabriele Gratton, Susan M. Garnsey, Michael A. Novak, and Monica Fabiani. "Read My Lips: Brain Dynamics Associated with Audiovisual Integration and Deviance Detection." Journal of Cognitive Neuroscience 27, no. 9 (September 2015): 1723–37. http://dx.doi.org/10.1162/jocn_a_00812.

Full text
Abstract:
Information from different modalities is initially processed in different brain areas, yet real-world perception often requires the integration of multisensory signals into a single percept. An example is the McGurk effect, in which people viewing a speaker whose lip movements do not match the utterance perceive the spoken sounds incorrectly, hearing them as more similar to those signaled by the visual rather than the auditory input. This indicates that audiovisual integration is important for generating the phoneme percept. Here we asked when and where the audiovisual integration process occurs, providing spatial and temporal boundaries for the processes generating phoneme perception. Specifically, we wanted to separate audiovisual integration from other processes, such as simple deviance detection. Building on previous work employing ERPs, we used an oddball paradigm in which task-irrelevant audiovisually deviant stimuli were embedded in strings of non-deviant stimuli. We also recorded the event-related optical signal, an imaging method combining spatial and temporal resolution, to investigate the time course and neuroanatomical substrate of audiovisual integration. We found that audiovisual deviants elicit a short duration response in the middle/superior temporal gyrus, whereas audiovisual integration elicits a more extended response involving also inferior frontal and occipital regions. Interactions between audiovisual integration and deviance detection processes were observed in the posterior/superior temporal gyrus. These data suggest that dynamic interactions between inferior frontal cortex and sensory regions play a significant role in multimodal integration.
APA, Harvard, Vancouver, ISO, and other styles
37

Akyürek, Elkan G., Nils Kappelmann, Marc Volkert, and Hedderik van Rijn. "What You See Is What You Remember: Visual Chunking by Temporal Integration Enhances Working Memory." Journal of Cognitive Neuroscience 29, no. 12 (December 2017): 2025–36. http://dx.doi.org/10.1162/jocn_a_01175.

Full text
Abstract:
Human memory benefits from information clustering, which can be accomplished by chunking. Chunking typically relies on expertise and strategy, and it is unknown whether perceptual clustering over time, through temporal integration, can also enhance working memory. The current study examined the attentional and working memory costs of temporal integration of successive target stimulus pairs embedded in rapid serial visual presentation. ERPs were measured as a function of behavioral reports: One target, two separate targets, or two targets reported as a single integrated target. N2pc amplitude, reflecting attentional processing, depended on the actual number of successive targets. The memory-related CDA and P3 components instead depended on the perceived number of targets irrespective of their actual succession. The report of two separate targets was associated with elevated amplitude, whereas integrated as well as actual single targets exhibited lower amplitude. Temporal integration thus provided an efficient means of processing sensory input, offloading working memory so that the features of two targets were consolidated and maintained at a cost similar to that of a single target.
APA, Harvard, Vancouver, ISO, and other styles
38

Revonsuo, Antti, Raija Portin, Kirsi Juottonen, and Juha O. Rinne. "Semantic Processing of Spoken Words in Alzheimer's Disease: An Electrophysiological Study." Journal of Cognitive Neuroscience 10, no. 3 (May 1998): 408–20. http://dx.doi.org/10.1162/089892998562726.

Full text
Abstract:
Patients suffering from Alzheimer's disease (AD) have severe difficulties in tasks requiring the use of semantic knowledge. The semantic deficits associated with AD have been extensively studied by using behavioral methods. Many of these studies indicate that AD patients have a general deficit in voluntary access to semantic representations but that the structure of the representations themselves might be preserved. However, several studies also provide evidence that to some extent semantic representations in AD may in fact be degraded. Recently, a few studies have utilized event-related brain potentials (ERPs) that are sensitive to semantic factors in order to investigate the electrophysiological correlates of the semantic impairment in AD. Interest has focused on the N400 component, which is known to reºect the on-line semantic processing of linguistic and pictorial stimuli. The results from studies of N400 changes in AD remain somewhat controversial: Some studies report normal or enlarged N400 components in AD, whereas others report diminished ones. One issue not reported in previous studies is whether word-elicited ERPs other than N400 remain normal in AD. In the present study our aim was to find out whether the ERP waveforms N1, P2, N400, and Late Positive Component (LPC) to semantically congruous and incongruous spoken words are abnormal in AD and whether such abnormalities specifically reºect deficiencies in semantic activation in AD. Auditory ERPs from 20 scalp sites to semantically congruous and incongruous final words in spoken sentences were recorded from 17 healthy elderly adults and 9 AD patients. The early ERP waveforms N1 and P2 were relatively normal for the AD patients, but the N400 and LPC effects (amplitude difference between congruous and incongruous conditions) were significantly reduced. We interpret the present results as showing that semantic-conceptual activation and other high-level integration processes are defective in AD. However, a word congruity effect earlier than N400 (phonological mismatch negativity), reflecting lexical selection processes, is at least to some extent preserved in AD.
APA, Harvard, Vancouver, ISO, and other styles
39

Mahmood, Faisal, Abdul Zahid Khan, and Rahat Hussain Bokhari. "ERP issues and challenges: a research synthesis." Kybernetes 49, no. 3 (November 13, 2019): 629–59. http://dx.doi.org/10.1108/k-12-2018-0699.

Full text
Abstract:
Purpose Despite more than two decades of experience regarding the adoption and implementation of enterprise resource planning (ERP) systems in organizations, ERPs success is questionable. Though ERPs success stories are published in past research studies, the failure rate of ERP systems is relatively high. The purpose of this study was to find issues and challenges and assess the degree of criticality of these issues/challenges faced by organizations during ERP implementation. Design/methodology/approach For doing systematic review/research synthesis systematic literature review (SLR) was carried out considering research studies published within the time period, i.e. 1999-2018. Three major steps such as planning, conducting and reporting were followed to proceed further in this study. This study attempted to accomplish a critical review of 53 studies out of 103 studies identified, which were published in reputable journals to synthesize the existing literature in the ERP domain. The studies selected have almost addressed different challenges/issues faced by small and large organizations during ERP implementation. Findings Research synthesis/SLR led to the identification of 31 issues/challenges, which may be termed as most critical based on their occurrence/frequency in past studies included. The topmost ten issues/challenges amongst 31 identified include top management approach, change management, training and development, effective communication, system integration, business process reengineering, consultants/vendors selection, project management, project team formation, team empowerment/skilled people and data conversing/migration. However, other issues/challenges identified such as security risks/data security, cloud awareness, functionality limitations, service level agreements and subscription expenses are more related to cloud ERPs. Originality/value The current study is unique in its kind, focusing on the issues and challenges faced by organization during implementing ERP projects. Moreover, this study contributes to understanding and further analyzing management capabilities for developing remedial measures while planning the implementation of an enterprise system in their organizations prior to the occurrence of different issues and challenges ahead. The study also led to understanding and explaining socio-technical issues and their severity.
APA, Harvard, Vancouver, ISO, and other styles
40

Coco, Moreno I., Antje Nuthmann, and Olaf Dimigen. "Fixation-related Brain Potentials during Semantic Integration of Object–Scene Information." Journal of Cognitive Neuroscience 32, no. 4 (April 2020): 571–89. http://dx.doi.org/10.1162/jocn_a_01504.

Full text
Abstract:
In vision science, a particularly controversial topic is whether and how quickly the semantic information about objects is available outside foveal vision. Here, we aimed at contributing to this debate by coregistering eye movements and EEG while participants viewed photographs of indoor scenes that contained a semantically consistent or inconsistent target object. Linear deconvolution modeling was used to analyze the ERPs evoked by scene onset as well as the fixation-related potentials (FRPs) elicited by the fixation on the target object ( t) and by the preceding fixation ( t − 1). Object–scene consistency did not influence the probability of immediate target fixation or the ERP evoked by scene onset, which suggests that object–scene semantics was not accessed immediately. However, during the subsequent scene exploration, inconsistent objects were prioritized over consistent objects in extrafoveal vision (i.e., looked at earlier) and were more effortful to process in foveal vision (i.e., looked at longer). In FRPs, we demonstrate a fixation-related N300/N400 effect, whereby inconsistent objects elicit a larger frontocentral negativity than consistent objects. In line with the behavioral findings, this effect was already seen in FRPs aligned to the pretarget fixation t − 1 and persisted throughout fixation t, indicating that the extraction of object semantics can already begin in extrafoveal vision. Taken together, the results emphasize the usefulness of combined EEG/eye movement recordings for understanding the mechanisms of object–scene integration during natural viewing.
APA, Harvard, Vancouver, ISO, and other styles
41

Press, Clare, Cecilia Heyes, Patrick Haggard, and Martin Eimer. "Visuotactile Learning and Body Representation: An ERP Study with Rubber Hands and Rubber Objects." Journal of Cognitive Neuroscience 20, no. 2 (February 2008): 312–23. http://dx.doi.org/10.1162/jocn.2008.20022.

Full text
Abstract:
We studied how the integration of seen and felt tactile stimulation modulates somatosensory processing, and investigated whether visuotactile integration depends on temporal contiguity of stimulation, and its coherence with a preexisting body representation. During training, participants viewed a rubber hand or a rubber object that was tapped either synchronously with stimulation of their own hand, or in an uncorrelated fashion. In a subsequent test phase, somatosensory event-related potentials (ERPs) were recorded to tactile stimulation of the left or right hand, to assess how tactile processing was affected by previous visuotactile experience during training. An enhanced somatosensory N140 component was elicited after synchronous, compared with uncorrelated, visuotactile training, irrespective of whether participants viewed a rubber hand or rubber object. This early effect of visuotactile integration on somatosensory processing is interpreted as a candidate electro-physiological correlate of the rubber hand illusion that is determined by temporal contiguity, but not by preexisting body representations. ERP modulations were observed beyond 200 msec poststimulus, suggesting an attentional bias induced by visuotactile training. These late modulations were absent when the stimulation of a rubber hand and the participant's own hand was uncorrelated during training, suggesting that preexisting body representations may affect later stages of tactile processing.
APA, Harvard, Vancouver, ISO, and other styles
42

Sirisapsombat, Vachrintr, Phakkharawat Sittiprapaporn, Chaiyavat Chaiyasut, Sasithorn Sirilun, Roungsan Chaisricharoen, and Thamthiwat Nararatwanchai. "Source localization of tone perception in alcoholic brain indexed by standardized low-resolution electromagnetic tomography." IAES International Journal of Artificial Intelligence (IJ-AI) 9, no. 3 (September 1, 2020): 561. http://dx.doi.org/10.11591/ijai.v9.i3.pp561-568.

Full text
Abstract:
<p>Alcohol consumption is known to associate with several diseases, injuries, and social problems. The long-term, excessive alcohol exposure can lead to liver cirrhosis and pancreatitis. After repating alcohol exposure, alcohol dependence would develop an individually behavioral, cognitive, and physiological phenomenon. Previous studies indicated that although the left hemisphere was selectively employed for processing linguistic information irrespectively of acoustic cues or subtype of phonological unit, the right hemisphere was employed for prosody-specific cues. These previous studies provided the impetus for future investigations of tone perception and temporal integration differences in tonal brain speaker who had long-term, excessive alcohol exposure such as Thai in the present study. The present study used both an auditory mismatch negativity (MMN) component of event-related potentials (ERPs) recording and the standardized Low-resolution Electromagnetic Tomography (sLORETA) techniques to measure the degree of cortical activation and to localize the brain area contributing to the scalp recorded auditory MMN component during the passive oddball paradigm. Ten healthy right-handed adults participated in this study. The findings showed that both [kha:] - mid tone perception and [khá:] - high tone perception elicited a strong MMN between 215-284 ms with reference to the standard-stimulus ERPs. Source localization was obtained in the middle temporal gyrus of the right hemisphere for both [kha:] - mid tone perception and [khá:] - high tone perception. Automatic detection of tone perception in alcoholic tonal brain is a useful index of language universal auditory memory traces.</p>
APA, Harvard, Vancouver, ISO, and other styles
43

Hagiwara, Hiroko, Takahiro Soshi, Masami Ishihara, and Kuniyasu Imanaka. "A Topographical Study on the Event-related Potential Correlates of Scrambled Word Order in Japanese Complex Sentences." Journal of Cognitive Neuroscience 19, no. 2 (February 2007): 175–93. http://dx.doi.org/10.1162/jocn.2007.19.2.175.

Full text
Abstract:
One of the most fundamental and universal properties of human language is a phenomenon called displacement. In the present study, we used multichannel event-related potentials (ERPs) to identify the nature of this phenomenon with Japanese, a subject-object-verb (SOV) language of relatively free word order. The ERPs of sentences of canonical word order (CC) were compared with those of non-canonical word order in two types of Japanese complex sentences; namely, in those which can be described as being in a middle-scrambled condition (MSC) and in those in a long-scrambled condition (LSC). The sustained anterior negativity (SAN) and the P600 in the pregap position were observed in the LSC, compared to the CC, and they are consistent with previous findings. The SAN, exhibiting a tripartite nature in morphology and scalp distribution, mainly reflected a storage cost of scrambled elements in sentence comprehension. The subsequent P600 had a left fronto-temporal maximum, distinguished from a posterior P600, taken as a reflector of the thematic role assignment in previous related studies. It is argued that the P600 in the present study reflects a cost of structural integration intensively depending on the case marker information. A compositional interpretation of sentence meanings was also observed, reflected in an anterior negativity at the postgap verbal position, which cannot be differentiated at the pregap verbal position in the languages of subject-verb-object (SVO) word order.
APA, Harvard, Vancouver, ISO, and other styles
44

Brink, Daniëlle van den, and Peter Hagoort. "The Influence of Semantic and Syntactic Context Constraints on Lexical Selection and Integration in Spoken-Word Comprehension as Revealed by ERPs." Journal of Cognitive Neuroscience 16, no. 6 (July 2004): 1068–84. http://dx.doi.org/10.1162/0898929041502670.

Full text
Abstract:
An event-related brain potential experiment was carried out to investigate the influence of semantic and syntactic context constraints on lexical selection and integration in spoken-word comprehension. Subjects were presented with constraining spoken sentences that contained a critical word that was either (a) congruent, (b) semantically and syntactically incongruent, but beginning with the same initial phonemes as the congruent critical word, or (c) semantically and syntactically incongruent, beginning with phonemes that differed from the congruent critical word. Relative to the congruent condition, an N200 effect reflecting difficulty in the lexical selection process was obtained in the semantically and syntactically incongruent condition where word onset differed from that of the congruent critical word. Both incongruent conditions elicited a large N400 followed by a left anterior negativity (LAN) time-locked to the moment of word category violation and a P600 effect. These results would best fit within a cascaded model of spoken-word processing, proclaiming an optimal use of contextual information during spoken-word identification by allowing for semantic and syntactic processing to take place in parallel after bottom-up activation of a set of candidates, and lexical integration to proceed with a limited number of candidates that still match the acoustic input.
APA, Harvard, Vancouver, ISO, and other styles
45

Ho, Hao Tam, Erich Schröger, and Sonja A. Kotz. "Selective Attention Modulates Early Human Evoked Potentials during Emotional Face–Voice Processing." Journal of Cognitive Neuroscience 27, no. 4 (April 2015): 798–818. http://dx.doi.org/10.1162/jocn_a_00734.

Full text
Abstract:
Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face–voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face–voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face–voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective—one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.
APA, Harvard, Vancouver, ISO, and other styles
46

Willems, Roel M., Aslı Özyürek, and Peter Hagoort. "Seeing and Hearing Meaning: ERP and fMRI Evidence of Word versus Picture Integration into a Sentence Context." Journal of Cognitive Neuroscience 20, no. 7 (July 2008): 1235–49. http://dx.doi.org/10.1162/jocn.2008.20085.

Full text
Abstract:
Understanding language always occurs within a situational context and, therefore, often implies combining streams of information from different domains and modalities. One such combination is that of spoken language and visual information, which are perceived together in a variety of ways during everyday communication. Here we investigate whether and how words and pictures differ in terms of their neural correlates when they are integrated into a previously built-up sentence context. This is assessed in two experiments looking at the time course (measuring event-related potentials, ERPs) and the locus (using functional magnetic resonance imaging, fMRI) of this integration process. We manipulated the ease of semantic integration of word and/or picture to a previous sentence context to increase the semantic load of processing. In the ERP study, an increased semantic load led to an N400 effect which was similar for pictures and words in terms of latency and amplitude. In the fMRI study, we found overlapping activations to both picture and word integration in the left inferior frontal cortex. Specific activations for the integration of a word were observed in the left superior temporal cortex. We conclude that despite obvious differences in representational format, semantic information coming from pictures and words is integrated into a sentence context in similar ways in the brain. This study adds to the growing insight that the language system incorporates (semantic) information coming from linguistic and extralinguistic domains with the same neural time course and by recruitment of overlapping brain areas.
APA, Harvard, Vancouver, ISO, and other styles
47

Habets, Boukje, Sotaro Kita, Zeshu Shao, Asli Özyurek, and Peter Hagoort. "The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension." Journal of Cognitive Neuroscience 23, no. 8 (August 2011): 1845–54. http://dx.doi.org/10.1162/jocn.2010.21462.

Full text
Abstract:
During face-to-face communication, one does not only hear speech but also see a speaker's communicative hand movements. It has been shown that such hand gestures play an important role in communication where the two modalities influence each other's interpretation. A gesture typically temporally overlaps with coexpressive speech, but the gesture is often initiated before (but not after) the coexpressive speech. The present ERP study investigated what degree of asynchrony in the speech and gesture onsets are optimal for semantic integration of the concurrent gesture and speech. Videos of a person gesturing were combined with speech segments that were either semantically congruent or incongruent with the gesture. Although gesture and speech always overlapped in time, gesture and speech were presented with three different degrees of asynchrony. In the SOA 0 condition, the gesture onset and the speech onset were simultaneous. In the SOA 160 and 360 conditions, speech was delayed by 160 and 360 msec, respectively. ERPs time locked to speech onset showed a significant difference between semantically congruent versus incongruent gesture–speech combinations on the N400 for the SOA 0 and 160 conditions. No significant difference was found for the SOA 360 condition. These results imply that speech and gesture are integrated most efficiently when the differences in onsets do not exceed a certain time span because of the fact that iconic gestures need speech to be disambiguated in a way relevant to the speech context.
APA, Harvard, Vancouver, ISO, and other styles
48

Nickels, Stefanie, and Karsten Steinhauer. "Prosody–syntax integration in a second language: Contrasting event-related potentials from German and Chinese learners of English using linear mixed effect models." Second Language Research 34, no. 1 (July 8, 2016): 9–37. http://dx.doi.org/10.1177/0267658316649998.

Full text
Abstract:
The role of prosodic information in sentence processing is not usually addressed in second language (L2) instruction, and neurocognitive studies on prosody–syntax interactions are rare. Here we compare event-related potentials (ERP) of Chinese and German learners of English L2 to those of native English speakers and show how first language (L1) background and L2 proficiency influence the online processing of prosody-induced garden-path effects. Unlike most previous ERP studies, we use linear mixed effect models to analyse L2 proficiency as a continuous (rather than categorical) variable. Our results show that both L1 background and language proficiency shape the integration of prosodic and syntactic cues, and that, importantly, even English native speakers’ ERPs were influenced by their English proficiency level. Lastly, this article also addresses why coverage of prosody in L2 classroom instruction may be beneficial.
APA, Harvard, Vancouver, ISO, and other styles
49

Cowell, Jason M., and Jean Decety. "Precursors to morality in development as a complex interplay between neural, socioenvironmental, and behavioral facets." Proceedings of the National Academy of Sciences 112, no. 41 (August 31, 2015): 12657–62. http://dx.doi.org/10.1073/pnas.1508832112.

Full text
Abstract:
The nature and underpinnings of infants’ seemingly complex, third-party, social evaluations remain highly contentious. Theoretical perspectives oscillate between rich and lean interpretations of the same expressed preferences. Although some argue that infants and toddlers possess a “moral sense” based on core knowledge of the social world, others suggest that social evaluations are hierarchical in nature and the product of an integration of rudimentary general processes such as attention allocation and approach and avoidance. Moreover, these biologically prepared minds interact in social environments that include significant variation, which are likely to impact early social evaluations and behavior. The present study examined the neural underpinnings of and precursors to moral sensitivity in infants and toddlers (n = 73, ages 12–24 mo) through a series of interwoven measures, combining multiple levels of analysis including electrophysiological, eye-tracking, behavioral, and socioenvironmental. Continuous EEG and time-locked event-related potentials (ERPs) and gaze fixation were recorded while children watched characters engaging in prosocial and antisocial actions in two different tasks. All children demonstrated a neural differentiation in both spectral EEG power density modulations and time-locked ERPs when perceiving prosocial or antisocial agents. Time-locked neural differences predicted children’s preference for prosocial characters and were influenced by parental values regarding justice and fairness. Overall, this investigation casts light on the fundamental nature of moral cognition, including its underpinnings in general processes such as attention and approach–withdrawal, providing plausible mechanisms of early change and a foundation for forward movement in the field of developmental social neuroscience.
APA, Harvard, Vancouver, ISO, and other styles
50

Stuckenberg, Maria V., Erich Schröger, and Andreas Widmann. "Modulation of early auditory processing by visual information: Prediction or bimodal integration?" Attention, Perception, & Psychophysics 83, no. 4 (January 27, 2021): 1538–51. http://dx.doi.org/10.3758/s13414-021-02240-1.

Full text
Abstract:
AbstractWhat happens if a visual cue misleads auditory expectations? Previous studies revealed an early visuo–auditory incongruency effect, so-called incongruency response (IR) of the auditory event-related brain potential (ERP), occurring 100 ms after onset of the sound being incongruent to the preceding visual cue. So far, this effect has been ascribed to reflect the mismatch between auditory sensory expectation activated by visual predictive information and the actual sensory input. Thus, an IR should be confined to an asynchronous presentation of visual cue and sound. Alternatively, one could argue that frequently presented congruent visual-cue–sound combinations are integrated into a bimodal representation whereby violation of the visual–auditory relationship results in a bimodal feature mismatch (the IR should be obtained with asynchronous and with synchronous presentation). In an asynchronous condition, an either high-pitched or low-pitched sound was preceded by a visual note symbol presented above or below a fixation cross (90% congruent; 10% incongruent), while in a synchronous condition, both were presented simultaneously. High-pitched and low-pitched sounds were presented with different probabilities (83% vs. 17%) to form a strong association between bimodal stimuli. In both conditions, tones with pitch incongruent with the location of the note symbols elicited incongruency effects in the N2 and P3 ERPs; however, the IR was only elicited in the asynchronous condition. This finding supports the sensorial prediction error hypothesis stating that the amplitude of the auditory ERP 100 ms after sound onset is enhanced in response to unexpected compared with expected but otherwise identical sounds.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography