To see the other types of publications on this topic, follow the link: Facial expression processing.

Journal articles on the topic 'Facial expression processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Facial expression processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Young, A. W. "Facial expression processing after amygdalotomy." Neurocase 3, no. 4 (1997): 267j—274. http://dx.doi.org/10.1093/neucas/3.4.267-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Young, Andrew W., Deborah J. Hellawell, Claudia Van de Wal, and Michael Johnson. "Facial expression processing after amygdalotomy." Neuropsychologia 34, no. 1 (1996): 31–39. http://dx.doi.org/10.1016/0028-3932(95)00062-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bondarenko, Yakov A., and Galina Ya Menshikova. "EXPLORING ANALYTICAL AND HOLISTIC PROCESSING IN FACIAL EXPRESSION RECOGNITION." Moscow University Psychology Bulletin, no. 2 (2020): 103–40. http://dx.doi.org/10.11621/vsp.2020.02.06.

Full text
Abstract:
Background. The study explores two main processes of perception of facial expression: analytical (perception based on individual facial features) and holistic (holistic and non-additive perception of all features). The relative contribution of each process to facial expression recognition is still an open question. Objective. To identify the role of holistic and analytical mechanisms in the process of facial expression recognition. Methods. A method was developed and tested for studying analytical and holistic processes in the task of evaluating subjective differences of expressions, using com
APA, Harvard, Vancouver, ISO, and other styles
4

Kuehne, Maria, Isabelle Siwy, Tino Zaehle, Hans-Jochen Heinze, and Janek S. Lobmaier. "Out of Focus: Facial Feedback Manipulation Modulates Automatic Processing of Unattended Emotional Faces." Journal of Cognitive Neuroscience 31, no. 11 (2019): 1631–40. http://dx.doi.org/10.1162/jocn_a_01445.

Full text
Abstract:
Facial expressions provide information about an individual's intentions and emotions and are thus an important medium for nonverbal communication. Theories of embodied cognition assume that facial mimicry and resulting facial feedback plays an important role in the perception of facial emotional expressions. Although behavioral and electrophysiological studies have confirmed the influence of facial feedback on the perception of facial emotional expressions, the influence of facial feedback on the automatic processing of such stimuli is largely unexplored. The automatic processing of unattended
APA, Harvard, Vancouver, ISO, and other styles
5

Garrido, L., and B. Duchaine. "Do facial identity and facial expression processing dissociate in prosopagnosia?" Journal of Vision 7, no. 9 (2010): 939. http://dx.doi.org/10.1167/7.9.939.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bell, Lauren, and Tirta Susilo. "Facial identity and facial expression processing dissociate in developmental prosopagnosia." Journal of Vision 18, no. 10 (2018): 918. http://dx.doi.org/10.1167/18.10.918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

XIA, Mu, Xueliu LI, Chun YE, and Hong LI. "The ERPs for the Facial Expression Processing." Advances in Psychological Science 22, no. 10 (2014): 1556. http://dx.doi.org/10.3724/sp.j.1042.2014.01556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vigil, Jacob M., and Patrick Coulombe. "Embodied simulation and the search for meaning are not necessary for facial expression processing." Behavioral and Brain Sciences 33, no. 6 (2010): 461–63. http://dx.doi.org/10.1017/s0140525x10001603.

Full text
Abstract:
AbstractEmbodied simulation and the epistemic motivation to search for the “meaning” of other people's behaviors are not necessary for specific and functional responding to, and hence processing of, human facial expressions. Rather, facial expression processing can be achieved through lower-cognitive, heuristical perceptual processing and expression of prototypical morphological musculature movement patterns that communicate discrete trustworthiness and capacity cues to conspecifics.
APA, Harvard, Vancouver, ISO, and other styles
9

Matsumiya, Kazumichi. "Retinotopy of Facial Expression Adaptation." Multisensory Research 27, no. 2 (2014): 127–37. http://dx.doi.org/10.1163/22134808-00002446.

Full text
Abstract:
The face aftereffect (FAE; the illusion of faces after adaptation to a face) has been reported to occur without retinal overlap between adaptor and test, but recent studies revealed that the FAE is not constant across all test locations, which suggests that the FAE is also retinotopic. However, it remains unclear whether the characteristic of the retinotopy of the FAE for one facial aspect is the same as that of the FAE for another facial aspect. In the research reported here, an examination of the retinotopy of the FAE for facial expression indicated that the facial expression aftereffect occ
APA, Harvard, Vancouver, ISO, and other styles
10

Padmapriya K.C., Leelavathy V., and Angelin Gladston. "Automatic Multiface Expression Recognition Using Convolutional Neural Network." International Journal of Artificial Intelligence and Machine Learning 11, no. 2 (2021): 1–13. http://dx.doi.org/10.4018/ijaiml.20210701.oa8.

Full text
Abstract:
The human facial expressions convey a lot of information visually. Facial expression recognition plays a crucial role in the area of human-machine interaction. Automatic facial expression recognition system has many applications in human behavior understanding, detection of mental disorders and synthetic human expressions. Recognition of facial expression by computer with high recognition rate is still a challenging task. Most of the methods utilized in the literature for the automatic facial expression recognition systems are based on geometry and appearance. Facial expression recognition is
APA, Harvard, Vancouver, ISO, and other styles
11

Chen, Wei, and Olivia S. Cheung. "Flexible face processing: Holistic processing of facial identity is modulated by task-irrelevant facial expression." Vision Research 178 (January 2021): 18–27. http://dx.doi.org/10.1016/j.visres.2020.09.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Redfern, Annabelle S., and Christopher P. Benton. "Expression Dependence in the Perception of Facial Identity." i-Perception 8, no. 3 (2017): 204166951771066. http://dx.doi.org/10.1177/2041669517710663.

Full text
Abstract:
We recognise familiar faces irrespective of their expression. This ability, crucial for social interactions, is a fundamental feature of face perception. We ask whether this constancy of facial identity may be compromised by changes in expression. This, in turn, addresses the issue of whether facial identity and expression are processed separately or interact. Using an identification task, participants learned the identities of two actors from naturalistic (so-called ambient) face images taken from movies. Training was either with neutral images or their expressive counterparts, perceived expr
APA, Harvard, Vancouver, ISO, and other styles
13

Song, K. T., M. J. Han, F. Y. Chang, and S. H. Chang. "A Robotic Facial Expression Recognition System Using Real-Time Vision System." Key Engineering Materials 381-382 (June 2008): 375–78. http://dx.doi.org/10.4028/www.scientific.net/kem.381-382.375.

Full text
Abstract:
The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirt
APA, Harvard, Vancouver, ISO, and other styles
14

Kettle, Jonathan W. L., and Nicholas B. Allen. "Facial Reactivity and Attentional Processing of Facial Expressions and Gaze Direction." Journal of Psychophysiology 33, no. 4 (2019): 254–66. http://dx.doi.org/10.1027/0269-8803/a000227.

Full text
Abstract:
Abstract. Patterns of facial reactivity and attentional allocation to emotional facial expressions, and how these are moderated by gaze direction, are not clearly established. Among a sample of undergraduate university students, aged between 17 and 22 years (76% female), corrugator and zygomatic reactivity, as measured by facial electromyography, and attention allocation, as measured by the startle reflex and startle-elicited N100, was examined while viewing happy, neutral, angry and fearful facial expressions, which were presented at either 0- or 30-degree gaze. Results indicated typically ob
APA, Harvard, Vancouver, ISO, and other styles
15

LIU, Hongyan, and Zhiguo HU. "The Facial Expression Processing in Social Anxiety Disorder." Advances in Psychological Science 21, no. 11 (2013): 1927–38. http://dx.doi.org/10.3724/sp.j.1042.2013.01927.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Magnussen, Svein, Bettina Sunde, and Stein Dyrnes. "Patterns of Perceptual Asymmetry in Processing Facial Expression." Cortex 30, no. 2 (1994): 215–29. http://dx.doi.org/10.1016/s0010-9452(13)80194-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Martin, Flavie, Jean-Yves Baudouin, Guy Tiberghien, and Nicolas Franck. "Processing emotional expression and facial identity in schizophrenia." Psychiatry Research 134, no. 1 (2005): 43–53. http://dx.doi.org/10.1016/j.psychres.2003.12.031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Doop, Mikisha L., and Sohee Park. "Facial expression and face orientation processing in schizophrenia." Psychiatry Research 170, no. 2-3 (2009): 103–7. http://dx.doi.org/10.1016/j.psychres.2009.06.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Deliens, Gaétane, Kyriakos Antoniou, Elise Clin, Ekaterina Ostashchenko, and Mikhail Kissine. "Context, facial expression and prosody in irony processing." Journal of Memory and Language 99 (April 2018): 35–48. http://dx.doi.org/10.1016/j.jml.2017.10.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Meena, H. K., K. K. Sharma, and S. D. Joshi. "Improved facial expression recognition using graph signal processing." Electronics Letters 53, no. 11 (2017): 718–20. http://dx.doi.org/10.1049/el.2017.0420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

BUCIU, IOAN, and IOAN NAFORNITA. "FEATURE EXTRACTION THROUGH CROSS-PHASE CONGRUENCY FOR FACIAL EXPRESSION ANALYSIS." International Journal of Pattern Recognition and Artificial Intelligence 23, no. 03 (2009): 617–35. http://dx.doi.org/10.1142/s021800140900717x.

Full text
Abstract:
Human face analysis has attracted a large number of researchers from various fields, such as computer vision, image processing, neurophysiology or psychology. One of the particular aspects of human face analysis is encompassed by facial expression recognition task. A novel method based on phase congruency for extracting the facial features used in the facial expression classification procedure is developed. Considering a set of image samples comprising humans expressing various expressions, this new approach computes the phase congruency map between the samples. The analysis is performed in th
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Dong. "Facial Expression Recognition Based on RS-SVM." Applied Mechanics and Materials 543-547 (March 2014): 2329–32. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.2329.

Full text
Abstract:
In Recent years, with the rapid development of facial expression recognition technology, processing and classification of facial expression recognition has become a hotspot in application studies of remote sensing. Rough set theory (RS) and SVM have unique advantages in information processing and classification. This paper applies RS-SVM to facial expression recognition, briefly introduce the concepts of RS and principle of SVM, attributes reduction in RS theory as preposing system to get rid of redundancy attributes. Meanwhile, the SVM classifier works as postposing system helps training and
APA, Harvard, Vancouver, ISO, and other styles
23

Wronka, Eligiusz, and Wioleta Walentowska. "Attentional Modulation of the Emotional Expression Processing Studied with ERPs and sLORETA." Journal of Psychophysiology 28, no. 1 (2014): 32–46. http://dx.doi.org/10.1027/0269-8803/a000109.

Full text
Abstract:
Recent ERP studies demonstrate that the processing of facial emotional expression can be modulated by attention. The aim of the present study was to investigate the neural correlates of attentional influence on the emotional expression processing at early stages. We recorded ERP responses to facial stimuli containing neutral versus emotional expression in two different conditions. The first task was to discriminate facial expressions, while the second task was to categorize face gender. Enhanced positivity at occipital and occipito-temporal locations between 110 and 170 ms poststimulus was eli
APA, Harvard, Vancouver, ISO, and other styles
24

Huang, Yunxin, Fei Chen, Shaohe Lv, and Xiaodong Wang. "Facial Expression Recognition: A Survey." Symmetry 11, no. 10 (2019): 1189. http://dx.doi.org/10.3390/sym11101189.

Full text
Abstract:
Facial Expression Recognition (FER), as the primary processing method for non-verbal intentions, is an important and promising field of computer vision and artificial intelligence, and one of the subject areas of symmetry. This survey is a comprehensive and structured overview of recent advances in FER. We first categorise the existing FER methods into two main groups, i.e., conventional approaches and deep learning-based approaches. Methodologically, to highlight the differences and similarities, we propose a general framework of a conventional FER approach and review the possible technologie
APA, Harvard, Vancouver, ISO, and other styles
25

Bediou, Benoit, Pierre Krolak-Salmon, Mohamed Saoud, et al. "Facial Expression and Sex Recognition in Schizophrenia and Depression." Canadian Journal of Psychiatry 50, no. 9 (2005): 525–33. http://dx.doi.org/10.1177/070674370505000905.

Full text
Abstract:
Background: Impaired facial expression recognition in schizophrenia patients contributes to abnormal social functioning and may predict functional outcome in these patients. Facial expression processing involves individual neural networks that have been shown to malfunction in schizophrenia. Whether these patients have a selective deficit in facial expression recognition or a more global impairment in face processing remains controversial. Objective: To investigate whether patients with schizophrenia exhibit a selective impairment in facial emotional expression recognition, compared with patie
APA, Harvard, Vancouver, ISO, and other styles
26

Benoit, Kristy E., Richard J. McNally, Ronald M. Rapee, Amanda L. Gamble, and Amy L. Wiseman. "Processing of Emotional Faces in Children and Adolescents With Anxiety Disorders." Behaviour Change 24, no. 4 (2007): 183–94. http://dx.doi.org/10.1375/bech.24.4.183.

Full text
Abstract:
AbstractThe purpose of this study was to test whether children and adolescents with anxiety disorders exhibit selective processing of threatening facial expressions in a pictorial version of the emotional Stroop paradigm. Participants named the colours of filters covering images of adults and children displaying either a neutral facial expression or one displaying the emotions of anger, disgust, or happiness. A delay in naming the colour of a filter implies attentional capture by the facial expression. Anxious participants, relative to control participants, exhibited slower colour naming overa
APA, Harvard, Vancouver, ISO, and other styles
27

Dobson, Seth D., and Chet C. Sherwood. "Correlated evolution of brain regions involved in producing and processing facial expressions in anthropoid primates." Biology Letters 7, no. 1 (2010): 86–88. http://dx.doi.org/10.1098/rsbl.2010.0427.

Full text
Abstract:
Anthropoid primates are distinguished from other mammals by having relatively large primary visual cortices (V1) and complex facial expressions. We present a comparative test of the hypothesis that facial expression processing coevolved with the expansion of V1 in anthropoids. Previously published data were analysed using phylogenetic comparative methods. The results of our study suggest a pattern of correlated evolution linking social group size, facial motor control and cortical visual processing in catarrhines, but not platyrrhines. Catarrhines that live in relatively large social groups te
APA, Harvard, Vancouver, ISO, and other styles
28

Taylor, Alisdair James Gordon, and Maria Jose. "Physical Aggression and Facial Expression Identification." Europe’s Journal of Psychology 10, no. 4 (2014): 650–59. http://dx.doi.org/10.5964/ejop.v10i4.816.

Full text
Abstract:
Social information processing theories suggest that aggressive individuals may exhibit hostile perceptual biases when interpreting other’s behaviour. This hypothesis was tested in the present study which investigated the effects of physical aggression on facial expression identification in a sample of healthy participants. Participants were asked to judge the expressions of faces presented to them and to complete a self-report measure of aggression. Relative to low physically aggressive participants, high physically aggressive participants were more likely to mistake non-angry facial expressio
APA, Harvard, Vancouver, ISO, and other styles
29

Denzel, D., L. R. Demenescu, L. Colic, F. von Düring, H. Nießen, and M. Walter. "The role of neurometabolites in emotional processing." European Psychiatry 33, S1 (2016): S87. http://dx.doi.org/10.1016/j.eurpsy.2016.01.047.

Full text
Abstract:
ObjectiveTo investigate how brain metabolites, especially glutamate and glutamate to glutamine ratio of pgACC modulate the neural response within these areas and how this affects their function during emotion facial expression matching task.MethodsSeventy healthy volunteers underwent magnetic resonance spectroscopy (MRS) and task functional magnetic resonance imaging (fMRI) in 7 Tesla scanner. PgACC MRS data were obtained using STEAM sequence and analyzed using LCModel.Angry, fearful, and happy facial expressions were presented in an affect-matching block where one of the two facial expression
APA, Harvard, Vancouver, ISO, and other styles
30

Kinchella, Jade, and Kun Guo. "Facial Expression Ambiguity and Face Image Quality Affect Differently on Expression Interpretation Bias." Perception 50, no. 4 (2021): 328–42. http://dx.doi.org/10.1177/03010066211000270.

Full text
Abstract:
We often show an invariant or comparable recognition performance for perceiving prototypical facial expressions, such as happiness and anger, under different viewing settings. However, it is unclear to what extent the categorisation of ambiguous expressions and associated interpretation bias are invariant in degraded viewing conditions. In this exploratory eye-tracking study, we systematically manipulated both facial expression ambiguity (via morphing happy and angry expressions in different proportions) and face image clarity/quality (via manipulating image resolution) to measure participants
APA, Harvard, Vancouver, ISO, and other styles
31

Cole, Jonathan. "Living with difficulties of facial processing." Facial Information Processing 8, no. 1 (2000): 237–60. http://dx.doi.org/10.1075/pc.8.1.10col.

Full text
Abstract:
The present paper considers the processing of facial information from a personal and narrative aspect, attempting to address the effects that deficits in such processing have on people’s perceptions of themselves and of others. The approach adopted has been a narrative and mainly subjective one, entering the experience of several subjects with facial problems to tease out the interactions between their facial problems and their relations with others. The subjects are those with blindness, either congenital or acquired, autism, Moebius syndrome (the congenital absence of facial expression), Bel
APA, Harvard, Vancouver, ISO, and other styles
32

Sheu, Jia Shing, Tsu Shien Hsieh, and Ho Nien Shou. "Facial Expression Generated Automatically Using Triangular Segmentation." Applied Mechanics and Materials 479-480 (December 2013): 834–38. http://dx.doi.org/10.4028/www.scientific.net/amm.479-480.834.

Full text
Abstract:
The advancement in computer technology provided instant messaging software that makes human interactions possible and dynamic. However, such software cannot convey actual emotions and lack a realistic depiction of feelings. Instant messaging will be more interesting if users’ facial images are integrated into a virtual portrait that can automatically create images with different expressions. This study uses triangular segmentation to generate facial expressions. The application of an image editing technique is introduced to automatically create images with expressions from an expressionless fa
APA, Harvard, Vancouver, ISO, and other styles
33

Balconi, Michela, and Claudio Lucchiari. "Consciousness and Emotional Facial Expression Recognition." Journal of Psychophysiology 21, no. 2 (2007): 100–108. http://dx.doi.org/10.1027/0269-8803.21.2.100.

Full text
Abstract:
Abstract. In this study we analyze whether facial expression recognition is marked by specific event-related potential (ERP) correlates and whether conscious and unconscious elaboration of emotional facial stimuli are qualitatively different processes. ERPs elicited by supraliminal and subliminal (10 ms) stimuli were recorded when subjects were viewing emotional facial expressions of four emotions or neutral stimuli. Two ERP effects (N2 and P3) were analyzed in terms of their peak amplitude and latency variations. An emotional specificity was observed for the negative deflection N2, whereas P3
APA, Harvard, Vancouver, ISO, and other styles
34

Aguado, Luis, Karisa Parkington, Teresa Dieguez-Risco, José Hinojosa, and Roxane Itier. "Joint Modulation of Facial Expression Processing by Contextual Congruency and Task Demands." Brain Sciences 9, no. 5 (2019): 116. http://dx.doi.org/10.3390/brainsci9050116.

Full text
Abstract:
Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expression
APA, Harvard, Vancouver, ISO, and other styles
35

Sonone, Praphull S., and Manjusha M. Patil. "Results: Recognition of Facial Expression by Digital Image Processing." International Journal of Advanced Engineering Research and Science 3, no. 10 (2016): 238–41. http://dx.doi.org/10.22161/ijaers/3.10.38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Vitale, Jonathan, Mary-Anne Williams, Benjamin Johnston, and Giuseppe Boccignone. "Affective facial expression processing via simulation: A probabilistic model." Biologically Inspired Cognitive Architectures 10 (October 2014): 30–41. http://dx.doi.org/10.1016/j.bica.2014.11.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

P., Pavithra, and Balaji Ganesh A. "DETECTION OF HUMAN FACIAL BEHAVIORAL EXPRESSION USING IMAGE PROCESSING." ICTACT Journal on Image and Video Processing 01, no. 03 (2011): 162–65. http://dx.doi.org/10.21917/ijivp.2011.0023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Kim, D. J. "Facial expression recognition using ASM-based post-processing technique." Pattern Recognition and Image Analysis 26, no. 3 (2016): 576–81. http://dx.doi.org/10.1134/s105466181603010x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Wronka, Eligiusz, and Wioleta Walentowska. "Attention effectively modulates processing of subliminally presented facial expression." International Journal of Psychophysiology 77, no. 3 (2010): 279–80. http://dx.doi.org/10.1016/j.ijpsycho.2010.06.137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Gorno-Tempini, Maria Luisa, Samanta Pradelli, Marco Serafini, et al. "Explicit and Incidental Facial Expression Processing: An fMRI Study." NeuroImage 14, no. 2 (2001): 465–73. http://dx.doi.org/10.1006/nimg.2001.0811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Bimler, David L., and Galina V. Paramei. "Facial-Expression Affective Attributes and their Configural Correlates: Components and Categories." Spanish Journal of Psychology 9, no. 1 (2006): 19–31. http://dx.doi.org/10.1017/s113874160000593x.

Full text
Abstract:
The present study investigates the perception of facial expressions of emotion, and explores the relation between the configural properties of expressions and their subjective attribution. Stimuli were a male and a female series of morphed facial expressions, interpolated between prototypes of seven emotions (happiness, sadness, fear, anger, surprise and disgust, and neutral) from Ekman and Friesen (1976). Topographical properties of the stimuli were quantified using the Facial Expression Measurement (FACEM) scheme. Perceived dissimilarities between the emotional expressions were elicited usin
APA, Harvard, Vancouver, ISO, and other styles
42

Ho, Hao Tam, Erich Schröger, and Sonja A. Kotz. "Selective Attention Modulates Early Human Evoked Potentials during Emotional Face–Voice Processing." Journal of Cognitive Neuroscience 27, no. 4 (2015): 798–818. http://dx.doi.org/10.1162/jocn_a_00734.

Full text
Abstract:
Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face–voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression,
APA, Harvard, Vancouver, ISO, and other styles
43

Guo, Junqi, Ke Shan, Hao Wu, Rongfang Bie, Wenwan You, and Di Lu. "Research on Facial Expression Recognition Technology Based on Convolutional-Neural-Network Structure." International Journal of Software Innovation 6, no. 4 (2018): 103–16. http://dx.doi.org/10.4018/ijsi.2018100108.

Full text
Abstract:
Human facial expressions change so subtly that recognition accuracy of most traditional approaches largely depend on feature extraction. In this article, the authors employ a deep convolutional neural network (CNN) to devise a facial expression recognition system to discover deeper feature representation of facial expression. The proposed system is composed of the input module, the pre-processing module, the recognition module and the output module. The authors introduce jaffe and ck+ to simulate and evaluate the performance under the influence of different factors (e.g. network structure, lea
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Yuanning, R. Mark Richardson, and Avniel Singh Ghuman. "Posterior Fusiform and Midfusiform Contribute to Distinct Stages of Facial Expression Processing." Cerebral Cortex 29, no. 7 (2018): 3209–19. http://dx.doi.org/10.1093/cercor/bhy186.

Full text
Abstract:
Abstract Though the fusiform is well-established as a key node in the face perception network, its role in facial expression processing remains unclear, due to competing models and discrepant findings. To help resolve this debate, we recorded from 17 subjects with intracranial electrodes implanted in face sensitive patches of the fusiform. Multivariate classification analysis showed that facial expression information is represented in fusiform activity and in the same regions that represent identity, though with a smaller effect size. Examination of the spatiotemporal dynamics revealed a funct
APA, Harvard, Vancouver, ISO, and other styles
45

Pant, Dibakar Raj, and Rolisha Sthapit. "Analysis of Micro Facial Expression by Machine and Deep Learning Methods: Haar, CNN, and RNN." Journal of the Institute of Engineering 16, no. 1 (2021): 95–101. http://dx.doi.org/10.3126/jie.v16i1.36562.

Full text
Abstract:
Facial expressions are due to the actions of the facial muscles located at different facial regions. These expressions are two types: Macro and Micro expressions. The second one is more important in computer vision. Analysis of micro expressions categorized by disgust, happiness, anger, sadness, surprise, contempt, and fear are challenging because of very fast and subtle facial movements. This article presents one machine learning method: Haar and two deep learning methods: Convolution Neural Network (CNN) and Recurrent Neural Network (RNN) to perform recognition of micro-facial expression ana
APA, Harvard, Vancouver, ISO, and other styles
46

Wu, Lingdan, Jie Pu, John J. B. Allen, and Paul Pauli. "Recognition of Facial Expressions in Individuals with Elevated Levels of Depressive Symptoms: An Eye-Movement Study." Depression Research and Treatment 2012 (2012): 1–7. http://dx.doi.org/10.1155/2012/249030.

Full text
Abstract:
Previous studies consistently reported abnormal recognition of facial expressions in depression. However, it is still not clear whether this abnormality is due to an enhanced or impaired ability to recognize facial expressions, and what underlying cognitive systems are involved. The present study aimed to examine how individuals with elevated levels of depressive symptoms differ from controls on facial expression recognition and to assess attention and information processing using eye tracking. Forty participants (18 with elevated depressive symptoms) were instructed to label facial expression
APA, Harvard, Vancouver, ISO, and other styles
47

Pomarol-Clotet, E., F. Hynes, C. Ashwin, E. T. Bullmore, P. J. McKenna, and K. R. Laws. "Facial emotion processing in schizophrenia: a non-specific neuropsychological deficit?" Psychological Medicine 40, no. 6 (2009): 911–19. http://dx.doi.org/10.1017/s0033291709991309.

Full text
Abstract:
BackgroundIdentification of facial emotions has been found to be impaired in schizophrenia but there are uncertainties about the neuropsychological specificity of the finding.MethodTwenty-two patients with schizophrenia and 20 healthy controls were given tests requiring identification of facial emotion, judgement of the intensity of emotional expressions without identification, familiar face recognition and the Benton Facial Recognition Test (BFRT). The schizophrenia patients were selected to be relatively intellectually preserved.ResultsThe patients with schizophrenia showed no deficit in ide
APA, Harvard, Vancouver, ISO, and other styles
48

Krebs, Julia F., Ajanta Biswas, Olivier Pascalis, Inge Kamp-Becker, Helmuth Remschmidt, and Gudrun Schwarzer. "Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?" Journal of Autism and Developmental Disorders 41, no. 6 (2010): 796–804. http://dx.doi.org/10.1007/s10803-010-1098-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Alais, David, Yiben Xu, Susan G. Wardle, and Jessica Taubert. "A shared mechanism for facial expression in human faces and face pareidolia." Proceedings of the Royal Society B: Biological Sciences 288, no. 1954 (2021): 20210966. http://dx.doi.org/10.1098/rspb.2021.0966.

Full text
Abstract:
Facial expressions are vital for social communication, yet the underlying mechanisms are still being discovered. Illusory faces perceived in objects (face pareidolia) are errors of face detection that share some neural mechanisms with human face processing. However, it is unknown whether expression in illusory faces engages the same mechanisms as human faces. Here, using a serial dependence paradigm, we investigated whether illusory and human faces share a common expression mechanism. First, we found that images of face pareidolia are reliably rated for expression, within and between observers
APA, Harvard, Vancouver, ISO, and other styles
50

Drapeau, Joanie, Nathalie Gosselin, Isabelle Peretz, and Michelle McKerral. "Electrophysiological Responses to Emotional Facial Expressions Following a Mild Traumatic Brain Injury." Brain Sciences 9, no. 6 (2019): 142. http://dx.doi.org/10.3390/brainsci9060142.

Full text
Abstract:
The present study aimed to measure neural information processing underlying emotional recognition from facial expressions in adults having sustained a mild traumatic brain injury (mTBI) as compared to healthy individuals. We thus measured early (N1, N170) and later (N2) event-related potential (ERP) components during presentation of fearful, neutral, and happy facial expressions in 10 adults with mTBI and 11 control participants. Findings indicated significant differences between groups, irrespective of emotional expression, in the early attentional stage (N1), which was altered in mTBI. The t
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!