Academic literature on the topic 'Perception faciale'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Perception faciale.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Perception faciale"
Hernández-Alfaro, Federico. "Syndrome d’hyperdivergence faciale." L'Orthodontie Française 87, no. 4 (December 2016): 479–89. http://dx.doi.org/10.1051/orthodfr/2016037.
Full textGatignol, Peggy, I. Bernat, F. Tankere, C. Sain Oulhen, T. Truong Tan, and G. Lamas. "Perception et production des émotions au cours d’une paralysie faciale." Kinésithérapie, la Revue 11, no. 109-110 (January 2011): 87–88. http://dx.doi.org/10.1016/s1779-0123(11)75041-0.
Full textLaverdière, Roxanne, Lye-Ann Robichaud, Annie Toulouse-Fournier, Audrey Marcoux, and Philip Jackson. "Influence du genre sur la perception de l'empathie à l'aide d'avatars." Psycause : revue scientifique étudiante de l'École de psychologie de l'Université Laval 10, no. 2 (November 20, 2020): 13–15. http://dx.doi.org/10.51656/psycause.v10i2.40770.
Full textFaure, Jacques, and Yves Bolender. "L’appréciation de la beauté : revue de littérature." L'Orthodontie Française 85, no. 1 (March 2014): 3–29. http://dx.doi.org/10.1051/orthodfr/2013073.
Full textFerney, Pauline, François Clauss, Damien Offner, and Delphine Wagner. "Intérêt prophylactique et thérapeutique des chewing-gums sans sucre en orthodontie. Une étude menée auprès de professionnels de santé et de patients." L'Orthodontie Française 88, no. 3 (September 2017): 275–81. http://dx.doi.org/10.1051/orthodfr/2017020.
Full textLefevre, Carmen E., and Gary J. Lewis. "Perceiving Aggression from Facial Structure: Further Evidence for A Positive Association with Facial Width–To–Height Ratio and Masculinity, but Not for Moderation by Self–Reported Dominance." European Journal of Personality 28, no. 6 (November 2014): 530–37. http://dx.doi.org/10.1002/per.1942.
Full textTobiasen, Joyce M., and John M. Hiebert. "Combined Effects of Severity of Cleft Impairment and Facial Attractiveness on Social Perception: An Experimental Study." Cleft Palate-Craniofacial Journal 30, no. 1 (January 1993): 82–86. http://dx.doi.org/10.1597/1545-1569_1993_030_0082_ceosoc_2.3.co_2.
Full textAkram, Umair, Rachel Sharman, and Amy Newman. "Altered Perception of Facially Expressed Tiredness in Insomnia." Perception 47, no. 1 (August 11, 2017): 105–11. http://dx.doi.org/10.1177/0301006617725241.
Full textLu, Yan, Jie Yang, Kaida Xiao, Michael Pointer, Changjun Li, and Sophie Wuerger. "Investigation of effect of skin tone to facial attractiveness." Color and Imaging Conference 2020, no. 28 (November 4, 2020): 25–29. http://dx.doi.org/10.2352/issn.2169-2629.2020.28.5.
Full textReed, J. Ann, and Elizabeth M. Blunk. "THE INFLUENCE OF FACIAL HAIR ON IMPRESSION FORMATION." Social Behavior and Personality: an international journal 18, no. 1 (January 1, 1990): 169–75. http://dx.doi.org/10.2224/sbp.1990.18.1.169.
Full textDissertations / Theses on the topic "Perception faciale"
Bayet, Laurie. "Le développement de la perception des expressions faciales." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAS049/document.
Full textThis thesis addressed the question of how the perception of emotional facial expressions develops, reframing it in the theoretical framework of face perception: the separation of variant (expression, gaze) and invariant (gender, race) streams, the role of experience, and social attention. More specifically, we investigated how in infants and children the perception of angry, smiling, or fearful facial expressions interacts with gender perception (Studies 1-2), gaze perception (Study 3), and face detection (Study 4).In a first study, we found that adults and 5-12 year-old children tend to categorize angry faces as male (Study 1). Comparing human performance with that of several automatic classifiers suggested that this reflects a strategy of using specific features and second-order relationships in the face to categorize gender. The bias was constant over all ages studied and extended to other-race faces, further suggesting that it doesn't require extensive experience. A second set of studies examined whether, in infants, the perception of smiling depends on experience-sensitive, invariant dimensions of the face such as gender and race (Study 2). Indeed, infants are typically most familiar with own-race female faces. The visual preference of 3.5 month-old infants for open-mouth, own-race smiling (versus neutral) faces was restricted to female faces and reversed in male faces. The effect did not replicate with own- or other-race closed-mouth smiles. We attempted to extend these results to an object-referencing task in 3.5-, 9- and 12-month-olds (Study 3). Objects previously referenced by smiling faces attracted similar attention as objects previously cued by neutral faces, regardless of age group and face gender, and despite differences in gaze following. Finally, we used univariate (face side preference) and multivariate (face versus noise side decoding evidence) trial-level measures of face detection, coupled with non-linear mixed modeling of psychometric curves, to reveal the detection advantage of fearful faces (compared to smiling faces) embedded in phase-scrambled noise in 3.5-, 6-, and 12-month-old infants (Study 4). The advantage was as or more evident in the youngest group than in the two older age groups.Taken together, these results provide insights into the early ontogeny and underlying cause of gender-emotion relationships in face perception and the sensitivity to fear
Dupouy, Stéphanie. "Le visage au scalpel : l'expression faciale dans l'oeil des savants (1750-1880)." Paris 1, 2007. http://www.theses.fr/2007PA010677.
Full textVannier, Loïc. "Perception des visages et des expressions faciales émotionnelles chez l'adulte et l'enfant : aspects neurophysiologiques." Tours, 2004. http://www.theses.fr/2004TOUR3311.
Full textThe aim of this study was to elaborate a protocol using quantified EEG to investigate explicit perception of faces and facial emotions. A previous study has tested the stability of qEEG and the reproducibility of the activations found. Cortical reactivity of 31 adults and 15 children was recorded during perception of facial expressions. A dominance of right hemisphere is observed in adults during face and emotion processing. Social stimuli induce frontal and temporal activations and central parietal areas are implicated in relational emotions. A global cortical activation is found in children showing that face and emotion processing isn't yet mature. This simple protocol shows that qEEG is a valid technique to investigate face perception and applicable in autistic children
Prigent, Elise. "Modulation émotionnelle de la perception de l’action motrice d’autrui." Thesis, Paris 11, 2012. http://www.theses.fr/2012PA113006.
Full textUnderstanding others’ motor behaviour is part and parcel of Humans’ social experience. According to scientific literature, we rely on specific mechanisms for perceiving human bodies (whether static or moving) on the one hand, and processing emotional facial expressions on the other hand. This thesis aims to understand to what extent the emotion conveyed by a person’s face can modulate one’s perception of her/his motor action. Results of study 1 showed that our estimation of an individual’s static equilibrium is modulated by the observed individual’s emotional facial expression (smiling or tensed). Study 2 focused on perceptual estimation of the physical effort developed by a person on the basis of his facial expression of pain alone. Results revealed that participants adopt two automatic perceptual mechanisms. The first, highlighted via functional measurement, facilitates estimating the intensity of effort pain felt by others. The second, evidenced by measuring memory bias, leads to an automatic anticipation of the subsequent changes in the intensity of pain-related facial expressions. Study 3 showed that the estimation of physical effort developed by a paraplegic individual performing a transfer movement is modulated by two pain behaviours (guarding and facial expression of pain). Interestingly, this modulation varies with participants’ familiarity with both the medical domain and paraplegia. The conclusion of this research suggests that the modulation of emotional perception related to others’ motor action is primarily subtended by an automatic (bottom-up) process and an implicit emotional contagion. However, the latter can be inhibited by an explicit (top-down) process which may depend on (1) the type of inference made on others (estimating postural balance or physical effort developed in others), and (2) the familiarity of the observer with motor action and facial expressions
Jacques, Corentin. "Décours temporel de la perception visuelle des visages : de la catégorisation faciale à l'encodage d'une représentation individuelle." Université catholique de Louvain, 2007. http://edoc.bib.ucl.ac.be:81/ETD-db/collection/available/BelnUcetd-11302007-101545/.
Full textLescanne, Emmanuel. "L'arachnoïde des citernes ponto-cérébelleuse et acoustico-faciale : Micro-anatomie appliquée à la chirurgie des Schwannomes Vestibulaires." Tours, 2006. http://www.theses.fr/2007TOUR3303.
Full textGoals of this PhD thesis, were to complete, at the anatomy and histology laboratories, the description of the internal acoustic meatus (IAM) and its contents. Additional goals were intended to verify the epi-arachnoidal origin of vestibular schwannoma (VS) by using temporal bones (TB), which were either normal or invaded by a VS. We demonstrated the existence of an acousticofacial cistern containing every nerve of the vestibulocochleofacial complex, including the vestibular ganglion from which VS develop. We saw no layer between the tumor and the intrameatal VS, as an epi-arachnoidal tumor origin would suggest. These observations are in contradiction with the descriptions concerning the epi-arachnoidal origin of VS. These findings clearly contradict the theory of the duplication of arachnoidal layers during medial growth of vestibular neuromas and may explain some of the intraoperative difficulties encountered in the atraumatic dissection of these tumors
Tessier, Marie-Hélène. "Décours temporel de l'expression faciale dynamique de la douleur à l'aide d'avatars virtuels." Master's thesis, Université Laval, 2018. http://hdl.handle.net/20.500.11794/32634.
Full textAffective computing aims to develop computer-based systems that can recognize and recreate some human internal states through virtual reality. Pain is one such internal states described as both a sensory and an affective subjective experience. To communicate our pain to others, facial expressions are an adaptive mean. Although the still pain facial expression is well defined, the sequential order of movements that compose it has rarely been addressed in research. The objective of this thesis is to compare the various levels of realism as well as pain intensity and unpleasantness level attributed to different sequential orders of avatars’ facial expressions of pain. An empirical study has been conducted with 45 adults (22 women), who rated seven orders of appearance of different facial pain movements (six sequences and one synchronized apparition) of four avatars (two women). The results showed that pain expressions were perceived: 1) as more realistic when the eye-related movements appeared before the nose- and mouth- related movements; 2) as more intensely painful when browsrelated movements were the last to appear; 3) as depicting higher unpleasantness level when the noseand mouth-related movements appeared before the brows-related movements. In fact, the sequence “Eyes, Nose/Mouth, Brows” was the only one that was perceived as the highest on both realism and pain level. The results of this study raise the importance of the order of appearance of facial movements in the perception of pain expressions. Thus, this thesis contributes to the field of affective computing through the advancement of knowledge on decoding and producing pain expressions in virtual reality.
Boucenna, Sofiane. "De la reconnaissance des expressions faciales à une perception visuelle partagée : une architecture sensori-motrice pour amorcer un référencement social d'objets, de lieux ou de comportements." Phd thesis, Université de Cergy Pontoise, 2011. http://tel.archives-ouvertes.fr/tel-00660120.
Full textBayle, Dimitri. "Traitement cérébral de l’expression faciale de peur : vision périphérique et effet de l’attention." Thesis, Lyon 1, 2009. http://www.theses.fr/2009LYO10227/document.
Full textFacial expression of fear is an important vector of social and environmental information. In natural conditions, the frightened faces appear mainly in our peripheral visual field. However, the brain mechanisms underlying perception of fear in the periphery remain largely unknown. We have demonstrated, through behavioral, magnetoencephalographic and intracranial studies that the perception of fear facial expression is efficient in large peripheral visual field. Fear perception in the periphery produces an early response in the amygdala and the frontal cortex, and a later response in the occipital and infero-temporal visual areas. Attentional control is able to inhibit the early response to fear expression and to increase the later temporo-occipital activities linked to face perception. Our results show that networks involved in fear perception are adapted to the peripheral vision. Moreover, they validate a new form of investigation of facial expression processing, which may lead to a better understanding of how we process social messages in more ecological situations
Linares, Claire. "Three Essays on Consumer Social Cognition in a Technology-Rich World." Thesis, Jouy-en Josas, HEC, 2022. http://www.theses.fr/2022EHEC0001.
Full textThe three essays of this dissertation examine consumer social cognition processes which take a special resonance in today’s technological world. Essay 1 investigates the effect of the mere presence of a technological device, a smartphone, on social interactions and creativity. The initial objective of this essay was to build on the work of Przybylski and Weinstein (2013), which showed a negative effect of the mere presence of a phone on relationship formation, to extend the investigation to creativity. After two failed replications of Przybylski and Weinstein’s (2013) results and an absence of robust results on creativity, the conclusion of this work is that the effect of the mere presence of a smartphone is at least harder to find than it may have been before. The two other essays in this dissertation examine questions at the intersection of management and face perception, at a time when faces take a new place in social interactions with the development of social media and videoconferencing platforms and with the increase in facial data with social media and facial detection technologies. Essay 2 investigates brand–user facial stereotypes, the mental representations that people have of the faces of the typical users of a brand (e.g., the face of a BMW driver). The first part reveals that such shared stereotypes exist by using a method borrowed from face-perception research that is new in consumer behavior research to compose “mugshots” of different car brand users for German consumers. The second part uncovers a face–brand matching effect, whereby observers can accurately match a target’s true perfume brand to their face, above chance level, and beyond sociodemographic cues. Together, the results of Essay 2 suggest that faces and brands can be connected both in consumers’ mental representations and in their actual faces. Although this work opens managerial opportunities, consumers may not be aware of the information that their faces reveal, which raises ethical questions to address. Finally, Essay 3 explores facial name stereotypes, that is the mental representations that people have of the face of someone wearing a given name (e.g., the stereotypical face of a man named James). The first part of a study produced mugshots associated with a series of French given names (e.g., the faces associated with the names Julien and Nicolas). The second part is currently in progress. Before sending the present document, the data collected up to March 29, 2022 were analyzed (143 valid participants out of 250 preregistered participants) to get a sense of the pattern. It already reveals that the mugshots are recognized on average by an independent sample of participants, significantly above chance level. If these preliminary results are confirmed once the preregistered sample size will be attained, this research would offer direct evidence supporting the existence of facial name stereotypes while validating the use of the reverse correlation technique from Essay 2 to capture such stereotypes. The objective is to take this work forward in the management domain in one of several possible directions fleshed out in the General Discussion of this essay. Overall, this dissertation sheds light on marketing and management questions that have theoretical relevance as well as managerial and ethical implications in our real- and virtual-world
Books on the topic "Perception faciale"
Ikeda, Susumu. Hito no kao matawa hyōjō no shikibetsu ni tsuite: Shoki no jikkenteki kenkyū o chūshin to shita shiteki tenbō. Suita-shi: Kansai Daigaku Shuppanbu, 1987.
Find full textCarter, Sandra E., and Vaughan T. Bailey. Facial expressions: Dynamic patterns, impairments and social perceptions. Hauppauge, N.Y: Nova Science Publisher's, 2012.
Find full textBarabanshchikov, V. A. Vosprii︠a︡tie vyrazheniĭ lit︠s︡a. Moskva: Institut psikhologii RAN, 2009.
Find full textGojmerac, Christina Barbara. Perception of emotional facial expressions elicit approach and withdrawal behaviour. Ottawa: National Library of Canada, 2002.
Find full textW, Vorder Bruegge Richard, ed. Computer-aided forensic facial comparison. Boca Raton, FL: Taylor & Francis Group, 2010.
Find full textValentine, Tim, and Josh P. Davis. Forensic facial identification: Theory and practice of identification from eyewitnesses, composites and CCTV. Chichester, West Sussex: John Wiley & Sons Inc., 2015.
Find full textBook chapters on the topic "Perception faciale"
Hoffmann, Holger, Harald C. Traue, Franziska Bachmayr, and Henrik Kessler. "Perception of Dynamic Facial Expressions of Emotion." In Perception and Interactive Technologies, 175–78. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11768029_17.
Full textKnowles, Kristen. "The Evolutionary Psychology of Leadership Trait Perception." In The Facial Displays of Leaders, 97–121. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94535-4_5.
Full textNkengne, Alex, Georgios Stamatas, and Christiane Bertin. "Facial Skin Attributes and Age Perception." In Textbook of Aging Skin, 973–80. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-540-89656-2_91.
Full textNkengne, Alex, Georgios N. Stamatas, and Christiane Bertin. "Facial Skin Attributes and Age Perception." In Textbook of Aging Skin, 1689–700. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-47398-6_91.
Full textNkengne, Alex, Georgios N. Stamatas, and Christiane Bertin. "Facial Skin Attributes and Age Perception." In Textbook of Aging Skin, 1–12. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-642-27814-3_91-2.
Full textSaha, Chandrani, Washef Ahmed, and Soma Mitra. "A New Motion Based Fully Automatic Facial Expression Recognition System." In Perception and Machine Intelligence, 145–54. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27387-2_19.
Full textZhang, David, Fangmei Chen, and Yong Xu. "A New Hypothesis on Facial Beauty Perception." In Computer Models for Facial Beauty Analysis, 143–63. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-32598-9_9.
Full textKonar, Amit, Aruna Chakraborty, Anisha Halder, Rajshree Mandal, and Ramadoss Janarthanan. "Interval Type-2 Fuzzy Model for Emotion Recognition from Facial Expression." In Perception and Machine Intelligence, 114–21. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27387-2_15.
Full textGan, Peter Zhuowei, Arcot Sowmya, and Gelareh Mohammadi. "Zero-shot Personality Perception From Facial Images." In AI 2022: Advances in Artificial Intelligence, 43–56. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-22695-3_4.
Full textKeating, Caroline F. "About Face! Facial Status Cues and Perceptions of Charismatic Leadership." In The Facial Displays of Leaders, 145–70. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94535-4_7.
Full textConference papers on the topic "Perception faciale"
Tamanaka, Fernanda G., Lucas P. Carlini, Tatiany M. Heideirich, Rita C. X. Balda, Marina C. M. Barros, Ruth Guinsburg, and Carlos E. Thomaz. "Neonatal pain scales study: A Kendall analysis between eye-tracking and literature facial features." In Simpósio Brasileiro de Computação Aplicada à Saúde. Sociedade Brasileira de Computação - SBC, 2021. http://dx.doi.org/10.5753/sbcas.2021.16068.
Full textWang, Xiaoyan, Tianxu Xu, Yiwen Zhang, Dongye Xu, Dong An, Qiang Wang, Zhongqi Pan, and Yang Yue. "3D Time-of-Flight Camera Based Face Mask Recognition Using Facial Contour and Artificial Neural Network." In 3D Image Acquisition and Display: Technology, Perception and Applications. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/3d.2022.jw2a.23.
Full textSchiano, Diane J., Sheryl M. Ehrlich, and Kyle Sheridan. "Categorical perception of facial affect." In CHI '01 extended abstracts. New York, New York, USA: ACM Press, 2001. http://dx.doi.org/10.1145/634067.634244.
Full textSchiano, Diane J., Sheryl M. Ehrlich, and Kyle Sheridan. "Categorical perception of facial affect." In CHI '01 extended abstracts. New York, New York, USA: ACM Press, 2001. http://dx.doi.org/10.1145/634239.634244.
Full textYamada, Takashi, and Tomio Watanabe. "An Average Facial Color Image Avatar System for the Analysis by Synthesis of Affect Display by Dynamic Facial Color and Expression." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70263.
Full textKleiner, Mario, Adrian Schwaninger, Douglas Cunningham, and Barbara Knappmeyer. "Using facial texture manipulation to study facial motion perception." In the 1st Symposium. New York, New York, USA: ACM Press, 2004. http://dx.doi.org/10.1145/1012551.1012602.
Full textCarlini, Lucas, Leonardo Ferreira, Gabriel Coutrin, Victor Varoto, Tatiany Marcondes, Rita Balda, Marina Barros, Ruth Guinsburg, and Carlos Thomaz. "Mobile Convolutional Neural Network for Neonatal Pain Assessment." In LatinX in AI at Computer Vision and Pattern Recognition Conference 2021. Journal of LatinX in AI Research, 2021. http://dx.doi.org/10.52591/lxai202106258.
Full textSato, Yugo, Takuya Kato, Naoki Nozawa, and Shigeo Morishima. "Perception of drowsiness based on correlation with facial image features." In SAP '16: ACM Symposium on Applied Perception 2016. New York, NY, USA: ACM, 2016. http://dx.doi.org/10.1145/2931002.2947705.
Full textRehman, Shafiq ur, Li Liu, and Haibo Li. "Manifold of Facial Expressions for Tactile Perception." In 2007 IEEE 9th Workshop on Multimedia Signal Processing. IEEE, 2007. http://dx.doi.org/10.1109/mmsp.2007.4412862.
Full textSorci, M., J. Ph Thiran, J. Cruz, T. Robin, and M. Bierlaire. "Modelling human perception of static facial expressions." In Gesture Recognition (FG). IEEE, 2008. http://dx.doi.org/10.1109/afgr.2008.4813428.
Full text