To see the other types of publications on this topic, follow the link: Facial Action Coding System (FACS).

Journal articles on the topic 'Facial Action Coding System (FACS)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Facial Action Coding System (FACS).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

An, Kyoung-Hee. "An Application of the Awareness Programme for Actors’ Face Training - Base on Facial Action Coding System (FACS) -." Journal of acting studies 20 (November 30, 2020): 119–38. http://dx.doi.org/10.26764/jaa.2020.20.7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

An, Kyoung Hee. "An Application of the Emotional Programme for Improvement of Performers’ Face Expression : Based on Facial Action Coding System(FACS)." Journal of Dance Society for Documentation & History 53 (June 30, 2019): 157–80. http://dx.doi.org/10.26861/sddh.2019.53.157.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Xue, Yao Feng, Hua Li Sun, and Ye Duan. "Research on Facial Expression Computing in Experiment Process." Applied Mechanics and Materials 738-739 (March 2015): 666–69. http://dx.doi.org/10.4028/www.scientific.net/amm.738-739.666.

Full text
Abstract:
The Candide face model and the Face Action Coding System (FACS) are introduced in the paper. The relations of the positions of feature points of Candide-3 model and the action units of FACS are studied. The application system for computing the facial expressions of students in the experiment teaching process is developed. The feasibility of the application system is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
4

Wibowo, Hardianto, Fachrunnisa Firdausi, Wildan Suharso, Wahyu Andhyka Kusuma, and Dani Harmanto. "Facial expression recognition of 3D image using facial action coding system (FACS)." TELKOMNIKA (Telecommunication Computing Electronics and Control) 17, no. 2 (2018): 628. http://dx.doi.org/10.12928/telkomnika.v17i2.9304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

A. Nasser, Sana, Ivan A. Hashim, and Wissam H. Ali. "Visual Depression Diagnosis From Face Based on Various Classification Algorithms." Engineering and Technology Journal 38, no. 11A (2020): 1717–29. http://dx.doi.org/10.30684/etj.v38i11a.1714.

Full text
Abstract:
Most psychologists believe that facial behavior through depression differs from facial behavior in the absence of depression, so facial behavior can be utilized as a dependable indicator for spotting depression. Visual depression diagnosis system (VDD) establishes dependents on expressions of the face that are expense-effective and movable. At this work, the VDD system is designed according to the Facial Action Coding System (FACS) to extract features of the face. The key concept of the Facial Action Coding System (FACS) to explain the whole face behavior utilizing Action Units (AUs), every AU
APA, Harvard, Vancouver, ISO, and other styles
6

Nendya, Matahari Bhakti, Lailatul Husniah, Hardianto Wibowo, and Eko Mulyanto Yuniarno. "Sintesa Ekspresi Wajah Karakter Virtual 3D menggunakan Action Unit berbasis Facial Action Coding System (FACS)." Journal of Animation and Games Studies 7, no. 1 (2021): 13–24. http://dx.doi.org/10.24821/jags.v7i1.4239.

Full text
Abstract:
Ekspresi wajah pada karakter virtual 3D memegang penran penting dalam pembuatan sebuah film animasi. Untuk mendapatkan ekspresi wajah yang diinginkan seorang animator kadang mengalami kesulitan dan membutuhkan waktu yang tidak sedikit. Penelitian ini dilakukan untuk mendapatkan ekspresi wajah dengan menggabungkan beberapa Action Unit yang ada pada FACS dan diimplementasikan pada wajah karakter virtual 3D. Action Unit pada FACS dipilih karena mengacu pada struktur otot wajah manusia. Eksperimen yang dilakukan menghasilkan komninasi Action Unit yang dapat membentuk ekspresi seperti joy expressio
APA, Harvard, Vancouver, ISO, and other styles
7

POLIKOVSKY, Senya, Yoshinari KAMEDA, and Yuichi OHTA. "Facial Micro-Expression Detection in Hi-Speed Video Based on Facial Action Coding System (FACS)." IEICE Transactions on Information and Systems E96.D, no. 1 (2013): 81–92. http://dx.doi.org/10.1587/transinf.e96.d.81.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Adis, Fransisca, and Yohanes Merci Widiastomo. "Designing Emotion Of Characters By Referencing From Facs In Short Animated Film “RANA”." ULTIMART Jurnal Komunikasi Visual 9, no. 2 (2018): 31–38. http://dx.doi.org/10.31937/ultimart.v9i2.747.

Full text
Abstract:
Facial expression is one of some aspects that can deliver story and character’s emotion in 3D animation. To achieve that, we need to plan the character facial from very beginning of the production. At early stage, the character designer need to think about the expression after theu done the character design. Rigger need to create a flexible rigging to achieve the design. Animator can get the clear picture how they animate the facial. Facial Action Coding System (FACS) that originally developed by Carl-Herman Hjortsjo and adopted by Paul Ekman and Wallace V. can be used to identify emotion in a
APA, Harvard, Vancouver, ISO, and other styles
9

Castillo, Louise IR, M. Erin Browne, Thomas Hadjistavropoulos, Kenneth M. Prkachin, and Rafik Goubran. "Automated vs. manual pain coding and heart rate estimations based on videos of older adults with and without dementia." Journal of Rehabilitation and Assistive Technologies Engineering 7 (January 2020): 205566832095019. http://dx.doi.org/10.1177/2055668320950196.

Full text
Abstract:
Introduction Technological advances have allowed for the estimation of physiological indicators from video data. FaceReader™ is an automated facial analysis software that has been used widely in studies of facial expressions of emotion and was recently updated to allow for the estimation of heart rate (HR) using remote photoplethysmography (rPPG). We investigated FaceReader™-based heart rate and pain expression estimations in older adults in relation to manual coding by experts. Methods Using a video dataset of older adult patients with and without dementia, we assessed the relationship betwee
APA, Harvard, Vancouver, ISO, and other styles
10

Ujir, Hamimah, Irwandi Hipiny, and D. N.F. Awang Iskandar. "Facial Action Units Analysis using Rule-Based Algorithm." International Journal of Engineering & Technology 7, no. 3.20 (2018): 284. http://dx.doi.org/10.14419/ijet.v7i3.20.19167.

Full text
Abstract:
Most works in quantifying facial deformation are based on action units (AUs) provided by the Facial Action Coding System (FACS) which describes facial expressions in terms of forty-six component movements. AU corresponds to the movements of individual facial muscles. This paper presents a rule based approach to classify the AU which depends on certain facial features. This work only covers deformation of facial features based on posed Happy and the Sad expression obtained from the BU-4DFE database. Different studies refer to different combination of AUs that form Happy and Sad expression. Acco
APA, Harvard, Vancouver, ISO, and other styles
11

Yasser, Fatima I., Bassam H. Abd, and Saad M. Abbas. "Detection of confusion behavior using a facial expression based on different classification algorithms." Engineering and Technology Journal 39, no. 2A (2021): 316–25. http://dx.doi.org/10.30684/etj.v39i2a.1750.

Full text
Abstract:
Confusion detection systems (CDSs) that need Noninvasive, mobile, and cost-effective methods use facial expressions as a technique to detect confusion. In previous works, the technology that the system used represents a major gap between this proposed CDS and other systems. This CDS depends on the Facial Action Coding System (FACS) that is used to extract facial features. The FACS shows the motion of the facial muscles represented by Action Units (AUs); the movement is represented with one facial muscle or more. Seven AUs are used as possible markers for detecting confusion that has been imple
APA, Harvard, Vancouver, ISO, and other styles
12

Gavrilescu, Mihai, and Nicolae Vizireanu. "Predicting Depression, Anxiety, and Stress Levels from Videos Using the Facial Action Coding System." Sensors 19, no. 17 (2019): 3693. http://dx.doi.org/10.3390/s19173693.

Full text
Abstract:
We present the first study in the literature that has aimed to determine Depression Anxiety Stress Scale (DASS) levels by analyzing facial expressions using Facial Action Coding System (FACS) by means of a unique noninvasive architecture on three layers designed to offer high accuracy and fast convergence: in the first layer, Active Appearance Models (AAM) and a set of multiclass Support Vector Machines (SVM) are used for Action Unit (AU) classification; in the second layer, a matrix is built containing the AUs’ intensity levels; and in the third layer, an optimal feedforward neural network (F
APA, Harvard, Vancouver, ISO, and other styles
13

Owusu, Ebenezer, Ebenezer Komla Gavua, and Zhan Yong-Zhao. "Facial Expression Recognition – A Comprehensive Review." International Journal of Technology and Management Research 1, no. 4 (2020): 29–46. http://dx.doi.org/10.47127/ijtmr.v1i4.36.

Full text
Abstract:
In this paper, we have provided a comprehensive review of modern facial expression recognition system. The history of the technology as well as the current status in terms of accomplishments and challenges has been emphasized. First, we highlighted some modern applications of the technology. The best methods of face detection, an essential component of automatic facial expression system, are also discussed. Facial Action Coding Systems- the cumulative database of research and development of micro expressions within the behavioral science are also enlightened. Then various facial expression dat
APA, Harvard, Vancouver, ISO, and other styles
14

Correia-Caeiro, Catia, Kathryn Holmes, and Takako Miyabe-Nishiwaki. "Extending the MaqFACS to measure facial movement in Japanese macaques (Macaca fuscata) reveals a wide repertoire potential." PLOS ONE 16, no. 1 (2021): e0245117. http://dx.doi.org/10.1371/journal.pone.0245117.

Full text
Abstract:
Facial expressions are complex and subtle signals, central for communication and emotion in social mammals. Traditionally, facial expressions have been classified as a whole, disregarding small but relevant differences in displays. Even with the same morphological configuration different information can be conveyed depending on the species. Due to a hardwired processing of faces in the human brain, humans are quick to attribute emotion, but have difficulty in registering facial movement units. The well-known human FACS (Facial Action Coding System) is the gold standard for objectively measurin
APA, Harvard, Vancouver, ISO, and other styles
15

Buhari, Adamu Muhammad, Chee-Pun Ooi, Vishnu Monn Baskaran, Raphaël C. W. Phan, KokSheik Wong, and Wooi-Haw Tan. "FACS-Based Graph Features for Real-Time Micro-Expression Recognition." Journal of Imaging 6, no. 12 (2020): 130. http://dx.doi.org/10.3390/jimaging6120130.

Full text
Abstract:
Several studies on micro-expression recognition have contributed mainly to accuracy improvement. However, the computational complexity receives lesser attention comparatively and therefore increases the cost of micro-expression recognition for real-time application. In addition, majority of the existing approaches required at least two frames (i.e., onset and apex frames) to compute features of every sample. This paper puts forward new facial graph features based on 68-point landmarks using Facial Action Coding System (FACS). The proposed feature extraction technique (FACS-based graph features
APA, Harvard, Vancouver, ISO, and other styles
16

Verma, Dhruv, Sejal Bhalla, Dhruv Sahnan, Jainendra Shukla, and Aman Parnami. "ExpressEar." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 3 (2021): 1–28. http://dx.doi.org/10.1145/3478085.

Full text
Abstract:
Continuous and unobtrusive monitoring of facial expressions holds tremendous potential to enable compelling applications in a multitude of domains ranging from healthcare and education to interactive systems. Traditional, vision-based facial expression recognition (FER) methods, however, are vulnerable to external factors like occlusion and lighting, while also raising privacy concerns coupled with the impractical requirement of positioning the camera in front of the user at all times. To bridge this gap, we propose ExpressEar, a novel FER system that repurposes commercial earables augmented w
APA, Harvard, Vancouver, ISO, and other styles
17

Vick, Sarah-Jane, Bridget M. Waller, Lisa A. Parr, Marcia C. Smith Pasqualini, and Kim A. Bard. "A Cross-species Comparison of Facial Morphology and Movement in Humans and Chimpanzees Using the Facial Action Coding System (FACS)." Journal of Nonverbal Behavior 31, no. 1 (2006): 1–20. http://dx.doi.org/10.1007/s10919-006-0017-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Davison, Adrian, Walied Merghani, and Moi Yap. "Objective Classes for Micro-Facial Expression Recognition." Journal of Imaging 4, no. 10 (2018): 119. http://dx.doi.org/10.3390/jimaging4100119.

Full text
Abstract:
Micro-expressions are brief spontaneous facial expressions that appear on a face when a person conceals an emotion, making them different to normal facial expressions in subtlety and duration. Currently, emotion classes within the CASME II dataset (Chinese Academy of Sciences Micro-expression II) are based on Action Units and self-reports, creating conflicts during machine learning training. We will show that classifying expressions using Action Units, instead of predicted emotion, removes the potential bias of human reporting. The proposed classes are tested using LBP-TOP (Local Binary Patter
APA, Harvard, Vancouver, ISO, and other styles
19

Clarici, Andrea, Francesca Melon, Susanne Braun, and Antonio Bava. "Asymmetries of Facial Motility during the Dissimulation of Emotion." Perceptual and Motor Skills 83, no. 1 (1996): 263–74. http://dx.doi.org/10.2466/pms.1996.83.1.263.

Full text
Abstract:
The asymmetries of facial expression were estimated in a sample of 14 experimental subjects with the Facial Action Coding System during voluntary control of facial mimicry while viewing videotapes. The subjects were instructed to express facially the emotion experienced or to dissimulate their true emotion with a facial expression opposite (incongruous) to what they actually felt. Only during dissimulation did facial mimicry show an asymmetric distribution toward the lower left side of the face.
APA, Harvard, Vancouver, ISO, and other styles
20

Kulkarni, Praveen, and Rajesh T. M. "Analysis on techniques used to recognize and identifying the Human emotions." International Journal of Electrical and Computer Engineering (IJECE) 10, no. 3 (2020): 3307. http://dx.doi.org/10.11591/ijece.v10i3.pp3307-3314.

Full text
Abstract:
Facial expression is a major area for non-verbal language in day to day life communication. As the statistical analysis shows only 7 percent of the message in communication was covered in verbal communication while 55 percent transmitted by facial expression. Emotional expression has been a research subject of physiology since Darwin’s work on emotional expression in the 19th century. According to Psychological theory the classification of human emotion is classified majorly into six emotions: happiness, fear, anger, surprise, disgust, and sadness. Facial expressions which involve the emotions
APA, Harvard, Vancouver, ISO, and other styles
21

Aswari, Puji, and Nova Eka Diana. "IDENTIFIKASI EMOSI BERDASARKAN ACTION UNIT MENGGUNAKAN METODE BÉZIER CURVE." SINERGI 20, no. 1 (2016): 74. http://dx.doi.org/10.22441/sinergi.2016.1.010.

Full text
Abstract:
Ekspresi wajah menjadi bahasa yang universal. Bahkan perubahan ekspresi wajah dapat membantu pengambilan keputusan. Pada tahun 1972, Paul Ekman mengklasifikasikan emosi dasar manusia ke dalam enam jenis: senang, sedih, terkejut, marah, takut, dan jijik. Kemudian Ekman dan Wallace Friesen mengembangkan sebuah alat untuk mengukur pergerakan pada wajah yang disebut Facial Action Coding System (FACS). FACS menentukan ekspresi wajah berdasarkan pergerakan otot wajah, yang diistilahkan Action Unit (AU). Penelitian ini bertujuan untuk mengetahui emosi tertarik yang dialami seseorang berdasarkan AU ya
APA, Harvard, Vancouver, ISO, and other styles
22

Yan, Jizheng, Zhiliang Wang, and Yan Yan. "Humanoid Robot Head Design Based on Uncanny Valley and FACS." Journal of Robotics 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/208924.

Full text
Abstract:
Emotional robots are always the focus of artificial intelligence (AI), and intelligent control of robot facial expression is a hot research topic. This paper focuses on the design of humanoid robot head, which is divided into three steps to achieve. The first step is to solve the uncanny valley about humanoid robot, to find and avoid the relationship between human being and robot; the second step is to solve the association between human face and robot head; compared with human being and robots, we analyze the similarities and differences and explore the same basis and mechanisms between robot
APA, Harvard, Vancouver, ISO, and other styles
23

Defrin, Ruth, Tali Benromano, and Chaim G. Pick. "Specific Behavioral Responses Rather Than Autonomic Responses Can Indicate and Quantify Acute Pain among Individuals with Intellectual and Developmental Disabilities." Brain Sciences 11, no. 2 (2021): 253. http://dx.doi.org/10.3390/brainsci11020253.

Full text
Abstract:
Individuals with intellectual and developmental disabilities (IDD) are at a high risk of experiencing pain. Pain management requires assessment, a challenging mission considering the impaired communication skills in IDD. We analyzed subjective and objective responses following calibrated experimental stimuli to determine whether they can differentiate between painful and non-painful states, and adequately quantify pain among individuals with IDD. Eighteen adults with IDD and 21 healthy controls (HC) received experimental pressure stimuli (innocuous, mildly noxious, and moderately noxious). Fac
APA, Harvard, Vancouver, ISO, and other styles
24

Hong, Yu-Jin, Sung Eun Choi, Gi Pyo Nam, Heeseung Choi, Junghyun Cho, and Ig-Jae Kim. "Adaptive 3D Model-Based Facial Expression Synthesis and Pose Frontalization." Sensors 20, no. 9 (2020): 2578. http://dx.doi.org/10.3390/s20092578.

Full text
Abstract:
Facial expressions are one of the important non-verbal ways used to understand human emotions during communication. Thus, acquiring and reproducing facial expressions is helpful in analyzing human emotional states. However, owing to complex and subtle facial muscle movements, facial expression modeling from images with face poses is difficult to achieve. To handle this issue, we present a method for acquiring facial expressions from a non-frontal single photograph using a 3D-aided approach. In addition, we propose a contour-fitting method that improves the modeling accuracy by automatically re
APA, Harvard, Vancouver, ISO, and other styles
25

Okuda, Itsuko, Yumika Yamakawa, Nobu Mitani, Naoko Ota, Marie Kawabata, and Naoki Yoshioka. "Objective evaluation of the relationship between facial expression analysis by the facial action coding system (FACS) and CT/MRI analyses of the facial expression muscles." Skin Research and Technology 26, no. 5 (2020): 727–33. http://dx.doi.org/10.1111/srt.12864.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Fiorentini, Chiara, Susanna Schmidt, and Paolo Viviani. "The Identification of Unfolding Facial Expressions." Perception 41, no. 5 (2012): 532–55. http://dx.doi.org/10.1068/p7052.

Full text
Abstract:
We asked whether the identification of emotional facial expressions (FEs) involves the simultaneous perception of the facial configuration or the detection of emotion-specific diagnostic cues. We recorded at high speed (500 frames s−1) the unfolding of the FE in five actors, each expressing six emotions (anger, surprise, happiness, disgust, fear, sadness). Recordings were coded every 10 frames (20 ms of real time) with the Facial Action Coding System (FACS, Ekman et al 2002, Salt Lake City, UT: Research Nexus eBook) to identify the facial actions contributing to each expression, and their inte
APA, Harvard, Vancouver, ISO, and other styles
27

Ihme, Klas, Christina Dömeland, Maria Freese, and Meike Jipp. "Frustration in the face of the driver." Interaction Studies 19, no. 3 (2018): 487–98. http://dx.doi.org/10.1075/is.17005.ihm.

Full text
Abstract:
Abstract Frustration in traffic is one of the causes of aggressive driving. Knowledge whether a driver is frustrated may be utilized by future advanced driver assistance systems to counteract this source of crashes. One possibility to achieve this is to automatically recognize facial expressions of drivers. However, only little is known about the facial expressions of frustrated drivers. Here, we report the results of a driving simulator study investigating the facial muscle activity that comes along with frustration. Twenty-eight participants were video-taped during frustrated and non-frustra
APA, Harvard, Vancouver, ISO, and other styles
28

Reilly, Judy Snitzer, and Ursula Bellugi. "Competition on the face: affect and language in ASL motherese." Journal of Child Language 23, no. 1 (1996): 219–39. http://dx.doi.org/10.1017/s0305000900010163.

Full text
Abstract:
ABSTRACTResearch on early mother-child interaction has documented the crucial role affect plays in the content and modulation of early interactions. For hearing mothers, voice quality is considered to be the single most informative channel for affective expression. For deaf caregivers who use American Sign Language (ASL), the vocal channel is unavailable, and facial expression is critically important. Not only do facial behaviours signal affective and communicative information, but specific facial behaviours also function as obligatory grammatical markers. This multifunctionality of facial exp
APA, Harvard, Vancouver, ISO, and other styles
29

Dias, Carlos Magno Machado, Carlos Alberto Gonçalves, and Ângela Maria Ribeiro. "Valências na resposta emocional dos eleitores: design experimental com neurociência." Revista de Administração da UFSM 13, no. 3 (2020): 483–500. http://dx.doi.org/10.5902/1983465943111.

Full text
Abstract:
This work aims to quantitatively and qualitatively evaluate the valence of voters’ emotional response to changes in the scenarios in videos of political propaganda. The experiment was conducted in a laboratory with a fictitious candidate and content. We used four different scenarios: one with a completely white background, one simulating a library, one with a popular house, and one with luxury houses. We use the Facial Action Coding System (FACS) as an instrument to measure emotions. We found statistical differences between the intensity of the valences throughout the video (n=108). The work e
APA, Harvard, Vancouver, ISO, and other styles
30

Smith, Marcia C., Melissa K. Smith, and Heiner Ellgring. "Spontaneous and posed facial expression in Parkinson's Disease." Journal of the International Neuropsychological Society 2, no. 5 (1996): 383–91. http://dx.doi.org/10.1017/s1355617700001454.

Full text
Abstract:
AbstractSpontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n – 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2–5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their e
APA, Harvard, Vancouver, ISO, and other styles
31

Stewart, Patrick A., Erik P. Bucy, and Marc Mehu. "Strengthening bonds and connecting with followers." Politics and the Life Sciences 34, no. 1 (2015): 73–92. http://dx.doi.org/10.1017/pls.2015.5.

Full text
Abstract:
The smiles and affiliative expressions of presidential candidates are important for political success, allowing contenders to nonverbally connect with potential supporters and bond with followers. Smiles, however, are not unitary displays; they are multifaceted in composition and signaling intent due to variations in performance. With this in mind, we examine the composition and perception of smiling behavior by Republican presidential candidates during the 2012 preprimary period. In this paper we review literature concerning different smile types and the muscular movements that compose them f
APA, Harvard, Vancouver, ISO, and other styles
32

Bänninger-Huber, Eva. "Prozesse der Emotionsregulierung in psychoanalytischen Langzeitpsychotherapien." Psychotherapie Forum 25, no. 1-2 (2021): 80–87. http://dx.doi.org/10.1007/s00729-021-00172-7.

Full text
Abstract:
ZusammenfassungDas Projekt verfolgt das Ziel, das affektive Regulierungsgeschehen in psychotherapeutischen Interaktionen anhand von Videoaufnahmen mikroanalytisch zu beschreiben und mit einem produktiven therapeutischen Prozess in Beziehung zu setzen. Analysiert werden mimische Verhaltensweisen, die mit dem Facial Action Coding System (FACS) objektiv erfasst werden. Im Fokus des Beitrags stehen die sogenannten Prototypischen Affektiven Mikrosequenzen (PAMS). PAMs sind durch Lächeln und Lachen gekennzeichnet und dienen dazu, Störungen in der Affektregulierung mit Hilfe des Gegenübers auszuregul
APA, Harvard, Vancouver, ISO, and other styles
33

Perez-Gomez, Vianney, Homero V. Rios-Figueroa, Ericka Janet Rechy-Ramirez, Efrén Mezura-Montes, and Antonio Marin-Hernandez. "Feature Selection on 2D and 3D Geometric Features to Improve Facial Expression Recognition." Sensors 20, no. 17 (2020): 4847. http://dx.doi.org/10.3390/s20174847.

Full text
Abstract:
An essential aspect in the interaction between people and computers is the recognition of facial expressions. A key issue in this process is to select relevant features to classify facial expressions accurately. This study examines the selection of optimal geometric features to classify six basic facial expressions: happiness, sadness, surprise, fear, anger, and disgust. Inspired by the Facial Action Coding System (FACS) and the Moving Picture Experts Group 4th standard (MPEG-4), an initial set of 89 features was proposed. These features are normalized distances and angles in 2D and 3D compute
APA, Harvard, Vancouver, ISO, and other styles
34

Hao, Min, Guangyuan Liu, Anu Gokhale, Ya Xu, and Rui Chen. "Detecting Happiness Using Hyperspectral Imaging Technology." Computational Intelligence and Neuroscience 2019 (January 15, 2019): 1–16. http://dx.doi.org/10.1155/2019/1965789.

Full text
Abstract:
Hyperspectral imaging (HSI) technology can be used to detect human emotions based on the power of material discrimination from their faces. In this paper, HSI is used to remotely sense and distinguish blood chromophores in facial tissues and acquire an evaluation indicator (tissue oxygen saturation, StO2) using an optical absorption model. This study explored facial analysis while people were showing spontaneous expressions of happiness during social interaction. Happiness, as a psychological emotion, has been shown to be strongly linked to other activities such as physiological reaction and f
APA, Harvard, Vancouver, ISO, and other styles
35

Hadjistvropoulos, Thomas, Diane LaChapelle, Farley MacLeod, Carla Hale, Norm O’Rourke, and Kenneth D. Craig. "Cognitive Functioning and Pain Reactions in Hospitalized Elders." Pain Research and Management 3, no. 3 (1998): 145–51. http://dx.doi.org/10.1155/1998/621580.

Full text
Abstract:
BACKGROUND: Objectively coded facial activity provides a useful index of pain among elders who have difficulty in reporting pain because of cognitive impairments. However, limitations of previous research include no direct assessment of participants' level of cognitive impairment; no comparison of the reactions of elders with cognitive impairments with those of nonimpaired elders; observers' expectations about pain levels could have influenced judgements about the severity of pain experienced when global rather than objectively coded measures were used because the painful medical procedure was
APA, Harvard, Vancouver, ISO, and other styles
36

Andersen, Pia Haubro, Sofia Broomé, Maheen Rashid, et al. "Towards Machine Recognition of Facial Expressions of Pain in Horses." Animals 11, no. 6 (2021): 1643. http://dx.doi.org/10.3390/ani11061643.

Full text
Abstract:
Automated recognition of human facial expressions of pain and emotions is to a certain degree a solved problem, using approaches based on computer vision and machine learning. However, the application of such methods to horses has proven difficult. Major barriers are the lack of sufficiently large, annotated databases for horses and difficulties in obtaining correct classifications of pain because horses are non-verbal. This review describes our work to overcome these barriers, using two different approaches. One involves the use of a manual, but relatively objective, classification system for
APA, Harvard, Vancouver, ISO, and other styles
37

Elliott, Eeva A., and Arthur M. Jacobs. "Phonological and morphological faces." Sign Language and Linguistics 17, no. 2 (2014): 123–80. http://dx.doi.org/10.1075/sll.17.2.01ell.

Full text
Abstract:
In this study, we verify the observation that signs for emotion related concepts are articulated with the congruent facial movements in German Sign Language using a corpus. We propose an account for the function of these facial movements in the language that also explains the function of mouthings and other facial movements at the lexical level. Our data, taken from 20 signers in three different conditions, show that for the disgust related signs, a disgust related facial movement with temporal scope only over the individual sign occurred in most cases. These movements often occurred in additi
APA, Harvard, Vancouver, ISO, and other styles
38

Dumer, Aleksey I., Harriet Oster, David McCabe, et al. "Effects of the Lee Silverman Voice Treatment (LSVT® LOUD) on Hypomimia in Parkinson's Disease." Journal of the International Neuropsychological Society 20, no. 3 (2014): 302–12. http://dx.doi.org/10.1017/s1355617714000046.

Full text
Abstract:
AbstractGiven associations between facial movement and voice, the potential of the Lee Silverman Voice Treatment (LSVT) to alleviate decreased facial expressivity, termed hypomimia, in Parkinson's disease (PD) was examined. Fifty-six participants—16 PD participants who underwent LSVT, 12 PD participants who underwent articulation treatment (ARTIC), 17 untreated PD participants, and 11 controls without PD—produced monologues about happy emotional experiences at pre- and post-treatment timepoints (“T1” and “T2,” respectively), 1 month apart. The groups of LSVT, ARTIC, and untreated PD participan
APA, Harvard, Vancouver, ISO, and other styles
39

Zhao, Yue, and Jiancheng Xu. "Necessary Morphological Patches Extraction for Automatic Micro-Expression Recognition." Applied Sciences 8, no. 10 (2018): 1811. http://dx.doi.org/10.3390/app8101811.

Full text
Abstract:
Micro expressions are usually subtle and brief facial expressions that humans use to hide their true emotional states. In recent years, micro-expression recognition has attracted wide attention in the fields of psychology, mass media, and computer vision. The shortest micro expression lasts only 1/25 s. Furthermore, different from macro-expressions, micro-expressions have considerable low intensity and inadequate contraction of the facial muscles. Based on these characteristics, automatic micro-expression detection and recognition are great challenges in the field of computer vision. In this p
APA, Harvard, Vancouver, ISO, and other styles
40

Kasatkina, D. A., A. M. Kravchenko, R. B. Kupriyanov, and E. V. Nekhorosheva. "Automatic engagement detection in the education: critical review." Современная зарубежная психология 9, no. 3 (2020): 59–68. http://dx.doi.org/10.17759/jmfp.2020090305.

Full text
Abstract:
This paper reviews the key research of the automatic engagement detection in education. Automatic engagement detection is necessary in enhancing educational process, there is a lack of out-of-the-box technical solutions. Engagement can be detected while tracing learning-centered affects: interest, confusion, frustration, delight, anger, boredom, and their facial and bodily expressions. Most of the researchers reveal these emotions on video using Facial Action Coding System (FACS). But there doesn’t exist a set of ready-made criteria to detect engagement and many scientists use additional techn
APA, Harvard, Vancouver, ISO, and other styles
41

Da Silva, Emely Pujolli, Kate Mamhy Oliveira Kumada, and Paula Dornhofer Paro Costa. "Analysis of Facial Expressions in Brazilian Sign Language (Libras)." European Scientific Journal, ESJ 17, no. 22 (2021): 1. http://dx.doi.org/10.19044/esj.2021.v17n22p1.

Full text
Abstract:
Brazilian Sign Language (in Portuguese, Libras) is a visuospatial linguistic system adopted by the Brazilian deaf communities as the primary form of communication. Libras are a language of minority groups, thus their research and production of teaching materials do not receive the same incentive to progress or improve as oral languages. This complex language employs signs composed of forms and hands movements combined with facial expressions and postures of the body. Facial expressions rarely appear in sign language literature, despite their being essential to this form of communication. There
APA, Harvard, Vancouver, ISO, and other styles
42

Wang, Huanmin. "Enhanced Forest Microexpression Recognition Based on Optical Flow Direction Histogram and Deep Multiview Network." Mathematical Problems in Engineering 2020 (August 31, 2020): 1–11. http://dx.doi.org/10.1155/2020/5675914.

Full text
Abstract:
In order to recognize the instantaneous changes of facial microexpressions in natural environment, a method based on optical flow direction histogram and depth multiview network to enhance forest microexpression recognition was proposed. In the preprocessing stage, the histogram equalization of the acquired face image is performed, and then the dense key points of the face are detected. According to the coordinates of the key points and the face action coding system (FACS), the face region is divided into 15 regions of interest (ROI). In the feature extraction stage, the optical flow direction
APA, Harvard, Vancouver, ISO, and other styles
43

Kanovský, Martin, Martina Baránková, Júlia Halamová, Bronislava Strnádelová, and Jana Koróniová. "Analysis of Facial Expressions Made While Watching a Video Eliciting Compassion." Perceptual and Motor Skills 127, no. 2 (2020): 317–46. http://dx.doi.org/10.1177/0031512519897512.

Full text
Abstract:
The aim of the study was to describe the spontaneous facial expressions elicited by viewers of a compassionate video in terms of the respondents’ muscular activity of single facial action units (AUs). We recruited a convenience sample of 111 undergraduate psychology students, aged 18-25 years ( M = 20.53; SD = 1.62) to watch (at home alone) a short video stimulus eliciting compassion, and we recorded the respondents’ faces using webcams. We used both a manual analysis, based on the Facial Action Coding System, and an automatic analysis of the holistic recognition of facial expressions as obtai
APA, Harvard, Vancouver, ISO, and other styles
44

Wolf, Karsten, Thomas Raedler, Kai Henke, et al. "The Face of Pain - A Pilot Study to Validate the Measurement of Facial Pain Expression with an Improved EMG Method." Pain Research and Management 10, no. 1 (2005): 15–19. http://dx.doi.org/10.1155/2005/643075.

Full text
Abstract:
OBJECTIVE: The purpose of this pilot study was to establish the validity of an improved facial electromyogram (EMG) method for the measurement of facial pain expression.BACKGROUND: Darwin defined pain in connection with fear as a simultaneous occurrence of eye staring, brow contraction and teeth chattering. Prkachin was the first to use the video-based Facial Action Coding System to measure facial expressions while using four different types of pain triggers, identifying a group of facial muscles around the eyes.METHOD: The activity of nine facial muscles in 10 healthy male subjects was analyz
APA, Harvard, Vancouver, ISO, and other styles
45

Altimir Colao, Carolina, and Nelson Valdés-Sánchez. "Facial-affective communication and verbal relational offers during ruptures and resolution strategies: A single case study." CES Psicología 13, no. 3 (2020): 180–200. http://dx.doi.org/10.21615/cesp.13.3.11.

Full text
Abstract:
Research on the therapeutic relationship has underscored its central role for the therapeutic change process, indicating the relevance of determining the specific elements and mechanisms involved in its configuration (Knobloch-Fedders, Elkin, & Kiesler, 2014). Research on ruptures of the therapeutic relationship has yielded particular contributions to better understanding the interpersonal negotiation process involved in the patient-therapist interaction. Although previous studies have contributed to the objective characterization and the exhaustive description of ruptures, more research i
APA, Harvard, Vancouver, ISO, and other styles
46

Vicianova, Martina. "Historical Techniques of Lie Detection." Europe’s Journal of Psychology 11, no. 3 (2015): 522–34. http://dx.doi.org/10.5964/ejop.v11i3.919.

Full text
Abstract:
Since time immemorial, lying has been a part of everyday life. For this reason, it has become a subject of interest in several disciplines, including psychology. The purpose of this article is to provide a general overview of the literature and thinking to date about the evolution of lie detection techniques. The first part explores ancient methods recorded circa 1000 B.C. (e.g., God’s judgment in Europe). The second part describes technical methods based on sciences such as phrenology, polygraph and graphology. This is followed by an outline of more modern-day approaches such as FACS (Facial
APA, Harvard, Vancouver, ISO, and other styles
47

Hale, Carla J., and Thomas Hadjistavropoulos. "Emotional Components of Pain." Pain Research and Management 2, no. 4 (1997): 217–25. http://dx.doi.org/10.1155/1997/283582.

Full text
Abstract:
BACKGROUND: Current definitions of pain suggest that emotion is an essential component of pain, however, the presumed relationship between emotion and pain, and the specific emotions that are involved in pain experiences have yet to be clarified.OBJECTIVE: To address these issues in order to assist in making current conceptualizations of pain more explicit.DESIGN: Thirty adult patients undergoing routine blood tests were videotaped. Spontaneous facial reactions were examined for distinct expressions of emotion and pain occurring at baseline, swabbing and venepuncture intervals. Expressions wer
APA, Harvard, Vancouver, ISO, and other styles
48

Lundblad, Johan, Maheen Rashid, Marie Rhodin, and Pia Haubro Andersen. "Effect of transportation and social isolation on facial expressions of healthy horses." PLOS ONE 16, no. 6 (2021): e0241532. http://dx.doi.org/10.1371/journal.pone.0241532.

Full text
Abstract:
Horses have the ability to generate a remarkable repertoire of facial expressions, some of which have been linked to the affective component of pain. This study describes the facial expressions in healthy horses free of pain before and during transportation and social isolation, which are putatively stressful but ordinary management procedures. Transportation was performed in 28 horses by subjecting them to short-term road transport in a horse trailer. A subgroup (n = 10) of these horses was also subjected to short-term social isolation. During all procedures, a body-mounted, remote-controlled
APA, Harvard, Vancouver, ISO, and other styles
49

Almeida, Cláudio Santos de. "Análise emocional de produtos de design baseada em expressão facial." InfoDesign - Revista Brasileira de Design da Informação 7, no. 3 (2011): 19–27. http://dx.doi.org/10.51358/id.v7i3.96.

Full text
Abstract:
Este artigo trata da descrio de um mtodo de anlise de produtos baseado na coleta de informaes sobre o estado emocional de sujeitos atravs de reconhecimento de emoes baseado em modificaes da face. Fundamentado na classificao de Donald Norman sobre os nveis de interao dos objetos de design com o ser humano, o artigo apresenta um modelo de anlise que pode ser usado para obter informaes sobre o nvel reativo de design. Prope o uso de Facial Action Coding System atravs de tcnicas automatizadas de reconhecimento facial de forma a reduzir a subjetividade e garantir a praticidade de uso em aplicaes den
APA, Harvard, Vancouver, ISO, and other styles
50

Schiller, Devon. "For Now We See through an AI Darkly; but Then Face-to-Face: A Brief Survey of Emotion Recognition in Biometric Art." Przegląd Kulturoznawczy, no. 3 (45) (2020): 230–60. http://dx.doi.org/10.4467/20843860pk.20.025.12585.

Full text
Abstract:
Our knowledge about the facial expression of emotion may well be entering an age of scientific revolution. Conceptual models for facial behavior and emotion phenomena appear to be undergoing a paradigm shift brought on at least in part by advances made in facial recognition technology and automated facial expression analysis. And the use of technological labor by corporate, government, and institutional agents for extracting data capital from both the static morphology of the face and dynamic movement of the emotions is accelerating. Through a brief survey, the author seeks to introduce what h
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!