To see the other types of publications on this topic, follow the link: Emotion Models.

Journal articles on the topic 'Emotion Models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Emotion Models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Thornton, Mark A., and Diana I. Tamir. "Mental models accurately predict emotion transitions." Proceedings of the National Academy of Sciences 114, no. 23 (2017): 5982–87. http://dx.doi.org/10.1073/pnas.1616056114.

Full text
Abstract:
Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to pred
APA, Harvard, Vancouver, ISO, and other styles
2

Fathalla, Rana. "Emotional Models." International Journal of Synthetic Emotions 11, no. 2 (2020): 1–18. http://dx.doi.org/10.4018/ijse.2020070101.

Full text
Abstract:
Emotion modeling has gained attention for almost two decades now due to the rapid growth of affective computing (AC). AC aims to detect and respond to the end-user's emotions by devices and computers. Despite the hard efforts being directed to emotion modeling with numerous tries to build different models of emotions, emotion modeling remains an art with a lack of consistency and clarity regarding the exact meaning of emotion modeling. This review deconstructs the vagueness of the term ‘emotion modeling' by discussing the various types and categories of emotion modeling, including computationa
APA, Harvard, Vancouver, ISO, and other styles
3

Mehta, Ansh. "Emotion Detection using Social Media Data." International Journal for Research in Applied Science and Engineering Technology 9, no. 11 (2021): 1456–59. http://dx.doi.org/10.22214/ijraset.2021.39027.

Full text
Abstract:
Abstract: Previous research on emotion recognition of Twitter users centered on the use of lexicons and basic classifiers on pack of words models, despite the recent accomplishments of deep learning in many disciplines of natural language processing. The study's main question is if deep learning can help them improve their performance. Because of the scant contextual information that most posts offer, emotion analysis is still difficult. The suggested method can capture more emotion sematic than existing models by projecting emoticons and words into emoticon space, which improves the performan
APA, Harvard, Vancouver, ISO, and other styles
4

Vuoskoski, Jonna K., and Tuomas Eerola. "Measuring music-induced emotion." Musicae Scientiae 15, no. 2 (2011): 159–73. http://dx.doi.org/10.1177/1029864911403367.

Full text
Abstract:
Most previous studies investigating music-induced emotions have applied emotion models developed in other fields to the domain of music. The aim of this study was to compare the applicability of music-specific and general emotion models – namely the Geneva Emotional Music Scale (GEMS), and the discrete and dimensional emotion models – in the assessment of music-induced emotions. A related aim was to explore the role of individual difference variables (such as personality and mood) in music-induced emotions, and to discover whether some emotion models reflect these individual differences more s
APA, Harvard, Vancouver, ISO, and other styles
5

Vuoskoski, Jonna K., and Tuomas Eerola. "Measuring Music-Induced Emotion: A Comparison of Emotion Models, Personality Biases, and Intensity of Experiences." Musicae Scientiae 15, no. 2 (2011): 159–73. http://dx.doi.org/10.1177/102986491101500203.

Full text
Abstract:
Most previous studies investigating music-induced emotions have applied emotion models developed in other fields to the domain of music. The aim of this study was to compare the applicability of music-specific and general emotion models – namely the Geneva Emotional Music Scale (GEMS), and the discrete and dimensional emotion models – in the assessment of music-induced emotions. A related aim was to explore the role of individual difference variables (such as personality and mood) in music-induced emotions, and to discover whether some emotion models reflect these individual differences more s
APA, Harvard, Vancouver, ISO, and other styles
6

He, Jing-Xian, Li Zhou, Zhen-Tao Liu, and Xin-Yue Hu. "Digital Empirical Research of Influencing Factors of Musical Emotion Classification Based on Pleasure-Arousal Musical Emotion Fuzzy Model." Journal of Advanced Computational Intelligence and Intelligent Informatics 24, no. 7 (2020): 872–81. http://dx.doi.org/10.20965/jaciii.2020.p0872.

Full text
Abstract:
In recent years, with the further breakthrough of artificial intelligence theory and technology, as well as the further expansion of the Internet scale, the recognition of human emotions and the necessity for satisfying human psychological needs in future artificial intelligence technology development tendencies have been highlighted, in addition to physical task accomplishment. Musical emotion classification is an important research topic in artificial intelligence. The key premise of realizing music emotion classification is to construct a musical emotion model that conforms to the character
APA, Harvard, Vancouver, ISO, and other styles
7

Cahyani, Denis Eka, and Irene Patasik. "Performance comparison of TF-IDF and Word2Vec models for emotion text classification." Bulletin of Electrical Engineering and Informatics 10, no. 5 (2021): 2780–88. http://dx.doi.org/10.11591/eei.v10i5.3157.

Full text
Abstract:
Emotion is the human feeling when communicating with other humans or reaction to everyday events. Emotion classification is needed to recognize human emotions from text. This study compare the performance of the TF-IDF and Word2Vec models to represent features in the emotional text classification. We use the support vector machine (SVM) and Multinomial Naïve Bayes (MNB) methods for classification of emotional text on commuter line and transjakarta tweet data. The emotion classification in this study has two steps. The first step classifies data that contain emotion or no emotion. The second st
APA, Harvard, Vancouver, ISO, and other styles
8

Hudlicka, Eva. "Guidelines for Designing Computational Models of Emotions." International Journal of Synthetic Emotions 2, no. 1 (2011): 26–79. http://dx.doi.org/10.4018/jse.2011010103.

Full text
Abstract:
Rapid growth in computational modeling of emotion and cognitive-affective architectures occurred over the past 15 years. Emotion models and architectures are built to elucidate the mechanisms of emotions and enhance believability and effectiveness of synthetic agents and robots. Despite the many emotion models developed to date, a lack of consistency and clarity regarding what exactly it means to ‘model emotions’ persists. There are no systematic guidelines for development of computational models of emotions. This paper deconstructs the often vague term ‘emotion modeling’ by suggesting the vie
APA, Harvard, Vancouver, ISO, and other styles
9

Bruna, O., H. Avetisyan, and J. Holub. "Emotion models for textual emotion classification." Journal of Physics: Conference Series 772 (November 2016): 012063. http://dx.doi.org/10.1088/1742-6596/772/1/012063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Yong De, Shu Tong Li, Jin Gang Jiang, and Tian Hua He. "Research on Emotion Body Language Model of the Humanoid Robot." Applied Mechanics and Materials 494-495 (February 2014): 1278–81. http://dx.doi.org/10.4028/www.scientific.net/amm.494-495.1278.

Full text
Abstract:
In order to improve the personification of human-robot emotion interaction, this paper based on the Euclidean space emotion model and Ekman theory of emotion, deduces the general formula of emotion characteristic value, establishes a improvement model of robot which can express the robots emotion through the body language. The general formula of emotional feature value make the models suitable for the expression of various emotions, and through body language to express the emotions, improves the emotion expression model generality and adaptability.
APA, Harvard, Vancouver, ISO, and other styles
11

Fellous, Jean-Marc. "Models of emotion." Scholarpedia 2, no. 11 (2007): 1453. http://dx.doi.org/10.4249/scholarpedia.1453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Liebold, Benny, René Richter, Michael Teichmann, Fred H. Hamker, and Peter Ohler. "Human Capacities for Emotion Recognition and their Implications for Computer Vision." i-com 14, no. 2 (2015): 126–37. http://dx.doi.org/10.1515/icom-2015-0032.

Full text
Abstract:
AbstractCurrent models for automated emotion recognition are developed under the assumption that emotion expressions are distinct expression patterns for basic emotions. Thereby, these approaches fail to account for the emotional processes underlying emotion expressions. We review the literature on human emotion processing and suggest an alternative approach to affective computing. We postulate that the generalizability and robustness of these models can be greatly increased by three major steps: (1) modeling emotional processes as a necessary foundation of emotion recognition; (2) basing mode
APA, Harvard, Vancouver, ISO, and other styles
13

Broekens, Joost. "Modeling the Experience of Emotion." International Journal of Synthetic Emotions 1, no. 1 (2010): 1–17. http://dx.doi.org/10.4018/jse.2010101601.

Full text
Abstract:
Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers, resulting in work that is widely published. The majority of this work consists of emotion recognition technology, computational modeling of causal factors of emotion and emotion expression in virtual characters and robots. A smaller part is concerned with modeling the effects of emotion on cognition and behavior, formal modeling of cognitive appraisal theory and models of emergent emotions. Part of the motivation for affective computing as a field is to better understan
APA, Harvard, Vancouver, ISO, and other styles
14

Harata, Miho, and Masataka Tokumaru. "Emotion Generation Model with Growth Functions for Robots." Journal of Advanced Computational Intelligence and Intelligent Informatics 17, no. 2 (2013): 335–42. http://dx.doi.org/10.20965/jaciii.2013.p0335.

Full text
Abstract:
In this paper, we propose an emotion model with growth functions for robots. Many emotion models for robots have been developed using Neural Networks (NN), which focus on the functions of emotion recognition, control, and expression. One problem that affects these emotion models for robots is the development of a “simplified” emotion generation algorithm. Users readily lose interest in “simple” systems. Most models have attempted to generate complex emotional expressions, whereas no previous studies have considered the “growth of a robot.” Therefore, we propose a growth model for emotions base
APA, Harvard, Vancouver, ISO, and other styles
15

Eerola, Tuomas, and Jonna K. Vuoskoski. "A Review of Music and Emotion Studies: Approaches, Emotion Models, and Stimuli." Music Perception 30, no. 3 (2012): 307–40. http://dx.doi.org/10.1525/mp.2012.30.3.307.

Full text
Abstract:
The field of music and emotion research has grown rapidly and diversified during the last decade. This has led to a certain degree of confusion and inconsistency between competing notions of emotions, data, and results. The present review of 251 studies describes the focus of prevalent research approaches, methods, and models of emotion, and documents the types of musical stimuli used over the past twenty years. Although self-report approaches to emotions are the most common way of dealing with music and emotions, using multiple approaches is becoming increasingly popular. A large majority (70
APA, Harvard, Vancouver, ISO, and other styles
16

Wang, Shu, Chonghuan Xu, Austin Shijun Ding, and Zhongyun Tang. "A Novel Emotion-Aware Hybrid Music Recommendation Method Using Deep Neural Network." Electronics 10, no. 15 (2021): 1769. http://dx.doi.org/10.3390/electronics10151769.

Full text
Abstract:
Emotion-aware music recommendations has gained increasing attention in recent years, as music comes with the ability to regulate human emotions. Exploiting emotional information has the potential to improve recommendation performances. However, conventional studies identified emotion as discrete representations, and could not predict users’ emotional states at time points when no user activity data exists, let alone the awareness of the influences posed by social events. In this study, we proposed an emotion-aware music recommendation method using deep neural networks (emoMR). We modeled a rep
APA, Harvard, Vancouver, ISO, and other styles
17

Zhou, Tie Hua, Wenlong Liang, Hangyu Liu, Ling Wang, Keun Ho Ryu, and Kwang Woo Nam. "EEG Emotion Recognition Applied to the Effect Analysis of Music on Emotion Changes in Psychological Healthcare." International Journal of Environmental Research and Public Health 20, no. 1 (2022): 378. http://dx.doi.org/10.3390/ijerph20010378.

Full text
Abstract:
Music therapy is increasingly being used to promote physical health. Emotion semantic recognition is more objective and provides direct awareness of the real emotional state based on electroencephalogram (EEG) signals. Therefore, we proposed a music therapy method to carry out emotion semantic matching between the EEG signal and music audio signal, which can improve the reliability of emotional judgments, and, furthermore, deeply mine the potential influence correlations between music and emotions. Our proposed EER model (EEG-based Emotion Recognition Model) could identify 20 types of emotions
APA, Harvard, Vancouver, ISO, and other styles
18

Schmitz-Hübsch, Alina, and Ron Becker. "A unified valence scale based on diagnosis of facial expressions." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 66, no. 1 (2022): 1056–59. http://dx.doi.org/10.1177/1071181322661500.

Full text
Abstract:
Affect-adaptive systems detect the emotional user state, assess it against the current situation, and adjust interaction accordingly. Tools for real-time emotional state detection, like the Emotient FACET engine (Littlewort et al., 2011), are based on the analysis of facial expressions. When developing affect-adaptive systems, output from the diagnostic engine must be mapped onto theoretical models of emotion. The Circumplex Model of Affect (Russell, 1980) describes emotion on two dimensions: valence and arousal. However, FACET offers three classifiers for valence: positive, neutral, and negat
APA, Harvard, Vancouver, ISO, and other styles
19

Zhou, Chaofeng. "The application of emotion regulation strategy in classroom teaching." Highlights in Business, Economics and Management 4 (December 12, 2022): 299–310. http://dx.doi.org/10.54097/hbem.v4i.3507.

Full text
Abstract:
China's double reduction policy has placed greater demands on the all-round development of students. While Chinese education has been excellent in influencing students at the cognitive level and has been prominent in various international competitions, can classroom teaching, which is one of the most important aspects that directly affect students, continue to meet the needs for students' emotional development? This paper reviews three mainstream models of emotion regulation teaching in China with Gross' Process Model of Emotion Regulation and related emotion regulation theories, which are cur
APA, Harvard, Vancouver, ISO, and other styles
20

Yi, Jingjing, Jiayu Gina Qu, and Wanjiang Jacob Zhang. "Depicting the Emotion Flow: Super-Spreaders of Emotional Messages on Weibo During the COVID-19 Pandemic." Social Media + Society 8, no. 1 (2022): 205630512210849. http://dx.doi.org/10.1177/20563051221084950.

Full text
Abstract:
This study collected 2 million posts and reposts regarding the early stage of COVID-19 in China on Weibo from 26 December 2019 to 29 February 2020. Emotion analysis and social network analysis were used to examine the flow of emotional messages (emotion flow) by comparing them with the flow of general messages (information flow). Results indicated that both emotional messages and general messages present a multilayer diffusion pattern and follow network step flow models. In our dataset, emotion network has a higher transmission efficiency than information network; officially verified accounts
APA, Harvard, Vancouver, ISO, and other styles
21

Dores, Artemisa R., Fernando Barbosa, Cristina Queirós, Irene P. Carvalho, and Mark D. Griffiths. "Recognizing Emotions through Facial Expressions: A Largescale Experimental Study." International Journal of Environmental Research and Public Health 17, no. 20 (2020): 7420. http://dx.doi.org/10.3390/ijerph17207420.

Full text
Abstract:
Experimental research examining emotional processes is typically based on the observation of images with affective content, including facial expressions. Future studies will benefit from databases with emotion-inducing stimuli in which characteristics of the stimuli potentially influencing results can be controlled. This study presents Portuguese normative data for the identification of seven facial expressions of emotions (plus a neutral face), on the Radboud Faces Database (RaFD). The effect of participants’ gender and models’ sex on emotion recognition was also examined. Participants (N = 1
APA, Harvard, Vancouver, ISO, and other styles
22

Hazra, Sumon Kumar, Romana Rahman Ema, Syed Md Galib, Shalauddin Kabir, and Nasim Adnan. "Emotion recognition of human speech using deep learning method and MFCC features." Radioelectronic and Computer Systems, no. 4 (November 29, 2022): 161–72. http://dx.doi.org/10.32620/reks.2022.4.13.

Full text
Abstract:
Subject matter: Speech emotion recognition (SER) is an ongoing interesting research topic. Its purpose is to establish interactions between humans and computers through speech and emotion. To recognize speech emotions, five deep learning models: Convolution Neural Network, Long-Short Term Memory, Artificial Neural Network, Multi-Layer Perceptron, Merged CNN, and LSTM Network (CNN-LSTM) are used in this paper. The Toronto Emotional Speech Set (TESS), Surrey Audio-Visual Expressed Emotion (SAVEE) and Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) datasets were used for this
APA, Harvard, Vancouver, ISO, and other styles
23

Meyer, Sara, H. Abigail Raikes, Elita A. Virmani, Sara Waters, and Ross A. Thompson. "Parent emotion representations and the socialization of emotion regulation in the family." International Journal of Behavioral Development 38, no. 2 (2014): 164–73. http://dx.doi.org/10.1177/0165025413519014.

Full text
Abstract:
There is considerable knowledge of parental socialization processes that directly and indirectly influence the development of children’s emotion self-regulation, but little understanding of the specific beliefs and values that underlie parents’ socialization approaches. This study examined multiple aspects of parents’ self-reported emotion representations and their associations with parents’ strategies for managing children’s negative emotions and children’s emotion self-regulatory behaviors. The sample consisted of 73 mothers of 4–5-year-old children; the sample was ethnically diverse. Two as
APA, Harvard, Vancouver, ISO, and other styles
24

Zeng, Xueqiang, Qifan Chen, Sufen Chen, and Jiali Zuo. "Emotion Label Enhancement via Emotion Wheel and Lexicon." Mathematical Problems in Engineering 2021 (April 30, 2021): 1–11. http://dx.doi.org/10.1155/2021/6695913.

Full text
Abstract:
Emotion Distribution Learning (EDL) is a recently proposed multiemotion analysis paradigm, which identifies basic emotions with different degrees of expression in a sentence. Different from traditional methods, EDL quantitatively models the expression degree of the corresponding emotion on the given instance in an emotion distribution. However, emotion labels are crisp in most existing emotion datasets. To utilize traditional emotion datasets in EDL, label enhancement aims to convert logical emotion labels into emotion distributions. This paper proposed a novel label enhancement method, called
APA, Harvard, Vancouver, ISO, and other styles
25

Bassett, Hideko Hamada, Susanne A. Denham, Nicole B. Fettig, Timothy W. Curby, Mandana Mohtasham, and Nila Austin. "Temperament in the classroom." International Journal of Behavioral Development 41, no. 1 (2016): 4–14. http://dx.doi.org/10.1177/0165025416644077.

Full text
Abstract:
Based on the emotion socialization and bioecological models, the present study examined the contributions of teacher emotion socialization (i.e., teacher reactions to child emotions) on children’s social–emotional behaviors, and the moderating effect of child temperamental surgency on these relations in the preschool context. A total of 337 children and 80 teachers from private and public preschools/childcares participated in the study. To account for the nested nature of our data, hierarchical linear modeling (HLM) was utilized. The results indicated that several types of teacher reactions to
APA, Harvard, Vancouver, ISO, and other styles
26

Smith, Geneva, and Jacques Carette. "Design Foundations for Emotional Game Characters." Eludamos: Journal for Computer Game Culture 10, no. 1 (2020): 109–40. http://dx.doi.org/10.7557/23.6175.

Full text
Abstract:
Recent Computer Role Playing Games such as Bethesda’s The Elder Scrolls V: Skyrim and Nintendo’s The Legend of Zelda: Breath of the Wild, have entranced us with their expansive, complex worlds. However, the Non-Player Characters (NPCs) in these games remain stale and lackluster outside of scripted events. This is, in part, because game engines generally do not simulate emotions in their NPCs while they wander in the world. Wouldn't these games be much more interesting, potentially even more re-playable, if NPCs reacted more appropriately to the situations they find themselves in?To be able to
APA, Harvard, Vancouver, ISO, and other styles
27

Liu, Xiao-Yu, Nai-Wen Chi, and Dwayne D. Gremler. "Emotion Cycles in Services: Emotional Contagion and Emotional Labor Effects." Journal of Service Research 22, no. 3 (2019): 285–300. http://dx.doi.org/10.1177/1094670519835309.

Full text
Abstract:
Service organizations encourage employees to express positive emotions in service encounters, in the hope that customers “catch” these emotions and react positively. Yet customer and employee emotions could be mutually influential. To understand emotional exchanges in service encounters and their influences on customer outcomes, the current study models the interplay of emotional contagion and emotional labor, as well as their influence on customer satisfaction. Employees might catch customers’ emotions and transmit those emotions back to customers through emotional contagion, and employee emo
APA, Harvard, Vancouver, ISO, and other styles
28

Nair, Shivashankar B., W. Wilfred Godfrey, and Dong Hwa Kim. "On Realizing a Multi-Agent Emotion Engine." International Journal of Synthetic Emotions 2, no. 2 (2011): 1–27. http://dx.doi.org/10.4018/jse.2011070101.

Full text
Abstract:
Emotions have always been a complex phenomenon and research on their causes and effects have been fraught with debates. Though a reasonable and unified theory seems lacking, there have been many attempts at building models that emote. This paper describes a multi-agent approach that aids robot emotion. Emotions are grounded on percepts from sensors and generated by dedicated emotion agents that work concurrently with others – the positive suppressing the negative and vice versa while stimulating their own kinds. Each agent forms a metaphor of an emotion-generating entity that has a replenishin
APA, Harvard, Vancouver, ISO, and other styles
29

Barlow, Meaghan, Emily Willroth, Carsten Wrosch, Oliver John, and Iris Mauss. "AGE DIFFERENCES EMOTION GLOBALIZING: AN EXAMINATION OF BOUNDARY CONDITIONS." Innovation in Aging 6, Supplement_1 (2022): 146. http://dx.doi.org/10.1093/geroni/igac059.581.

Full text
Abstract:
Abstract Emotion globalizing is an individual difference variable referring to the extent to which daily variations in an individual’s current emotions spillover into their evaluations of life satisfaction. The present research sought to: 1) extend the conception of emotion globalizing to stressor-related emotions, 2) examine age differences in these processes, and 3) differentiate associations with evaluations of life satisfaction and day satisfaction. To do so, we used daily diary data from two adult lifespan community samples [Study 1: N = 133 females, age range = 23-78; Study 2: N = 137, a
APA, Harvard, Vancouver, ISO, and other styles
30

Sun, Ron, Joseph Allen, and Eric Werbin. "Modeling Emotion Contagion within a Computational Cognitive Architecture." Journal of Cognition and Culture 22, no. 1-2 (2022): 60–89. http://dx.doi.org/10.1163/15685373-12340125.

Full text
Abstract:
Abstract The issue of emotion contagion has been gaining attention. Humans can share emotions, for example, through gestures, through speech, or even through online text via social media. There have been computational models trying to capture emotion contagion. However, these models are limited as they tend to represent agents in a very simplified way. There exist also more complex models of agents and their emotions, but they are not yet addressing emotion contagion. We use a more psychologically realistic and better validated model – the Clarion cognitive architecture – as the basis to model
APA, Harvard, Vancouver, ISO, and other styles
31

Horvat, Marko, Alan Jović, and Kristijan Burnik. "Investigation of Relationships between Discrete and Dimensional Emotion Models in Affective Picture Databases Using Unsupervised Machine Learning." Applied Sciences 12, no. 15 (2022): 7864. http://dx.doi.org/10.3390/app12157864.

Full text
Abstract:
Digital documents created to evoke emotional responses are intentionally stored in special affective multimedia databases, along with metadata describing their semantics and emotional content. These databases are routinely used in multidisciplinary research on emotion, attention, and related phenomena. Affective dimensions and emotion norms are the most common emotion data models in the field of affective computing, but they are considered separable and not interchangeable. The goal of this study was to determine whether it is possible to statistically infer values of emotionally annotated pic
APA, Harvard, Vancouver, ISO, and other styles
32

Szabóová, Martina, Martin Sarnovský, Viera Maslej Krešňáková, and Kristína Machová. "Emotion Analysis in Human–Robot Interaction." Electronics 9, no. 11 (2020): 1761. http://dx.doi.org/10.3390/electronics9111761.

Full text
Abstract:
This paper connects two large research areas, namely sentiment analysis and human–robot interaction. Emotion analysis, as a subfield of sentiment analysis, explores text data and, based on the characteristics of the text and generally known emotional models, evaluates what emotion is presented in it. The analysis of emotions in the human–robot interaction aims to evaluate the emotional state of the human being and on this basis to decide how the robot should adapt its behavior to the human being. There are several approaches and algorithms to detect emotions in the text data. We decided to app
APA, Harvard, Vancouver, ISO, and other styles
33

Lucin, D. V., Y. A. Kozhukhova, and E. A. Suchkova. "Emotion congruence in the perception of ambiguous facial expressions." Experimental Psychology (Russia) 12, no. 1 (2019): 27–39. http://dx.doi.org/10.17759/exppsy.2019120103.

Full text
Abstract:
Emotion congruence in emotion perception is manifested in increasing sensitivity to the emotions corresponding to the perceiver’s emotional state. In this study, an experimental procedure that robustly generates emotion congruence during the perception of ambiguous facial expressions has been developed. It was hypothesized that emotion congruence will be stronger in the early stages of perception. In two experiments, happiness and sadness were elicited in 69 (mean age 20.2, 57 females) and 58 (mean age 18.2, 50 females) participants. Then they determined what emotions were present in the ambig
APA, Harvard, Vancouver, ISO, and other styles
34

Othman, Marini, Abdul Wahab, Izzah Karim, Mariam Adawiah Dzulkifli, and Imad Fakhri Taha Alshaikli. "EEG Emotion Recognition Based on the Dimensional Models of Emotions." Procedia - Social and Behavioral Sciences 97 (November 2013): 30–37. http://dx.doi.org/10.1016/j.sbspro.2013.10.201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Starkey, Charles. "Perceptual Emotions and Emotional Virtue." Journal of Philosophy of Emotion 3, no. 1 (2021): 10–15. http://dx.doi.org/10.33497/2021.summer.3.

Full text
Abstract:
In this essay I focus on two areas discussed in Michael Brady’s Emotion: The Basics, namely perceptual models of emotion and the relation between emotion and virtue. Brady raises two concerns about perceptual theories: that they arguably collapse into feeling or cognitive theories of emotion; and that the analogy between emotion and perception is questionable at best, and is thus not an adequate way of characterizing emotion. I argue that a close look at perception and emotional experience reveals a structure of emotion that avoids these problems. I then explore other ways in which emotions ca
APA, Harvard, Vancouver, ISO, and other styles
36

Puri, Tanvi, Mukesh Soni, Gaurav Dhiman, Osamah Ibrahim Khalaf, Malik alazzam, and Ihtiram Raza Khan. "Detection of Emotion of Speech for RAVDESS Audio Using Hybrid Convolution Neural Network." Journal of Healthcare Engineering 2022 (February 27, 2022): 1–9. http://dx.doi.org/10.1155/2022/8472947.

Full text
Abstract:
Every human being has emotion for every item related to them. For every customer, their emotion can help the customer representative to understand their requirement. So, speech emotion recognition plays an important role in the interaction between humans. Now, the intelligent system can help to improve the performance for which we design the convolution neural network (CNN) based network that can classify emotions in different categories like positive, negative, or more specific. In this paper, we use the Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) audio records. The L
APA, Harvard, Vancouver, ISO, and other styles
37

Scherer, Klaus R. "Emotions are emergent processes: they require a dynamic computational architecture." Philosophical Transactions of the Royal Society B: Biological Sciences 364, no. 1535 (2009): 3459–74. http://dx.doi.org/10.1098/rstb.2009.0141.

Full text
Abstract:
Emotion is a cultural and psychobiological adaptation mechanism which allows each individual to react flexibly and dynamically to environmental contingencies. From this claim flows a description of the elements theoretically needed to construct a virtual agent with the ability to display human-like emotions and to respond appropriately to human emotional expression. This article offers a brief survey of the desirable features of emotion theories that make them ideal blueprints for agent models. In particular, the component process model of emotion is described, a theory which postulates emotio
APA, Harvard, Vancouver, ISO, and other styles
38

Rodríguez, Luis-Felipe, Félix Ramos, and Yingxu Wang. "Cognitive Computational Models of Emotions and Affective Behaviors." International Journal of Software Science and Computational Intelligence 4, no. 2 (2012): 41–63. http://dx.doi.org/10.4018/jssci.2012040103.

Full text
Abstract:
Emotions are one of the important subconscious mechanisms that influence human behaviors, attentions, and decision making. The emotion process helps to determine how humans perceive their internal status and needs in order to form consciousness of an individual. Emotions have been studied from multidisciplinary perspectives and covered a wide range of empirical and psychological topics, such as understanding the emotional processes, creating cognitive and computational models of emotions, and applications in computational intelligence. This paper presents a comprehensive survey of cognitive an
APA, Harvard, Vancouver, ISO, and other styles
39

He, Zhongyang, Ning Zhuang, Guangcheng Bao, Ying Zeng, and Bin Yan. "Cross-Day EEG-Based Emotion Recognition Using Transfer Component Analysis." Electronics 11, no. 4 (2022): 651. http://dx.doi.org/10.3390/electronics11040651.

Full text
Abstract:
EEG-based emotion recognition can help achieve more natural human-computer interaction, but the temporal non-stationarity of EEG signals affects the robustness of EEG-based emotion recognition models. Most existing studies use the emotional EEG data collected in the same trial to train and test models, once this kind of model is applied to the data collected at different times of the same subject, its recognition accuracy will decrease significantly. To address the problem of EEG-based cross-day emotion recognition, this paper has constructed a database of emotional EEG signals collected over
APA, Harvard, Vancouver, ISO, and other styles
40

Luna-Jiménez, Cristina, Ricardo Kleinlein, David Griol, Zoraida Callejas, Juan M. Montero, and Fernando Fernández-Martínez. "A Proposal for Multimodal Emotion Recognition Using Aural Transformers and Action Units on RAVDESS Dataset." Applied Sciences 12, no. 1 (2021): 327. http://dx.doi.org/10.3390/app12010327.

Full text
Abstract:
Emotion recognition is attracting the attention of the research community due to its multiple applications in different fields, such as medicine or autonomous driving. In this paper, we proposed an automatic emotion recognizer system that consisted of a speech emotion recognizer (SER) and a facial emotion recognizer (FER). For the SER, we evaluated a pre-trained xlsr-Wav2Vec2.0 transformer using two transfer-learning techniques: embedding extraction and fine-tuning. The best accuracy results were achieved when we fine-tuned the whole model by appending a multilayer perceptron on top of it, con
APA, Harvard, Vancouver, ISO, and other styles
41

Bardak, F. Kebire, M. Nuri Seyman, and Feyzullah Temurtaş. "EEG Based Emotion Prediction with Neural Network Models." Tehnički glasnik 16, no. 4 (2022): 497–502. http://dx.doi.org/10.31803/tg-20220330064309.

Full text
Abstract:
The term "emotion" refers to an individual's response to an event, person, or condition. In recent years, there has been an increase in the number of papers that have studied emotion estimation. In this study, a dataset based on three different emotions, utilized to classify feelings using EEG brainwaves, has been analysed. In the dataset, six film clips have been used to elicit positive and negative emotions from a male and a female. However, there has not been a trigger to elicit a neutral mood. Various classification approaches have been used to classify the dataset, including MLP, SVM, PNN
APA, Harvard, Vancouver, ISO, and other styles
42

Liu, Shiguang, Huixin Wang, and Min Pei. "Facial-expression-aware Emotional Color Transfer Based on Convolutional Neural Network." ACM Transactions on Multimedia Computing, Communications, and Applications 18, no. 1 (2022): 1–19. http://dx.doi.org/10.1145/3464382.

Full text
Abstract:
Emotional color transfer aims to change the evoked emotion of a source image to that of a target image by adjusting color distribution. Most of existing emotional color transfer methods only consider the low-level visual features of an image and ignore the facial expression features when the image contains a human face, which would cause incorrect emotion evaluation for the given image. In addition, previous emotional color transfer methods may easily result in ambiguity between the emotion of resulting image and target image. For example, if the background of the target image is dark while th
APA, Harvard, Vancouver, ISO, and other styles
43

Pohl, Anna, Sebastian Dummel, Mascha Bothur, and Alexander L. Gerlach. "Interoceptive accuracy does not predict emotion perception in daily life." Open Psychology 4, no. 1 (2022): 175–86. http://dx.doi.org/10.1515/psych-2022-0009.

Full text
Abstract:
Abstract Peripheral emotion theories suggest a crucial role of interoception for emotion perception, which in turn facilitates emotion regulation. Laboratory studies found positive relations between interoceptive accuracy and perceived emotion intensity and arousal. Studies in natural settings are largely missing, but seem important by virtue of emotional experience and regulation diversity. On hundred seven participants underwent a cardiovascular interoceptive accuracy task. Afterwards, participants provided detailed information on perceived emotions and emotion regulation strategies in an ec
APA, Harvard, Vancouver, ISO, and other styles
44

Liang, Yunlong, Fandong Meng, Ying Zhang, Yufeng Chen, Jinan Xu, and Jie Zhou. "Infusing Multi-Source Knowledge with Heterogeneous Graph Neural Network for Emotional Conversation Generation." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 15 (2021): 13343–52. http://dx.doi.org/10.1609/aaai.v35i15.17575.

Full text
Abstract:
The success of emotional conversation systems depends on sufficient perception and appropriate expression of emotions. In a real-world conversation, we firstly instinctively perceive emotions from multi-source information, including the emotion flow of dialogue history, facial expressions, and personalities of speakers, and then express suitable emotions according to our personalities, but these multiple types of information are insufficiently exploited in emotional conversation fields. To address this issue, we propose a heterogeneous graph-based model for emotional conversation generation. S
APA, Harvard, Vancouver, ISO, and other styles
45

Fakhar, Shariqa, Junaid Baber, Sibghat Ullah Bazai, et al. "Smart Classroom Monitoring Using Novel Real-Time Facial Expression Recognition System." Applied Sciences 12, no. 23 (2022): 12134. http://dx.doi.org/10.3390/app122312134.

Full text
Abstract:
Emotions play a vital role in education. Technological advancement in computer vision using deep learning models has improved automatic emotion recognition. In this study, a real-time automatic emotion recognition system is developed incorporating novel salient facial features for classroom assessment using a deep learning model. The proposed novel facial features for each emotion are initially detected using HOG for face recognition, and automatic emotion recognition is then performed by training a convolutional neural network (CNN) that takes real-time input from a camera deployed in the cla
APA, Harvard, Vancouver, ISO, and other styles
46

Song, Yading, Simon Dixon, Marcus T. Pearce, and Andrea R. Halpern. "Perceived and Induced Emotion Responses to Popular Music." Music Perception 33, no. 4 (2016): 472–92. http://dx.doi.org/10.1525/mp.2016.33.4.472.

Full text
Abstract:
Music both conveys and evokes emotions, and although both phenomena are widely studied, the difference between them is often neglected. The purpose of this study is to examine the difference between perceived and induced emotion for Western popular music using both categorical and dimensional models of emotion, and to examine the influence of individual listener differences on their emotion judgment. A total of 80 musical excerpts were randomly selected from an established dataset of 2,904 popular songs tagged with one of the four words “happy,” “sad,” “angry,” or “relaxed” on the Last.FM web
APA, Harvard, Vancouver, ISO, and other styles
47

Caballero-Morales, Santiago-Omar. "Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels." Scientific World Journal 2013 (2013): 1–13. http://dx.doi.org/10.1155/2013/162093.

Full text
Abstract:
An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness). Then, estimation of the emotional state from a spoken sentence is perfor
APA, Harvard, Vancouver, ISO, and other styles
48

Khan, Shahidul Islam, FaisalBinAziz, and MdMisbah Uddin. "Emotion Detection from Multilingual Text and Multi-Emotional Sentence using Difference NLP Feature Extraction Technique and ML Classifier." International Journal of Advanced Networking and Applications 14, no. 03 (2022): 5429–35. http://dx.doi.org/10.35444/ijana.2022.14303.

Full text
Abstract:
Machines can read, comprehend, and extrapolate meaning from human languages, thanks to natural language processing.In this paper, we have detected emotion from multilingual text and multi-emotional sentences.For our research, we have collected a dataset containing around 7000 tweets on 4 emotions (Anger, Fear, Joy, and Sadness). After pre-processing our data, we used 2 NLP feature extraction models and trained those with the help of 4 different Machine Learning classifiers. We have also developed an algorithm for detecting exact emotions from multi-emotional sentences. Also, we compared our re
APA, Harvard, Vancouver, ISO, and other styles
49

Hübner, Amelie M., Ima Trempler, Corinna Gietmann, and Ricarda I. Schubotz. "Interoceptive sensibility predicts the ability to infer others’ emotional states." PLOS ONE 16, no. 10 (2021): e0258089. http://dx.doi.org/10.1371/journal.pone.0258089.

Full text
Abstract:
Emotional sensations and inferring another’s emotional states have been suggested to depend on predictive models of the causes of bodily sensations, so-called interoceptive inferences. In this framework, higher sensibility for interoceptive changes (IS) reflects higher precision of interoceptive signals. The present study examined the link between IS and emotion recognition, testing whether individuals with higher IS recognize others’ emotions more easily and are more sensitive to learn from biased probabilities of emotional expressions. We recorded skin conductance responses (SCRs) from forty
APA, Harvard, Vancouver, ISO, and other styles
50

Kragel, Philip A., Marianne C. Reddan, Kevin S. LaBar, and Tor D. Wager. "Emotion schemas are embedded in the human visual system." Science Advances 5, no. 7 (2019): eaaw4358. http://dx.doi.org/10.1126/sciadv.aaw4358.

Full text
Abstract:
Theorists have suggested that emotions are canonical responses to situations ancestrally linked to survival. If so, then emotions may be afforded by features of the sensory environment. However, few computational models describe how combinations of stimulus features evoke different emotions. Here, we develop a convolutional neural network that accurately decodes images into 11 distinct emotion categories. We validate the model using more than 25,000 images and movies and show that image content is sufficient to predict the category and valence of human emotion ratings. In two functional magnet
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!