Academic literature on the topic 'Multimodal perception of emotion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multimodal perception of emotion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multimodal perception of emotion"

1

Barabanschikov, Vladimir A., and Ekaterina V. Suvorova. "Expression and perception of multimodal emotional states." National Psychological Journal 51, no. 3 (2023): 106–27. http://dx.doi.org/10.11621/npj.2023.0311.

Full text
Abstract:
Background. The urge for deeper knowledge on the nature of emotion expression and perception in ecologically and socially valid conditions is of grate importance. The scope is to develop experimental procedures recording not only the demonstration of emotions, but also actual human emotional experience. Objective. The objective is to reveal the patterns of expression and identification of multimodal dynamic emotional states, based on the stimuli, created by professional actors. Sample. The experiments involved 96 (48 women and 48 men) specialists in various fields of practice as well as underg
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Zhongbo. "Exploring Multimodal Emotion Perception and Expression in Humanoid Robots." Applied and Computational Engineering 174, no. 1 (2025): 86–91. https://doi.org/10.54254/2755-2721/2025.po24880.

Full text
Abstract:
Traditional single-modal emotion recognition is limited by environmental sensitivity, which means that under different environmental conditions, the effect of emotion recognition may be affected, resulting in a decrease in recognition accuracy. Multimodal emotion recognition improves the accuracy of emotion recognition through the complementarity of features between different modalities. In view of the key role of emotion in human cognition and behavior, this paper explains the adaptation laws of emotional robots in medical care, education and growth scenarios by systematically analyzing unimo
APA, Harvard, Vancouver, ISO, and other styles
3

de Boer, Minke J., Deniz Başkent, and Frans W. Cornelissen. "Eyes on Emotion: Dynamic Gaze Allocation During Emotion Perception From Speech-Like Stimuli." Multisensory Research 34, no. 1 (2020): 17–47. http://dx.doi.org/10.1163/22134808-bja10029.

Full text
Abstract:
Abstract The majority of emotional expressions used in daily communication are multimodal and dynamic in nature. Consequently, one would expect that human observers utilize specific perceptual strategies to process emotions and to handle the multimodal and dynamic nature of emotions. However, our present knowledge on these strategies is scarce, primarily because most studies on emotion perception have not fully covered this variation, and instead used static and/or unimodal stimuli with few emotion categories. To resolve this knowledge gap, the present study examined how dynamic emotional audi
APA, Harvard, Vancouver, ISO, and other styles
4

Gao, Xiyuan, Shekhar Nayak, and Matt Coler. "Enhancing sarcasm detection through multimodal data integration: A proposal for augmenting audio with text and emoticon." Journal of the Acoustical Society of America 155, no. 3_Supplement (2024): A264. http://dx.doi.org/10.1121/10.0027441.

Full text
Abstract:
Sarcasm detection presents unique challenges in speech technology, particularly for individuals with disorders that affect pitch perception or those lacking contextual auditory cues. While previous research [1, 2] has established the significance of pitch variation in sarcasm detection, these studies have primarily focused on singular modalities, often overlooking the potential synergies of integrating multimodal data. We propose an approach that synergizes auditory, textual, and emoticon data to enhance sarcasm detection. This involves augmenting sarcastic audio data with corresponding text u
APA, Harvard, Vancouver, ISO, and other styles
5

Vallverdú, Jordi, Gabriele Trovato, and Lorenzo Jamone. "Allocentric Emotional Affordances in HRI: The Multimodal Binding." Multimodal Technologies and Interaction 2, no. 4 (2018): 78. http://dx.doi.org/10.3390/mti2040078.

Full text
Abstract:
The concept of affordance perception is one of the distinctive traits of human cognition; and its application to robots can dramatically improve the quality of human-robot interaction (HRI). In this paper we explore and discuss the idea of “emotional affordances” by proposing a viable model for implementation into HRI; which considers allocentric and multimodal perception. We consider “2-ways” affordances: perceived object triggering an emotion; and perceived human emotion expression triggering an action. In order to make the implementation generic; the proposed model includes a library that c
APA, Harvard, Vancouver, ISO, and other styles
6

Shackman, Jessica E., and Seth D. Pollak. "Experiential Influences on Multimodal Perception of Emotion." Child Development 76, no. 5 (2005): 1116–26. http://dx.doi.org/10.1111/j.1467-8624.2005.00901.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Portnova, Galina V., and Daria A. Stebakova. "The multimodal emotion perception in codependent individuals." Neuroscience Research Notes 6, no. 1 (2023): 210. http://dx.doi.org/10.31117/neuroscirn.v6i1.210.

Full text
Abstract:
The emotional disturbances of individuals with codependency are often ignored. This study aimed to investigate the emotional perception of codependent individuals in four modalities – visual, auditory, tactile and olfactory. An EEG study was performed and presented pleasant and unpleasant stimuli selected by a panel of experts for each modality. Participants (fifteen codependent individuals and fifteen healthy volunteers) were instructed to assess the emotional impact and pleasantness of stimuli. The method of EEG spaces was used to visualize how close perceived stimuli were according to EEG d
APA, Harvard, Vancouver, ISO, and other styles
8

Barabanschikov, V. A., and E. V. Suvorova. "Gender Differences in the Recognition of Emotional States." Психологическая наука и образование 26, no. 6 (2021): 107–16. http://dx.doi.org/10.17759/pse.2021260608.

Full text
Abstract:
As a rule, gender differences in the perception of human emotional states are studied on the basis of static pictures of face, gestures or poses. The dynamics and multiplicity of the emotion expression remain in the «blind zone». This work is aimed at finding relationships in the perception of the procedural characteristics of the emotion expression. The influence of gender and age on the identification of human emotional states is experimentally investigated in ecologically and socially valid situations. The experiments were based on the Russian-language version of the Geneva Emotion Recognit
APA, Harvard, Vancouver, ISO, and other styles
9

Marino, David, Max Henry, Pascal E. Fortin, Rachit Bhayana, and Jeremy Cooperstock. "I See What You're Hearing: Facilitating The Effect of Environment on Perceived Emotion While Teleconferencing." Proceedings of the ACM on Human-Computer Interaction 7, CSCW1 (2023): 1–15. http://dx.doi.org/10.1145/3579495.

Full text
Abstract:
Our perception of emotion is highly contextual. Changes in the environment can affect our narrative framing, and thus augment our emotional perception of interlocutors. User environments are typically heavily suppressed due to the technical limitations of commercial videoconferencing platforms. As a result, there is often a lack of contextual awareness while participating in a video call, and this affects how we perceive the emotions of conversants. We present a videoconferencing module that visualizes the user's aural environment to enhance awareness between interlocutors. The system visualiz
APA, Harvard, Vancouver, ISO, and other styles
10

Bi, Xin, and Tian Zhang. "Analysis of the fusion of multimodal sentiment perception and physiological signals in Chinese-English cross-cultural communication: Transformer approach incorporating self-attention enhancement." PeerJ Computer Science 11 (May 23, 2025): e2890. https://doi.org/10.7717/peerj-cs.2890.

Full text
Abstract:
With the acceleration of globalization, cross-cultural communication has become a crucial issue in various fields. Emotion, as an essential component of communication, plays a key role in improving understanding and interaction efficiency across different cultures. However, accurately recognizing emotions across cultural backgrounds remains a major challenge in affective computing, particularly due to limitations in multimodal feature fusion and temporal dependency modeling in traditional approaches. To address this, we propose the TAF-ATRM framework, which integrates Transformer and multi-hea
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Multimodal perception of emotion"

1

Cox, A. G. "Multimodal emotion perception from facial and vocal signals." Thesis, University of Cambridge, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598105.

Full text
Abstract:
The perception of emotion in other people is a fundamental part of social communication. Emotional expressions are often multimodal in nature and like human speech both auditory and visual components are used for comprehension. Up to this date however, the majority of emotion research has focused on the perception of emotion from facial or vocal expressions in isolation. This thesis investigated the behavioural and neural consequences of perceiving emotion from facial and vocal emotional signals simultaneously. Initial experiments demonstrated that a congruent, but unattended, vocal expression
APA, Harvard, Vancouver, ISO, and other styles
2

REALDON, OLIVIA. "Differenze culturali nella percezione multimodale delle emozioni." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2012. http://hdl.handle.net/10281/37944.

Full text
Abstract:
The research question in the present study concerns how culture shapes the way in which simultaneous facial and vocalization cues are combined in emotion perception. The matter is not whether culture influences such process: cultures supply systems of meaning that make salient different core emotional themes, different sets of emotions, their ostensible expression, and action tendencies. Therefore, research doesn’t regard whether, but how and at what level of analysis culture shapes these processes (Matsumoto, 2001). Cultural variability was tested within the methodological framework of cultu
APA, Harvard, Vancouver, ISO, and other styles
3

ur, Réhman Shafiq. "Expressing emotions through vibration for perception and control." Doctoral thesis, Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-32990.

Full text
Abstract:
This thesis addresses a challenging problem: “how to let the visually impaired ‘see’ others emotions”. We, human beings, are heavily dependent on facial expressions to express ourselves. A smile shows that the person you are talking to is pleased, amused, relieved etc. People use emotional information from facial expressions to switch between conversation topics and to determine attitudes of individuals. Missing emotional information from facial expressions and head gestures makes the visually impaired extremely difficult to interact with others in social events. To enhance the visually impair
APA, Harvard, Vancouver, ISO, and other styles
4

Fernández, Carbonell Marcos. "Automated Multimodal Emotion Recognition." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-282534.

Full text
Abstract:
Being able to read and interpret affective states plays a significant role in human society. However, this is difficult in some situations, especially when information is limited to either vocal or visual cues. Many researchers have investigated the so-called basic emotions in a supervised way. This thesis holds the results of a multimodal supervised and unsupervised study of a more realistic number of emotions. To that end, audio and video features are extracted from the GEMEP dataset employing openSMILE and OpenFace, respectively. The supervised approach includes the comparison of multiple s
APA, Harvard, Vancouver, ISO, and other styles
5

Nguyen, Tien Dung. "Multimodal emotion recognition using deep learning techniques." Thesis, Queensland University of Technology, 2020. https://eprints.qut.edu.au/180753/1/Tien%20Dung_Nguyen_Thesis.pdf.

Full text
Abstract:
This thesis investigates the use of deep learning techniques to address the problem of machine understanding of human affective behaviour and improve the accuracy of both unimodal and multimodal human emotion recognition. The objective was to explore how best to configure deep learning networks to capture individually and jointly, the key features contributing to human emotions from three modalities (speech, face, and bodily movements) to accurately classify the expressed human emotion. The outcome of the research should be useful for several applications including the design of social robots.
APA, Harvard, Vancouver, ISO, and other styles
6

Gay, R. "Morality : Emotion, perception and belief." Thesis, University of Oxford, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.371649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lawrie, Louisa. "Adult ageing and emotion perception." Thesis, University of Aberdeen, 2018. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=239235.

Full text
Abstract:
Older adults are worse than young adults at perceiving emotions in others. However, it is unclear why these age-related differences in emotion perception exist. The studies presented in this thesis investigated the cognitive, emotional and motivational factors influencing age differences in emotion perception. Study 1 revealed no age differences in mood congruence effects: sad faces were rated as more sad when participants experienced negative mood. In contrast, Study 2 demonstrated that sad mood impaired recognition accuracy for sad faces. Together, findings suggested that different methods o
APA, Harvard, Vancouver, ISO, and other styles
8

Abrilian, Sarkis. "Représentation de comportements emotionnels multimodaux spontanés : perception, annotation et synthèse." Phd thesis, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00620827.

Full text
Abstract:
L'objectif de cette thèse est de représenter les émotions spontanées et les signes multimodaux associés pour contribuer à la conception des futurs systèmes affectifs interactifs. Les prototypes actuels sont généralement limités à la détection et à la génération de quelques émotions simples et se fondent sur des données audio ou vidéo jouées par des acteurs et récoltées en laboratoire. Afin de pouvoir modéliser les relations complexes entre les émotions spontanées et leurs expressions dans différentes modalités, une approche exploratoire est nécessaire. L'approche exploratoire que nous avons ch
APA, Harvard, Vancouver, ISO, and other styles
9

Angelica, Lim. "MEI: Multimodal Emotional Intelligence." 京都大学 (Kyoto University), 2014. http://hdl.handle.net/2433/188869.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kosti, Ronak. "Visual scene context in emotion perception." Doctoral thesis, Universitat Oberta de Catalunya, 2019. http://hdl.handle.net/10803/667808.

Full text
Abstract:
Els estudis psicològics demostren que el context de l'escena, a més de l'expressió facial i la postura corporal, aporta informació important a la nostra percepció de les emocions de les persones. Tot i això, el processament del context per al reconeixement automàtic de les emocions no s'ha explorat a fons, en part per la manca de dades adequades. En aquesta tesi presentem EMOTIC, un conjunt de dades d'imatges de persones en situacions naturals i diferents anotades amb la seva aparent emoció. La base de dades EMOTIC combina dos tipus de representació d'emocions diferents: (1) un conjunt de 26 c
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Multimodal perception of emotion"

1

André, Elisabeth, Laila Dybkjær, Wolfgang Minker, Heiko Neumann, Roberto Pieraccini, and Michael Weber, eds. Perception in Multimodal Dialogue Systems. Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-69369-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stiefelhagen, Rainer, Rachel Bowers, and Jonathan Fiscus, eds. Multimodal Technologies for Perception of Humans. Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-68585-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Stiefelhagen, Rainer, and John Garofolo, eds. Multimodal Technologies for Perception of Humans. Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-69568-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Riccio, Gary E. Multimodal perception and multicriterion control of nested systems. National Aeronautics and Space Administration, Lyndon B. Johnson Space Center, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Riccio, Gary E. Multimodal perception and multicriterion control of nested systems. National Aeronautics and Space Administration, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vernon, McDonald P., Bloomberg Jacob, and Lyndon B. Johnson Space Center., eds. Multimodal perception and multicriterion control of nested systems. National Aeronautics and Space Administration, Lyndon B. Johnson Space Center, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Seymour, Julie, Abigail Hackett, and Lisa Procter. Children's spatialities: Embodiment, emotion and agency. Palgrave Macmillan, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Badenhop, Dennis. Praktische Anschauung: Sinneswahrnehmung, Emotion und moralisches Begründen. Verlag Karl Alber, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

D, Ellis Ralph, and Newton Natika, eds. Consciousness & emotion: Agency, conscious choice, and selective perception. John Benjamins Pub., 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ham, Se-jŏng. In'gan ŭi chŏngsŏ: Human sensibility and emotion. Hakchisa, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Multimodal perception of emotion"

1

Esposito, Anna, Domenico Carbone, and Maria Teresa Riviello. "Visual Context Effects on the Perception of Musical Emotional Expressions." In Biometric ID Management and Multimodal Communication. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04391-8_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Aijun. "Perception of Multimodal Emotional Expressions By Japanese and Chinese." In Encoding and Decoding of Emotional Speech. Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-47691-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hunter, Patrick G., and E. Glenn Schellenberg. "Music and Emotion." In Music Perception. Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-6114-3_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bratton, John, Peter Sawchuk, Carolyn Forshaw, Militza Callinan, and Martin Corbett. "Perception and emotion." In Work and Organizational Behaviour. Macmillan Education UK, 2010. http://dx.doi.org/10.1007/978-0-230-36602-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cabada, Ramón Zatarain, Héctor Manuel Cárdenas López, and Hugo Jair Escalante. "Building Resources for Emotion Detection." In Multimodal Affective Computing. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-32542-7_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kotsia, Irene, Stefanos Zafeiriou, George Goudelis, Ioannis Patras, and Kostas Karpouzis. "Multimodal Sensing in Affective Gaming." In Emotion in Games. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41316-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vasilescu, Ioana. "Emotion Perception and Recognition." In Emotion-Oriented Systems. John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118601938.ch7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cabada, Ramón Zatarain, Héctor Manuel Cárdenas López, and Hugo Jair Escalante. "Multimodal Emotion Recognition in Learning Environments." In Multimodal Affective Computing. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-32542-7_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wagner, Johannes, Florian Lingenfelser, and Elisabeth André. "Building a Robust System for Multimodal Emotion Recognition." In Emotion Recognition. John Wiley & Sons, Inc., 2015. http://dx.doi.org/10.1002/9781118910566.ch15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kenemans, Leon, and Nick Ramsey. "Perception, Attention and Emotion." In Psychology in the Brain. Macmillan Education UK, 2013. http://dx.doi.org/10.1007/978-1-137-29614-6_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multimodal perception of emotion"

1

Pa Aung, Khin Pa, Hao-Long Yin, Tian-Fang Ma, Wei-Long Zheng, and Bao-Liang Lu. "A Multimodal Myanmar Emotion Dataset for Emotion Recognition." In 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2024. https://doi.org/10.1109/embc53108.2024.10782660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Laingo Nantenaina, RAJAONARIMANANA, and ANDRIAMANOHISOA Hery Zo. "Emotion Analysis Using Multimodal Data." In 2024 International Conference on Engineering and Emerging Technologies (ICEET). IEEE, 2024. https://doi.org/10.1109/iceet65156.2024.10913793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Guimin, Zhihong Zhu, Daniel Hershcovich, Lijie Hu, Hasti Seifi, and Jiayuan Xie. "UniMEEC: Towards Unified Multimodal Emotion Recognition and Emotion Cause." In Findings of the Association for Computational Linguistics: EMNLP 2024. Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dai, Wenliang, Zihan Liu, Tiezheng Yu, and Pascale Fung. "Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion Recognition." In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing. Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.aacl-main.30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ortiz-Perez, David, Manuel Benavent-Lledo, David Mulero-Pérez, David Tomás, and Jose Garcia-Rodriguez. "Multimodal Fusion Strategies for Emotion Recognition." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Jiaqi, Naihao Li, Long Zhang, and Linlin Shan. "Emotion Recognition for Multimodal Information Interaction." In 2025 International Conference on Intelligent Systems and Computational Networks (ICISCN). IEEE, 2025. https://doi.org/10.1109/iciscn64258.2025.10934693.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Robert V, Noel Jeygar, Ashwin Ponnur, Adella Aseesh Sai, and Manchuru Geethika. "Multimodal Analysis for Speech Emotion Recognition." In 2024 International Conference on Emerging Research in Computational Science (ICERCS). IEEE, 2024. https://doi.org/10.1109/icercs63125.2024.10895442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mohitkar, Rasika. "Multimodal Emotion Detection Using Deep Learning." In 2025 International Conference on Data Science, Agents & Artificial Intelligence (ICDSAAI). IEEE, 2025. https://doi.org/10.1109/icdsaai65575.2025.11011790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kulkarni, Rohit G., Samahitha Rajeevalochana, Aditya Vishwanatha, Sharanya S, and Surabhi Narayan. "Game Analysis Using Multimodal Emotion Recognition." In 2025 10th International Conference on Computer and Communication System (ICCCS). IEEE, 2025. https://doi.org/10.1109/icccs65393.2025.11069702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nguyen, Rami Huu, Kenichi Maeda, Mahsa Geshvadi, and Daniel Haehn. "Evaluating ‘Graphical Perception’ with Multimodal LLMs." In 2025 IEEE 18th Pacific Visualization Conference (PacificVis). IEEE, 2025. https://doi.org/10.1109/pacificvis64226.2025.00035.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Multimodal perception of emotion"

1

Maia, Maercio, Abrahão Baptista, Patricia Vanzella, Pedro Montoya, and Henrique Lima. Neural correlates of the perception of emotions elicited by dance movements. A scope review. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, 2023. http://dx.doi.org/10.37766/inplasy2023.2.0086.

Full text
Abstract:
Review question / Objective: The main question of the study is "how do dance neuroscience studies define and assess emotions?" The main objective is to establish, through the available literature, a scientific overview of studies in dance neuroscience that address the perception of emotions in the context of neuroaesthetics. Specifically, it is expected to verify if there is methodological homogeneity in studies involving the evaluation of emotions within the context of dance neuroscience; whether the definition of emotion is shared in these studies and, furthermore, whether in multimodal stud
APA, Harvard, Vancouver, ISO, and other styles
2

Rúas-Araújo, J., M. I. Punín Larrea, H. Gómez Alvarado, P. Cuesta-Morales, and S. Ratté. Neuroscience applied to perception analysis: Heart and emotion when listening to Ecuador’s national anthem. Revista Latina de Comunicación Social, 2015. http://dx.doi.org/10.4185/rlcs-2015-1052en.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Balali, Vahid, Arash Tavakoli, and Arsalan Heydarian. A Multimodal Approach for Monitoring Driving Behavior and Emotions. Mineta Transportation Institute, 2020. http://dx.doi.org/10.31979/mti.2020.1928.

Full text
Abstract:
Studies have indicated that emotions can significantly be influenced by environmental factors; these factors can also significantly influence drivers’ emotional state and, accordingly, their driving behavior. Furthermore, as the demand for autonomous vehicles is expected to significantly increase within the next decade, a proper understanding of drivers’/passengers’ emotions, behavior, and preferences will be needed in order to create an acceptable level of trust with humans. This paper proposes a novel semi-automated approach for understanding the effect of environmental factors on drivers’ e
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!