Journal articles on the topic 'Multimodal perception of emotion'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Multimodal perception of emotion.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Barabanschikov, Vladimir A., and Ekaterina V. Suvorova. "Expression and perception of multimodal emotional states." National Psychological Journal 51, no. 3 (2023): 106–27. http://dx.doi.org/10.11621/npj.2023.0311.
Full textChen, Zhongbo. "Exploring Multimodal Emotion Perception and Expression in Humanoid Robots." Applied and Computational Engineering 174, no. 1 (2025): 86–91. https://doi.org/10.54254/2755-2721/2025.po24880.
Full textde Boer, Minke J., Deniz Başkent, and Frans W. Cornelissen. "Eyes on Emotion: Dynamic Gaze Allocation During Emotion Perception From Speech-Like Stimuli." Multisensory Research 34, no. 1 (2020): 17–47. http://dx.doi.org/10.1163/22134808-bja10029.
Full textGao, Xiyuan, Shekhar Nayak, and Matt Coler. "Enhancing sarcasm detection through multimodal data integration: A proposal for augmenting audio with text and emoticon." Journal of the Acoustical Society of America 155, no. 3_Supplement (2024): A264. http://dx.doi.org/10.1121/10.0027441.
Full textVallverdú, Jordi, Gabriele Trovato, and Lorenzo Jamone. "Allocentric Emotional Affordances in HRI: The Multimodal Binding." Multimodal Technologies and Interaction 2, no. 4 (2018): 78. http://dx.doi.org/10.3390/mti2040078.
Full textShackman, Jessica E., and Seth D. Pollak. "Experiential Influences on Multimodal Perception of Emotion." Child Development 76, no. 5 (2005): 1116–26. http://dx.doi.org/10.1111/j.1467-8624.2005.00901.x.
Full textPortnova, Galina V., and Daria A. Stebakova. "The multimodal emotion perception in codependent individuals." Neuroscience Research Notes 6, no. 1 (2023): 210. http://dx.doi.org/10.31117/neuroscirn.v6i1.210.
Full textBarabanschikov, V. A., and E. V. Suvorova. "Gender Differences in the Recognition of Emotional States." Психологическая наука и образование 26, no. 6 (2021): 107–16. http://dx.doi.org/10.17759/pse.2021260608.
Full textMarino, David, Max Henry, Pascal E. Fortin, Rachit Bhayana, and Jeremy Cooperstock. "I See What You're Hearing: Facilitating The Effect of Environment on Perceived Emotion While Teleconferencing." Proceedings of the ACM on Human-Computer Interaction 7, CSCW1 (2023): 1–15. http://dx.doi.org/10.1145/3579495.
Full textBi, Xin, and Tian Zhang. "Analysis of the fusion of multimodal sentiment perception and physiological signals in Chinese-English cross-cultural communication: Transformer approach incorporating self-attention enhancement." PeerJ Computer Science 11 (May 23, 2025): e2890. https://doi.org/10.7717/peerj-cs.2890.
Full textYamauchi, Takashi, Jinsil Seo, and Annie Sungkajun. "Interactive Plants: Multisensory Visual-Tactile Interaction Enhances Emotional Experience." Mathematics 6, no. 11 (2018): 225. http://dx.doi.org/10.3390/math6110225.
Full textLavan, Nadine, and Carolyn McGettigan. "Increased Discriminability of Authenticity from Multimodal Laughter is Driven by Auditory Information." Quarterly Journal of Experimental Psychology 70, no. 10 (2017): 2159–68. http://dx.doi.org/10.1080/17470218.2016.1226370.
Full textPortnova, Galina, Aleksandra Maslennikova, Natalya Zakharova, and Olga Martynova. "The Deficit of Multimodal Perception of Congruent and Non-Congruent Fearful Expressions in Patients with Schizophrenia: The ERP Study." Brain Sciences 11, no. 1 (2021): 96. http://dx.doi.org/10.3390/brainsci11010096.
Full textVani Vivekanand, Chettiyar. "Performance Analysis of Emotion Classification Using Multimodal Fusion Technique." Journal of Computational Science and Intelligent Technologies 2, no. 1 (2021): 14–20. http://dx.doi.org/10.53409/mnaa/jcsit/2103.
Full textJolibois, Simon Christophe, Akinori Ito, and Takashi Nose. "The Development of an Emotional Embodied Conversational Agent and the Evaluation of the Effect of Response Delay on User Impression." Applied Sciences 15, no. 8 (2025): 4256. https://doi.org/10.3390/app15084256.
Full textMittal, Trisha, Aniket Bera, and Dinesh Manocha. "Multimodal and Context-Aware Emotion Perception Model With Multiplicative Fusion." IEEE MultiMedia 28, no. 2 (2021): 67–75. http://dx.doi.org/10.1109/mmul.2021.3068387.
Full textBarabanschikov, V. A., and E. V. Suvorova. "Vivid Face Perception as a Constructive Component of Multimodal Affective States." Experimental Psychology (Russia) 17, no. 4 (2024): 4–27. https://doi.org/10.17759/exppsy.2024170401.
Full textWang, Yonggu, Kailin Pan, Yifan Shao, Jiarong Ma, and Xiaojuan Li. "Applying a Convolutional Vision Transformer for Emotion Recognition in Children with Autism: Fusion of Facial Expressions and Speech Features." Applied Sciences 15, no. 6 (2025): 3083. https://doi.org/10.3390/app15063083.
Full textKdouri, Lahoucine, Youssef Hmamouche, Amal El Fallah Seghrouchni, and Thierry Chaminade. "Predicting Activity in Brain Areas Associated with Emotion Processing Using Multimodal Behavioral Signals." Multimodal Technologies and Interaction 9, no. 4 (2025): 31. https://doi.org/10.3390/mti9040031.
Full textBarabanschikov, V. A., E. V. Suvorova, and A. V. Malionok. "Perception of the Prosodic Formative of Multimodal Affective States." Experimental Psychology (Russia) 17, no. 3 (2024): 30–51. http://dx.doi.org/10.17759/exppsy.2024170303.
Full textMontembeault, Maxime, Estefania Brando, Kim Charest, et al. "Multimodal emotion perception in young and elderly patients with multiple sclerosis." Multiple Sclerosis and Related Disorders 58 (February 2022): 103478. http://dx.doi.org/10.1016/j.msard.2021.103478.
Full textBai, Jie. "Optimized Piano Music Education Model Based on Multimodal Information Fusion for Emotion Recognition in Multimedia Video Networks." Mobile Information Systems 2022 (August 24, 2022): 1–12. http://dx.doi.org/10.1155/2022/1882739.
Full textChen, Junwei. "Discussing Solutions to the Data Imbalance Problem in Emotion Recognition." Applied and Computational Engineering 174, no. 1 (2025): 23–31. https://doi.org/10.54254/2755-2721/2025.po24697.
Full textZhang, Jingjing, and Wei Chen. "A Decade of Music Emotion Computing: A Bibliometric Analysis of Trends, Interdisciplinary Collaboration, and Applications." Education for Information 41, no. 3 (2025): 227–55. https://doi.org/10.1177/01678329251323441.
Full textGauraangi Praakash. "Multimodal Emotion Recognition: A Tri-modal Approach Using Speech, Text, and Visual Cues for Enhanced Interaction Analysis." Journal of Information Systems Engineering and Management 10, no. 39s (2025): 654–63. https://doi.org/10.52783/jisem.v10i39s.7269.
Full textLi, Jialu. "Integrating Multimodal Data for Deep Learning-Based Facial Emotion Recognition." Highlights in Science, Engineering and Technology 124 (February 18, 2025): 362–67. https://doi.org/10.54097/gpy08650.
Full textHancock, Megan R., and Tessa Bent. "Multimodal emotion perception: Influences of autism spectrum disorder and autism-like traits." Journal of the Acoustical Society of America 148, no. 4 (2020): 2765. http://dx.doi.org/10.1121/1.5147698.
Full textBänziger, Tanja, Marcello Mortillaro, and Klaus R. Scherer. "Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception." Emotion 12, no. 5 (2012): 1161–79. http://dx.doi.org/10.1037/a0025827.
Full textUdeh, Chinonso Paschal, Luefeng Chen, Sheng Du, Min Li, and Min Wu. "Multimodal Facial Emotion Recognition Using Improved Convolution Neural Networks Model." Journal of Advanced Computational Intelligence and Intelligent Informatics 27, no. 4 (2023): 710–19. http://dx.doi.org/10.20965/jaciii.2023.p0710.
Full textSable, Prof R. Y., Aqsa Sayyed, Baliraje Kalyane, Kosheen Sadhu, and Prathamesh Ghatole. "Enhancing Music Mood Recognition with LLMs and Audio Signal Processing: A Multimodal Approach." International Journal for Research in Applied Science and Engineering Technology 12, no. 7 (2024): 628–42. http://dx.doi.org/10.22214/ijraset.2024.63590.
Full textHorii, Takato, Yukie Nagai, and Minoru Asada. "Modeling Development of Multimodal Emotion Perception Guided by Tactile Dominance and Perceptual Improvement." IEEE Transactions on Cognitive and Developmental Systems 10, no. 3 (2018): 762–75. http://dx.doi.org/10.1109/tcds.2018.2809434.
Full textYamamoto, Hisako W., and Akihiro Tanaka. "The Development of multimodal emotion perception from the combination of bodies and voices." Proceedings of the Annual Convention of the Japanese Psychological Association 86 (2022): 1EV—062—PO—1EV—062—PO. http://dx.doi.org/10.4992/pacjpa.86.0_1ev-062-po.
Full textde Gelder, Beatrice, and Jean Vroomen. "Rejoinder - Bimodal emotion perception: integration across separate modalities, cross-modal perceptual grouping or perception of multimodal events?" Cognition & Emotion 14, no. 3 (2000): 321–24. http://dx.doi.org/10.1080/026999300378842.
Full textLuna-Jiménez, Cristina, Ricardo Kleinlein, David Griol, Zoraida Callejas, Juan M. Montero, and Fernando Fernández-Martínez. "A Proposal for Multimodal Emotion Recognition Using Aural Transformers and Action Units on RAVDESS Dataset." Applied Sciences 12, no. 1 (2021): 327. http://dx.doi.org/10.3390/app12010327.
Full textBarabanschikov, V. A., and O. A. Korolkova. "Perception of “Live” Facial Expressions." Experimental Psychology (Russia) 13, no. 3 (2020): 55–73. http://dx.doi.org/10.17759/exppsy.2020130305.
Full textKim, Bora, and Bhuja Chung. "Affective Perception Characteristics in School-aged Children with High-functioning Autism Spectrum Disorder according to Prosody and Semantic Consistency." Communication Sciences & Disorders 29, no. 4 (2024): 776–86. https://doi.org/10.12963/csd.240075.
Full textMichael, Stefanus, and Amalia Zahra. "Multimodal speech emotion recognition optimization using genetic algorithm." Bulletin of Electrical Engineering and Informatics 13, no. 5 (2024): 3309–16. http://dx.doi.org/10.11591/eei.v13i5.7409.
Full textMiao, Haotian, Yifei Zhang, Daling Wang, and Shi Feng. "Multi-Output Learning Based on Multimodal GCN and Co-Attention for Image Aesthetics and Emotion Analysis." Mathematics 9, no. 12 (2021): 1437. http://dx.doi.org/10.3390/math9121437.
Full textRyumin, Dmitry, Elena Ryumina, and Denis Ivanko. "EMOLIPS: Towards Reliable Emotional Speech Lip-Reading." Mathematics 11, no. 23 (2023): 4787. http://dx.doi.org/10.3390/math11234787.
Full textBabaoğlu, Gizem, Başak Yazgan, Pınar Erturk, et al. "Vocal emotion recognition by native Turkish children with normal hearing and with hearing aids." Journal of the Acoustical Society of America 151, no. 4 (2022): A278. http://dx.doi.org/10.1121/10.0011335.
Full textBerkane, Mohamed, Kenza Belhouchette, and Hacene Belhadef. "Emotion Recognition Approach Using Multilayer Perceptron Network and Motion Estimation." International Journal of Synthetic Emotions 10, no. 1 (2019): 38–53. http://dx.doi.org/10.4018/ijse.2019010102.
Full textDessai, Amita, and Hassanali Virani. "Multimodal and Multidomain Feature Fusion for Emotion Classification Based on Electrocardiogram and Galvanic Skin Response Signals." Sci 6, no. 1 (2024): 10. http://dx.doi.org/10.3390/sci6010010.
Full textResende Faria, Diego, Abraham Itzhak Weinberg, and Pedro Paulo Ayrosa. "Multimodal Affective Communication Analysis: Fusing Speech Emotion and Text Sentiment Using Machine Learning." Applied Sciences 14, no. 15 (2024): 6631. http://dx.doi.org/10.3390/app14156631.
Full textZhang, Lihong, Chaolong Liu, and Nan Jia. "Uni2Mul: A Conformer-Based Multimodal Emotion Classification Model by Considering Unimodal Expression Differences with Multi-Task Learning." Applied Sciences 13, no. 17 (2023): 9910. http://dx.doi.org/10.3390/app13179910.
Full textKholkhal, Mourad, Mohammed Sofiane Bendelhoum, Nabil Dib, and Mohammed Ridha Youbi. "Decoding the brain: EEG applications in sensory processing, emotion recognition, and pain assessment." STUDIES IN ENGINEERING AND EXACT SCIENCES 5, no. 2 (2024): e9688. http://dx.doi.org/10.54021/seesv5n2-388.
Full textKalam, Mr Md Abdul. "A STUDY ON A JOURNEY THROUGH STORIES IN SEQUENTIAL FRAMES." International Scientific Journal of Engineering and Management 04, no. 05 (2025): 1–7. https://doi.org/10.55041/isjem03444.
Full textYang, Yi, Hao Feng, Yiming Cheng, and Zhu Han. "Emotion-Aware Scene Adaptation: A Bandwidth-Efficient Approach for Generating Animated Shorts." Sensors 24, no. 5 (2024): 1660. http://dx.doi.org/10.3390/s24051660.
Full textFiorini, Laura, Grazia D'Onofrio, Alessandra Sorrentino, et al. "The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study." JMIR Human Factors 11 (January 26, 2024): e45494. http://dx.doi.org/10.2196/45494.
Full textPrajapati, Vrinda, Rajlakshmi Guha, and Aurobinda Routray. "Multimodal prediction of trait emotional intelligence–Through affective changes measured using non-contact based physiological measures." PLOS ONE 16, no. 7 (2021): e0254335. http://dx.doi.org/10.1371/journal.pone.0254335.
Full textPiwek, Lukasz, Karin Petrini, and Frank E. Pollick. "Auditory signal dominates visual in the perception of emotional social interactions." Seeing and Perceiving 25 (2012): 112. http://dx.doi.org/10.1163/187847612x647450.
Full text