Journal articles on the topic 'Attention based models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Attention based models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Zang, Yubin, Zhenming Yu, Kun Xu, Minghua Chen, Sigang Yang, and Hongwei Chen. "Fiber communication receiver models based on the multi-head attention mechanism." Chinese Optics Letters 21, no. 3 (2023): 030602. http://dx.doi.org/10.3788/col202321.030602.
Full textQin, Chu-Xiong, and Dan Qu. "Towards Understanding Attention-Based Speech Recognition Models." IEEE Access 8 (2020): 24358–69. http://dx.doi.org/10.1109/access.2020.2970758.
Full textCha, Peter, Paul Ginsparg, Felix Wu, Juan Carrasquilla, Peter L. McMahon, and Eun-Ah Kim. "Attention-based quantum tomography." Machine Learning: Science and Technology 3, no. 1 (2021): 01LT01. http://dx.doi.org/10.1088/2632-2153/ac362b.
Full textFallahnejad, Zohreh, and Hamid Beigy. "Attention-based skill translation models for expert finding." Expert Systems with Applications 193 (May 2022): 116433. http://dx.doi.org/10.1016/j.eswa.2021.116433.
Full textSteelman, Kelly S., Jason S. McCarley, and Christopher D. Wickens. "Theory-based Models of Attention in Visual Workspaces." International Journal of Human–Computer Interaction 33, no. 1 (2016): 35–43. http://dx.doi.org/10.1080/10447318.2016.1232228.
Full textThapa, Krishu K., Bhupinderjeet Singh, Supriya Savalkar, Alan Fern, Kirti Rajagopalan, and Ananth Kalyanaraman. "Attention-Based Models for Snow-Water Equivalent Prediction." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 21 (2024): 22969–75. http://dx.doi.org/10.1609/aaai.v38i21.30337.
Full text王, 紫阳. "Wind Speed Prediction Based on Attention-Combined Models." Artificial Intelligence and Robotics Research 14, no. 02 (2025): 389–96. https://doi.org/10.12677/airr.2025.142038.
Full textAlOmar, Ban, Zouheir Trabelsi, and Firas Saidi. "Attention-Based Deep Learning Modelling for Intrusion Detection." European Conference on Cyber Warfare and Security 22, no. 1 (2023): 22–32. http://dx.doi.org/10.34190/eccws.22.1.1172.
Full textTangsali, Rahul, Swapnil Chhatre, Soham Naik, Pranav Bhagwat, and Geetanjali Kale. "Evaluating Performances of Attention-Based Merge Architecture Models for Image Captioning in Indian Languages." Journal of Image and Graphics 11, no. 3 (2023): 294–301. http://dx.doi.org/10.18178/joig.11.3.294-301.
Full textKong, Phutphalla, Matei Mancas, Bernard Gosselin, and Kimtho Po. "DeepRare: Generic Unsupervised Visual Attention Models." Electronics 11, no. 11 (2022): 1696. http://dx.doi.org/10.3390/electronics11111696.
Full textFei Wang, Fei Wang, and Haijun Zhang Fei Wang. "Multiscale Convolutional Attention-based Residual Network Expression Recognition." 網際網路技術學刊 24, no. 5 (2023): 1169–75. http://dx.doi.org/10.53106/160792642023092405015.
Full textHashemi, Seyyed Mohammad Reza. "A Survey of Visual Attention Models." Ciência e Natura 37 (December 19, 2015): 297. http://dx.doi.org/10.5902/2179460x20786.
Full textLee, Soohyun, and Jongyoul Park. "Attention Map-Based Automatic Masking for Object Swapping in Diffusion Models." Journal of KIISE 52, no. 4 (2025): 284–92. https://doi.org/10.5626/jok.2025.52.4.284.
Full textYue, Wang, and Li Lei. "Sentiment Analysis using a CNN-BiLSTM Deep Model Based on Attention Classification." Information 26, no. 3 (2023): 117–62. http://dx.doi.org/10.47880/inf2603-02.
Full textSharada, Gupta, and N. Eshwarappa Murundi. "Breast cancer detection through attention based feature integration model." IAES International Journal of Artificial Intelligence (IJ-AI) 13, no. 2 (2024): 2254–64. https://doi.org/10.11591/ijai.v13.i2.pp2254-2264.
Full textSingh, Sushant, and Ausif Mahmood. "CacheFormer: High-Attention-Based Segment Caching." AI 6, no. 4 (2025): 85. https://doi.org/10.3390/ai6040085.
Full textWenjuan Xiao, Wenjuan Xiao, and Xiaoming Wang Wenjuan Xiao. "Attention Mechanism Based Spatial-Temporal Graph Convolution Network for Traffic Prediction." 電腦學刊 35, no. 4 (2024): 093–108. http://dx.doi.org/10.53106/199115992024083504007.
Full textNazari, Sana, and Rafael Garcia. "Going Smaller: Attention-based models for automated melanoma diagnosis." Computers in Biology and Medicine 185 (February 2025): 109492. https://doi.org/10.1016/j.compbiomed.2024.109492.
Full textZhou, Qifeng, Xiang Liu, and Qing Wang. "Interpretable duplicate question detection models based on attention mechanism." Information Sciences 543 (January 2021): 259–72. http://dx.doi.org/10.1016/j.ins.2020.07.048.
Full textIsrar, Huma, Safdar Abbas Khan, Muhammad Ali Tahir, Muhammad Khuram Shahzad, Muneer Ahmad, and Jasni Mohamad Zain. "Neural Machine Translation Models with Attention-Based Dropout Layer." Computers, Materials & Continua 75, no. 2 (2023): 2981–3009. http://dx.doi.org/10.32604/cmc.2023.035814.
Full textYang, Zhifei, Wenmin Li, Fei Gao, and Qiaoyan Wen. "FAPA: Transferable Adversarial Attacks Based on Foreground Attention." Security and Communication Networks 2022 (October 29, 2022): 1–8. http://dx.doi.org/10.1155/2022/4447307.
Full textHe, Ruifeng, Mingtian Xie, and Aixing He. "Video anomaly detection based on hybrid attention mechanism." Applied and Computational Engineering 57, no. 1 (2024): 212–17. http://dx.doi.org/10.54254/2755-2721/57/20241336.
Full textRosenberg, Monica D., Wei-Ting Hsu, Dustin Scheinost, R. Todd Constable, and Marvin M. Chun. "Connectome-based Models Predict Separable Components of Attention in Novel Individuals." Journal of Cognitive Neuroscience 30, no. 2 (2018): 160–73. http://dx.doi.org/10.1162/jocn_a_01197.
Full textGuo, Yuxi. "Interpretability analysis in transformers based on attention visualization." Applied and Computational Engineering 76, no. 1 (2024): 92–102. http://dx.doi.org/10.54254/2755-2721/76/20240571.
Full textWang, Lei, Ed X. Wu, and Fei Chen. "EEG-based auditory attention decoding using speech-level-based segmented computational models." Journal of Neural Engineering 18, no. 4 (2021): 046066. http://dx.doi.org/10.1088/1741-2552/abfeba.
Full textKramer, Arthur F., and Andrew Jacobson. "A comparison of Space-Based and Object-Based Models of Visual Attention." Proceedings of the Human Factors Society Annual Meeting 34, no. 19 (1990): 1489–93. http://dx.doi.org/10.1177/154193129003401915.
Full textRayeesa, Mehmood, Bashir Rumaan, and J. Giri Kaiser. "Deep Generative Models: A Review." Indian Journal of Science and Technology 16, no. 7 (2023): 460–67. https://doi.org/10.17485/IJST/v16i7.2296.
Full textYeom, Hong-gi, and Kyung-min An. "A Simplified Query-Only Attention for Encoder-Based Transformer Models." Applied Sciences 14, no. 19 (2024): 8646. http://dx.doi.org/10.3390/app14198646.
Full textHanafi, Hanafi, Andri Pranolo, Yingchi Mao, Taqwa Hariguna, Leonel Hernandez, and Nanang Fitriana Kurniawan. "IDSX-Attention: Intrusion detection system (IDS) based hybrid MADE-SDAE and LSTM-Attention mechanism." International Journal of Advances in Intelligent Informatics 9, no. 1 (2023): 121. http://dx.doi.org/10.26555/ijain.v9i1.942.
Full textSun, Wenhao, Xue-Mei Dong, Benlei Cui, and Jingqun Tang. "Attentive Eraser: Unleashing Diffusion Model’s Object Removal Potential via Self-Attention Redirection Guidance." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 19 (2025): 20734–42. https://doi.org/10.1609/aaai.v39i19.34285.
Full textKristensen, Terje. "Towards Spike based Models of Visual Attention in the Brain." International Journal of Adaptive, Resilient and Autonomic Systems 6, no. 2 (2015): 117–38. http://dx.doi.org/10.4018/ijaras.2015070106.
Full textChandrasekaran, Ganesh, Mandalapu Kalpana Chowdary, Jyothi Chinna Babu, Ajmeera Kiran, Kotthuru Anil Kumar, and Seifedine Kadry. "Deep learning-based attention models for sarcasm detection in text." International Journal of Electrical and Computer Engineering (IJECE) 14, no. 6 (2024): 6786. http://dx.doi.org/10.11591/ijece.v14i6.pp6786-6796.
Full textSun, Xinhao. "Application of Attention-Based LSTM Hybrid Models for Stock Price Prediction." Advances in Economics, Management and Political Sciences 104, no. 1 (2024): 46–60. http://dx.doi.org/10.54254/2754-1169/104/2024ed0152.
Full textKamalov, Firuz, Inga Zicmane, Murodbek Safaraliev, Linda Smail, Mihail Senyuk, and Pavel Matrenin. "Attention-Based Load Forecasting with Bidirectional Finetuning." Energies 17, no. 18 (2024): 4699. http://dx.doi.org/10.3390/en17184699.
Full textZheng, Guoqiang, Tianle Zhao, and Yaohui Liu. "Cloud Removal in the Tibetan Plateau Region Based on Self-Attention and Local-Attention Models." Sensors 24, no. 23 (2024): 7848. https://doi.org/10.3390/s24237848.
Full textAlsayadi, Hamzah A., Abdelaziz A. Abdelhamid, Islam Hegazy, and Zaki T. Fayed. "Non-diacritized Arabic speech recognition based on CNN-LSTM and attention-based models." Journal of Intelligent & Fuzzy Systems 41, no. 6 (2021): 6207–19. http://dx.doi.org/10.3233/jifs-202841.
Full textGuang, Jiahe, Xingrui He, Zeng Li, and Shiyu He. "Road Pothole Detection Model Based on Local Attention Resnet18-CNN-LSTM." Theoretical and Natural Science 42, no. 1 (2024): 131–38. http://dx.doi.org/10.54254/2753-8818/42/20240669.
Full textZhou, Jiawei. "Predicting Stock Price by Using Attention-Based Hybrid LSTM Model." Asian Journal of Basic Science & Research 06, no. 02 (2024): 145–58. http://dx.doi.org/10.38177/ajbsr.2024.6211.
Full textShin, Yehjin, Jeongwhan Choi, Hyowon Wi, and Noseong Park. "An Attentive Inductive Bias for Sequential Recommendation beyond the Self-Attention." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 8 (2024): 8984–92. http://dx.doi.org/10.1609/aaai.v38i8.28747.
Full textWang, Yuming, Yu Li, and Hua Zou. "Masked Face Recognition System Based on Attention Mechanism." Information 14, no. 2 (2023): 87. http://dx.doi.org/10.3390/info14020087.
Full textKamal, Saurabh, Sahil Sharma, Vijay Kumar, Hammam Alshazly, Hany S. Hussein, and Thomas Martinetz. "Trading Stocks Based on Financial News Using Attention Mechanism." Mathematics 10, no. 12 (2022): 2001. http://dx.doi.org/10.3390/math10122001.
Full textSi, Nianwen, Wenlin Zhang, Dan Qu, Xiangyang Luo, Heyu Chang, and Tong Niu. "Spatial-Channel Attention-Based Class Activation Mapping for Interpreting CNN-Based Image Classification Models." Security and Communication Networks 2021 (May 31, 2021): 1–13. http://dx.doi.org/10.1155/2021/6682293.
Full textZhang, Mengya, Yuan Zhang, and Qinghui Zhang. "Attention-Mechanism-Based Models for Unconstrained Face Recognition with Mask Occlusion." Electronics 12, no. 18 (2023): 3916. http://dx.doi.org/10.3390/electronics12183916.
Full textXue, Mengfan, Minghao Chen, Dongliang Peng, Yunfei Guo, and Huajie Chen. "One Spatio-Temporal Sharpening Attention Mechanism for Light-Weight YOLO Models Based on Sharpening Spatial Attention." Sensors 21, no. 23 (2021): 7949. http://dx.doi.org/10.3390/s21237949.
Full textAmin, Rashid. "Urdu Sentiment Analysis Using Deep Attention-based Technique." Foundation University Journal of Engineering and Applied Sciences <br><i style="color:yellow;">(HEC Recognized Y Category , ISSN 2706-7351)</i> 3, no. 1 (2022): 7. http://dx.doi.org/10.33897/fujeas.v3i1.564.
Full textZhou, Lixin, Zhenyu Zhang, Laijun Zhao, and Pingle Yang. "Attention-based BiLSTM models for personality recognition from user-generated content." Information Sciences 596 (June 2022): 460–71. http://dx.doi.org/10.1016/j.ins.2022.03.038.
Full textZhang, Lin, Huapeng Qin, Junqi Mao, Xiaoyan Cao, and Guangtao Fu. "High temporal resolution urban flood prediction using attention-based LSTM models." Journal of Hydrology 620 (May 2023): 129499. http://dx.doi.org/10.1016/j.jhydrol.2023.129499.
Full textWang, Zhenyi, Pengfei Yang, Linwei Hu, et al. "SLAPP: Subgraph-level attention-based performance prediction for deep learning models." Neural Networks 170 (February 2024): 285–97. http://dx.doi.org/10.1016/j.neunet.2023.11.043.
Full textHao, Cuiping, and Ting Yang. "Deep Collaborative Online Learning Resource Recommendation Based on Attention Mechanism." Scientific Programming 2022 (March 24, 2022): 1–10. http://dx.doi.org/10.1155/2022/3199134.
Full textAshtari, Amirsaman, Chang Wook Seo, Cholmin Kang, Sihun Cha, and Junyong Noh. "Reference Based Sketch Extraction via Attention Mechanism." ACM Transactions on Graphics 41, no. 6 (2022): 1–16. http://dx.doi.org/10.1145/3550454.3555504.
Full text