Journal articles on the topic 'Attention LSTM'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Attention LSTM.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Li, Youru, Zhenfeng Zhu, Deqiang Kong, Hua Han, and Yao Zhao. "EA-LSTM: Evolutionary attention-based LSTM for time series prediction." Knowledge-Based Systems 181 (October 2019): 104785. http://dx.doi.org/10.1016/j.knosys.2019.05.028.
Full textHuang, Zhongzhan, Senwei Liang, Mingfu Liang, and Haizhao Yang. "DIANet: Dense-and-Implicit Attention Network." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 4206–14. http://dx.doi.org/10.1609/aaai.v34i04.5842.
Full textWang, Hao, Xiaofang Zhang, Bin Liang, Qian Zhou, and Baowen Xu. "Gated Hierarchical LSTMs for Target-Based Sentiment Analysis." International Journal of Software Engineering and Knowledge Engineering 28, no. 11n12 (November 2018): 1719–37. http://dx.doi.org/10.1142/s0218194018400259.
Full textFeng, Kaicheng, and Xiaobing Liu. "Adaptive Attention with Consumer Sentinel for Movie Box Office Prediction." Complexity 2020 (December 7, 2020): 1–9. http://dx.doi.org/10.1155/2020/6689304.
Full textJo, Youngki, and Hyunsoo Lee. "Electricity Demand Forecasting Framework using Modified Attention-based LSTM." Journal of Korean Institute of Intelligent Systems 30, no. 3 (June 30, 2020): 242–50. http://dx.doi.org/10.5391/jkiis.2020.30.3.242.
Full textGallardo-Antolín, Ascensión, and Juan M. Montero. "Detecting Deception from Gaze and Speech Using a Multimodal Attention LSTM-Based Framework." Applied Sciences 11, no. 14 (July 11, 2021): 6393. http://dx.doi.org/10.3390/app11146393.
Full textZhang, Xuan, Xun Liang, Aakas Zhiyuli, Shusen Zhang, Rui Xu, and Bo Wu. "AT-LSTM: An Attention-based LSTM Model for Financial Time Series Prediction." IOP Conference Series: Materials Science and Engineering 569 (August 9, 2019): 052037. http://dx.doi.org/10.1088/1757-899x/569/5/052037.
Full textYin, Helin, Dong Jin, Yeong Hyeon Gu, Chang Jin Park, Sang Keun Han, and Seong Joon Yoo. "STL-ATTLSTM: Vegetable Price Forecasting Using STL and Attention Mechanism-Based LSTM." Agriculture 10, no. 12 (December 8, 2020): 612. http://dx.doi.org/10.3390/agriculture10120612.
Full textYang, Zhan, Chengliang Li, Zhongying Zhao, and Chao Li. "Sentiment classification based on dependency-relationship embedding and attention mechanism." Journal of Intelligent & Fuzzy Systems 41, no. 1 (August 11, 2021): 867–77. http://dx.doi.org/10.3233/jifs-202747.
Full textKim, Hong-In, and Rae-Hong Park. "Residual LSTM Attention Network for Object Tracking." IEEE Signal Processing Letters 25, no. 7 (July 2018): 1029–33. http://dx.doi.org/10.1109/lsp.2018.2835768.
Full textXie, Yue, Ruiyu Liang, Zhenlin Liang, Chengwei Huang, Cairong Zou, and Bjorn Schuller. "Speech Emotion Classification Using Attention-Based LSTM." IEEE/ACM Transactions on Audio, Speech, and Language Processing 27, no. 11 (November 2019): 1675–85. http://dx.doi.org/10.1109/taslp.2019.2925934.
Full textLiu, Zhongyu, Tian Chen, Enjie Ding, Yafeng Liu, and Wanli Yu. "Attention-Based Convolutional LSTM for Describing Video." IEEE Access 8 (2020): 133713–24. http://dx.doi.org/10.1109/access.2020.3010872.
Full textBin, Yi, Yang Yang, Fumin Shen, Ning Xie, Heng Tao Shen, and Xuelong Li. "Describing Video With Attention-Based Bidirectional LSTM." IEEE Transactions on Cybernetics 49, no. 7 (July 2019): 2631–41. http://dx.doi.org/10.1109/tcyb.2018.2831447.
Full textLi, Xiangpeng, Zhilong Zhou, Lijiang Chen, and Lianli Gao. "Residual attention-based LSTM for video captioning." World Wide Web 22, no. 2 (February 26, 2018): 621–36. http://dx.doi.org/10.1007/s11280-018-0531-z.
Full textTalafha, Bashar, Analle Abuammar, and Mahmoud Al-Ayyoub. "Atar: Attention-based LSTM for Arabizi transliteration." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 3 (June 1, 2021): 2327. http://dx.doi.org/10.11591/ijece.v11i3.pp2327-2334.
Full text王, 彬. "Attention-Bi-LSTM Based Analysis of Weibo Comments." Computer Science and Application 10, no. 12 (2020): 2380–87. http://dx.doi.org/10.12677/csa.2020.1012252.
Full textFang, Kuncheng, Lian Zhou, Cheng Jin, Yuejie Zhang, Kangnian Weng, Tao Zhang, and Weiguo Fan. "Fully Convolutional Video Captioning with Coarse-to-Fine and Inherited Attention." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 8271–78. http://dx.doi.org/10.1609/aaai.v33i01.33018271.
Full textLi, Songzhou, Gang Xie, Jinchang Ren, Lei Guo, Yunyun Yang, and Xinying Xu. "Urban PM2.5 Concentration Prediction via Attention-Based CNN–LSTM." Applied Sciences 10, no. 6 (March 12, 2020): 1953. http://dx.doi.org/10.3390/app10061953.
Full textLee, Yong-Hyeok, Dong-Won Jang, Jae-Bin Kim, Rae-Hong Park, and Hyung-Min Park. "Audio–Visual Speech Recognition Based on Dual Cross-Modality Attentions with the Transformer Model." Applied Sciences 10, no. 20 (October 17, 2020): 7263. http://dx.doi.org/10.3390/app10207263.
Full textXIE, Yue, Ruiyu LIANG, Zhenlin LIANG, and Li ZHAO. "Attention-Based Dense LSTM for Speech Emotion Recognition." IEICE Transactions on Information and Systems E102.D, no. 7 (July 1, 2019): 1426–29. http://dx.doi.org/10.1587/transinf.2019edl8019.
Full textNIZAMIDIN, Tashpolat, Li ZHAO, Ruiyu LIANG, Yue XIE, and Askar HAMDULLA. "Siamese Attention-Based LSTM for Speech Emotion Recognition." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E103.A, no. 7 (July 1, 2020): 937–41. http://dx.doi.org/10.1587/transfun.2019eal2156.
Full textShen, Yatian, Yan Li, Jun Sun, Wenke Ding, Xianjin Shi, Lei Zhang, Xiajiong Shen, and Jing He. "Hashtag Recommendation Using LSTM Networks with Self-Attention." Computers, Materials & Continua 61, no. 3 (2019): 1261–69. http://dx.doi.org/10.32604/cmc.2019.06104.
Full textDeng, Dong, Liping Jing, Jian Yu, and Shaolong Sun. "Sparse Self-Attention LSTM for Sentiment Lexicon Construction." IEEE/ACM Transactions on Audio, Speech, and Language Processing 27, no. 11 (November 2019): 1777–90. http://dx.doi.org/10.1109/taslp.2019.2933326.
Full textWang, Ye, Xinxiang Zhang, Mi Lu, Han Wang, and Yoonsuck Choe. "Attention augmentation with multi-residual in bidirectional LSTM." Neurocomputing 385 (April 2020): 340–47. http://dx.doi.org/10.1016/j.neucom.2019.10.068.
Full textYang, Liang, Haifeng Hu, Songlong Xing, and Xinlong Lu. "Constrained LSTM and Residual Attention for Image Captioning." ACM Transactions on Multimedia Computing, Communications, and Applications 16, no. 3 (September 4, 2020): 1–18. http://dx.doi.org/10.1145/3386725.
Full textZhang, Tao, Xiao-Qing Zheng, and Ming-Xin Liu. "Multiscale attention-based LSTM for ship motion prediction." Ocean Engineering 230 (June 2021): 109066. http://dx.doi.org/10.1016/j.oceaneng.2021.109066.
Full textYan, Le, Changwei Chen, Tingting Hang, and Youchuan Hu. "A stream prediction model based on attention-LSTM." Earth Science Informatics 14, no. 2 (February 16, 2021): 723–33. http://dx.doi.org/10.1007/s12145-021-00571-z.
Full text张, 玉铭. "Action Recognition Based on Attention and Bi-LSTM." Computer Science and Application 11, no. 06 (2021): 1607–16. http://dx.doi.org/10.12677/csa.2021.116166.
Full textMunir, Hafiz Shahbaz, Shengbing Ren, Mubashar Mustafa, Chaudry Naeem Siddique, and Shazib Qayyum. "Attention based GRU-LSTM for software defect prediction." PLOS ONE 16, no. 3 (March 4, 2021): e0247444. http://dx.doi.org/10.1371/journal.pone.0247444.
Full textMingkang Zhu, 朱铭康, and 卢先领 Xianling Lu. "Human Action Recognition Algorithm Based on Bi-LSTM-Attention Model." Laser & Optoelectronics Progress 56, no. 15 (2019): 151503. http://dx.doi.org/10.3788/lop56.151503.
Full textKim, Mintae, Yeongtaek Oh, and Wooju Kim. "Sentence Similarity Prediction based on Siamese CNN-Bidirectional LSTM with Self-attention." Journal of KIISE 46, no. 3 (March 31, 2019): 241–45. http://dx.doi.org/10.5626/jok.2019.46.3.241.
Full textCheng, Lin, Yuliang Shi, Kun Zhang, Xinjun Wang, and Zhiyong Chen. "GGATB-LSTM: Grouping and Global Attention-based Time-aware Bidirectional LSTM Medical Treatment Behavior Prediction." ACM Transactions on Knowledge Discovery from Data 15, no. 3 (May 2021): 1–16. http://dx.doi.org/10.1145/3441454.
Full textJang, Beakcheol, Myeonghwi Kim, Gaspard Harerimana, Sang-ug Kang, and Jong Wook Kim. "Bi-LSTM Model to Increase Accuracy in Text Classification: Combining Word2vec CNN and Attention Mechanism." Applied Sciences 10, no. 17 (August 24, 2020): 5841. http://dx.doi.org/10.3390/app10175841.
Full textZhu, Xinxin, Lixiang Li, Jing Liu, Ziyi Li, Haipeng Peng, and Xinxin Niu. "Image captioning with triple-attention and stack parallel LSTM." Neurocomputing 319 (November 2018): 55–65. http://dx.doi.org/10.1016/j.neucom.2018.08.069.
Full textWang, Bo, and Binwen Fan. "Attention-based Hierarchical LSTM Model for Document Sentiment Classification." IOP Conference Series: Materials Science and Engineering 435 (November 5, 2018): 012051. http://dx.doi.org/10.1088/1757-899x/435/1/012051.
Full textFu, Xianghua, Jingying Yang, Jianqiang Li, Min Fang, and Huihui Wang. "Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis." IEEE Access 6 (2018): 71884–91. http://dx.doi.org/10.1109/access.2018.2878425.
Full textLiang, Ruiyu, Fanliu Kong, Yue Xie, Guichen Tang, and Jiaming Cheng. "Real-Time Speech Enhancement Algorithm Based on Attention LSTM." IEEE Access 8 (2020): 48464–76. http://dx.doi.org/10.1109/access.2020.2979554.
Full textDing, Yukai, Yuelong Zhu, Jun Feng, Pengcheng Zhang, and Zirun Cheng. "Interpretable spatio-temporal attention LSTM model for flood forecasting." Neurocomputing 403 (August 2020): 348–59. http://dx.doi.org/10.1016/j.neucom.2020.04.110.
Full textKumar, Avinash, Vishnu Teja Narapareddy, Veerubhotla Aditya Srikanth, Aruna Malapati, and Lalita Bhanu Murthy Neti. "Sarcasm Detection Using Multi-Head Attention Based Bidirectional LSTM." IEEE Access 8 (2020): 6388–97. http://dx.doi.org/10.1109/access.2019.2963630.
Full textLin, Zhifeng, Lianglun Cheng, and Guoheng Huang. "Electricity consumption prediction based on LSTM with attention mechanism." IEEJ Transactions on Electrical and Electronic Engineering 15, no. 4 (January 6, 2020): 556–62. http://dx.doi.org/10.1002/tee.23088.
Full textZhu, Guangming, Liang Zhang, Lu Yang, Lin Mei, Syed Afaq Ali Shah, Mohammed Bennamoun, and Peiyi Shen. "Redundancy and Attention in Convolutional LSTM for Gesture Recognition." IEEE Transactions on Neural Networks and Learning Systems 31, no. 4 (April 2020): 1323–35. http://dx.doi.org/10.1109/tnnls.2019.2919764.
Full textGao, Lianli, Zhao Guo, Hanwang Zhang, Xing Xu, and Heng Tao Shen. "Video Captioning With Attention-Based LSTM and Semantic Consistency." IEEE Transactions on Multimedia 19, no. 9 (September 2017): 2045–55. http://dx.doi.org/10.1109/tmm.2017.2729019.
Full textJing, Ran. "A Self-attention Based LSTM Network for Text Classification." Journal of Physics: Conference Series 1207 (April 2019): 012008. http://dx.doi.org/10.1088/1742-6596/1207/1/012008.
Full textZhong, Rui, Rui Wang, Yang Zou, Zhiqiang Hong, and Min Hu. "Graph Attention Networks Adjusted Bi-LSTM for Video Summarization." IEEE Signal Processing Letters 28 (2021): 663–67. http://dx.doi.org/10.1109/lsp.2021.3066349.
Full textUllah, Mohib, Muhammad Mudassar Yamin, Ahmed Mohammed, Sultan Daud Khan, Habib Ullah, and Faouzi Alaya Cheikh. "ATTENTION-BASED LSTM NETWORK FOR ACTION RECOGNITION IN SPORTS." Electronic Imaging 2021, no. 6 (January 18, 2021): 302–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.6.iriacv-302.
Full textYu, Yeonguk, and Yoon-Joong Kim. "Attention-LSTM-Attention Model for Speech Emotion Recognition and Analysis of IEMOCAP Database." Electronics 9, no. 5 (April 26, 2020): 713. http://dx.doi.org/10.3390/electronics9050713.
Full textLi, Jiakang, Xiongwei Zhang, Meng Sun, Xia Zou, and Changyan Zheng. "Attention-Based LSTM Algorithm for Audio Replay Detection in Noisy Environments." Applied Sciences 9, no. 8 (April 13, 2019): 1539. http://dx.doi.org/10.3390/app9081539.
Full textCheng, Yepeng, Zuren Liu, and Yasuhiko Morimoto. "Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting." Information 11, no. 6 (June 5, 2020): 305. http://dx.doi.org/10.3390/info11060305.
Full textWu, Yirui, Yukai Ding, Yuelong Zhu, Jun Feng, and Sifeng Wang. "Complexity to Forecast Flood: Problem Definition and Spatiotemporal Attention LSTM Solution." Complexity 2020 (March 26, 2020): 1–13. http://dx.doi.org/10.1155/2020/7670382.
Full textZou, Xiangyu, Jinjin Zhao, Duan Zhao, Bin Sun, Yongxin He, and Stelios Fuentes. "Air Quality Prediction Based on a Spatiotemporal Attention Mechanism." Mobile Information Systems 2021 (February 19, 2021): 1–12. http://dx.doi.org/10.1155/2021/6630944.
Full text