Academic literature on the topic 'Attention LSTM'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Attention LSTM.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Attention LSTM"
Li, Youru, Zhenfeng Zhu, Deqiang Kong, Hua Han, and Yao Zhao. "EA-LSTM: Evolutionary attention-based LSTM for time series prediction." Knowledge-Based Systems 181 (October 2019): 104785. http://dx.doi.org/10.1016/j.knosys.2019.05.028.
Full textHuang, Zhongzhan, Senwei Liang, Mingfu Liang, and Haizhao Yang. "DIANet: Dense-and-Implicit Attention Network." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 4206–14. http://dx.doi.org/10.1609/aaai.v34i04.5842.
Full textWang, Hao, Xiaofang Zhang, Bin Liang, Qian Zhou, and Baowen Xu. "Gated Hierarchical LSTMs for Target-Based Sentiment Analysis." International Journal of Software Engineering and Knowledge Engineering 28, no. 11n12 (November 2018): 1719–37. http://dx.doi.org/10.1142/s0218194018400259.
Full textFeng, Kaicheng, and Xiaobing Liu. "Adaptive Attention with Consumer Sentinel for Movie Box Office Prediction." Complexity 2020 (December 7, 2020): 1–9. http://dx.doi.org/10.1155/2020/6689304.
Full textJo, Youngki, and Hyunsoo Lee. "Electricity Demand Forecasting Framework using Modified Attention-based LSTM." Journal of Korean Institute of Intelligent Systems 30, no. 3 (June 30, 2020): 242–50. http://dx.doi.org/10.5391/jkiis.2020.30.3.242.
Full textGallardo-Antolín, Ascensión, and Juan M. Montero. "Detecting Deception from Gaze and Speech Using a Multimodal Attention LSTM-Based Framework." Applied Sciences 11, no. 14 (July 11, 2021): 6393. http://dx.doi.org/10.3390/app11146393.
Full textZhang, Xuan, Xun Liang, Aakas Zhiyuli, Shusen Zhang, Rui Xu, and Bo Wu. "AT-LSTM: An Attention-based LSTM Model for Financial Time Series Prediction." IOP Conference Series: Materials Science and Engineering 569 (August 9, 2019): 052037. http://dx.doi.org/10.1088/1757-899x/569/5/052037.
Full textYin, Helin, Dong Jin, Yeong Hyeon Gu, Chang Jin Park, Sang Keun Han, and Seong Joon Yoo. "STL-ATTLSTM: Vegetable Price Forecasting Using STL and Attention Mechanism-Based LSTM." Agriculture 10, no. 12 (December 8, 2020): 612. http://dx.doi.org/10.3390/agriculture10120612.
Full textYang, Zhan, Chengliang Li, Zhongying Zhao, and Chao Li. "Sentiment classification based on dependency-relationship embedding and attention mechanism." Journal of Intelligent & Fuzzy Systems 41, no. 1 (August 11, 2021): 867–77. http://dx.doi.org/10.3233/jifs-202747.
Full textKim, Hong-In, and Rae-Hong Park. "Residual LSTM Attention Network for Object Tracking." IEEE Signal Processing Letters 25, no. 7 (July 2018): 1029–33. http://dx.doi.org/10.1109/lsp.2018.2835768.
Full textDissertations / Theses on the topic "Attention LSTM"
Singh, J. P., A. Kumar, Nripendra P. Rana, and Y. K. Dwivedi. "Attention-based LSTM network for rumor veracity estimation of tweets." Springer, 2020. http://hdl.handle.net/10454/17942.
Full textTwitter has become a fertile place for rumors, as information can spread to a large number of people immediately. Rumors can mislead public opinion, weaken social order, decrease the legitimacy of government, and lead to a significant threat to social stability. Therefore, timely detection and debunking rumor are urgently needed. In this work, we proposed an Attention-based Long-Short Term Memory (LSTM) network that uses tweet text with thirteen different linguistic and user features to distinguish rumor and non-rumor tweets. The performance of the proposed Attention-based LSTM model is compared with several conventional machine and deep learning models. The proposed Attention-based LSTM model achieved an F1-score of 0.88 in classifying rumor and non-rumor tweets, which is better than the state-of-the-art results. The proposed system can reduce the impact of rumors on society and weaken the loss of life, money, and build the firm trust of users with social media platforms.
Kindbom, Hannes. "Investigating the Attribution Quality of LSTM with Attention and SHAP : Going Beyond Predictive Performance." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-302412.
Full textGenom att estimera påverkan varje marknadsföringskanal har på konverteringar, kan annonsörer utveckla strategier och spendera sina marknadsföringsbudgetar optimalt. Det här kallas ofta attributionsmodellering och det får alltmer uppmärksamhet i både näringslivet och akademin när tillgången till spårningsinformation ökar online. Med fokus på att uppnå högre prediktiv prestanda är Long Short-Term Memory (LSTM) för närvarande en populär datadriven lösning inom attributionsmodellering. Sådana djupa neurala nätverk har dock kritiserats för att vara svårtolkade. Tolkningsbarhet är viktigt, då kanalattributioner generellt fås genom att studera hur en modell gör en binär konverteringsprediktering givet en sekvens av klick eller visningar av annonser i olika kanaler. Det här examensarbetet studerar och jämför därför kvaliteten av en LSTMs attributioner, beräknade med SHapley Additive exPlanations (SHAP), attention och fractional scores mot tre grundmodeller. Fractional scores beräknas som medelvärdesdifferensen av en modells predikterade konverteringssannolikhet med och utan en viss kanal. Därutöver utvecklas en syntetisk datagenerator baserad på en Poissonprocess, vilken valideras mot verklig data. Generatorn används för att kunna mäta attributionskvalitet som Mean Absolute Error (MAE) mellan beräknade attributioner och de verkliga kausala sambanden mellan kanalklick och konverteringar. De experimentella resultaten visar att attributionskvaliteten inte entydigt avspeglas av en LSTMs prediktiva prestanda. Det är generellt inte möjligt att anta en hög attributionskvalitet enbart baserat på en hög prediktiv prestanda. Alla modeller uppnår exempelvis ~82% prediktiv träffsäkerhet på verklig data, medan LSTM Fractional och SHAP ger den lägsta attributionskvaliteten på 0:0566 respektive 0:0311 MAE. Det här kan jämföras mot en förbättrad MAE på 0:0058, som erhålls med en Last-touch-modell. Kvaliteten på attributioner varierar också signifikant beroende på vilket metod för attributionsberäkning som används för LSTM. Det här antyder att den pågående strävan efter högre prediktiv träffsäkerhet kan ifrågasättas och att det inte alltid är berättigat att använda en LSTM när attributioner av hög kvalitet eftersträvas.
Forch, Valentin, Julien Vitay, and Fred H. Hamker. "Recurrent Spatial Attention for Facial Emotion Recognition." Technische Universität Chemnitz, 2020. https://monarch.qucosa.de/id/qucosa%3A72453.
Full textBopaiah, Jeevith. "A recurrent neural network architecture for biomedical event trigger classification." UKnowledge, 2018. https://uknowledge.uky.edu/cs_etds/73.
Full textSoncini, Filippo. "Classificazione di documenti tramite reti neurali." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20509/.
Full textNäslund, Per. "Artificial Neural Networks in Swedish Speech Synthesis." Thesis, KTH, Tal-kommunikation, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239350.
Full textTalsynteser, också kallat TTS (text-to-speech) används i stor utsträckning inom smarta assistenter och många andra applikationer. Samtida forskning applicerar maskininlärning och artificiella neurala nätverk (ANN) för att utföra talsyntes. Det har visats i studier att dessa system presterar bättre än de äldre konkatenativa och parametriska metoderna. I den här rapporten utforskas ANN-baserade TTS-metoder och en av metoderna implementeras för det svenska språket. Den använda metoden kallas “Tacotron” och är ett första steg mot end-to-end TTS baserat på neurala nätverk. Metoden binder samman flertalet olika ANN-tekniker. Det resulterande systemet jämförs med en parametriskt TTS genom ett graderat preferens-test som innefattar 20 svensktalande försökspersoner. En statistiskt säkerställd preferens för det ANN- baserade TTS-systemet fastställs. Försökspersonerna indikerar att det ANN-baserade TTS-systemet presterar bättre än det parametriska när det kommer till ljudkvalitet och naturlighet men visar brister inom tydlighet.
Carman, Benjamin Andrew. "Translating LaTeX to Coq: A Recurrent Neural Network Approach to Formalizing Natural Language Proofs." Ohio University Honors Tutorial College / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ouhonors161919616626269.
Full textUjihara, Rintaro. "Multi-objective optimization for model selection in music classification." Thesis, KTH, Optimeringslära och systemteori, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298370.
Full textI och med genombrottet av maskininlärningstekniker har forskning kring känsloklassificering i musik sett betydande framsteg genom att kombinera olikamusikanalysverktyg med nya maskinlärningsmodeller. Trots detta är hur man förbehandlar ljuddatat och valet av vilken maskinklassificeringsalgoritm som ska tillämpas beroende på vilken typ av data man arbetar med samt målet med projektet. Denna uppsats samarbetspartner, Ichigoichie AB, utvecklar för närvarande ett system för att kategorisera musikdata enligt positiva och negativa känslor. För att höja systemets noggrannhet är målet med denna uppsats att experimentellt hitta bästa modellen baserat på sex musik-egenskaper (Mel-spektrogram, MFCC, HPSS, Onset, CENS samt Tonnetz) och ett antal olika maskininlärningsmodeller, inklusive Deep Learning-modeller. Varje modell hyperparameteroptimeras och utvärderas enligt paretooptimalitet med hänsyn till noggrannhet och beräkningstid. Resultaten visar att den mest lovande modellen uppnådde 95% korrekt klassificering med en beräkningstid på mindre än 15 sekunder.
GAO, SHAO-EN, and 高紹恩. "Share Price Trend Prediction Using Attention with LSTM Structure." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/n57t99.
Full text國立勤益科技大學
資訊工程系
107
Stock market has a considerable impact in the whole financial market.Among researches on prediction, stock price movements prediction is a quite hot topic. In this paper, stock price movements were predicted by utilizing various stock information by technical means of deep learning.The architecture based on LSTM using Attention proposed in this paper was proven through experiment to be able to effectively improve prediction accuracy. This paper uses deep learning to predict the trend of stock prices.Since the price increase of stocks is usually related to the stock price in the past, a long term short term memory LSTM based architecture is proposed. LSTM improves the long term dependence of traditional RNN, effectively improves the accuracy and stability of prediction,and improves the accuracy and stability of the network by adding Attention.
Tseng, Po-Yen, and 曾博彥. "Android Malware Analysis Based on System Call sequences and Attention-LSTM." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/gdrth9.
Full text國立中央大學
資訊管理學系
107
With the popularity of Android mobile devices, detecting and protecting malicious software has become an important issue. Although there have been studies proposed that dynamic analysis can overcome the shortcomings of avoidance detection problems such as code obfuscated. However, how to learn more detail of correlation between the sequence-type features extracted by dynamic analysis to improve the resolution accuracy of the classification model is the direction of many research efforts. This study extracts the system call sequence as a feature, and extracts the system call correlation through the Long Short-Term Memory (LSTM) deep learning model. In addition, in order to avoid the increase of the length of the system call sequence and reduce the accuracy of the model classification, the attention mechanism is added to the classification model. The experimental results show that through the two-layer of Bi- LSTM architecture and the deep neural network of the Attention mechanism, the resolution of benign and malicious programs is 93.5%, and the classification of benign programs and two other malicious types is detailed. The result is an accuracy of 93.1%, showing excellent classification ability.
Book chapters on the topic "Attention LSTM"
Grósz, Tamás, and Mikko Kurimo. "LSTM-XL: Attention Enhanced Long-Term Memory for LSTM Cells." In Text, Speech, and Dialogue, 382–93. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83527-9_32.
Full textYuan, Fangfang, Yanmin Shang, Yanbing Liu, Yanan Cao, and Jianlong Tan. "Attention-Based LSTM for Insider Threat Detection." In Applications and Techniques in Information Security, 192–201. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-15-0871-4_15.
Full textLebron Casas, Luis, and Eugenia Koblents. "Video Summarization with LSTM and Deep Attention Models." In MultiMedia Modeling, 67–79. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-05716-9_6.
Full textZheng, Zengwei, Lifei Shi, Chi Wang, Lin Sun, and Gang Pan. "LSTM with Uniqueness Attention for Human Activity Recognition." In Lecture Notes in Computer Science, 498–509. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30508-6_40.
Full textLiu, Suyuan, Wenming Zheng, Tengfei Song, and Yuan Zong. "Sparse Graphic Attention LSTM for EEG Emotion Recognition." In Communications in Computer and Information Science, 690–97. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36808-1_75.
Full textWeytjens, Hans, and Jochen De Weerdt. "Process Outcome Prediction: CNN vs. LSTM (with Attention)." In Business Process Management Workshops, 321–33. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-66498-5_24.
Full textLi, Changliang, Changsong Li, and Pengyuan Liu. "Sentiment Analysis Based on LSTM Architecture with Emoticon Attention." In Lecture Notes in Computer Science, 232–42. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-26142-9_21.
Full textCai, Guoyong, and Hongyu Li. "Joint Attention LSTM Network for Aspect-Level Sentiment Analysis." In Lecture Notes in Computer Science, 147–57. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01012-6_12.
Full textWang, Xiuling, Hao Chen, Zhoujun Li, and Zhonghua Zhao. "Unrest News Amount Prediction with Context-Aware Attention LSTM." In Lecture Notes in Computer Science, 369–77. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-97310-4_42.
Full textZhang, Kai, Weiping Ren, and Yangsen Zhang. "Attention-Based Bi-LSTM for Chinese Named Entity Recognition." In Lecture Notes in Computer Science, 643–52. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04015-4_56.
Full textConference papers on the topic "Attention LSTM"
Chen, Zhenzhong, and Wanjie Sun. "Scanpath Prediction for Visual Attention using IOR-ROI LSTM." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/89.
Full textXu, Cheng, Junzhong Ji, Menglong Zhang, and Xiaodan Zhang. "Attention-gated LSTM for Image Captioning." In 2019 International Conference on Unmanned Systems and Artificial Intelligence (ICUSAI). IEEE, 2019. http://dx.doi.org/10.1109/icusai47366.2019.9124779.
Full textAhmed, Mahtab, Muhammad Rifayat Samee, and Robert E. Mercer. "Improving Tree-LSTM with Tree Attention." In 2019 IEEE 13th International Conference on Semantic Computing (ICSC). IEEE, 2019. http://dx.doi.org/10.1109/icosc.2019.8665673.
Full textCheng, Weiguo, and Zhenyi Xu. "ECS Request Prediction with Attention-LSTM." In 2020 Chinese Automation Congress (CAC). IEEE, 2020. http://dx.doi.org/10.1109/cac51589.2020.9327311.
Full textZhang, Zhichao, Junyu Dong, Qilu Zhao, Lin Qi, and Shu Zhang. "Attention LSTM for Scene Graph Generation." In 2021 6th International Conference on Image, Vision and Computing (ICIVC). IEEE, 2021. http://dx.doi.org/10.1109/icivc52351.2021.9526967.
Full textXing, Bowen, Lejian Liao, Dandan Song, Jingang Wang, Fuzheng Zhang, Zhongyuan Wang, and Heyan Huang. "Earlier Attention? Aspect-Aware LSTM for Aspect-Based Sentiment Analysis." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/738.
Full textGuo, Jingjie, Kelang Tian, Kejiang Ye, and Cheng-Zhong Xu. "MA-LSTM: A Multi-Attention Based LSTM for Complex Pattern Extraction." In 2020 25th International Conference on Pattern Recognition (ICPR). IEEE, 2021. http://dx.doi.org/10.1109/icpr48806.2021.9412402.
Full textSong, Jingkuan, Lianli Gao, Zhao Guo, Wu Liu, Dongxiang Zhang, and Heng Tao Shen. "Hierarchical LSTM with Adjusted Temporal Attention for Video Captioning." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/381.
Full textXie, Qi, Yongjun Wang, and Zhiquan Qin. "Malware Family Classification using LSTM with Attention." In 2020 13th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI). IEEE, 2020. http://dx.doi.org/10.1109/cisp-bmei51763.2020.9263499.
Full textLei, Li, Ming Chen, Chengwan He, and Duojiao Li. "XSS Detection Technology Based on LSTM-Attention." In 2020 5th International Conference on Control, Robotics and Cybernetics (CRC). IEEE, 2020. http://dx.doi.org/10.1109/crc51253.2020.9253484.
Full text