Статті в журналах з теми "Recurrent neural networks BLSTM"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Recurrent neural networks BLSTM".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.
Guo, Yanbu, Bingyi Wang, Weihua Li, and Bei Yang. "Protein secondary structure prediction improved by recurrent neural networks integrated with two-dimensional convolutional neural networks." Journal of Bioinformatics and Computational Biology 16, no. 05 (2018): 1850021. http://dx.doi.org/10.1142/s021972001850021x.
Повний текст джерелаZhong, Cheng, Zhonglian Jiang, Xiumin Chu, and Lei Liu. "Inland Ship Trajectory Restoration by Recurrent Neural Network." Journal of Navigation 72, no. 06 (2019): 1359–77. http://dx.doi.org/10.1017/s0373463319000316.
Повний текст джерелаKADARI, REKIA, YU ZHANG, WEINAN ZHANG, and TING LIU. "CCG supertagging with bidirectional long short-term memory networks." Natural Language Engineering 24, no. 1 (2017): 77–90. http://dx.doi.org/10.1017/s1351324917000250.
Повний текст джерелаShchetinin, E. Yu. "EMOTIONS RECOGNITION IN HUMAN SPEECH USING DEEP NEURAL NETWORKS." Vestnik komp'iuternykh i informatsionnykh tekhnologii, no. 199 (January 2021): 44–51. http://dx.doi.org/10.14489/vkit.2021.01.pp.044-051.
Повний текст джерелаDutta, Aparajita, Kusum Kumari Singh, and Ashish Anand. "SpliceViNCI: Visualizing the splicing of non-canonical introns through recurrent neural networks." Journal of Bioinformatics and Computational Biology 19, no. 04 (2021): 2150014. http://dx.doi.org/10.1142/s0219720021500141.
Повний текст джерелаZhang, Ansi, Honglei Wang, Shaobo Li, et al. "Transfer Learning with Deep Recurrent Neural Networks for Remaining Useful Life Estimation." Applied Sciences 8, no. 12 (2018): 2416. http://dx.doi.org/10.3390/app8122416.
Повний текст джерелаLi, Yue, Xutao Wang, and Pengjian Xu. "Chinese Text Classification Model Based on Deep Learning." Future Internet 10, no. 11 (2018): 113. http://dx.doi.org/10.3390/fi10110113.
Повний текст джерелаXuan, Wenjing, Ning Liu, Neng Huang, Yaohang Li, and Jianxin Wang. "CLPred: a sequence-based protein crystallization predictor using BLSTM neural network." Bioinformatics 36, Supplement_2 (2020): i709—i717. http://dx.doi.org/10.1093/bioinformatics/btaa791.
Повний текст джерелаBrocki, Łukasz, and Krzysztof Marasek. "Deep Belief Neural Networks and Bidirectional Long-Short Term Memory Hybrid for Speech Recognition." Archives of Acoustics 40, no. 2 (2015): 191–95. http://dx.doi.org/10.1515/aoa-2015-0021.
Повний текст джерелаVarshney, Abhishek, Samit Kumar Ghosh, Sibasankar Padhy, Rajesh Kumar Tripathy, and U. Rajendra Acharya. "Automated Classification of Mental Arithmetic Tasks Using Recurrent Neural Network and Entropy Features Obtained from Multi-Channel EEG Signals." Electronics 10, no. 9 (2021): 1079. http://dx.doi.org/10.3390/electronics10091079.
Повний текст джерелаMahmoud, Adnen, and Mounir Zrigui. "BLSTM-API: Bi-LSTM Recurrent Neural Network-Based Approach for Arabic Paraphrase Identification." Arabian Journal for Science and Engineering 46, no. 4 (2021): 4163–74. http://dx.doi.org/10.1007/s13369-020-05320-w.
Повний текст джерелаLong, Haixia, Zhao Sun, Manzhi Li, Hai Yan Fu, and Ming Cai Lin. "Predicting Protein Phosphorylation Sites Based on Deep Learning." Current Bioinformatics 15, no. 4 (2020): 300–308. http://dx.doi.org/10.2174/1574893614666190902154332.
Повний текст джерелаGao, Shenghan, Changyan Zheng, Yicong Zhao, Ziyue Wu, Jiao Li, and Xian Huang. "Comparison of enhancement techniques based on neural networks for attenuated voice signal captured by flexible vibration sensors on throats." Nanotechnology and Precision Engineering 5, no. 1 (2022): 013001. http://dx.doi.org/10.1063/10.0009187.
Повний текст джерелаZulqarnain, Muhammad, Rozaida Ghazali, Yana Mazwin Mohmad Hassim, and Muhammad Rehan. "Text classification based on gated recurrent unit combines with support vector machine." International Journal of Electrical and Computer Engineering (IJECE) 10, no. 4 (2020): 3734. http://dx.doi.org/10.11591/ijece.v10i4.pp3734-3742.
Повний текст джерелаZiafat, Nishmia, Hafiz Farooq Ahmad, Iram Fatima, Muhammad Zia, Abdulaziz Alhumam, and Kashif Rajpoot. "Correct Pronunciation Detection of the Arabic Alphabet Using Deep Learning." Applied Sciences 11, no. 6 (2021): 2508. http://dx.doi.org/10.3390/app11062508.
Повний текст джерелаTerra Vieira, Samuel, Renata Lopes Rosa, Demóstenes Zegarra Rodríguez, Miguel Arjona Ramírez, Muhammad Saadi, and Lunchakorn Wuttisittikulkij. "Q-Meter: Quality Monitoring System for Telecommunication Services Based on Sentiment Analysis Using Deep Learning." Sensors 21, no. 5 (2021): 1880. http://dx.doi.org/10.3390/s21051880.
Повний текст джерелаChhetri, Manoj, Sudhanshu Kumar, Partha Pratim Roy, and Byung-Gyu Kim. "Deep BLSTM-GRU Model for Monthly Rainfall Prediction: A Case Study of Simtokha, Bhutan." Remote Sensing 12, no. 19 (2020): 3174. http://dx.doi.org/10.3390/rs12193174.
Повний текст джерелаKumar, S., M. Anand Kumar, and K. P. Soman. "Deep Learning Based Part-of-Speech Tagging for Malayalam Twitter Data (Special Issue: Deep Learning Techniques for Natural Language Processing)." Journal of Intelligent Systems 28, no. 3 (2019): 423–35. http://dx.doi.org/10.1515/jisys-2017-0520.
Повний текст джерелаHou, Hongwei, Kunzhi Tang, Xiaoqian Liu, and Yue Zhou. "Application of Artificial Intelligence Technology Optimized by Deep Learning to Rural Financial Development and Rural Governance." Journal of Global Information Management 30, no. 7 (2022): 1–23. http://dx.doi.org/10.4018/jgim.289220.
Повний текст джерелаJaveed, Danish, Tianhan Gao, Muhammad Taimoor Khan, and Ijaz Ahmad. "A Hybrid Deep Learning-Driven SDN Enabled Mechanism for Secure Communication in Internet of Things (IoT)." Sensors 21, no. 14 (2021): 4884. http://dx.doi.org/10.3390/s21144884.
Повний текст джерелаOtte, S., L. Wittig, G. Hüttmann, et al. "Investigating Recurrent Neural Networks for OCT A-scan Based Tissue Analysis." Methods of Information in Medicine 53, no. 04 (2014): 245–49. http://dx.doi.org/10.3414/me13-01-0135.
Повний текст джерелаFeigl, Tobias, Sebastian Kram, Philipp Woller, Ramiz H. Siddiqui, Michael Philippsen, and Christopher Mutschler. "RNN-Aided Human Velocity Estimation from a Single IMU." Sensors 20, no. 13 (2020): 3656. http://dx.doi.org/10.3390/s20133656.
Повний текст джерелаLynn, Htet Myet, Pankoo Kim, and Sung Bum Pan. "Data Independent Acquisition Based Bi-Directional Deep Networks for Biometric ECG Authentication." Applied Sciences 11, no. 3 (2021): 1125. http://dx.doi.org/10.3390/app11031125.
Повний текст джерелаZheng, Chunjun, Chunli Wang, and Ning Jia. "An Ensemble Model for Multi-Level Speech Emotion Recognition." Applied Sciences 10, no. 1 (2019): 205. http://dx.doi.org/10.3390/app10010205.
Повний текст джерелаQin, Tianyun, Rangding Wang, Diqun Yan, and Lang Lin. "Source Cell-Phone Identification in the Presence of Additive Noise from CQT Domain." Information 9, no. 8 (2018): 205. http://dx.doi.org/10.3390/info9080205.
Повний текст джерелаGrossberg, Stephen. "Recurrent neural networks." Scholarpedia 8, no. 2 (2013): 1888. http://dx.doi.org/10.4249/scholarpedia.1888.
Повний текст джерелаBitzer, Sebastian, and Stefan J. Kiebel. "Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks." Biological Cybernetics 106, no. 4-5 (2012): 201–17. http://dx.doi.org/10.1007/s00422-012-0490-x.
Повний текст джерелаSchuster, M., and K. K. Paliwal. "Bidirectional recurrent neural networks." IEEE Transactions on Signal Processing 45, no. 11 (1997): 2673–81. http://dx.doi.org/10.1109/78.650093.
Повний текст джерелаPassricha, Vishal, and Rajesh Kumar Aggarwal. "A Hybrid of Deep CNN and Bidirectional LSTM for Automatic Speech Recognition." Journal of Intelligent Systems 29, no. 1 (2019): 1261–74. http://dx.doi.org/10.1515/jisys-2018-0372.
Повний текст джерелаMa, Xiao, Peter Karkus, David Hsu, and Wee Sun Lee. "Particle Filter Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5101–8. http://dx.doi.org/10.1609/aaai.v34i04.5952.
Повний текст джерелаKAWAMURA, Yoshiaki. "Learning for Recurrent Neural Networks." Journal of Japan Society for Fuzzy Theory and Systems 7, no. 1 (1995): 52–56. http://dx.doi.org/10.3156/jfuzzy.7.1_52.
Повний текст джерелаSutskever, Ilya, and Geoffrey Hinton. "Temporal-Kernel Recurrent Neural Networks." Neural Networks 23, no. 2 (2010): 239–43. http://dx.doi.org/10.1016/j.neunet.2009.10.009.
Повний текст джерелаGavaldà, Ricard, and Hava T. Siegelmann. "Discontinuities in Recurrent Neural Networks." Neural Computation 11, no. 3 (1999): 715–45. http://dx.doi.org/10.1162/089976699300016638.
Повний текст джерелаJinmiao Chen and N. S. Chaudhari. "Segmented-Memory Recurrent Neural Networks." IEEE Transactions on Neural Networks 20, no. 8 (2009): 1267–80. http://dx.doi.org/10.1109/tnn.2009.2022980.
Повний текст джерелаSamuelides, M., and B. Cessac. "Random recurrent neural networks dynamics." European Physical Journal Special Topics 142, no. 1 (2007): 89–122. http://dx.doi.org/10.1140/epjst/e2007-00059-1.
Повний текст джерелаCheng, Chang-Yuan, Kuang-Hui Lin, and Chih-Wen Shih. "Multistability in Recurrent Neural Networks." SIAM Journal on Applied Mathematics 66, no. 4 (2006): 1301–20. http://dx.doi.org/10.1137/050632440.
Повний текст джерелаRuiz, Luana, Fernando Gama, and Alejandro Ribeiro. "Gated Graph Recurrent Neural Networks." IEEE Transactions on Signal Processing 68 (2020): 6303–18. http://dx.doi.org/10.1109/tsp.2020.3033962.
Повний текст джерелаSantini, Simone, Alberto Del Bimbo, and Ramesh Jain. "Block-structured recurrent neural networks." Neural Networks 8, no. 1 (1995): 135–47. http://dx.doi.org/10.1016/0893-6080(94)00060-y.
Повний текст джерелаHunt, Andrew. "Recurrent neural networks for syllabification." Speech Communication 13, no. 3-4 (1993): 323–32. http://dx.doi.org/10.1016/0167-6393(93)90031-f.
Повний текст джерелаWhite, Halbert. "Learning in recurrent neural networks." Mathematical Social Sciences 22, no. 1 (1991): 102–3. http://dx.doi.org/10.1016/0165-4896(91)90073-z.
Повний текст джерелаImam, Nabil. "Wiring up recurrent neural networks." Nature Machine Intelligence 3, no. 9 (2021): 740–41. http://dx.doi.org/10.1038/s42256-021-00391-2.
Повний текст джерелаPu, Yi-Fei, Zhang Yi, and Ji-Liu Zhou. "Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 28, no. 10 (2017): 2319–33. http://dx.doi.org/10.1109/tnnls.2016.2582512.
Повний текст джерелаShchetinin, Eugene Yu, and Leonid Sevastianov. "Improving the Learning Power of Artificial Intelligence Using Multimodal Deep Learning." EPJ Web of Conferences 248 (2021): 01017. http://dx.doi.org/10.1051/epjconf/202124801017.
Повний текст джерелаTamura, Akihiro, Taro Watanabe, and Eiichiro Sumita. "Recurrent Neural Networks for Word Alignment." Journal of Natural Language Processing 22, no. 4 (2015): 289–312. http://dx.doi.org/10.5715/jnlp.22.289.
Повний текст джерелаPark, Sungrae, Kyungwoo Song, Mingi Ji, Wonsung Lee, and Il-Chul Moon. "Adversarial Dropout for Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4699–706. http://dx.doi.org/10.1609/aaai.v33i01.33014699.
Повний текст джерелаMu, Yangzi, Mengxing Huang, Chunyang Ye, and Qingzhou Wu. "Diagnosis Prediction via Recurrent Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 117–20. http://dx.doi.org/10.18178/ijmlc.2018.8.2.673.
Повний текст джерелаS B, Chandini. "Intrusion Detection using Recurrent Neural Networks." International Journal for Research in Applied Science and Engineering Technology 8, no. 6 (2020): 2050–52. http://dx.doi.org/10.22214/ijraset.2020.6335.
Повний текст джерелаFreitag, Steffen, Wolfgang Graf, and Michael Kaliske. "Recurrent neural networks for fuzzy data." Integrated Computer-Aided Engineering 18, no. 3 (2011): 265–80. http://dx.doi.org/10.3233/ica-2011-0373.
Повний текст джерелаGarzon, Max, and Fernanda Botelho. "Dynamical approximation by recurrent neural networks." Neurocomputing 29, no. 1-3 (1999): 25–46. http://dx.doi.org/10.1016/s0925-2312(99)00114-9.
Повний текст джерелаDobnikar, Andrej, and Branko Šter. "Structural Properties of Recurrent Neural Networks." Neural Processing Letters 29, no. 2 (2009): 75–88. http://dx.doi.org/10.1007/s11063-009-9096-2.
Повний текст джерела