To see the other types of publications on this topic, follow the link: LSTM Neural networks.

Journal articles on the topic 'LSTM Neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'LSTM Neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yu, Yong, Xiaosheng Si, Changhua Hu, and Jianxun Zhang. "A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures." Neural Computation 31, no. 7 (2019): 1235–70. http://dx.doi.org/10.1162/neco_a_01199.

Full text
Abstract:
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of long-term dependencies well. Since its introduction, almost all the exciting results based on RNNs have been achieved by the LSTM. The LSTM has become the focus of deep learning. We review the LSTM cel
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Huimin, Liyong Wang, Yangyang Xu, et al. "State of Charge Estimation for Lithium-ion Battery Using Long Short-Term Memory Networks." Journal of Physics: Conference Series 2890, no. 1 (2024): 012024. http://dx.doi.org/10.1088/1742-6596/2890/1/012024.

Full text
Abstract:
Abstract Accurate estimation of the State of Charge (SOC) in lithium-ion batteries is crucial for enhancing performance and extending battery life, especially in applications like electric vehicles and energy storage systems. This study introduces a novel method for SOC estimation that utilizes Long Short-Term Memory (LSTM) neural networks. To evaluate the LSTM model’s effectiveness, we compared its performance with that of Backpropagation (BP) neural networks and Recurrent Neural Networks (RNN) using the Root Mean Square Error (RMSE) as the evaluation metric. The findings reveal that the LSTM
APA, Harvard, Vancouver, ISO, and other styles
3

Bakir, Houda, Ghassen Chniti, and Hédi Zaher. "E-Commerce Price Forecasting Using LSTM Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 169–74. http://dx.doi.org/10.18178/ijmlc.2018.8.2.682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Burges, Entesar T., Zakariya A. Oraibi, and Ali Wali. "Gait Recognition Using Hybrid LSTM-CNN Deep Neural Networks." Journal of Image and Graphics 12, no. 2 (2024): 168–75. http://dx.doi.org/10.18178/joig.12.2.168-175.

Full text
Abstract:
Identifying individuals based on their gait is a crucial aspect of biometric authentication. It is complicated by several factors, such as altering one’s walking posture, donning a coat, and wearing high heels. With the advent of artificial intelligence, deep learning, in particular, has made significant strides in this area. The conditional Generative Adversarial Network (cGAN), together with hybrid Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNNs), are used in this research to create images using a novel technique. The framework comprises three parts. The first involves
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, David, and An Wei. "Regulated LSTM Artificial Neural Networks for Option Risks." FinTech 1, no. 2 (2022): 180–90. http://dx.doi.org/10.3390/fintech1020014.

Full text
Abstract:
This research aims to study the pricing risks of options by using improved LSTM artificial neural network models and make direct comparisons with the Black–Scholes option pricing model based upon the option prices of 50 ETFs of the Shanghai Securities Exchange from 1 January 2018 to 31 December 2019. We study an LSTM model, a mathematical option pricing model (BS model), and an improved artificial neural network model—the regulated LSTM model. The method we adopted is first to price the options using the mathematical model—i.e., the BS model—and then to construct the LSTM neural network for tr
APA, Harvard, Vancouver, ISO, and other styles
6

Wan, Yingliang, Hong Tao, and Li Ma. "Forecasting Zhejiang Province's GDP Using a CNN-LSTM Model." Frontiers in Business, Economics and Management 13, no. 3 (2024): 233–35. http://dx.doi.org/10.54097/bmq2dy63.

Full text
Abstract:
Zhejiang province has experienced notable economic growth in recent years. Despite this, achieving sustainable high-quality economic development presents complex challenges and uncertainties. This study employs advanced neural network methodologies, including Convolutional Neural Networks (CNN), Long Short-Term Memory networks (LSTM), and an integrated CNN-LSTM model, to predict Zhejiang's economic trajectory. Our empirical analysis demonstrates the proficiency of neural networks in delivering reasonably precise economic forecasts, despite inherent prediction residuals. A comparative assessmen
APA, Harvard, Vancouver, ISO, and other styles
7

Kalinin, Maxim, Vasiliy Krundyshev, and Evgeny Zubkov. "Estimation of applicability of modern neural network methods for preventing cyberthreats to self-organizing network infrastructures of digital economy platforms,." SHS Web of Conferences 44 (2018): 00044. http://dx.doi.org/10.1051/shsconf/20184400044.

Full text
Abstract:
The problems of applying neural network methods for solving problems of preventing cyberthreats to flexible self-organizing network infrastructures of digital economy platforms: vehicle adhoc networks, wireless sensor networks, industrial IoT, “smart buildings” and “smart cities” are considered. The applicability of the classic perceptron neural network, recurrent, deep, LSTM neural networks and neural networks ensembles in the restricting conditions of fast training and big data processing are estimated. The use of neural networks with a complex architecture– recurrent and LSTM neural network
APA, Harvard, Vancouver, ISO, and other styles
8

Wan, Huaiyu, Shengnan Guo, Kang Yin, Xiaohui Liang, and Youfang Lin. "CTS-LSTM: LSTM-based neural networks for correlatedtime series prediction." Knowledge-Based Systems 191 (March 2020): 105239. http://dx.doi.org/10.1016/j.knosys.2019.105239.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kande, Jayanth. "Twitter Sentiment Analysis with LSTM Neural Networks." REST Journal on Data Analytics and Artificial Intelligence 3, no. 3 (2024): 92–98. http://dx.doi.org/10.46632/jdaai/3/3/11.

Full text
Abstract:
This project delves into sentiment analysis on Twitter using Long Short-Term Memory (LSTM) Neural Networks in conjunction with Global Vectors for Word Representation (GloVe). The study explores the properties of tweets, preprocessing steps, and applying GloVe embedding’s to map words to vectors. The classifier’s design and training parameters are detailed, and the results are compared with baselines, revealing the LSTM’s superiority in handling sequential language data. Furthermore, trials explore how changing the quantity of fully connected layers and LSTM time steps affects accuracy. The fin
APA, Harvard, Vancouver, ISO, and other styles
10

Shewalkar, Apeksha, Deepika Nyavanandi, and Simone A. Ludwig. "Performance Evaluation of Deep Neural Networks Applied to Speech Recognition: RNN, LSTM and GRU." Journal of Artificial Intelligence and Soft Computing Research 9, no. 4 (2019): 235–45. http://dx.doi.org/10.2478/jaiscr-2019-0006.

Full text
Abstract:
Abstract Deep Neural Networks (DNN) are nothing but neural networks with many hidden layers. DNNs are becoming popular in automatic speech recognition tasks which combines a good acoustic with a language model. Standard feedforward neural networks cannot handle speech data well since they do not have a way to feed information from a later layer back to an earlier layer. Thus, Recurrent Neural Networks (RNNs) have been introduced to take temporal dependencies into account. However, the shortcoming of RNNs is that long-term dependencies due to the vanishing/exploding gradient problem cannot be h
APA, Harvard, Vancouver, ISO, and other styles
11

Du, Shaohui, Zhenghan Chen, Haoyan Wu, Yihong Tang, and YuanQing Li. "Image Recommendation Algorithm Combined with Deep Neural Network Designed for Social Networks." Complexity 2021 (July 2, 2021): 1–9. http://dx.doi.org/10.1155/2021/5196190.

Full text
Abstract:
In recent years, deep neural networks have achieved great success in many fields, such as computer vision and natural language processing. Traditional image recommendation algorithms use text-based recommendation methods. The process of displaying images requires a lot of time and labor, and the time-consuming labor is inefficient. Therefore, this article mainly studies image recommendation algorithms based on deep neural networks in social networks. First, according to the time stamp information of the dataset, the interaction records of each user are sorted by the closest time. Then, some fe
APA, Harvard, Vancouver, ISO, and other styles
12

Zhang, Chuanwei, Xusheng Xu, Yikun Li, Jing Huang, Chenxi Li, and Weixin Sun. "Research on SOC Estimation Method for Lithium-Ion Batteries Based on Neural Network." World Electric Vehicle Journal 14, no. 10 (2023): 275. http://dx.doi.org/10.3390/wevj14100275.

Full text
Abstract:
With the increasingly serious problem of environmental pollution, new energy vehicles have become a hot spot in today’s research. The lithium-ion battery has become the mainstream power battery of new energy vehicles as it has the advantages of long service life, high-rated voltage, low self-discharge rate, etc. The battery management system is the key part that ensures the efficient and safe operation of the vehicle as well as the long life of the power battery. The accurate estimation of the power battery state directly affects the whole vehicle’s performance. As a result, this paper establi
APA, Harvard, Vancouver, ISO, and other styles
13

Vlachas, Pantelis R., Wonmin Byeon, Zhong Y. Wan, Themistoklis P. Sapsis, and Petros Koumoutsakos. "Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 474, no. 2213 (2018): 20170844. http://dx.doi.org/10.1098/rspa.2017.0844.

Full text
Abstract:
We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto–Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GP
APA, Harvard, Vancouver, ISO, and other styles
14

Pal, Subarno, Soumadip Ghosh, and Amitava Nag. "Sentiment Analysis in the Light of LSTM Recurrent Neural Networks." International Journal of Synthetic Emotions 9, no. 1 (2018): 33–39. http://dx.doi.org/10.4018/ijse.2018010103.

Full text
Abstract:
Long short-term memory (LSTM) is a special type of recurrent neural network (RNN) architecture that was designed over simple RNNs for modeling temporal sequences and their long-range dependencies more accurately. In this article, the authors work with different types of LSTM architectures for sentiment analysis of movie reviews. It has been showed that LSTM RNNs are more effective than deep neural networks and conventional RNNs for sentiment analysis. Here, the authors explore different architectures associated with LSTM models to study their relative performance on sentiment analysis. A simpl
APA, Harvard, Vancouver, ISO, and other styles
15

Jiang, Yun. "Deep learning-based automatic modulation recognition: Combination of CNN and LSTM neural network." Advances in Engineering Innovation 16, no. 4 (2025): None. https://doi.org/10.54254/2977-3903/2025.22437.

Full text
Abstract:
With the deepening development of communication technology, the technology of automatic modulation and recognition of communication signals has been more and more widely used in military and civilian fields. This paper mainly studies the implementation of automatic modulation recognition using deep learning as a computing tool, focusing on CNN neural network and LSTM neural network, and conducting simulation experiments on public data sets. Based on the original CNN neural network, this paper introduces the structure of LSTM neural network and combines the advantages of the two types of neural
APA, Harvard, Vancouver, ISO, and other styles
16

Sridhar, C., and Aniruddha Kanhe. "Performance Comparison of Various Neural Networks for Speech Recognition." Journal of Physics: Conference Series 2466, no. 1 (2023): 012008. http://dx.doi.org/10.1088/1742-6596/2466/1/012008.

Full text
Abstract:
Abstract Speech recognition is a method where an audio signal is translated into text, words, or commands and also tells how the speech is recognized. Recently, many deep learning models have been adopted for automatic speech recognition and proved more effective than traditional machine learning methods like Artificial Neural Networks(ANN). This work examines the efficient learning architectures of features by different deep neural networks. In this paper, five neural network models, namely, CNN, LSTM, Bi-LSTM, GRU, and CONV-LSTM, for the comparative study. We trained the networks using Audio
APA, Harvard, Vancouver, ISO, and other styles
17

Assaad, Rayan H., and Sara Fayek. "Predicting the Price of Crude Oil and its Fluctuations Using Computational Econometrics: Deep Learning, LSTM, and Convolutional Neural Networks." Econometric Research in Finance 6, no. 2 (2021): 119–37. http://dx.doi.org/10.2478/erfin-2021-0006.

Full text
Abstract:
Abstract There has been a renewed interest in accurately forecasting the price of crude oil and its fluctuations. That said, this paper aims to study whether the price of crude oil in the United States (US) could be predicted using the stock prices of the top information technology companies. To this end, time-series data was collected and pre-processed as needed, and three architectures of computational neural networks were tested: deep neural networks, long-short term memory (LSTM) neural networks, and a combination of convolutional and LSTM neural networks. The findings suggest that LSTM ne
APA, Harvard, Vancouver, ISO, and other styles
18

Assaad, Rayan H., and Sara Fayek. "Predicting the Price of Crude Oil and its Fluctuations Using Computational Econometrics: Deep Learning, LSTM, and Convolutional Neural Networks." Econometric Research in Finance 6, no. 2 (2021): 119–37. http://dx.doi.org/10.2478/erfin-2021-0006.

Full text
Abstract:
Abstract There has been a renewed interest in accurately forecasting the price of crude oil and its fluctuations. That said, this paper aims to study whether the price of crude oil in the United States (US) could be predicted using the stock prices of the top information technology companies. To this end, time-series data was collected and pre-processed as needed, and three architectures of computational neural networks were tested: deep neural networks, long-short term memory (LSTM) neural networks, and a combination of convolutional and LSTM neural networks. The findings suggest that LSTM ne
APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Jaekyung, Hyunwoo Kim, and Hyungkyoo Kim. "Commercial Vacancy Prediction Using LSTM Neural Networks." Sustainability 13, no. 10 (2021): 5400. http://dx.doi.org/10.3390/su13105400.

Full text
Abstract:
Previous studies on commercial vacancy have mostly focused on the survival rate of commercial buildings over a certain time frame and the cause of their closure, due to a lack of appropriate data. Based on a time-series of 2,940,000 individual commercial facility data, the main purpose of this research is two-fold: (1) to examine long short-term memory (LSTM) as a feasible option for predicting trends in commercial districts and (2) to identify the influence of each variable on prediction results for establishing evidence-based decision-making on the primary influences of commercial vacancy. T
APA, Harvard, Vancouver, ISO, and other styles
20

Khalil, Kasem, Omar Eldash, Ashok Kumar, and Magdy Bayoumi. "Economic LSTM Approach for Recurrent Neural Networks." IEEE Transactions on Circuits and Systems II: Express Briefs 66, no. 11 (2019): 1885–89. http://dx.doi.org/10.1109/tcsii.2019.2924663.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ergen, Tolga, and Suleyman Serdar Kozat. "Unsupervised Anomaly Detection With LSTM Neural Networks." IEEE Transactions on Neural Networks and Learning Systems 31, no. 8 (2020): 3127–41. http://dx.doi.org/10.1109/tnnls.2019.2935975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Rabpreet, Singh Keer Rabpreet Singh Keer. "Handwriting generation using recurrent neural networks (LSTM)." International Journal of Scientific Development and Research 8, no. 9 (2023): 1085–109. https://doi.org/10.5281/zenodo.10446335.

Full text
Abstract:
Handwriting is a skill developed by humans from a very early stage in order to represent his/her thoughts visually using letters and making meaningful words and sentences. Every person improves this skill by practicing and developing his/her own style of writing. Because of the distinctiveness of handwriting style, it is frequently used as a measure to identify a forgery.  Even though the applications of synthesizing handwriting is less, this problem can be generalized and can be functionally applied to other more practical problems. Mimicking or imitating a specific handwriting style can
APA, Harvard, Vancouver, ISO, and other styles
23

Zhang, Chun-Xiang, Shu-Yang Pang, Xue-Yao Gao, Jia-Qi Lu, and Bo Yu. "Attention Neural Network for Biomedical Word Sense Disambiguation." Discrete Dynamics in Nature and Society 2022 (January 10, 2022): 1–14. http://dx.doi.org/10.1155/2022/6182058.

Full text
Abstract:
In order to improve the disambiguation accuracy of biomedical words, this paper proposes a disambiguation method based on the attention neural network. The biomedical word is viewed as the center. Morphology, part of speech, and semantic information from 4 adjacent lexical units are extracted as disambiguation features. The attention layer is used to generate a feature matrix. Average asymmetric convolutional neural networks (Av-ACNN) and bidirectional long short-term memory (Bi-LSTM) networks are utilized to extract features. The softmax function is applied to determine the semantic category
APA, Harvard, Vancouver, ISO, and other styles
24

Yu, Dian, and Shouqian Sun. "A Systematic Exploration of Deep Neural Networks for EDA-Based Emotion Recognition." Information 11, no. 4 (2020): 212. http://dx.doi.org/10.3390/info11040212.

Full text
Abstract:
Subject-independent emotion recognition based on physiological signals has become a research hotspot. Previous research has proved that electrodermal activity (EDA) signals are an effective data resource for emotion recognition. Benefiting from their great representation ability, an increasing number of deep neural networks have been applied for emotion recognition, and they can be classified as a Convolutional Neural Network (CNN), a Recurrent Neural Network (RNN), or a combination of these (CNN+RNN). However, there has been no systematic research on the predictive power and configurations of
APA, Harvard, Vancouver, ISO, and other styles
25

Zhao, Yuxiao, Leyu Lin, and Alois K. Schlarb. "Long Short-Term Memory Networks for the Automated Identification of the Stationary Phase in Tribological Experiments." Lubricants 12, no. 12 (2024): 423. https://doi.org/10.3390/lubricants12120423.

Full text
Abstract:
This study outlines the development and optimization of a Long Short-Term Memory (LSTM) network designed to analyze and classify time-series data from tribological experiments, with a particular emphasis on identifying stationary phases. The process of fine-tuning key hyperparameters was systematically optimized through Bayesian optimization, coupled with K-fold cross-validation to minimize the inherent randomness associated with training neural networks. The refined LSTM network achieved a weighted average accuracy of 84%, demonstrating a high level of agreement between the network’s identifi
APA, Harvard, Vancouver, ISO, and other styles
26

Pan, Yu, Jing Xu, Maolin Wang, et al. "Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 4683–90. http://dx.doi.org/10.1609/aaai.v33i01.33014683.

Full text
Abstract:
Recurrent Neural Networks (RNNs) and their variants, such as Long-Short Term Memory (LSTM) networks, and Gated Recurrent Unit (GRU) networks, have achieved promising performance in sequential data modeling. The hidden layers in RNNs can be regarded as the memory units, which are helpful in storing information in sequential contexts. However, when dealing with high dimensional input data, such as video and text, the input-to-hidden linear transformation in RNNs brings high memory usage and huge computational cost. This makes the training of RNNs very difficult. To address this challenge, we pro
APA, Harvard, Vancouver, ISO, and other styles
27

Zhou, Lixia, Xia Chen, Runsha Dong, and Shan Yang. "Hotspots Prediction Based on LSTM Neural Network for Cellular Networks." Journal of Physics: Conference Series 1624 (October 2020): 052016. http://dx.doi.org/10.1088/1742-6596/1624/5/052016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Varma, Danthuluru Sri Datta Manikanta. "ActiWise: Insight on Human Activity Recognition Using Deep Learning Approaches." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 05 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem32830.

Full text
Abstract:
In this study, we investigate the fusion of Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks for human activity recognition (HAR). By integrating hierarchical spatial features extracted by CNNs with LSTM networks' temporal modelling capabilities, our approach excels in discerning nuanced patterns from raw sensor data collected via wearable devices. Through rigorous experimentation and validation, our CNN+LSTM model demonstrates robust performance in accurately classifying a spectrum of human activities. This research advances HAR methodologies, shedding light on
APA, Harvard, Vancouver, ISO, and other styles
29

Song, Dazhi, and Dazhi Song. "Stock Price Prediction based on Time Series Model and Long Short-term Memory Method." Highlights in Business, Economics and Management 24 (January 22, 2024): 1203–10. http://dx.doi.org/10.54097/e75xgk49.

Full text
Abstract:
This study conducts a comparative analysis of two prominent methodologies, Time Series Analysis and Long Short-Term Memory Neural Networks (LSTM), for the prediction of stock prices, utilizing historical data from Netflix. The primary purpose of conducting this research is to evaluate their efficacy in terms of predictive accuracy. Time Series Analysis encompasses stationarity tests, rolling statistics, and the application of the Autoregressive Integrated Moving Average model. In contrast, LSTM Neural Networks involve data normalization, reshaping, and the development of LSTM-based models. Per
APA, Harvard, Vancouver, ISO, and other styles
30

Kłosowski, Grzegorz, and Tomasz Rymarczyk. "APPLICATION OF CONVOLUTIONAL NEURAL NETWORKS IN WALL MOISTURE IDENTIFICATION BY EIT METHOD." Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska 12, no. 1 (2022): 20–23. http://dx.doi.org/10.35784/iapgos.2883.

Full text
Abstract:
The article presents the results of research in the area of using deep neural networks to identify moisture inside the walls of buildings using electrical impedance tomography. Two deep neural networks were used to transform the input measurements into images of damp places - convolutional neural networks (CNN) and recurrent long short-term memory networks LSTM. After training both models, a comparative assessment of the results obtained thanks to them was made. The conclusions show that both models are highly utilitarian in the analyzed problem. However, slightly better results were obtained
APA, Harvard, Vancouver, ISO, and other styles
31

Gers, Felix A., Jürgen Schmidhuber, and Fred Cummins. "Learning to Forget: Continual Prediction with LSTM." Neural Computation 12, no. 10 (2000): 2451–71. http://dx.doi.org/10.1162/089976600300015015.

Full text
Abstract:
Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indefinitely and eventually cause the network to break down. Our remedy is a novel, adaptive “forget gate” that enables an LSTM cell to learn to reset itself at appropriate times, thus
APA, Harvard, Vancouver, ISO, and other styles
32

Victor, Nancy, and Daphne Lopez. "sl-LSTM." International Journal of Grid and High Performance Computing 12, no. 3 (2020): 1–16. http://dx.doi.org/10.4018/ijghpc.2020070101.

Full text
Abstract:
The volume of data in diverse data formats from various data sources has led the way for a new drift in the digital world, Big Data. This article proposes sl-LSTM (sequence labelling LSTM), a neural network architecture that combines the effectiveness of typical LSTM models to perform sequence labeling tasks. This is a bi-directional LSTM which uses stochastic gradient descent optimization and combines two features of the existing LSTM variants: coupled input-forget gates for reducing the computational complexity and peephole connections that allow all gates to inspect the current cell state.
APA, Harvard, Vancouver, ISO, and other styles
33

You, Yue, Woo-Hyoung Kim, and Yong-Seok Cho. "Stock Market Prediction Based on LSTM Neural Networks." Korea International Trade Research Institute 19, no. 2 (2023): 391–407. http://dx.doi.org/10.16980/jitc.19.2.202304.391.

Full text
Abstract:
Purpose – This study aims to more accurately and effectively predict trends in portfolio prices by building a model using LSTM neural networks, and investigating the risk and profit prediction of investment portfolios. Design/Methodology/Approach – To obtain a return on stocks, this study used 60 monthly transaction data from major countries, including the United States and Korea, for five ETFs, BNDX, BND, VXUS, VTI, and 122630.KS, for five years from January 2016 to December of 2021. In addition, a related portfolio was constructed using modern portfolio theory. Through Min-Max normalization,
APA, Harvard, Vancouver, ISO, and other styles
34

Chuang, Chia-Chun, Chien-Ching Lee, Chia-Hong Yeng, Edmund-Cheung So, and Yeou-Jiunn Chen. "Attention Mechanism-Based Convolutional Long Short-Term Memory Neural Networks to Electrocardiogram-Based Blood Pressure Estimation." Applied Sciences 11, no. 24 (2021): 12019. http://dx.doi.org/10.3390/app112412019.

Full text
Abstract:
Monitoring people’s blood pressure can effectively prevent blood pressure-related diseases. Therefore, providing a convenient and comfortable approach can effectively help patients in monitoring blood pressure. In this study, an attention mechanism-based convolutional long short-term memory (LSTM) neural network is proposed to easily estimate blood pressure. To easily and comfortably estimate blood pressure, electrocardiogram (ECG) and photoplethysmography (PPG) signals are acquired. To precisely represent the characteristics of ECG and PPG signals, the signals in the time and frequency domain
APA, Harvard, Vancouver, ISO, and other styles
35

Zhang, Kexian, and Min Hong. "Forecasting crude oil price using LSTM neural networks." Data Science in Finance and Economics 2, no. 3 (2022): 163–80. http://dx.doi.org/10.3934/dsfe.2022008.

Full text
Abstract:
<abstract> <p>As a key input factor in industrial production, the price volatility of crude oil often brings about economic volatility, so forecasting crude oil price has always been a pivotal issue in economics. In our study, we constructed an LSTM (short for Long Short-Term Memory neural network) model to conduct this forecasting based on data from February 1986 to May 2021. An ANN (short for Artificial Neural Network) model and a typical ARIMA (short for Autoregressive Integrated Moving Average) model are taken as the comparable models. The results show that, first, the LSTM mod
APA, Harvard, Vancouver, ISO, and other styles
36

Mienye, Ibomoiye Domor, Theo G. Swart, and George Obaido. "Recurrent Neural Networks: A Comprehensive Review of Architectures, Variants, and Applications." Information 15, no. 9 (2024): 517. http://dx.doi.org/10.3390/info15090517.

Full text
Abstract:
Recurrent neural networks (RNNs) have significantly advanced the field of machine learning (ML) by enabling the effective processing of sequential data. This paper provides a comprehensive review of RNNs and their applications, highlighting advancements in architectures, such as long short-term memory (LSTM) networks, gated recurrent units (GRUs), bidirectional LSTM (BiLSTM), echo state networks (ESNs), peephole LSTM, and stacked LSTM. The study examines the application of RNNs to different domains, including natural language processing (NLP), speech recognition, time series forecasting, auton
APA, Harvard, Vancouver, ISO, and other styles
37

Han, Shipeng, Zhen Meng, Xingcheng Zhang, and Yuepeng Yan. "Hybrid Deep Recurrent Neural Networks for Noise Reduction of MEMS-IMU with Static and Dynamic Conditions." Micromachines 12, no. 2 (2021): 214. http://dx.doi.org/10.3390/mi12020214.

Full text
Abstract:
Micro-electro-mechanical system inertial measurement unit (MEMS-IMU), a core component in many navigation systems, directly determines the accuracy of inertial navigation system; however, MEMS-IMU system is often affected by various factors such as environmental noise, electronic noise, mechanical noise and manufacturing error. These can seriously affect the application of MEMS-IMU used in different fields. Focus has been on MEMS gyro since it is an essential and, yet, complex sensor in MEMS-IMU which is very sensitive to noises and errors from the random sources. In this study, recurrent neur
APA, Harvard, Vancouver, ISO, and other styles
38

Li, Le meng, Peng Wang, Jie Li, and Gu Chao. "The power system fault detection and classification based on LSTM." Journal of Physics: Conference Series 2935, no. 1 (2025): 012033. https://doi.org/10.1088/1742-6596/2935/1/012033.

Full text
Abstract:
Abstract The stability of the power system and fault detection classification are essential components in achieving the intelligence of power systems. LSTM (Long Short-Term Memory) recurrent neural networks can effectively handle the temporal and nonlinear nature of load data, making them suitable for short-term power system fault detection and classification. A method for power system fault detection and classification based on LSTM recurrent neural networks is proposed. The results demonstrate the accuracy and convenience of the LSTM model.
APA, Harvard, Vancouver, ISO, and other styles
39

Hidri, Adel, Suleiman Ali Alsaif, Muteeb Alahmari, Eman AlShehri, and Minyar Sassi Hidri. "Opinion Mining and Analysis Using Hybrid Deep Neural Networks." Technologies 13, no. 5 (2025): 175. https://doi.org/10.3390/technologies13050175.

Full text
Abstract:
Understanding customer attitudes has become a critical component of decision-making due to the growing influence of social media and e-commerce. Text-based opinions are the most structured, hence playing an important role in sentiment analysis. Most of the existing methods, which include lexicon-based approaches and traditional machine learning techniques, are insufficient for handling contextual nuances and scalability. While the latter has limitations in model performance and generalization, deep learning (DL) has achieved improvement, especially on semantic relationship capturing with recur
APA, Harvard, Vancouver, ISO, and other styles
40

Blinov, I., V. Miroshnyk, and V. Sychova. "Short-term forecasting of electricity imbalances using artificial neural networks." IOP Conference Series: Earth and Environmental Science 1254, no. 1 (2023): 012029. http://dx.doi.org/10.1088/1755-1315/1254/1/012029.

Full text
Abstract:
Abstract Currently, the problem of improving results of short-term forecasting of electricity imbalances in the modern electricity market of Ukraine is a current problem. In order to solve this problem, two types of neural networks with recurrent layers LSTM and LSTNet were analyzed in this work. A comparison of the results of short-term forecasting of daily schedules of electricity imbalances using LSTM and LSTNet neural networks with vector autoregression model (VARMA) was carried out. Actual data of the balancing market were used for the research. Analysis of the results shows that the smal
APA, Harvard, Vancouver, ISO, and other styles
41

Opałka, Sławomir, Dominik Szajerman, and Adam Wojciechowski. "LSTM multichannel neural networks in mental task classification." COMPEL - The international journal for computation and mathematics in electrical and electronic engineering 38, no. 4 (2019): 1204–13. http://dx.doi.org/10.1108/compel-10-2018-0429.

Full text
Abstract:
Purpose The purpose of this paper is to apply recurrent neural networks (RNNs) and more specifically long-short term memory (LSTM)-based ones for mental task classification in terms of BCI systems. The authors have introduced novel LSTM-based multichannel architecture model which proved to be highly promising in other fields, yet was not used for mental tasks classification. Design/methodology/approach Validity of the multichannel LSTM-based solution was confronted with the results achieved by a non-multichannel state-of-the-art solutions on a well-recognized data set. Findings The results dem
APA, Harvard, Vancouver, ISO, and other styles
42

Nehal, Mohamed Ali, Mostafa Abd El Hamid Marwa, and Youssif Aliaa. "Sentiment Analysis for Movies Reviews Dataset Using Deep Learning Models." International Journal of Data Mining & Knowledge Management Process (IJDKP) 9, no. 2/3 (2019): 19–27. https://doi.org/10.5281/zenodo.3340668.

Full text
Abstract:
Due to the enormous amount of data and opinions being produced, shared and transferred everyday across the internet and other media, Sentiment analysis has become vital for developing opinion mining systems. This paper introduces a developed classification sentiment analysis using deep learning networks and introduces comparative results of different deep learning networks. Multilayer Perceptron (MLP) was developed as a baseline for other networks results. Long short-term memory (LSTM) recurrent neural network, Convolutional Neural Network (CNN) in addition to a hybrid model of LSTM and CNN we
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Zian. "Stock price prediction using LSTM neural networks: Techniques and applications." Applied and Computational Engineering 86, no. 1 (2024): 294–300. http://dx.doi.org/10.54254/2755-2721/86/20241605.

Full text
Abstract:
The prediction of stock prices has garnered significant attention due to the potential financial gains and complex questions involved. This paper elaborates a comparison between the Long Short-Term Memory (LSTM) model, optimised using the early-stopping method, and the conventional mathematical method Autoregressive Integrated Moving Average Model(ARIMA), which is conducted using the S&P500 from 2022, May 01 to 2024, May 01. The results indicate that the LSTM surpasses ARIMA. To be more specific, LSTM achieves a 92% reduction in error rates compared to ARIMA. In addition, when the optimise
APA, Harvard, Vancouver, ISO, and other styles
44

Becerra Muriel, Cristian. "Forecasting the Future Value of a Colombian Investment Fund with LSTM Recurrent Neural Networks (LSTM)." System Analysis & Mathematical Modeling 6, no. 1 (2024): 78–88. http://dx.doi.org/10.17150/2713-1734.2024.6(1).78-88.

Full text
Abstract:
Recurrent neural networks are a tool that is currently used in time series, a widespread use of these networks is the forecasting of future prices in financial time series. One widely used recurrent neural network model is the LSTM (Long Short-Term Memory) model, proposed by Sepp Hochreiter and Jürgen Schmidhuber in their paper called LONG SHORT-TERM MEMORY published in 1997. This model solves the long term memory problem of recurrent neural networks by adding a selective memory cell which acts as a "filter" to choose what kind of information is important to keep and what kind of information i
APA, Harvard, Vancouver, ISO, and other styles
45

Wan, Renzhuo, Shuping Mei, Jun Wang, Min Liu, and Fan Yang. "Multivariate Temporal Convolutional Network: A Deep Neural Networks Approach for Multivariate Time Series Forecasting." Electronics 8, no. 8 (2019): 876. http://dx.doi.org/10.3390/electronics8080876.

Full text
Abstract:
Multivariable time series prediction has been widely studied in power energy, aerology, meteorology, finance, transportation, etc. Traditional modeling methods have complex patterns and are inefficient to capture long-term multivariate dependencies of data for desired forecasting accuracy. To address such concerns, various deep learning models based on Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN) methods are proposed. To improve the prediction accuracy and minimize the multivariate time series data dependence for aperiodic data, in this article, Beijing PM2.5 and ISO-N
APA, Harvard, Vancouver, ISO, and other styles
46

Alaameri, Zahra Hasan Oleiwi, and Mustafa Abdulsahib Faihan. "Forecasting the Accounting Profits of the Banks Listed in Iraq Stock Exchange Using Artificial Neural Networks." Webology 19, no. 1 (2022): 2669–82. http://dx.doi.org/10.14704/web/v19i1/web19177.

Full text
Abstract:
This paper demonstrates the feasibility of using deep learning approaches in time series forecasting of bank profits. Two types of neural networks were used, LSTM (Long-Short Term Memory) and NAR (Nonlinear Autoregressive) networks, for comparison. The data from 12 Iraqi banks, which are registered in the Iraq stock exchange, were involved in this study for sixteen years (2004-2019). RMSE and MAPE were used for comparing the performance of the two models (LSTM and NAR). Our results showed that the NAR is more accurate than LSTM for the prediction of profits. And that the use of the NAR network
APA, Harvard, Vancouver, ISO, and other styles
47

Bucci, Andrea. "Realized Volatility Forecasting with Neural Networks." Journal of Financial Econometrics 18, no. 3 (2020): 502–31. http://dx.doi.org/10.1093/jjfinec/nbaa008.

Full text
Abstract:
Abstract In the last few decades, a broad strand of literature in finance has implemented artificial neural networks as a forecasting method. The major advantage of this approach is the possibility to approximate any linear and nonlinear behaviors without knowing the structure of the data generating process. This makes it suitable for forecasting time series which exhibit long-memory and nonlinear dependencies, like conditional volatility. In this article, the predictive performance of feed-forward and recurrent neural networks (RNNs) was compared, particularly focusing on the recently develop
APA, Harvard, Vancouver, ISO, and other styles
48

Pavlatos, Christos, Evangelos Makris, Georgios Fotis, Vasiliki Vita, and Valeri Mladenov. "Enhancing Electrical Load Prediction Using a Bidirectional LSTM Neural Network." Electronics 12, no. 22 (2023): 4652. http://dx.doi.org/10.3390/electronics12224652.

Full text
Abstract:
Precise anticipation of electrical demand holds crucial importance for the optimal operation of power systems and the effective management of energy markets within the domain of energy planning. This study builds on previous research focused on the application of artificial neural networks to achieve accurate electrical load forecasting. In this paper, an improved methodology is introduced, centering around bidirectional Long Short-Term Memory (LSTM) neural networks (NN). The primary aim of the proposed bidirectional LSTM network is to enhance predictive performance by capturing intricate temp
APA, Harvard, Vancouver, ISO, and other styles
49

Alam, Muhammad S., AKM B. Hossain, and Farhan B. Mohamed. "Performance Evaluation of Recurrent Neural Networks Applied to Indoor Camera Localization." International Journal of Emerging Technology and Advanced Engineering 12, no. 8 (2022): 116–24. http://dx.doi.org/10.46338/ijetae0822_15.

Full text
Abstract:
Researchers in robotics and computer vision are experimenting with the image-based localization of indoor cameras. Implementation of indoor camera localization problems using a Convolutional neural network (CNN) or Recurrent neural network (RNN) is more challenging from a large image dataset because of the internal structure of CNN or RNN. We can choose a preferable CNN or RNN variant based on the problem type and size of the dataset. CNN is the most flexible method for implementing indoor localization problems. Despite CNN's suitability for hyper-parameter selection, it requires a lot of trai
APA, Harvard, Vancouver, ISO, and other styles
50

Kabildjanov, A. S., Ch Z. Okhunboboeva, and S. Yo Ismailov. "Intelligent forecasting of growth and development of fruit trees by deep learning recurrent neural networks." IOP Conference Series: Earth and Environmental Science 1206, no. 1 (2023): 012015. http://dx.doi.org/10.1088/1755-1315/1206/1/012015.

Full text
Abstract:
Abstract The questions of intellectual forecasting of dynamic processes of growth and development of fruit trees are considered. The average growth rate of shoots of apple trees of the «Renet Simirenko» variety was predicted. Forecasting was carried out using a deep learning recurrent neural network LSTM in relation to a one-dimensional time series, with which the specified parameter was described. The implementation of the recurrent neural network LSTM was carried out in the MATLAB 2021 environment. When defining the architecture and training of the LSTM recurrent neural network, the Deep Net
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!