Academic literature on the topic 'LSTM (long short term memory networks)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LSTM (long short term memory networks).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "LSTM (long short term memory networks)"

1

Singh, Arjun, Shashi Kant Dargar, Amit Gupta, et al. "Evolving Long Short-Term Memory Network-Based Text Classification." Computational Intelligence and Neuroscience 2022 (February 21, 2022): 1–11. http://dx.doi.org/10.1155/2022/4725639.

Full text
Abstract:
Recently, long short-term memory (LSTM) networks are extensively utilized for text classification. Compared to feed-forward neural networks, it has feedback connections, and thus, it has the ability to learn long-term dependencies. However, the LSTM networks suffer from the parameter tuning problem. Generally, initial and control parameters of LSTM are selected on a trial and error basis. Therefore, in this paper, an evolving LSTM (ELSTM) network is proposed. A multiobjective genetic algorithm (MOGA) is used to optimize the architecture and weights of LSTM. The proposed model is tested on a we
APA, Harvard, Vancouver, ISO, and other styles
2

Hochreiter, Sepp, and Jürgen Schmidhuber. "Long Short-Term Memory." Neural Computation 9, no. 8 (1997): 1735–80. http://dx.doi.org/10.1162/neco.1997.9.8.1735.

Full text
Abstract:
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to
APA, Harvard, Vancouver, ISO, and other styles
3

Xu, Wei, Yanan Jiang, Xiaoli Zhang, Yi Li, Run Zhang, and Guangtao Fu. "Using long short-term memory networks for river flow prediction." Hydrology Research 51, no. 6 (2020): 1358–76. http://dx.doi.org/10.2166/nh.2020.026.

Full text
Abstract:
Abstract Deep learning has made significant advances in methodologies and practical applications in recent years. However, there is a lack of understanding on how the long short-term memory (LSTM) networks perform in river flow prediction. This paper assesses the performance of LSTM networks to understand the impact of network structures and parameters on river flow predictions. Two river basins with different characteristics, i.e., Hun river and Upper Yangtze river basins, are used as case studies for the 10-day average flow predictions and the daily flow predictions, respectively. The use of
APA, Harvard, Vancouver, ISO, and other styles
4

Song, Tianyu, Wei Ding, Jian Wu, Haixing Liu, Huicheng Zhou, and Jinggang Chu. "Flash Flood Forecasting Based on Long Short-Term Memory Networks." Water 12, no. 1 (2019): 109. http://dx.doi.org/10.3390/w12010109.

Full text
Abstract:
Flash floods occur frequently and distribute widely in mountainous areas because of complex geographic and geomorphic conditions and various climate types. Effective flash flood forecasting with useful lead times remains a challenge due to its high burstiness and short response time. Recently, machine learning has led to substantial changes across many areas of study. In hydrology, the advent of novel machine learning methods has started to encourage novel applications or substantially improve old ones. This study aims to establish a discharge forecasting model based on Long Short-Term Memory
APA, Harvard, Vancouver, ISO, and other styles
5

Shankar, Sonali, P. Vigneswara Ilavarasan, Sushil Punia, and Surya Prakash Singh. "Forecasting container throughput with long short-term memory networks." Industrial Management & Data Systems 120, no. 3 (2019): 425–41. http://dx.doi.org/10.1108/imds-07-2019-0370.

Full text
Abstract:
Purpose Better forecasting always leads to better management and planning of the operations. The container throughput data are complex and often have multiple seasonality. This makes it difficult to forecast accurately. The purpose of this paper is to forecast container throughput using deep learning methods and benchmark its performance over other traditional time-series methods. Design/methodology/approach In this study, long short-term memory (LSTM) networks are implemented to forecast container throughput. The container throughput data of the Port of Singapore are used for empirical analys
APA, Harvard, Vancouver, ISO, and other styles
6

Nguyen, Sang Thi Thanh, and Bao Duy Tran. "Long Short-Term Memory Based Movie Recommendation." Science & Technology Development Journal - Engineering and Technology 3, SI1 (2020): SI1—SI9. http://dx.doi.org/10.32508/stdjet.v3isi1.540.

Full text
Abstract:
Recommender systems (RS) have become a fundamental tool for helping users make decisions around millions of different choices nowadays – the era of Big Data. It brings a huge benefit for many business models around the world due to their effectiveness on the target customers. A lot of recommendation models and techniques have been proposed and many accomplished incredible outcomes. Collaborative filtering and content-based filtering methods are common, but these both have some disadvantages. A critical one is that they only focus on a user's long-term static preference while ignoring his or he
APA, Harvard, Vancouver, ISO, and other styles
7

Tra, Nguyen Ngoc, Ho Phuoc Tien, Nguyen Thanh Dat, and Nguyen Ngoc Vu. "VN-INDEX TREND PREDICTION USING LONG-SHORT TERM MEMORY NEURAL NETWORKS." Journal of Science and Technology: Issue on Information and Communications Technology 17, no. 12.2 (2019): 61. http://dx.doi.org/10.31130/ict-ud.2019.94.

Full text
Abstract:
The paper attemps to forecast the future trend of Vietnam index (VN-index) by using long-short term memory (LSTM) networks. In particular, an LSTM-based neural network is employed to study the temporal dependence in time-series data of past and present VN index values. Empirical forecasting results show that LSTM-based stock trend prediction offers an accuracy of about 60% which outperforms moving-average-based prediction.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Jianyong, Lei Zhang, Yuanyuan Chen, and Zhang Yi. "A New Delay Connection for Long Short-Term Memory Networks." International Journal of Neural Systems 28, no. 06 (2018): 1750061. http://dx.doi.org/10.1142/s0129065717500617.

Full text
Abstract:
Connections play a crucial role in neural network (NN) learning because they determine how information flows in NNs. Suitable connection mechanisms may extensively enlarge the learning capability and reduce the negative effect of gradient problems. In this paper, a new delay connection is proposed for Long Short-Term Memory (LSTM) unit to develop a more sophisticated recurrent unit, called Delay Connected LSTM (DCLSTM). The proposed delay connection brings two main merits to DCLSTM with introducing no extra parameters. First, it allows the output of the DCLSTM unit to maintain LSTM, which is a
APA, Harvard, Vancouver, ISO, and other styles
9

Lees, Thomas, Steven Reece, Frederik Kratzert, et al. "Hydrological concept formation inside long short-term memory (LSTM) networks." Hydrology and Earth System Sciences 26, no. 12 (2022): 3079–101. http://dx.doi.org/10.5194/hess-26-3079-2022.

Full text
Abstract:
Abstract. Neural networks have been shown to be extremely effective rainfall-runoff models, where the river discharge is predicted from meteorological inputs. However, the question remains: what have these models learned? Is it possible to extract information about the learned relationships that map inputs to outputs, and do these mappings represent known hydrological concepts? Small-scale experiments have demonstrated that the internal states of long short-term memory networks (LSTMs), a particular neural network architecture predisposed to hydrological modelling, can be interpreted. By extra
APA, Harvard, Vancouver, ISO, and other styles
10

Min, Huasong, Ziming Chen, Bin Fang, et al. "Cross-Individual Gesture Recognition Based on Long Short-Term Memory Networks." Scientific Programming 2021 (July 6, 2021): 1–11. http://dx.doi.org/10.1155/2021/6680417.

Full text
Abstract:
Gestures recognition based on surface electromyography (sEMG) has been widely used for human-computer interaction. However, there are few research studies on overcoming the influence of physiological factors among different individuals. In this paper, a cross-individual gesture recognition method based on long short-term memory (LSTM) networks is proposed, named cross-individual LSTM (CI-LSTM). CI-LSTM has a dual-network structure, including a gesture recognition module and an individual recognition module. By designing the loss function, the individual information recognition module assists t
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!