Academic literature on the topic 'Learning long short-term memory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Learning long short-term memory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Learning long short-term memory"

1

Hochreiter, Sepp, and Jürgen Schmidhuber. "Long Short-Term Memory." Neural Computation 9, no. 8 (1997): 1735–80. http://dx.doi.org/10.1162/neco.1997.9.8.1735.

Full text
Abstract:
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to
APA, Harvard, Vancouver, ISO, and other styles
2

Wagle, Aumkar. "Deep Learning for Financial Time Series using Long Short-Term Memory Model." International Journal of Science and Research (IJSR) 13, no. 4 (2024): 1944–72. http://dx.doi.org/10.21275/sr24418141736.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zannatul, Ferdoush, University Brac, Chakrabarty Amitabha, and Uddin Jia. "A short-term hybrid forecasting model for time series electrical-load data using random forest and bidirectional long short-term memory." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 1 (2021): 763–71. https://doi.org/10.11591/ijece.v11i1.pp763-771.

Full text
Abstract:
In the presence of the deregulated electric industry, load forecasting is more demanded than ever to ensure the execution of applications such as energy generation, pricing decisions, resource procurement, and infrastructure development. This paper presents a hybrid machine learning model for short-term load forecasting (STLF) by applying random forest and bidirectional long short-term memory to acquire the benefits of both methods. In the experimental evaluation, we used a Bangladeshi electricity consumption dataset of 36 months. The paper provides a comparative study between the proposed hyb
APA, Harvard, Vancouver, ISO, and other styles
4

Sai Swaroop Reddy, Venkata. "Predicting Soccer Match Outcomes Using Deep Learning: A Long Short-Term Memory (LSTM) Approach." International Journal of Science and Research (IJSR) 11, no. 10 (2022): 1454–58. https://doi.org/10.21275/sr22108120231.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Simanihuruk, Laurensia, and Hari Suparwito. "Long Short-Term Memory and Bidirectional Long Short-Term Memory Algorithms for Sentiment Analysis of Skintific Product Reviews." ITM Web of Conferences 71 (2025): 01016. https://doi.org/10.1051/itmconf/20257101016.

Full text
Abstract:
In the era of ever-evolving digital technology, conducting customer sentiment analysis through product reviews has become crucial for businesses to improve their offerings and increase customer satisfaction. This research aims to analyze the sentiment of SKINTIFIC skincare products on the Shopee online store platform using advanced deep learning models: Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (Bi-LSTM). These models were evaluated using learning rate, number of units, and dropout rate. The dataset consists of 9,184 product reviews extracted through the Shopee API
APA, Harvard, Vancouver, ISO, and other styles
6

Pandikumar, S., S. Bharani Sethupandian, M. Sakthi Saravanan, S. Navin Prasad, and M. Arun. "Deep Learning based Long Short-Term Memory Recurrent Neural Network for Stock Price Movement Prediction." Indian Journal of Science and Technology 15, no. 11 (2022): 474–80. http://dx.doi.org/10.17485/ijst/v15i11.27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jones, Gary, and Bill Macken. "Long-term associative learning predicts verbal short-term memory performance." Memory & Cognition 46, no. 2 (2017): 216–29. http://dx.doi.org/10.3758/s13421-017-0759-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shimi, Andria, and Robert H. Logie. "Feature binding in short-term memory and long-term learning." Quarterly Journal of Experimental Psychology 72, no. 6 (2018): 1387–400. http://dx.doi.org/10.1177/1747021818807718.

Full text
Abstract:
In everyday experience, we encounter visual feature combinations. Some combinations are learned to support object recognition, and some are arbitrary and rapidly changing, so are retained briefly to complete ongoing tasks before being updated or forgotten. However, the boundary conditions between temporary retention of fleeting feature combinations and learning of feature bindings are unclear. Logie, Brockmole, and Vandenbroucke demonstrated that 60 repetitions of the same feature bindings for change detection resulted in no learning, but clear learning occurred with cued recall of the feature
APA, Harvard, Vancouver, ISO, and other styles
9

Chen Wang, Chen Wang, Bingchun Liu Chen Wang, Jiali Chen Bingchun Liu, and Xiaogang Yu Jiali Chen. "Air Quality Index Prediction Based on a Long Short-Term Memory Artificial Neural Network Model." 電腦學刊 34, no. 2 (2023): 069–79. http://dx.doi.org/10.53106/199115992023043402006.

Full text
Abstract:
<p>Air pollution has become one of the important challenges restricting the sustainable development of cities. Therefore, it is of great significance to achieve accurate prediction of Air Quality Index (AQI). Long Short Term Memory (LSTM) is a deep learning method suitable for learning time series data. Considering its superiority in processing time series data, this study established an LSTM forecasting model suitable for air quality index forecasting. First, we focus on optimizing the feature metrics of the model input through Information Gain (IG). Second, the prediction results of th
APA, Harvard, Vancouver, ISO, and other styles
10

Mirza, Arsalan R., and Abdulbasit K. Al-Talabani. "Time Series-Based Spoof Speech Detection Using Long Short-Term Memory and Bidirectional Long Short-Term Memory." ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY 12, no. 2 (2024): 119–29. http://dx.doi.org/10.14500/aro.11636.

Full text
Abstract:
Detecting fake speech in voice-based authentication systems is crucial for reliability. Traditional methods often struggle because they can't handle the complex patterns over time. Our study introduces an advanced approach using deep learning, specifically Long Short-Term Memory (LSTM) and Bidirectional LSTM (BiLSTM) models, tailored for identifying fake speech based on its temporal characteristics. We use speech signals with cepstral features like Mel-frequency cepstral coefficients (MFCC), Constant Q cepstral coefficients (CQCC), and open-source Speech and Music Interpretation by Large-space
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Learning long short-term memory"

1

Cumming, N. "The Hebb effect : investigating long-term learning from short-term memory." Thesis, University of Cambridge, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598214.

Full text
Abstract:
How do we learn a sequence of items so we can remember it not only over the short-term, as in hearing a phone-number and repeating it back, but over the long term? Ten experiments are presented that investigate this problem using the Hebb repetition effect (Hebb, 1961). In a canonical Hebb effect experiment, lists of familiar items are presented in an immediate serial recall task and one list is repeatedly presented at regular intervals. This leads to an improvement in recall for the repeating list over baseline performance. Existing models of serial order learning are tested; Chapter 2 provid
APA, Harvard, Vancouver, ISO, and other styles
2

Bailey, Tony J. "Neuromorphic Architecture with Heterogeneously Integrated Short-Term and Long-Term Learning Paradigms." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1554217105047975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shojaee, Ali B. S. "Bacteria Growth Modeling using Long-Short-Term-Memory Networks." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1617105038908441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gustafsson, Anton, and Julian Sjödal. "Energy Predictions of Multiple Buildings using Bi-directional Long short-term Memory." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-43552.

Full text
Abstract:
The process of energy consumption and monitoring of a buildingis time-consuming. Therefore, an feasible approach for using trans-fer learning is presented to decrease the necessary time to extract re-quired large dataset. The technique applies a bidirectional long shortterm memory recurrent neural network using sequence to sequenceprediction. The idea involves a training phase that extracts informa-tion and patterns of a building that is presented with a reasonablysized dataset. The validation phase uses a dataset that is not sufficientin size. This dataset was acquired through a related paper
APA, Harvard, Vancouver, ISO, and other styles
5

Yangyang, Wen. "Sensor numerical prediction based on long-term and short-term memory neural network." Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-39165.

Full text
Abstract:
Many sensor nodes are scattered in the sensor network,which are used in all aspects of life due to their small size, low power consumption, and multiple functions. With the advent of the Internet of Things, more small sensor devices will appear in our lives. The research of deep learning neural networks is generally based on large and medium-sized devices such as servers and computers, and it is rarely heard about the research of neural networks based on small Internet of Things devices. In this study, the Internet of Things devices are divided into three types: large, medium, and small in ter
APA, Harvard, Vancouver, ISO, and other styles
6

Singh, Akash. "Anomaly Detection for Temporal Data using Long Short-Term Memory (LSTM)." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-215723.

Full text
Abstract:
We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on LSTM prediction and detection performance. LSTMs are also compared to feed-forward neural networks wit
APA, Harvard, Vancouver, ISO, and other styles
7

van, der Westhuizen Jos. "Biological applications, visualizations, and extensions of the long short-term memory network." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/287476.

Full text
Abstract:
Sequences are ubiquitous in the domain of biology. One of the current best machine learning techniques for analysing sequences is the long short-term memory (LSTM) network. Owing to significant barriers to adoption in biology, focussed efforts are required to realize the use of LSTMs in practice. Thus, the aim of this work is to improve the state of LSTMs for biology, and we focus on biological tasks pertaining to physiological signals, peripheral neural signals, and molecules. This goal drives the three subplots in this thesis: biological applications, visualizations, and extensions. We start
APA, Harvard, Vancouver, ISO, and other styles
8

Hernandez, Villapol Jorge Luis. "Spectrum Analysis and Prediction Using Long Short Term Memory Neural Networks and Cognitive Radios." Thesis, University of North Texas, 2017. https://digital.library.unt.edu/ark:/67531/metadc1062877/.

Full text
Abstract:
One statement that we can make with absolute certainty in our current time is that wireless communication is now the standard and the de-facto type of communication. Cognitive radios are able to interpret the frequency spectrum and adapt. The aim of this work is to be able to predict whether a frequency channel is going to be busy or free in a specific time located in the future. To do this, the problem is modeled as a time series problem where each usage of a channel is treated as a sequence of busy and free slots in a fixed time frame. For this time series problem, the method being implement
APA, Harvard, Vancouver, ISO, and other styles
9

Corni, Gabriele. "A study on the applicability of Long Short-Term Memory networks to industrial OCR." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
This thesis summarises the research-oriented study of applicability of Long Short-Term Memory Recurrent Neural Networks (LSTMs) to industrial Optical Character Recognition (OCR) problems. Traditionally solved through Convolutional Neural Network-based approaches (CNNs), the reported work aims to detect the OCR aspects that could be improved by exploiting recurrent patterns among pixel intensities, and speed up the overall character detection process. Accuracy, speed and complexity act as the main key performance indicators. After studying the core Deep Learning foundations, the best train
APA, Harvard, Vancouver, ISO, and other styles
10

von, Hacht Johan. "Anomaly Detection for Root Cause Analysis in System Logs using Long Short-Term Memory." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-301656.

Full text
Abstract:
Many software systems are under test to ensure that they function as expected. Sometimes, a test can fail, and in that case, it is essential to understand the cause of the failure. However, as systems grow larger and become more complex, this task can become non-trivial and potentially take much time. Therefore, even partially, automating the process of root cause analysis can save time for the developers involved. This thesis investigates the use of a Long Short-Term Memory (LSTM) anomaly detector in system logs for root cause analysis. The implementation is evaluated in a quantitative and a
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Learning long short-term memory"

1

LaCroix, Connie Lynn. Short-term and long-term memory for intentional versus incidental learning: Does rhyme or reason make a difference? Laurentian University, Department of Psychology, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

United States. Office of the Under Secretary of Defense for Research and Engineering. and Institute for Defense Analyses, eds. The long-term retention of knowledge and skills: A cognitive and instructional perspective. Springer-Verlag, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

1972-, Thorn Annabel, and Page Mike 1966-, eds. Interactions between short-term and long-term memory in the verbal domain. Psychology Press, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

1972-, Thorn Annabel, and Page Mike 1966-, eds. Interactions between short-term and long-term memory in the verbal domain. Psychology Press, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ruf, Alessia. Short- and Long-Term Modality Effect in Multimedia Learning. Springer Fachmedien Wiesbaden, 2016. http://dx.doi.org/10.1007/978-3-658-12430-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dehn, Milton J. Long-term memory problems in children and adolescents: Assessment, intervention, and effective instruction. Wiley, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Grabowski, Peter. The effects of three dimensional text imagery on short and long term memory. Laurentian University, Department of Psychology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Eitelman, Paul. A non-random walk revisited: Short- and long-term memory in asset prices. Federal Reserve Board, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dehn, Milton J. Working memory and academic learning: Assessment and intervention. John Wiley & Sons, Inc., 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

1963-, Oakes Lisa M., and Bauer Patricia J, eds. Short-and long-term memory in infancy and early childhood: Taking the first steps toward remembering. Oxford University Press, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Learning long short-term memory"

1

Alla, Sridhar, and Suman Kalyan Adari. "Long Short-Term Memory Models." In Beginning Anomaly Detection Using Python-Based Deep Learning. Apress, 2019. http://dx.doi.org/10.1007/978-1-4842-5177-5_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Adari, Suman Kalyan, and Sridhar Alla. "Long Short-Term Memory Models." In Beginning Anomaly Detection Using Python-Based Deep Learning. Apress, 2024. http://dx.doi.org/10.1007/979-8-8688-0008-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Baddeley, A. D., C. Papagno, and G. Vallar. "When long-term learning depends on short-term storage." In Exploring Working Memory. Routledge, 2017. http://dx.doi.org/10.4324/9781315111261-13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hvitfeldt, Emil, and Julia Silge. "Long short-term memory (LSTM) networks." In Supervised Machine Learning for Text Analysis in R. Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003093459-14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bhasin, Harsh. "Gated Recurrent Unit and Long Short-Term Memory." In Hands-on Deep Learning. Apress, 2024. https://doi.org/10.1007/979-8-8688-1035-0_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Teyler, Timothy J. "Long-Term Potentiation and Memory." In Learning and Memory. Birkhäuser Boston, 1989. http://dx.doi.org/10.1007/978-1-4899-6778-7_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Swain, Debabrata, Vijeta, Soham Manjare, Sachin Kulawade, and Tanuj Sharma. "Stock Market Prediction Using Long Short-Term Memory Model." In Machine Learning and Information Processing. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-4859-2_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Prakash, N., and G. Sumaiya Farzana. "Short Term Price Forecasting of Horticultural Crops Using Long Short Term Memory Neural Network." In Learning and Analytics in Intelligent Systems. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46943-6_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Xie, Zongxia, and Hao Wen. "Composite Quantile Regression Long Short-Term Memory Network." In Artificial Neural Networks and Machine Learning – ICANN 2019: Text and Time Series. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30490-4_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Swarnkar, Suman Kumar, and Yogesh Kumar Rathore. "Music Genre Classification Using Long Short-Term Memory (LSTM) Networks." In Machine Learning in Multimedia. CRC Press, 2024. http://dx.doi.org/10.1201/9781003477280-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Learning long short-term memory"

1

Xu, Li. "Personalized English Learning Resource Recommendation using Attention Long Short-Term Memory." In 2025 4th International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE). IEEE, 2025. https://doi.org/10.1109/icdcece65353.2025.11036013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Jiale, and Duo Mei. "Prediction of bus arrival time based on improved long short-term memory network model." In International Conference on Cloud Computing, Performance Computing, and Deep Learning, edited by Wanyang Dai and Xiangjie Kong. SPIE, 2024. http://dx.doi.org/10.1117/12.3050640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Xiaoyue, Danya Xu, Jingfei Wang, and Tao Yang. "Soft Sensor Modeling Based on Gated Stacking of Extended Long Short-Term Memory Networks." In 2025 IEEE 14th Data Driven Control and Learning Systems (DDCLS). IEEE, 2025. https://doi.org/10.1109/ddcls66240.2025.11065482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mpela, Motebang Daniel, and Tranos Zuva. "Deep Learning-Based Long and Short-Term Memory Integration for Enhanced Library Recommendation Systems." In 2024 4th International Multidisciplinary Information Technology and Engineering Conference (IMITEC). IEEE, 2024. https://doi.org/10.1109/imitec60221.2024.10851043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kaur, Arshleen, Vinay Kukreja, Deepak Upadhyay, Manisha Aeri, and Rishabh Sharma. "A Deep Learning-based Long Short-Term Memory Technique for Google Stock Price Prediction." In 2024 Asia Pacific Conference on Innovation in Technology (APCIT). IEEE, 2024. http://dx.doi.org/10.1109/apcit62007.2024.10673419.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ezhilmathi, K., M. Durairaj, Mubashira Vazhangal, Riyaz Mohammad, Punit Pathak, and M. Karthik. "Tailored English Language Learning Support: Leveraging Long Short-Term Memory Networks for Personalized Assistance." In 2024 Third International Conference on Electrical, Electronics, Information and Communication Technologies (ICEEICT). IEEE, 2024. http://dx.doi.org/10.1109/iceeict61591.2024.10718506.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fernandes, Mateus A., Eduardo Gildin, and Marcio A. Sampaio. "Data-Driven Estimation of Flowing Bottom-Hole Pressure in Petroleum Wells Using Long Short-Term Memory." In 2024 International Conference on Machine Learning and Applications (ICMLA). IEEE, 2024. https://doi.org/10.1109/icmla61862.2024.00236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rahmanto, Nugroho, and Satria Mandala. "Myocardial Infarction Identification in Electrocardiogram Signals Using Long Short Term Memory Transfer Learning (LSTM-TL)." In 2025 International Conference on Advancement in Data Science, E-learning and Information System (ICADEIS). IEEE, 2025. https://doi.org/10.1109/icadeis65852.2025.10933419.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Guo, Xin, Yu Tian, Qinghan Xue, et al. "Continual Learning Long Short Term Memory." In Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.findings-emnlp.164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chandwani, Vipasha, Sandeep Kumar, and Parikshit Kishor Singh. "Long Short-Term Memory based Conversation Modelling." In 2020 3rd International Conference on Emerging Technologies in Computer Engineering: Machine Learning and Internet of Things (ICETCE). IEEE, 2020. http://dx.doi.org/10.1109/icetce48199.2020.9091753.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Learning long short-term memory"

1

Cárdenas-Cárdenas, Julián Alonso, Deicy J. Cristiano-Botia, and Nicolás Martínez-Cortés. Colombian inflation forecast using Long Short-Term Memory approach. Banco de la República, 2023. http://dx.doi.org/10.32468/be.1241.

Full text
Abstract:
We use Long Short Term Memory (LSTM) neural networks, a deep learning technique, to forecast Colombian headline inflation one year ahead through two approaches. The first one uses only information from the target variable, while the second one incorporates additional information from some relevant variables. We employ sample rolling to the traditional neuronal network construction process, selecting the hyperparameters with criteria for minimizing the forecast error. Our results show a better forecasting capacity of the network with information from additional variables, surpassing both the ot
APA, Harvard, Vancouver, ISO, and other styles
2

Ly, Racine, Fousseini Traore, and Khadim Dia. Forecasting commodity prices using long-short-term memory neural networks. International Food Policy Research Institute, 2021. http://dx.doi.org/10.2499/p15738coll2.134265.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Han, Shangxuan. Stock Prediction with Random Forests and Long Short-term Memory. Iowa State University, 2019. http://dx.doi.org/10.31274/cc-20240624-1334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Carew, Thomas J. A Parallel Processing Hypothesis for Short-Term and Long-Term Memory in Aplysia. Defense Technical Information Center, 1994. http://dx.doi.org/10.21236/ada284101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kumar, Kaushal, and Yupeng Wei. Attention-Based Data Analytic Models for Traffic Flow Predictions. Mineta Transportation Institute, 2023. http://dx.doi.org/10.31979/mti.2023.2211.

Full text
Abstract:
Traffic congestion causes Americans to lose millions of hours and dollars each year. In fact, 1.9 billion gallons of fuel are wasted each year due to traffic congestion, and each hour stuck in traffic costs about $21 in wasted time and fuel. The traffic congestion can be caused by various factors, such as bottlenecks, traffic incidents, bad weather, work zones, poor traffic signal timing, and special events. One key step to addressing traffic congestion and identifying its root cause is an accurate prediction of traffic flow. Accurate traffic flow prediction is also important for the successfu
APA, Harvard, Vancouver, ISO, and other styles
6

Vold, Andrew. Improving Physics Based Electron Neutrino Appearance Identication with a Long Short-Term Memory Network. Office of Scientific and Technical Information (OSTI), 2018. http://dx.doi.org/10.2172/1529330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ankel, Victoria, Stella Pantopoulou, Matthew Weathered, Darius Lisowski, Anthonie Cilliers, and Alexander Heifetz. One-Step Ahead Prediction of Thermal Mixing Tee Sensors with Long Short Term Memory (LSTM) Neural Networks. Office of Scientific and Technical Information (OSTI), 2020. http://dx.doi.org/10.2172/1760289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

McCausland, Kathleen. A comparative study of the short-term auditory memory span and sequence of language/learning disabled children and normal children. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.2849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kaffenberger, Michelle. Modeling the Long-Run Learning Impact of the COVID-19 Learning Shock: Actions to (More Than) Mitigate Loss. Research on Improving Systems of Education (RISE), 2020. http://dx.doi.org/10.35489/bsgrise-ri_2020/017.

Full text
Abstract:
The COVID-19 pandemic has forced 1.7 billion children out of school temporarily. While many education systems are attempting varying degrees of remote learning, it is widely accepted that the closures will produce substantial losses in learning (World Bank, 2020; Kuhfeld et al., 2020). However, the real concern is not just that a few months of learning will be lost in the short run, but that these losses will accumulate into large and permanent learning losses as many children fall behind during school closures and never catch up. This note uses a calibrated model with a “pedagogical productio
APA, Harvard, Vancouver, ISO, and other styles
10

Groeneveld, Caspar, Elia Kibga, and Tom Kaye. Deploying an e-Learning Environment in Zanzibar: Feasibility Assessment. EdTech Hub, 2020. http://dx.doi.org/10.53832/edtechhub.0028.

Full text
Abstract:
The Zanzibar Ministry of Education and Vocational Training (MoEVT) and the World Bank (the Bank) approached the EdTech Hub (the Hub) in April 2020 to explore the feasibility of implementing a Virtual Learning Environment (VLE). The Hub was requested to focus primarily on the deployment of a VLE in lower secondary education, and this report consequently focuses primarily on this group. The report is structured in four sections: An introduction to provide the background and guiding principles for the engagement with a short overview of the methodology applied. An analysis of the Zanzibar educati
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!