Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Recurrent neural networks BLSTM.

Дисертації з теми "Recurrent neural networks BLSTM"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Recurrent neural networks BLSTM".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Etienne, Caroline. "Apprentissage profond appliqué à la reconnaissance des émotions dans la voix." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS517.

Повний текст джерела
Анотація:
Mes travaux de thèse s'intéressent à l'utilisation de nouvelles technologies d'intelligence artificielle appliquées à la problématique de la classification automatique des séquences audios selon l'état émotionnel du client au cours d'une conversation avec un téléconseiller. En 2016, l'idée est de se démarquer des prétraitements de données et modèles d'apprentissage automatique existant au sein du laboratoire, et de proposer un modèle qui soit le plus performant possible sur la base de données audios IEMOCAP. Nous nous appuyons sur des travaux existants sur les modèles de réseaux de neurones pr
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Morillot, Olivier. "Reconnaissance de textes manuscrits par modèles de Markov cachés et réseaux de neurones récurrents : application à l'écriture latine et arabe." Electronic Thesis or Diss., Paris, ENST, 2014. http://www.theses.fr/2014ENST0002.

Повний текст джерела
Анотація:
La reconnaissance d’écriture manuscrite est une composante essentielle de l’analyse de document. Une tendance actuelle de ce domaine est de passer de la reconnaissance de mots isolés à celle d’une séquence de mots. Notre travail consiste donc à proposer un système de reconnaissance de lignes de texte sans segmentation explicite de la ligne en mots. Afin de construire un modèle performant, nous intervenons à plusieurs niveaux du système de reconnaissance. Tout d’abord, nous introduisons deux méthodes de prétraitement originales : un nettoyage des images de lignes de texte et une correction loca
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Żbikowski, Rafal Waclaw. "Recurrent neural networks some control aspects /." Connect to electronic version, 1994. http://hdl.handle.net/1905/180.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Ahamed, Woakil Uddin. "Quantum recurrent neural networks for filtering." Thesis, University of Hull, 2009. http://hydra.hull.ac.uk/resources/hull:2411.

Повний текст джерела
Анотація:
The essence of stochastic filtering is to compute the time-varying probability densityfunction (pdf) for the measurements of the observed system. In this thesis, a filter isdesigned based on the principles of quantum mechanics where the schrodinger waveequation (SWE) plays the key part. This equation is transformed to fit into the neuralnetwork architecture. Each neuron in the network mediates a spatio-temporal field witha unified quantum activation function that aggregates the pdf information of theobserved signals. The activation function is the result of the solution of the SWE. Theincorpor
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Zbikowski, Rafal Waclaw. "Recurrent neural networks : some control aspects." Thesis, University of Glasgow, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390233.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Jacobsson, Henrik. "Rule extraction from recurrent neural networks." Thesis, University of Sheffield, 2006. http://etheses.whiterose.ac.uk/6081/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Bonato, Tommaso. "Time Series Predictions With Recurrent Neural Networks." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Знайти повний текст джерела
Анотація:
L'obiettivo principale di questa tesi è studiare come gli algoritmi di apprendimento automatico (machine learning in inglese) e in particolare le reti neurali LSTM (Long Short Term Memory) possano essere utilizzati per prevedere i valori futuri di una serie storica regolare come, per esempio, le funzioni seno e coseno. Una serie storica è definita come una sequenza di osservazioni s_t ordinate nel tempo. Inoltre cercheremo di applicare gli stessi principi per prevedere i valori di una serie storica prodotta utilizzando i dati di vendita di un prodotto cosmetico durante un periodo di tre anni.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Silfa, Franyell. "Energy-efficient architectures for recurrent neural networks." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/671448.

Повний текст джерела
Анотація:
Deep Learning algorithms have been remarkably successful in applications such as Automatic Speech Recognition and Machine Translation. Thus, these kinds of applications are ubiquitous in our lives and are found in a plethora of devices. These algorithms are composed of Deep Neural Networks (DNNs), such as Convolutional Neural Networks and Recurrent Neural Networks (RNNs), which have a large number of parameters and require a large amount of computations. Hence, the evaluation of DNNs is challenging due to their large memory and power requirements. RNNs are employed to solve sequence to sequ
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Brax, Christoffer. "Recurrent neural networks for time-series prediction." Thesis, University of Skövde, Department of Computer Science, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-480.

Повний текст джерела
Анотація:
<p>Recurrent neural networks have been used for time-series prediction with good results. In this dissertation recurrent neural networks are compared with time-delayed feed forward networks, feed forward networks and linear regression models on a prediction task. The data used in all experiments is real-world sales data containing two kinds of segments: campaign segments and non-campaign segments. The task is to make predictions of sales under campaigns. It is evaluated if more accurate predictions can be made when only using the campaign segments of the data.</p><p>Throughout the entire proje
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Ljungehed, Jesper. "Predicting Customer Churn Using Recurrent Neural Networks." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-210670.

Повний текст джерела
Анотація:
Churn prediction is used to identify customers that are becoming less loyal and is an important tool for companies that want to stay competitive in a rapidly growing market. In retail, a dynamic definition of churn is needed to identify churners correctly. Customer Lifetime Value (CLV) is the monetary value of a customer relationship. No change in CLV for a given customer indicates a decrease in loyalty. This thesis proposes a novel approach to churn prediction. The proposed model uses a Recurrent Neural Network to identify churners based on Customer Lifetime Value time series regression. The
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Rabi, Gihad. "Visual speech recognition by recurrent neural networks." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape16/PQDD_0010/MQ36169.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Miller, Paul Ian. "Recurrent neural networks and adaptive motor control." Thesis, University of Stirling, 1997. http://hdl.handle.net/1893/21520.

Повний текст джерела
Анотація:
This thesis is concerned with the use of neural networks for motor control tasks. The main goal of the thesis is to investigate ways in which the biological notions of motor programs and Central Pattern Generators (CPGs) may be implemented in a neural network framework. Biological CPGs can be seen as components within a larger control scheme, which is basically modular in design. In this thesis, these ideas are investigated through the use of modular recurrent networks, which are used in a variety of control tasks. The first experimental chapter deals with learning in recurrent networks, and i
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Xie, Xiaohui 1972. "Dynamics and learning in recurrent neural networks." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/8393.

Повний текст джерела
Анотація:
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2002.<br>Includes bibliographical references (p. 141-151).<br>This thesis is a study of dynamics and learning in recurrent neural networks. Many computations of neural systems are carried out through a network of a large number of neurons. With massive feedback connections among these neurons, a study of its dynamics is necessary in order to understand the network's function. In this thesis, I aim at studying several recurrent network models and relating the dynamics with the networks' computation. Fo
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Potter, Chris, Kurt Kosbar, and Adam Panagos. "MIMO Channel Prediction Using Recurrent Neural Networks." International Foundation for Telemetering, 2008. http://hdl.handle.net/10150/606193.

Повний текст джерела
Анотація:
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California<br>Adaptive modulation is a communication technique capable of maximizing throughput while guaranteeing a fixed symbol error rate (SER). However, this technique requires instantaneous channel state information at the transmitter. This can be obtained by predicting channel states at the receiver and feeding them back to the transmitter. Existing algorithms used to predict single-inp
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Alam, Samiul. "Recurrent neural networks in electricity load forecasting." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-233254.

Повний текст джерела
Анотація:
In this thesis two main studies are conducted to compare the predictive capabilities of feed-forward neural networks (FFNN) and long short-term memory networks (LSTM) in electricity load forecasting. The first study compares univariate networks using past electricity load, as well as multivariate networks using past electricity load and air temperature, in day-ahead load forecasting using varying lookback periods and sparsity of past observations. The second study compares FFNNs and LSTMs of different complexities (i.e. network sizes) when restrictions imposed by limitations of the real world
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Besharat, Pour Shiva. "Hierarchical sales forecasting using Recurrent Neural Networks." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290892.

Повний текст джерела
Анотація:
Sales forecasting equips businesses with the essential basis for planning future investments, controlling costs, and production. This research is in cooperation with a property development company for the purpose of improving the accuracy of manual sales forecasting. The objective is to investigate the effects of using the underlying factors that affect the individual sales of the company in forecasting the company’s income. One approach uses an aggregation of the estimates of the individual sales to approximate the company’s income. This approach uses the underlying hierarchical factors of th
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Tegnér, Gustaf. "Recurrent neural networks for financial asset forecasting." Thesis, KTH, Matematisk statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229924.

Повний текст джерела
Анотація:
The application of neural networks in finance has found renewed interest in the past few years. Neural networks have a proven capability of modeling non-linear relationships and have been proven widely successful in domains such as image and speech recognition. These favorable properties of the Neural Network make them an alluring choice of model when studying the financial markets. This thesis is concerned with investigating the use of recurrent neural networks for predicting future financial asset price movements on a set of futures contracts. To aid our research, we compare them to a set of
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Perumal, Subramoniam. "Stability and Switchability in Recurrent Neural Networks." University of Cincinnati / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1227194814.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Graves, Alex. "Supervised sequence labelling with recurrent neural networks." kostenfrei, 2008. http://mediatum2.ub.tum.de/doc/673554/673554.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Berlati, Alessandro. "Ambiguity in Recurrent Models: Predicting Multiple Hypotheses with Recurrent Neural Networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16611/.

Повний текст джерела
Анотація:
Multiple Hypothesis Prediction (MHP) models have been introduced to deal with uncertainty in feedforward neural networks, in particular it has been shown how to easily convert a standard single-prediction neural network into one able to show many feasible outcomes. Ambiguity, however, is present also in problems where feedback model are needed, such as sequence generation and time series classification. In our work, we propose an extension of MHP to Recurrent Neural Networks (RNNs), especially those consisting of Long Short-Term Memory units. We test the resulting models on both regression an
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Le, Ngan Thi Hoang. "Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling." Research Showcase @ CMU, 2018. http://repository.cmu.edu/dissertations/1166.

Повний текст джерела
Анотація:
Semantic labeling is becoming more and more popular among researchers in computer vision and machine learning. Many applications, such as autonomous driving, tracking, indoor navigation, augmented reality systems, semantic searching, medical imaging are on the rise, requiring more accurate and efficient segmentation mechanisms. In recent years, deep learning approaches based on Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) have dramatically emerged as the dominant paradigm for solving many problems in computer vision and machine learning. The main focus of this thes
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Sarti, Paolo. "Embeddings for text classification with recurrent neural networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Знайти повний текст джерела
Анотація:
L'importanza di metodi automatici per la classificazione ed estrazione di informazioni da testi è cresciuta significativamente negli ultimi anni, a causa della produzione sempre maggiore di questo tipo di dati, specialmente tramite piattaforme web. Questo ha portato allo sviluppo di nuovi algoritmi per analizzare testi non strutturati. Le tecniche di "Embedding", che associano parole o parti di testo di lunghezza variabile a vettori di dimensione fissa mantenendo relazioni di similarità semantica, sono state un grande progresso per il campo del "Natural Language Processing". Inoltre, avanzamen
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Gers, Félix. "Long short-term memory in recurrent neural networks /." [S.l.] : [s.n.], 2001. http://library.epfl.ch/theses/?nr=2366.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Tino, Peter, and Georg Dorffner. "Recurrent neural networks with iterated function systems dynamics." SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 1998. http://epub.wu.ac.at/948/1/document.pdf.

Повний текст джерела
Анотація:
We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and faster), 2) the RNN state part codes the information processing states in the symbolic input stream in a well-organized and intuitively appealing way. We show that there is a direct correspondence between
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Steinberger, Thomas, and Lucas Zinner. "Complete controllability of discrete-time recurrent neural networks." SFB Adaptive Information Systems and Modelling in Economics and Management Science, WU Vienna University of Economics and Business, 1999. http://epub.wu.ac.at/440/1/document.pdf.

Повний текст джерела
Анотація:
This paper presents a characterization of complete controllability for the class of discrete-time recurrent neural networks. We prove that complete controllability holds if and only if the rank of the control matrix equals the state space dimension. (author's abstract)<br>Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Mastrogiuseppe, Francesca. "From dynamics to computations in recurrent neural networks." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE048/document.

Повний текст джерела
Анотація:
Le cortex cérébral des mammifères est constitué de larges et complexes réseaux de neurones. La tâche de ces assemblées de cellules est d’encoder et de traiter, le plus précisément possible, l'information sensorielle issue de notre environnement extérieur. De façon surprenante, les enregistrements électrophysiologiques effectués sur des animaux en comportement ont montré que l’activité corticale est excessivement irrégulière. Les motifs temporels d’activité ainsi que les taux de décharge moyens des cellules varient considérablement d’une expérience à l’autre, et ce malgré des conditions expérim
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Shao, Yuanlong. "Learning Sparse Recurrent Neural Networks in Language Modeling." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Kolen, John F. "Exploring the computational capabilities of recurrent neural networks /." The Ohio State University, 1994. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487853913100192.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Chen, Jacob. "Embodied perception during walking using Deep Recurrent Neural Networks." Thesis, University of British Columbia, 2017. http://hdl.handle.net/2429/62171.

Повний текст джерела
Анотація:
Movements such as walking require knowledge of the environment in order to be robust. This knowledge can be gleaned via embodied perception. While information about the upcoming terrain such as compliance, friction, or slope may be difficult to directly estimate, using the walking motion itself allows for these properties to be implicitly observed over time from the stream of movement data. However, the relationship between a parameter such as ground compliance and the movement data may be complex and difficult to discover. In this thesis, we demonstrate the use of a Deep LSTM Network to estim
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Jansson, Anton. "Predicting trajectories of golf balls using recurrent neural networks." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-210552.

Повний текст джерела
Анотація:
This thesis is concerned with the problem of predicting the remaining part of the trajectory of a golf ball as it travels through the air where only the three-dimensional position of the ball is captured. The approach taken to solve this problem relied on recurrent neural networks in the form of the long short-term memory networks (LSTM). The motivation behind this choice was that this type of networks had led to state-of-the-art performance for similar problems such as predicting the trajectory of pedestrians. The results show that using LSTMs led to an average reduction of 36.6 % of the erro
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Salihoglu, Utku. "Toward a brain-like memory with recurrent neural networks." Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210221.

Повний текст джерела
Анотація:
For the last twenty years, several assumptions have been expressed in the fields of information processing, neurophysiology and cognitive sciences. First, neural networks and their dynamical behaviors in terms of attractors is the natural way adopted by the brain to encode information. Any information item to be stored in the neural network should be coded in some way or another in one of the dynamical attractors of the brain, and retrieved by stimulating the network to trap its dynamics in the desired item’s basin of attraction. The second view shared by neural network researchers is to base
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Molter, Colin. "Storing information through complex dynamics in recurrent neural networks." Doctoral thesis, Universite Libre de Bruxelles, 2005. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/211039.

Повний текст джерела
Анотація:
The neural net computer simulations which will be presented here are based on the acceptance of a set of assumptions that for the last twenty years have been expressed in the fields of information processing, neurophysiology and cognitive sciences. First of all, neural networks and their dynamical behaviors in terms of attractors is the natural way adopted by the brain to encode information. Any information item to be stored in the neural net should be coded in some way or another in one of the dynamical attractors of the brain and retrieved by stimulating the net so as to trap its dynamics in
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Mehta, Manish P. "Prediction of manufacturing operations sequence using recurrent neural networks." Ohio : Ohio University, 1997. http://www.ohiolink.edu/etd/view.cgi?ohiou1177089656.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Vartak, Aniket Arun. "GAUSS-NEWTON BASED LEARNING FOR FULLY RECURRENT NEURAL NETWORKS." Master's thesis, University of Central Florida, 2004. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4429.

Повний текст джерела
Анотація:
The thesis discusses a novel off-line and on-line learning approach for Fully Recurrent Neural Networks (FRNNs). The most popular algorithm for training FRNNs, the Real Time Recurrent Learning (RTRL) algorithm, employs the gradient descent technique for finding the optimum weight vectors in the recurrent neural network. Within the framework of the research presented, a new off-line and on-line variation of RTRL is presented, that is based on the Gauss-Newton method. The method itself is an approximate Newton's method tailored to the specific optimization problem, (non-linear least squares), wh
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Senior, Andrew William. "Off-line cursive handwriting recognition using recurrent neural networks." Thesis, University of Cambridge, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338024.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Alvarez, Mouravskaia Kevin. "Metaphor identification for Spanish sentences using recurrent neural networks." Master's thesis, Pontificia Universidad Católica del Perú, 2019. http://hdl.handle.net/20.500.12404/16531.

Повний текст джерела
Анотація:
Metaphors are an important literary figure that is found in books or and daily use. Nowadays it is an essential task for Natural Language Processing (NLP), but the dependence of the context and the lack corpus in other languages make it a bottleneck for some tasks such as translation or interpretation of texts. We present a classification model using recurrent neural networks for metaphor identification in Spanish sentences. We tested our model and his variants on a new corpus in Spanish and compared it with the current baseline using an English corpus. Our best model reports an F-sco
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Fors, Johansson Christoffer. "Arrival Time Predictions for Buses using Recurrent Neural Networks." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-165133.

Повний текст джерела
Анотація:
In this thesis, two different types of bus passengers are identified. These two types, namely current passengers and passengers-to-be have different needs in terms of arrival time predictions. A set of machine learning models based on recurrent neural networks and long short-term memory units were developed to meet these needs. Furthermore, bus data from the public transport in Östergötland county, Sweden, were collected and used for training new machine learning models. These new models are compared with the current prediction system that is used today to provide passengers with arrival time
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Pokhrel, Abhishek <1996&gt. "Stock Returns Prediction using Recurrent Neural Networks with LSTM." Master's Degree Thesis, Università Ca' Foscari Venezia, 2022. http://hdl.handle.net/10579/22038.

Повний текст джерела
Анотація:
Research in asset pricing has, until recently, side-stepped the high dimensionality problem by focusing on low-dimensional models. Work on cross-sectional stock return prediction, for example, has focused on regressions with a small number of characteristics. Given the background of an enormously large number of variables that could potentially be relevant for predicting returns, focusing on such a small number of factors effectively means that the researchers are imposing a very high degree of sparsity on these models. This research studies the use of the recurrent neural network (RNN) method
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Rodriguez, Paul Fabian. "Mathematical foundations of simple recurrent networks /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1999. http://wwwlib.umi.com/cr/ucsd/fullcit?p9935464.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Martins, Helder. "Predicting user churn on streaming services using recurrent neural networks." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-217109.

Повний текст джерела
Анотація:
Providers of online services have witnessed a rapid growth of their user base in the last few years. The phenomenon has attracted an increasing number of competitors determined on obtaining their own share of the market. In this context, the cost of attracting new customers has increased significantly, raising the importance of retaining existing clients. Therefore, it has become progressively more important for the companies to improve user experience and ensure they keep a larger share of their users active in consuming their product. Companies are thus compelled to build tools that can iden
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Beneš, Karel. "Recurrent Neural Networks with Elastic Time Context in Language Modeling." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2016. http://www.nusl.cz/ntk/nusl-255481.

Повний текст джерела
Анотація:
Tato zpráva popisuje  experimentální práci na statistické jazykovém modelování pomocí rekurentních neuronových sítí (RNN). Je zde předložen důkladný přehled dosud publikovaných prací, následovaný popisem algoritmů pro trénování příslušných modelů. Většina z popsaných technik byla implementována ve vlastním nástroji, založeném na knihovně Theano. Byla provedena rozsáhlá sada experimentů s modelem Jednoduché rekurentní sítě (SRN), která odhalila některé jejich dosud nepublikované vlastnosti. Při statické evaluaci modelu byly dosažené výsledky relativně cca. o 2.7 % horší, než nejlepší publikovan
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Bolcato, Pietro. "Concurrent generation of melody and lyrics by recurrent neural networks." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284499.

Повний текст джерела
Анотація:
This work proposes a conditioned recurrent neural network architecture forconcurrent melody and lyrics generation. This is in contrast to methods thatfirst generate music and then lyrics, or vice versa. The system is trained to firstsample a pitch from a distribution, then sample a duration conditioned on thesampled pitch, and finally sample a syllable conditioned on the sampled pitchand duration. The evaluation metrics show the trained system generates musicand text sequences that exhibit some sensible musical and linguistic properties,and as further evaluation, it was applied in a human-AI c
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Daliparthi, Venkata Satya Sai Ajay. "Semantic Segmentation of Urban Scene Images Using Recurrent Neural Networks." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20651.

Повний текст джерела
Анотація:
Background: In Autonomous Driving Vehicles, the vehicle receives pixel-wise sensor data from RGB cameras, point-wise depth information from the cameras, and sensors data as input. The computer present inside the Autonomous Driving vehicle processes the input data and provides the desired output, such as steering angle, torque, and brake. To make an accurate decision by the vehicle, the computer inside the vehicle should be completely aware of its surroundings and understand each pixel in the driving scene. Semantic Segmentation is the task of assigning a class label (Such as Car, Road, Pedestr
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Nguyen, Thaovy Tuong. "Utilizing Recurrent Neural Networks for Temporal Data Generation and Prediction." Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/103874.

Повний текст джерела
Анотація:
The Falling Creek Reservoir (FCR) in Roanoke is monitored for water quality and other key measurements to distribute clean and safe water to the community. Forecasting these measurements is critical for management of the FCR. However, current techniques are limited by inherent Gaussian linearity assumptions. Since the dynamics of the ecosystem may be non-linear, we propose neural network-based schemes for forecasting. We create the LatentGAN architecture by extending the recurrent neural network-based ProbCast and autoencoder forecasting architectures to produce multiple forecasts for a single
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Ärlemalm, Filip. "Harbour Porpoise Click Train Classification with LSTM Recurrent Neural Networks." Thesis, KTH, Teknisk informationsvetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-215088.

Повний текст джерела
Анотація:
The harbour porpoise is a toothed whale whose presence is threatened in Scandinavia. Onestep towards preserving the species in critical areas is to study and observe the harbourporpoise population growth or decline in these areas. Today this is done by using underwateraudio recorders, so called hydrophones, and manual analyzing tools. This report describes amethod that modernizes the process of harbour porpoise detection with machine learning. Thedetection method is based on data collected by the hydrophone AQUAclick 100. The data isprocessed and classified automatically with a stacked long sh
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Shertil, M. S. "On the induction of temporal structure by recurrent neural networks." Thesis, Nottingham Trent University, 2014. http://irep.ntu.ac.uk/id/eprint/27915/.

Повний текст джерела
Анотація:
Language acquisition is one of the core problems in artificial intelligence (AI) and it is generally accepted that any successful AI account of the mind will stand or fall depending on its ability to model human language. Simple Recurrent Networks (SRNs) are a class of so-called artificial neural networks that have a long history in language modelling via learning to predict the next word in a sentence. However, SRNs have also been shown to suffer from catastrophic forgetting, lack of syntactic systematicity and an inability to represent more than three levels of centre-embedding, due to the s
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Otte, Sebastian [Verfasser]. "Recurrent Neural Networks for Sequential Pattern Recognition Applications / Sebastian Otte." München : Verlag Dr. Hut, 2017. http://d-nb.info/1149579382/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Ahrneteg, Jakob, and Dean Kulenovic. "Semantic Segmentation of Historical Document Images Using Recurrent Neural Networks." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18219.

Повний текст джерела
Анотація:
Background. This thesis focuses on the task of historical document semantic segmentation with recurrent neural networks. Document semantic segmentation involves the segmentation of a page into different meaningful regions and is an important prerequisite step of automated document analysis and digitisation with optical character recognition. At the time of writing, convolutional neural network based solutions are the state-of-the-art for analyzing document images while the use of recurrent neural networks in document semantic segmentation has not yet been studied. Considering the nature of a r
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Haddad, Josef, and Carl Piehl. "Unsupervised anomaly detection in time series with recurrent neural networks." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259655.

Повний текст джерела
Анотація:
Artificial neural networks (ANN) have been successfully applied to a wide range of problems. However, most of the ANN-based models do not attempt to model the brain in detail, but there are still some models that do. An example of a biologically constrained ANN is Hierarchical Temporal Memory (HTM). This study applies HTM and Long Short-Term Memory (LSTM) to anomaly detection problems in time series in order to compare their performance for this task. The shape of the anomalies are restricted to point anomalies and the time series are univariate. Pre-existing implementations that utilise these
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Sutskever, Ilya. "Training Recurrent Neural Networks." Thesis, 2013. http://hdl.handle.net/1807/36012.

Повний текст джерела
Анотація:
Recurrent Neural Networks (RNNs) are powerful sequence models that were believed to be difficult to train, and as a result they were rarely used in machine learning applications. This thesis presents methods that overcome the difficulty of training RNNs, and applications of RNNs to challenging problems. We first describe a new probabilistic sequence model that combines Restricted Boltzmann Machines and RNNs. The new model is more powerful than similar models while being less difficult to train. Next, we present a new variant of the Hessian-free (HF) optimizer and show that it can train RNNs
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!