Academic literature on the topic 'Recurent neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Recurent neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Recurent neural networks"

1

Mu, Yangzi, Mengxing Huang, Chunyang Ye, and Qingzhou Wu. "Diagnosis Prediction via Recurrent Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 117–20. http://dx.doi.org/10.18178/ijmlc.2018.8.2.673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Changki. "Image Caption Generation using Recurrent Neural Network." Journal of KIISE 43, no. 8 (2016): 878–82. http://dx.doi.org/10.5626/jok.2016.43.8.878.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

D, Sreekanth. "Metro Water Fraudulent Prediction in Houses Using Convolutional Neural Network and Recurrent Neural Network." Revista Gestão Inovação e Tecnologias 11, no. 4 (2021): 1177–87. http://dx.doi.org/10.47059/revistageintec.v11i4.2177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wutsqa, Dhoriva Urwatul, and Anisa Nurjanah. "Breast Cancer Classification Using Fuzzy Elman Recurrent Neural Network." Journal of Advanced Research in Dynamical and Control Systems 11, no. 11-SPECIAL ISSUE (2019): 946–53. http://dx.doi.org/10.5373/jardcs/v11sp11/20193119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kumar, G. Prem, and P. Venkataram. "Network restoration using recurrent neural networks." International Journal of Network Management 8, no. 5 (1998): 264–73. http://dx.doi.org/10.1002/(sici)1099-1190(199809/10)8:5<264::aid-nem298>3.0.co;2-o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ziemke, Tom. "Radar Image Segmentation Using Self-Adapting Recurrent Networks." International Journal of Neural Systems 08, no. 01 (1997): 47–54. http://dx.doi.org/10.1142/s0129065797000070.

Full text
Abstract:
This paper presents a novel approach to the segmentation and integration of (radar) images using a second-order recurrent artificial neural network architecture consisting of two sub-networks: a function network that classifies radar measurements into four different categories of objects in sea environments (water, oil spills, land and boats), and a context network that dynamically computes the function network's input weights. It is shown that in experiments (using simulated radar images) this mechanism outperforms conventional artificial neural networks since it allows the network to learn t
APA, Harvard, Vancouver, ISO, and other styles
7

Fomenko, Volodymyr, Heorhii Loutskii, Pavlo Rehida, and Artem Volokyta. "THEMATIC TEXTS GENERATION ISSUES BASED ON RECURRENT NEURAL NETWORKS AND WORD2VEC." TECHNICAL SCIENCES AND TECHNOLOG IES, no. 4(10) (2017): 110–15. http://dx.doi.org/10.25140/2411-5363-2017-4(10)-110-115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

P. Deshmukh, Rahul, and A. A. Ghatol. "Short Term Flood Forecasting Using Recurrent Neural Networks a Comparative Study." International Journal of Engineering and Technology 2, no. 5 (2010): 430–34. http://dx.doi.org/10.7763/ijet.2010.v2.160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kalinin, Maxim, Vasiliy Krundyshev, and Evgeny Zubkov. "Estimation of applicability of modern neural network methods for preventing cyberthreats to self-organizing network infrastructures of digital economy platforms,." SHS Web of Conferences 44 (2018): 00044. http://dx.doi.org/10.1051/shsconf/20184400044.

Full text
Abstract:
The problems of applying neural network methods for solving problems of preventing cyberthreats to flexible self-organizing network infrastructures of digital economy platforms: vehicle adhoc networks, wireless sensor networks, industrial IoT, “smart buildings” and “smart cities” are considered. The applicability of the classic perceptron neural network, recurrent, deep, LSTM neural networks and neural networks ensembles in the restricting conditions of fast training and big data processing are estimated. The use of neural networks with a complex architecture– recurrent and LSTM neural network
APA, Harvard, Vancouver, ISO, and other styles
10

Back, Andrew D., and Ah Chung Tsoi. "A Low-Sensitivity Recurrent Neural Network." Neural Computation 10, no. 1 (1998): 165–88. http://dx.doi.org/10.1162/089976698300017935.

Full text
Abstract:
The problem of high sensitivity in modeling is well known. Small perturbations in the model parameters may result in large, undesired changes in the model behavior. A number of authors have considered the issue of sensitivity in feedforward neural networks from a probabilistic perspective. Less attention has been given to such issues in recurrent neural networks. In this article, we present a new recurrent neural network architecture, that is capable of significantly improved parameter sensitivity properties compared to existing recurrent neural networks. The new recurrent neural network gener
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Recurent neural networks"

1

Černík, Tomáš. "Neuronové sítě s proměnnou topologií." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2016. http://www.nusl.cz/ntk/nusl-255440.

Full text
Abstract:
Master theses deals with Constructive Neural newtorks. First part describes neural networks and coresponding mathematical models. Furher, it shows basic algorithms for learning neural networks and desribes basic constructive algotithms and their modifications. The second part deals with implementation details of selected algorithms and provides their comparision. Further comparision with backpropagation algorithm is provided.
APA, Harvard, Vancouver, ISO, and other styles
2

Nodžák, Petr. "Automatické rozpoznání akordů pomocí hlubokých neuronových sítí." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2020. http://www.nusl.cz/ntk/nusl-433596.

Full text
Abstract:
This work deals with automatic chord recognition using neural networks. The problem was separated into two subproblems. The first subproblem aims to experimental finding of most suitable solution for a acoustic model and the second one aims to experimental finding of most suitable solution for a language model. The problem was solved by iterative method. First a suboptimal solution of the first subproblem was found and then the second one. A total of 19 acoustic and 12 language models were made. Ten training datasets was created for acoustic models and three for language models. In total, over
APA, Harvard, Vancouver, ISO, and other styles
3

Żbikowski, Rafal Waclaw. "Recurrent neural networks some control aspects /." Connect to electronic version, 1994. http://hdl.handle.net/1905/180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahamed, Woakil Uddin. "Quantum recurrent neural networks for filtering." Thesis, University of Hull, 2009. http://hydra.hull.ac.uk/resources/hull:2411.

Full text
Abstract:
The essence of stochastic filtering is to compute the time-varying probability densityfunction (pdf) for the measurements of the observed system. In this thesis, a filter isdesigned based on the principles of quantum mechanics where the schrodinger waveequation (SWE) plays the key part. This equation is transformed to fit into the neuralnetwork architecture. Each neuron in the network mediates a spatio-temporal field witha unified quantum activation function that aggregates the pdf information of theobserved signals. The activation function is the result of the solution of the SWE. Theincorpor
APA, Harvard, Vancouver, ISO, and other styles
5

Zbikowski, Rafal Waclaw. "Recurrent neural networks : some control aspects." Thesis, University of Glasgow, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390233.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jacobsson, Henrik. "Rule extraction from recurrent neural networks." Thesis, University of Sheffield, 2006. http://etheses.whiterose.ac.uk/6081/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ayoub, Issa. "Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39337.

Full text
Abstract:
Affective computing has gained significant attention from researchers in the last decade due to the wide variety of applications that can benefit from this technology. Often, researchers describe affect using emotional dimensions such as arousal and valence. Valence refers to the spectrum of negative to positive emotions while arousal determines the level of excitement. Describing emotions through continuous dimensions (e.g. valence and arousal) allows us to encode subtle and complex affects as opposed to discrete emotions, such as the basic six emotions: happy, anger, fear, disgust, sad and n
APA, Harvard, Vancouver, ISO, and other styles
8

Chan, Heather Y. "Gene Network Inference and Expression Prediction Using Recurrent Neural Networks and Evolutionary Algorithms." BYU ScholarsArchive, 2010. https://scholarsarchive.byu.edu/etd/2648.

Full text
Abstract:
We demonstrate the success of recurrent neural networks in gene network inference and expression prediction using a hybrid of particle swarm optimization and differential evolution to overcome the classic obstacle of local minima in training recurrent neural networks. We also provide an improved validation framework for the evaluation of genetic network modeling systems that will result in better generalization and long-term prediction capability. Success in the modeling of gene regulation and prediction of gene expression will lead to more rapid discovery and development of therapeutic medici
APA, Harvard, Vancouver, ISO, and other styles
9

Bonato, Tommaso. "Time Series Predictions With Recurrent Neural Networks." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
L'obiettivo principale di questa tesi è studiare come gli algoritmi di apprendimento automatico (machine learning in inglese) e in particolare le reti neurali LSTM (Long Short Term Memory) possano essere utilizzati per prevedere i valori futuri di una serie storica regolare come, per esempio, le funzioni seno e coseno. Una serie storica è definita come una sequenza di osservazioni s_t ordinate nel tempo. Inoltre cercheremo di applicare gli stessi principi per prevedere i valori di una serie storica prodotta utilizzando i dati di vendita di un prodotto cosmetico durante un periodo di tre anni.
APA, Harvard, Vancouver, ISO, and other styles
10

Silfa, Franyell. "Energy-efficient architectures for recurrent neural networks." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/671448.

Full text
Abstract:
Deep Learning algorithms have been remarkably successful in applications such as Automatic Speech Recognition and Machine Translation. Thus, these kinds of applications are ubiquitous in our lives and are found in a plethora of devices. These algorithms are composed of Deep Neural Networks (DNNs), such as Convolutional Neural Networks and Recurrent Neural Networks (RNNs), which have a large number of parameters and require a large amount of computations. Hence, the evaluation of DNNs is challenging due to their large memory and power requirements. RNNs are employed to solve sequence to sequ
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Recurent neural networks"

1

Hu, Xiaolin, and P. Balasubramaniam. Recurrent neural networks. InTech, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hammer, Barbara. Learning with recurrent neural networks. Springer London, 2000. http://dx.doi.org/10.1007/bfb0110016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

ElHevnawi, Mahmoud, and Mohamed Mysara. Recurrent neural networks and soft computing. InTech, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

K, Tan K., ed. Convergence analysis of recurrent neural networks. Kluwer Academic Publishers, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yi, Zhang, and K. K. Tan. Convergence Analysis of Recurrent Neural Networks. Springer US, 2004. http://dx.doi.org/10.1007/978-1-4757-3819-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Springer Berlin Heidelberg, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24797-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Derong, Liu, ed. Qualitative analysis and synthesis of recurrent neural networks. Marcel Dekker, Inc., 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Wen. Recurrent neural networks applied to robotic motion control. National Library of Canada, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rovithakis, George A., and Manolis A. Christodoulou. Adaptive Control with Recurrent High-order Neural Networks. Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0785-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Recurent neural networks"

1

da Silva, Ivan Nunes, Danilo Hernane Spatti, Rogerio Andrade Flauzino, Luisa Helena Bartocci Liboni, and Silas Franco dos Reis Alves. "Recurrent Hopfield Networks." In Artificial Neural Networks. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43162-8_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Du, Ke-Lin, and M. N. S. Swamy. "Recurrent Neural Networks." In Neural Networks and Statistical Learning. Springer London, 2019. http://dx.doi.org/10.1007/978-1-4471-7452-3_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yalçın, Orhan Gazi. "Recurrent Neural Networks." In Applied Neural Networks with TensorFlow 2. Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6513-0_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Calin, Ovidiu. "Recurrent Neural Networks." In Deep Learning Architectures. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-36721-3_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Caterini, Anthony L., and Dong Eui Chang. "Recurrent Neural Networks." In Deep Neural Networks in a Mathematical Framework. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-75304-1_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kamath, Uday, John Liu, and James Whitaker. "Recurrent Neural Networks." In Deep Learning for NLP and Speech Recognition. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-14596-5_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Marhon, Sajid A., Christopher J. F. Cameron, and Stefan C. Kremer. "Recurrent Neural Networks." In Intelligent Systems Reference Library. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-36657-4_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Aggarwal, Charu C. "Recurrent Neural Networks." In Neural Networks and Deep Learning. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-94463-0_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Skansi, Sandro. "Recurrent Neural Networks." In Undergraduate Topics in Computer Science. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-73004-2_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ketkar, Nikhil. "Recurrent Neural Networks." In Deep Learning with Python. Apress, 2017. http://dx.doi.org/10.1007/978-1-4842-2766-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Recurent neural networks"

1

Mat Darus, I. Z., M. O. Tokhi, and S. Z. Mohd. Hashim. "Non-Linear System Identification of Flexible Plate Structures Using Neural Networks." In ASME 7th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2004. http://dx.doi.org/10.1115/esda2004-58200.

Full text
Abstract:
This paper investigates the utilisation of feedforward and recurrent neural networks for dynamic modelling of a flexible plate structure. Neuro-modelling techniques are used for non-parametric identification of the flexible plate structure based on one-step-ahead prediction. A multi layer perceptron (MLP) and Elman neural networks are designed to characterise the dynamic behaviour of the flexible plate. Results of the modelling techniques are validated through a range of tests including input/output mapping, training and test validation, mean-squared error and correlation tests. Results are pr
APA, Harvard, Vancouver, ISO, and other styles
2

Gao, Yang, Hong Yang, Peng Zhang, Chuan Zhou, and Yue Hu. "Graph Neural Architecture Search." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/195.

Full text
Abstract:
Graph neural networks (GNNs) emerged recently as a powerful tool for analyzing non-Euclidean data such as social network data. Despite their success, the design of graph neural networks requires heavy manual work and domain knowledge. In this paper, we present a graph neural architecture search method (GraphNAS) that enables automatic design of the best graph neural architecture based on reinforcement learning. Specifically, GraphNAS uses a recurrent network to generate variable-length strings that describe the architectures of graph neural networks, and trains the recurrent network with polic
APA, Harvard, Vancouver, ISO, and other styles
3

Hupkes, Dieuwke, and Willem Zuidema. "Visualisation and 'Diagnostic Classifiers' Reveal how Recurrent and Recursive Neural Networks Process Hierarchical Structure (Extended Abstract)." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/796.

Full text
Abstract:
In this paper, we investigate how recurrent neural networks can learn and process languages with hierarchical, compositional semantics. To this end, we define the artificial task of processing nested arithmetic expressions, and study whether different types of neural networks can learn to compute their meaning. We find that simple recurrent networks cannot find a generalising solution to this task, but gated recurrent neural networks perform surprisingly well: networks learn to predict the outcome of the arithmetic expressions with high accuracy, although performance deteriorates somewhat with
APA, Harvard, Vancouver, ISO, and other styles
4

Mohammadi, Rasul, Esmaeil Naderi, Khashayar Khorasani, and Shahin Hashtrudi-Zad. "Fault Diagnosis of Gas Turbine Engines by Using Dynamic Neural Networks." In ASME Turbo Expo 2010: Power for Land, Sea, and Air. ASMEDC, 2010. http://dx.doi.org/10.1115/gt2010-23586.

Full text
Abstract:
This paper presents a novel methodology for fault detection in gas turbine engines based on the concept of dynamic neural networks. The neural network structure belongs to the class of locally recurrent globally feed-forward networks. The architecture of the network is similar to the feed-forward multi-layer perceptron with the difference that the processing units include dynamic characteristics. The dynamics present in these networks make them a powerful tool useful for identification of nonlinear systems. The dynamic neural network architecture that is described in this paper is used for fau
APA, Harvard, Vancouver, ISO, and other styles
5

Freitag, Steffen, and Wolfgang Graf. "FE Analysis of Structures With Uncertain Model-Free Material Descriptions." In ASME 2011 International Mechanical Engineering Congress and Exposition. ASMEDC, 2011. http://dx.doi.org/10.1115/imece2011-63357.

Full text
Abstract:
An artificial neural network concept is presented, which can be used to identify uncertain time-dependent material behavior. Dependencies between strain and stress processes obtained from uncertain results of experimental investigations are described by recurrent neural networks for fuzzy data. Direct and indirect approaches are presented for the computation of fuzzy stress and strain processes from experimental results. The identification of uncertain stress-strain-time dependencies with recurrent neural networks for fuzzy data is realized by a network training utilizing α-cuts and an α-level
APA, Harvard, Vancouver, ISO, and other styles
6

Zomorodian, Roozbeh, Hiwa Khaledi, and Mohammad Bagher Ghofrani. "Static and Dynamic Neural Networks for Simulation and Optimization of Cogeneration Systems." In ASME Turbo Expo 2006: Power for Land, Sea, and Air. ASMEDC, 2006. http://dx.doi.org/10.1115/gt2006-90236.

Full text
Abstract:
In this paper, the application of neural networks for simulation and optimization of the cogeneration systems has been presented. CGAM problem, a benchmark in cogeneration systems, is chosen as a case study. Thermodynamic model includes precise modeling of the whole plant. For simulation of the steady sate behavior, the static neural network is applied. Then using dynamic neural network, plant is optimized thermodynamically. Multi layer feed forward neural networks is chosen as static net and recurrent neural networks as dynamic net. The steady state behavior of CGAM problem is simulated by MF
APA, Harvard, Vancouver, ISO, and other styles
7

Bukka, Sandeep R., Allan Ross Magee, and Rajeev K. Jaiman. "Deep Convolutional Recurrent Autoencoders for Flow Field Prediction." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18556.

Full text
Abstract:
Abstract In this paper, an end-to-end nonlinear model reduction methodology is presented based on the convolutional recurrent autoencoder networks. The methodology is developed in the context of overall data-driven reduced order model framework proposed in the paper. The basic idea behind the methodology is to obtain the low dimensional representations via convolutional neural networks and evolve these low dimensional features via recurrent neural networks in time domain. The high dimensional representations are constructed from the evolved low dimensional features via transpose convolutional
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Ziqian, and Nirwan Ansari. "Control of Recurrent Neural Networks Using Differential Minimax Game: The Stochastic Case." In ASME 2010 Dynamic Systems and Control Conference. ASMEDC, 2010. http://dx.doi.org/10.1115/dscc2010-4006.

Full text
Abstract:
As a continuation of our study, this paper extends our research results of optimality-oriented stabilization from deterministic recurrent neural networks to stochastic recurrent neural networks, and presents a new approach to achieve optimally stochastic input-to-state stabilization in probability for stochastic recurrent neural networks driven by noise of unknown covariance. This approach is developed by using stochastic differential minimax game, Hamilton-Jacobi-Isaacs (HJI) equation, inverse optimality, and Lyapunov technique. A numerical example is given to demonstrate the effectiveness of
APA, Harvard, Vancouver, ISO, and other styles
9

Heng, Hongjun, and Renjie Li. "Double Multi-Head Attention-Based Capsule Network for Relation Classification." In 11th International Conference on Computer Science and Information Technology (CCSIT 2021). AIRCC Publishing Corporation, 2021. http://dx.doi.org/10.5121/csit.2021.110711.

Full text
Abstract:
Semantic relation classification is an important task in the field of nature language processing. The existing neural network relation classification models introduce attention mechanism to increase the importance of significant features, but part of these attention models only have one head which is not enough to capture more distinctive fine-grained features. Models based on RNN (Recurrent Neural Network) usually use single-layer structure and have limited feature extraction capability. Current RNN-based capsule networks have problem of improper handling of noise which increase complexity of
APA, Harvard, Vancouver, ISO, and other styles
10

Kieu, Tung, Bin Yang, Chenjuan Guo, and Christian S. Jensen. "Outlier Detection for Time Series with Recurrent Autoencoder Ensembles." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/378.

Full text
Abstract:
We propose two solutions to outlier detection in time series based on recurrent autoencoder ensembles. The solutions exploit autoencoders built using sparsely-connected recurrent neural networks (S-RNNs). Such networks make it possible to generate multiple autoencoders with different neural network connection structures. The two solutions are ensemble frameworks, specifically an independent framework and a shared framework, both of which combine multiple S-RNN based autoencoders to enable outlier detection. This ensemble-based approach aims to reduce the effects of some autoencoders being over
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Recurent neural networks"

1

Brabel, Michael J. Basin Sculpting a Hybrid Recurrent Feedforward Neural Network. Defense Technical Information Center, 1998. http://dx.doi.org/10.21236/ada336386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pearlmutter, Barak A. Learning State Space Trajectories in Recurrent Neural Networks: A preliminary Report. Defense Technical Information Center, 1988. http://dx.doi.org/10.21236/ada219114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Talathi, S. S. Deep Recurrent Neural Networks for seizure detection and early seizure detection systems. Office of Scientific and Technical Information (OSTI), 2017. http://dx.doi.org/10.2172/1366924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mathia, Karl. Solutions of linear equations and a class of nonlinear equations using recurrent neural networks. Portland State University Library, 2000. http://dx.doi.org/10.15760/etd.1354.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bodruzzaman, M., and M. A. Essawy. Iterative prediction of chaotic time series using a recurrent neural network. Quarterly progress report, January 1, 1995--March 31, 1995. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/283610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!