Littérature scientifique sur le sujet « Continuous time recurrent neural network »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Continuous time recurrent neural network ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Continuous time recurrent neural network"

1

Osipov, Vasiliy, and Dmitriy Miloserdov. "Neural network event forecasting for robots with continuous training." Information and Control Systems, no. 5 (October 20, 2020): 33–42. http://dx.doi.org/10.31799/1684-8853-2020-5-33-42.

Texte intégral
Résumé :
Introduction: High hopes for a significant expansion of human capabilities in various fields of activity are pinned on the creation and use of highly intelligent robots. To achieve this level of robot intelligence, it is necessary to successfully solve the problems of predicting the external environment and the state of the robots themselves. Solutions based on recurrent neural networks with controlled elements are promising neural network forecasting systems. Purpose: Search for appropriate neural network structures for predicting events. Development of approaches to controlling the associati
Styles APA, Harvard, Vancouver, ISO, etc.
2

Gavaldà, Ricard, and Hava T. Siegelmann. "Discontinuities in Recurrent Neural Networks." Neural Computation 11, no. 3 (1999): 715–45. http://dx.doi.org/10.1162/089976699300016638.

Texte intégral
Résumé :
This article studies the computational power of various discontinuous real computational models that are based on the classical analog recurrent neural network (ARNN). This ARNN consists of finite number of neurons; each neuron computes a polynomial net function and a sigmoid-like continuous activation function. We introduce arithmetic networks as ARNN augmented with a few simple discontinuous (e.g., threshold or zero test) neurons. We argue that even with weights restricted to polynomial time computable reals, arithmetic networks are able to compute arbitrarily complex recursive functions. We
Styles APA, Harvard, Vancouver, ISO, etc.
3

Cauwenberghs, G. "An analog VLSI recurrent neural network learning a continuous-time trajectory." IEEE Transactions on Neural Networks 7, no. 2 (1996): 346–61. http://dx.doi.org/10.1109/72.485671.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Sontag, Eduardo, and Héctor Sussmann. "Complete controllability of continuous-time recurrent neural networks." Systems & Control Letters 30, no. 4 (1997): 177–83. http://dx.doi.org/10.1016/s0167-6911(97)00002-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Das, S., and O. Olurotimi. "Noisy recurrent neural networks: the continuous-time case." IEEE Transactions on Neural Networks 9, no. 5 (1998): 913–36. http://dx.doi.org/10.1109/72.712164.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Yu, Jiali, Huajin Tang, and Haizhou Li. "Continuous attractors of discrete-time recurrent neural networks." Neural Computing and Applications 23, no. 1 (2012): 89–96. http://dx.doi.org/10.1007/s00521-012-0975-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Wang, Xin, Arun Jagota, Fernanda Botelho, and Max Garzon. "Absence of Cycles in Symmetric Neural Networks." Neural Computation 10, no. 5 (1998): 1235–49. http://dx.doi.org/10.1162/089976698300017430.

Texte intégral
Résumé :
For a given recurrent neural network, a discrete-time model may have asymptotic dynamics different from the one of a related continuous-time model. In this article, we consider a discrete-time model that discretizes the continuous-time leaky integrat or model and study its parallel, sequential, block-sequential, and distributed dynamics for symmetric networks. We provide sufficient (and in many cases necessary) conditions for the discretized model to have the same cycle-free dynamics of the corresponding continuous-time model in symmetric networks.
Styles APA, Harvard, Vancouver, ISO, etc.
8

SATO, SHOZO, and KAZUTOSHI GOHARA. "FRACTAL TRANSITION IN CONTINUOUS RECURRENT NEURAL NETWORKS." International Journal of Bifurcation and Chaos 11, no. 02 (2001): 421–34. http://dx.doi.org/10.1142/s0218127401002158.

Texte intégral
Résumé :
A theory for continuous dynamical systems stochastically excited by temporal external inputs has been presented. The theory suggests that the dynamics of continuous-time recurrent neural networks (RNNs) is generally characterized by a set of continuous trajectories with a fractal-like structure in hyper-cylindrical phase space. We refer to this dynamics as the fractal transition. In this paper, three types of numerical experiments are discussed in order to investigate the learning process and noise effects in terms of the fractal transition. First, to analyze how an RNN learns desired input–ou
Styles APA, Harvard, Vancouver, ISO, etc.
9

TSUNG, FU-SHENG, and GARRISON W. COTTRELL. "LEARNING IN RECURRENT FINITE DIFFERENCE NETWORKS." International Journal of Neural Systems 06, no. 03 (1995): 249–56. http://dx.doi.org/10.1142/s0129065795000184.

Texte intégral
Résumé :
A recurrent learning algorithm based on a finite difference discretization of continuous equations for neural networks is derived. This algorithm has the simplicity of discrete algorithms while retaining some essential characteristics of the continuous equations. In discrete networks learning smooth oscillations is difficult if the period of oscillation is too large. The network either grossly distorts the waveforms or is unable to learn at all. We show how the finite difference formulation can explain and overcome this problem. Formulas for learning time constants and time delays in this fram
Styles APA, Harvard, Vancouver, ISO, etc.
10

Wang, Jun, and Guang Wu. "A multilayer recurrent neural network for solving continuous-time algebraic Riccati equations." Neural Networks 11, no. 5 (1998): 939–50. http://dx.doi.org/10.1016/s0893-6080(98)00034-3.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Plus de sources

Thèses sur le sujet "Continuous time recurrent neural network"

1

Vigraham, Saranyan A. "An Analog Evolvable Hardware Device for Active Control." Wright State University / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=wright1195506953.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Kramer, Gregory Robert. "An analysis of neutral drift's effect on the evolution of a CTRNN locomotion controller with noisy fitness evaluation." Wright State University / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=wright1182196651.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Van, Lierde Boris. "Developing Box-Pushing Behaviours Using Evolutionary Robotics." Thesis, Högskolan Dalarna, Datateknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:du-6250.

Texte intégral
Résumé :
The context of this report and the IRIDIA laboratory are described in the preface. Evolutionary Robotics and the box-pushing task are presented in the introduction.The building of a test system supporting Evolutionary Robotics experiments is then detailed. This system is made of a robot simulator and a Genetic Algorithm. It is used to explore the possibility of evolving box-pushing behaviours. The bootstrapping problem is explained, and a novel approach for dealing with it is proposed, with results presented.Finally, ideas for extending this approach are presented in the conclusion.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Al, Seyab Rihab Khalid Shakir. "Nonlinear model predictive control using automatic differentiation." Thesis, Cranfield University, 2006. http://hdl.handle.net/1826/1491.

Texte intégral
Résumé :
Although nonlinear model predictive control (NMPC) might be the best choice for a nonlinear plant, it is still not widely used. This is mainly due to the computational burden associated with solving online a set of nonlinear differential equations and a nonlinear dynamic optimization problem in real time. This thesis is concerned with strategies aimed at reducing the computational burden involved in different stages of the NMPC such as optimization problem, state estimation, and nonlinear model identification. A major part of the computational burden comes from function and derivative evaluati
Styles APA, Harvard, Vancouver, ISO, etc.
5

Moradi, Mahdi. "TIME SERIES FORECASTING USING DUAL-STAGE ATTENTION-BASED RECURRENT NEURAL NETWORK." OpenSIUC, 2020. https://opensiuc.lib.siu.edu/theses/2701.

Texte intégral
Résumé :
AN ABSTRACT OF THE RESEARCH PAPER OFMahdi Moradi, for the Master of Science degree in Computer Science, presented on April 1, 2020, at Southern Illinois University Carbondale.TITLE: TIME SERIES FORECASTING USING DUAL-STAGE ATTENTION-BASED RECURRENT NEURAL NETWORKMAJOR PROFESSOR: Dr. Banafsheh Rekabdar
Styles APA, Harvard, Vancouver, ISO, etc.
6

Beneš, Karel. "Recurrent Neural Networks with Elastic Time Context in Language Modeling." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2016. http://www.nusl.cz/ntk/nusl-255481.

Texte intégral
Résumé :
Tato zpráva popisuje  experimentální práci na statistické jazykovém modelování pomocí rekurentních neuronových sítí (RNN). Je zde předložen důkladný přehled dosud publikovaných prací, následovaný popisem algoritmů pro trénování příslušných modelů. Většina z popsaných technik byla implementována ve vlastním nástroji, založeném na knihovně Theano. Byla provedena rozsáhlá sada experimentů s modelem Jednoduché rekurentní sítě (SRN), která odhalila některé jejich dosud nepublikované vlastnosti. Při statické evaluaci modelu byly dosažené výsledky relativně cca. o 2.7 % horší, než nejlepší publikovan
Styles APA, Harvard, Vancouver, ISO, etc.
7

Battista, Aldo. "Low-dimensional continuous attractors in recurrent neural networks : from statistical physics to computational neuroscience." Thesis, Université Paris sciences et lettres, 2020. http://www.theses.fr/2020UPSLE012.

Texte intégral
Résumé :
La manière dont l'information sensorielle est codée et traitée par les circuits neuronaux est une question centrale en neurosciences computationnelles. Dans de nombreuses régions du cerveau, on constate que l'activité des neurones dépend fortement de certains corrélats sensoriels continus ; on peut citer comme exemples les cellules simples de la zone V1 du cortex visuel codant pour l'orientation d'une barre présentée à la rétine, et les cellules de direction de la tête dans le subiculum ou les cellules de lieu dans l'hippocampe, dont les activités dépendent, respectivement, de l'orientation de
Styles APA, Harvard, Vancouver, ISO, etc.
8

Chan, Heather Y. "Gene Network Inference and Expression Prediction Using Recurrent Neural Networks and Evolutionary Algorithms." BYU ScholarsArchive, 2010. https://scholarsarchive.byu.edu/etd/2648.

Texte intégral
Résumé :
We demonstrate the success of recurrent neural networks in gene network inference and expression prediction using a hybrid of particle swarm optimization and differential evolution to overcome the classic obstacle of local minima in training recurrent neural networks. We also provide an improved validation framework for the evaluation of genetic network modeling systems that will result in better generalization and long-term prediction capability. Success in the modeling of gene regulation and prediction of gene expression will lead to more rapid discovery and development of therapeutic medici
Styles APA, Harvard, Vancouver, ISO, etc.
9

Pan, YaDung. "Fuzzy adaptive recurrent counterpropagation neural networks: A neural network architecture for qualitative modeling and real-time simulation of dynamic processes." Diss., The University of Arizona, 1995. http://hdl.handle.net/10150/187101.

Texte intégral
Résumé :
In this dissertation, a new artificial neural network (ANN) architecture called fuzzy adaptive recurrent counterpropagation neural network (FARCNN) is presented. FARCNNs can be directly synthesized from a set of training data, making system behavioral learning extremely fast. FARCNNs can be applied directly and effectively to model both static and dynamic system behavior based on observed input/output behavioral patterns alone without need of knowing anything about the internal structure of the system under study. The FARCNN architecture is derived from the methodology of fuzzy inductive reaso
Styles APA, Harvard, Vancouver, ISO, etc.
10

Yang, Jianxiang. "Time-delay neural network systems for stop and unstop phoneme discrimination in continuous speech signal." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1996. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/MQ31661.pdf.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Plus de sources

Livres sur le sujet "Continuous time recurrent neural network"

1

Unger, Herwig, and Wolfgang A. Halang, eds. Autonomous Systems 2016. VDI Verlag, 2016. http://dx.doi.org/10.51202/9783186848109.

Texte intégral
Résumé :
To meet the expectations raised by the terms Industrie 4.0, Industrial Internet and Internet of Things, real innovations are necessary, which can be brought about by information processing systems working autonomously. Owing to their growing complexity and their embedding in complex environments, their design becomes increasingly critical. Thus, the topics addressed in this book span from verification and validation of safety-related control software and suitable hardware designed for verifiability to be deployed in embedded systems over approaches to suppress electromagnetic interferences to
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Continuous time recurrent neural network"

1

Yi, Zhang, and K. K. Tan. "Other Models of Continuous Time Recurrent Neural Networks." In Network Theory and Applications. Springer US, 2004. http://dx.doi.org/10.1007/978-1-4757-3819-3_7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Chen, Joseph, and Stefan Wermter. "Continuous Time Recurrent Neural Networks for Grammatical Induction." In ICANN 98. Springer London, 1998. http://dx.doi.org/10.1007/978-1-4471-1599-1_56.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Zhao, Weirui, and Huanshui Zhang. "Global Convergence of Continuous-Time Recurrent Neural Networks with Delays." In Advances in Neural Networks - ISNN 2006. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_16.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Shiva, Ashraya Samba, and Amir Hussain. "Continuous Time Recurrent Neural Network Model of Recurrent Collaterals in the Hippocampus CA3 Region." In Advances in Brain Inspired Cognitive Systems. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-49685-6_31.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Fu, Chaojin, and Zhongsheng Wang. "Stability Analysis of a General Class of Continuous-Time Recurrent Neural Networks." In Advances in Neural Networks – ISNN 2009. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01507-6_40.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Liao, Xiaoxin, and Zhigang Zeng. "Global Exponential Stability in Lagrange Sense of Continuous-Time Recurrent Neural Networks." In Advances in Neural Networks - ISNN 2006. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_17.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Nakamura, Yuichi, and Masahiro Nakagawa. "Approximation Capability of Continuous Time Recurrent Neural Networks for Non-autonomous Dynamical Systems." In Artificial Neural Networks – ICANN 2009. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04277-5_60.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Yu, Zhibin, and Minho Lee. "Continuous Motion Recognition Using Multiple Time Constant Recurrent Neural Network with a Deep Network Model." In Intelligent Data Engineering and Automated Learning – IDEAL 2013. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41278-3_15.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Bown, Oliver, and Sebastian Lexer. "Continuous-Time Recurrent Neural Networks for Generative and Interactive Musical Performance." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11732242_62.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Marín, F. J., and F. Sandoval. "Genetic synthesis of discrete-time recurrent neural network." In New Trends in Neural Computation. Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/3-540-56798-4_144.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Continuous time recurrent neural network"

1

Costea, Ruxandra L., and Corneliu A. Marinov. "Continuous time recurrent neural network designed for KWTA operation." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033204.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Heinrich, Stefan, Tayfun Alpay, and Stefan Wermter. "Adaptive and Variational Continuous Time Recurrent Neural Networks." In 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob). IEEE, 2018. http://dx.doi.org/10.1109/devlrn.2018.8761019.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Liu, Oingshan, and Yan Zhao. "A continuous-time recurrent neural network for real-time support vector regression." In 2013 IEEE Symposium on Computational Intelligence in Control and Automation (CICA). IEEE, 2013. http://dx.doi.org/10.1109/cica.2013.6611683.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Ventresca, M., and B. Ombuki. "Search Space Analysis of Recurrent Spiking and Continuous-time Neural Networks." In The 2006 IEEE International Joint Conference on Neural Network Proceedings. IEEE, 2006. http://dx.doi.org/10.1109/ijcnn.2006.247076.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Bailador, Gonzalo, Daniel Roggen, Gerhard Tröster, and Gracián Triviño. "Real time gesture recognition using continuous time recurrent neural networks." In 2nd International ICST Conference on Body Area Networks. ICST, 2007. http://dx.doi.org/10.4108/bodynets.2007.149.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Mostafa, Mohamad, Werner G. Teich, and Jurgen Lindner. "Vector equalization based on continuous-time recurrent neural networks." In 2012 6th International Conference on Signal Processing and Communication Systems (ICSPCS 2012). IEEE, 2012. http://dx.doi.org/10.1109/icspcs.2012.6508013.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Suykens, J. A. K., J. Vandewalle, and B. De Moor. "Nonlinear H∞ control for continuous-time recurrent neural networks." In 1997 European Control Conference (ECC). IEEE, 1997. http://dx.doi.org/10.23919/ecc.1997.7082137.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Miguel, Cesar Gomes, Carolina Feher da Silva, and Marcio Lobo Netto. "Structural and Parametric Evolution of Continuous-Time Recurrent Neural Networks." In 2008 10th Brazilian Symposium on Neural Networks (SBRN). IEEE, 2008. http://dx.doi.org/10.1109/sbrn.2008.12.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Tang, Huajin, Boon Hwa Tan, and Rui Yan. "Robot-to-human handover with obstacle avoidance via continuous time Recurrent Neural Network." In 2016 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2016. http://dx.doi.org/10.1109/cec.2016.7743924.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Yan, Zheng, Xinyi Le, Shiping Wen та Jie Lu. "A Continuous-Time Recurrent Neural Network for Sparse Signal Reconstruction Via ℓ1 Minimization". У 2018 Eighth International Conference on Information Science and Technology (ICIST). IEEE, 2018. http://dx.doi.org/10.1109/icist.2018.8426132.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Continuous time recurrent neural network"

1

Bodruzzaman, M., and M. A. Essawy. Iterative prediction of chaotic time series using a recurrent neural network. Quarterly progress report, January 1, 1995--March 31, 1995. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/283610.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!