Letteratura scientifica selezionata sul tema "Stochastic neural networks"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Stochastic neural networks".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Articoli di riviste sul tema "Stochastic neural networks"

1

Wong, Eugene. "Stochastic neural networks." Algorithmica 6, no. 1-6 (June 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Zhou, Wuneng, Xueqing Yang, Jun Yang, and Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix." Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Testo completo
Abstract (sommario):
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Reddy, BhanuTeja, and Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks." International Journal of Innovative Research in Computer Science & Technology 7, no. 5 (September 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI, and S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS." Modern Physics Letters B 24, no. 11 (May 10, 2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Testo completo
Abstract (sommario):
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Testo completo
Abstract (sommario):
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (February 19, 2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Yu, Tianyuan, Yongxin Yang, Da Li, Timothy Hospedales, and Tao Xiang. "Simple and Effective Stochastic Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 4 (May 18, 2021): 3252–60. http://dx.doi.org/10.1609/aaai.v35i4.16436.

Testo completo
Abstract (sommario):
Stochastic neural networks (SNNs) are currently topical, with several paradigms being actively investigated including dropout, Bayesian neural networks, variational information bottleneck (VIB) and noise regularized learning. These neural network variants impact several major considerations, including generalization, network compression, robustness against adversarial attack and label noise, and model calibration. However, many existing networks are complicated and expensive to train, and/or only address one or two of these practical considerations. In this paper we propose a simple and effective stochastic neural network (SE-SNN) architecture for discriminative learning by directly modeling activation uncertainty and encouraging high activation variability. Compared to existing SNNs, our SE-SNN is simpler to implement and faster to train, and produces state of the art results on network compression by pruning, adversarial defense, learning with label noise, and model calibration.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Peretto, Pierre, and Jean-jacques Niez. "Stochastic Dynamics of Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics 16, no. 1 (January 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Kanarachos, Andreas E., and Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks." IFAC Proceedings Volumes 31, no. 12 (June 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Più fonti

Tesi sul tema "Stochastic neural networks"

1

Pensuwon, Wanida. "Stochastic dynamic hierarchical neural networks." Thesis, University of Hertfordshire, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366030.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Sutherland, Connie. "Spatio-temporal feedback in stochastic neural networks." Thesis, University of Ottawa (Canada), 2007. http://hdl.handle.net/10393/27559.

Testo completo
Abstract (sommario):
The mechanisms by which groups of neurons interact is an important facet to understanding how the brain functions. Here we study stochastic neural networks with delayed feedback. The first part of our study looks at how feedback and noise affect the mean firing rate of the network. Secondly we look at how the spatial profile of the feedback affects the behavior of the network. Our numerical and theoretical results show that negative (inhibitory) feedback linearizes the frequency vs input current (f-I) curve via the divisive gain effect it has on the network. The interaction of the inhibitory feedback and the input bias is what produces the divisive decrease in the slope (known as the gain) of the f-I curve. Our work predicts that an increase in noise is required along with increase in inhibitory feedback to attain a divisive and subtractive shift of the gain as seen in experiments [1]. Our results also show that, although the spatial profile of the feedback does not effect the mean activity of the network, it does influence the overall dynamics of the network. Local feedback generates a network oscillation, which is more robust against disruption by noise or uncorrelated input or network heterogeneity, than that for the global feedback (all-to-all coupling) case. For example uncorrelated input completely disrupts the network oscillation generated by global feedback, but only diminishes the network oscillation due to local feedback. This is characterized by 1st and 2nd order spike train statistics. Further, our theory agrees well with numerical simulations of network dynamics.
Gli stili APA, Harvard, Vancouver, ISO e altri
3

CAMPOS, LUCIANA CONCEICAO DIAS. "PERIODIC STOCHASTIC MODEL BASED ON NEURAL NETWORKS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=17076@1.

Testo completo
Abstract (sommario):
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO<br>Processo Estocástico é um ramo da teoria da probabilidade onde se define um conjunto de modelos que permitem o estudo de problemas com componentes aleatórias. Muitos problemas reais apresentam características complexas, tais como não-linearidade e comportamento caótico, que necessitam de modelos capazes de capturar as reais características do problema para obter um tratamento apropriado. Porém, os modelos existentes ou são lineares, cuja aplicabilidade a esses problemas pode ser inadequada, ou necessitam de uma formulação complexa, onde a aplicabilidade é limitada e específica ao problema, ou dependem de suposições a priori sobre o comportamento do problema para poderem ser aplicados. Isso motivou a elaboração de um novo modelo de processo estocástico genérico, intrinsecamente não-linear, que possa ser aplicado em uma gama de problemas de fenômenos não-lineares, de comportamento altamente estocástico, e até mesmo com características periódicas. Como as redes neurais artificiais são modelos paramétricos não-lineares, simples de entendimento e implementação, capazes de capturar comportamentos de variados tipos de problemas, decidiu-se então utilizá-las como base do novo modelo proposto nessa tese, que é denominado Processo Estocástico Neural. A não-linearidade, obtida através das redes neurais desse processo estocástico, permite que se capture adequadamente o comportamento da série histórica de problemas de fenômenos não-lineares, com características altamente estocásticas e até mesmo periódicas. O objetivo é usar esse modelo para gerar séries temporais sintéticas, igualmente prováveis à série histórica, na solução desses tipos de problemas, como por exemplo os problemas que envolvem fenômenos climatológicos, econômicos, entre outros. Escolheu-se, como estudo de caso dessa tese, aplicar o modelo proposto no tratamento de afluências mensais sob o contexto do planejamento da operação do sistema hidrotérmico brasileiro. Os resultados mostraram que o Processo Estocástico Neural consegue gerar séries sintéticas com características similares às séries históricas de afluências mensais.<br>Stochastic Process is a branch of probability theory which defines a set of templates that allow the study of problems with random components. Many real problems exhibit complex characteristics such as nonlinearity and chaotic behavior, which require models capable of capture the real characteristics of the problem for a appropriate treatment. However, existing models have limited application to certain problems or because they are linear models (whose application gets results inconsistent or inadequate) or because they require a complex formulation or depend on a priori assumptions about the behavior of the problem, which requires a knowledge the problem at a level of detail that there is not always available. This motivated the development of a model stochastic process based on neural networks, so that is generic to be applied in a range of problems involving highly stochastic phenomena of behavior and also can be applied to phenomena that have periodic characteristics. As artificial neural networks are non-linear models, simple to understand and implementation, able to capture behaviors of varied types problems, then decided to use them as the basis of new model proposed in this thesis, which is an intrinsically non-linear model, called the Neural Stochastic Process. Through neural networks that stochastic process, can adequately capture the behavior problems of the series of phenomena with features highly stochastic and / or periodical. The goal is to use this model to generate synthetic time series, equally likely to historical series, in solution of various problems, eg problems phenomena involving climatology, economic, among others. It was chosen as a case study of this thesis, applying the model proposed in the treatment of monthly inflows in the context of operation planning of the Brazilian hydrothermal system. The Results showed that the process can Stochastic Neural generate synthetic series of similar characteristics to the historical monthly inflow series.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Ling, Hong. "Implementation of Stochastic Neural Networks for Approximating Random Processes." Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080108.124352/.

Testo completo
Abstract (sommario):
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biological systems on the basis of mimicking the information processing methods in the human brain. The capability of current ANNs only focuses on approximating arbitrary deterministic input-output mappings. However, these ANNs do not adequately represent the variability which is observed in the systems’ natural settings as well as capture the complexity of the whole system behaviour. This thesis addresses the development of a new class of neural networks called Stochastic Neural Networks (SNNs) in order to simulate internal stochastic properties of systems. Developing a suitable mathematical model for SNNs is based on canonical representation of stochastic processes or systems by means of Karhunen-Loève Theorem. Some successful real examples, such as analysis of full displacement field of wood in compression, confirm the validity of the proposed neural networks. Furthermore, analysis of internal workings of SNNs provides an in-depth view on the operation of SNNs that help to gain a better understanding of the simulation of stochastic processes by SNNs.
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Zhao, Jieyu. "Stochastic bit stream neural networks : theory, simulations and applications." Thesis, Royal Holloway, University of London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338916.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Hyland, P. "On the implementation of neural networks using stochastic arithmetic." Thesis, Bangor University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306224.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Todeschi, Tiziano. "Calibration of local-stochastic volatility models with neural networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23052/.

Testo completo
Abstract (sommario):
During the last twenty years several models have been proposed to improve the classic Black-Scholes framework for equity derivatives pricing. Recently a new model has been proposed: Local-Stochastic Volatility Model (LSV). This model considers volatility as the product between a deterministic and a stochastic term. So far, the model choice was not only driven by the capacity of capturing empirically observed market features well, but also by the computational tractability of the calibration process. This is now undergoing a big change since machine learning technologies offer new perspectives on model calibration. In this thesis we consider the calibration problem to be the search for a model which generates given market prices and where additionally technology from generative adversarial networks can be used. This means parametrizing the model pool in a way which is accessible for machine learning techniques and interpreting the inverse problems a training task of a generative network, whose quality is assessed by an adversary. The calibration algorithm proposed for LSV models use as generative models so-called neural stochastic differential equations (SDE), which just means to parameterize the drift and volatility of an Ito-SDE by neural networks.
Gli stili APA, Harvard, Vancouver, ISO e altri
8

陳穎志 and Wing-chi Chan. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31241475.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Chan, Wing-chi. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks /." Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22925843.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Rising, Barry John Paul. "Hardware architectures for stochastic bit-stream neural networks : design and implementation." Thesis, Royal Holloway, University of London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326219.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Più fonti

Libri sul tema "Stochastic neural networks"

1

Zhou, Wuneng, Jun Yang, Liuwei Zhou, and Dongbing Tong. Stability and Synchronization Control of Stochastic Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-47833-2.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Thathachar, Mandayam A. L. Networks of learning automata: Techniques for online stochastic optimization. Boston, MA: Kluwer Academic, 2003.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Su-shing, Chen, and Society of Photo-optical Instrumentation Engineers., eds. Neural and stochastic methods in image and signal processing III: 28-29 July 1994, San Diego, California. Bellingham, Wash: SPIE, 1994.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Focus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Su-shing, Chen, Society of Photo-optical Instrumentation Engineers., and Society for Industrial and Applied Mathematics., eds. Neural and stochastic methods in image and signal processing: 20-23 July 1992, San Diego, California. Bellingham, Wash: SPIE, 1992.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Su-shing, Chen, and Society of Photo-optical Instrumentation Engineers., eds. Neural and stochastic methods in image and signal processing II: 12-13 July 1993, San Diego, California. Bellingham, Wash: SPIE, 1993.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

R, Dougherty Edward, and Society of Photo-optical Instrumentation Engineers., eds. Neural, morphological, and stochastic methods in image and signal processing: 10-11 July, 1995, San Diego, California. Bellingham, Wash., USA: SPIE, 1995.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Su-shing, Chen, and Society of Photo-optical Instrumentation Engineers., eds. Stochastic and neural methods in signal processing, image processing, and computer vision: 24-26 July 1991, San Diego, California. Bellingham, Wash: SPIE, 1991.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

International Conference on Applied Stochastic Models and Data Analysis (12th : 2007 : Chania, Greece), ed. Advances in data analysis: Theory and applications to reliability and inference, data mining, bioinformatics, lifetime data, and neural networks. Boston: Birkhäuser, 2010.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
Più fonti

Capitoli di libri sul tema "Stochastic neural networks"

1

Rojas, Raúl. "Stochastic Networks." In Neural Networks, 371–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4_14.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Müller, Berndt, Joachim Reinhardt, and Michael T. Strickland. "Stochastic Neurons." In Neural Networks, 38–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4_4.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Müller, Berndt, and Joachim Reinhardt. "Stochastic Neurons." In Neural Networks, 37–44. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3_4.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Hänggi, Martin, and George S. Moschytz. "Stochastic Optimization." In Cellular Neural Networks, 101–25. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4757-3220-7_6.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Rennolls, Keith, Alan Soper, Phil Robbins, and Ray Guthrie. "Stochastic Neural Networks." In ICANN ’93, 481. London: Springer London, 1993. http://dx.doi.org/10.1007/978-1-4471-2063-6_122.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Zhang, Yumin, Lei Guo, Lingyao Wu, and Chunbo Feng. "On Stochastic Neutral Neural Networks." In Advances in Neural Networks — ISNN 2005, 69–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_10.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Siegelmann, Hava T. "Stochastic Dynamics." In Neural Networks and Analog Computation, 121–39. Boston, MA: Birkhäuser Boston, 1999. http://dx.doi.org/10.1007/978-1-4612-0707-8_9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Golea, Mostefa, Masahiro Matsuoka, and Yasubumi Sakakibara. "Stochastic simple recurrent neural networks." In Grammatical Interference: Learning Syntax from Sentences, 262–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/bfb0033360.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Hidalgo, Jorge, Luís F. Seoane, Jesús M. Cortés, and Miguel A. Muñoz. "Stochastic Amplification in Neural Networks." In Trends in Mathematics, 45–49. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08138-0_9.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Herve, Thierry, Olivier Francois, and Jacques Demongeot. "Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation?" In Neural Networks, 81–89. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7_29.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Atti di convegni sul tema "Stochastic neural networks"

1

Alibrandi, Umberto, Claudio Perez, and Khalid M. Mosalam. "Quantum Physics Stochastic Neural Networks (QPNN)." In 2024 8th International Conference on System Reliability and Safety (ICSRS), 765–69. IEEE, 2024. https://doi.org/10.1109/icsrs63046.2024.10927527.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Sen, Mrinmay, A. K. Qin, Gayathri C, Raghu Kishore N, Yen-Wei Chen, and Balasubramanian Raman. "SOFIM: Stochastic Optimization Using Regularized Fisher Information Matrix." In 2024 International Joint Conference on Neural Networks (IJCNN), 1–7. IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650665.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Zhao, J. "Stochastic connection neural networks." In 4th International Conference on Artificial Neural Networks. IEE, 1995. http://dx.doi.org/10.1049/cp:19950525.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020. http://dx.doi.org/10.1109/icassp40776.2020.9054424.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Chien, Jen-Tzung, and Yu-Min Huang. "Stochastic Convolutional Recurrent Networks." In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206970.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Galan-Prado, Fabio, Alejandro Moran, Joan Font, Miquel Roca, and Josep L. Rossello. "Stochastic Radial Basis Neural Networks." In 2019 29th International Symposium on Power and Timing Modeling, Optimization and Simulation (PATMOS). IEEE, 2019. http://dx.doi.org/10.1109/patmos.2019.8862129.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Ramakrishnan, Swathika, and Dhireesha Kudithipudi. "On accelerating stochastic neural networks." In NANOCOM '17: ACM The Fourth Annual International Conference on Nanoscale Computing and Communication. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3109453.3123959.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Weller, Dennis D., Nathaniel Bleier, Michael Hefenbrock, Jasmin Aghassi-Hagmann, Michael Beigl, Rakesh Kumar, and Mehdi B. Tahoori. "Printed Stochastic Computing Neural Networks." In 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2021. http://dx.doi.org/10.23919/date51398.2021.9474254.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Gulshad, Sadaf, Dick Sigmund, and Jong-Hwan Kim. "Learning to reproduce stochastic time series using stochastic LSTM." In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965942.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Nikolic, Konstantin P., and Ivan B. Scepanovic. "Stochastic search-based neural networks learning algorithms." In 2008 9th Symposium on Neural Network Applications in Electrical Engineering. IEEE, 2008. http://dx.doi.org/10.1109/neurel.2008.4685579.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri

Rapporti di organizzazioni sul tema "Stochastic neural networks"

1

Fernández-Villaverde, Jesús, Galo Nuño, and Jesse Perla. Taming the curse of dimensionality: quantitative economics with deep learning. Madrid: Banco de España, November 2024. http://dx.doi.org/10.53479/38233.

Testo completo
Abstract (sommario):
We argue that deep learning provides a promising approach to addressing the curse of dimensionality in quantitative economics. We begin by exploring the unique challenges involved in solving dynamic equilibrium models, particularly the feedback loop between individual agents’ decisions and the aggregate consistency conditions required to achieve equilibrium. We then introduce deep neural networks and demonstrate their application by solving the stochastic neoclassical growth model. Next, we compare deep neural networks with traditional solution methods in quantitative economics. We conclude with a review of the applications of neural networks in quantitative economics and provide arguments for cautious optimism.
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Burton, Robert M., and Jr. Topics in Stochastics, Symbolic Dynamics and Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, December 1996. http://dx.doi.org/10.21236/ada336426.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Bailey Bond, Robert, Pu Ren, James Fong, Hao Sun, and Jerome F. Hajjar. Physics-informed Machine Learning Framework for Seismic Fragility Analysis of Steel Structures. Northeastern University, August 2024. http://dx.doi.org/10.17760/d20680141.

Testo completo
Abstract (sommario):
The seismic assessment of structures is a critical step to increase community resilience under earthquake hazards. This research aims to develop a Physics-reinforced Machine Learning (PrML) paradigm for metamodeling of nonlinear structures under seismic hazards using artificial intelligence. Structural metamodeling, a reduced-fidelity surrogate model to a more complex structural model, enables more efficient performance-based design and analysis, optimizing structural designs and ease the computational effort for reliability fragility analysis, leading to globally efficient designs while maintaining required levels of accuracy. The growing availability of high-performance computing has improved this analysis by providing the ability to evaluate higher order numerical models. However, more complex models of the seismic response of various civil structures demand increasing amounts of computing power. In addition, computational cost greatly increases with numerous iterations to account for optimization and stochastic loading (e.g., Monte Carlo simulations or Incremental Dynamic Analysis). To address the large computational burden, simpler models are desired for seismic assessment with fragility analysis. Physics reinforced Machine Learning integrates physics knowledge (e.g., scientific principles, laws of physics) into the traditional machine learning architectures, offering physically bounded, interpretable models that require less data than traditional methods. This research introduces a PrML framework to develop fragility curves using the combination of neural networks of domain knowledge. The first aim involves clustering and selecting ground motions for nonlinear response analysis of archetype buildings, ensuring that selected ground motions will include as few ground motions as possible while still expressing all the key representative events the structure will probabilistically experience in its lifetime. The second aim constructs structural PrML metamodels to capture the nonlinear behavior of these buildings utilizing the nonlinear Equation of Motion (EOM). Embedding physical principles, like the general form of the EOM, into the learning process will inform the system to stay within known physical bounds, resulting in interpretable results, robust inferencing, and the capability of dealing with incomplete and scarce data. The third and final aim applies the metamodels to probabilistic seismic response prediction, fragility analysis, and seismic performance factor development. The efficiency and accuracy of this approach are evaluated against existing physics-based fragility analysis methods.
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Fernández-Villaverde, Jesús, Joël Marbet, Galo Nuño, and Omar Rachedi. Inequality and the zero lower bound. Madrid: Banco de España, February 2024. http://dx.doi.org/10.53479/36133.

Testo completo
Abstract (sommario):
This paper studies how household inequality shapes the effects of the zero lower bound (ZLB) on nominal interest rates on aggregate dynamics. To do so, we consider a heterogeneous agent New Keynesian (HANK) model with an occasionally binding ZLB and solve for its fully non-linear stochastic equilibrium using a novel neural network algorithm. In this setting, changes in the monetary policy stance influence households’precautionary savings by altering the frequency of ZLB events. As a result, the model features monetary policy non-neutrality in the long run. The degree of long-run non-neutrality, i.e., by how much monetary policy shifts real rates in the ergodic distribution of the model, can be substantial when we combine low inflation targets and high levels of wealth inequality.
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia