To see the other types of publications on this topic, follow the link: Simple recurrent neural network.

Books on the topic 'Simple recurrent neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 16 books for your research on the topic 'Simple recurrent neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Jones, Steven P. Neural network models of simple mechanical systems illustrating the feasibility of accelerated life testing. National Aeronautics and Space Administration, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Recurrent Neural Networks: From Simple to Gated Architectures. Springer International Publishing AG, 2023.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Salem, Fathi M. Recurrent Neural Networks: From Simple to Gated Architectures. Springer International Publishing AG, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Magic, John, and Mark Magic. Action Recognition Using Python and Recurrent Neural Network. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yi, Zhang, and K. K. Tan. Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications). Springer, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bosco, Joish, and Fateh Khan. Stock Market Prediction and Efficiency Analysis Using Recurrent Neural Network. GRIN Verlag GmbH, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

V, David. Neural Network Programming with Java: Simple Guide on Neural Networks. CreateSpace Independent Publishing Platform, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Magic, John, and Mark Magic. Action Recognition: Step-By-step Recognizing Actions with Python and Recurrent Neural Network. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shan, Yunting, John Magic, and Mark Magic. Action Recognition: Step-By-step Recognizing Actions with Python and Recurrent Neural Network. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

CNND simulator, cellular neural network embedded in a simple dual computing structure: User's guide version 1.1. Computer and Automation Institute, Hungarian Academy of Sciences, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
11

Sangeetha, V., and S. Kevin Andrews. Introduction to Artificial Intelligence and Neural Networks. Magestic Technology Solutions (P) Ltd, Chennai, Tamil Nadu, India, 2023. http://dx.doi.org/10.47716/mts/978-93-92090-24-0.

Full text
Abstract:
Artificial Intelligence (AI) has emerged as a defining force in the current era, shaping the contours of technology and deeply permeating our everyday lives. From autonomous vehicles to predictive analytics and personalized recommendations, AI continues to revolutionize various facets of human existence, progressively becoming the invisible hand guiding our decisions. Simultaneously, its growing influence necessitates the need for a nuanced understanding of AI, thereby providing the impetus for this book, “Introduction to Artificial Intelligence and Neural Networks.” This book aims to equip its readers with a comprehensive understanding of AI and its subsets, machine learning and deep learning, with a particular emphasis on neural networks. It is designed for novices venturing into the field, as well as experienced learners who desire to solidify their knowledge base or delve deeper into advanced topics. In Chapter 1, we provide a thorough introduction to the world of AI, exploring its definition, historical trajectory, and categories. We delve into the applications of AI, and underscore the ethical implications associated with its proliferation. Chapter 2 introduces machine learning, elucidating its types and basic algorithms. We examine the practical applications of machine learning and delve into challenges such as overfitting, underfitting, and model validation. Deep learning and neural networks, an integral part of AI, form the crux of Chapter 3. We provide a lucid introduction to deep learning, describe the structure of neural networks, and explore forward and backward propagation. This chapter also delves into the specifics of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs). In Chapter 4, we outline the steps to train neural networks, including data preprocessing, cost functions, gradient descent, and various optimizers. We also delve into regularization techniques and methods for evaluating a neural network model. Chapter 5 focuses on specialized topics in neural networks such as autoencoders, Generative Adversarial Networks (GANs), Long Short-Term Memory Networks (LSTMs), and Neural Architecture Search (NAS). In Chapter 6, we illustrate the practical applications of neural networks, examining their role in computer vision, natural language processing, predictive analytics, autonomous vehicles, and the healthcare industry. Chapter 7 gazes into the future of AI and neural networks. It discusses the current challenges in these fields, emerging trends, and future ethical considerations. It also examines the potential impacts of AI and neural networks on society. Finally, Chapter 8 concludes the book with a recap of key learnings, implications for readers, and resources for further study. This book aims not only to provide a robust theoretical foundation but also to kindle a sense of curiosity and excitement about the endless possibilities AI and neural networks offer. The journ
APA, Harvard, Vancouver, ISO, and other styles
12

Raff, Lionel, Ranga Komanduri, Martin Hagan, and Satish Bukkapatnam. Neural Networks in Chemical Reaction Dynamics. Oxford University Press, 2012. http://dx.doi.org/10.1093/oso/9780199765652.001.0001.

Full text
Abstract:
This monograph presents recent advances in neural network (NN) approaches and applications to chemical reaction dynamics. Topics covered include: (i) the development of ab initio potential-energy surfaces (PES) for complex multichannel systems using modified novelty sampling and feedforward NNs; (ii) methods for sampling the configuration space of critical importance, such as trajectory and novelty sampling methods and gradient fitting methods; (iii) parametrization of interatomic potential functions using a genetic algorithm accelerated with a NN; (iv) parametrization of analytic interatomic potential functions using NNs; (v) self-starting methods for obtaining analytic PES from ab inito electronic structure calculations using direct dynamics; (vi) development of a novel method, namely, combined function derivative approximation (CFDA) for simultaneous fitting of a PES and its corresponding force fields using feedforward neural networks; (vii) development of generalized PES using many-body expansions, NNs, and moiety energy approximations; (viii) NN methods for data analysis, reaction probabilities, and statistical error reduction in chemical reaction dynamics; (ix) accurate prediction of higher-level electronic structure energies (e.g. MP4 or higher) for large databases using NNs, lower-level (Hartree-Fock) energies, and small subsets of the higher-energy database; and finally (x) illustrative examples of NN applications to chemical reaction dynamics of increasing complexity starting from simple near equilibrium structures (vibrational state studies) to more complex non-adiabatic reactions. The monograph is prepared by an interdisciplinary group of researchers working as a team for nearly two decades at Oklahoma State University, Stillwater, OK with expertise in gas phase reaction dynamics; neural networks; various aspects of MD and Monte Carlo (MC) simulations of nanometric cutting, tribology, and material properties at nanoscale; scaling laws from atomistic to continuum; and neural networks applications to chemical reaction dynamics. It is anticipated that this emerging field of NN in chemical reaction dynamics will play an increasingly important role in MD, MC, and quantum mechanical studies in the years to come.
APA, Harvard, Vancouver, ISO, and other styles
13

Piccinini, Gualtiero. Computationalism. Edited by Eric Margolis, Richard Samuels, and Stephen P. Stich. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780195309799.013.0010.

Full text
Abstract:
The introduction of the concept of computation in cognitive science is discussed in this article. Computationalism is usually introduced as an empirical hypothesis that can be disconfirmed. Processing information is surely an important aspect of cognition so if computation is information processing, then cognition involves computation. Computationalism becomes more significant when it has explanatory power. The most relevant and explanatory notion of computation is that associated with digital computers. Turing analyzed computation in terms of what are now called Turing machines that are the kind of simple processor operating on an unbounded tape. Turing stated that any function that can be computed by an algorithm could be computed by a Turing machine. McCulloch and Pitts's account of cognition contains three important aspects that include an analogy between neural processes and digital computations, the use of mathematically defined neural networks as models, and an appeal to neurophysiological evidence to support their neural network models. Computationalism involves three accounts of computation such as causal, semantic, and mechanistic. There are mappings between any physical system and at least some computational descriptions under the causal account. The semantic account may be formulated as a restricted causal account.
APA, Harvard, Vancouver, ISO, and other styles
14

Fox, Raymond. The Use of Self. Oxford University Press, 2011. http://dx.doi.org/10.1093/oso/9780190616144.001.0001.

Full text
Abstract:
This monograph presents recent advances in neural network (NN) approaches and applications to chemical reaction dynamics. Topics covered include: (i) the development of ab initio potential-energy surfaces (PES) for complex multichannel systems using modified novelty sampling and feedforward NNs; (ii) methods for sampling the configuration space of critical importance, such as trajectory and novelty sampling methods and gradient fitting methods; (iii) parametrization of interatomic potential functions using a genetic algorithm accelerated with a NN; (iv) parametrization of analytic interatomic potential functions using NNs; (v) self-starting methods for obtaining analytic PES from ab inito electronic structure calculations using direct dynamics; (vi) development of a novel method, namely, combined function derivative approximation (CFDA) for simultaneous fitting of a PES and its corresponding force fields using feedforward neural networks; (vii) development of generalized PES using many-body expansions, NNs, and moiety energy approximations; (viii) NN methods for data analysis, reaction probabilities, and statistical error reduction in chemical reaction dynamics; (ix) accurate prediction of higher-level electronic structure energies (e.g. MP4 or higher) for large databases using NNs, lower-level (Hartree-Fock) energies, and small subsets of the higher-energy database; and finally (x) illustrative examples of NN applications to chemical reaction dynamics of increasing complexity starting from simple near equilibrium structures (vibrational state studies) to more complex non-adiabatic reactions. The monograph is prepared by an interdisciplinary group of researchers working as a team for nearly two decades at Oklahoma State University, Stillwater, OK with expertise in gas phase reaction dynamics; neural networks; various aspects of MD and Monte Carlo (MC) simulations of nanometric cutting, tribology, and material properties at nanoscale; scaling laws from atomistic to continuum; and neural networks applications to chemical reaction dynamics. It is anticipated that this emerging field of NN in chemical reaction dynamics will play an increasingly important role in MD, MC, and quantum mechanical studies in the years to come.
APA, Harvard, Vancouver, ISO, and other styles
15

Koch, Christof. Biophysics of Computation. Oxford University Press, 1998. http://dx.doi.org/10.1093/oso/9780195104912.001.0001.

Full text
Abstract:
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes. Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation. Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
APA, Harvard, Vancouver, ISO, and other styles
16

Khan, Aman, and Kenneth A. Scorgie. Forecasting Government Budgets. The Rowman & Littlefield Publishing Group, 2022. https://doi.org/10.5040/9781666990355.

Full text
Abstract:
Forecasting is integral to all governmental activities, especially budgetary activities. Without good and accurate forecasts, a government will not only find it difficult to carry out its everyday operations but will also find it difficult to cope with the increasingly complex environment in which it has to operate. This book presents, in a simple and easy to understand manner, some of the commonly used methods in budget forecasting, simple as well as advanced. The book is divided into three parts: It begins with an overview of forecasting background, forecasting process, and forecasting methods, followed by a detailed discussion of the actual methods in Parts I, II, and III. Part I discusses a combination of basic time series models such as percentage average, simple moving average, double moving average, exponential moving average, double as well as triple, simple trend line, time-series with cyclical variation, and time-series regression, with single and multiple independent variables. Part II discusses some of the more advanced, but frequently used time series models, such as ARIMA, regular as well as seasonal, Vector Autoregression (VAR), and Vector Error Correction (VEC). Part III provides an overview of three of the more recent advances in time series models, namely ensemble forecasting, state-space forecasting, and neural network. The book concludes with a brief discussion of some practical issues in budget forecasting.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography