Books on the topic 'Recurrent neural networks BLSTM'

To see the other types of publications on this topic, follow the link: Recurrent neural networks BLSTM.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 40 books for your research on the topic 'Recurrent neural networks BLSTM.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Hu, Xiaolin, and P. Balasubramaniam. Recurrent neural networks. Rijek, Crotia: InTech, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Salem, Fathi M. Recurrent Neural Networks. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89929-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hammer, Barbara. Learning with recurrent neural networks. London: Springer London, 2000. http://dx.doi.org/10.1007/bfb0110016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

ElHevnawi, Mahmoud, and Mohamed Mysara. Recurrent neural networks and soft computing. Rijeka: InTech, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yi, Zhang. Convergence analysis of recurrent neural networks. Boston: Kluwer Academic Publishers, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yi, Zhang, and K. K. Tan. Convergence Analysis of Recurrent Neural Networks. Boston, MA: Springer US, 2004. http://dx.doi.org/10.1007/978-1-4757-3819-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24797-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Michel, Anthony N. Qualitative analysis and synthesis of recurrent neural networks. New York: Marcel Dekker, Inc., 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Wen. Recurrent neural networks applied to robotic motion control. Ottawa: National Library of Canada, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
11

Rovithakis, George A., and Manolis A. Christodoulou. Adaptive Control with Recurrent High-order Neural Networks. London: Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0785-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Bianchi, Filippo Maria, Enrico Maiorino, Michael C. Kampffmeyer, Antonello Rizzi, and Robert Jenssen. Recurrent Neural Networks for Short-Term Load Forecasting. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70338-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kuan, Chung-Ming. Forecasting exchange rates using feedforward and recurrent neural networks. Champaign: University of Illinois at Urbana-Champaign, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kuan, Chung-Ming. Forecasting exchange rates using feedforward and recurrent neural networks. [Urbana, Ill.]: College of Commerce and Business Administration, University of Illinois at Urbana-Champaign, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
15

Rovithakis, George A. Adaptive control with recurrent high-order neural networks: Theory and industrial applications. London: Springer, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Rovithakis, George A. Adaptive Control with Recurrent High-order Neural Networks: Theory and Industrial Applications. London: Springer London, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
17

Medsker, Larry, and Lakhmi C. Jain, eds. Recurrent Neural Networks. CRC Press, 1999. http://dx.doi.org/10.1201/9781420049176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Hu, Xiaolin, and P. Balasubramaniam, eds. Recurrent Neural Networks. InTech, 2008. http://dx.doi.org/10.5772/68.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Hammer, Barbara. Learning with Recurrent Neural Networks. Springer, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
20

Yi, Zhang. Convergence Analysis of Recurrent Neural Networks. Springer, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
21

R, Medsker L., and Jain L. C, eds. Recurrent neural networks: Design and applications. Boca Raton, Fla: CRC Press, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
22

ElHefnawi, Mahmoud, ed. Recurrent Neural Networks and Soft Computing. InTech, 2012. http://dx.doi.org/10.5772/2296.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Springer, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Supervised Sequence Labelling With Recurrent Neural Networks. Springer, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
25

Recurrent Neural Networks for Temporal Data Processing. InTech, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
26

Brunner, Daniel, Miguel C. Soriano, and Guy Van der Sande. Photonic Reservoir Computing: Optical Recurrent Neural Networks. de Gruyter GmbH, Walter, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
27

Cardot, Hubert, ed. Recurrent Neural Networks for Temporal Data Processing. InTech, 2011. http://dx.doi.org/10.5772/631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

1965-, Kolen John F., and Kremer Stefan C. 1968-, eds. A field guide to dynamical recurrent networks. New York: IEEE Press, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
29

(Editor), John F. Kolen, and Stefan C. Kremer (Editor), eds. A Field Guide to Dynamical Recurrent Networks. Wiley-IEEE Press, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
30

Omlin, Christian W. Knowledge Acquisition and Representation in Recurrent Neural Networks. World Scientific Publishing Company, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
31

Yi, Zhang, and K. K. Tan. Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications). Springer, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
32

Mandic, Danilo, and Jonathon Chambers. Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. Wiley, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
33

Mandic, Danilo P., and Jonathon A. Chambers. Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. Wiley & Sons, Incorporated, John, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
34

(Editor), Larry Medsker, and Lakhmi C. Jain (Editor), eds. Recurrent Neural Networks: Design and Applications (The Crc Press International Series on Computational Intelligence). CRC, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
35

Michel, Anthony, and Derong Liu. Qualitative Analysis and Synthesis of Recurrent Neural Networks (Pure and Applied Mathematics). CRC, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
36

Bianchi, Filippo Maria Maria, Enrico Maiorino, Michael C. Kampffmeyer, Antonello Rizzi, and Robert Jenssen. Recurrent Neural Networks for Short-Term Load Forecasting: An Overview and Comparative Analysis. Springer, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
37

Kostadinov, Simeon. Recurrent Neural Networks with Python Quick Start Guide: Sequential learning and language modeling with TensorFlow. Packt Publishing, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
38

Gandhi, Vaibhav. Brain-Computer Interfacing for Assistive Robotics: Electroencephalograms, Recurrent Quantum Neural Networks, and User-Centric Graphical Interfaces. Academic Press, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
39

Lewis, Marc D. The Development of Emotion Regulation. Edited by Philip David Zelazo. Oxford University Press, 2013. http://dx.doi.org/10.1093/oxfordhb/9780199958474.013.0004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This chapter examines the relation between normative advances and emerging individual differences in emotion regulation (ER), using principles from developmental cognitive neuroscience to integrate these seemingly disparate processes. Like several other theorists, I view corticolimbic development as a self-organizing stream of synaptic alterations, driven by experience rather than biologically prespecified. This conceptualization helps resolve ambiguities that appear when we try, but consistently fail, to neatly parse individual differences and developmental differences. At the neural level, increasingly specific patterns of synaptic activation converge in response to (or in anticipation of) recurrent emotions, creating synaptic networks that link multiple regions. These networks regulate emotions (in real time). But they also stabilize and consolidate with repetition, thus giving rise tohabitsthat are the hallmark of individual development. These configurations are progressively sculpted through individual learning experiences, but they also become increasingly effective with use, thereby expressing both individual trajectories and normative advances as they develop. In sum, experience-driven synaptic changes create a repertoire of individual solutions to universal challenges, shared among members of a culture or society. This description casts individual differences and age-related advances as dual facets of a unitary developmental process.
40

Trappenberg, Thomas P. Fundamentals of Machine Learning. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198828044.001.0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Machine learning is exploding, both in research and for industrial applications. This book aims to be a brief introduction to this area given the importance of this topic in many disciplines, from sciences to engineering, and even for its broader impact on our society. This book tries to contribute with a style that keeps a balance between brevity of explanations, the rigor of mathematical arguments, and outlining principle ideas. At the same time, this book tries to give some comprehensive overview of a variety of methods to see their relation on specialization within this area. This includes some introduction to Bayesian approaches to modeling as well as deep learning. Writing small programs to apply machine learning techniques is made easy today by the availability of high-level programming systems. This book offers examples in Python with the machine learning libraries sklearn and Keras. The first four chapters concentrate largely on the practical side of applying machine learning techniques. The book then discusses more fundamental concepts and includes their formulation in a probabilistic context. This is followed by chapters on advanced models, that of recurrent neural networks and that of reinforcement learning. The book closes with a brief discussion on the impact of machine learning and AI on our society.

To the bibliography