To see the other types of publications on this topic, follow the link: RNN (recurrent neural networks).

Journal articles on the topic 'RNN (recurrent neural networks)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'RNN (recurrent neural networks).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ma, Xiao, Peter Karkus, David Hsu, and Wee Sun Lee. "Particle Filter Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5101–8. http://dx.doi.org/10.1609/aaai.v34i04.5952.

Full text
Abstract:
Recurrent neural networks (RNNs) have been extraordinarily successful for prediction with sequential data. To tackle highly variable and multi-modal real-world data, we introduce Particle Filter Recurrent Neural Networks (PF-RNNs), a new RNN family that explicitly models uncertainty in its internal structure: while an RNN relies on a long, deterministic latent state vector, a PF-RNN maintains a latent state distribution, approximated as a set of particles. For effective learning, we provide a fully differentiable particle filter algorithm that updates the PF-RNN latent state distribution accor
APA, Harvard, Vancouver, ISO, and other styles
2

SCHÄFER, ANTON MAXIMILIAN, and HANS-GEORG ZIMMERMANN. "RECURRENT NEURAL NETWORKS ARE UNIVERSAL APPROXIMATORS." International Journal of Neural Systems 17, no. 04 (2007): 253–63. http://dx.doi.org/10.1142/s0129065707001111.

Full text
Abstract:
Recurrent Neural Networks (RNN) have been developed for a better understanding and analysis of open dynamical systems. Still the question often arises if RNN are able to map every open dynamical system, which would be desirable for a broad spectrum of applications. In this article we give a proof for the universal approximation ability of RNN in state space model form and even extend it to Error Correction and Normalized Recurrent Neural Networks.
APA, Harvard, Vancouver, ISO, and other styles
3

Ma, Qianli, Zhenxi Lin, Enhuan Chen, and Garrison Cottrell. "Temporal Pyramid Recurrent Neural Network." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5061–68. http://dx.doi.org/10.1609/aaai.v34i04.5947.

Full text
Abstract:
Learning long-term and multi-scale dependencies in sequential data is a challenging task for recurrent neural networks (RNNs). In this paper, a novel RNN structure called temporal pyramid RNN (TP-RNN) is proposed to achieve these two goals. TP-RNN is a pyramid-like structure and generally has multiple layers. In each layer of the network, there are several sub-pyramids connected by a shortcut path to the output, which can efficiently aggregate historical information from hidden states and provide many gradient feedback short-paths. This avoids back-propagating through many hidden states as in
APA, Harvard, Vancouver, ISO, and other styles
4

Suk-Hwan, Jung, and Chung Yong-Joo. "Sound event detection using deep neural networks." TELKOMNIKA Telecommunication, Computing, Electronics and Control 18, no. 5 (2020): 2587~2596. https://doi.org/10.12928/TELKOMNIKA.v18i5.14246.

Full text
Abstract:
We applied various architectures of deep neural networks for sound event detection and compared their performance using two different datasets. Feed forward neural network (FNN), convolutional neural network (CNN), recurrent neural network (RNN) and convolutional recurrent neural network (CRNN) were implemented using hyper-parameters optimized for each architecture and dataset. The results show that the performance of deep neural networks varied significantly depending on the learning rate, which can be optimized by conducting a series of experiments on the validation data over predetermined r
APA, Harvard, Vancouver, ISO, and other styles
5

Lyu, Shengfei, and Jiaqi Liu. "Convolutional Recurrent Neural Networks for Text Classification." Journal of Database Management 32, no. 4 (2021): 65–82. http://dx.doi.org/10.4018/jdm.2021100105.

Full text
Abstract:
Recurrent neural network (RNN) and convolutional neural network (CNN) are two prevailing architectures used in text classification. Traditional approaches combine the strengths of these two networks by straightly streamlining them or linking features extracted from them. In this article, a novel approach is proposed to maintain the strengths of RNN and CNN to a great extent. In the proposed approach, a bi-directional RNN encodes each word into forward and backward hidden states. Then, a neural tensor layer is used to fuse bi-directional hidden states to get word representations. Meanwhile, a c
APA, Harvard, Vancouver, ISO, and other styles
6

Hindarto, Djarot. "Comparison of RNN Architectures and Non-RNN Architectures in Sentiment Analysis." sinkron 8, no. 4 (2023): 2537–46. http://dx.doi.org/10.33395/sinkron.v8i4.13048.

Full text
Abstract:
This study compares the sentiment analysis performance of multiple Recurrent Neural Network architectures and One-Dimensional Convolutional Neural Networks. THE METHODS EVALUATED ARE simple Recurrent Neural Network, Long Short-Term Memory, Gated Recurrent Unit, Bidirectional Recurrent Neural Network, and 1D ConvNets. A dataset comprising text reviews with positive or negative sentiment labels was evaluated. All evaluated models demonstrated an extremely high accuracy, ranging from 99.81% to 99.99%. Apart from that, the loss generated by these models is also low, ranging from 0.0043 to 0.0021.
APA, Harvard, Vancouver, ISO, and other styles
7

Tridarma, Panggih, and Sukmawati Nur Endah. "Pengenalan Ucapan Bahasa Indonesia Menggunakan MFCC dan Recurrent Neural Network." JURNAL MASYARAKAT INFORMATIKA 11, no. 2 (2020): 36–44. http://dx.doi.org/10.14710/jmasif.11.2.34874.

Full text
Abstract:
Pengenalan ucapan (speech recognition) merupakan perkembangan teknologi dalam bidang suara. Pengenalan ucapan memungkinkan suatu perangkat lunak mengenali kata-kata yang diucapkan oleh manusia dan ditampilkan dalam bentuk tulisan. Namun masih terdapat masalah untuk mengenali kata-kata yang diucapkan, seperti karakteristik suara yang berbeda, usia, kesehatan, dan jenis kelamin. Penelitian ini membahas pengenalan ucapan bahasa Indonesia dengan menggunakan Mel-Frequency Cepstral Coefficient (MFCC) sebagai metode ekstraksi ciri dan Recurrent Neural Network (RNN) sebagai metode pengenalannya dengan
APA, Harvard, Vancouver, ISO, and other styles
8

Kao, Jonathan C. "Considerations in using recurrent neural networks to probe neural dynamics." Journal of Neurophysiology 122, no. 6 (2019): 2504–21. http://dx.doi.org/10.1152/jn.00467.2018.

Full text
Abstract:
Recurrent neural networks (RNNs) are increasingly being used to model complex cognitive and motor tasks performed by behaving animals. RNNs are trained to reproduce animal behavior while also capturing key statistics of empirically recorded neural activity. In this manner, the RNN can be viewed as an in silico circuit whose computational elements share similar motifs with the cortical area it is modeling. Furthermore, because the RNN’s governing equations and parameters are fully known, they can be analyzed to propose hypotheses for how neural populations compute. In this context, we present i
APA, Harvard, Vancouver, ISO, and other styles
9

Rath, Ankit, and Subhrajyoti Ranjan Sahu. "Recurrent Neural Networks for Recommender Systems." Computational Intelligence and Machine Learning 1, no. 1 (2020): 31–36. http://dx.doi.org/10.36647/ciml/01.01.a004.

Full text
Abstract:
The Internet is becoming one of the biggest sources of information in recent years, keeping people updated about everyday events. The information available on it is also growing, with the increase in the use of the Internet. Due to this, it takes a great deal of time and effort to locate relavent knowledge that the user wants. Recommender systems are software mechanisms that automatically suggest relavent user-needed information. Recurrent Neural Networks has lately gained importance in the field of recommender systems, since they give improved results in building deep learning models with seq
APA, Harvard, Vancouver, ISO, and other styles
10

B.Venkateswarlu and Dr. C. Gulzar. "Spam Classification using Recurrent Neural Networks." international journal of engineering technology and management sciences 9, no. 2 (2025): 684–89. https://doi.org/10.46647/ijetms.2025.v09i02.087.

Full text
Abstract:
Spam classification is a critical task in email filtering systems to distinguish between legitimate andspam emails. Traditional machine learning methods have been used for this purpose, but they oftenstruggle to capture the complex patterns and variations in spam emails. In this paper, we propose anovel approach using Recurrent Neural Networks (RNNs) for spam classification. RNNs are wellsuited for sequence modeling tasks like this, as they can capture dependencies between words in anemail. We use a Long Short-Term Memory (LSTM) RNN architecture, known for its ability toretain information over
APA, Harvard, Vancouver, ISO, and other styles
11

Alam, Muhammad S., AKM B. Hossain, and Farhan B. Mohamed. "Performance Evaluation of Recurrent Neural Networks Applied to Indoor Camera Localization." International Journal of Emerging Technology and Advanced Engineering 12, no. 8 (2022): 116–24. http://dx.doi.org/10.46338/ijetae0822_15.

Full text
Abstract:
Researchers in robotics and computer vision are experimenting with the image-based localization of indoor cameras. Implementation of indoor camera localization problems using a Convolutional neural network (CNN) or Recurrent neural network (RNN) is more challenging from a large image dataset because of the internal structure of CNN or RNN. We can choose a preferable CNN or RNN variant based on the problem type and size of the dataset. CNN is the most flexible method for implementing indoor localization problems. Despite CNN's suitability for hyper-parameter selection, it requires a lot of trai
APA, Harvard, Vancouver, ISO, and other styles
12

Debajit, Datta, Evangeline David Preetha, Mittal Dhruv, and Jain Anukriti. "Neural Machine Translation using Recurrent Neural Network." International Journal of Engineering and Advanced Technology (IJEAT) 9, no. 4 (2020): 1395–400. https://doi.org/10.35940/ijeat.D7637.049420.

Full text
Abstract:
In this era of globalization, it is quite likely to come across people or community who do not share the same language for communication as us. To acknowledge the problems caused by this, we have machine translation systems being developed. Developers of several reputed organizations like Google LLC, have been working to bring algorithms to support machine translations using machine learning algorithms like Artificial Neural Network (ANN) in order to facilitate machine translation. Several Neural Machine Translations have been developed in this regard, but Recurrent Neural Network (RNN), on th
APA, Harvard, Vancouver, ISO, and other styles
13

Aribowo, Widi. "ELMAN-RECURRENT NEURAL NETWORK FOR LOAD SHEDDING OPTIMIZATION." SINERGI 24, no. 1 (2020): 29. http://dx.doi.org/10.22441/sinergi.2020.1.005.

Full text
Abstract:
Load shedding plays a key part in the avoidance of the power system outage. The frequency and voltage fluidity leads to the spread of a power system into sub-systems and leads to the outage as well as the severe breakdown of the system utility. In recent years, Neural networks have been very victorious in several signal processing and control applications. Recurrent Neural networks are capable of handling complex and non-linear problems. This paper provides an algorithm for load shedding using ELMAN Recurrent Neural Networks (RNN). Elman has proposed a partially RNN, where the feedforward conn
APA, Harvard, Vancouver, ISO, and other styles
14

Tin, Ting Tin, Cheok Jia Wei, Ong Tzi Min, Boo Zheng Feng, and Too Chin Xian. "Real estate price forecasting utilizing recurrent neural networks incorporating genetic algorithms." International Journal of Innovative Research and Scientific Studies 7, no. 3 (2024): 1216–26. http://dx.doi.org/10.53894/ijirss.v7i3.3220.

Full text
Abstract:
This study aims to develop and examine the effectiveness of the Recurrent Neural Network (RNN) model incorporating Genetic Algorithm (GA) in forecasting real estate prices. Real estate prices have a significant impact on a country’s financial system. Therefore, the ability to accurately forecast its price is valuable. A set of data containing 5.4 million unique records of real estate with their prices is used in the study. The data set, which spans from 2018 to 2021, contains twelve independent variables and one dependent variable. We preprocessed the data set to reduce noise and outliers that
APA, Harvard, Vancouver, ISO, and other styles
15

Warcita, Kurniabudi, and Eko Arip Winanto. "Detection of UDP Flooding DDoS Attacks on IoT Networks Using Recurrent Neural Network." Jurnal Nasional Pendidikan Teknik Informatika (JANAPATI) 13, no. 3 (2024): 471–81. https://doi.org/10.23887/janapati.v13i3.79601.

Full text
Abstract:
Internet of Thing (IoT) is a concept where an object can transfer data through a network without requiring human interaction. Complex IoT networks make it vulnerable to cyber attacks such as DDoS UDP Flood attacks, UDP Flood attacks can disrupt IoT devices. Therefore, this study proposes an attack detection method using a deep learning approach with the Recurrent Neural Network (RNN) method. This study uses Principle Component Analysis (PCA) to reduce the feature dimension, before learning using RNN. The purpose of this study is to test the combined performance of the PCA and RNN methods to de
APA, Harvard, Vancouver, ISO, and other styles
16

Wibowo, Angga, Kurnianingsih, and Eri Sato-Shimokawara. "Prediction Analysis of Greeting Gestures Based on Recurrent Neural Networks." JOIV : International Journal on Informatics Visualization 9, no. 3 (2025): 1066. https://doi.org/10.62527/joiv.9.3.2917.

Full text
Abstract:
Human activity recognition, such as rehabilitation, sports, human behavior, etc., is developing rapidly. A Recurrent Neural Network (RNN) is a practical approach to human activity recognition research and sequential data. However, studies on recognizing human activities rarely study culture, including greeting gestures. And studies seldom use small datasets when employing the RNN approach, as they typically utilize large amounts of data to conduct such studies. This study aims to predict greeting gestures from Japan and Indonesia with limited data. This study proposes and compares six RNN arch
APA, Harvard, Vancouver, ISO, and other styles
17

Suresh, Naveen, Neelesh Chinnakonda Ashok Kumar, Srikumar Subramanian, and Gowri Srinivasa. "Memory augmented recurrent neural networks for de-novo drug design." PLOS ONE 17, no. 6 (2022): e0269461. http://dx.doi.org/10.1371/journal.pone.0269461.

Full text
Abstract:
A recurrent neural network (RNN) is a machine learning model that learns the relationship between elements of an input series, in addition to inferring a relationship between the data input to the model and target output. Memory augmentation allows the RNN to learn the interrelationships between elements of the input over a protracted length of the input series. Inspired by the success of stack augmented RNN (StackRNN) to generate strings for various applications, we present two memory augmented RNN-based architectures: the Neural Turing Machine (NTM) and the Differentiable Neural Computer (DN
APA, Harvard, Vancouver, ISO, and other styles
18

Paramasivan, Senthil Kumar. "Deep Learning Based Recurrent Neural Networks to Enhance the Performance of Wind Energy Forecasting: A Review." Revue d'Intelligence Artificielle 35, no. 1 (2021): 1–10. http://dx.doi.org/10.18280/ria.350101.

Full text
Abstract:
In the modern era, deep learning is a powerful technique in the field of wind energy forecasting. The deep neural network effectively handles the seasonal variation and uncertainty characteristics of wind speed by proper structural design, objective function optimization, and feature learning. The present paper focuses on the critical analysis of wind energy forecasting using deep learning based Recurrent neural networks (RNN) models. It explores RNN and its variants, such as simple RNN, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), and Bidirectional RNN models. The recurrent neur
APA, Harvard, Vancouver, ISO, and other styles
19

Hu, Hao, Liqiang Wang, and Guo-Jun Qi. "Learning to Adaptively Scale Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3822–29. http://dx.doi.org/10.1609/aaai.v33i01.33013822.

Full text
Abstract:
Recent advancements in recurrent neural network (RNN) research have demonstrated the superiority of utilizing multiscale structures in learning temporal representations of time series. Currently, most of multiscale RNNs use fixed scales, which do not comply with the nature of dynamical temporal patterns among sequences. In this paper, we propose Adaptively Scaled Recurrent Neural Networks (ASRNN), a simple but efficient way to handle this problem. Instead of using predefined scales, ASRNNs are able to learn and adjust scales based on different temporal contexts, making them more flexible in mo
APA, Harvard, Vancouver, ISO, and other styles
20

Helfrich, Kyle, and Qiang Ye. "Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4115–22. http://dx.doi.org/10.1609/aaai.v34i04.5831.

Full text
Abstract:
Several variants of recurrent neural networks (RNNs) with orthogonal or unitary recurrent matrices have recently been developed to mitigate the vanishing/exploding gradient problem and to model long-term dependencies of sequences. However, with the eigenvalues of the recurrent matrix on the unit circle, the recurrent state retains all input information which may unnecessarily consume model capacity. In this paper, we address this issue by proposing an architecture that expands upon an orthogonal/unitary RNN with a state that is generated by a recurrent matrix with eigenvalues in the unit disc.
APA, Harvard, Vancouver, ISO, and other styles
21

Hitoishi, Das. "Image Captioning using Convolutional Neural Networks and Long Short Term Memory Cells." International Journal of Recent Technology and Engineering (IJRTE) 11, no. 1 (2022): 91–95. https://doi.org/10.35940/ijrte.E6741.0511122.

Full text
Abstract:
<strong>Abstract:</strong> This paper discusses an efficient approach to captioning a given image using a combination of Convolutional Neural Network (CNN) and Recurrent Neural Networks (RNN) with Long Short Term Memory Cells (LSTM). Image captioning is a realm of deep learning and computer vision which deals with generating relevant captions for a given input image. The research in this area includes the hyperparameter tuning of Convolutional Neural Networks and Recurrent Neural Networks to generate captions which are as accurate as possible. The basic outline of the process includes giving a
APA, Harvard, Vancouver, ISO, and other styles
22

Zafri Wan Yahaya, Wan Muhammad, Fadhlan Hafizhelmi Kamaru Zaman, and Mohd Fuad Abdul Latip. "Prediction of energy consumption using recurrent neural networks (RNN) and nonlinear autoregressive neural network with external input (NARX)." Indonesian Journal of Electrical Engineering and Computer Science 17, no. 3 (2020): 1215. http://dx.doi.org/10.11591/ijeecs.v17.i3.pp1215-1223.

Full text
Abstract:
Recurrent Neural Networks (RNN) and Nonlinear Autoregressive Neural Network with External Input (NARX) are recently applied in predicting energy consumption. Energy consumption prediction for depth analysis of how electrical energy consumption is managed on Tower 2 Engineering Building is critical in order to reduce the energy usage and the operational cost. Prediction of energy consumption in this building will bring great benefits to the Faculty of Electrical Engineering UiTM Shah Alam. In this work, we present the comparative study on the performance of prediction of energy consumption in T
APA, Harvard, Vancouver, ISO, and other styles
23

Wan, Muhammad Zafri Wan Yahaya, Hafizhelmi Kamaru Zaman Fadhlan, and Fuad Abdul Latip Mohd. "Prediction of energy consumption using recurrent neural networks (RNN) and nonlinear autoregressive neural network with external input (NARX)." Indonesian Journal of Electrical Engineering and Computer Science (IJEECS) 17, no. 3 (2020): 1215–23. https://doi.org/10.11591/ijeecs.v17.i3.pp1215-1223.

Full text
Abstract:
Recurrent Neural Networks (RNN) and Nonlinear Autoregressive Neural Network with External Input (NARX) are recently applied in predicting energy consumption. Energy consumption prediction for depth analysis of how electrical energy consumption is managed on Tower 2 Engineering Building is critical in order to reduce the energy usage and the operational cost. Prediction of energy consumption in this building will bring great benefits to the Faculty of Electrical Engineering UiTM Shah Alam. In this work, we present the comparative study on the performance of prediction of energy consumption in T
APA, Harvard, Vancouver, ISO, and other styles
24

Liu, Xuanxin, Fu Xu, Yu Sun, Haiyan Zhang, and Zhibo Chen. "Convolutional Recurrent Neural Networks for Observation-Centered Plant Identification." Journal of Electrical and Computer Engineering 2018 (2018): 1–7. http://dx.doi.org/10.1155/2018/9373210.

Full text
Abstract:
Traditional image-centered methods of plant identification could be confused due to various views, uneven illuminations, and growth cycles. To tolerate the significant intraclass variances, the convolutional recurrent neural networks (C-RNNs) are proposed for observation-centered plant identification to mimic human behaviors. The C-RNN model is composed of two components: the convolutional neural network (CNN) backbone is used as a feature extractor for images, and the recurrent neural network (RNN) units are built to synthesize multiview features from each image for final prediction. Extensiv
APA, Harvard, Vancouver, ISO, and other styles
25

Tiňo, Peter, and Barbara Hammer. "Architectural Bias in Recurrent Neural Networks: Fractal Analysis." Neural Computation 15, no. 8 (2003): 1931–57. http://dx.doi.org/10.1162/08997660360675099.

Full text
Abstract:
We have recently shown that when initialized with “small” weights, recurrent neural networks (RNNs) with standard sigmoid-type activation functions are inherently biased toward Markov models; even prior to any training, RNN dynamics can be readily used to extract finite memory machines (Hammer &amp; Tiňo, 2002; Tiňo, Čerňanský, &amp;Beňušková, 2002a, 2002b). Following Christiansen and Chater (1999), we refer to this phenomenon as the architectural bias of RNNs. In this article, we extend our work on the architectural bias in RNNs by performing a rigorous fractal analysis of recurrent activatio
APA, Harvard, Vancouver, ISO, and other styles
26

Rajakumar, Alfred, John Rinzel, and Zhe S. Chen. "Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation." Neural Computation 33, no. 10 (2021): 2603–45. http://dx.doi.org/10.1162/neco_a_01418.

Full text
Abstract:
Abstract Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics (“neural sequences”) of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale's principle will help elucidate the neural representations and mechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a
APA, Harvard, Vancouver, ISO, and other styles
27

Bhatia, Manpreet Kaur, and Vinayak Bhatt. "Forecasting Time Series Data using Recurrent Neural Networks: A Systematic Review." Journal for Research in Applied Sciences and Biotechnology 3, no. 6 (2024): 184–89. https://doi.org/10.55544/jrasb.3.6.22.

Full text
Abstract:
The method of time series forecasting stands crucial in multiple application areas that include finance as well as healthcare and energy management and climate modeling. RNNs serve as a powerful tool under deep learning because they possess ability to detect sequential data patterns while extracting temporal dependencies from time series data using traditional statistical methods which were previously the dominant approach. This paper conducts an organized review of modern techniques for predicting time series data by using RNNs. This discussion covers three major RNN architectures together wi
APA, Harvard, Vancouver, ISO, and other styles
28

Mohd Ruslan, Muhammad Faridzul Faizal, and Mohd Firdaus Hassan. "Unbalance Failure Recognition Using Recurrent Neural Network." International Journal of Automotive and Mechanical Engineering 19, no. 2 (2022): 9668–80. http://dx.doi.org/10.15282/ijame.19.2.2022.04.0746.

Full text
Abstract:
Many machine learning models have been created in recent years, which focus on recognising bearings and gearboxes with less attention on detecting unbalance issues. Unbalance is a fundamental issue that frequently occurs in deteriorating machinery, which requires checking prior to significant faults such as bearing and gearbox failures. Unbalance will propagate unless correction happens, causing damage to neighbouring components, such as bearings and mechanical seals. Because recurrent neural networks are well-known for their performance with sequential data, in this study, RNN is proposed to
APA, Harvard, Vancouver, ISO, and other styles
29

Kim, Robert, Yinghao Li, and Terrence J. Sejnowski. "Simple framework for constructing functional spiking recurrent neural networks." Proceedings of the National Academy of Sciences 116, no. 45 (2019): 22811–20. http://dx.doi.org/10.1073/pnas.1905926116.

Full text
Abstract:
Cortical microcircuits exhibit complex recurrent architectures that possess dynamically rich properties. The neurons that make up these microcircuits communicate mainly via discrete spikes, and it is not clear how spikes give rise to dynamics that can be used to perform computationally challenging tasks. In contrast, continuous models of rate-coding neurons can be trained to perform complex tasks. Here, we present a simple framework to construct biologically realistic spiking recurrent neural networks (RNNs) capable of learning a wide range of tasks. Our framework involves training a continuou
APA, Harvard, Vancouver, ISO, and other styles
30

Mayr, Franz, Sergio Yovine, and Ramiro Visca. "Property Checking with Interpretable Error Characterization for Recurrent Neural Networks." Machine Learning and Knowledge Extraction 3, no. 1 (2021): 205–27. http://dx.doi.org/10.3390/make3010010.

Full text
Abstract:
This paper presents a novel on-the-fly, black-box, property-checking through learning approach as a means for verifying requirements of recurrent neural networks (RNN) in the context of sequence classification. Our technique steps on a tool for learning probably approximately correct (PAC) deterministic finite automata (DFA). The sequence classifier inside the black-box consists of a Boolean combination of several components, including the RNN under analysis together with requirements to be checked, possibly modeled as RNN themselves. On one hand, if the output of the algorithm is an empty DFA
APA, Harvard, Vancouver, ISO, and other styles
31

Zhang, Xiyue, Xiaoning Du, Xiaofei Xie, Lei Ma, Yang Liu, and Meng Sun. "Decision-Guided Weighted Automata Extraction from Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 13 (2021): 11699–707. http://dx.doi.org/10.1609/aaai.v35i13.17391.

Full text
Abstract:
Recurrent Neural Networks (RNNs) have demonstrated their effectiveness in learning and processing sequential data (e.g., speech and natural language). However, due to the black-box nature of neural networks, understanding the decision logic of RNNs is quite challenging. Some recent progress has been made to approximate the behavior of an RNN by weighted automata. They provide better interpretability, but still suffer from poor scalability. In this paper, we propose a novel approach to extracting weighted automata with the guidance of a target RNN's decision and context information. In particul
APA, Harvard, Vancouver, ISO, and other styles
32

S.U., Rajpal, and Yash Katariya. "BREAST CANCER PREDICTION FROM GENE EXPRESSION DATA USING RECURRENT NEURAL NETWORKS." ICTACT Journal on Data Science and Machine Learning 6, no. 1 (2024): 735–38. https://doi.org/10.21917/ijdsml.2024.0150.

Full text
Abstract:
Gene expression data holds significant potential for identifying biomarkers and predicting the progression of breast cancer. Despite advancements in machine learning, accurately predicting breast cancer from gene expression data remains a challenge due to high-dimensionality, noise, and feature correlation in datasets. This study proposes a hybrid Recurrent Neural Network (RNN) to enhance prediction accuracy. The RNN combines convolutional layers for feature extraction with recurrent layers to capture sequential dependencies inherent in gene expression data. The method begins by preprocessing
APA, Harvard, Vancouver, ISO, and other styles
33

Chen, Chi-Kan. "Inference of gene networks from gene expression time series using recurrent neural networks and sparse MAP estimation." Journal of Bioinformatics and Computational Biology 16, no. 04 (2018): 1850009. http://dx.doi.org/10.1142/s0219720018500099.

Full text
Abstract:
Background: The inference of genetic regulatory networks (GRNs) provides insight into the cellular responses to signals. A class of recurrent neural networks (RNNs) capturing the dynamics of GRN has been used as a basis for inferring small-scale GRNs from gene expression time series. The Bayesian framework facilitates incorporating the hypothesis of GRN into the model estimation to improve the accuracy of GRN inference. Results: We present new methods for inferring small-scale GRNs based on RNNs. The weights of wires of RNN represent the strengths of gene-to-gene regulatory interactions. We us
APA, Harvard, Vancouver, ISO, and other styles
34

Ji, Junjie, Yongzhang Zhou, Qiuming Cheng, Shoujun Jiang, and Shiting Liu. "Landslide Susceptibility Mapping Based on Deep Learning Algorithms Using Information Value Analysis Optimization." Land 12, no. 6 (2023): 1125. http://dx.doi.org/10.3390/land12061125.

Full text
Abstract:
Selecting samples with non-landslide attributes significantly impacts the deep-learning modeling of landslide susceptibility mapping. This study presents a method of information value analysis in order to optimize the selection of negative samples used for machine learning. Recurrent neural network (RNN) has a memory function, so when using an RNN for landslide susceptibility mapping purposes, the input order of the landslide-influencing factors affects the resulting quality of the model. The information value analysis calculates the landslide-influencing factors, determines the input order of
APA, Harvard, Vancouver, ISO, and other styles
35

Xu, Chang, Weiran Huang, Hongwei Wang, Gang Wang, and Tie-Yan Liu. "Modeling Local Dependence in Natural Language with Multi-Channel Recurrent Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5525–32. http://dx.doi.org/10.1609/aaai.v33i01.33015525.

Full text
Abstract:
Recurrent Neural Networks (RNNs) have been widely used in processing natural language tasks and achieve huge success. Traditional RNNs usually treat each token in a sentence uniformly and equally. However, this may miss the rich semantic structure information of a sentence, which is useful for understanding natural languages. Since semantic structures such as word dependence patterns are not parameterized, it is a challenge to capture and leverage structure information. In this paper, we propose an improved variant of RNN, Multi-Channel RNN (MC-RNN), to dynamically capture and leverage local s
APA, Harvard, Vancouver, ISO, and other styles
36

Jose, Abin, Rijo Roy, Daniel Moreno-Andrés, and Johannes Stegmaier. "Automatic detection of cell-cycle stages using recurrent neural networks." PLOS ONE 19, no. 3 (2024): e0297356. http://dx.doi.org/10.1371/journal.pone.0297356.

Full text
Abstract:
Mitosis is the process by which eukaryotic cells divide to produce two similar daughter cells with identical genetic material. Research into the process of mitosis is therefore of critical importance both for the basic understanding of cell biology and for the clinical approach to manifold pathologies resulting from its malfunctioning, including cancer. In this paper, we propose an approach to study mitotic progression automatically using deep learning. We used neural networks to predict different mitosis stages. We extracted video sequences of cells undergoing division and trained a Recurrent
APA, Harvard, Vancouver, ISO, and other styles
37

Паршин, А. И., М. Н. Аралов, В. Ф. Барабанов, and Н. И. Гребенникова. "RANDOM MULTI-MODAL DEEP LEARNING IN THE PROBLEM OF IMAGE RECOGNITION." ВЕСТНИК ВОРОНЕЖСКОГО ГОСУДАРСТВЕННОГО ТЕХНИЧЕСКОГО УНИВЕРСИТЕТА, no. 4 (October 20, 2021): 21–26. http://dx.doi.org/10.36622/vstu.2021.17.4.003.

Full text
Abstract:
Задача распознавания изображений - одна из самых сложных в машинном обучении, требующая от исследователя как глубоких знаний, так и больших временных и вычислительных ресурсов. В случае использования нелинейных и сложных данных применяются различные архитектуры глубоких нейронных сетей, но при этом сложным вопросом остается проблема выбора нейронной сети. Основными архитектурами, используемыми повсеместно, являются свёрточные нейронные сети (CNN), рекуррентные нейронные сети (RNN), глубокие нейронные сети (DNN). На основе рекуррентных нейронных сетей (RNN) были разработаны сети с долгой кратко
APA, Harvard, Vancouver, ISO, and other styles
38

Lamar, Annie K. "Generating Metrically Accurate Homeric Poetry with Recurrent Neural Networks." International Journal of Transdisciplinary Artificial Intelligence 2, no. 1 (2020): 1–25. http://dx.doi.org/10.35708/tai1869-126247.

Full text
Abstract:
We investigate the generation of metrically accurate Homeric poetry using recurrent neural networks (RNN). We assess two models: a basic encoder-decoder RNN and the hierarchical recurrent encoderdecoder model (HRED). We assess the quality of the generated lines of poetry using quantitative metrical analysis and expert evaluation. This evaluation reveals that while the basic encoder-decoder is able to capture complex poetic meter, it under performs in terms of semantic coherence. The HRED model, however, produces more semantically coherent lines of poetry but is unable to capture the meter. Our
APA, Harvard, Vancouver, ISO, and other styles
39

Tito Ayyalasomayajula, Madan Mohan, and Sailaja Ayyalasomayajula. "Improving Machine Reliability with Recurrent Neural Networks." International Journal for Research Publication and Seminar 11, no. 4 (2020): 253–79. http://dx.doi.org/10.36676/jrps.v11.i4.1500.

Full text
Abstract:
This study explores the application of recurrent neural networks (RNNs) to enhance machine reliability in industrial settings, specifically in predictive maintenance systems. Predictive maintenance uses previous sensor data to identify abnormalities and forecast machine breakdowns before they occur, lowering downtime and maintenance costs. RNNs are ideal with their unique capacity to handle sequential input while capturing temporal relationships. RNN-based models may reliably foresee machine breakdowns and detect early malfunction indicators, allowing for appropriate interventions. The paper i
APA, Harvard, Vancouver, ISO, and other styles
40

Luna-Perejón, Francisco, Manuel Jesús Domínguez-Morales, and Antón Civit-Balcells. "Wearable Fall Detector Using Recurrent Neural Networks." Sensors 19, no. 22 (2019): 4885. http://dx.doi.org/10.3390/s19224885.

Full text
Abstract:
Falls have become a relevant public health issue due to their high prevalence and negative effects in elderly people. Wearable fall detector devices allow the implementation of continuous and ubiquitous monitoring systems. The effectiveness for analyzing temporal signals with low energy consumption is one of the most relevant characteristics of these devices. Recurrent neural networks (RNNs) have demonstrated a great accuracy in some problems that require analyzing sequential inputs. However, getting appropriate response times in low power microcontrollers remains a difficult task due to their
APA, Harvard, Vancouver, ISO, and other styles
41

Muhammad, Ukasha, and Gbolagade Morufat Damola. "Cryptocurrencies Price Prediction Using Deep Learning Models (Gated Recurrent Unit And Recurrent Neural Network))." Kasu Journal of Computer Science 1, no. 3 (2024): 544–52. http://dx.doi.org/10.47514/kjcs/2024.1.3.0011.

Full text
Abstract:
Background: With their volatile prices, cryptocurrencies have become valuable assets in the financial market. Predicting cryptocurrency prices accurately is essential for making well-informed investment decisions. Time series prediction models, like Gated Recurrent Unit (GRU) and Recurrent Neural Networks (RNN), are popular tools for financial data forecasting because they can capture sequential dependencies in data. Aim: This study aims to predict the average monthly closing prices of five major cryptocurrencies—Bitcoin (BTC), Ethereum (ETH), Binance Coin (BNB), Litecoin (LTC), and Ripple (XR
APA, Harvard, Vancouver, ISO, and other styles
42

Omankwu, Obinnaya Chinecherem &. Ubah Valetine Ifeanyi. "Hybrid Deep Learning Model for Heart Disease Prediction Using Recurrent Neural Network (RNN)." NIPES Journal of Science and Technology Research 5, no. 2 (2023): 184–94. https://doi.org/10.5281/zenodo.8014330.

Full text
Abstract:
<em>In</em><em> this paper, we use a recurrent neural network (RNN) that combines multiple gated recurrent units (GRUs), long short-term memory (LSTM), and the Adam optimizer to develop a new hybrid deep learning model for heart disease prediction. This proposed model yielded an excellent accuracy of 98.6876%. This proposed model is a hybrid of GRUs and RNNs model. The model was developed in Python 3.7 by integrating multiple GRUs and RNNs working with Keras and Tensorflow as backends for the deep learning process and is supported by various Python libraries. A recent existing model using RNN
APA, Harvard, Vancouver, ISO, and other styles
43

Liang, Kaiwei, Na Qin, Deqing Huang, and Yuanzhe Fu. "Convolutional Recurrent Neural Network for Fault Diagnosis of High-Speed Train Bogie." Complexity 2018 (October 23, 2018): 1–13. http://dx.doi.org/10.1155/2018/4501952.

Full text
Abstract:
Timely detection and efficient recognition of fault are challenging for the bogie of high-speed train (HST), owing to the fact that different types of fault signals have similar characteristics in the same frequency range. Notice that convolutional neural networks (CNNs) are powerful in extracting high-level local features and that recurrent neural networks (RNNs) are capable of learning long-term context dependencies in vibration signals. In this paper, by combining CNN and RNN, a so-called convolutional recurrent neural network (CRNN) is proposed to diagnose various faults of the HST bogie,
APA, Harvard, Vancouver, ISO, and other styles
44

Mahto, Kunal, Subhash Chandra Dutta, and Laljee Manjhi. "Evaluating Neural Network Techniques in Intrusion Detection Systems: A Focus on CNN and Hybrid Strategies." International Journal of Students' Research in Technology & Management 12, no. 1 (2024): 11–14. http://dx.doi.org/10.18510/ijsrtm.2024.1212.

Full text
Abstract:
Purpose of Study: The increasing focus on Artificial Intelligence (AI) worldwide has brought about potential benefits, but it also poses significant risks, especially in network security. Methodology: This study adopts a detailed comparative approach to evaluate the effectiveness of various Neural Network (NN) techniques in the context of Intrusion Detection Systems (IDS). The research focuses on four specific NN architectures: Artificial Neural Networks (ANN), Deep Neural Networks (DNN), Convolutional Neural Networks (CNN), and Recurrent Neural Networks (RNN). Each of these techniques is appl
APA, Harvard, Vancouver, ISO, and other styles
45

Surenthiran, Krishnan, Magalingam Pritheega, and Ibrahim Roslina. "Hybrid deep learning model using recurrent neural network and gated recurrent unit for heart disease prediction." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 6 (2021): 5467–76. https://doi.org/10.11591/ijece.v11i6.pp5467-5476.

Full text
Abstract:
This paper proposes a new hybrid deep learning model for heart disease prediction using recurrent neural network (RNN) with the combination of multiple gated recurrent units (GRU), long short-term memory (LSTM) and Adam optimizer. This proposed model resulted in an outstanding accuracy of 98.6876% which is the highest in the existing model of RNN. The model was developed in Python 3.7 by integrating RNN in multiple GRU that operates in Keras and Tensorflow as the backend for deep learning process, supported by various Python libraries. The recent existing models using RNN have reached an accur
APA, Harvard, Vancouver, ISO, and other styles
46

Mukesh, Kumar Shukla, Srivastava Harshit, Gupta Naincy, and Yadav Reema. "A Detailed Review On Artificial Intelligence In Pharmacy." American Journal of PharmTech Research 13, no. 03 (2023): 26–38. https://doi.org/10.5281/zenodo.8046921.

Full text
Abstract:
ABSTRACT Artificial intelligence (AI) focuses on creating intelligent models that help us envision knowledge, solve problems, and make decisions. AI is active these days, it plays an important role in various areas of pharmaceutical science, such as drug development and formulation of drug administration. For drug discovery and drug discovery research, such as development, poly-pharmacology, and hospital pharmacy, development of various artificial neural networks (ANNs) such as delivery formulations and Deep&nbsp;Neural networks (DNN) or recurrent neural networks (RNN) are used.&nbsp;Artificia
APA, Harvard, Vancouver, ISO, and other styles
47

Aslam, Naeem, Ahsan Nadeem, Muhammad Kamran Abid, and Muhammad Fuzail. "Text-Based Sentiment Analysis Using CNN-GRU Deep Learning Model." Journal of Information Communication Technologies and Robotic Applications 14, no. 1 (2023): 16–28. http://dx.doi.org/10.51239/jictra.v14i1.318.

Full text
Abstract:
Sentiment analysis identifies both positive and negative viewpoints from sources like social media, surveys, and reviews by automating text analysis with artificial intelligence (AI). Using data to inform decisions is made easier by this. Deep Learning (DL) has gained a lot of interest in recent years from academia and industry because of its outstanding performance. Convolutional neural networks (CNN) and recurrent neural networks (RNN) are the two deep learning designs that are most frequently utilized. Because they can examine enormous amounts of data, neural networks have the potential to
APA, Harvard, Vancouver, ISO, and other styles
48

Neerugatti, Varipally Vishwanath, K. Manjunathachari, and Prasad K. Satya. "Multi-lingual character recognition and extraction using recurrent neural networks." i-manager’s Journal on Image Processing 10, no. 4 (2023): 1. http://dx.doi.org/10.26634/jip.10.4.20293.

Full text
Abstract:
In recent years, segmentation and recognition of multilingual languages have attracted the attention of many researchers. Multilingual Optical Character Recognition (OCR) technology uses tools like PyTesseract, OpenCV and Recurrent Neural Networks (RNN) to transform text in English, Telugu, Hindi, Tamil and Kannada. Converting text to digital format transforms communication and supports cultural understanding. The system supports multiple languages and can handle different languages. PyTesseract and OpenCV are used for accurate behavior recognition, while RNN improves language understanding. T
APA, Harvard, Vancouver, ISO, and other styles
49

Rahani, Faisal Fajri, and Miftahurrahma Rosyida. "Quadrotor height control system using LQR and recurrent artificial neural networks." Journal of Soft Computing Exploration 5, no. 2 (2024): 183–91. http://dx.doi.org/10.52465/joscex.v5i2.379.

Full text
Abstract:
The quadorotor is a type of unmanned flying vehicle known as Unmanned Aerial Vehicle (UAV). In recent years, quadrotors have attracted much attention from researchers around the world due to their excellent maneuverability. A good control system in this quadrotor system is needed for ease of use of this quadrotor. One control system that is often used is the Linear Quadratic Regulator (LQR) control system. This control system has challenges for dynamic system disturbances in quadrotor control. Researchers proposed a recurrent artificial neural network (RNN) system to address these challenges.R
APA, Harvard, Vancouver, ISO, and other styles
50

Prof., Aarthy G., Shenoy K. Vibha, H. Thejashree, and S. Nithin. "Image based Spam Detection using Recurrent Neural Networks (RNN)." Recent Innovations in Wireless Network Security 5, no. 2 (2023): 20–31. https://doi.org/10.5281/zenodo.8141409.

Full text
Abstract:
<em>Image-spam initially arose as a way of bypassing text-based spam filters. It is widely used to advertise products, mislead individuals into providing personal information, or transmit hazardous viruses. Image spam is harder to detect than text-based spam. Image-based encryption methods can be used to create image spam that is even more difficult to detect than what is often seen in reality. Image spam has evolved over time and may now overcome various kinds of classic anti-spam methods. Spammers can utilise pictures that just include text, sliced images, and randomly created images. Text-o
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!