To see the other types of publications on this topic, follow the link: Backpropagation Neural Networks.

Journal articles on the topic 'Backpropagation Neural Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Backpropagation Neural Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wythoff, Barry J. "Backpropagation neural networks." Chemometrics and Intelligent Laboratory Systems 18, no. 2 (1993): 115–55. http://dx.doi.org/10.1016/0169-7439(93)80052-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Harrington, Peter de B. "Temperature-Constrained Backpropagation Neural Networks." Analytical Chemistry 66, no. 6 (1994): 802–7. http://dx.doi.org/10.1021/ac00078a007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rosenbaum, Robert. "On the relationship between predictive coding and backpropagation." PLOS ONE 17, no. 3 (2022): e0266102. http://dx.doi.org/10.1371/journal.pone.0266102.

Full text
Abstract:
Artificial neural networks are often interpreted as abstract models of biological neuronal networks, but they are typically trained using the biologically unrealistic backpropagation algorithm and its variants. Predictive coding has been proposed as a potentially more biologically realistic alternative to backpropagation for training neural networks. This manuscript reviews and extends recent work on the mathematical relationship between predictive coding and backpropagation for training feedforward artificial neural networks on supervised learning tasks. Implications of these results for the
APA, Harvard, Vancouver, ISO, and other styles
4

Keller, James M., and Hossein Tahani. "Backpropagation neural networks for fuzzy logic." Information Sciences 62, no. 3 (1992): 205–21. http://dx.doi.org/10.1016/0020-0255(92)90016-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jong Man Cho. "Chromosome classification using backpropagation neural networks." IEEE Engineering in Medicine and Biology Magazine 19, no. 1 (2000): 28–33. http://dx.doi.org/10.1109/51.816241.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Mingfeng. "Comprehensive Review of Backpropagation Neural Networks." Academic Journal of Science and Technology 9, no. 1 (2024): 150–54. http://dx.doi.org/10.54097/51y16r47.

Full text
Abstract:
The Backpropagation Neural Network (BPNN) is a deep learning model inspired by the biological neural network. Introduced in the 1980s, the BPNN quickly became a focal point in neural network research due to its outstanding learning capability and adaptability. The network structure consists of input, hidden, and output layers, and it optimizes weights through the backpropagation algorithm, widely applied in image recognition, speech processing, natural language processing, and more. The mathematical model of neurons describes the relationship between input and output, and the training process
APA, Harvard, Vancouver, ISO, and other styles
7

Saha, Sanjit Kumar. "Performance Evaluation of Neural Networks in Road Sign Recognition." International Journal of Advanced Engineering Research and Science 11, no. 2 (2024): 007–12. http://dx.doi.org/10.22161/ijaers.11.2.

Full text
Abstract:
This paper presents an in-depth study of road sign recognition techniques leveraging neural networks. Road sign recognition stands as a critical component of intelligent transportation systems, contributing to enhanced road safety and efficient traffic management. The paper focuses on exploring various neural network architectures for example, backpropagation neural network and hybrid neural network which is a combination of two neural network (backpropagation neural network and bidirectional associative memory), training methodologies, dataset considerations, and performance evaluations for a
APA, Harvard, Vancouver, ISO, and other styles
8

Kusumoputro, Benyamin, and Li Na. "Discrimination of Two-Mixture Fragrances Odor Using Artificial Odor Recognition System with Ensemble Backpropagation Neural Networks." Applied Mechanics and Materials 303-306 (February 2013): 1514–18. http://dx.doi.org/10.4028/www.scientific.net/amm.303-306.1514.

Full text
Abstract:
The human sensory test is often used to obtain the sensory quantities of odors, however, the fluctuation of results due to the experts condition can cause discrepancies among panelists. We have developed an artificial odor recognition system using a quartz resonator sensor and backpropagation neural networks as the pattern recognition system in order to eliminate the disadvantage of human panelist system. The backpropagation neural networks shows high recognition rate for single component odor, however, become very low when it is used to discriminate mixture fragrances odor. In this paper we h
APA, Harvard, Vancouver, ISO, and other styles
9

SAWITRI, MADE NITA DWI, I. WAYAN SUMARJAYA, and NI KETUT TARI TASTRAWATI. "PERAMALAN MENGGUNAKAN METODE BACKPROPAGATION NEURAL NETWORK." E-Jurnal Matematika 7, no. 3 (2018): 264. http://dx.doi.org/10.24843/mtk.2018.v07.i03.p213.

Full text
Abstract:
The purpose of the study is to forecast the price of rice in the city of Denpasar in 2017 using backpropagation neural network method. Backpropagation neural network is a model of artificial neural network by finding the optimal weight value. Artificial neural networks are information processing systems that have certain performance characteristics similar to that of human neural networks. This analysis uses time series data of rice prices in the city of Denpasar from January 2001 until December 2016. The results of this research, concludes that the lowest rice price is predicted in July 2017
APA, Harvard, Vancouver, ISO, and other styles
10

Sanubary, Iklas. "BRAIN TUMOR DETECTION USING BACKPROPAGATION NEURAL NETWORKS." Indonesian Journal of Physics and Nuclear Applications 3, no. 3 (2018): 83–88. http://dx.doi.org/10.24246/ijpna.v3i3.83-88.

Full text
Abstract:
A study of brain tumor detection has been done by making use of backpropagation neural networks with Gray Level Co-Occurrence Matrix (GLCM) feature extraction. CT-Scan images of the brain consist of 12 normal and 13 abnormal (tumor) brain images are analyzed. The preprocessing stage begins with cropping the image to a 256 x 256 pixels picture, then converting the colored images into grayscale images, and equalizing the histogram to improve the quality of the images. GLCM is used to calculate statistical features determined by 5 parameters i.e., contrast, correlation, energy and homogeneity for
APA, Harvard, Vancouver, ISO, and other styles
11

Hazrati, Ayoub, Shannon Kariuki, and Ricardo Silva. "Comparative Analysis of Backpropagation and Genetic Algorithms in Neural Network Training." International Journal of Health Technology and Innovation 3, no. 03 (2024): 18–25. https://doi.org/10.60142/ijhti.v3i03.04.

Full text
Abstract:
The exploration of artificial neural networks (ANNs) has seen significant advancements, yet the optimal approach for training these networks remains a topic of debate. This study investigates the efficacy and computational efficiency of two prominent optimization techniques, backpropagation and genetic algorithms (GAs), in training traditional neural networks. The research conducted three experiments: the first involved training a single-layer neural network using a simple mathematical function; the second utilized the diabetes dataset for regression analysis; and the third applied the iris da
APA, Harvard, Vancouver, ISO, and other styles
12

Wright, Logan G., Tatsuhiro Onodera, Martin M. Stein, et al. "Deep physical neural networks trained with backpropagation." Nature 601, no. 7894 (2022): 549–55. http://dx.doi.org/10.1038/s41586-021-04223-6.

Full text
Abstract:
AbstractDeep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability1. Deep-learning accelerators2–9 aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far10–22 have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this
APA, Harvard, Vancouver, ISO, and other styles
13

Marchesi, M. L., F. Piazza, and A. Uncini. "Backpropagation without multiplier for multilayer neural networks." IEE Proceedings - Circuits, Devices and Systems 143, no. 4 (1996): 229. http://dx.doi.org/10.1049/ip-cds:19960336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Nasr, G. E., E. A. Badr, and C. Joun. "Backpropagation neural networks for modeling gasoline consumption." Energy Conversion and Management 44, no. 6 (2003): 893–905. http://dx.doi.org/10.1016/s0196-8904(02)00087-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Harrington, Peter de B. "Sigmoid transfer functions in backpropagation neural networks." Analytical Chemistry 65, no. 15 (1993): 2167–68. http://dx.doi.org/10.1021/ac00063a042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Wong, F. S. "Time series forecasting using backpropagation neural networks." Neurocomputing 2, no. 4 (1991): 147–59. http://dx.doi.org/10.1016/0925-2312(91)90045-d.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Nyongesa, H. O. "Fuzzy assisted learning in backpropagation neural networks." Neural Computing & Applications 6, no. 4 (1997): 238–44. http://dx.doi.org/10.1007/bf01501510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Walczak, B. "Neural networks with robust backpropagation learning algorithm." Analytica Chimica Acta 322, no. 1-2 (1996): 21–29. http://dx.doi.org/10.1016/0003-2670(95)00552-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Mills, H., D. R. Burton, and M. J. Lalor. "Applying backpropagation neural networks to fringe analysis." Optics and Lasers in Engineering 23, no. 5 (1995): 331–41. http://dx.doi.org/10.1016/0143-8166(95)00038-p.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

A. Bida, Falah, and Hayder A. Naser. "Diagnostic of Osteoporosis Using Backpropagation Neural Networks." Journal of Techniques 7, no. 2 (2025): 10–20. https://doi.org/10.51173/jt.v7i2.2597.

Full text
Abstract:
In this study, an artificial neural network (ANN) using backpropagation was utilized to categorize bone images into either healthy or osteoporotic categories based on various statistical operations. An input matrix was constructed containing the six statistical features of 125 samples, representing X-ray images of knee joints, with 25 healthy and 100 osteoporotic samples. Of these, 70% were used for training, 15% for validation, and 15% for network testing. The classification efficiency of the neural network for the 125 samples was 97%. The research included analysis of arithmetic mean, standa
APA, Harvard, Vancouver, ISO, and other styles
21

Ozbay, Serkan. "Modified Backpropagation Algorithm with Multiplicative Calculus in Neural Networks." Elektronika ir Elektrotechnika 29, no. 3 (2023): 55–61. http://dx.doi.org/10.5755/j02.eie.34105.

Full text
Abstract:
Backpropagation is one of the most widely used algorithms for training feedforward deep neural networks. The algorithm requires a differentiable activation function and it performs computations of the gradient proceeding backwards through the feedforward deep neural network from the last layer through to the first layer. In order to calculate the gradient at a specific layer, the gradients of all layers are combined via the chain rule of calculus. One of the biggest disadvantages of the backpropagation is that it requires a large amount of training time. To overcome this issue, this paper prop
APA, Harvard, Vancouver, ISO, and other styles
22

Budiman, I., A. Mubarak, S. Kapita, S. Do Abdullah, and M. Salmin. "Implementation of Backpropagation Artificial Network Methods for Early Children’s Intelligence Prediction." E3S Web of Conferences 328 (2021): 04033. http://dx.doi.org/10.1051/e3sconf/202132804033.

Full text
Abstract:
Intelligence is the ability to process certain types of information derived from human biological and psychological factors. This study aims to implement a Backpropagation artificial neural network for prediction of early childhood intelligence and how to calculate system accuracy on children's intelligence using the backpropagation artificial neural network method. The Backpropagation Neural Network method is one of the best methods in dealing with the problem of recognizing complex patterns. Backpropagation Neural Networks have advantages because the learning is done repeatedly so that it ca
APA, Harvard, Vancouver, ISO, and other styles
23

Aspman, Johannes, Georgios Korpas, and Jakub Marecek. "Taming Binarized Neural Networks and Mixed-Integer Programs." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 10 (2024): 10935–43. http://dx.doi.org/10.1609/aaai.v38i10.28968.

Full text
Abstract:
There has been a great deal of recent interest in binarized neural networks, especially because of their explainability. At the same time, automatic differentiation algorithms such as backpropagation fail for binarized neural networks, which limits their applicability. We show that binarized neural networks admit a tame representation by reformulating the problem of training binarized neural networks as a subadditive dual of a mixed-integer program, which we show to have nice properties. This makes it possible to use the framework of Bolte et al. for implicit differentiation, which offers the
APA, Harvard, Vancouver, ISO, and other styles
24

Cao, WanLing. "Evaluating the Vocal Music Teaching Using Backpropagation Neural Network." Mobile Information Systems 2022 (June 24, 2022): 1–7. http://dx.doi.org/10.1155/2022/3843726.

Full text
Abstract:
The vocal music teaching for evaluating performers is affected by multiple factors. Evaluators are greatly influenced by subjective factors in scoring outputs. The backpropagation (BP) neural network provides a novel technology that can theoretically simulate any nonlinear continuous function within a certain accuracy range. The backpropagation neural network is composed of adaptive feedforward learning network that is widely used in artificial intelligence (AI). In addition, the backpropagation neural network can simulate the nonlinear mapping composed of various factors. The novelty of the n
APA, Harvard, Vancouver, ISO, and other styles
25

Astion, M. L., M. H. Wener, R. G. Thomas, G. G. Hunder, and D. A. Bloch. "Overtraining in neural networks that interpret clinical data." Clinical Chemistry 39, no. 9 (1993): 1998–2004. http://dx.doi.org/10.1093/clinchem/39.9.1998.

Full text
Abstract:
Abstract Backpropagation neural networks are a computer-based pattern-recognition method that has been applied to the interpretation of clinical data. Unlike rule-based pattern recognition, backpropagation networks learn by being repetitively trained with examples of the patterns to be differentiated. We describe and analyze the phenomenon of overtraining in backpropagation networks. Overtraining refers to the reduction in generalization ability that can occur as networks are trained. The clinical application we used was the differentiation of giant cell arteritis (GCA) from other forms of vas
APA, Harvard, Vancouver, ISO, and other styles
26

MANGAL, MANISH, and MANU PRATAP SINGH. "ANALYSIS OF MULTIDIMENSIONAL XOR CLASSIFICATION PROBLEM WITH EVOLUTIONARY FEEDFORWARD NEURAL NETWORKS." International Journal on Artificial Intelligence Tools 16, no. 01 (2007): 111–20. http://dx.doi.org/10.1142/s0218213007003229.

Full text
Abstract:
This paper describes the application of two evolutionary algorithms to the feedforward neural networks used in classification problems. Besides of a simple backpropagation feedforward algorithm, the paper considers the genetic algorithm and random search algorithm. The objective is to analyze the performance of GAs over the simple backpropagation feedforward in terms of accuracy or speed in this problem. The experiments considered a feedforward neural network trained with genetic algorithm/random search algorithm and 39 types of network structures and artificial data sets. In most cases, the e
APA, Harvard, Vancouver, ISO, and other styles
27

Ruslau, Maria Fransina Veronica, Rian Ade Pratama, Martha Betaubun, and Dessy Rizki Suryani. "Multiclass classification using backpropagation neural network." IOP Conference Series: Earth and Environmental Science 1454, no. 1 (2025): 012037. https://doi.org/10.1088/1755-1315/1454/1/012037.

Full text
Abstract:
Abstract The regression model was commonly employed while examining response characteristics in social studies. If the variable response or targets are both categorical and ordinal, the ordinal logistic regression model should be employed, but it may not always produce adequate results. Neural networks are an alternate categorization method that can deal with a high number of qualitative characteristics associated to behavioural reactions. Backpropagation was used in this work to predict the poverty status of impoverished households in Surabaya, rather than ordinal logistic regression. In this
APA, Harvard, Vancouver, ISO, and other styles
28

Korkobi, Talel, Mohamed Djemel, and Mohamed Chtourou. "Stability Analysis of Neural Networks-Based System Identification." Modelling and Simulation in Engineering 2008 (2008): 1–8. http://dx.doi.org/10.1155/2008/343940.

Full text
Abstract:
This paper treats some problems related to nonlinear systems identification. A stability analysis neural network model for identifying nonlinear dynamic systems is presented. A constrained adaptive stable backpropagation updating law is presented and used in the proposed identification approach. The proposed backpropagation training algorithm is modified to obtain an adaptive learning rate guarantying convergence stability. The proposed learning rule is the backpropagation algorithm under the condition that the learning rate belongs to a specified range defining the stability domain. Satisfyin
APA, Harvard, Vancouver, ISO, and other styles
29

Oliver Muncharaz, J. "Hybrid fuzzy neural network versus backpropagation neural network: An application to predict the Ibex-35 index stock." Finance, Markets and Valuation 6, no. 1 (2020): 85–98. http://dx.doi.org/10.46503/alep9985.

Full text
Abstract:
The use of neural networks has been extended in all areas of knowledge due to the good results being obtained in the resolution of the different problems posed. The prediction of prices in general, and stock market prices in particular, represents one of the main objectives of the use of neural networks in finance. This paper presents the analysis of the efficiency of the hybrid fuzzy neural network against a backpropagation type neural network in the price prediction of the Spanish stock exchange index (IBEX-35). The paper is divided into two parts. In the first part, the main characteristics
APA, Harvard, Vancouver, ISO, and other styles
30

Pineda, Fernando J. "Recurrent Backpropagation and the Dynamical Approach to Adaptive Neural Computation." Neural Computation 1, no. 2 (1989): 161–72. http://dx.doi.org/10.1162/neco.1989.1.2.161.

Full text
Abstract:
Error backpropagation in feedforward neural network models is a popular learning algorithm that has its roots in nonlinear estimation and optimization. It is being used routinely to calculate error gradients in nonlinear systems with hundreds of thousands of parameters. However, the classical architecture for backpropagation has severe restrictions. The extension of backpropagation to networks with recurrent connections will be reviewed. It is now possible to efficiently compute the error gradients for networks that have temporal dynamics, which opens applications to a host of problems in syst
APA, Harvard, Vancouver, ISO, and other styles
31

Pai, Sunil, Zhanghao Sun, Tyler W. Hughes, et al. "Experimentally realized in situ backpropagation for deep learning in photonic neural networks." Science 380, no. 6643 (2023): 398–404. http://dx.doi.org/10.1126/science.ade8450.

Full text
Abstract:
Integrated photonic neural networks provide a promising platform for energy-efficient, high-throughput machine learning with extensive scientific and commercial applications. Photonic neural networks efficiently transform optically encoded inputs using Mach-Zehnder interferometer mesh networks interleaved with nonlinearities. We experimentally trained a three-layer, four-port silicon photonic neural network with programmable phase shifters and optical power monitoring to solve classification tasks using “in situ backpropagation,” a photonic analog of the most popular method to train convention
APA, Harvard, Vancouver, ISO, and other styles
32

Tang, Zheng, Xu Gang Wang, Hiroki Tamura, and Masahiro Ishii. "An Algorithm of Supervised Learning for Multilayer Neural Networks." Neural Computation 15, no. 5 (2003): 1125–42. http://dx.doi.org/10.1162/089976603765202686.

Full text
Abstract:
A method of supervised learning for multilayer artificial neural networks to escape local minima is proposed. The learning model has two phases: a backpropagation phase and a gradient ascent phase. The backpropagation phase performs steepest descent on a surface in weight space whose height at any point in weight space is equal to an error measure, and it finds a set of weights minimizing this error measure. When the backpropagation gets stuck in local minima, the gradient ascent phase attempts to fill up the valley by modifying gain parameters in a gradient ascent direction of the error measu
APA, Harvard, Vancouver, ISO, and other styles
33

De Wolf, E. D., and L. J. Francl. "Neural Network Classification of Tan Spot and Stagonospora Blotch Infection Periods in a Wheat Field Environment." Phytopathology® 90, no. 2 (2000): 108–13. http://dx.doi.org/10.1094/phyto.2000.90.2.108.

Full text
Abstract:
Tan spot and Stagonospora blotch of hard red spring wheat served as a model system for evaluating disease forecasts by artificial neural networks. Pathogen infection periods on susceptible wheat plants were measured in the field from 1993 to 1998, and incidence data were merged with 24-h summaries of accumulated growing degree days, temperature, relative humidity, precipitation, and leaf wetness duration. The resulting data set of 202 discrete periods was randomly assigned to 10 modeldevelopment or -validation (n = 50) data sets. Backpropagation neural networks, general regression neural netwo
APA, Harvard, Vancouver, ISO, and other styles
34

Johansson, E. M., F. U. Dowla, and D. M. Goodman. "BACKPROPAGATION LEARNING FOR MULTILAYER FEED-FORWARD NEURAL NETWORKS USING THE CONJUGATE GRADIENT METHOD." International Journal of Neural Systems 02, no. 04 (1991): 291–301. http://dx.doi.org/10.1142/s0129065791000261.

Full text
Abstract:
In many applications, the number of interconnects or weights in a neural network is so large that the learning time for the conventional backpropagation algorithm can become excessively long. Numerical optimization theory offers a rich and robust set of techniques which can be applied to neural networks to improve learning rates. In particular, the conjugate gradient method is easily adapted to the backpropagation learning problem. This paper describes the conjugate gradient method, its application to the backpropagation learning problem and presents results of numerical tests which compare co
APA, Harvard, Vancouver, ISO, and other styles
35

De Groff, Dolores, and Perambur Neelakanta. "Faster Convergent Artificial Neural Networks." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 17, no. 1 (2018): 7126–32. http://dx.doi.org/10.24297/ijct.v17i1.7106.

Full text
Abstract:
Proposed in this paper is a novel fast-convergence algorithm applied to neural networks (ANNs) with a learning rate based on the eigenvalues of the associated Hessian matrix of the input data. That is, the learning rate applied to the backpropagation algorithm changes dynamically with the input data used for training. The best choice of learning rate to converge to an accurate value quickly is derived. This newly proposed fast-convergence algorithm is applied to a traditional multilayer ANN architecture with feed-forward and backpropagation techniques. The proposed strategy is applied to vario
APA, Harvard, Vancouver, ISO, and other styles
36

Bao, Chunhui, Yifei Pu, and Yi Zhang. "Fractional-Order Deep Backpropagation Neural Network." Computational Intelligence and Neuroscience 2018 (July 3, 2018): 1–10. http://dx.doi.org/10.1155/2018/7361628.

Full text
Abstract:
In recent years, the research of artificial neural networks based on fractional calculus has attracted much attention. In this paper, we proposed a fractional-order deep backpropagation (BP) neural network model with L2 regularization. The proposed network was optimized by the fractional gradient descent method with Caputo derivative. We also illustrated the necessary conditions for the convergence of the proposed network. The influence of L2 regularization on the convergence was analyzed with the fractional-order variational method. The experiments have been performed on the MNIST dataset to
APA, Harvard, Vancouver, ISO, and other styles
37

Khalid Awang, Mohd, Mohammad Ridwan Ismail, Mokhairi Makhtar, M. Nordin A Rahman, and Abd Rasid Mamat. "Performance Comparison of Neural Network Training Algorithms for Modeling Customer Churn Prediction." International Journal of Engineering & Technology 7, no. 2.15 (2018): 35. http://dx.doi.org/10.14419/ijet.v7i2.15.11196.

Full text
Abstract:
Predicting customer churn has become the priority of every telecommunication service provider as the market is becoming more saturated and competitive. This paper presents a comparison of neural network learning algorithms for customer churn prediction. The data set used to train and test the neural network algorithms was provided by one of the leading telecommunication company in Malaysia. The Multilayer Perceptron (MLP) networks are trained using nine (9) types of learning algorithms, which are Levenberg Marquardt backpropagation (trainlm), BFGS Quasi-Newton backpropagation (trainbfg), Conju
APA, Harvard, Vancouver, ISO, and other styles
38

Wan, Eric A., and Françoise Beaufays. "Diagrammatic Derivation of Gradient Algorithms for Neural Networks." Neural Computation 8, no. 1 (1996): 182–201. http://dx.doi.org/10.1162/neco.1996.8.1.182.

Full text
Abstract:
Deriving gradient algorithms for time-dependent neural network structures typically requires numerous chain rule expansions, diligent bookkeeping, and careful manipulation of terms. In this paper, we show how to derive such algorithms via a set of simple block diagram manipulation rules. The approach provides a common framework to derive popular algorithms including backpropagation and backpropagation-through-time without a single chain rule expansion. Additional examples are provided for a variety of complicated architectures to illustrate both the generality and the simplicity of the approac
APA, Harvard, Vancouver, ISO, and other styles
39

Visutsak, Porawat. "Activity Classification Using Backpropagation Neural Networks for the Daily Lives of the Elderly." International Journal of Machine Learning and Computing 11, no. 3 (2021): 188–93. http://dx.doi.org/10.18178/ijmlc.2021.11.3.1034.

Full text
Abstract:
Activity Analysis Systems or Activity Recognition Systems for the elderly is recently a part of the smart home systems design. This assisted system normally helps the senior people to live alone in a house, safely and improve a quality of life. Therefore, learning to recognize which activities are safe is necessary for classifying the activities of the elderly. This information will give the researchers in the assistive technology some insights to understand the basic daily lives of the elderly. Moreover, it is also help the caregivers to monitor activities of the senior people while they live
APA, Harvard, Vancouver, ISO, and other styles
40

Thimm, Georg, Perry Moerland, and Emile Fiesler. "The Interchangeability of Learning Rate and Gain in Backpropagation Neural Networks." Neural Computation 8, no. 2 (1996): 451–60. http://dx.doi.org/10.1162/neco.1996.8.2.451.

Full text
Abstract:
The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the gain of its activation function(s) is investigated. In specific, it is proven that changing the gain of the activation function is equivalent to changing the learning rate and the weights. This simplifies the backpropagation learning rule by eliminating one of its parameters. The theorem can be extended to hold for some well-known variations on the backpropagation algorithm, such as using a momentum term, flat spot elimination, or adaptive gain. Furthermore, it is successfully applied
APA, Harvard, Vancouver, ISO, and other styles
41

Zorins, A. "FINANCIAL FORECASTING USING NEURAL NETWORKS." Environment. Technology. Resources. Proceedings of the International Scientific and Practical Conference 1 (June 26, 2006): 392. http://dx.doi.org/10.17770/etr2003vol1.2027.

Full text
Abstract:
This paper presents an application of neural networks to financial time-series forecasting. No additional indicators, but only the information contained in the sales time series was used to model and forecast stock exchange index. The forecasting is carried out by two different neural network learning algorithms – error backpropagation and Kohonen self-organising maps. The results are presented and their comparative analysis is performed in this article.
APA, Harvard, Vancouver, ISO, and other styles
42

BEIGY, HAMID, and MOHAMMAD R. MEYBODI. "BACKPROPAGATION ALGORITHM ADAPTATION PARAMETERS USING LEARNING AUTOMATA." International Journal of Neural Systems 11, no. 03 (2001): 219–28. http://dx.doi.org/10.1142/s0129065701000655.

Full text
Abstract:
Despite of the many successful applications of backpropagation for training multi–layer neural networks, it has many drawbacks. For complex problems it may require a long time to train the networks, and it may not train at all. Long training time can be the result of the non-optimal parameters. It is not easy to choose appropriate value of the parameters for a particular problem. In this paper, by interconnection of fixed structure learning automata (FSLA) to the feedforward neural networks, we apply learning automata (LA) scheme for adjusting these parameters based on the observation of rando
APA, Harvard, Vancouver, ISO, and other styles
43

Bohari, Abdul Rahman. "Meningkatkan Kinerja Backpropagation Neural Network Menggunakan Algoritma Adaptif." IMTechno: Journal of Industrial Management and Technology 3, no. 1 (2022): 58–63. http://dx.doi.org/10.31294/imtechno.v3i1.1043.

Full text
Abstract:
The application of Artificial Neural Networks in various fields of human life is getting wider, especially in the industrial sector. One of the artificial neural network structures that are quite often used is the Feedforward Neural Network with its well-known learning algorithm, namely Backpropagation. However, as reported by several researchers, Backpropagation has several weaknesses such as it takes a long time to converge in the training process, it is quite sensitive to initial weight conditions and is relatively often trapped in a local minima which can thwart the training process. In th
APA, Harvard, Vancouver, ISO, and other styles
44

Wanto, Anjar, Agus Perdana Windarto, Dedy Hartama, and Iin Parlina. "Use of Binary Sigmoid Function And Linear Identity In Artificial Neural Networks For Forecasting Population Density." IJISTECH (International Journal Of Information System & Technology) 1, no. 1 (2017): 43. http://dx.doi.org/10.30645/ijistech.v1i1.6.

Full text
Abstract:
Artificial Neural Network (ANN) is often used to solve forecasting cases. As in this study. The artificial neural network used is with backpropagation algorithm. The study focused on cases concerning overcrowding forecasting based District in Simalungun in Indonesia in 2010-2015. The data source comes from the Central Bureau of Statistics of Simalungun Regency. The population density forecasting its future will be processed using backpropagation algorithm focused on binary sigmoid function (logsig) and a linear function of identity (purelin) with 5 network architecture model used the 3-5-1, 3-
APA, Harvard, Vancouver, ISO, and other styles
45

Sekhar, Ch, and P. Sai Meghana. "A Study on Backpropagation in Artificial Neural Networks." Asia-Pacific Journal of Neural Networks and Its Applications 4, no. 1 (2020): 21–28. http://dx.doi.org/10.21742/ajnnia.2020.4.1.03.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Skorin-Kapov, Jadranka, and K. Wendy Tang. "Training Artificial Neural Networks: Backpropagation via Nonlinear Optimization." Journal of Computing and Information Technology 9, no. 1 (2001): 1. http://dx.doi.org/10.2498/cit.2001.01.01.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Liang Jin and M. M. Gupta. "Stable dynamic backpropagation learning in recurrent neural networks." IEEE Transactions on Neural Networks 10, no. 6 (1999): 1321–34. http://dx.doi.org/10.1109/72.809078.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Mandal, S., Subba Rao, and D. H. Raju. "Ocean wave parameters estimation using backpropagation neural networks." Marine Structures 18, no. 3 (2005): 301–18. http://dx.doi.org/10.1016/j.marstruc.2005.09.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Jenq, John Jingfu, and Wingning Li. "Feedforward backpropagation artificial neural networks on reconfigurable meshes." Future Generation Computer Systems 14, no. 5-6 (1998): 313–19. http://dx.doi.org/10.1016/s0167-739x(98)00036-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Tzafestas, S. G., P. J. Dalianis, and G. Anthopoulos. "On the overtraining phenomenon of backpropagation neural networks." Mathematics and Computers in Simulation 40, no. 5-6 (1996): 507–21. http://dx.doi.org/10.1016/0378-4754(95)00003-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!