To see the other types of publications on this topic, follow the link: Back propagation function network.

Journal articles on the topic 'Back propagation function network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Back propagation function network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Asaad, Renas Rajab, and Rasan I. Ali. "Back Propagation Neural Network(BPNN) and Sigmoid Activation Function in Multi-Layer Networks." Academic Journal of Nawroz University 8, no. 4 (2019): 216. http://dx.doi.org/10.25007/ajnu.v8n4a464.

Full text
Abstract:
Back propagation neural network are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural network for back propagation of errors, and sigmoid activation function. This neural network to map non-linear threshold gate. The non-linear used to classify binary inputs (x1, x2) and passing it through hidden layer for computing coefficient_errors and gradient_errors (Cerrors, Gerrors), after computing errors by (ei = Output_desir
APA, Harvard, Vancouver, ISO, and other styles
2

Garkani-Nejad, Zahra, and Behzad Ahmadi-Roudi. "Investigating the role of weight update functions in developing artificial neural network modeling of retention times of furan and phenol derivatives." Canadian Journal of Chemistry 91, no. 4 (2013): 255–62. http://dx.doi.org/10.1139/cjc-2012-0372.

Full text
Abstract:
A quantitative structure−retention relationship study has been carried out on the retention times of 63 furan and phenol derivatives using artificial neural networks (ANNs). First, a large number of descriptors were calculated using HyperChem, Mopac, and Dragon softwares. Then, a suitable number of these descriptors were selected using a multiple linear regression technique. This paper focuses on investigating the role of weight update functions in developing ANNs. Therefore, selected descriptors were used as inputs for ANNs with six different weight update functions including the Levenberg−Ma
APA, Harvard, Vancouver, ISO, and other styles
3

Mahmoud, Waleed Ameen, Ali Ibrahim Abbas, and Nuha Abdul Sahib Alwan. "FACE IDENTIFICATION USING BACK-PROPAGATION ADAPTIVE MULTIWAVENET." Journal of Engineering 18, no. 03 (2023): 392–402. http://dx.doi.org/10.31026/j.eng.2012.03.12.

Full text
Abstract:
Face Identification is an important research topic in the field of computer vision and pattern recognition and has become a very active research area in recent decades. Recently multiwavelet-based neural networks (multiwavenets) have been used for function approximation and recognition, but to our best knowledge it has not been used for face Identification. This paper presents a novel approach for the Identification of human faces using Back-Propagation Adaptive Multiwavenet. The proposed multiwavenet has a structure similar to a multilayer perceptron (MLP) neural network with three layers, bu
APA, Harvard, Vancouver, ISO, and other styles
4

Pan, Hao. "An Improved Back-Propagation Neural Network Algorithm." Applied Mechanics and Materials 556-562 (May 2014): 4586–90. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.4586.

Full text
Abstract:
Based on the idea of standard back-propagation (BP) learning algorithm, an improved BP learning algorithm is presented. Three parameters are incorporated into each processing unit to enhance the output function. The improved BP learning algorithm is developed for updating the three parameters as well as the connection weights. It not only improves the learning speed, but also reduces the occurrence of local minima. Finally, the algorithm is tested on the XOR problem to verify the validity of the improved BP.
APA, Harvard, Vancouver, ISO, and other styles
5

Tang, Chuan Yin, Guang Yao Zhao, Yi Min Zhang, and Xiao Yu E. "Research on Active Suspension System Based on BP and RBF Network Algorithm." Advanced Materials Research 230-232 (May 2011): 149–53. http://dx.doi.org/10.4028/www.scientific.net/amr.230-232.149.

Full text
Abstract:
A six degrees of freedom half body vehicle suspension system is presented in the paper .The Back Propagation neural network algorithm and the Radial-Basis Function network algorithm is adopted to control the suspension system. With the aid of software Matlab/Simulink , the simulation model is obtained. A great deal of simulation work is done. Simulation results demonstrate that both the designed radius basis function neural network and the back propagation neural network work well for the proposed vehicle suspension model in the paper .
APA, Harvard, Vancouver, ISO, and other styles
6

Singarimbun, Roy Nuary. "Adaptive Moment Estimation To Minimize Square Error In Backpropagation Algorithm." Data Science: Journal of Computing and Applied Informatics 4, no. 1 (2020): 27–46. http://dx.doi.org/10.32734/jocai.v4.i1-1160.

Full text
Abstract:
Back - propagation Neural Network has weaknesses such as errors of gradient descent training slowly of error function, training time is too long and is easy to fall into local optimum. Back - propagation algorithm is one of the artificial neural network training algorithm that has weaknesses such as the convergence of long, over-fitting and easy to get stuck in local optima. Back - propagation is used to minimize errors in each iteration. This paper investigates and evaluates the performance of Adaptive Moment Estimation (ADAM) to minimize the squared error in back - propagation gradient desce
APA, Harvard, Vancouver, ISO, and other styles
7

Ding, Shuo, Xiao Heng Chang, and Qing Hui Wu. "A Study on Approximation Performances of General Regression Neural Network." Applied Mechanics and Materials 441 (December 2013): 713–16. http://dx.doi.org/10.4028/www.scientific.net/amm.441.713.

Full text
Abstract:
In order to study the approximation performance of general regression neural networks, the structure and algorithm of general regression neural networks are first introduced. Then general regression neural networks and back propagation neural networks improved by Levenberg-Marquardt algorithm are established through programming using MATLAB language. A certain nonlinear function is taken as an example to be approximated by the two kinds of neural networks. The simulation results indicate that compared with back propagation neural networks, general regression neural networks has better approxim
APA, Harvard, Vancouver, ISO, and other styles
8

Morris, A. S., and M. A. Mansor. "Manipulator Inverse Kinematics using an Adaptive Back-propagation Algorithm and Radial Basis Function with a Lookup Table." Robotica 16, no. 4 (1998): 433–44. http://dx.doi.org/10.1017/s0263574798001064.

Full text
Abstract:
This is an extension of previous work which used an artificial neural network with a back-propagation algorithm and a lookup table to find the inverse kinematics for a manipulator arm moving along pre-defined trajectories. The work now described shows that the performance of this technique can be improved if the back-propagation is made to be adaptive. Also, further improvement is obtained by using the whole workspace to train the neural network rather than just a pre-defined path. For the inverse kinematics of the whole workspace, a comparison has also been done between the adaptive back-prop
APA, Harvard, Vancouver, ISO, and other styles
9

Heidari, Mohammad, and Hadi Homaei. "Estimation of Acceleration Amplitude of Vehicle by Back Propagation Neural Networks." Advances in Acoustics and Vibration 2013 (June 4, 2013): 1–7. http://dx.doi.org/10.1155/2013/614025.

Full text
Abstract:
This paper investigates the variation of vertical vibrations of vehicles using a neural network (NN). The NN is a back propagation NN, which is employed to predict the amplitude of acceleration for different road conditions such as concrete, waved stone block paved, and country roads. In this paper, four supervised functions, namely, newff, newcf, newelm, and newfftd, have been used for modeling the vehicle vibrations. The networks have four inputs of velocity (), damping ratio (), natural frequency of vehicle shock absorber (), and road condition (R.C) as the independent variables and one out
APA, Harvard, Vancouver, ISO, and other styles
10

Samantaray, Sandeep, and Abinash Sahoo. "Prediction of runoff using BPNN, FFBPNN, CFBPNN algorithm in arid watershed: A case study." International Journal of Knowledge-based and Intelligent Engineering Systems 24, no. 3 (2020): 243–51. http://dx.doi.org/10.3233/kes-200046.

Full text
Abstract:
Here, an endeavor has been made to predict the correspondence between rainfall and runoff and modeling are demonstrated using Feed Forward Back Propagation Neural Network (FFBPNN), Back Propagation Neural Network (BPNN), and Cascade Forward Back Propagation Neural Network (CFBPNN), for predicting runoff. Various indicators like mean square error (MSE), Root Mean Square Error (RMSE), and coefficient of determination (R2) for training and testing phase are used to appraise performance of model. BPNN performs paramount among three networks having model architecture 4-5-1 utilizing Log-sig transfe
APA, Harvard, Vancouver, ISO, and other styles
11

Jie, Zhou, and Ma Qiurui. "Establishing a Genetic Algorithm-Back Propagation model to predict the pressure of girdles and to determine the model function." Textile Research Journal 90, no. 21-22 (2020): 2564–78. http://dx.doi.org/10.1177/0040517520922947.

Full text
Abstract:
A Genetic Algorithm-Back Propagation (GA-BP) neural network method has been proposed to predict the clothing pressure of girdles in different postures. Firstly, a Back Propagation (BP) neural network model was used to predict the clothing pressure based on seven parameters, and three optimal functions of the model were derived. However, the prediction error 0.85411 of the network was more than the forecast requirement of 0.5 and the optimal initial weights and thresholds for the network could not be calculated. Therefore, a GA model and the BP neural network model were combined into a new GA-B
APA, Harvard, Vancouver, ISO, and other styles
12

Sivak, Maria, and Vladimir Timofeev. "Building robust neural networks using different loss functions." Analysis and data processing systems, no. 2 (June 18, 2021): 67–82. http://dx.doi.org/10.17212/2782-2001-2021-2-67-82.

Full text
Abstract:
The paper considers the problem of building robust neural networks using different robust loss functions. Applying such neural networks is reasonably when working with noisy data, and it can serve as an alternative to data preprocessing and to making neural network architecture more complex. In order to work adequately, the error back-propagation algorithm requires a loss function to be continuously or two-times differentiable. According to this requirement, two five robust loss functions were chosen (Andrews, Welsch, Huber, Ramsey and Fair). Using the above-mentioned functions in the error ba
APA, Harvard, Vancouver, ISO, and other styles
13

Redwan, Renas M. "Neural networks and Sigmoid Activation Function in Multi-Layer Networks." Qubahan Academic Journal 1, no. 2 (2020): 29–43. http://dx.doi.org/10.48161/qaj.v1n2a11.

Full text
Abstract:
Back propagation neural networks are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural networks for back propagation of errors, and sigmoid activation function. This neural networks to map non-linear threshold gate. The non-linear used to classify binary inputs ( ) and passing it through hidden layer for computing and ( ), after computing errors by ( ) the weights and thetas ( ) are changing according to errors. Sigmo
APA, Harvard, Vancouver, ISO, and other styles
14

Zhang, Dai Yuan, and Jian Hui Zhan. "Short-Term Traffic Flow Forecasting of Road Based on Spline Weight Function Neural Networks." Applied Mechanics and Materials 513-517 (February 2014): 695–98. http://dx.doi.org/10.4028/www.scientific.net/amm.513-517.695.

Full text
Abstract:
Traditional short-term traffic flow forecasting of road usually based on back propagation neural network, which has a low prediction accuracy and convergence speed. This paper introduces a spline weight function neural networks which has a feature that the weight function can well reflect sample information after training, thus propose a short-term traffic flow forecasting method base on the spline weight function neural network, specify the network learning algorithm, and make a comparative tests bases on the actual data. The result proves that in short-term traffic flow forecasting, the spli
APA, Harvard, Vancouver, ISO, and other styles
15

Vairaprakash, Gurusamy *1 &. K.Nandhini2. "PERFORMANCE EVALUATION OF VARIANCES IN BACKPROPAGATION NEURAL NETWORK USED FOR HANDWRITTEN CHARACTER RECOGNITION." INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY 6, no. 11 (2017): 372–78. https://doi.org/10.5281/zenodo.1054593.

Full text
Abstract:
A Neural Network is a powerful data modeling tool that is able to capture and represent complex input/output relationships. The motivation for the development of neural network technology stemmed from the desire to develop an artificial system that could perform "intelligent" tasks similar to those performed by the human brain.Back propagation was created by generalizing the Widrow-Hoff learning rule to multiple-layer networks and nonlinear differentiable transfer functions. The term back propagation refers to the manner in which the gradient is computed for nonlinear multilayer networks. Ther
APA, Harvard, Vancouver, ISO, and other styles
16

ABDI, H. "A NEURAL NETWORK PRIMER." Journal of Biological Systems 02, no. 03 (1994): 247–81. http://dx.doi.org/10.1142/s0218339094000179.

Full text
Abstract:
Neural networks are composed of basic units somewhat analogous to neurons. These units are linked to each other by connections whose strength is modifiable as a result of a learning process or algorithm. Each of these units integrates independently (in paral lel) the information provided by its synapses in order to evaluate its state of activation. The unit response is then a linear or nonlinear function of its activation. Linear algebra concepts are used, in general, to analyze linear units, with eigenvectors and eigenvalues being the core concepts involved. This analysis makes clear the stro
APA, Harvard, Vancouver, ISO, and other styles
17

Liu, Qingliang, and Jinmei Lai. "Stochastic Loss Function." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 4884–91. http://dx.doi.org/10.1609/aaai.v34i04.5925.

Full text
Abstract:
Training deep neural networks is inherently subject to the predefined and fixed loss functions during optimizing. To improve learning efficiency, we develop Stochastic Loss Function (SLF) to dynamically and automatically generating appropriate gradients to train deep networks in the same round of back-propagation, while maintaining the completeness and differentiability of the training pipeline. In SLF, a generic loss function is formulated as a joint optimization problem of network weights and loss parameters. In order to guarantee the requisite efficiency, gradients with the respect to the g
APA, Harvard, Vancouver, ISO, and other styles
18

Dr. Baljit Singh Khehra, Er Amanpreet Kaur,. "Quality Assessment of modelled protein structure using Back-propagation and Radial Basis Function algorithm." International Journal of Scientific Research and Management (IJSRM) 5, no. 7 (2017): 6019–33. http://dx.doi.org/10.18535/ijsrm/v5i7.27.

Full text
Abstract:
Protein structure prediction (PSP) is the most important and challenging problem in bioinformatics today. This is due to the fact that the biological function of the protein is determined by its structure. While there is a gap between the number of known protein structures and the number of known protein sequences, protein structure prediction aims at reducing this structure –sequence gap. Protein structure can be experimentally determined using either X-ray crystallography or Nuclear Magnetic Resonance (NMR). However, these empirical techniques are very time consuming. So, various machine lea
APA, Harvard, Vancouver, ISO, and other styles
19

Chithambaram, T., and K. Perumal. "Comparative Study: Artificial Neural Networks Training Functions for Brain Tumor Segmentation for MRI Images." Journal of Computational and Theoretical Nanoscience 17, no. 4 (2020): 1831–38. http://dx.doi.org/10.1166/jctn.2020.8448.

Full text
Abstract:
Brain tumor detection from medical images is essential to diagnose earlier and to take decision in treatment planning. Magnetic Resonance Images (MRI) is frequently preferred for detecting brain tumors by the physicians. This paper analyses various Artificial Neural Networks (ANN) training functions for brain tumor segmentation such as Levenberg-Marquardt (LM), Quasi Newton back propagation (QN), Bayesian regularization (BR), Resilient back propagation algorithm (RP) and Scaled conjugate gradient back propagation (SCG). The training algorithms were employed in different sized network for segme
APA, Harvard, Vancouver, ISO, and other styles
20

WIRANATA, I. KETUT RESTU, G. K. GANDHIADI, and LUH PUTU IDA HARINI. "PERAMALAN KUNJUNGAN WISATAWAN MANCANEGARA KE PROVINSI BALI MENGGUNAKAN METODE ARTIFICIAL NEURAL NETWORK." E-Jurnal Matematika 9, no. 4 (2020): 213. http://dx.doi.org/10.24843/mtk.2020.v09.i04.p301.

Full text
Abstract:
Bali has an increasing tourism potential. This is evidenced by the increasing number of foreign tourist visits to Bali Province each year. Although Bali's tourism trends have continued to increase over the past few years, efforts to improve the quality of Bali tourism need to be made. One way is to do forecasting. To support improvement efforts in Bali's tourism sector, the author created a forecasting system for foreign tourists to Bali province using artificial neural network methods with back propagation algorithms. Artificial Neural Networks with back propagation algorithms are neural netw
APA, Harvard, Vancouver, ISO, and other styles
21

REHMAN, MUHAMMAD ZUBAIR, and NAZRI MOHD NAWI. "STUDYING THE EFFECT OF ADAPTIVE MOMENTUM IN IMPROVING THE ACCURACY OF GRADIENT DESCENT BACK PROPAGATION ALGORITHM ON CLASSIFICATION PROBLEMS." International Journal of Modern Physics: Conference Series 09 (January 2012): 432–39. http://dx.doi.org/10.1142/s201019451200551x.

Full text
Abstract:
Despite being widely used in the practical problems around the world, Gradient Descent Back-propagation algorithm comes with problems like slow convergence and convergence to local minima. Previous researchers have suggested certain modifications to improve the convergence in gradient Descent Back-propagation algorithm such as careful selection of input weights and biases, learning rate, momentum, network topology, activation function and value for 'gain' in the activation function. This research proposed an algorithm for improving the working performance of back-propagation algorithm which is
APA, Harvard, Vancouver, ISO, and other styles
22

Mazakov, T. Zh, and D. N. Narynbekovna. "DEVELOPMENT OF BIOMETRIC METHODS AND INFORMATION SECURITY TOOLS." PHYSICO-MATHEMATICAL SERIES 2, no. 336 (2021): 121–24. http://dx.doi.org/10.32014/2021.2518-1726.30.

Full text
Abstract:
Now a day’s security is a big issue, the whole world has been working on the face recognition techniques as face is used for the extraction of facial features. An analysis has been done of the commonly used face recognition techniques. This paper presents a system for the recognition of face for identification and verification purposes by using Principal Component Analysis (PCA) with Back Propagation Neural Networks (BPNN) and the implementation of face recognition system is done by using neural network. The use of neural network is to produce an output pattern from input pattern. This system
APA, Harvard, Vancouver, ISO, and other styles
23

Kaur, Jatinder, Dr Mandeep Singh, Pardeep Singh Bains, and Gagandeep Singh. "Analysis of Multi layer Perceptron Network." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 7, no. 2 (2013): 600–606. http://dx.doi.org/10.24297/ijct.v7i2.3462.

Full text
Abstract:
In this paper, we introduce the multilayer Perceptron (feedforward) neural network (MLPs) and used it for a function approximation. For the training of MLP, we have used back propagation algorithm principle. The main purpose of this paper lies in changing the number of hidden layers of MLP for achieving minimum value of mean square error.
APA, Harvard, Vancouver, ISO, and other styles
24

Le, Tuan-Ho, Li Dai, Hyeonae Jang, and Sangmun Shin. "Robust Process Parameter Design Methodology: A New Estimation Approach by Using Feed-Forward Neural Network Structures and Machine Learning Algorithms." Applied Sciences 12, no. 6 (2022): 2904. http://dx.doi.org/10.3390/app12062904.

Full text
Abstract:
In robust design (RD) modeling, the response surface methodology (RSM) based on the least-squares method (LSM) is a useful statistical tool for estimating functional relationships between input factors and their associated output responses. Neural network (NN)-based models provide an alternative means of executing input-output functions without the assumptions necessary with LSM-based RSM. However, current NN-based estimation methods do not always provide suitable response functions. Thus, there is room for improvement in the realm of RD modeling. In this study, a new NN-based RD modeling proc
APA, Harvard, Vancouver, ISO, and other styles
25

Payal, Mahajan, and Zaheeruddin. "Analysis of back propagation and radial basis function neural networks for handover decisions in wireless communication." International Journal of Electrical and Computer Engineering (IJECE) 10, no. 5 (2020): 4835–43. https://doi.org/10.11591/ijece.v10i5.pp4835-4843.

Full text
Abstract:
In mobile systems, handoff is a vital process, referring to a process of allocating an ongoing call from one BS to another BS. The handover technique is very important to maintain the Quality of service. Handover algorithms, based on neural networks, fuzzy logic etc. can be used for the same purpose to keep Quality of service as high as possible. In this paper, it is proposed that back propagation networks and radial basis functions may be used for taking handover decision in wireless communication networks. The performance of these classifiers is evaluated on the basis of neurons in hidden la
APA, Harvard, Vancouver, ISO, and other styles
26

Kao, Chin Ming, Li Chen, Chang Huan Kou, and Shih Wei Ma. "Applying Back-Propagation Neural Network for Estimating the Slump of Concrete." Advanced Materials Research 651 (January 2013): 986–89. http://dx.doi.org/10.4028/www.scientific.net/amr.651.986.

Full text
Abstract:
This paper proposes the back-propagation neural network (BPN) and applies it to estimate the slump of high-performance concrete (HPC). It is known that HPC is a highly complex material whose behaviour is difficult to model, especially for slump. To estimate the slump, it is a nonlinear function of the content of all concrete ingredients, including cement, fly ash, blast furnace slag, water, superplasticizer, and coarse and fine aggregate. Therefore, slump estimation is set as a function of the content of these seven concrete ingredients and additional four important ratios. The results show th
APA, Harvard, Vancouver, ISO, and other styles
27

Liao, Xuan, Tong Zhou, Longlong Zhang, Xiang Hu, and Yuanxi Peng. "A Method for Calculating the Derivative of Activation Functions Based on Piecewise Linear Approximation." Electronics 12, no. 2 (2023): 267. http://dx.doi.org/10.3390/electronics12020267.

Full text
Abstract:
Nonlinear functions are widely used as activation functions in artificial neural networks, which have a great impact on the fitting ability of artificial neural networks. Due to the complexity of the activation function, the computation of the activation function and its derivative requires a lot of computing resources and time during training. In order to improve the computational efficiency of the derivatives of the activation function in the back-propagation of artificial neural networks, this paper proposes a method based on piecewise linear approximation method to calculate the derivative
APA, Harvard, Vancouver, ISO, and other styles
28

Liu, T. I., and K. S. Anantharaman. "Intelligent Classification and Measurement of Drill Wear." Journal of Engineering for Industry 116, no. 3 (1994): 392–97. http://dx.doi.org/10.1115/1.2901957.

Full text
Abstract:
Artificial neural networks are used for on-line classification and measurement of drill wear. The input vector of the neural network is obtained by processing the thrust and torque signals. Outputs are the wear states and flank wear measurements. The learning process can be performed by back propagation along with adaptive activation-function slope. The results of neural networks with and without adaptive activation-function slope, as well as various neural network architectures are compared. On-line classification of drill wear using neural networks has 100 percent reliability. The average fl
APA, Harvard, Vancouver, ISO, and other styles
29

Thatoi, Dhirendranath, Punyaslok Guru, Prabir Kumar Jena, Sasanka Choudhury, and Harish Chandra Das. "Comparison of CFBP, FFBP, and RBF Networks in the Field of Crack Detection." Modelling and Simulation in Engineering 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/292175.

Full text
Abstract:
The issue of crack detection and its diagnosis has gained a wide spread of industrial interest. The crack/damage affects the industrial economic growth. So early crack detection is an important aspect in the point of view of any industrial growth. In this paper a design tool ANSYS is used to monitor various changes in vibrational characteristics of thin transverse cracks on a cantilever beam for detecting the crack position and depth and was compared using artificial intelligence techniques. The usage of neural networks is the key point of development in this paper. The three neural networks u
APA, Harvard, Vancouver, ISO, and other styles
30

Panda, S. S., D. Chakraborty, and S. K. Pal. "Flank wear prediction in drilling using back propagation neural network and radial basis function network." Applied Soft Computing 8, no. 2 (2008): 858–71. http://dx.doi.org/10.1016/j.asoc.2007.07.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Fan, Qi, Chang Jie Zhu, Bao Hua Wang, and Jian Yu Xiao. "Use ANN to Solve Function Regression Problem in Manufacturing." Applied Mechanics and Materials 44-47 (December 2010): 3119–22. http://dx.doi.org/10.4028/www.scientific.net/amm.44-47.3119.

Full text
Abstract:
Engineers often meet function regression problems in manufacturing. In this work, we use artifical neural network to solve this problem. We choose a typical function as the target function. Since the input value of the function is continuous, the output of the regression model should also have a continuous value range. We implement the feed forward neural network with back propagation learning algorithm, investigate different network parameters, and compare several different training algorithms. The performance assessment based on the test dataset is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
32

Wichman, R. F., and J. Alexander. "Using Function Approximation to Determine Neural Network Accuracy." AECL Nuclear Review 2, no. 1 (2013): 89–98. http://dx.doi.org/10.12943/anr.2013.00008.

Full text
Abstract:
Many, if not most, control processes demonstrate non-linear behavior in some portion of their operating range and the ability of neural networks to model non-linear dynamics makes them very appealing for control. Control of high reliability safety systems, and autonomous control in process or robotic applications, however, require accurate and consistent control and neural networks are only approximators of various functions so their degree of approximation becomes important. In this paper, the factors affecting the ability of a feed-forward back-propagation neural network to accurately approx
APA, Harvard, Vancouver, ISO, and other styles
33

Bondarev, V. "Training a digital model of a deep spiking neural network using backpropagation." E3S Web of Conferences 224 (2020): 01026. http://dx.doi.org/10.1051/e3sconf/202022401026.

Full text
Abstract:
Deep spiking neural networks are one of the promising eventbased sensor signal processing concepts. However, the practical application of such networks is difficult with standard deep neural network training packages. In this paper, we propose a vector-matrix description of a spike neural network that allows us to adapt the traditional backpropagation algorithm for signals represented as spike time sequences. We represent spike sequences as binary vectors. This enables us to derive expressions for the forward propagation of spikes and the corresponding spike training algorithm based on the bac
APA, Harvard, Vancouver, ISO, and other styles
34

Safavi, A., M. H. Esteki, S. M. Mirvakili, and M. Khaki. "Comparison of back propagation network and radial basis function network in Departure from Nucleate Boiling Ratio (DNBR) calculation." Kerntechnik 85, no. 1 (2020): 15–25. http://dx.doi.org/10.1515/kern-2020-850105.

Full text
Abstract:
Abstract Since estimating the minimum departure from nucleate boiling ratio (MDNBR) requires complex calculations, an alternative method has always been considered. One of these methods is neural network. In this study, the Back Propagation Neural network (BPN) and Radial Basis Function Neural network (RBFN) are introduced and compared in order to estimate MDNBR of the VVER-1000 light water reactor. In these networks, the MDNBR were predicted with the inputs including core mass flux, core inlet temperature, pressure, reactor power level and position of the control rods. To obtain the data requ
APA, Harvard, Vancouver, ISO, and other styles
35

Vimaladevi, M., and B. Kalaavathi. "A microarray gene expression data classification using hybrid back propagation neural network." Genetika 46, no. 3 (2014): 1013–26. http://dx.doi.org/10.2298/gensr1403013v.

Full text
Abstract:
Classification of cancer establishes appropriate treatment and helps to decide the diagnosis. Cancer expands progressively from an alteration in a cell's genetic structure. This change (mutation) results in cells with uncontrolled growth patterns. In cancer classification, the approach, Back propagation is sufficient and also it is a universal technique of training artificial neural networks. It is also called supervised learning method. It needs many dataset for input and output for making up the training set. The back propagation method may execute the function of collaborate multiple partie
APA, Harvard, Vancouver, ISO, and other styles
36

Lav, Singh Mathur* Mr. Amit Agrawal Dr. Dharmendra Kumar Singh (member of IEI). "MODELING OF BREAKDOWN VOLTAGE OF SOLID INSULATING MATERIALS BY ARTIFICIAL NEURAL NETWORK." INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY 5, no. 6 (2016): 788–96. https://doi.org/10.5281/zenodo.56011.

Full text
Abstract:
This paper presents a model to find out the breakdown voltage of solid insulating materials under AC excitation condition by employing the artificial neural network method. The paper gives a brief introduction to multilayer perceptrons and resilient back-propagation. A relation between input variables and output variables i. e. breakdown voltage is demonstrated. The inputs to the neural networks are the thickness of material, diameter of void, depth of void and permittivity of materials. Neural network methodology is the one of the most popular and widely used method for the analysis of voids.
APA, Harvard, Vancouver, ISO, and other styles
37

Sangram S. Nikam, A. K. Mishra, A. Sarangi, Paresh B. Shirsath, D. K. Singh, and V. Ramasubramanian. "Artificial Neural Network Models to Predict Wheat Crop Evapotranspiration." Journal of Agricultural Engineering (India) 47, no. 2 (2024): 20–25. http://dx.doi.org/10.52151/jae2010472.1406.

Full text
Abstract:
The development of Artificial Neural Network (ANN) models for prediction of wheat crop evapotranspiration using measured weather data and lysimeter measured crop evapotranspiration (ETc) for Delhi is described. Eleven meteorological variables were taken into consideration for this study. ANN models were developed in MATLAB© with different network architectures using Feed Forward Back Propagation (FFBP) and Elman Back Propagation (EBP) algorithms. The total length of data record used was 744, out of that 60% was taken for model training, 20% for model testing and remaining 20% for model validat
APA, Harvard, Vancouver, ISO, and other styles
38

Mahajan, Payal, and Zaheeruddin Zaheeruddin. "Analysis of back propagation and radial basis function neural networks for handover decisions in wireless communication." International Journal of Electrical and Computer Engineering (IJECE) 10, no. 5 (2020): 4835. http://dx.doi.org/10.11591/ijece.v10i5.pp4835-4843.

Full text
Abstract:
In mobile systems, handoff is a vital process, referring to a process of allocating an ongoing call from one BS to another BS. The handover technique is very important to maintain the Quality of service. Handover algorithms, based on neural networks, fuzzy logic etc. can be used for the same purpose to keep Quality of service as high as possible. In this paper, it is proposed that back propagation networks and radial basis functions may be used for taking handover decision in wireless communication networks. The performance of these classifiers is evaluated on the basis of neurons in hidden la
APA, Harvard, Vancouver, ISO, and other styles
39

Chen, Xu Sheng, Chen Peng Xu, and Hong Qi Wang. "Equipment Manufacturing Industry Knowledge Chain Efficiency Prediction Algorithm Based on Improved RBFNN." Applied Mechanics and Materials 441 (December 2013): 776–79. http://dx.doi.org/10.4028/www.scientific.net/amm.441.776.

Full text
Abstract:
A new knowledge chain efficiency prediction arithmetic in equipment manufacturing industry in China was proposed, Radial basis function neural network (RBFNN) was designed, and initial temperature numerical calculation arithmetic was adopted to adjust the network weights. MATLAB program was compiled; experiments on related data have been done employing the program. All experiments have shown that the arithmetic can efficiently approach the precision with 10-4 error, also the learning speed is quick and predictions are ideal. Trainings have been done with other networks in comparison. Back-prop
APA, Harvard, Vancouver, ISO, and other styles
40

Maibam, Sanju Meetei1. "QUANTIFICATION OF METHANE (MARSH) GAS USING RESILIENT BACKPROPAGATION NEURAL NETWORK." Multilogic in science XIII, no. XXXXVI (2023): 709–12. https://doi.org/10.5281/zenodo.7869736.

Full text
Abstract:
This study unequivocally demonstrates that robust propagation neural networks can be used to quantify marsh or methane gases. Methane has lower explosive limit (LEL) is 5.0% (5000 ppm) and its upper explosive limit is 15.0% (150000 ppm) by volume with air. The main purposed of this study is to classify the concentration level below the LEL of the methane gas present in the air by using the resilient back-propagation neural network to prevent the various hazardous caused by the methane.   After the suggested network was trained with defaults free parameters, it provided a very high qu
APA, Harvard, Vancouver, ISO, and other styles
41

R., Bhuvana, Purushothaman S., Rajeswari R., and Balaji R.G. "Development of combined back propagation algorithm and radial basis function for diagnosing depression patients." International Journal of Engineering & Technology 4, no. 1 (2015): 244. http://dx.doi.org/10.14419/ijet.v4i1.4201.

Full text
Abstract:
Depression is a severe and well-known public health challenge. Depression is one of the most common psychological problems affecting nearly everyone either personally or through a family member. This paper proposes neural network algorithm for faster learning of depression data and classifying the depression. Implementation of neural networks methods for depression data mining using Back Propagation Algorithm (BPA) and Radial Basis Function (RBF) are presented. Experimental data were collected with 21 depression variables used as inputs for artificial neural network (ANN) and one desired categ
APA, Harvard, Vancouver, ISO, and other styles
42

Gevaert, Wouter, Georgi Tsenov, and Valeri Mladenov. "Neural networks used for speech recognition." Journal of Automatic Control 20, no. 1 (2010): 1–7. http://dx.doi.org/10.2298/jac1001001g.

Full text
Abstract:
In this paper is presented an investigation of the speech recognition classification performance. This investigation on the speech recognition classification performance is performed using two standard neural networks structures as the classifier. The utilized standard neural network types include Feed-forward Neural Network (NN) with back propagation algorithm and a Radial Basis Functions Neural Networks.
APA, Harvard, Vancouver, ISO, and other styles
43

Li, Jing Jing, and Zhe Cui. "A Modified Back Propagation Algorithm of Neural Network with Global Optimization." Advanced Materials Research 1042 (October 2014): 232–38. http://dx.doi.org/10.4028/www.scientific.net/amr.1042.232.

Full text
Abstract:
The advantages and weakens of traditional BP algorithm is briefly analyzed and an efficient global optimization algorithm is proposed.The basic principle of the algorithm is presented,and a new BP neural network algorithm based on the existing BP algorithm and the new global optimization algorithm is proposed, considering the new global optimization algorithm can solve the problem of local minimum efficiently. To verify the effectiveness of the new BP algorithm,the paper compared the experimental results of various algorithms in solving function fitting problem.
APA, Harvard, Vancouver, ISO, and other styles
44

Tan, Shuqiu, Jiahao Pan, Jianxun Zhang, and Yahui Liu. "CASVM: An Efficient Deep Learning Image Classification Method Combined with SVM." Applied Sciences 12, no. 22 (2022): 11690. http://dx.doi.org/10.3390/app122211690.

Full text
Abstract:
Recent advances in convolutional neural networks (CNNs) for image feature extraction have achieved extraordinary performance, but back-propagation algorithms tend to fall into local minima. To alleviate this problem, this paper proposes a coordinate attention-support vector machine-convolutional neural network (CASVM). This proposed to enhance the model’s ability by introducing coordinate attention while obtaining enhanced image features. Training is carried out by back-propagating the loss function of support vector machines (SVMs) to improve the generalization capability, which can effective
APA, Harvard, Vancouver, ISO, and other styles
45

Chen, Jingfeng. "Spam mail classification using back propagation neural networks." Applied and Computational Engineering 5, no. 1 (2023): 438–49. http://dx.doi.org/10.54254/2755-2721/5/20230617.

Full text
Abstract:
Mail classification methods based on machine learning have been introduced to combat spams. However, few researches focus on the most powerful machine learning model that is neural networks. In this paper, the author trains BP neural networks to detect spams. The inputs of the neural networks are only information about words, punctures, signs, numbers and illegal words. Five neural networks which are different in number of neurons and number of layers are experimented on. All networks apply Rectified Linear Unit (ReLU) functions and Momentum learning technology. The results show that the netwo
APA, Harvard, Vancouver, ISO, and other styles
46

Ou, Shaoduan, and Luke E. K. Achenie. "Artificial Neural Network Modeling of PEM Fuel Cells." Journal of Fuel Cell Science and Technology 2, no. 4 (2005): 226–33. http://dx.doi.org/10.1115/1.2039951.

Full text
Abstract:
Artificial neural network (ANN) approaches for modeling of proton exchange membrane (PEM) fuel cells have been investigated in this study. This type of data-driven approach is capable of inferring functional relationships among process variables (i.e., cell voltage, current density, feed concentration, airflow rate, etc.) in fuel cell systems. In our simulations, ANN models have shown to be accurate for modeling of fuel cell systems. Specifically, different approaches for ANN, including back-propagation feed-forward networks, and radial basis function networks, were considered. The back-propag
APA, Harvard, Vancouver, ISO, and other styles
47

Chen, Xu Sheng, Wen Jun Yue, and Hong Qi Wang. "A Novel Knowledge Diffusion Efficiency Prediction Arithmetic in Equipment Manufacturing Industry Based on Simulated Annealing Arithmetic." Applied Mechanics and Materials 441 (December 2013): 768–71. http://dx.doi.org/10.4028/www.scientific.net/amm.441.768.

Full text
Abstract:
A novel knowledge diffusion efficiency prediction arithmetic in equipment manufacturing industry in China was proposed, Radial basis function neural network (RBFNN) was designed, and simulated annealing arithmetic was adopted to adjust the network weights. MATLAB program was compiled; experiments on related data have been done employing the program. All experiments have shown that the arithmetic can efficiently approach the precision with 10-4 error, also the learning speed is quick and predictions are ideal. Trainings have been done with other networks in comparison. Back-propagation learning
APA, Harvard, Vancouver, ISO, and other styles
48

Zhu, Qiuyu, Zikuang He, Tao Zhang, and Wennan Cui. "Improving Classification Performance of Softmax Loss Function Based on Scalable Batch-Normalization." Applied Sciences 10, no. 8 (2020): 2950. http://dx.doi.org/10.3390/app10082950.

Full text
Abstract:
Convolutional neural networks (CNNs) have made great achievements on computer vision tasks, especially the image classification. With the improvement of network structure and loss functions, the performance of image classification is getting higher and higher. The classic Softmax + cross-entropy loss has been the norm for training neural networks for years, which is calculated from the output probability of the ground-truth class. Then the network’s weight is updated by gradient calculation of the loss. However, after several epochs of training, the back-propagation errors usually become almos
APA, Harvard, Vancouver, ISO, and other styles
49

Alasl, M. Kashefi, M. Khosravi, M. Hosseini, G. R. Pazuki, and R. Nezakati Esmail Zadeh. "Measurement and mathematical modelling of nutrient level and water quality parameters." Water Science and Technology 66, no. 9 (2012): 1962–67. http://dx.doi.org/10.2166/wst.2012.333.

Full text
Abstract:
Physico-chemical water quality parameters and nutrient levels such as water temperature, turbidity, saturated oxygen, dissolved oxygen, pH, chlorophyll-a, salinity, conductivity, total nitrogen and total phosphorus, were measured from April to September 2011 in the Karaj dam area, Iran. Total nitrogen in water was modelled using an artificial neural network system. In the proposed system, water temperature, depth, saturated oxygen, dissolved oxygen, pH, chlorophyll-a, salinity, turbidity and conductivity were considered as input data, and the total nitrogen in water was considered as output. T
APA, Harvard, Vancouver, ISO, and other styles
50

Gill, Sumeet, and Renu Devi. "Enhancing Cloud Data Security using Artificial Neural Networks for Users’ Account Hijacking Security Threats." Indian Journal Of Science And Technology 17, no. 34 (2024): 3538–52. http://dx.doi.org/10.17485/ijst/v17i34.2339.

Full text
Abstract:
Objectives: To ensure the security of passwords of cloud users' accounts that cannot be decrypted easily by any software or hackers. Methods: In this manuscript, we have designed an experimental setup using a feed-forward back-propagation algorithm of Artificial Neural Networks techniques to ensure cloud data security. For this purpose, we have utilized password-based datasets created by us. 70% of the datasets are allocated for training and 30% for testing and validation purposes. In this training, TRAINLM training function, LEARNGDM adaptive function, performance function is MSE, and PURELIN
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!