Academic literature on the topic 'Radial basis function networks (RBFNs)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Radial basis function networks (RBFNs).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Radial basis function networks (RBFNs)"

1

Feng, Hsuan Ming, Ching Chang Wong, and Ji Hwei Horng. "RBFNs Nonlinear Control System Design through BFPSO Algorithm." Applied Mechanics and Materials 764-765 (May 2015): 619–23. http://dx.doi.org/10.4028/www.scientific.net/amm.764-765.619.

Full text
Abstract:
All parameters are automatically extracted by the bacterial foraging particle swarm optimization (BFPSO) algorithm to approach the desired control system. Three parameterize basis function neural networks (RBFNs) model to solve the car-pole system problem. Several free parameters of radial basis functions can be automatically tuned by the direct of the specified fitness function. In additional, the proper number of radial basis functions (RBFs) of the constructed RBFNs can be chosen by the defined fitness function which takes this factor into account. The desired multiple objectives of the RBFNs control system are proposed to simultaneously approach the smaller errors with a fewer RBFs number. Simulations show that the developed RBFNs control systems efficiently achieve the desired the setting lot as soon as possible.
APA, Harvard, Vancouver, ISO, and other styles
2

Alqezweeni, Mohie Mortadha, Vladimir Ivanovich Gorbachenko, Maxim Valerievich Zhukov, and Mustafa Sadeq Jaafar. "Efficient Solving of Boundary Value Problems Using Radial Basis Function Networks Learned by Trust Region Method." International Journal of Mathematics and Mathematical Sciences 2018 (June 3, 2018): 1–4. http://dx.doi.org/10.1155/2018/9457578.

Full text
Abstract:
A method using radial basis function networks (RBFNs) to solve boundary value problems of mathematical physics is presented in this paper. The main advantages of mesh-free methods based on RBFN are explained here. To learn RBFNs, the Trust Region Method (TRM) is proposed, which simplifies the process of network structure selection and reduces time expenses to adjust their parameters. Application of the proposed algorithm is illustrated by solving two-dimensional Poisson equation.
APA, Harvard, Vancouver, ISO, and other styles
3

Holden, Sean B., and Mahesan Niranjan. "Average-Case Learning Curves for Radial Basis Function Networks." Neural Computation 9, no. 2 (1997): 441–60. http://dx.doi.org/10.1162/neco.1997.9.2.441.

Full text
Abstract:
The application of statistical physics to the study of the learning curves of feedforward connectionist networks has to date been concerned mostly with perceptron-like networks. Recent work has extended the theory to networks such as committee machines and parity machines, and an important direction for current and future research is the extension of this body of theory to further connectionist networks. In this article, we use this formalism to investigate the learning curves of gaussian radial basis function networks (RBFNs) having fixed basis functions. (These networks have also been called generalized linear regression models.) We address the problem of learning linear and nonlinear, realizable and unrealizable, target rules from noise-free training examples using a stochastic training algorithm. Expressions for the generalization error, defined as the expected error for a network with a given set of parameters, are derived for general gaussian RBFNs, for which all parameters, including centers and spread parameters, are adaptable. Specializing to the case of RBFNs with fixed basis functions (basis functions having parameters chosen without reference to the training examples), we then study the learning curves for these networks in the limit of high temperature.
APA, Harvard, Vancouver, ISO, and other styles
4

HUANG, DE-SHUANG. "RADIAL BASIS PROBABILISTIC NEURAL NETWORKS: MODEL AND APPLICATION." International Journal of Pattern Recognition and Artificial Intelligence 13, no. 07 (1999): 1083–101. http://dx.doi.org/10.1142/s0218001499000604.

Full text
Abstract:
This paper investigates the capabilities of radial basis function networks (RBFN) and kernel neural networks (KNN), i.e. a specific probabilistic neural networks (PNN), and studies their similarities and differences. In order to avoid the huge amount of hidden units of the KNNs (or PNNs) and reduce the training time for the RBFNs, this paper proposes a new feedforward neural network model referred to as radial basis probabilistic neural network (RBPNN). This new network model inherits the merits of the two old odels to a great extent, and avoids their defects in some ways. Finally, we apply this new RBPNN to the recognition of one-dimensional cross-images of radar targets (five kinds of aircrafts), and the experimental results are given and discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Gil Pita, R., R. Vicen, M. Rosa, M. P. Jarabo, P. Vera, and J. Curpian. "Ultrasonic flaw detection using radial basis function networks (RBFNs)." Ultrasonics 42, no. 1-9 (2004): 361–65. http://dx.doi.org/10.1016/j.ultras.2003.11.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Joug, Shian Ming, Hsuan Ming Feng, and Dong Hui Guo. "Self-Tuning RBFNs Mobile Robot Systems through Bacterial Foraging Particle Swarm Optimization Learning Algorithm." Applied Mechanics and Materials 284-287 (January 2013): 2128–36. http://dx.doi.org/10.4028/www.scientific.net/amm.284-287.2128.

Full text
Abstract:
A radial basis function neural networks (RBFNs) mobile robot control system is automatically developed with the image processing and learned by the bacterial foraging particle swarm optimization (BFPSO) algorithm in this paper. The image-based architecture of robot model is self-generated to travel the routing path in the dynamical and complicated environments. The visible omni-directional image sensors capture the surrounding environment to represent the behavior model of the mobile robot system. Three parameterize RBFNs model with the centers and spreads of each radial basis function, and the connection weights to solve the mobile robot path traveling and routing problems. Several free parameters of radial basis functions can be automatically tuned by the direct of the specified fitness function. In additional, the proper number of radial basis functions of the constructed RBFNs can be chosen by the defined fitness function which takes this factor into account. The desired multiple objectives of the RBFNs control system are proposed to simultaneously approach the shorter path and avoid the unexpected obstacles. Evaluations of PSO and BFPSO show that the developed RBFNs robot systems skip the obstacles and efficiently achieve the desired targets as soon as possible.
APA, Harvard, Vancouver, ISO, and other styles
7

Dash, Ch Sanjeev Kumar, Ajit Kumar Behera, Satchidananda Dehuri, and Sung-Bae Cho. "Radial basis function neural networks: a topical state-of-the-art survey." Open Computer Science 6, no. 1 (2016): 33–63. http://dx.doi.org/10.1515/comp-2016-0005.

Full text
Abstract:
AbstractRadial basis function networks (RBFNs) have gained widespread appeal amongst researchers and have shown good performance in a variety of application domains. They have potential for hybridization and demonstrate some interesting emergent behaviors. This paper aims to offer a compendious and sensible survey on RBF networks. The advantages they offer, such as fast training and global approximation capability with local responses, are attracting many researchers to use them in diversified fields. The overall algorithmic development of RBF networks by giving special focus on their learning methods, novel kernels, and fine tuning of kernel parameters have been discussed. In addition, we have considered the recent research work on optimization of multi-criterions in RBF networks and a range of indicative application areas along with some open source RBFN tools.
APA, Harvard, Vancouver, ISO, and other styles
8

MAYORGA, RENÉ V., and JONATHAN CARRERA. "A RADIAL BASIS FUNCTION NETWORK APPROACH FOR THE COMPUTATION OF INVERSE CONTINUOUS TIME VARIANT FUNCTIONS." International Journal of Neural Systems 17, no. 03 (2007): 149–60. http://dx.doi.org/10.1142/s0129065707001020.

Full text
Abstract:
This Paper presents an efficient approach for the fast computation of inverse continuous time variant functions with the proper use of Radial Basis Function Networks (RBFNs). The approach is based on implementing RBFNs for computing inverse continuous time variant functions via an overall damped least squares solution that includes a novel null space vector for singularities prevention. The singularities avoidance null space vector is derived from developing a sufficiency condition for singularities prevention that conduces to establish some characterizing matrices and an associated performance index.
APA, Harvard, Vancouver, ISO, and other styles
9

HUANG, DE-SHUANG. "APPLICATION OF GENERALIZED RADIAL BASIS FUNCTION NETWORKS TO RECOGNITION OF RADAR TARGETS." International Journal of Pattern Recognition and Artificial Intelligence 13, no. 06 (1999): 945–62. http://dx.doi.org/10.1142/s0218001499000525.

Full text
Abstract:
This paper extends general radial basis function networks (RBFN) with Gaussian kernel functions to generalized radial basis function networks (GRBFN) with Parzen window functions, and discusses applying the GRBFNs to recognition of radar targets. The equivalence between the RBFN classifiers (RBFNC) with outer-supervised signals of 0 or 1 and the estimate of Parzen windowed probabilistic density is proved. It is pointed out that the I/O functions of the hidden units in the RBFNC can be extended to general Parzen window functions (or called as potential functions). We present using recursive least square-backpropagation (RLS–BP) learning algorithm to train the GRBFNCs to classify five types of radar targets by means of their one-dimensional cross profiles. The concepts about the rate of recognition and confidence in the process of testing classification performance of the GRBFNCs are introduced. Six generalized kernel functions such as Gaussian, Double-Exponential, Triangle, Hyperbolic, Sinc and Cauchy, are used as the hidden I/O functions of the RBFNCs, and the classification performance of corresponding GRBFNCs for classifying one-dimensional cross profiles of radar targets is discussed.
APA, Harvard, Vancouver, ISO, and other styles
10

Dash, Ch Sanjeev Kumar, Ajit Kumar Behera, Satchidananda Dehuri, and Sung-Bae Cho. "Differential Evolution-Based Optimization of Kernel Parameters in Radial Basis Function Networks for Classification." International Journal of Applied Evolutionary Computation 4, no. 1 (2013): 56–80. http://dx.doi.org/10.4018/jaec.2013010104.

Full text
Abstract:
In this paper a two phases learning algorithm with a modified kernel for radial basis function neural networks is proposed for classification. In phase one a new meta-heuristic approach differential evolution is used to reveal the parameters of the modified kernel. The second phase focuses on optimization of weights for learning the networks. Further, a predefined set of basis functions is taken for empirical analysis of which basis function is better for which kind of domain. The simulation result shows that the proposed learning mechanism is evidently producing better classification accuracy vis-à-vis radial basis function neural networks (RBFNs) and genetic algorithm-radial basis function (GA-RBF) neural networks.
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!