Academic literature on the topic 'Neural network: performance'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural network: performance.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural network: performance"

1

Panda, Subodh, Bikash Swain, and Sandeep Mishra. "Boiler Performance Optimization Using Process Neural Network." Indian Journal of Applied Research 3, no. 7 (2011): 298–300. http://dx.doi.org/10.15373/2249555x/july2013/93.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

CR, Dhivyaa, Sudhakar R, Nithya K, and Prabhakar E. "Performance Analysis of Convolutional Neural Network for Retinal Image Classification." International Journal of Psychosocial Rehabilitation 23, no. 4 (2019): 1149–59. http://dx.doi.org/10.37200/ijpr/v23i4/pr190441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Xiao Hu, Feng Xu, Jin Hua Zhang, and Su Nan Wang. "A New Small-World Neural Network with its Performance on Fault Tolerance." Advanced Materials Research 629 (December 2012): 719–24. http://dx.doi.org/10.4028/www.scientific.net/amr.629.719.

Full text
Abstract:
Many artificial neural networks are the simple simulation of brain neural network’s architecture and function. However, how to rebuild new artificial neural network which architecture is similar to biological neural networks is worth studying. In this study, a new multilayer feedforward small-world neural network is presented using the results form research on complex network. Firstly, a new multilayer feedforward small-world neural network which relies on the rewiring probability heavily is built up on the basis of the construction ideology of Watts-Strogatz networks model and community structure. Secondly, fault tolerance is employed in investigating the performances of new small-world neural network. When the network with connection fault or neuron damage is used to test the fault tolerance performance under different rewiring probability, simulation results show that the fault tolerance capability of small-world neural network outmatches that of the same scale regular network when the fault probability is more than 40%, while random network has the best fault tolerance capability.
APA, Harvard, Vancouver, ISO, and other styles
4

Yen, Gary G., and Haiming Lu. "Hierarchical Rank Density Genetic Algorithm for Radial-Basis Function Neural Network Design." International Journal of Computational Intelligence and Applications 03, no. 03 (2003): 213–32. http://dx.doi.org/10.1142/s1469026803000975.

Full text
Abstract:
In this paper, we propose a genetic algorithm based design procedure for a radial-basis function neural network. A Hierarchical Rank Density Genetic Algorithm (HRDGA) is used to evolve the neural network's topology and parameters simultaneously. Compared with traditional genetic algorithm based designs for neural networks, the hierarchical approach addresses several deficiencies highlighted in literature. In addition, the rank-density based fitness assignment technique is used to optimize the performance and topology of the evolved neural network to deal with the confliction between the training performance and network complexity. Instead of producing a single optimal solution, HRDGA provides a set of near-optimal neural networks to the designers so that they can have more flexibility for the final decision-making based on certain preferences. In terms of searching for a near-complete set of candidate networks with high performances, the networks designed by the proposed algorithm prove to be competitive, or even superior, to three other traditional radial-basis function networks for predicting Mackey–Glass chaotic time series.
APA, Harvard, Vancouver, ISO, and other styles
5

Jeong, Yeongsang, and Sungshin Kim. "A Study of Arrow Performance using Artificial Neural Network." Journal of Korean Institute of Intelligent Systems 24, no. 5 (2014): 548–53. http://dx.doi.org/10.5391/jkiis.2014.24.5.548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tun, Myat Thida. "Implementation and Performance Evaluation of Neural Network for English Alphabet Recognition System." International Journal of Trend in Scientific Research and Development Volume-2, Issue-5 (2018): 474–78. http://dx.doi.org/10.31142/ijtsrd15863.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Suhailayani Suhaimi, Nur, Zalinda Othman, and Mohd Ridzwan Yaakub. "Analyzing Prediction Performance between Wavelet Neural Network and Product-Unit Neural Network." Journal of Physics: Conference Series 1432 (January 2020): 012081. http://dx.doi.org/10.1088/1742-6596/1432/1/012081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stevens, R., J. Ikeda, A. Casillas, J. Palacio-Cayetano, and S. Clyman. "Artificial neural network-based performance assessments." Computers in Human Behavior 15, no. 3-4 (1999): 295–313. http://dx.doi.org/10.1016/s0747-5632(99)00025-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jasic, Teo, and Douglas Wood. "Neural network protocols and model performance." Neurocomputing 55, no. 3-4 (2003): 747–53. http://dx.doi.org/10.1016/s0925-2312(03)00437-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wilson, Charles L., James L. Blue, and Omid M. Omidvar. "Training Dynamics and Neural Network Performance." Neural Networks 10, no. 5 (1997): 907–23. http://dx.doi.org/10.1016/s0893-6080(96)00119-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography