Academic literature on the topic 'Multi-layer neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multi-layer neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multi-layer neural networks"

1

Ban, Jung-Chao, and Chih-Hung Chang. "The layer effect on multi-layer cellular neural networks." Applied Mathematics Letters 26, no. 7 (2013): 706–9. http://dx.doi.org/10.1016/j.aml.2013.01.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Scherer, Magdalena. "Multi-layer neural networks for sales forecasting." Journal of Applied Mathematics and Computational Mechanics 17, no. 1 (2018): 61–68. http://dx.doi.org/10.17512/jamcm.2018.1.06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ban, Jung-Chao, and Chih-Hung Chang. "Hausdorff Dimension of Multi-Layer Neural Networks." Advances in Pure Mathematics 03, no. 09 (2013): 9–14. http://dx.doi.org/10.4236/apm.2013.39a1002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ban, Jung-Chao, and Chih-Hung Chang. "Diamond in multi-layer cellular neural networks." Applied Mathematics and Computation 222 (October 2013): 1–12. http://dx.doi.org/10.1016/j.amc.2013.07.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Redwan, Renas M. "Neural networks and Sigmoid Activation Function in Multi-Layer Networks." Qubahan Academic Journal 1, no. 2 (2020): 29–43. http://dx.doi.org/10.48161/qaj.v1n2a11.

Full text
Abstract:
Back propagation neural networks are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural networks for back propagation of errors, and sigmoid activation function. This neural networks to map non-linear threshold gate. The non-linear used to classify binary inputs ( ) and passing it through hidden layer for computing and ( ), after computing errors by ( ) the weights and thetas ( ) are changing according to errors. Sigmo
APA, Harvard, Vancouver, ISO, and other styles
6

YEN, GARY, and HAIMING LU. "HIERARCHICAL GENETIC ALGORITHM FOR NEAR-OPTIMAL FEEDFORWARD NEURAL NETWORK DESIGN." International Journal of Neural Systems 12, no. 01 (2002): 31–43. http://dx.doi.org/10.1142/s0129065702001023.

Full text
Abstract:
In this paper, we propose a genetic algorithm based design procedure for a multi-layer feed-forward neural network. A hierarchical genetic algorithm is used to evolve both the neural network's topology and weighting parameters. Compared with traditional genetic algorithm based designs for neural networks, the hierarchical approach addresses several deficiencies, including a feasibility check highlighted in literature. A multi-objective cost function is used herein to optimize the performance and topology of the evolved neural network simultaneously. In the prediction of Mackey–Glass chaotic ti
APA, Harvard, Vancouver, ISO, and other styles
7

TÖRÖK, LEVENTE, and TAMÁS ROSKA. "STABILITY OF MULTI-LAYER CELLULAR NEURAL/NONLINEAR NETWORKS." International Journal of Bifurcation and Chaos 14, no. 10 (2004): 3567–86. http://dx.doi.org/10.1142/s0218127404011582.

Full text
Abstract:
We have found a formalism that lets us present generalizations of several stability theorems (see Chua & Roska, 1990; Chua & Wu, 1992; Gilli, 1993; Forti, 2002] on Multi-Layer Cellular Neural/Nonlinear Networks (MLCNN) formerly claimed for Single-Layer Cellular Neural/Nonlinear Networks (CNN). The theorems were selected with special regard to usefulness in engineering applications. Hence, in contrast to many works considering stability on recurrent neural networks, the criteria of the new theorems have clear indications that are easy to verify directly on the template values. Proofs of
APA, Harvard, Vancouver, ISO, and other styles
8

Nguyen, Tan Loc, and Yonggwan Won. "Sleep snoring detection using multi-layer neural networks." Bio-Medical Materials and Engineering 26, s1 (2015): S1749—S1755. http://dx.doi.org/10.3233/bme-151475.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Svozil, Daniel, Vladimír Kvasnicka, and Jir̂í Pospichal. "Introduction to multi-layer feed-forward neural networks." Chemometrics and Intelligent Laboratory Systems 39, no. 1 (1997): 43–62. http://dx.doi.org/10.1016/s0169-7439(97)00061-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ban, Jung-Chao, and Chih-Hung Chang. "The learning problem of multi-layer neural networks." Neural Networks 46 (October 2013): 116–23. http://dx.doi.org/10.1016/j.neunet.2013.05.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Multi-layer neural networks"

1

Cairns, Graham Andrew. "Learning with analogue VLSI multi-layer perceptrons." Thesis, University of Oxford, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.296901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmed, Zulfiqar. "An hybrid architecture for multi-layer feed-forward neural networks." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0009/MQ52500.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tombs, Jonathan Noel. "Multi-layer neural networks and their implementation in analogue VLSI." Thesis, University of Oxford, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.334293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zheng, Gonghui. "Design and evaluation of a multi-output-layer perceptron." Thesis, University of Ulster, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Setyawati, Bina R. "Multi-layer feed forward neural networks for foreign exchange time series forecasting." Morgantown, W. Va. : [West Virginia University Libraries], 2005. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=4180.

Full text
Abstract:
Thesis (Ph. D.)--West Virginia University, 2005.<br>Title from document title page. Document formatted into pages; contains xii, 185 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 140-146).
APA, Harvard, Vancouver, ISO, and other styles
6

Bulbuller, Gokhan. "Recognition of in-ear microphone speech data using multi-layer neural networks." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Mar%5FBulbuller.pdf.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, March 2006.<br>"March 2006." Thesis Advisor(s): Monique P. Fargues, Ravi Vaidyanathan. Includes bibliographical references (p. 159-162). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Hao. "A new scheme for training ReLU-based multi-layer feedforward neural networks." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-217384.

Full text
Abstract:
A new scheme for training Rectified Linear Unit (ReLU) based feedforward neural networks is examined in this thesis. The project starts with the row-by-row updating strategy designed for Single-hidden Layer Feedforward neural Networks (SLFNs). This strategy exploits the properties held by ReLUs and optimizes each row in the input weight matrix individually, under the common optimization scheme. Then the Direct Updating Strategy (DUS), which has two different versions: Vector-Based Method (VBM) and Matrix-Based Method (MBM), is proposed to optimize the input weight matrix as a whole. Finally DU
APA, Harvard, Vancouver, ISO, and other styles
8

Penny, William Douglas. "The storage, training and generalization properties of multi-layer logical neural networks." Thesis, Brunel University, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.331996.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

McGarry, Kenneth J. "Rule extraction and knowledge transfer from radial basis function neural networks." Thesis, University of Sunderland, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.391744.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Krasniewicz, Jan A. "The application and analysis of genetic algorithms to discover topological free parameters in multi-layer perceptions." Thesis, Birmingham City University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367474.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Multi-layer neural networks"

1

Shepherd, Adrian J. Second-order methods for neural networks: Fast and reliable training methods for multi-layer perceptrons. Springer, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Penny, William Douglas. The storage, training and generalization properties of multi-layer logical neural networks. Brunel University, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dissertation: Autonomous Construction of Multi Layer Perceptron Neural Networks. Storming Media, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shepherd, Adrian J. Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons. Springer, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shepherd, Adrian J. Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons. Springer London, Limited, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Multi-layer neural networks"

1

Davalo, Eric, and Patrick Naïm. "Multi-layer Neural Networks." In Neural Networks. Macmillan Education UK, 1991. http://dx.doi.org/10.1007/978-1-349-12312-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shepherd, Adrian J. "Multi-Layer Perceptron Training." In Second-Order Methods for Neural Networks. Springer London, 1997. http://dx.doi.org/10.1007/978-1-4471-0953-2_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Tsu-Chang. "Multi-Layer Feed-Forward Networks." In Structure Level Adaptation for Artificial Neural Networks. Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Widrow, Bernard, and Edward P. Katz. "Backpropagation for Multi-layer Neural Networks." In Springer Series on Bio- and Neurosystems. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-80939-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Shuzhong, Siwei Luo, and Jianyu Li. "Building Multi-layer Small World Neural Network." In Advances in Neural Networks - ISNN 2006. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Eleuteri, Antonio, Roberto Tagliaferri, and Leopoldo Milano. "Divergence Projections for Variable Selection in Multi–layer Perceptron Networks." In Neural Nets. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45216-4_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Suresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. "Fully Complex-valued Multi Layer Perceptron Networks." In Supervised Learning with Complex-valued Neural Networks. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pérez-Miñana, Elena, Peter Ross, and John Hallam. "Multi-layer perceptron design using Delaunay triangulations." In Fuzzy Logic, Neural Networks, and Evolutionary Computation. Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/3-540-61988-7_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Barber, David, Peter Sollich, and David Saad. "Finite Size Effects in On-Line Learning of Multi-Layer Neural Networks." In Mathematics of Neural Networks. Springer US, 1997. http://dx.doi.org/10.1007/978-1-4615-6099-9_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sollich, Peter. "Query Learning for Maximum Information Gain in a Multi-Layer Neural Network." In Mathematics of Neural Networks. Springer US, 1997. http://dx.doi.org/10.1007/978-1-4615-6099-9_59.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multi-layer neural networks"

1

Zou, Hanyi, Huifang Xu, Qingchao Kong, Yilin Cao, and Wenji Mao. "Generating Relevant Article Comments via Variational Multi-Layer Fusion." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650383.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sreenivasan, Prashanth, and Yogesh H. Kulkarni. "Computing Midcurve with Multi-Layer and Convolutional Neural Networks." In 2025 International Conference on Computational, Communication and Information Technology (ICCCIT). IEEE, 2025. https://doi.org/10.1109/icccit62592.2025.10928152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ruiz, José Luis López, Ángeles Verdejo Espinosa, and Macarena Espinilla Estévez. "Federated Learning Methodology for Indoor Location Systems in Multi-Layer Architecture." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651450.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Han, Chaoqi, Bingcai Chen, Chanjuan Liu, et al. "SlideMLP: A Pure Multi-layer Perceptrons Method For Medical Image Segmentation." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Ziao, Xuyang Li, Lin Zhao, Zhongquan Gao, and Yu Zheng. "Industrial Surface Defect Detection via Multi-scale Mask Cross-layer Fusion Network." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10651547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Du, Ruizhong, Yidan Li, Mingyue Li, Jinjia Peng, Yuting Zhu, and Caixia Ma. "Multi-attribute Semantic Adversarial Attack Based on Cross-layer Interpolation for Face Recognition." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Tianhua, and Pianran Guo. "Track Finding for Multi-Layer Silicon Pixel Detectors with GNN." In 2025 5th International Conference on Neural Networks, Information and Communication Engineering (NNICE). IEEE, 2025. https://doi.org/10.1109/nnice64954.2025.11064706.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fujino, Kazushi, Keiki Takadama, and Hiroyuki Sato. "Multi-layer Cortical Learning Algorithm for Forecasting Time-series Data with Smoothly Changing Variation Patterns." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sun, Mengyuan. "Music Melody Extraction Algorithm Based on Multi-Layer Perceptron Neural Network." In 2024 4th International Conference on Mobile Networks and Wireless Communications (ICMNWC). IEEE, 2024. https://doi.org/10.1109/icmnwc63764.2024.10871981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Pei, Lihua Zhou, Lizhen Wang, Hongmei Chen, and Qing Xiao. "Multi-masks and Bi-spaces Reconstruction based Single-Layer Auto-encoder for Heterogeneous Graph Representation Learning." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650586.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!