To see the other types of publications on this topic, follow the link: Multi-layer neural networks.

Journal articles on the topic 'Multi-layer neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Multi-layer neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ban, Jung-Chao, and Chih-Hung Chang. "The layer effect on multi-layer cellular neural networks." Applied Mathematics Letters 26, no. 7 (2013): 706–9. http://dx.doi.org/10.1016/j.aml.2013.01.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Scherer, Magdalena. "Multi-layer neural networks for sales forecasting." Journal of Applied Mathematics and Computational Mechanics 17, no. 1 (2018): 61–68. http://dx.doi.org/10.17512/jamcm.2018.1.06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ban, Jung-Chao, and Chih-Hung Chang. "Hausdorff Dimension of Multi-Layer Neural Networks." Advances in Pure Mathematics 03, no. 09 (2013): 9–14. http://dx.doi.org/10.4236/apm.2013.39a1002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ban, Jung-Chao, and Chih-Hung Chang. "Diamond in multi-layer cellular neural networks." Applied Mathematics and Computation 222 (October 2013): 1–12. http://dx.doi.org/10.1016/j.amc.2013.07.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Redwan, Renas M. "Neural networks and Sigmoid Activation Function in Multi-Layer Networks." Qubahan Academic Journal 1, no. 2 (2020): 29–43. http://dx.doi.org/10.48161/qaj.v1n2a11.

Full text
Abstract:
Back propagation neural networks are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural networks for back propagation of errors, and sigmoid activation function. This neural networks to map non-linear threshold gate. The non-linear used to classify binary inputs ( ) and passing it through hidden layer for computing and ( ), after computing errors by ( ) the weights and thetas ( ) are changing according to errors. Sigmo
APA, Harvard, Vancouver, ISO, and other styles
6

YEN, GARY, and HAIMING LU. "HIERARCHICAL GENETIC ALGORITHM FOR NEAR-OPTIMAL FEEDFORWARD NEURAL NETWORK DESIGN." International Journal of Neural Systems 12, no. 01 (2002): 31–43. http://dx.doi.org/10.1142/s0129065702001023.

Full text
Abstract:
In this paper, we propose a genetic algorithm based design procedure for a multi-layer feed-forward neural network. A hierarchical genetic algorithm is used to evolve both the neural network's topology and weighting parameters. Compared with traditional genetic algorithm based designs for neural networks, the hierarchical approach addresses several deficiencies, including a feasibility check highlighted in literature. A multi-objective cost function is used herein to optimize the performance and topology of the evolved neural network simultaneously. In the prediction of Mackey–Glass chaotic ti
APA, Harvard, Vancouver, ISO, and other styles
7

TÖRÖK, LEVENTE, and TAMÁS ROSKA. "STABILITY OF MULTI-LAYER CELLULAR NEURAL/NONLINEAR NETWORKS." International Journal of Bifurcation and Chaos 14, no. 10 (2004): 3567–86. http://dx.doi.org/10.1142/s0218127404011582.

Full text
Abstract:
We have found a formalism that lets us present generalizations of several stability theorems (see Chua & Roska, 1990; Chua & Wu, 1992; Gilli, 1993; Forti, 2002] on Multi-Layer Cellular Neural/Nonlinear Networks (MLCNN) formerly claimed for Single-Layer Cellular Neural/Nonlinear Networks (CNN). The theorems were selected with special regard to usefulness in engineering applications. Hence, in contrast to many works considering stability on recurrent neural networks, the criteria of the new theorems have clear indications that are easy to verify directly on the template values. Proofs of
APA, Harvard, Vancouver, ISO, and other styles
8

Nguyen, Tan Loc, and Yonggwan Won. "Sleep snoring detection using multi-layer neural networks." Bio-Medical Materials and Engineering 26, s1 (2015): S1749—S1755. http://dx.doi.org/10.3233/bme-151475.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Svozil, Daniel, Vladimír Kvasnicka, and Jir̂í Pospichal. "Introduction to multi-layer feed-forward neural networks." Chemometrics and Intelligent Laboratory Systems 39, no. 1 (1997): 43–62. http://dx.doi.org/10.1016/s0169-7439(97)00061-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ban, Jung-Chao, and Chih-Hung Chang. "The learning problem of multi-layer neural networks." Neural Networks 46 (October 2013): 116–23. http://dx.doi.org/10.1016/j.neunet.2013.05.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Ban, Jung-Chao, and Chih-Hung Chang. "Realization problem of multi-layer cellular neural networks." Neural Networks 70 (October 2015): 9–17. http://dx.doi.org/10.1016/j.neunet.2015.06.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Coury, Denis V., and Mário Oleskovicz. "Multi-layer neural networks applied to distance relaying." International Journal of Electrical Power & Energy Systems 20, no. 8 (1998): 539–42. http://dx.doi.org/10.1016/s0142-0615(98)00018-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ban, Jung-Chao, Chih-Hung Chang, Song-Sun Lin, and Yin-Heng Lin. "Spatial complexity in multi-layer cellular neural networks." Journal of Differential Equations 246, no. 2 (2009): 552–80. http://dx.doi.org/10.1016/j.jde.2008.05.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Gurevich, Pavel, and Hannes Stuke. "Gradient conjugate priors and multi-layer neural networks." Artificial Intelligence 278 (January 2020): 103184. http://dx.doi.org/10.1016/j.artint.2019.103184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Subotin, M., W. Marsh, J. McMichael, J. J. Fung, and I. Dvorchik. "Performance of Multi-Layer Feedforward Neural Networks to Predict Liver Transplantation Outcome." Methods of Information in Medicine 35, no. 01 (1996): 12–18. http://dx.doi.org/10.1055/s-0038-1634637.

Full text
Abstract:
AbstractA novel multisolutional clustering and quantization (MCO) algorithm has been developed that provides a flexible way to preprocess data. It was tested whether it would impact the neural network’s performance favorably and whether the employment of the proposed algorithm would enable neural networks to handle missing data. This was assessed by comparing the performance of neural networks using a well-documented data set to predict outcome following liver transplantation. This new approach to data preprocessing leads to a statistically significant improvement in network performance when c
APA, Harvard, Vancouver, ISO, and other styles
16

Chithra, P. L., and P. Bhavani. "A Novel 3D Multi-Layer Convolutional Neural Networks for Lung Cancer Segmentation in CT Images." Indian Journal Of Science And Technology 17, no. 13 (2024): 1368–80. http://dx.doi.org/10.17485/ijst/v17i13.2081.

Full text
Abstract:
Background/Objectives: A novel three-dimensional efficient Multi-Layer Convolutional Neural Network (3D-MLCNN) is proposed for detecting lung tumors accurately using Computerized Tomography (CT) lung tumor images. The proposed K-means segmentation algorithm for labeling the tumor region automatically. This proposed K-means segmentation algorithm automatically labels the tumor regions to process with the 3D MLCNN model to predict tiny tumors and extract tumor regions accurately. Methods: The proposed 3DMLCNN network goal is to extract the tumor region in CT lung images to classify the lung tumo
APA, Harvard, Vancouver, ISO, and other styles
17

Mali, H. Hakem Alameady*. "CLASSIFYING POISONOUS AND EDIBLE MUSHROOMS IN THE AGARICUS AND LEPIOTA FAMILY USING MULTILAYER PERCEPTION." INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY 6`, no. 1 (2017): 154–64. https://doi.org/10.5281/zenodo.233441.

Full text
Abstract:
Classification is one of the applications of feed-forward Artificial Neural Network (ANN). Classification can map data to predefined classes or groups. It is referred to as a supervised learning, because before examining data the classes are always determined. Multi-Layer Perception, is a supervised neutral networks model that is use to train and test data to build a model. In this experiment. Multi-Layer Perception is used to train the Data set to produce a model to make prediction of classifying .After preparing the Mushrooms data for training, only 8124 of dataset instances used to be train
APA, Harvard, Vancouver, ISO, and other styles
18

Liang, Lixin, Ning Li, Yihong Li, and Lin Lin. "Neural Networks with the Correlative Layer for Multi-label Classification." Journal of Physics: Conference Series 2425, no. 1 (2023): 012034. http://dx.doi.org/10.1088/1742-6596/2425/1/012034.

Full text
Abstract:
Abstract Multi-label classification is a very significant but challenging task. Correlation between labels often exists, so recent works have paid much attention to using the correlation between labels to improve the classification performance. However, how to effectively learn the correlation is still a problem. In this paper, a general framework, i.e., the neural network with the correlative layer (CLNN), is proposed, where the correlative layer is used to express the dependencies between labels. Different from existing work, CLNN first trains a neural network without the correlative layer t
APA, Harvard, Vancouver, ISO, and other styles
19

Mühlenbein, H. "Limitations of multi-layer perceptron networks - steps towards genetic neural networks." Parallel Computing 14, no. 3 (1990): 249–60. http://dx.doi.org/10.1016/0167-8191(90)90079-o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Asaad, Renas Rajab, and Rasan I. Ali. "Back Propagation Neural Network(BPNN) and Sigmoid Activation Function in Multi-Layer Networks." Academic Journal of Nawroz University 8, no. 4 (2019): 216. http://dx.doi.org/10.25007/ajnu.v8n4a464.

Full text
Abstract:
Back propagation neural network are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural network for back propagation of errors, and sigmoid activation function. This neural network to map non-linear threshold gate. The non-linear used to classify binary inputs (x1, x2) and passing it through hidden layer for computing coefficient_errors and gradient_errors (Cerrors, Gerrors), after computing errors by (ei = Output_desir
APA, Harvard, Vancouver, ISO, and other styles
21

Takahashi, Kazuhiko. "Remarks on multi layer neural networks involving chaos neurons." International Journal of Applied Electromagnetics and Mechanics 18, no. 1-3 (2003): 165–76. http://dx.doi.org/10.3233/jae-2003-283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Chunmei, He. "Evolutionary Learning Algorithm for Multi-layer Morphological Neural Networks." Information Technology Journal 12, no. 4 (2013): 852–56. http://dx.doi.org/10.3923/itj.2013.852.856.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Ditzler, Gregory, Robi Polikar, and Gail Rosen. "Multi-Layer and Recursive Neural Networks for Metagenomic Classification." IEEE Transactions on NanoBioscience 14, no. 6 (2015): 608–16. http://dx.doi.org/10.1109/tnb.2015.2461219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Oh, Sung-Kwun, and Witold Pedrycz. "Genetic optimization-driven multi-layer hybrid fuzzy neural networks." Simulation Modelling Practice and Theory 14, no. 5 (2006): 597–613. http://dx.doi.org/10.1016/j.simpat.2005.10.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ban, Jung-Chao, Chih-Hung Chang, and Song-Sun Lin. "On the structure of multi-layer cellular neural networks." Journal of Differential Equations 252, no. 8 (2012): 4563–97. http://dx.doi.org/10.1016/j.jde.2012.01.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Bo, Zhengtao Ding, and Chen Lv. "Distributed Training for Multi-Layer Neural Networks by Consensus." IEEE Transactions on Neural Networks and Learning Systems 31, no. 5 (2020): 1771–78. http://dx.doi.org/10.1109/tnnls.2019.2921926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ban, Jung-Chao, and Chih-Hung Chang. "The Spatial Complexity of Inhomogeneous Multi-layer Neural Networks." Neural Processing Letters 43, no. 1 (2014): 31–47. http://dx.doi.org/10.1007/s11063-014-9400-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Sinaga, Daurat, Cahaya Jatmoko, Suprayogi Suprayogi, and Novi Hedriyanto. "Multi-Layer Convolutional Neural Networks for Batik Image Classification." Scientific Journal of Informatics 11, no. 2 (2024): 477–84. https://doi.org/10.15294/sji.v11i2.3309.

Full text
Abstract:
Purpose: The purpose of this study is to enhance the classification of batik motifs through the implementation of a novel approach utilizing Multi-Layer Convolutional Neural Networks (CNN). Batik, a traditional Indonesian textile art form, boasts intricate motifs reflecting rich cultural heritage. However, the diverse designs often pose challenges in accurate classification. Leveraging advancements in deep learning, this research proposes a methodological framework employing Multi-Layer CNN to improve classification accuracy. Methods: The methodology integrates Multi-Layer CNN architecture wit
APA, Harvard, Vancouver, ISO, and other styles
29

Nortje, Wimpie D., Johann E. W. Holm, Gerhard P. Hancke, Imre J. Rudas, and Laszlo Horvath. "Results of Bias-variance Tests on Multi-layer Perceptron Neural Networks." Journal of Advanced Computational Intelligence and Intelligent Informatics 5, no. 5 (2001): 300–305. http://dx.doi.org/10.20965/jaciii.2001.p0300.

Full text
Abstract:
Training neural networks involves selection of a set of network parameters, or weights, on account of fitting a non-linear model to data. Due to the bias in the training data and small computational errors, the neural networks’ opinions are biased. Some improvement is possible when multiple networks are used to do the classification. This approach is similar to taking the average of a number of biased opinions in order to remove some of the bias that resulted from training. Bayesian networks are effective in removing some of the bias associated with training, but Bayesian techniques are tediou
APA, Harvard, Vancouver, ISO, and other styles
30

Ma, Qianli, Zhenxi Lin, Enhuan Chen, and Garrison Cottrell. "Temporal Pyramid Recurrent Neural Network." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5061–68. http://dx.doi.org/10.1609/aaai.v34i04.5947.

Full text
Abstract:
Learning long-term and multi-scale dependencies in sequential data is a challenging task for recurrent neural networks (RNNs). In this paper, a novel RNN structure called temporal pyramid RNN (TP-RNN) is proposed to achieve these two goals. TP-RNN is a pyramid-like structure and generally has multiple layers. In each layer of the network, there are several sub-pyramids connected by a shortcut path to the output, which can efficiently aggregate historical information from hidden states and provide many gradient feedback short-paths. This avoids back-propagating through many hidden states as in
APA, Harvard, Vancouver, ISO, and other styles
31

Afinogentov, A. A., Yu A. Bagdasarova, M. Yu Derevyanov, and Yu E. Pleshivtseva. "Application of Neural Networks to Assess the Resource Value of Oil-Contaminated Waste Storage Facilities." IOP Conference Series: Earth and Environmental Science 988, no. 2 (2022): 022073. http://dx.doi.org/10.1088/1755-1315/988/2/022073.

Full text
Abstract:
Abstract The article presents a methodology for evaluating the efficiency of oil industry waste recycling systems using multi-layer artificial neural networks. As an indicator of the efficiency of the recycling system, the indicator of the resource value of oil-contaminated waste (OCW) is used. For training neural networks, the data sets are formed using the resource value assessment algorithm based on the Data Envelopment Analysis (DEA) method of multi-factor evaluation of the efficiency of production systems. The development and training of neural networks are performed using the free softwa
APA, Harvard, Vancouver, ISO, and other styles
32

Koyuncu, Hakan. "Determination of positioning accuracies by using fingerprint localisation and artificial neural networks." Thermal Science 23, Suppl. 1 (2019): 99–111. http://dx.doi.org/10.2298/tsci180912334k.

Full text
Abstract:
Fingerprint localisation technique is an effective positioning technique to determine the object locations by using radio signal strength, values in indoors. The technique is subject to big positioning errors due to challenging environmental conditions. In this paper, initially, a fingerprint localisation technique is deployed by using classical k-nearest neighborhood method to determine the unknown object locations. Additionally, several artificial neural networks, are employed, using fingerprint data, such as single-layer feed forward neural network, multi-layer feed forward neural network,
APA, Harvard, Vancouver, ISO, and other styles
33

Atanassov, Krassimir, Sotir Sotirov, and Tania Pencheva. "Intuitionistic Fuzzy Deep Neural Network." Mathematics 11, no. 3 (2023): 716. http://dx.doi.org/10.3390/math11030716.

Full text
Abstract:
The concept of an intuitionistic fuzzy deep neural network (IFDNN) is introduced here as a demonstration of a combined use of artificial neural networks and intuitionistic fuzzy sets, aiming to benefit from the advantages of both methods. The investigation presents in a methodological way the whole process of IFDNN development, starting with the simplest form—an intuitionistic fuzzy neural network (IFNN) with one layer with single-input neuron, passing through IFNN with one layer with one multi-input neuron, further subsequent complication—an IFNN with one layer with many multi-input neurons,
APA, Harvard, Vancouver, ISO, and other styles
34

Le, Thai Hoang. "Applying Artificial Neural Networks for Face Recognition." Advances in Artificial Neural Systems 2011 (November 3, 2011): 1–16. http://dx.doi.org/10.1155/2011/673016.

Full text
Abstract:
This paper introduces some novel models for all steps of a face recognition system. In the step of face detection, we propose a hybrid model combining AdaBoost and Artificial Neural Network (ABANN) to solve the process efficiently. In the next step, labeled faces detected by ABANN will be aligned by Active Shape Model and Multi Layer Perceptron. In this alignment step, we propose a new 2D local texture model based on Multi Layer Perceptron. The classifier of the model significantly improves the accuracy and the robustness of local searching on faces with expression variation and ambiguous cont
APA, Harvard, Vancouver, ISO, and other styles
35

Seilov, Shakhmaran Zh, Vadim Yu Goikhman, Yerden Zhursinbek, Mereilim N. Kassenova, Daniyar S. Shingissov, and Akhmet T. Kuzbayev. "USE OF ELEMENTS OF ARTIFICIAL INTELLIGENCE IN THE ANALYSIS OF INFOCOMMUNICATION TRAFFIC." T-Comm 14, no. 12 (2020): 66–71. http://dx.doi.org/10.36724/2072-8735-2020-14-12-66-71.

Full text
Abstract:
Modern communication networks are based on multi-service networks, which are a single telecommunications structure that can transmit large volumes of multi-format information (voice, video, data) and provide users with a variety of information and communication services. Traffic transmitted in multiservice networks differs significantly from traditional traffic of telephone or other homogeneous networks. Knowledge of the nature of modern traffic is necessary for the successful construction, operation and development of multi-service communication networks, providing users with high-quality ser
APA, Harvard, Vancouver, ISO, and other styles
36

Wang, Ling, and A. D. Hope. "FAULT DIAGNOSIS: Bearing fault diagnosis using multi-layer neural networks." Insight - Non-Destructive Testing and Condition Monitoring 46, no. 8 (2004): 451–55. http://dx.doi.org/10.1784/insi.46.8.451.39377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Markus, E. D., O. U. Okereke, and John T. Agee. "Predicting Telephone Traffic Congestion Using Multi Layer Feedforward Neural Networks." Advanced Materials Research 367 (October 2011): 191–98. http://dx.doi.org/10.4028/www.scientific.net/amr.367.191.

Full text
Abstract:
Predicting congestion in a telephone network has become part of an efficient network planning operation. The excellent capability of neural network (NN) to learn complex nonlinear systems makes it suitable for identifying the relationship between traffic congestion and the variables responsible for its occurrence in a time-varying traffic situation. This paper presents an artificial NN model for predicting traffic congestion in a telephone network. The design strategy uses a multilayered feedforward NN with backpropagation algorithm to model the telephone traffic situation. Matlab was used as
APA, Harvard, Vancouver, ISO, and other styles
38

Park, Byoung-Jun, Keon-Jun Park, Dong-Yoon Lee, and Sung-Kwun Oh. "The Design of Genetically Optimized Multi-layer Fuzzy Neural Networks." Journal of Korean Institute of Intelligent Systems 14, no. 5 (2004): 660–65. http://dx.doi.org/10.5391/jkiis.2004.14.5.660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

JIANG, MINGHU, and GEORGES GIELEN. "THE EFFECTS OF QUANTIZATION ON MULTI-LAYER FEEDFORWARD NEURAL NETWORKS." International Journal of Pattern Recognition and Artificial Intelligence 17, no. 04 (2003): 637–61. http://dx.doi.org/10.1142/s0218001403002514.

Full text
Abstract:
In this paper we investigate the combined effect of quantization and clipping on multi-layer feedforward neural networks (MLFNN). Statistical models are used to analyze the effects of quantization in a digital implementation. We analyze the performance degradation caused as a function of the number of fixed-point and floating-point quantization bits in the MLFNN. To analyze a true nonlinear neuron, we adopt the uniform and normal probability distributions, compare the training performances with and without weight clipping, and derive in detail the effect of the quantization error on forward an
APA, Harvard, Vancouver, ISO, and other styles
40

Lee, In-Soo, Jong-Hyun Lee, Tae-Hyun Cho, Myung-Hwan Jeong, Kyung-Jin Na, and Soo-Young Ha. "Robot Welding Gun Monitoring System Using Multi-layer Neural Networks." Journal of Korean Institute of Information Technology 17, no. 8 (2019): 49–57. http://dx.doi.org/10.14801/jkiit.2019.17.8.49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Neville, R. S. "Reuse of information in multi-layer sigma-pi neural networks." Connection Science 18, no. 1 (2006): 43–59. http://dx.doi.org/10.1080/09540090500132252.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Ban, Jung-Chao, and Chih-Hung Chang. "When are two multi-layer cellular neural networks the same?" Neural Networks 79 (July 2016): 12–19. http://dx.doi.org/10.1016/j.neunet.2016.03.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Arriaza, Oscar Velásquez, Zagaa Tumurkhuyagc, and Dong-Won Kim. "Chatter Identification using Multiple Sensors and Multi-Layer Neural Networks." Procedia Manufacturing 17 (2018): 150–57. http://dx.doi.org/10.1016/j.promfg.2018.10.030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Ban, Jung-Chao, and Chih-Hung Chang. "Solution Structure of Multi-layer Neural Networks with Initial Condition." Journal of Dynamics and Differential Equations 28, no. 1 (2015): 69–92. http://dx.doi.org/10.1007/s10884-015-9471-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jou, I. Chang, Shih-Shien You, and Long-Wen Chang. "Analysis of hidden nodes for multi-layer perceptron neural networks." Pattern Recognition 27, no. 6 (1994): 859–64. http://dx.doi.org/10.1016/0031-3203(94)90170-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Khotanzad, A., and C. Chung. "Application of multi-layer perceptron neural networks to vision problems." Neural Computing & Applications 7, no. 3 (1998): 249–59. http://dx.doi.org/10.1007/bf01414886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Elgharabawy, Ayman, Mukesh Prasad, and Chin-Teng Lin. "Subgroup Preference Neural Network." Sensors 21, no. 18 (2021): 6104. http://dx.doi.org/10.3390/s21186104.

Full text
Abstract:
Subgroup label ranking aims to rank groups of labels using a single ranking model, is a new problem faced in preference learning. This paper introduces the Subgroup Preference Neural Network (SGPNN) that combines multiple networks have different activation function, learning rate, and output layer into one artificial neural network (ANN) to discover the hidden relation between the subgroups’ multi-labels. The SGPNN is a feedforward (FF), partially connected network that has a single middle layer and uses stairstep (SS) multi-valued activation function to enhance the prediction’s probability an
APA, Harvard, Vancouver, ISO, and other styles
48

Rediniotis, O. K., and G. Chrysanthakopoulos. "Application of Neural Networks and Fuzzy Logic to the Calibration of the Seven-Hole Probe." Journal of Fluids Engineering 120, no. 1 (1998): 95–101. http://dx.doi.org/10.1115/1.2819670.

Full text
Abstract:
The theory and techniques of Artificial Neural Networks (ANN) and Fuzzy Logic Systems (FLS) are applied toward the formulation of accurate and wide-range calibration methods for such flow-diagnostics instruments as multi-hole probes. Besides introducing new calibration techniques, part of the work’s objective is to: (a) apply fuzzy-logic methods to identify systems whose behavior is described in a “crisp” rather than a “linguistic” framework and (b) compare the two approaches, i.e., neural network versus fuzzy logic approach, and their potential as universal approximators. For the ANN approach
APA, Harvard, Vancouver, ISO, and other styles
49

Feng, Yifan, Haoxuan You, Zizhao Zhang, Rongrong Ji, and Yue Gao. "Hypergraph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3558–65. http://dx.doi.org/10.1609/aaai.v33i01.33013558.

Full text
Abstract:
In this paper, we present a hypergraph neural networks (HGNN) framework for data representation learning, which can encode high-order data correlation in a hypergraph structure. Confronting the challenges of learning representation for complex data in real practice, we propose to incorporate such data structure in a hypergraph, which is more flexible on data modeling, especially when dealing with complex data. In this method, a hyperedge convolution operation is designed to handle the data correlation during representation learning. In this way, traditional hypergraph learning procedure can be
APA, Harvard, Vancouver, ISO, and other styles
50

Aguilar-Fuertes, Jose J., Francisco Noguero-Rodríguez, José C. Jaen Ruiz, Luis M. García-RAffi, and Sergio Hoyas. "Tracking Turbulent Coherent Structures by Means of Neural Networks." Energies 14, no. 4 (2021): 984. http://dx.doi.org/10.3390/en14040984.

Full text
Abstract:
The behaviours of individual flow structures have become a relevant matter of study in turbulent flows as the computational power to allow their study feasible has become available. Especially, high instantaneous Reynolds Stress events have been found to dominate the behaviour of the logarithmic layer. In this work, we present a viability study where two machine learning solutions are proposed to reduce the computational cost of tracking such structures in large domains. The first one is a Multi-Layer Perceptron. The second one uses Long Short-Term Memory (LSTM). Both of the methods are develo
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!