Academic literature on the topic 'Supervised neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Supervised neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Supervised neural networks"

1

Yeh, I.-Cheng, and Kuan-Cheng Lin. "Supervised Learning Probabilistic Neural Networks." Neural Processing Letters 34, no. 2 (July 22, 2011): 193–208. http://dx.doi.org/10.1007/s11063-011-9191-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hush, D. R., and B. G. Horne. "Progress in supervised neural networks." IEEE Signal Processing Magazine 10, no. 1 (January 1993): 8–39. http://dx.doi.org/10.1109/79.180705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tomasov, Adrian, Martin Holik, Vaclav Oujezsky, Tomas Horvath, and Petr Munster. "GPON PLOAMd Message Analysis Using Supervised Neural Networks." Applied Sciences 10, no. 22 (November 18, 2020): 8139. http://dx.doi.org/10.3390/app10228139.

Full text
Abstract:
This paper discusses the possibility of analyzing the orchestration protocol used in gigabit-capable passive optical networks (GPONs). Considering the fact that a GPON is defined by the International Telecommunication Union Telecommunication sector (ITU-T) as a set of recommendations, implementation across device vendors might exhibit few differences, which complicates analysis of such protocols. Therefore, machine learning techniques are used (e.g., neural networks) to evaluate differences in GPONs among various device vendors. As a result, this paper compares three neural network models base
APA, Harvard, Vancouver, ISO, and other styles
4

Hammer, Barbara. "Neural Smithing – Supervised Learning in Feedforward Artificial Neural Networks." Pattern Analysis & Applications 4, no. 1 (March 2001): 73–74. http://dx.doi.org/10.1007/s100440170029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sarukkai, Ramesh R. "Supervised Networks That Self-Organize Class Outputs." Neural Computation 9, no. 3 (March 1, 1997): 637–48. http://dx.doi.org/10.1162/neco.1997.9.3.637.

Full text
Abstract:
Supervised, neural network, learning algorithms have proved very successful at solving a variety of learning problems; however, they suffer from a common problem of requiring explicit output labels. In this article, it is shown that pattern classification can be achieved, in a multilayered, feedforward, neural network, without requiring explicit output labels, by a process of supervised self-organization. The class projection is achieved by optimizing appropriate within-class uniformity and between-class discernibility criteria. The mapping function and the class labels are developed together
APA, Harvard, Vancouver, ISO, and other styles
6

Doyle, J. R. "Supervised learning in N-tuple neural networks." International Journal of Man-Machine Studies 33, no. 1 (July 1990): 21–40. http://dx.doi.org/10.1016/s0020-7373(05)80113-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Secco, Jacopo, Mauro Poggio, and Fernando Corinto. "Supervised neural networks with memristor binary synapses." International Journal of Circuit Theory and Applications 46, no. 1 (January 2018): 221–33. http://dx.doi.org/10.1002/cta.2429.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sporea, Ioana, and André Grüning. "Supervised Learning in Multilayer Spiking Neural Networks." Neural Computation 25, no. 2 (February 2013): 473–509. http://dx.doi.org/10.1162/neco_a_00396.

Full text
Abstract:
We introduce a supervised learning algorithm for multilayer spiking neural networks. The algorithm overcomes a limitation of existing learning algorithms: it can be applied to neurons firing multiple spikes in artificial neural networks with hidden layers. It can also, in principle, be used with any linearizable neuron model and allows different coding schemes of spike train patterns. The algorithm is applied successfully to classic linearly nonseparable benchmarks such as the XOR problem and the Iris data set, as well as to more complex classification and mapping problems. The algorithm has b
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Juexin, Anjun Ma, Qin Ma, Dong Xu, and Trupti Joshi. "Inductive inference of gene regulatory network using supervised and semi-supervised graph neural networks." Computational and Structural Biotechnology Journal 18 (2020): 3335–43. http://dx.doi.org/10.1016/j.csbj.2020.10.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Jianqiao, Zhaolu Zuo, Danchao Wu, Bing Li, Xiaoni Li, and Deyi Kong. "Bearing Defect Detection with Unsupervised Neural Networks." Shock and Vibration 2021 (August 19, 2021): 1–11. http://dx.doi.org/10.1155/2021/9544809.

Full text
Abstract:
Bearings always suffer from surface defects, such as scratches, black spots, and pits. Those surface defects have great effects on the quality and service life of bearings. Therefore, the defect detection of the bearing has always been the focus of the bearing quality control. Deep learning has been successfully applied to the objection detection due to its excellent performance. However, it is difficult to realize automatic detection of bearing surface defects based on data-driven-based deep learning due to few samples data of bearing defects on the actual production line. Sample preprocessin
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Supervised neural networks"

1

Sporea, Ioana. "Supervised learning in multilayer spiking neural networks." Thesis, University of Surrey, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.576119.

Full text
Abstract:
In this thesis, a new supervised learning algorithm for multilayer spik- ing neural networks is presented. Gradient descent learning algo- rithms have led traditional neural networks with multiple layers to be one of the most powerful and flexible computational models derived from artificial neural networks. However, more recent experimental evidence suggests that biological neural systems use the exact time of single action potentials to encode information. These findings have led to a new way of simulating neural networks based on temporal en- coding with single spikes. Analytical demonstrat
APA, Harvard, Vancouver, ISO, and other styles
2

Graves, Alex. "Supervised sequence labelling with recurrent neural networks." kostenfrei, 2008. http://mediatum2.ub.tum.de/doc/673554/673554.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Yuxuan. "Supervised Speech Separation Using Deep Neural Networks." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1426366690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hu, Renjie. "Random neural networks for dimensionality reduction and regularized supervised learning." Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/6960.

Full text
Abstract:
This dissertation explores Random Neural Networks (RNNs) in several aspects and their applications. First, Novel RNNs have been proposed for dimensionality reduction and visualization. Based on Extreme Learning Machines (ELMs) and Self-Organizing Maps (SOMs) a new method is created to identify the important variables and visualize the data. This technique reduces the curse of dimensionality and improves furthermore the interpretability of the visualization and is tested on real nursing survey datasets. ELM-SOM+ is an autoencoder created to preserves the intrinsic quality of SOM and also brings
APA, Harvard, Vancouver, ISO, and other styles
5

Aylas, Victor David Sanchez. "Contributions to Supervised Learning of Real-Valued Functions Using Neural Networks." NSUWorks, 1998. http://nsuworks.nova.edu/gscis_etd/395.

Full text
Abstract:
This dissertation presents a new strategy for the automatic design of neural networks. The learning environment addressed is supervised learning from examples. Specifically, Radial Basis Functions (RBF) networks learning real-valued functions of real vectors as in non-linear regression applications are considered. The strategy is based upon the application of strong theoretical relationships between RBF networks and methods from approximation theory, robust statistics, and computational learning theory. The complexity of the network design is examined in detail from the formal definition of th
APA, Harvard, Vancouver, ISO, and other styles
6

Tatsumi, Keiji. "Studies on supervised learning for neural networks with applications to optimization problems." 京都大学 (Kyoto University), 2006. http://hdl.handle.net/2433/136029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vančo, Timotej. "Self-supervised učení v aplikacích počítačového vidění." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2021. http://www.nusl.cz/ntk/nusl-442510.

Full text
Abstract:
The aim of the diploma thesis is to make research of the self-supervised learning in computer vision applications, then to choose a suitable test task with an extensive data set, apply self-supervised methods and evaluate. The theoretical part of the work is focused on the description of methods in computer vision, a detailed description of neural and convolution networks and an extensive explanation and division of self-supervised methods. Conclusion of the theoretical part is devoted to practical applications of the Self-supervised methods in practice. The practical part of the diploma thesi
APA, Harvard, Vancouver, ISO, and other styles
8

Charles, Eugene Yougarajah Andrew. "Supervised and unsupervised weight and delay adaptation learning in temporal coding spiking neural networks." Thesis, Cardiff University, 2006. http://orca.cf.ac.uk/56168/.

Full text
Abstract:
Artificial neural networks are learning paradigms which mimic the biological neural system. The temporal coding Spiking Neural Network, a relatively new artificial neural network paradigm, is considered to be computationally more powerful than the conventional neural network. Research on the network of spiking neurons is an emerging field and has potential for wider investigation. This research explores alternative learning models with temporal coding spiking neural networks for clustering and classification tasks. Neurons are known to be operating in two modes namely, as integrators and coinc
APA, Harvard, Vancouver, ISO, and other styles
9

Tang, Yuxing. "Weakly supervised learning of deformable part models and convolutional neural networks for object detection." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSEC062/document.

Full text
Abstract:
Dans cette thèse, nous nous intéressons au problème de la détection d’objets faiblement supervisée. Le but est de reconnaître et de localiser des objets dans les images, n’ayant à notre disposition durant la phase d’apprentissage que des images partiellement annotées au niveau des objets. Pour cela, nous avons proposé deux méthodes basées sur des modèles différents. Pour la première méthode, nous avons proposé une amélioration de l’approche ”Deformable Part-based Models” (DPM) faiblement supervisée, en insistant sur l’importance de la position et de la taille du filtre racine initial spécifiqu
APA, Harvard, Vancouver, ISO, and other styles
10

Pehrson, Jakob, and Sara Lindstrand. "Support Unit Classification through Supervised Machine Learning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281537.

Full text
Abstract:
The purpose of this article is to evaluate the impact a supervised machine learning classification model can have on the process of internal customer support within a large digitized company. Chatbots are becoming a frequently used utility among digital services, though the true general impact is not always clear. The research is separated into the following two questions: (1) Which supervised machine learning algorithm of naïve Bayes, logistic regression, and neural networks can best predict the correct support a user needs and with what accuracy? And (2) What is the effect on the productivit
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Supervised neural networks"

1

J, Marks Robert, ed. Neural smithing: Supervised learning in feedforward artificial neural networks. Cambridge, Mass: The MIT Press, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Suresh, Sundaram. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Graves, Alex. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-24797-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Suresh, Sundaram, Narasimhan Sundararajan, and Ramasamy Savitha. Supervised Learning with Complex-valued Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-29491-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Surinder, Singh. Exploratory spatial data analysis using supervised neural networks. London: University of East London, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Supervised and unsupervised pattern recognition: Feature extraction and computational intelligence. Boca Raton, Fla: CRC Press, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

SFI/CNLS Workshop on Formal Approaches to Supervised Learning (1992 Santa Fe, N.M.). The mathematics of generalization: The proceedings of the SFI/CNLS Workshop on Formal Approaches to Supervised Learning. Edited by Wolpert David H. Reading, Mass: Addison-Wesley Pub. Co., 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Leung, Wing Kai. The specification, analysis and metrics of supervised feedforward artificial neural networks for applied science and engineering applications. Birmingham: University of Central England in Birmingham, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Supervised Learning With Complexvalued Neural Networks. Springer, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Supervised neural networks"

1

Castillo, Oscar, and Patricia Melin. "Supervised Learning Neural Networks." In Soft Computing and Fractal Theory for Intelligent Manufacturing, 47–73. Heidelberg: Physica-Verlag HD, 2003. http://dx.doi.org/10.1007/978-3-7908-1766-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Melin, Patricia, and Oscar Castillo. "Supervised Learning Neural Networks." In Hybrid Intelligent Systems for Pattern Recognition Using Soft Computing, 55–83. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/978-3-540-32378-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Buscema, Massimo. "Supervised Artificial Neural Networks: Backpropagation Neural Networks." In Intelligent Data Mining in Law Enforcement Analytics, 119–35. Dordrecht: Springer Netherlands, 2012. http://dx.doi.org/10.1007/978-94-007-4914-6_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Behnke, Sven. "Supervised Learning." In Hierarchical Neural Networks for Image Interpretation, 111–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45169-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hvitfeldt, Emil, and Julia Silge. "Dense neural networks." In Supervised Machine Learning for Text Analysis in R, 231–72. Boca Raton: Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003093459-13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hvitfeldt, Emil, and Julia Silge. "Convolutional neural networks." In Supervised Machine Learning for Text Analysis in R, 303–42. Boca Raton: Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003093459-15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hammer, Barbara, Alexander Hasenfuss, Frank-Michael Schleif, and Thomas Villmann. "Supervised Batch Neural Gas." In Artificial Neural Networks in Pattern Recognition, 33–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11829898_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Brabazon, Anthony, Michael O’Neill, and Seán McGarraghy. "Neural Networks for Supervised Learning." In Natural Computing Algorithms, 221–59. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-43631-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fernández-Redondo, Mercedes, Joaquín Torres-Sospedra, and Carlos Hernández-Espinosa. "Training RBFs Networks: A Comparison Among Supervised and Not Supervised Algorithms." In Neural Information Processing, 477–86. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11893028_53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hajek, Petr, and Vladimir Olej. "Municipal Creditworthiness Modelling by Kernel-Based Approaches with Supervised and Semi-supervised Learning." In Engineering Applications of Neural Networks, 35–44. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03969-0_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Supervised neural networks"

1

Hagiwara and Nakagawa. "Supervised learning with artificial selection." In International Joint Conference on Neural Networks. IEEE, 1989. http://dx.doi.org/10.1109/ijcnn.1989.118443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Berton, Lilian, Jorge Valverde-Rebaza, and Alneu de Andrade Lopes. "Link prediction in graph construction for supervised and semi-supervised learning." In 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280543.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yin. "On asymptotic properties of supervised learning." In International Joint Conference on Neural Networks. IEEE, 1989. http://dx.doi.org/10.1109/ijcnn.1989.118540.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lamba, Sahil, and Rishab Lamba. "Spiking Neural Networks Vs Convolutional Neural Networks for Supervised Learning." In 2019 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS). IEEE, 2019. http://dx.doi.org/10.1109/icccis48478.2019.8974507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Maggu, Jyoti, and Angshul Majumdar. "Supervised Kernel Transform Learning." In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8852179.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Jordanov, Ivan, Nedyalko Petrov, and Alessio Petrozziello. "Supervised radar signal classification." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Jim Jing-Yan, and Xin Gao. "Semi-supervised sparse coding." In 2014 International Joint Conference on Neural Networks (IJCNN). IEEE, 2014. http://dx.doi.org/10.1109/ijcnn.2014.6889449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shukla, Ankita, Gullal S. Cheema, and Saket Anand. "Semi-Supervised Clustering with Neural Networks." In 2020 IEEE Sixth International Conference on Multimedia Big Data (BigMM). IEEE, 2020. http://dx.doi.org/10.1109/bigmm50055.2020.00030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Harrison, Kyle, and Amit Kumar Mishra. "Supervised Neural Networks for RFI Flagging." In 2019 RFI Workshop - Coexisting with Radio Frequency Interference (RFI). IEEE, 2019. http://dx.doi.org/10.23919/rfi48793.2019.9111748.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhan, Youqiu. "Self-supervised hamiltonian mechanics neural networks." In 2021 IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE). IEEE, 2021. http://dx.doi.org/10.1109/iccece51280.2021.9342165.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Supervised neural networks"

1

Zhang, Yunchong. Blind Denoising by Self-Supervised Neural Networks in Astronomical Datasets (Noise2Self4Astro). Office of Scientific and Technical Information (OSTI), August 2019. http://dx.doi.org/10.2172/1614728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Farhi, Edward, and Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, December 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.

Full text
Abstract:
We introduce a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning. The quantum circuit consists of a sequence of parameter dependent unitary transformations which acts on an input quantum state. For binary classification a single Pauli operator is measured on a designated readout qubit. The measured output is the quantum neural network’s predictor of the binary label of the input state. We show through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets. W
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!