To see the other types of publications on this topic, follow the link: Neural networks.

Journal articles on the topic 'Neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Navghare, Tukaram, Aniket Muley, and Vinayak Jadhav. "Siamese Neural Networks for Kinship Prediction: A Deep Convolutional Neural Network Approach." Indian Journal Of Science And Technology 17, no. 4 (2024): 352–58. http://dx.doi.org/10.17485/ijst/v17i4.3018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

N, Vikram. "Artificial Neural Networks." International Journal of Research Publication and Reviews 4, no. 4 (2023): 4308–9. http://dx.doi.org/10.55248/gengpi.4.423.37858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

O. H. Abdelwahed, O. H. Abdelwahed, and M. El-Sayed Wahed. "Optimizing Single Layer Cellular Neural Network Simulator using Simulated Annealing Technique with Neural Networks." Indian Journal of Applied Research 3, no. 6 (2011): 91–94. http://dx.doi.org/10.15373/2249555x/june2013/31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Perfetti, R. "A neural network to design neural networks." IEEE Transactions on Circuits and Systems 38, no. 9 (1991): 1099–103. http://dx.doi.org/10.1109/31.83884.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

AVeselý. "Neural networks in data mining." Agricultural Economics (Zemědělská ekonomika) 49, No. 9 (2012): 427–31. http://dx.doi.org/10.17221/5427-agricecon.

Full text
Abstract:
To posses relevant information is an inevitable condition for successful enterprising in modern business. Information could be parted to data and knowledge. How to gather, store and retrieve data is studied in database theory. In the knowledge engineering, there is in the centre of interest the knowledge and methods of its formalization and gaining are studied. Knowledge could be gained from experts, specialists in the area of interest, or it can be gained by induction from sets of data. Automatic induction of knowledge from data sets, usually stored in large databases, is called data mining.
APA, Harvard, Vancouver, ISO, and other styles
6

Zengguo Sun, Zengguo Sun, Guodong Zhao Zengguo Sun, Rafał Scherer Guodong Zhao, Wei Wei Rafał Scherer, and Marcin Woźniak Wei Wei. "Overview of Capsule Neural Networks." 網際網路技術學刊 23, no. 1 (2022): 033–44. http://dx.doi.org/10.53106/160792642022012301004.

Full text
Abstract:
<p>As a vector transmission network structure, the capsule neural network has been one of the research hotspots in deep learning since it was proposed in 2017. In this paper, the latest research progress of capsule networks is analyzed and summarized. Firstly, we summarize the shortcomings of convolutional neural networks and introduce the basic concept of capsule network. Secondly, we analyze and summarize the improvements in the dynamic routing mechanism and network structure of the capsule network in recent years and the combination of the capsule network with other network structures
APA, Harvard, Vancouver, ISO, and other styles
7

J, Joselin, Dinesh T, and Ashiq M. "A Review on Neural Networks." International Journal of Trend in Scientific Research and Development Volume-2, Issue-6 (2018): 565–69. http://dx.doi.org/10.31142/ijtsrd18461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Alle, Kailash. "Sentiment Analysis Using Neural Networks." International Journal of Science and Research (IJSR) 7, no. 12 (2018): 1604–8. http://dx.doi.org/10.21275/sr24716104045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gumen, O., I. Selina, and D. Міz. "NEURAL NETWORKS. COMPUTER VISUAL RECOGNITION." Modern problems of modeling, no. 26 (June 13, 2024): 95–99. https://doi.org/10.33842/2313125x-2024-26-95-99.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ziroyan, M. A., E. A. Tusova, A. S. Hovakimian, and S. G. Sargsyan. "Neural networks apparatus in biometrics." Contemporary problems of social work 1, no. 2 (2015): 129–37. http://dx.doi.org/10.17922/2412-5466-2015-1-2-129-137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Marton, Sascha, Stefan Lüdtke, and Christian Bartelt. "Explanations for Neural Networks by Neural Networks." Applied Sciences 12, no. 3 (2022): 980. http://dx.doi.org/10.3390/app12030980.

Full text
Abstract:
Understanding the function learned by a neural network is crucial in many domains, e.g., to detect a model’s adaption to concept drift in online learning. Existing global surrogate model approaches generate explanations by maximizing the fidelity between the neural network and a surrogate model on a sample-basis, which can be very time-consuming. Therefore, these approaches are not applicable in scenarios where timely or frequent explanations are required. In this paper, we introduce a real-time approach for generating a symbolic representation of the function learned by a neural network. Our
APA, Harvard, Vancouver, ISO, and other styles
12

Wang, Jun. "Artificial neural networks versus natural neural networks." Decision Support Systems 11, no. 5 (1994): 415–29. http://dx.doi.org/10.1016/0167-9236(94)90016-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Yashchenko, V. O. "Neural-like growing networks in the development of general intelligence. Neural-like growing networks (P. II)." Mathematical machines and systems 1 (2023): 3–29. http://dx.doi.org/10.34121/1028-9763-2023-1-3-29.

Full text
Abstract:
This article is devoted to the development of general artificial intelligence (AGI) based on a new type of neural networks – “neural-like growing networks”. It consists of two parts. The first one was published in N4, 2022, and describes an artificial neural-like element (artificial neuron) in terms of its functionality, which is as close as possible to a biological neuron. An artificial neural-like element is the main element in building neural-like growing networks. The second part deals with the structures and functions of artificial and natural neural networks. The paper proposes a new app
APA, Harvard, Vancouver, ISO, and other styles
14

Tetko, Igor V. "Neural Network Studies. 4. Introduction to Associative Neural Networks." Journal of Chemical Information and Computer Sciences 42, no. 3 (2002): 717–28. http://dx.doi.org/10.1021/ci010379o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Simons, Robert, and J. G. Taylor. "Neural Networks." Journal of the Operational Research Society 47, no. 4 (1996): 596. http://dx.doi.org/10.2307/3010740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Schier, R. "Neural networks." Radiology 191, no. 1 (1994): 291. http://dx.doi.org/10.1148/radiology.191.1.8134593.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Tafti, Mohammed H. A. "Neural networks." ACM SIGMIS Database: the DATABASE for Advances in Information Systems 23, no. 1 (1992): 51–54. http://dx.doi.org/10.1145/134347.134361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Turega, M. A. "Neural Networks." Computer Journal 35, no. 3 (1992): 290. http://dx.doi.org/10.1093/comjnl/35.3.290.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Jordan, Michael I., and Christopher M. Bishop. "Neural networks." ACM Computing Surveys 28, no. 1 (1996): 73–75. http://dx.doi.org/10.1145/234313.234348.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Dory, Robert A. "Neural Networks." Computers in Physics 4, no. 3 (1990): 324. http://dx.doi.org/10.1063/1.4822918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ganssle, Graham. "Neural networks." Leading Edge 37, no. 8 (2018): 616–19. http://dx.doi.org/10.1190/tle37080616.1.

Full text
Abstract:
We've all heard a proselytizing hyperbolist make the artificial-intelligence-is-going-to-steal-my-job speech. If you subscribe, look at the code in the notebook accompanying this tutorial at https://github.com/seg/tutorials-2018 . It demonstrates a small neural network. You'll find a simple system composed chiefly of multiply and add operations. That's really all that happens inside a neural network. Multiply and add. There's no magic here.
APA, Harvard, Vancouver, ISO, and other styles
22

McGourty, Christine. "Neural networks." Nature 335, no. 6186 (1988): 103. http://dx.doi.org/10.1038/335103b0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Simons, Robert. "Neural Networks." Journal of the Operational Research Society 47, no. 4 (1996): 596–97. http://dx.doi.org/10.1057/jors.1996.70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Beatty, P. C. W. "Neural networks." Current Anaesthesia & Critical Care 9, no. 4 (1998): 168–73. http://dx.doi.org/10.1016/s0953-7112(98)80050-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cutler, Adele. "Neural Networks." Technometrics 42, no. 4 (2000): 432. http://dx.doi.org/10.1080/00401706.2000.10485724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Signorini, DavidF, JimM Slattery, S. R. Dodds, Victor Lane, and Peter Littlejohns. "Neural networks." Lancet 346, no. 8988 (1995): 1500–1501. http://dx.doi.org/10.1016/s0140-6736(95)92525-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Jefferson, MilesF, Neil Pendleton, Sam Lucas, MichaelA Horan, and Lionel Tarassenko. "Neural networks." Lancet 346, no. 8991-8992 (1995): 1712. http://dx.doi.org/10.1016/s0140-6736(95)92880-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Gutfreund, H. "NEURAL NETWORKS." International Journal of Modern Physics B 04, no. 06 (1990): 1223–39. http://dx.doi.org/10.1142/s0217979290000607.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Medoff, Deborah R., and M.-A. Tagamets. "Neural Networks." American Journal of Psychiatry 157, no. 10 (2000): 1571. http://dx.doi.org/10.1176/appi.ajp.157.10.1571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Lewis, David A. "Neural Networks." American Journal of Psychiatry 157, no. 11 (2000): 1752. http://dx.doi.org/10.1176/appi.ajp.157.11.1752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Graybiel, Ann M. "Neural Networks." American Journal of Psychiatry 158, no. 1 (2001): 21. http://dx.doi.org/10.1176/appi.ajp.158.1.21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Tamminga, Carol A., and Henry H. Holcomb. "Neural Networks." American Journal of Psychiatry 158, no. 2 (2001): 185. http://dx.doi.org/10.1176/appi.ajp.158.2.185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Schwindling, Jerome. "Neural Networks." EPJ Web of Conferences 4 (2010): 02002. http://dx.doi.org/10.1051/epjconf/20100402002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Widrow, Bernard, David E. Rumelhart, and Michael A. Lehr. "Neural networks." Communications of the ACM 37, no. 3 (1994): 93–105. http://dx.doi.org/10.1145/175247.175257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kock, Gerd. "Neural networks." Microprocessing and Microprogramming 38, no. 1-5 (1993): 679. http://dx.doi.org/10.1016/0165-6074(93)90210-c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Titterington, Michael. "Neural networks." WIREs Computational Statistics 2, no. 1 (2009): 1–8. http://dx.doi.org/10.1002/wics.50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Garcia, R. K., K. Moreira Gandra, J. M. Block, and D. Barrera-Arellano. "Neural networks to formulate special fats." Grasas y Aceites 63, no. 3 (2012): 245–52. http://dx.doi.org/10.3989/gya.119011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Veselý, A., and D. Brechlerová. "Neural networks in intrusion detection systems." Agricultural Economics (Zemědělská ekonomika) 50, No. 1 (2012): 35–40. http://dx.doi.org/10.17221/5164-agricecon.

Full text
Abstract:
Security of an information system is its very important property, especially today, when computers are interconnected via internet. Because no system can be absolutely secure, the timely and accurate detection of intrusions is necessary. For this purpose, Intrusion Detection Systems (IDS) were designed. There are two basic models of IDS: misuse IDS and anomaly IDS. Misuse systems detect intrusions by looking for activity that corresponds to the known signatures of intrusions or vulnerabilities. Anomaly systems detect intrusions by searching for an abnormal system activity. Most IDS commercial
APA, Harvard, Vancouver, ISO, and other styles
39

Mu, Yangzi, Mengxing Huang, Chunyang Ye, and Qingzhou Wu. "Diagnosis Prediction via Recurrent Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 117–20. http://dx.doi.org/10.18178/ijmlc.2018.8.2.673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Sandhiya, K., M. Vidhya, and M. Shivaranjani S. Saranya. "Smart Fruit Classification using Neural Networks." International Journal of Trend in Scientific Research and Development Volume-2, Issue-1 (2017): 1298–303. http://dx.doi.org/10.31142/ijtsrd6986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

S, Pothumani, and Priya N. "Analysis of RAID in Neural Networks." Journal of Advanced Research in Dynamical and Control Systems 11, no. 0009-SPECIAL ISSUE (2019): 589–94. http://dx.doi.org/10.5373/jardcs/v11/20192609.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

TYMOSHENKO, Pavlo, Yevgen ZASOBA, Olexander KOVALCHUK, and Olexander PSHENYCHNYY. "NEUROEVOLUTIONARY ALGORITHMS FOR NEURAL NETWORKS GENERATING." Herald of Khmelnytskyi National University. Technical sciences 315, no. 6(1) (2022): 240–44. http://dx.doi.org/10.31891/2307-5732-2022-315-6-240-244.

Full text
Abstract:
Solving engineering problems using conventional neural networks requires long-term research on the choice of architecture and hyperparameters. A strong artificial intelligence would be devoid of such shortcomings. Such research is carried out using a very wide range of approaches: for example, biological (attempts to grow a brain in laboratory conditions), hardware (creating neural processors) or software (using the power of ordinary CPUs and GPUs). The goal of the work is to develop such a system that would allow using evolutionary approaches to generate neural networks suitable for solving p
APA, Harvard, Vancouver, ISO, and other styles
43

Kumar, G. Prem, and P. Venkataram. "Network restoration using recurrent neural networks." International Journal of Network Management 8, no. 5 (1998): 264–73. http://dx.doi.org/10.1002/(sici)1099-1190(199809/10)8:5<264::aid-nem298>3.0.co;2-o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Full text
Abstract:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neura
APA, Harvard, Vancouver, ISO, and other styles
45

MEDOFF, DEBORAH, and HENRY HOLCOMB. "Neural Networks: Neural Systems II." American Journal of Psychiatry 157, no. 8 (2000): 1212. http://dx.doi.org/10.1176/appi.ajp.157.8.1212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Boesen, Tue, Eldad Haber, and Uri M. Ascher. "Neural DAEs: Constrained Neural Networks." SIAM Journal on Scientific Computing 47, no. 2 (2025): C291—C312. https://doi.org/10.1137/23m1574051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Rodriguez, Nathaniel, Eduardo Izquierdo, and Yong-Yeol Ahn. "Optimal modularity and memory capacity of neural reservoirs." Network Neuroscience 3, no. 2 (2019): 551–66. http://dx.doi.org/10.1162/netn_a_00082.

Full text
Abstract:
The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network’s architecture and function is still primitive. Here we reveal that a neural network’s modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modulari
APA, Harvard, Vancouver, ISO, and other styles
48

Šourek, Gustav, Filip Železný, and Ondřej Kuželka. "Beyond graph neural networks with lifted relational neural networks." Machine Learning 110, no. 7 (2021): 1695–738. http://dx.doi.org/10.1007/s10994-021-06017-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Purushothaman, G., and N. B. Karayiannis. "Quantum neural networks (QNNs): inherently fuzzy feedforward neural networks." IEEE Transactions on Neural Networks 8, no. 3 (1997): 679–93. http://dx.doi.org/10.1109/72.572106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Vidyasagar, M. "Are analog neural networks better than binary neural networks?" Circuits, Systems, and Signal Processing 17, no. 2 (1998): 243–70. http://dx.doi.org/10.1007/bf01202855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!