Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Neural networks.

Zeitschriftenartikel zum Thema „Neural networks“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Neural networks" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Navghare, Tukaram, Aniket Muley, and Vinayak Jadhav. "Siamese Neural Networks for Kinship Prediction: A Deep Convolutional Neural Network Approach." Indian Journal Of Science And Technology 17, no. 4 (2024): 352–58. http://dx.doi.org/10.17485/ijst/v17i4.3018.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

N, Vikram. "Artificial Neural Networks." International Journal of Research Publication and Reviews 4, no. 4 (2023): 4308–9. http://dx.doi.org/10.55248/gengpi.4.423.37858.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

O. H. Abdelwahed, O. H. Abdelwahed, and M. El-Sayed Wahed. "Optimizing Single Layer Cellular Neural Network Simulator using Simulated Annealing Technique with Neural Networks." Indian Journal of Applied Research 3, no. 6 (2011): 91–94. http://dx.doi.org/10.15373/2249555x/june2013/31.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Perfetti, R. "A neural network to design neural networks." IEEE Transactions on Circuits and Systems 38, no. 9 (1991): 1099–103. http://dx.doi.org/10.1109/31.83884.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

AVeselý. "Neural networks in data mining." Agricultural Economics (Zemědělská ekonomika) 49, No. 9 (2012): 427–31. http://dx.doi.org/10.17221/5427-agricecon.

Der volle Inhalt der Quelle
Annotation:
To posses relevant information is an inevitable condition for successful enterprising in modern business. Information could be parted to data and knowledge. How to gather, store and retrieve data is studied in database theory. In the knowledge engineering, there is in the centre of interest the knowledge and methods of its formalization and gaining are studied. Knowledge could be gained from experts, specialists in the area of interest, or it can be gained by induction from sets of data. Automatic induction of knowledge from data sets, usually stored in large databases, is called data mining.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Zengguo Sun, Zengguo Sun, Guodong Zhao Zengguo Sun, Rafał Scherer Guodong Zhao, Wei Wei Rafał Scherer, and Marcin Woźniak Wei Wei. "Overview of Capsule Neural Networks." 網際網路技術學刊 23, no. 1 (2022): 033–44. http://dx.doi.org/10.53106/160792642022012301004.

Der volle Inhalt der Quelle
Annotation:
<p>As a vector transmission network structure, the capsule neural network has been one of the research hotspots in deep learning since it was proposed in 2017. In this paper, the latest research progress of capsule networks is analyzed and summarized. Firstly, we summarize the shortcomings of convolutional neural networks and introduce the basic concept of capsule network. Secondly, we analyze and summarize the improvements in the dynamic routing mechanism and network structure of the capsule network in recent years and the combination of the capsule network with other network structures
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

J, Joselin, Dinesh T, and Ashiq M. "A Review on Neural Networks." International Journal of Trend in Scientific Research and Development Volume-2, Issue-6 (2018): 565–69. http://dx.doi.org/10.31142/ijtsrd18461.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Alle, Kailash. "Sentiment Analysis Using Neural Networks." International Journal of Science and Research (IJSR) 7, no. 12 (2018): 1604–8. http://dx.doi.org/10.21275/sr24716104045.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Gumen, O., I. Selina, and D. Міz. "NEURAL NETWORKS. COMPUTER VISUAL RECOGNITION." Modern problems of modeling, no. 26 (June 13, 2024): 95–99. https://doi.org/10.33842/2313125x-2024-26-95-99.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Ziroyan, M. A., E. A. Tusova, A. S. Hovakimian, and S. G. Sargsyan. "Neural networks apparatus in biometrics." Contemporary problems of social work 1, no. 2 (2015): 129–37. http://dx.doi.org/10.17922/2412-5466-2015-1-2-129-137.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Marton, Sascha, Stefan Lüdtke, and Christian Bartelt. "Explanations for Neural Networks by Neural Networks." Applied Sciences 12, no. 3 (2022): 980. http://dx.doi.org/10.3390/app12030980.

Der volle Inhalt der Quelle
Annotation:
Understanding the function learned by a neural network is crucial in many domains, e.g., to detect a model’s adaption to concept drift in online learning. Existing global surrogate model approaches generate explanations by maximizing the fidelity between the neural network and a surrogate model on a sample-basis, which can be very time-consuming. Therefore, these approaches are not applicable in scenarios where timely or frequent explanations are required. In this paper, we introduce a real-time approach for generating a symbolic representation of the function learned by a neural network. Our
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Wang, Jun. "Artificial neural networks versus natural neural networks." Decision Support Systems 11, no. 5 (1994): 415–29. http://dx.doi.org/10.1016/0167-9236(94)90016-7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Yashchenko, V. O. "Neural-like growing networks in the development of general intelligence. Neural-like growing networks (P. II)." Mathematical machines and systems 1 (2023): 3–29. http://dx.doi.org/10.34121/1028-9763-2023-1-3-29.

Der volle Inhalt der Quelle
Annotation:
This article is devoted to the development of general artificial intelligence (AGI) based on a new type of neural networks – “neural-like growing networks”. It consists of two parts. The first one was published in N4, 2022, and describes an artificial neural-like element (artificial neuron) in terms of its functionality, which is as close as possible to a biological neuron. An artificial neural-like element is the main element in building neural-like growing networks. The second part deals with the structures and functions of artificial and natural neural networks. The paper proposes a new app
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Tetko, Igor V. "Neural Network Studies. 4. Introduction to Associative Neural Networks." Journal of Chemical Information and Computer Sciences 42, no. 3 (2002): 717–28. http://dx.doi.org/10.1021/ci010379o.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Simons, Robert, and J. G. Taylor. "Neural Networks." Journal of the Operational Research Society 47, no. 4 (1996): 596. http://dx.doi.org/10.2307/3010740.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Schier, R. "Neural networks." Radiology 191, no. 1 (1994): 291. http://dx.doi.org/10.1148/radiology.191.1.8134593.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Tafti, Mohammed H. A. "Neural networks." ACM SIGMIS Database: the DATABASE for Advances in Information Systems 23, no. 1 (1992): 51–54. http://dx.doi.org/10.1145/134347.134361.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Turega, M. A. "Neural Networks." Computer Journal 35, no. 3 (1992): 290. http://dx.doi.org/10.1093/comjnl/35.3.290.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Jordan, Michael I., and Christopher M. Bishop. "Neural networks." ACM Computing Surveys 28, no. 1 (1996): 73–75. http://dx.doi.org/10.1145/234313.234348.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Dory, Robert A. "Neural Networks." Computers in Physics 4, no. 3 (1990): 324. http://dx.doi.org/10.1063/1.4822918.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Ganssle, Graham. "Neural networks." Leading Edge 37, no. 8 (2018): 616–19. http://dx.doi.org/10.1190/tle37080616.1.

Der volle Inhalt der Quelle
Annotation:
We've all heard a proselytizing hyperbolist make the artificial-intelligence-is-going-to-steal-my-job speech. If you subscribe, look at the code in the notebook accompanying this tutorial at https://github.com/seg/tutorials-2018 . It demonstrates a small neural network. You'll find a simple system composed chiefly of multiply and add operations. That's really all that happens inside a neural network. Multiply and add. There's no magic here.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

McGourty, Christine. "Neural networks." Nature 335, no. 6186 (1988): 103. http://dx.doi.org/10.1038/335103b0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Simons, Robert. "Neural Networks." Journal of the Operational Research Society 47, no. 4 (1996): 596–97. http://dx.doi.org/10.1057/jors.1996.70.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Beatty, P. C. W. "Neural networks." Current Anaesthesia & Critical Care 9, no. 4 (1998): 168–73. http://dx.doi.org/10.1016/s0953-7112(98)80050-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Cutler, Adele. "Neural Networks." Technometrics 42, no. 4 (2000): 432. http://dx.doi.org/10.1080/00401706.2000.10485724.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Signorini, DavidF, JimM Slattery, S. R. Dodds, Victor Lane, and Peter Littlejohns. "Neural networks." Lancet 346, no. 8988 (1995): 1500–1501. http://dx.doi.org/10.1016/s0140-6736(95)92525-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Jefferson, MilesF, Neil Pendleton, Sam Lucas, MichaelA Horan, and Lionel Tarassenko. "Neural networks." Lancet 346, no. 8991-8992 (1995): 1712. http://dx.doi.org/10.1016/s0140-6736(95)92880-4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Gutfreund, H. "NEURAL NETWORKS." International Journal of Modern Physics B 04, no. 06 (1990): 1223–39. http://dx.doi.org/10.1142/s0217979290000607.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Medoff, Deborah R., and M.-A. Tagamets. "Neural Networks." American Journal of Psychiatry 157, no. 10 (2000): 1571. http://dx.doi.org/10.1176/appi.ajp.157.10.1571.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Lewis, David A. "Neural Networks." American Journal of Psychiatry 157, no. 11 (2000): 1752. http://dx.doi.org/10.1176/appi.ajp.157.11.1752.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Graybiel, Ann M. "Neural Networks." American Journal of Psychiatry 158, no. 1 (2001): 21. http://dx.doi.org/10.1176/appi.ajp.158.1.21.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Tamminga, Carol A., and Henry H. Holcomb. "Neural Networks." American Journal of Psychiatry 158, no. 2 (2001): 185. http://dx.doi.org/10.1176/appi.ajp.158.2.185.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Schwindling, Jerome. "Neural Networks." EPJ Web of Conferences 4 (2010): 02002. http://dx.doi.org/10.1051/epjconf/20100402002.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Widrow, Bernard, David E. Rumelhart, and Michael A. Lehr. "Neural networks." Communications of the ACM 37, no. 3 (1994): 93–105. http://dx.doi.org/10.1145/175247.175257.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Kock, Gerd. "Neural networks." Microprocessing and Microprogramming 38, no. 1-5 (1993): 679. http://dx.doi.org/10.1016/0165-6074(93)90210-c.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Titterington, Michael. "Neural networks." WIREs Computational Statistics 2, no. 1 (2009): 1–8. http://dx.doi.org/10.1002/wics.50.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Garcia, R. K., K. Moreira Gandra, J. M. Block, and D. Barrera-Arellano. "Neural networks to formulate special fats." Grasas y Aceites 63, no. 3 (2012): 245–52. http://dx.doi.org/10.3989/gya.119011.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Veselý, A., and D. Brechlerová. "Neural networks in intrusion detection systems." Agricultural Economics (Zemědělská ekonomika) 50, No. 1 (2012): 35–40. http://dx.doi.org/10.17221/5164-agricecon.

Der volle Inhalt der Quelle
Annotation:
Security of an information system is its very important property, especially today, when computers are interconnected via internet. Because no system can be absolutely secure, the timely and accurate detection of intrusions is necessary. For this purpose, Intrusion Detection Systems (IDS) were designed. There are two basic models of IDS: misuse IDS and anomaly IDS. Misuse systems detect intrusions by looking for activity that corresponds to the known signatures of intrusions or vulnerabilities. Anomaly systems detect intrusions by searching for an abnormal system activity. Most IDS commercial
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Mu, Yangzi, Mengxing Huang, Chunyang Ye, and Qingzhou Wu. "Diagnosis Prediction via Recurrent Neural Networks." International Journal of Machine Learning and Computing 8, no. 2 (2018): 117–20. http://dx.doi.org/10.18178/ijmlc.2018.8.2.673.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Sandhiya, K., M. Vidhya, and M. Shivaranjani S. Saranya. "Smart Fruit Classification using Neural Networks." International Journal of Trend in Scientific Research and Development Volume-2, Issue-1 (2017): 1298–303. http://dx.doi.org/10.31142/ijtsrd6986.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

S, Pothumani, and Priya N. "Analysis of RAID in Neural Networks." Journal of Advanced Research in Dynamical and Control Systems 11, no. 0009-SPECIAL ISSUE (2019): 589–94. http://dx.doi.org/10.5373/jardcs/v11/20192609.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

TYMOSHENKO, Pavlo, Yevgen ZASOBA, Olexander KOVALCHUK, and Olexander PSHENYCHNYY. "NEUROEVOLUTIONARY ALGORITHMS FOR NEURAL NETWORKS GENERATING." Herald of Khmelnytskyi National University. Technical sciences 315, no. 6(1) (2022): 240–44. http://dx.doi.org/10.31891/2307-5732-2022-315-6-240-244.

Der volle Inhalt der Quelle
Annotation:
Solving engineering problems using conventional neural networks requires long-term research on the choice of architecture and hyperparameters. A strong artificial intelligence would be devoid of such shortcomings. Such research is carried out using a very wide range of approaches: for example, biological (attempts to grow a brain in laboratory conditions), hardware (creating neural processors) or software (using the power of ordinary CPUs and GPUs). The goal of the work is to develop such a system that would allow using evolutionary approaches to generate neural networks suitable for solving p
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Kumar, G. Prem, and P. Venkataram. "Network restoration using recurrent neural networks." International Journal of Network Management 8, no. 5 (1998): 264–73. http://dx.doi.org/10.1002/(sici)1099-1190(199809/10)8:5<264::aid-nem298>3.0.co;2-o.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Der volle Inhalt der Quelle
Annotation:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neura
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

MEDOFF, DEBORAH, and HENRY HOLCOMB. "Neural Networks: Neural Systems II." American Journal of Psychiatry 157, no. 8 (2000): 1212. http://dx.doi.org/10.1176/appi.ajp.157.8.1212.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Boesen, Tue, Eldad Haber, and Uri M. Ascher. "Neural DAEs: Constrained Neural Networks." SIAM Journal on Scientific Computing 47, no. 2 (2025): C291—C312. https://doi.org/10.1137/23m1574051.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Rodriguez, Nathaniel, Eduardo Izquierdo, and Yong-Yeol Ahn. "Optimal modularity and memory capacity of neural reservoirs." Network Neuroscience 3, no. 2 (2019): 551–66. http://dx.doi.org/10.1162/netn_a_00082.

Der volle Inhalt der Quelle
Annotation:
The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network’s architecture and function is still primitive. Here we reveal that a neural network’s modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modulari
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Šourek, Gustav, Filip Železný, and Ondřej Kuželka. "Beyond graph neural networks with lifted relational neural networks." Machine Learning 110, no. 7 (2021): 1695–738. http://dx.doi.org/10.1007/s10994-021-06017-3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Purushothaman, G., and N. B. Karayiannis. "Quantum neural networks (QNNs): inherently fuzzy feedforward neural networks." IEEE Transactions on Neural Networks 8, no. 3 (1997): 679–93. http://dx.doi.org/10.1109/72.572106.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Vidyasagar, M. "Are analog neural networks better than binary neural networks?" Circuits, Systems, and Signal Processing 17, no. 2 (1998): 243–70. http://dx.doi.org/10.1007/bf01202855.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!