Academic literature on the topic 'Neural network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural network"

1

Navghare, Tukaram, Aniket Muley, and Vinayak Jadhav. "Siamese Neural Networks for Kinship Prediction: A Deep Convolutional Neural Network Approach." Indian Journal Of Science And Technology 17, no. 4 (2024): 352–58. http://dx.doi.org/10.17485/ijst/v17i4.3018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

O. H. Abdelwahed, O. H. Abdelwahed, and M. El-Sayed Wahed. "Optimizing Single Layer Cellular Neural Network Simulator using Simulated Annealing Technique with Neural Networks." Indian Journal of Applied Research 3, no. 6 (2011): 91–94. http://dx.doi.org/10.15373/2249555x/june2013/31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tran, Loc. "Directed Hypergraph Neural Network." Journal of Advanced Research in Dynamical and Control Systems 12, SP4 (2020): 1434–41. http://dx.doi.org/10.5373/jardcs/v12sp4/20201622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Antipova, E. S., and S. A. Rashkovskiy. "Autoassociative Hamming Neural Network." Nelineinaya Dinamika 17, no. 2 (2021): 175–93. http://dx.doi.org/10.20537/nd210204.

Full text
Abstract:
An autoassociative neural network is suggested which is based on the calculation of Hamming distances, while the principle of its operation is similar to that of the Hopfield neural network. Using standard patterns as an example, we compare the efficiency of pattern recognition for the autoassociative Hamming network and the Hopfield network. It is shown that the autoassociative Hamming network successfully recognizes standard patterns with a degree of distortion up to $40\%$ and more than $60\%$, while the Hopfield network ceases to recognize the same patterns with a degree of distortion of m
APA, Harvard, Vancouver, ISO, and other styles
5

Perfetti, R. "A neural network to design neural networks." IEEE Transactions on Circuits and Systems 38, no. 9 (1991): 1099–103. http://dx.doi.org/10.1109/31.83884.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zengguo Sun, Zengguo Sun, Guodong Zhao Zengguo Sun, Rafał Scherer Guodong Zhao, Wei Wei Rafał Scherer, and Marcin Woźniak Wei Wei. "Overview of Capsule Neural Networks." 網際網路技術學刊 23, no. 1 (2022): 033–44. http://dx.doi.org/10.53106/160792642022012301004.

Full text
Abstract:
<p>As a vector transmission network structure, the capsule neural network has been one of the research hotspots in deep learning since it was proposed in 2017. In this paper, the latest research progress of capsule networks is analyzed and summarized. Firstly, we summarize the shortcomings of convolutional neural networks and introduce the basic concept of capsule network. Secondly, we analyze and summarize the improvements in the dynamic routing mechanism and network structure of the capsule network in recent years and the combination of the capsule network with other network structures
APA, Harvard, Vancouver, ISO, and other styles
7

D, Sreekanth. "Metro Water Fraudulent Prediction in Houses Using Convolutional Neural Network and Recurrent Neural Network." Revista Gestão Inovação e Tecnologias 11, no. 4 (2021): 1177–87. http://dx.doi.org/10.47059/revistageintec.v11i4.2177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mahat, Norpah, Nor Idayunie Nording, Jasmani Bidin, Suzanawati Abu Hasan, and Teoh Yeong Kin. "Artificial Neural Network (ANN) to Predict Mathematics Students’ Performance." Journal of Computing Research and Innovation 7, no. 1 (2022): 29–38. http://dx.doi.org/10.24191/jcrinn.v7i1.264.

Full text
Abstract:
Predicting students’ academic performance is very essential to produce high-quality students. The main goal is to continuously help students to increase their ability in the learning process and to help educators as well in improving their teaching skills. Therefore, this study was conducted to predict mathematics students’ performance using Artificial Neural Network (ANN). The secondary data from 382 mathematics students from UCI Machine Learning Repository Data Sets used to train the neural networks. The neural network model built using nntool. Two inputs are used which are the first and the
APA, Harvard, Vancouver, ISO, and other styles
9

FUKUSHIMA, Kunihiko. "Neocognitron: Deep Convolutional Neural Network." Journal of Japan Society for Fuzzy Theory and Intelligent Informatics 27, no. 4 (2015): 115–25. http://dx.doi.org/10.3156/jsoft.27.4_115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

CVS, Rajesh, and Nadikoppula Pardhasaradhi. "Analysis of Artificial Neural-Network." International Journal of Trend in Scientific Research and Development Volume-2, Issue-6 (2018): 418–28. http://dx.doi.org/10.31142/ijtsrd18482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Neural network"

1

Xu, Shuxiang, University of Western Sydney, and of Informatics Science and Technology Faculty. "Neuron-adaptive neural network models and applications." THESIS_FIST_XXX_Xu_S.xml, 1999. http://handle.uws.edu.au:8081/1959.7/275.

Full text
Abstract:
Artificial Neural Networks have been widely probed by worldwide researchers to cope with the problems such as function approximation and data simulation. This thesis deals with Feed-forward Neural Networks (FNN's) with a new neuron activation function called Neuron-adaptive Activation Function (NAF), and Feed-forward Higher Order Neural Networks (HONN's) with this new neuron activation function. We have designed a new neural network model, the Neuron-Adaptive Neural Network (NANN), and mathematically proved that one NANN can approximate any piecewise continuous function to any desired accuracy
APA, Harvard, Vancouver, ISO, and other styles
2

Ellerbrock, Thomas M. "Multilayer neural networks learnability, network generation, and network simplification /." [S.l. : s.n.], 1999. http://deposit.ddb.de/cgi-bin/dokserv?idn=958467897.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Patterson, Raymond A. "Hybrid Neural networks and network design." Connect to resource, 1995. http://rave.ohiolink.edu/etdc/view.cgi?acc%5Fnum=osu1262707683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Khliobas. "NEURAL NETWORK." Thesis, Київ 2018, 2018. http://er.nau.edu.ua/handle/NAU/33752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rastogi, Preeti. "Assessing Wireless Network Dependability Using Neural Networks." Ohio University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1129134364.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chambers, Mark Andrew. "Queuing network construction using artificial neural networks /." The Ohio State University, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488193665234291.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dunn, Nathan A. "A Novel Neural Network Analysis Method Applied to Biological Neural Networks." Thesis, view abstract or download file of text, 2006. http://proquest.umi.com/pqdweb?did=1251892251&sid=2&Fmt=2&clientId=11238&RQT=309&VName=PQD.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2006.<br>Typescript. Includes vita and abstract. Includes bibliographical references (leaves 122- 131). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
8

BRUCE, WILLIAM, and OTTER EDVIN VON. "Artificial Neural Network Autonomous Vehicle : Artificial Neural Network controlled vehicle." Thesis, KTH, Maskinkonstruktion (Inst.), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191192.

Full text
Abstract:
This thesis aims to explain how a Artificial Neural Network algorithm could be used as means of control for a Autonomous Vehicle. It describes the theory behind the neural network and Autonomous Vehicles, and how a prototype with a camera as its only input can be designed to test and evaluate the algorithms capabilites, and also drive using it. The thesis will show that the Artificial Neural Network can, with a image resolution of 100 × 100 and a training set with 900 images, makes decisions with a 0.78 confidence level.<br>Denna rapport har som mal att beskriva hur en Artificiellt Neuronnatve
APA, Harvard, Vancouver, ISO, and other styles
9

De, Jongh Albert. "Neural network ensembles." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/50035.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2004.<br>ENGLISH ABSTRACT: It is possible to improve on the accuracy of a single neural network by using an ensemble of diverse and accurate networks. This thesis explores diversity in ensembles and looks at the underlying theory and mechanisms employed to generate and combine ensemble members. Bagging and boosting are studied in detail and I explain their success in terms of well-known theoretical instruments. An empirical evaluation of their performance is conducted and I compare them to a single classifier and to each other in terms of accuracy
APA, Harvard, Vancouver, ISO, and other styles
10

Simmen, Martin Walter. "Neural network optimization." Thesis, University of Edinburgh, 1992. http://hdl.handle.net/1842/12942.

Full text
Abstract:
Combinatorial optimization problems arise throughout science, industry, and commerce. The demonstration that analogue neural networks could, in principle, rapidly find near-optimal solutions to such problems - many of which appear computationally intractable - was important both for the novelty of the approach and because these networks are potentially implementable in parallel hardware. However, subsequent research, conducted largely on the travelling salesman problem, revealed problems regarding the original network's parameter sensitivity and tendency to give invalid states. Although this h
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Neural network"

1

De Wilde, Philippe. Neural Network Models. Springer London, 1997. http://dx.doi.org/10.1007/978-1-84628-614-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Taylor, J. G., E. R. Caianiello, R. M. J. Cotterill, and J. W. Clark, eds. Neural Network Dynamics. Springer London, 1992. http://dx.doi.org/10.1007/978-1-4471-2001-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Taylor, J. G., ed. Neural Network Applications. Springer London, 1992. http://dx.doi.org/10.1007/978-1-4471-2003-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Harvey, Robert L. Neural network principles. Prentice-Hall International, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

B, Demuth Howard, and Beale Mark H, eds. Neural network design. PWS Pub., 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Edgar, Sánchez-Sinencio, and Newcomb Robert W, eds. Neural network hardware. IEEE, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bharath, Ramachandran. Neural network computing. Windcrest/McGraw-Hill, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Toshio, Fukuda, ed. Neural network applications. IEEE, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shanmuganathan, Subana, and Sandhya Samarasinghe, eds. Artificial Neural Network Modelling. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-28495-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Skrzypek, Josef, ed. Neural Network Simulation Environments. Springer US, 1994. http://dx.doi.org/10.1007/978-1-4615-2736-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Neural network"

1

D’Addona, Doriana Marilena. "Neural Network." In CIRP Encyclopedia of Production Engineering. Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-642-35950-7_6563-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

D’Addona, Doriana Marilena. "Neural Network." In CIRP Encyclopedia of Production Engineering. Springer Berlin Heidelberg, 2019. http://dx.doi.org/10.1007/978-3-662-53120-4_6563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

D’Addona, Doriana Marilena. "Neural Network." In CIRP Encyclopedia of Production Engineering. Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-20617-7_6563.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Phil. "Neural Network." In MATLAB Deep Learning. Apress, 2017. http://dx.doi.org/10.1007/978-1-4842-2845-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chityala, Ravishankar, and Sridevi Pudipeddi. "Neural Network." In Image Processing and Acquisition using Python. Chapman and Hall/CRC, 2020. http://dx.doi.org/10.1201/9780429243370-11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Weik, Martin H. "neural network." In Computer Science and Communications Dictionary. Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_12300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Burgos, José E. "Neural Network." In Encyclopedia of Animal Cognition and Behavior. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-319-47829-6_775-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Burgos, José E. "Neural Network." In Encyclopedia of Animal Cognition and Behavior. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-319-55065-7_775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Liang, Jianxin Zhao, and Richard Mortier. "Neural Network." In Undergraduate Topics in Computer Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-97645-3_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tsai, Kao-Tai. "Neural Network." In Machine Learning for Knowledge Discovery with R. Chapman and Hall/CRC, 2021. http://dx.doi.org/10.1201/9781003205685-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Neural network"

1

Parto, Midya, Gordon H. Y. Li, Ryoto Sekine, et al. "An Optical Neural Network Based on Nanophotonic Optical Parametric Oscillators." In CLEO: Science and Innovations. Optica Publishing Group, 2024. http://dx.doi.org/10.1364/cleo_si.2024.stu3p.7.

Full text
Abstract:
We experimentally demonstrate a recurrent optical neural network based on a nanophotonic optical parametric oscillator fabricated on thin-film lithium niobate. Our demonstration paves the way for realizing optical neural networks exhibiting ultra-low la-tencies.
APA, Harvard, Vancouver, ISO, and other styles
2

Reilly, Rosemarie, Xiaoshu Xu, and Jerald Jones. "Neural Network Application to Acoustic Emission Signal Processing." In CORROSION 1992. NACE International, 1992. https://doi.org/10.5006/c1992-92242.

Full text
Abstract:
Abstract Artificial neural systems, also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, non-linear, mathematical relationship or transform. These constructs have two significant properties that have proven usefu
APA, Harvard, Vancouver, ISO, and other styles
3

Zheng, Shengjie, Lang Qian, Pingsheng Li, Chenggang He, Xiaoqi Qin, and Xiaojian Li. "An Introductory Review of Spiking Neural Network and Artificial Neural Network: From Biological Intelligence to Artificial Intelligence." In 8th International Conference on Artificial Intelligence (ARIN 2022). Academy and Industry Research Collaboration Center (AIRCC), 2022. http://dx.doi.org/10.5121/csit.2022.121010.

Full text
Abstract:
Stemming from the rapid development of artificial intelligence, which has gained expansive success in pattern recognition, robotics, and bioinformatics, neuroscience is also gaining tremendous progress. A kind of spiking neural network with biological interpretability is gradually receiving wide attention, and this kind of neural network is also regarded as one of the directions toward general artificial intelligence. This review summarizes the basic properties of artificial neural networks as well as spiking neural networks. Our focus is on the biological background and theoretical basis of s
APA, Harvard, Vancouver, ISO, and other styles
4

Yang, Zhun, Adam Ishay, and Joohyung Lee. "NeurASP: Embracing Neural Networks into Answer Set Programming." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/243.

Full text
Abstract:
We present NeurASP, a simple extension of answer set programs by embracing neural networks. By treating the neural network output as the probability distribution over atomic facts in answer set programs, NeurASP provides a simple and effective way to integrate sub-symbolic and symbolic computation. We demonstrate how NeurASP can make use of a pre-trained neural network in symbolic computation and how it can improve the neural network's perception result by applying symbolic reasoning in answer set programming. Also, NeurASP can make use of ASP rules to train a neural network better so that a n
APA, Harvard, Vancouver, ISO, and other styles
5

Huynh, Alex V., John F. Walkup, and Thomas F. Krile. "Optical perceptron-based quadratic neural network." In OSA Annual Meeting. Optica Publishing Group, 1991. http://dx.doi.org/10.1364/oam.1991.mii8.

Full text
Abstract:
Optical quadratic neural networks are currently being investigated because of their advantages over linear neural networks.1 Based on a quadratic neuron already constructed,2 an optical quadratic neural network utilizing four-wave mixing in photorefractive barium titanate (BaTiO3) has been developed. This network implements a feedback loop using a charge-coupled device camera, two monochrome liquid crystal televisions, a computer, and various optical elements. For training, the network employs the supervised quadratic Perceptron algorithm to associate binary-valued input vectors with specified
APA, Harvard, Vancouver, ISO, and other styles
6

Shi, Weijia, Andy Shih, Adnan Darwiche, and Arthur Choi. "On Tractable Representations of Binary Neural Networks." In 17th International Conference on Principles of Knowledge Representation and Reasoning {KR-2020}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/kr.2020/91.

Full text
Abstract:
We consider the compilation of a binary neural network’s decision function into tractable representations such as Ordered Binary Decision Diagrams (OBDDs) and Sentential Decision Diagrams (SDDs). Obtaining this function as an OBDD/SDD facilitates the explanation and formal verification of a neural network’s behavior. First, we consider the task of verifying the robustness of a neural network, and show how we can compute the expected robustness of a neural network, given an OBDD/SDD representation of it. Next, we consider a more efficient approach for compiling neural networks, based on a pseudo-
APA, Harvard, Vancouver, ISO, and other styles
7

Bian, Shaoping, Kebin Xu, and Jing Hong. "Near neighbor neurons interconnected neural network." In OSA Annual Meeting. Optica Publishing Group, 1989. http://dx.doi.org/10.1364/oam.1989.tht27.

Full text
Abstract:
When the Hopfield neural network is extended to deal with a 2-D image composed of N×N pixels, the weight interconnection is a fourth-rank tensor with N4 elements. Each neuron is interconnected with all other neurons of the network. For an image, N will be large. So N4, the number of elements of the interconnection tensor, will be so large as to make the neural network's learning time (which corresponds to the precalculation of the interconnection tensor elements) too long. It is also difficult to implement the 2-D Hopfield neural network optically.
APA, Harvard, Vancouver, ISO, and other styles
8

Huynh, Alex V., John F. Walkup, and Thomas F. Krile. "Optical quadratic perceptron neural network." In OSA Annual Meeting. Optica Publishing Group, 1990. http://dx.doi.org/10.1364/oam.1990.thy35.

Full text
Abstract:
Optical quadratic neural networks are currently being investigated because of their advantages with respect to linear neural networks.1 A quadratic neuron has previously been implemented by using a photorefractive barium titanate crystal.2 This approach has been improved and enhanced to realize a neural network that implements the perceptron learning algorithm. The input matrix, which is an encoded version of the input vector, is placed on a mask, and the interconnection matrix is computer-generated on a monochrome liquid-crystal television. By performing the four-wave mixing operation, the ba
APA, Harvard, Vancouver, ISO, and other styles
9

Pryor, Connor, Charles Dickens, Eriq Augustine, Alon Albalak, William Yang Wang, and Lise Getoor. "NeuPSL: Neural Probabilistic Soft Logic." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/461.

Full text
Abstract:
In this paper, we introduce Neural Probabilistic Soft Logic (NeuPSL), a novel neuro-symbolic (NeSy) framework that unites state-of-the-art symbolic reasoning with the low-level perception of deep neural networks. To model the boundary between neural and symbolic representations, we propose a family of energy-based models, NeSy Energy-Based Models, and show that they are general enough to include NeuPSL and many other NeSy approaches. Using this framework, we show how to seamlessly integrate neural and symbolic parameter learning and inference in NeuPSL. Through an extensive empirical evaluatio
APA, Harvard, Vancouver, ISO, and other styles
10

Zhan, Tiffany. "Hyper-Parameter Tuning in Deep Neural Network Learning." In 8th International Conference on Artificial Intelligence and Applications (AI 2022). Academy and Industry Research Collaboration Center (AIRCC), 2022. http://dx.doi.org/10.5121/csit.2022.121809.

Full text
Abstract:
Deep learning has been increasingly used in various applications such as image and video recognition, recommender systems, image classification, image segmentation, medical image analysis, natural language processing, brain–computer interfaces, and financial time series. In deep learning, a convolutional neural network (CNN) is regularized versions of multilayer perceptrons. Multilayer perceptrons usually mean fully connected networks, that is, each neuron in one layer is connected to all neurons in the next layer. The full connectivity of these networks makes them prone to overfitting data. T
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Neural network"

1

Pollack, Randy B. Neural Network Technologies. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada262576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wilensky, Gregg, Narbik Manukian, Joseph Neuhaus, and Natalie Rivetti. Neural Network Studies. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada271593.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tarasenko, Andrii O., Yuriy V. Yakimov, and Vladimir N. Soloviev. Convolutional neural networks for image classification. [б. в.], 2020. http://dx.doi.org/10.31812/123456789/3682.

Full text
Abstract:
This paper shows the theoretical basis for the creation of convolutional neural networks for image classification and their application in practice. To achieve the goal, the main types of neural networks were considered, starting from the structure of a simple neuron to the convolutional multilayer network necessary for the solution of this problem. It shows the stages of the structure of training data, the training cycle of the network, as well as calculations of errors in recognition at the stage of training and verification. At the end of the work the results of network training, calculatio
APA, Harvard, Vancouver, ISO, and other styles
4

Barto, Andrew. Adaptive Neural Network Architecture. Defense Technical Information Center, 1987. http://dx.doi.org/10.21236/ada190114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

McDonnell, John R., and Don Waagen. Evolving Neural Network Architecture. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada264802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

McDonnell, J. R., and D. Waagen. Evolving Neural Network Connectivity. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada273134.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Saavedra, Gary, and Aidan Thompson. Neural Network Interatomic Potentials. Office of Scientific and Technical Information (OSTI), 2020. http://dx.doi.org/10.2172/1678825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shao, Lu. Automatic Seizure Detection based on a Convolutional Neural Network-Recurrent Neural Network Model. Iowa State University, 2022. http://dx.doi.org/10.31274/cc-20240624-269.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mark A. Rhode. Tampa Electric Neural Network Sootblowing. Office of Scientific and Technical Information (OSTI), 2004. http://dx.doi.org/10.2172/900191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mark A. Rhode. Tampa Electric Neural Network Sootblowing. Office of Scientific and Technical Information (OSTI), 2004. http://dx.doi.org/10.2172/900192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!