Academic literature on the topic 'Neural networs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural networs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural networs"

1

Kaur, Amritpal, and Dr Yogeshwar Randhawa. "Image Segmentation with Artificial Neural Networs Alongwith Updated Jseg Algorithm." IOSR Journal of Electronics and Communication Engineering 9, no. 4 (2014): 01–13. http://dx.doi.org/10.9790/2834-09420113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Navghare, Tukaram, Aniket Muley, and Vinayak Jadhav. "Siamese Neural Networks for Kinship Prediction: A Deep Convolutional Neural Network Approach." Indian Journal Of Science And Technology 17, no. 4 (2024): 352–58. http://dx.doi.org/10.17485/ijst/v17i4.3018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

O. H. Abdelwahed, O. H. Abdelwahed, and M. El-Sayed Wahed. "Optimizing Single Layer Cellular Neural Network Simulator using Simulated Annealing Technique with Neural Networks." Indian Journal of Applied Research 3, no. 6 (2011): 91–94. http://dx.doi.org/10.15373/2249555x/june2013/31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Boonsatit, Nattakan, Santhakumari Rajendran, Chee Peng Lim, Anuwat Jirawattanapanit, and Praneesh Mohandas. "New Adaptive Finite-Time Cluster Synchronization of Neutral-Type Complex-Valued Coupled Neural Networks with Mixed Time Delays." Fractal and Fractional 6, no. 9 (2022): 515. http://dx.doi.org/10.3390/fractalfract6090515.

Full text
Abstract:
The issue of adaptive finite-time cluster synchronization corresponding to neutral-type coupled complex-valued neural networks with mixed delays is examined in this research. A neutral-type coupled complex-valued neural network with mixed delays is more general than that of a traditional neural network, since it considers distributed delays, state delays and coupling delays. In this research, a new adaptive control technique is developed to synchronize neutral-type coupled complex-valued neural networks with mixed delays in finite time. To stabilize the resulting closed-loop system, the Lyapun
APA, Harvard, Vancouver, ISO, and other styles
5

Mahat, Norpah, Nor Idayunie Nording, Jasmani Bidin, Suzanawati Abu Hasan, and Teoh Yeong Kin. "Artificial Neural Network (ANN) to Predict Mathematics Students’ Performance." Journal of Computing Research and Innovation 7, no. 1 (2022): 29–38. http://dx.doi.org/10.24191/jcrinn.v7i1.264.

Full text
Abstract:
Predicting students’ academic performance is very essential to produce high-quality students. The main goal is to continuously help students to increase their ability in the learning process and to help educators as well in improving their teaching skills. Therefore, this study was conducted to predict mathematics students’ performance using Artificial Neural Network (ANN). The secondary data from 382 mathematics students from UCI Machine Learning Repository Data Sets used to train the neural networks. The neural network model built using nntool. Two inputs are used which are the first and the
APA, Harvard, Vancouver, ISO, and other styles
6

Jiang, Yiming, Chenguang Yang, Shi-lu Dai, and Beibei Ren. "Deterministic learning enhanced neutral network control of unmanned helicopter." International Journal of Advanced Robotic Systems 13, no. 6 (2016): 172988141667111. http://dx.doi.org/10.1177/1729881416671118.

Full text
Abstract:
In this article, a neural network–based tracking controller is developed for an unmanned helicopter system with guaranteed global stability in the presence of uncertain system dynamics. Due to the coupling and modeling uncertainties of the helicopter systems, neutral networks approximation techniques are employed to compensate the unknown dynamics of each subsystem. In order to extend the semiglobal stability achieved by conventional neural control to global stability, a switching mechanism is also integrated into the control design, such that the resulted neural controller is always valid wit
APA, Harvard, Vancouver, ISO, and other styles
7

Zengguo Sun, Zengguo Sun, Guodong Zhao Zengguo Sun, Rafał Scherer Guodong Zhao, Wei Wei Rafał Scherer, and Marcin Woźniak Wei Wei. "Overview of Capsule Neural Networks." 網際網路技術學刊 23, no. 1 (2022): 033–44. http://dx.doi.org/10.53106/160792642022012301004.

Full text
Abstract:
<p>As a vector transmission network structure, the capsule neural network has been one of the research hotspots in deep learning since it was proposed in 2017. In this paper, the latest research progress of capsule networks is analyzed and summarized. Firstly, we summarize the shortcomings of convolutional neural networks and introduce the basic concept of capsule network. Secondly, we analyze and summarize the improvements in the dynamic routing mechanism and network structure of the capsule network in recent years and the combination of the capsule network with other network structures
APA, Harvard, Vancouver, ISO, and other styles
8

N, Vikram. "Artificial Neural Networks." International Journal of Research Publication and Reviews 4, no. 4 (2023): 4308–9. http://dx.doi.org/10.55248/gengpi.4.423.37858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Murugan, S., and Dr M. Jeyakarthic. "Optimal Deep Neural Network based Classification Model for Intrusion Detection in Mobile Adhoc Networks." Journal of Advanced Research in Dynamical and Control Systems 11, no. 10-SPECIAL ISSUE (2019): 1374–87. http://dx.doi.org/10.5373/jardcs/v11sp10/20192983.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Botmart, T., N. Yotha, K. Mukdasai, and W. Weera. "Improved Results on Passivity Analysis of Neutral-Type Neural Networks with Mixed Time-Varying Delays." International Journal of Information and Electronics Engineering 8, no. 3 (2018): 30–35. http://dx.doi.org/10.18178/ijiee.2018.8.3.690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Neural networs"

1

Xu, Shuxiang, University of Western Sydney, and of Informatics Science and Technology Faculty. "Neuron-adaptive neural network models and applications." THESIS_FIST_XXX_Xu_S.xml, 1999. http://handle.uws.edu.au:8081/1959.7/275.

Full text
Abstract:
Artificial Neural Networks have been widely probed by worldwide researchers to cope with the problems such as function approximation and data simulation. This thesis deals with Feed-forward Neural Networks (FNN's) with a new neuron activation function called Neuron-adaptive Activation Function (NAF), and Feed-forward Higher Order Neural Networks (HONN's) with this new neuron activation function. We have designed a new neural network model, the Neuron-Adaptive Neural Network (NANN), and mathematically proved that one NANN can approximate any piecewise continuous function to any desired accuracy
APA, Harvard, Vancouver, ISO, and other styles
2

Patterson, Raymond A. "Hybrid Neural networks and network design." Connect to resource, 1995. http://rave.ohiolink.edu/etdc/view.cgi?acc%5Fnum=osu1262707683.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ellerbrock, Thomas M. "Multilayer neural networks learnability, network generation, and network simplification /." [S.l. : s.n.], 1999. http://deposit.ddb.de/cgi-bin/dokserv?idn=958467897.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rastogi, Preeti. "Assessing Wireless Network Dependability Using Neural Networks." Ohio University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1129134364.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chambers, Mark Andrew. "Queuing network construction using artificial neural networks /." The Ohio State University, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488193665234291.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dunn, Nathan A. "A Novel Neural Network Analysis Method Applied to Biological Neural Networks." Thesis, view abstract or download file of text, 2006. http://proquest.umi.com/pqdweb?did=1251892251&sid=2&Fmt=2&clientId=11238&RQT=309&VName=PQD.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2006.<br>Typescript. Includes vita and abstract. Includes bibliographical references (leaves 122- 131). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
7

Viñoles, Serra Mireia. "Dynamics of Two Neuron Cellular Neural Networks." Doctoral thesis, Universitat Ramon Llull, 2011. http://hdl.handle.net/10803/9154.

Full text
Abstract:
Les xarxes neuronals cel·lulars altrament anomenades CNNs, són un tipus de sistema dinàmic que relaciona diferents elements que s'anomenen neurones via unes plantilles de paràmetres. Aquest sistema queda completament determinat coneixent quines són les entrades a la xarxa, les sortides i els paràmetres o pesos. En aquest treball fem un estudi exhaustiu sobre aquest tipus de xarxa en el cas més senzill on només hi intervenen dues neurones. Tot i la simplicitat del sistema, veurem que pot tenir una dinàmica molt rica. <br/><br/>Primer de tot, revisem l'estabilitat d'aquest sistema des de dos pun
APA, Harvard, Vancouver, ISO, and other styles
8

Xu, Shuxiang. "Neuron-adaptive neural network models and applications." Thesis, [Campbelltown, N.S.W. : The Author], 1999. http://handle.uws.edu.au:8081/1959.7/275.

Full text
Abstract:
Artificial Neural Networks have been widely probed by worldwide researchers to cope with the problems such as function approximation and data simulation. This thesis deals with Feed-forward Neural Networks (FNN's) with a new neuron activation function called Neuron-adaptive Activation Function (NAF), and Feed-forward Higher Order Neural Networks (HONN's) with this new neuron activation function. We have designed a new neural network model, the Neuron-Adaptive Neural Network (NANN), and mathematically proved that one NANN can approximate any piecewise continuous function to any desired accuracy
APA, Harvard, Vancouver, ISO, and other styles
9

Xu, Shuxiang. "Neuron-adaptive neural network models and applications /." [Campbelltown, N.S.W. : The Author], 1999. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030702.085320/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Post, David L. "Network Management: Assessing Internet Network-Element Fault Status Using Neural Networks." Ohio : Ohio University, 2008. http://www.ohiolink.edu/etd/view.cgi?ohiou1220632155.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Neural networs"

1

Dominique, Valentin, and Edelman Betty, eds. Neural networks. Sage Publications, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abdi, Hervé, Dominique Valentin, and Betty Edelman. Neural Networks. SAGE Publications, Inc., 1999. http://dx.doi.org/10.4135/9781412985277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Davalo, Eric, and Patrick Naïm. Neural Networks. Macmillan Education UK, 1991. http://dx.doi.org/10.1007/978-1-349-12312-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Müller, Berndt, and Joachim Reinhardt. Neural Networks. Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rojas, Raúl. Neural Networks. Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Müller, Berndt, Joachim Reinhardt, and Michael T. Strickland. Neural Networks. Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Almeida, Luis B., and Christian J. Wellekens, eds. Neural Networks. Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

1931-, Taylor John, and UNICOM Seminars, eds. Neural networks. A. Waller, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lisboa, P. G. J. Techniques and applications of neural networks. E. Horwood, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rojas, Raúl. Neural networks: A systematic introduction. Springer-Verlag, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Neural networs"

1

Gopinath, Divya, Luca Lungeanu, Ravi Mangal, Corina Păsăreanu, Siqi Xie, and Huanfeng Yu. "Feature-Guided Analysis of Neural Networks." In Fundamental Approaches to Software Engineering. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-30826-0_7.

Full text
Abstract:
AbstractApplying standard software engineering practices to neural networks is challenging due to the lack of high-level abstractions describing a neural network’s behavior. To address this challenge, we propose to extract high-level task-specific features from the neural network internal representation, based on monitoring the neural network activations. The extracted feature representations can serve as a link to high-level requirements and can be leveraged to enable fundamental software engineering activities, such as automated testing, debugging, requirements analysis, and formal verification, leading to better engineering of neural networks. Using two case studies, we present initial empirical evidence demonstrating the feasibility of our ideas.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Yumin, Lei Guo, Lingyao Wu, and Chunbo Feng. "On Stochastic Neutral Neural Networks." In Advances in Neural Networks — ISNN 2005. Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Amir, Guy, Haoze Wu, Clark Barrett, and Guy Katz. "An SMT-Based Approach for Verifying Binarized Neural Networks." In Tools and Algorithms for the Construction and Analysis of Systems. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72013-1_11.

Full text
Abstract:
AbstractDeep learning has emerged as an effective approach for creating modern software systems, with neural networks often surpassing hand-crafted systems. Unfortunately, neural networks are known to suffer from various safety and security issues. Formal verification is a promising avenue for tackling this difficulty, by formally certifying that networks are correct. We propose an SMT-based technique for verifying binarized neural networks — a popular kind of neural network, where some weights have been binarized in order to render the neural network more memory and energy efficient, and quicker to evaluate. One novelty of our technique is that it allows the verification of neural networks that include both binarized and non-binarized components. Neural network verification is computationally very difficult, and so we propose here various optimizations, integrated into our SMT procedure as deduction steps, as well as an approach for parallelizing verification queries. We implement our technique as an extension to the Marabou framework, and use it to evaluate the approach on popular binarized neural network architectures.
APA, Harvard, Vancouver, ISO, and other styles
4

Bile, Alessandro. "Introduction to Neural Networks: Biological Neural Network." In Solitonic Neural Networks. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-48655-5_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Richards, Bryn, and Nwabueze Emekwuru. "Using Machine Learning to Predict Synthetic Fuel Spray Penetration from Limited Experimental Data Without Computational Fluid Dynamics." In Springer Proceedings in Energy. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-30960-1_6.

Full text
Abstract:
AbstractMachine Learning (ML) is increasingly used to predict fuel spray characteristics, but ML conventionally requires large datasets for training. There is a problem of limited training data in the field of synthetic fuel sprays. One solution is to reproduce experimental results using Computational Fluid Dynamics (CFD) and then to augment or replace experimental data with more abundant CFD output data. However, this approach can obscure the relationship of the neural network to the training data by introducing new factors, such as CFD grid design, turbulence model, near-wall treatment, and the particle tracking approach. This paper argues that CFD can be eliminated as a data augmentation tool in favour of a systematic treatment of the uncertainty in the neural network training. Confidence intervals are calculated for neural network outputs, and these encompass both (1) uncertainty due to errors in experimental measurements of the neural networks’ training data and (2) uncertainty due to under-training the neural networks with limited experimental data. This approach potentially improves the usefulness of artificial neural networks for predicting the behaviour of sprays in engineering applications. Confidence intervals represent a more rigorous and trustworthy measure of the reliability of a neural network’s predictions than a conventional validation exercise. Furthermore, when data are limited, the best use of all available data is to improve the training of a neural network and our confidence in that training, rather than to reserve data for ad-hoc testing, which exercise can only at best approximate a confidence interval.
APA, Harvard, Vancouver, ISO, and other styles
6

Lopez, Diego Manzanas, Sung Woo Choi, Hoang-Dung Tran, and Taylor T. Johnson. "NNV 2.0: The Neural Network Verification Tool." In Computer Aided Verification. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-37703-7_19.

Full text
Abstract:
AbstractThis manuscript presents the updated version of the Neural Network Verification (NNV) tool. NNV is a formal verification software tool for deep learning models and cyber-physical systems with neural network components. NNV was first introduced as a verification framework for feedforward and convolutional neural networks, as well as for neural network control systems. Since then, numerous works have made significant improvements in the verification of new deep learning models, as well as tackling some of the scalability issues that may arise when verifying complex models. In this new version of NNV, we introduce verification support for multiple deep learning models, including neural ordinary differential equations, semantic segmentation networks and recurrent neural networks, as well as a collection of reachability methods that aim to reduce the computation cost of reachability analysis of complex neural networks. We have also added direct support for standard input verification formats in the community such as VNNLIB (verification properties), and ONNX (neural networks) formats. We present a collection of experiments in which NNV verifies safety and robustness properties of feedforward, convolutional, semantic segmentation and recurrent neural networks, as well as neural ordinary differential equations and neural network control systems. Furthermore, we demonstrate the capabilities of NNV against a commercially available product in a collection of benchmarks from control systems, semantic segmentation, image classification, and time-series data.
APA, Harvard, Vancouver, ISO, and other styles
7

Wüthrich, Mario V., and Michael Merz. "Recurrent Neural Networks." In Springer Actuarial. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_8.

Full text
Abstract:
AbstractThis chapter considers recurrent neural (RN) networks. These are special network architectures that are useful for time-series modeling, e.g., applied to time-series forecasting. We study the most popular RN networks which are the long short-term memory (LSTM) networks and the gated recurrent unit (GRU) networks. We apply these networks to mortality forecasting.
APA, Harvard, Vancouver, ISO, and other styles
8

Wüthrich, Mario V., and Michael Merz. "Convolutional Neural Networks." In Springer Actuarial. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-12409-9_9.

Full text
Abstract:
AbstractThis chapter considers convolutional neural (CN) networks. These are special network architectures that are useful for time-series and spatial data modeling, e.g., applied to image recognition problems. Time-series and images have a natural topology, and CN networks try to benefit from this additional structure (over tabular data). We introduce these network architectures and provide insurance-relevant examples related to telematics data and mortality forecasting.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Chaoming, and Yi Zeng. "Network of Recurrent Neural Networks: Design for Emergence." In Neural Information Processing. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04179-3_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

kinugasa, Koki, and Koichiro Yamauchi. "Pruning Neural Network Parameters Using Recurrent Neural Networks." In Lecture Notes in Computer Science. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-6582-2_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Neural networs"

1

Wright, Logan. "Physical Neural Networks Based on Multimode Optical Waves." In Integrated Photonics Research, Silicon and Nanophotonics. Optica Publishing Group, 2024. https://doi.org/10.1364/iprsn.2024.iw2b.4.

Full text
Abstract:
Physical neural networks provide a way to realize neural network calculations by leveraging the controllable computations physical systems natively perform. I present an example on-chip physical neural network based on arbitrarily controllable multimode wave propagation. Full-text article not available; see video presentation
APA, Harvard, Vancouver, ISO, and other styles
2

Reid, S., G. E. C. Bell, and G. L. Edgemon. "The Use of Skewness, Kurtosis and Neural Networks for Determining Corrosion Mechanism from Electrochemical Noise Data." In CORROSION 1998. NACE International, 1998. https://doi.org/10.5006/c1998-98176.

Full text
Abstract:
Abstract This paper describes the work undertaken to de-skill the complex procedure of determining corrosion mechanisms derived from electrochemical noise data. The use of neural networks is discussed and applied to the real time generated electrochemical noise data files with the purpose of determining characteristics particular to individual types of corrosion mechanisms. The electrochemical noise signals can have a wide dynamic range and various methods of raw data pre-processing prior to neural network analysis were investigated. Normalized data were ultimately used as input to the final n
APA, Harvard, Vancouver, ISO, and other styles
3

Alekseev, V. I., A. N. Bobryshev, and I. V. Bryksin. "Neural Networs technologies and theirs application in oil prospecting seismology." In Geophysics of the 21st Century - The Leap into the Future. European Association of Geoscientists & Engineers, 2003. http://dx.doi.org/10.3997/2214-4609-pdb.38.f304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zouaoui, Rania, and Hassen Mekki. "2D visual servoïng of wheeled mobile robot by neural networs." In 2013 International Conference on Individual and Collective Behaviors in Robotics (ICBR). IEEE, 2013. http://dx.doi.org/10.1109/icbr.2013.6729263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Zhun, Adam Ishay, and Joohyung Lee. "NeurASP: Embracing Neural Networks into Answer Set Programming." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/243.

Full text
Abstract:
We present NeurASP, a simple extension of answer set programs by embracing neural networks. By treating the neural network output as the probability distribution over atomic facts in answer set programs, NeurASP provides a simple and effective way to integrate sub-symbolic and symbolic computation. We demonstrate how NeurASP can make use of a pre-trained neural network in symbolic computation and how it can improve the neural network's perception result by applying symbolic reasoning in answer set programming. Also, NeurASP can make use of ASP rules to train a neural network better so that a n
APA, Harvard, Vancouver, ISO, and other styles
6

Shi, Weijia, Andy Shih, Adnan Darwiche, and Arthur Choi. "On Tractable Representations of Binary Neural Networks." In 17th International Conference on Principles of Knowledge Representation and Reasoning {KR-2020}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/kr.2020/91.

Full text
Abstract:
We consider the compilation of a binary neural network’s decision function into tractable representations such as Ordered Binary Decision Diagrams (OBDDs) and Sentential Decision Diagrams (SDDs). Obtaining this function as an OBDD/SDD facilitates the explanation and formal verification of a neural network’s behavior. First, we consider the task of verifying the robustness of a neural network, and show how we can compute the expected robustness of a neural network, given an OBDD/SDD representation of it. Next, we consider a more efficient approach for compiling neural networks, based on a pseudo-
APA, Harvard, Vancouver, ISO, and other styles
7

Pryor, Connor, Charles Dickens, Eriq Augustine, Alon Albalak, William Yang Wang, and Lise Getoor. "NeuPSL: Neural Probabilistic Soft Logic." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/461.

Full text
Abstract:
In this paper, we introduce Neural Probabilistic Soft Logic (NeuPSL), a novel neuro-symbolic (NeSy) framework that unites state-of-the-art symbolic reasoning with the low-level perception of deep neural networks. To model the boundary between neural and symbolic representations, we propose a family of energy-based models, NeSy Energy-Based Models, and show that they are general enough to include NeuPSL and many other NeSy approaches. Using this framework, we show how to seamlessly integrate neural and symbolic parameter learning and inference in NeuPSL. Through an extensive empirical evaluatio
APA, Harvard, Vancouver, ISO, and other styles
8

Farhat, Nabil H., and Mostafa Eldefrawy. "The bifurcating neuron." In OSA Annual Meeting. Optica Publishing Group, 1991. http://dx.doi.org/10.1364/oam.1991.mk3.

Full text
Abstract:
Present neural network models ignore temporal considerations, and hence synchronicity in neural networks, by representing neuron response with a transfer function relating frequency of action potentials (firing frequency) to activation potential. Models of living neuron based on the Hudgkin-Huxley model of the excitable membrane of the squid’s axon and its Fitzhugh-Nagumo approximation, exhibit much more complex and rich behavior than that described by firing frequency-activation potential models. We describe the theory, operation, and properties of an integrate-and-fire neuron which we call t
APA, Harvard, Vancouver, ISO, and other styles
9

Ozcan, Neyir. "New results for global stability of neutral-type delayed neural networks." In The 11th International Conference on Integrated Modeling and Analysis in Applied Control and Automation. CAL-TEK srl, 2018. http://dx.doi.org/10.46354/i3m.2018.imaaca.004.

Full text
Abstract:
"This paper deals with the stability analysis of the class of neutral-type neural networks with constant time delay. By using a suitable Lyapunov functional, some delay independent sufficient conditions are derived, which ensure the global asymptotic stability of the equilibrium point for this this class of neutral-type neural networks with time delays with respect to the Lipschitz activation functions. The presented stability results rely on checking some certain properties of matrices. Therefore, it is easy to verify the validation of the constraint conditions on the network parameters of ne
APA, Harvard, Vancouver, ISO, and other styles
10

Zheng, Shengjie, Lang Qian, Pingsheng Li, Chenggang He, Xiaoqi Qin, and Xiaojian Li. "An Introductory Review of Spiking Neural Network and Artificial Neural Network: From Biological Intelligence to Artificial Intelligence." In 8th International Conference on Artificial Intelligence (ARIN 2022). Academy and Industry Research Collaboration Center (AIRCC), 2022. http://dx.doi.org/10.5121/csit.2022.121010.

Full text
Abstract:
Stemming from the rapid development of artificial intelligence, which has gained expansive success in pattern recognition, robotics, and bioinformatics, neuroscience is also gaining tremendous progress. A kind of spiking neural network with biological interpretability is gradually receiving wide attention, and this kind of neural network is also regarded as one of the directions toward general artificial intelligence. This review summarizes the basic properties of artificial neural networks as well as spiking neural networks. Our focus is on the biological background and theoretical basis of s
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Neural networs"

1

Tarasenko, Andrii O., Yuriy V. Yakimov, and Vladimir N. Soloviev. Convolutional neural networks for image classification. [б. в.], 2020. http://dx.doi.org/10.31812/123456789/3682.

Full text
Abstract:
This paper shows the theoretical basis for the creation of convolutional neural networks for image classification and their application in practice. To achieve the goal, the main types of neural networks were considered, starting from the structure of a simple neuron to the convolutional multilayer network necessary for the solution of this problem. It shows the stages of the structure of training data, the training cycle of the network, as well as calculations of errors in recognition at the stage of training and verification. At the end of the work the results of network training, calculatio
APA, Harvard, Vancouver, ISO, and other styles
2

Markova, Oksana, Serhiy Semerikov та Maiia Popel. СoCalc as a Learning Tool for Neural Network Simulation in the Special Course “Foundations of Mathematic Informatics”. Sun SITE Central Europe, 2018. http://dx.doi.org/10.31812/0564/2250.

Full text
Abstract:
The role of neural network modeling in the learning сontent of special course “Foundations of Mathematic Informatics” was discussed. The course was developed for the students of technical universities – future IT-specialists and directed to breaking the gap between theoretic computer science and it’s applied applications: software, system and computing engineering. CoCalc was justified as a learning tool of mathematical informatics in general and neural network modeling in particular. The elements of technique of using CoCalc at studying topic “Neural network and pattern recognition” of the sp
APA, Harvard, Vancouver, ISO, and other styles
3

Sgurev, Vassil. Artificial Neural Networks as a Network Flow with Capacities. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, 2018. http://dx.doi.org/10.7546/crabs.2018.09.12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Smith, Patrick I. Neural Networks. Office of Scientific and Technical Information (OSTI), 2003. http://dx.doi.org/10.2172/815740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Johnson, John L., and C. C. Sung. Neural Networks. Defense Technical Information Center, 1990. http://dx.doi.org/10.21236/ada222110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Levesque, Joseph. Neural network denoising of HED x-ray images, with an introduction to neural networks. Office of Scientific and Technical Information (OSTI), 2023. http://dx.doi.org/10.2172/1970268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cárdenas-Cárdenas, Julián Alonso, Deicy J. Cristiano-Botia, and Nicolás Martínez-Cortés. Colombian inflation forecast using Long Short-Term Memory approach. Banco de la República, 2023. http://dx.doi.org/10.32468/be.1241.

Full text
Abstract:
We use Long Short Term Memory (LSTM) neural networks, a deep learning technique, to forecast Colombian headline inflation one year ahead through two approaches. The first one uses only information from the target variable, while the second one incorporates additional information from some relevant variables. We employ sample rolling to the traditional neuronal network construction process, selecting the hyperparameters with criteria for minimizing the forecast error. Our results show a better forecasting capacity of the network with information from additional variables, surpassing both the ot
APA, Harvard, Vancouver, ISO, and other styles
8

Semerikov, Serhiy O., Illia O. Teplytskyi, Yuliia V. Yechkalo, and Arnold E. Kiv. Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot. [б. в.], 2018. http://dx.doi.org/10.31812/123456789/2648.

Full text
Abstract:
The article substantiates the necessity to develop training methods of computer simulation of neural networks in the spreadsheet environment. The systematic review of their application to simulating artificial neural networks is performed. The authors distinguish basic approaches to solving the problem of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools of neural network simulation, application of third-party add-ins to spreadsheets, development of macros using the embedded languages of spreadsheets; use of standard spreadsheet ad
APA, Harvard, Vancouver, ISO, and other styles
9

Pollack, Randy B. Neural Network Technologies. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada262576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wilensky, Gregg, Narbik Manukian, Joseph Neuhaus, and Natalie Rivetti. Neural Network Studies. Defense Technical Information Center, 1993. http://dx.doi.org/10.21236/ada271593.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!