Journal articles on the topic 'Machine Learning,Artificial Neural Networks,Spiking neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Machine Learning,Artificial Neural Networks,Spiking neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

AL-Fayyadh, Hayder Rahm Dakheel, Salam Abdulabbas Ganim Ali, and Dr Basim Abood. "Modelling an Adaptive Learning System Using Artificial Intelligence." Webology 19, no. 1 (2021): 01–18. http://dx.doi.org/10.14704/web/v19i1/web19001.

Full text
Abstract:
The goal of this paper is to use artificial intelligence to build and evaluate an adaptive learning system where we adopt the basic approaches of spiking neural networks as well as artificial neural networks. Spiking neural networks receive increasing attention due to their advantages over traditional artificial neural networks. They have proven to be energy efficient, biological plausible, and up to 105 times faster if they are simulated on analogue traditional learning systems. Artificial neural network libraries use computational graphs as a pervasive representation, however, spiking models
APA, Harvard, Vancouver, ISO, and other styles
2

Chunduri, Raghavendra K., and Darshika G. Perera. "Neuromorphic Sentiment Analysis Using Spiking Neural Networks." Sensors 23, no. 18 (2023): 7701. http://dx.doi.org/10.3390/s23187701.

Full text
Abstract:
Over the past decade, the artificial neural networks domain has seen a considerable embracement of deep neural networks among many applications. However, deep neural networks are typically computationally complex and consume high power, hindering their applicability for resource-constrained applications, such as self-driving vehicles, drones, and robotics. Spiking neural networks, often employed to bridge the gap between machine learning and neuroscience fields, are considered a promising solution for resource-constrained applications. Since deploying spiking neural networks on traditional von
APA, Harvard, Vancouver, ISO, and other styles
3

Javanshir, Amirhossein, Thanh Thi Nguyen, M. A. Parvez Mahmud, and Abbas Z. Kouzani. "Training Spiking Neural Networks with Metaheuristic Algorithms." Applied Sciences 13, no. 8 (2023): 4809. http://dx.doi.org/10.3390/app13084809.

Full text
Abstract:
Taking inspiration from the brain, spiking neural networks (SNNs) have been proposed to understand and diminish the gap between machine learning and neuromorphic computing. Supervised learning is the most commonly used learning algorithm in traditional ANNs. However, directly training SNNs with backpropagation-based supervised learning methods is challenging due to the discontinuous and non-differentiable nature of the spiking neuron. To overcome these problems, this paper proposes a novel metaheuristic-based supervised learning method for SNNs by adapting the temporal error function. We inves
APA, Harvard, Vancouver, ISO, and other styles
4

Verma, Saurabh, Vishal Paranjape, and Saurabh Sharma. "A Bio-Inspired Approach to Enhancing Machine Learning: Integrating Spiking Neural Networks." International Journal of Innovative Research in Computer and Communication Engineering 11, no. 11 (2023): 11793–99. http://dx.doi.org/10.15680/ijircce.2023.1111061.

Full text
Abstract:
In this study, we investigate the integration of bio-inspired neural networks, specifically spiking neural networks (SNNs), into conventional machine learning frameworks. Known for their energy efficiency and ability to process temporal information, SNNs present unique advantages over traditional artificial neural networks (ANNs). This paper explores recent advancements in SNNs, focusing on methodologies such as spike-timing-dependent plasticity (STDP), hybrid conversion techniques, and learnable membrane time constants [1][2]. We evaluate the application of SNNs in visual categorization, digi
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Zhao. "Research on handwritten digits recognition system based on spiking neuron network." Applied and Computational Engineering 30, no. 1 (2024): 284–91. http://dx.doi.org/10.54254/2755-2721/30/20230052.

Full text
Abstract:
In the 21st century, deep learning has revolutionized the fields of machine learning and computer science, attaining high accuracy in tasks such as image recognition. More layers and more parameters are stuffed into the network to achieve higher performance, making the network extremely large. A new, radically different approach was proposed to complete the tasks, such as image recognition, using a spiking neural network(SNN). The spiking neural network is event-driven rather than data-driven, which makes it more physiologically realistic and uses a lot less power. This study reviews the devel
APA, Harvard, Vancouver, ISO, and other styles
6

Boya Marqas, Ridwan, Saman M. Almufty, Renas R. Asaad, and Dr Tamara Saad mohamed. "Advancing AI: A Comprehensive Study of Novel Machine Learning Architectures." International Journal of Scientific World 11, no. 1 (2025): 48–85. https://doi.org/10.14419/kwb24564.

Full text
Abstract:
The rapid evolution of machine learning (ML) and artificial intelligence (AI) has led to groundbreaking advancements in computational models, empowering applications across diverse domains. This paper provides an in-depth exploration of advanced ML architectures, including transformers, Graph Neural Networks (GNNs), capsule networks, spiking neural networks (SNNs), and hybrid models. These architectures address the limitations of traditional models like convolutional and recurrent neural networks, offering superior accuracy, scalability, and efficiency for complex data. Key applications are di
APA, Harvard, Vancouver, ISO, and other styles
7

Abdulkadirov, Ruslan, Pavel Lyakhov, and Nikolay Nagornov. "Survey of Optimization Algorithms in Modern Neural Networks." Mathematics 11, no. 11 (2023): 2466. http://dx.doi.org/10.3390/math11112466.

Full text
Abstract:
The main goal of machine learning is the creation of self-learning algorithms in many areas of human activity. It allows a replacement of a person with artificial intelligence in seeking to expand production. The theory of artificial neural networks, which have already replaced humans in many problems, remains the most well-utilized branch of machine learning. Thus, one must select appropriate neural network architectures, data processing, and advanced applied mathematics tools. A common challenge for these networks is achieving the highest accuracy in a short time. This problem is solved by m
APA, Harvard, Vancouver, ISO, and other styles
8

Pietrzak, Paweł, Szymon Szczęsny, Damian Huderek, and Łukasz Przyborowski. "Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities." Sensors 23, no. 6 (2023): 3037. http://dx.doi.org/10.3390/s23063037.

Full text
Abstract:
Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient than ANNs on event-driven neuromorphic hardware. This can yield drastic maintenance cost reduction for neural network models, as the energy consumption would be much lower in comparison to regular deep learning models hosted in the cloud today. However, such hardware is still not yet widely available.
APA, Harvard, Vancouver, ISO, and other styles
9

Nobukawa, Sou, Haruhiko Nishimura, and Teruya Yamanishi. "Pattern Classification by Spiking Neural Networks Combining Self-Organized and Reward-Related Spike-Timing-Dependent Plasticity." Journal of Artificial Intelligence and Soft Computing Research 9, no. 4 (2019): 283–91. http://dx.doi.org/10.2478/jaiscr-2019-0009.

Full text
Abstract:
Abstract Many recent studies have applied to spike neural networks with spike-timing-dependent plasticity (STDP) to machine learning problems. The learning abilities of dopamine-modulated STDP (DA-STDP) for reward-related synaptic plasticity have also been gathering attention. Following these studies, we hypothesize that a network structure combining self-organized STDP and reward-related DA-STDP can solve the machine learning problem of pattern classification. Therefore, we studied the ability of a network in which recurrent spiking neural networks are combined with STDP for non-supervised le
APA, Harvard, Vancouver, ISO, and other styles
10

K P, VISHNUPRIYA, JWALA JOSE, PRINCE JOY, SRITHA S, and GIBI K. S. "Brain-Inspired Artificial Intelligence: Revolutionizing Computing and Cognitive Systems." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 12 (2024): 1–8. https://doi.org/10.55041/ijsrem39825.

Full text
Abstract:
Brain-inspired artificial intelligence (AI) is a rapidly evolving field that seeks to model computational systems after the structure, processes, and functioning of the human brain. By drawing from neuroscience and cognitive science, brain-inspired AI aims to improve the efficiency, scalability, and adaptability of machine learning algorithms. This paper explores the key technologies and advancements in the realm of brain-inspired AI, including neural networks, neuromorphic hardware, brain-computer interfaces, and algorithms inspired by biological learning mechanisms. Additionally, we will ana
APA, Harvard, Vancouver, ISO, and other styles
11

Zhang, Qian, Chenxi Wu, Adar Kahana, et al. "Artificial to Spiking Neural Networks Conversion with Calibration in Scientific Machine Learning." SIAM Journal on Scientific Computing 47, no. 3 (2025): C559—C577. https://doi.org/10.1137/24m1643232.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lin, Ranxi, Benzhe Dai, Yingkai Zhao, Gang Chen, and Huaxiang Lu. "Constrain Bias Addition to Train Low-Latency Spiking Neural Networks." Brain Sciences 13, no. 2 (2023): 319. http://dx.doi.org/10.3390/brainsci13020319.

Full text
Abstract:
In recent years, a third-generation neural network, namely, spiking neural network, has received plethora of attention in the broad areas of Machine learning and Artificial Intelligence. In this paper, a novel differential-based encoding method is proposed and new spike-based learning rules for backpropagation is derived by constraining the addition of bias voltage in spiking neurons. The proposed differential encoding method can effectively exploit the correlation between the data and improve the performance of the proposed model, and the new learning rule can take complete advantage of the m
APA, Harvard, Vancouver, ISO, and other styles
13

Shen, Jiangrong, Wenyao Ni, Qi Xu, and Huajin Tang. "Efficient Spiking Neural Networks with Sparse Selective Activation for Continual Learning." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 1 (2024): 611–19. http://dx.doi.org/10.1609/aaai.v38i1.27817.

Full text
Abstract:
The next generation of machine intelligence requires the capability of continual learning to acquire new knowledge without forgetting the old one while conserving limited computing resources. Spiking neural networks (SNNs), compared to artificial neural networks (ANNs), have more characteristics that align with biological neurons, which may be helpful as a potential gating function for knowledge maintenance in neural networks. Inspired by the selective sparse activation principle of context gating in biological systems, we present a novel SNN model with selective activation to achieve continua
APA, Harvard, Vancouver, ISO, and other styles
14

Anzueto-Ríos, Álvaro, Felipe Gómez-Castañeda, Luis M. Flores-Nava, and José A. Moreno-Cadenas. "Approaching Optimal Nonlinear Dimensionality Reduction by a Spiking Neural Network." Electronics 10, no. 14 (2021): 1679. http://dx.doi.org/10.3390/electronics10141679.

Full text
Abstract:
This work deals with the presentation of a spiking neural network as a means for efficiently solving the reduction of dimensionality of data in a nonlinear manner. The underneath neural model, which can be integrated as neuromorphic hardware, becomes suitable for intelligent processing in edge computing within Internet of Things systems. In this sense, to achieve a meaningful performance with a low complexity one-layer spiking neural network, the training phase uses the metaheuristic Artificial Bee Colony algorithm with an objective function from the principals in the machine learning science,
APA, Harvard, Vancouver, ISO, and other styles
15

Karuna, G., K. Pravallika, K. Anuradha, and V. Srilakshmi. "Convolutional and Spiking Neural Network Models for Crop Yield Forecasting." E3S Web of Conferences 309 (2021): 01162. http://dx.doi.org/10.1051/e3sconf/202130901162.

Full text
Abstract:
Prediction of Crop yield focuses primarily on agriculture research which will have a significant effect on making decisions such as import-export, pricing and distribution of specific crops. Predicting accurately with well-timed forecasts is important, but it is a difficult task due to numerous complex factors. Mostly crops like wheat, rice, peas, pulses, sugar cane, tea, cotton, green houses, corn, and soybean can all be used to forecast crop yields. We considered corn dataset to predict the yield for 13 different states in United States. Crop development and progression are strongly affected
APA, Harvard, Vancouver, ISO, and other styles
16

Pasupuleti, Murali Krishna. "Organoid Intelligence: Integrating Living Neuronal Networks with Silicon Systems for the Next Evolution of Artificial Intelligence." International Journal of Academic and Industrial Research Innovations(IJAIRI) 05, no. 07 (2025): 66–81. https://doi.org/10.62311/nesx/rpj5.

Full text
Abstract:
Abstract: The emergence of Organoid Intelligence (OI) marks a transformative shift in artificial intelligence by integrating living neuronal networks with silicon-based systems. This study explores a bio-digital hybrid framework that combines cerebral organoids—three-dimensional neural tissues derived from human stem cells—with neuromorphic computing architectures to emulate advanced cognitive processes such as learning, memory, and adaptive decision-making. A robust methodological pipeline was implemented involving multi-electrode array (MEA) interfaces, signal transduction layers, and predic
APA, Harvard, Vancouver, ISO, and other styles
17

Bhargava, Aman, Mohammad R. Rezaei, and Milad Lankarany. "Gradient-Free Neural Network Training via Synaptic-Level Reinforcement Learning." AppliedMath 2, no. 2 (2022): 185–95. http://dx.doi.org/10.3390/appliedmath2020011.

Full text
Abstract:
An ongoing challenge in neural information processing is the following question: how do neurons adjust their connectivity to improve network-level task performance over time (i.e., actualize learning)? It is widely believed that there is a consistent, synaptic-level learning mechanism in specific brain regions, such as the basal ganglia, that actualizes learning. However, the exact nature of this mechanism remains unclear. Here, we investigate the use of universal synaptic-level algorithms in training connectionist models. Specifically, we propose an algorithm based on reinforcement learning (
APA, Harvard, Vancouver, ISO, and other styles
18

Dulundu, Alp. "Advancing Artificial Intelligence: The Potential of Brain-Inspired Architectures and Neuromorphic Computing for Adaptive, Efficient Systems." Next Frontier For Life Sciences and AI 8, no. 1 (2024): 53. http://dx.doi.org/10.62802/tfme0736.

Full text
Abstract:
Brain-inspired AI architecture, also known as neuromorphic computing, seeks to emulate the structure and functionality of the human brain to create more efficient, adaptive, and intelligent systems. Unlike traditional AI models that rely on conventional computing frameworks, brain-inspired architectures leverage neural networks and synapse-like connections to perform computations more similarly to biological brains. This approach offers significant advantages, including lower power consumption, improved learning capabilities, and enhanced problem-solving efficiency, particularly in tasks that
APA, Harvard, Vancouver, ISO, and other styles
19

Yang, Shuangming, Jiangtong Tan, and Badong Chen. "Robust Spike-Based Continual Meta-Learning Improved by Restricted Minimum Error Entropy Criterion." Entropy 24, no. 4 (2022): 455. http://dx.doi.org/10.3390/e24040455.

Full text
Abstract:
The spiking neural network (SNN) is regarded as a promising candidate to deal with the great challenges presented by current machine learning techniques, including the high energy consumption induced by deep neural networks. However, there is still a great gap between SNNs and the online meta-learning performance of artificial neural networks. Importantly, existing spike-based online meta-learning models do not target the robust learning based on spatio-temporal dynamics and superior machine learning theory. In this invited article, we propose a novel spike-based framework with minimum error e
APA, Harvard, Vancouver, ISO, and other styles
20

McMillan, Kyle, Rosa Qiyue So, Camilo Libedinsky, Kai Keng Ang, and Brian Premchand. "Spike-Weighted Spiking Neural Network with Spiking Long Short-Term Memory: A Biomimetic Approach to Decoding Brain Signals." Algorithms 17, no. 4 (2024): 156. http://dx.doi.org/10.3390/a17040156.

Full text
Abstract:
Background. Brain–machine interfaces (BMIs) offer users the ability to directly communicate with digital devices through neural signals decoded with machine learning (ML)-based algorithms. Spiking Neural Networks (SNNs) are a type of Artificial Neural Network (ANN) that operate on neural spikes instead of continuous scalar outputs. Compared to traditional ANNs, SNNs perform fewer computations, use less memory, and mimic biological neurons better. However, SNNs only retain information for short durations, limiting their ability to capture long-term dependencies in time-variant data. Here, we pr
APA, Harvard, Vancouver, ISO, and other styles
21

Al-Yassari, Mohammed Mousa Rashid, and Nadia Adnan Shiltagh Al-Jamali. "Automatic Spike Neural Technique for Slicing Bandwidth Estimated Virtual Buffer-Size in Network Environment." Journal of Engineering 29, no. 6 (2023): 87–97. http://dx.doi.org/10.31026/j.eng.2023.06.07.

Full text
Abstract:
The Next-generation networks, such as 5G and 6G, need capacity and requirements for low latency, and high dependability. According to experts, one of the most important features of (5 and 6) G networks is network slicing. To enhance the Quality of Service (QoS), network operators may now operate many instances on the same infrastructure due to configuring able slicing QoS. Each virtualized network resource, such as connection bandwidth, buffer size, and computing functions, may have a varied number of virtualized network resources. Because network resources are limited, virtual resources of th
APA, Harvard, Vancouver, ISO, and other styles
22

Xie, Ziyi, Junsong Peng, Mariia Sorokina, and Heping Zeng. "Design of Mode-Locked Fibre Laser with Non-Linear Power and Spectrum Width Transfer Functions with a Power Threshold." Applied Sciences 12, no. 20 (2022): 10318. http://dx.doi.org/10.3390/app122010318.

Full text
Abstract:
There is a growing demand for higher computational speed and energy efficiency of machine learning approaches and, in particular, neural networks. Optical implementation of neural networks can address this challenge. Compared to other neuromorphic platforms, fibre-based technologies can unlock a wide bandwidth window and offer flexibility in dimensionality and complexity. Moreover, fibre represents a well-studied, low-cost and low-loss material, widely used for signal processing and transmission. At the same time, mode-locked fibre lasers offer flexibility and control, while the mode-locking e
APA, Harvard, Vancouver, ISO, and other styles
23

Andrés, Eva, Manuel Pegalajar Cuéllar, and Gabriel Navarro. "Brain-Inspired Agents for Quantum Reinforcement Learning." Mathematics 12, no. 8 (2024): 1230. http://dx.doi.org/10.3390/math12081230.

Full text
Abstract:
In recent years, advancements in brain science and neuroscience have significantly influenced the field of computer science, particularly in the domain of reinforcement learning (RL). Drawing insights from neurobiology and neuropsychology, researchers have leveraged these findings to develop novel mechanisms for understanding intelligent decision-making processes in the brain. Concurrently, the emergence of quantum computing has opened new frontiers in artificial intelligence, leading to the development of quantum machine learning (QML). This study introduces a novel model that integrates quan
APA, Harvard, Vancouver, ISO, and other styles
24

Hersam, Mark C. "(Invited) Enabling Bio-Realistic Artificial Intelligence Hardware with Neuromorphic Nanoelectronics." ECS Meeting Abstracts MA2024-01, no. 57 (2024): 3012. http://dx.doi.org/10.1149/ma2024-01573012mtgabs.

Full text
Abstract:
The exponentially improving performance of digital computers has recently slowed due to the speed and power consumption issues resulting from the von Neumann bottleneck. In contrast, neuromorphic computing aims to circumvent these limitations by spatially co-locating logic and memory in a manner analogous to biological neuronal networks [1]. Beyond reducing power consumption, neuromorphic devices provide efficient architectures for image recognition, machine learning, and artificial intelligence [2]. This talk will explore how low-dimensional nanoelectronic materials enable gate-tunable neurom
APA, Harvard, Vancouver, ISO, and other styles
25

Kumbhar, Gaurang. "Synaptic AI: Bridging Neural Dynamics and Deep Learning for Next- Generation Computation." International Scientific Journal of Engineering and Management 04, no. 04 (2025): 1–7. https://doi.org/10.55041/isjem02829.

Full text
Abstract:
The escalating computational and power demands of deep learning algorithms challenge traditional von Neumann architectures, which separate memory and processing units. This structural bottleneck, often referred to as the "von Neumann bottleneck," hampers data throughput and energy efficiency—especially in real-time, data-intensive AI applications. Neuromorphic computing, inspired by the human brain's architecture and function, offers a promising alternative. Unlike conventional systems, neuromorphic models integrate processing and memory, enabling highly parallel, event-driven computation. Thi
APA, Harvard, Vancouver, ISO, and other styles
26

Moshruba, Ayana, Ihsen Alouani, and Maryam Parsa. "Are Neuromorphic Architectures Inherently Privacy-preserving? An Exploratory Study." Proceedings on Privacy Enhancing Technologies 2025, no. 2 (2025): 243–57. https://doi.org/10.56553/popets-2025-0060.

Full text
Abstract:
While machine learning (ML) models are becoming mainstream, including in critical application domains, concerns have been raised about the increasing risk of sensitive data leakage. Various privacy attacks, such as membership inference attacks (MIAs), have been developed to extract data from trained ML models, posing significant risks to data confidentiality. While the predominant work in the ML community considers traditional Artificial Neural Networks (ANNs) as the default neural model, neuromorphic architectures, such as Spiking Neural Networks (SNNs), have recently emerged as an attractive
APA, Harvard, Vancouver, ISO, and other styles
27

Hersam, Mark C. "(Invited) Two-Dimensional Neuromorphic Computing Materials and Devices." ECS Meeting Abstracts MA2023-01, no. 13 (2023): 1317. http://dx.doi.org/10.1149/ma2023-01131317mtgabs.

Full text
Abstract:
The exponentially improving performance of digital computers has recently slowed due to the speed and power consumption issues resulting from the von Neumann bottleneck. In contrast, neuromorphic computing aims to circumvent these limitations by spatially co-locating logic and memory. Beyond reducing power consumption, neuromorphic devices provide efficient architectures for image recognition, machine learning, and artificial intelligence [1]. This talk will explore how two-dimensional (2D) nanoelectronic materials enable gate-tunable neuromorphic devices [2]. For example, by utilizing self-al
APA, Harvard, Vancouver, ISO, and other styles
28

Nasir, Inzamam Mashood, Sara Tehsin, Robertas Damaševičius, and Rytis Maskeliūnas. "Integrating Explanations into CNNs by Adopting Spiking Attention Block for Skin Cancer Detection." Algorithms 17, no. 12 (2024): 557. https://doi.org/10.3390/a17120557.

Full text
Abstract:
Lately, there has been a substantial rise in the number of identified individuals with skin cancer, making it the most widespread form of cancer worldwide. Until now, several machine learning methods that utilize skin scans have been directly employed for skin cancer classification, showing encouraging outcomes in terms of enhancing diagnostic precision. In this paper, multimodal Explainable Artificial Intelligence (XAI) is presented that offers explanations that (1) address a gap regarding interpretation by identifying specific dermoscopic features, thereby enabling (2) dermatologists to comp
APA, Harvard, Vancouver, ISO, and other styles
29

Vanarse, Anup, Adam Osseiran, Alexander Rassau, and Peter van der Made. "Application of Neuromorphic Olfactory Approach for High-Accuracy Classification of Malts." Sensors 22, no. 2 (2022): 440. http://dx.doi.org/10.3390/s22020440.

Full text
Abstract:
Current developments in artificial olfactory systems, also known as electronic nose (e-nose) systems, have benefited from advanced machine learning techniques that have significantly improved the conditioning and processing of multivariate feature-rich sensor data. These advancements are complemented by the application of bioinspired algorithms and architectures based on findings from neurophysiological studies focusing on the biological olfactory pathway. The application of spiking neural networks (SNNs), and concepts from neuromorphic engineering in general, are one of the key factors that h
APA, Harvard, Vancouver, ISO, and other styles
30

GHOSH-DASTIDAR, SAMANWOY, and HOJJAT ADELI. "SPIKING NEURAL NETWORKS." International Journal of Neural Systems 19, no. 04 (2009): 295–308. http://dx.doi.org/10.1142/s0129065709002002.

Full text
Abstract:
Most current Artificial Neural Network (ANN) models are based on highly simplified brain dynamics. They have been used as powerful computational tools to solve complex pattern recognition, function estimation, and classification problems. ANNs have been evolving towards more powerful and more biologically realistic models. In the past decade, Spiking Neural Networks (SNNs) have been developed which comprise of spiking neurons. Information transfer in these neurons mimics the information transfer in biological neurons, i.e., via the precise timing of spikes or a sequence of spikes. To facilitat
APA, Harvard, Vancouver, ISO, and other styles
31

Tavanaei, Amirhossein, Masoud Ghodrati, Saeed Reza Kheradpisheh, Timothée Masquelier, and Anthony Maida. "Deep learning in spiking neural networks." Neural Networks 111 (March 2019): 47–63. http://dx.doi.org/10.1016/j.neunet.2018.12.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

BELATRECHE, AMMAR, LIAM P. MAGUIRE, MARTIN MCGINNITY, and QING XIANG WU. "EVOLUTIONARY DESIGN OF SPIKING NEURAL NETWORKS." New Mathematics and Natural Computation 02, no. 03 (2006): 237–53. http://dx.doi.org/10.1142/s179300570600049x.

Full text
Abstract:
Unlike traditional artificial neural networks (ANNs), which use a high abstraction of real neurons, spiking neural networks (SNNs) offer a biologically plausible model of realistic neurons. They differ from classical artificial neural networks in that SNNs handle and communicate information by means of timing of individual pulses, an important feature of neuronal systems being ignored by models based on rate coding scheme. However, in order to make the most of these realistic neuronal models, good training algorithms are required. Most existing learning paradigms tune the synaptic weights in a
APA, Harvard, Vancouver, ISO, and other styles
33

Ponulak,2, Filip, and Andrzej Kasinski. "Introduction to spiking neural networks: Information processing, learning and applications." Acta Neurobiologiae Experimentalis 71, no. 4 (2011): 409–33. http://dx.doi.org/10.55782/ane-2011-1862.

Full text
Abstract:
The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, spe
APA, Harvard, Vancouver, ISO, and other styles
34

Xu, Zenglin. "Tensor Networks Meet Neural Networks." Journal of Physics: Conference Series 2278, no. 1 (2022): 012003. http://dx.doi.org/10.1088/1742-6596/2278/1/012003.

Full text
Abstract:
Abstract As a simulation of the human cognitive system, deep neural networks have achieved great success in many machine learning tasks and are the main driving force of the current development of artificial intelligence. On the other hand, tensor networks as an approximation of quantum many-body systems in quantum physics are applied to quantum physics, statistical physics, quantum chemistry and machine learning. This talk will first give a brief introduction to neural networks and tensor networks, and then discuss the cross-field research between deep neural networks and tensor networks, suc
APA, Harvard, Vancouver, ISO, and other styles
35

Liu, Yuxiang, and Wei Pan. "Spiking Neural-Networks-Based Data-Driven Control." Electronics 12, no. 2 (2023): 310. http://dx.doi.org/10.3390/electronics12020310.

Full text
Abstract:
Machine learning can be effectively applied in control loops to make optimal control decisions robustly. There is increasing interest in using spiking neural networks (SNNs) as the apparatus for machine learning in control engineering because SNNs can potentially offer high energy efficiency, and new SNN-enabling neuromorphic hardware is being rapidly developed. A defining characteristic of control problems is that environmental reactions and delayed rewards must be considered. Although reinforcement learning (RL) provides the fundamental mechanisms to address such problems, implementing these
APA, Harvard, Vancouver, ISO, and other styles
36

Sporea, Ioana, and André Grüning. "Supervised Learning in Multilayer Spiking Neural Networks." Neural Computation 25, no. 2 (2013): 473–509. http://dx.doi.org/10.1162/neco_a_00396.

Full text
Abstract:
We introduce a supervised learning algorithm for multilayer spiking neural networks. The algorithm overcomes a limitation of existing learning algorithms: it can be applied to neurons firing multiple spikes in artificial neural networks with hidden layers. It can also, in principle, be used with any linearizable neuron model and allows different coding schemes of spike train patterns. The algorithm is applied successfully to classic linearly nonseparable benchmarks such as the XOR problem and the Iris data set, as well as to more complex classification and mapping problems. The algorithm has b
APA, Harvard, Vancouver, ISO, and other styles
37

Zhang, Yongqiang, Haijie Pang, Jinlong Ma, Guilei Ma, Xiaoming Zhang, and Menghua Man. "Research on Anti-Interference Performance of Spiking Neural Network Under Network Connection Damage." Brain Sciences 15, no. 3 (2025): 217. https://doi.org/10.3390/brainsci15030217.

Full text
Abstract:
Background: With the development of artificial intelligence, memristors have become an ideal choice to optimize new neural network architectures and improve computing efficiency and energy efficiency due to their combination of storage and computing power. In this context, spiking neural networks show the ability to resist Gaussian noise, spike interference, and AC electric field interference by adjusting synaptic plasticity. The anti-interference ability to spike neural networks has become an important direction of electromagnetic protection bionics research. Methods: Therefore, this research
APA, Harvard, Vancouver, ISO, and other styles
38

Zenke, Friedemann, and Surya Ganguli. "SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks." Neural Computation 30, no. 6 (2018): 1514–41. http://dx.doi.org/10.1162/neco_a_01086.

Full text
Abstract:
A vast majority of computation in the brain is performed by spiking neural networks. Despite the ubiquity of such spiking, we currently lack an understanding of how biological spiking neural circuits learn and compute in vivo, as well as how we can instantiate such capabilities in artificial spiking circuits in silico. Here we revisit the problem of supervised learning in temporally coding multilayer spiking neural networks. First, by using a surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of determ
APA, Harvard, Vancouver, ISO, and other styles
39

Wu, Hao, Yueyi Zhang, Wenming Weng, et al. "Training Spiking Neural Networks with Accumulated Spiking Flow." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (2021): 10320–28. http://dx.doi.org/10.1609/aaai.v35i12.17236.

Full text
Abstract:
The fast development of neuromorphic hardwares promotes Spiking Neural Networks (SNNs) to a thrilling research avenue. Current SNNs, though much efficient, are less effective compared with leading Artificial Neural Networks (ANNs) especially in supervised learning tasks. Recent efforts further demonstrate the potential of SNNs in supervised learning by introducing approximated backpropagation (BP) methods. To deal with the non-differentiable spike function in SNNs, these BP methods utilize information from the spatio-temporal domain to adjust the model parameters. With the increasing of time w
APA, Harvard, Vancouver, ISO, and other styles
40

Cachi, Paolo G., Sebastián Ventura, and Krzysztof J. Cios. "Improving Spiking Neural Network Performance with Auxiliary Learning." Machine Learning and Knowledge Extraction 5, no. 3 (2023): 1010–22. http://dx.doi.org/10.3390/make5030052.

Full text
Abstract:
The use of back propagation through the time learning rule enabled the supervised training of deep spiking neural networks to process temporal neuromorphic data. However, their performance is still below non-spiking neural networks. Previous work pointed out that one of the main causes is the limited number of neuromorphic data currently available, which are also difficult to generate. With the goal of overcoming this problem, we explore the usage of auxiliary learning as a means of helping spiking neural networks to identify more general features. Tests are performed on neuromorphic DVS-CIFAR
APA, Harvard, Vancouver, ISO, and other styles
41

Lin, Xianghong, Mengwei Zhang, and Xiangwen Wang. "Supervised Learning Algorithm for Multilayer Spiking Neural Networks with Long-Term Memory Spike Response Model." Computational Intelligence and Neuroscience 2021 (November 24, 2021): 1–16. http://dx.doi.org/10.1155/2021/8592824.

Full text
Abstract:
As a new brain-inspired computational model of artificial neural networks, spiking neural networks transmit and process information via precisely timed spike trains. Constructing efficient learning methods is a significant research field in spiking neural networks. In this paper, we present a supervised learning algorithm for multilayer feedforward spiking neural networks; all neurons can fire multiple spikes in all layers. The feedforward network consists of spiking neurons governed by biologically plausible long-term memory spike response model, in which the effect of earlier spikes on the r
APA, Harvard, Vancouver, ISO, and other styles
42

Soupizet, Thomas, Zalfa Jouni, Siqi Wang, Aziz Benlarbi-Delai, and Pietro M. Ferreira. "Analog Spiking Neural Network Synthesis for the MNIST." Journal of Integrated Circuits and Systems 18, no. 1 (2023): 1–12. http://dx.doi.org/10.29292/jics.v18i1.663.

Full text
Abstract:
Different from classical artificial neural network which processes digital data, the spiking neural network (SNN) processes spike trains. Indeed, its event-driven property helps to capture the rich dynamics the neurons have within the brain, and the sparsity of collected spikes helps reducing computational power. Novel synthesis framework is proposed and an algorithm is detailed to guide designers into deep learning and energy-efficient analog SNN using MNIST. An analog SNN composed of 86 electronic neurons (eNeuron) and 1238 synapses interacting through two hidden layers is illustrated. Three
APA, Harvard, Vancouver, ISO, and other styles
43

Nascimben, Mauro, and Lia Rimondini. "Molecular Toxicity Virtual Screening Applying a Quantized Computational SNN-Based Framework." Molecules 28, no. 3 (2023): 1342. http://dx.doi.org/10.3390/molecules28031342.

Full text
Abstract:
Spiking neural networks are biologically inspired machine learning algorithms attracting researchers’ attention for their applicability to alternative energy-efficient hardware other than traditional computers. In the current work, spiking neural networks have been tested in a quantitative structure–activity analysis targeting the toxicity of molecules. Multiple public-domain databases of compounds have been evaluated with spiking neural networks, achieving accuracies compatible with high-quality frameworks presented in the previous literature. The numerical experiments also included an analys
APA, Harvard, Vancouver, ISO, and other styles
44

Talukder, Ria, Anas Skalli, Xavier Porte, and Daniel Brunner. "Computation and implementation of large scalable Spiking Neural Network." EPJ Web of Conferences 287 (2023): 13006. http://dx.doi.org/10.1051/epjconf/202328713006.

Full text
Abstract:
Photonic neural networks are a highly sought-after area of research due to their potential for high-performance complex computing. Unlike artificial neural networks, which use simple nonlinear maps, biological neurons transmit information and perform computations through spikes that depend on spike time and/or rate. Through comprehensive studies and experiments, a strong foundation has been laid for the development of photonic neural networks. We have recently developed a large-scale spiking neural network, which serves as a proof-of-concept experiment for novel bio-inspired learning concepts.
APA, Harvard, Vancouver, ISO, and other styles
45

Bansal, Deepika, Kavita Khanna, Rita Chhikara, Rakesh Kumar Dua, and Rajeev Malhotra. "Comparative Analysis of Artificial Neural Networks and Deep Neural Networks for Detection of Dementia." International Journal of Social Ecology and Sustainable Development 13, no. 9 (2022): 1–18. http://dx.doi.org/10.4018/ijsesd.313966.

Full text
Abstract:
Dementia is a neurocognitive brain disease that emerged as a worldwide health challenge. Machine learning and deep learning have been effectively applied for the detection of dementia using magnetic resonance imaging. In this work, the performance of both machine learning and deep learning frameworks along with artificial neural networks are assessed for detecting dementia and normal subjects using MRI images. The first-order and second-order hand-crafted features are used as input for machine learning and artificial neural networks. And automatic feature extraction is used in the last framewo
APA, Harvard, Vancouver, ISO, and other styles
46

Hopkins, Michael, Garibaldi Pineda-García, Petruţ A. Bogdan, and Steve B. Furber. "Spiking neural networks for computer vision." Interface Focus 8, no. 4 (2018): 20180007. http://dx.doi.org/10.1098/rsfs.2018.0007.

Full text
Abstract:
State-of-the-art computer vision systems use frame-based cameras that sample the visual scene as a series of high-resolution images. These are then processed using convolutional neural networks using neurons with continuous outputs. Biological vision systems use a quite different approach, where the eyes (cameras) sample the visual scene continuously, often with a non-uniform resolution, and generate neural spike events in response to changes in the scene. The resulting spatio-temporal patterns of events are then processed through networks of spiking neurons. Such event-based processing offers
APA, Harvard, Vancouver, ISO, and other styles
47

Ryndin, E. A., N. V. Andreeva, V. V. Luchinin, K. S. Goncharov, and V. S. Raiimzhonov. "Neuromorphic Functional Modules of Spiking Neural Network." Nano- i Mikrosistemnaya Tehnika 23, no. 6 (2021): 317–26. http://dx.doi.org/10.17587/nmst.23.317-326.

Full text
Abstract:
In the current era, design and development of artificial neural networks exploiting the architecture of the human brain have evolved rapidly. Artificial neural networks effectively solve a wide range of common for artificial intelligence tasks involving data classification and recognition, prediction, forecasting and adaptive control of object behavior. Biologically inspired underlying principles of ANN operation have certain advantages over the conventional von Neumann architecture including unsupervised learning, architectural flexibility and adaptability to environmental change and high per
APA, Harvard, Vancouver, ISO, and other styles
48

Kocsis, Zoltan Tamas. "Artificial Neural Networks in Medicine." Acta Technica Jaurinensis 12, no. 2 (2019): 117–29. http://dx.doi.org/10.14513/actatechjaur.v12.n2.497.

Full text
Abstract:
In recent years, Information Technology has been developed in a way that applications based on Artificial Intelligence have emerged. This development has resulted in machines being able to perform increasingly complex learning processes. The use of Information Technology, including Artificial Intelligence is becoming more and more widespread in all fields of life. Some common examples are face recognition in smartphones, or the programming of washing machines. As you may think, Artificial Intelligence can also be used in medicine. In this study I am presenting the relationship between machine
APA, Harvard, Vancouver, ISO, and other styles
49

Sellier, Jean Michel, and Alexandre Martini. "On Training Spiking Neural Networks by Means of a Novel Quantum Inspired Machine Learning Method." Applied AI Letters 6, no. 2 (2025). https://doi.org/10.1002/ail2.114.

Full text
Abstract:
ABSTRACTIn spite of the high potential shown by spiking neural networks (e.g., temporal patterns), training them remains an open and complex problem. In practice, while in theory these networks are computationally as powerful as mainstream artificial neural networks, they have not reached the same accuracy levels yet. The major reason for such a situation seems to be represented by the lack of adequate training algorithms for deep spiking neural networks, since spike signals are not differentiable, that is, no direct way to compute a gradient is provided. Recently, a novel training method, bas
APA, Harvard, Vancouver, ISO, and other styles
50

De Geeter, Florent, Damien Ernst, and Guillaume Drion. "Spike-based computation using classical recurrent neural networks." Neuromorphic Computing and Engineering, May 3, 2024. http://dx.doi.org/10.1088/2634-4386/ad473b.

Full text
Abstract:
Abstract Spiking neural networks are a type of artificial neural networks in which communication between neurons is only made of events, also called spikes. This property allows neural networks to make asynchronous and sparse computations and therefore drastically decrease energy consumption when run on specialised hardware. However, training such networks is known to be difficult, mainly due to the non-differentiability of the spike activation, which prevents the use of classical backpropagation. This is because state-of-the-art spiking neural networks are usually derived from biologically-in
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!