Academic literature on the topic 'STDP [Spike Timing Dependant Plasticity]'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'STDP [Spike Timing Dependant Plasticity].'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "STDP [Spike Timing Dependant Plasticity]"
BADOUAL, MATHILDE, QUAN ZOU, ANDREW P. DAVISON, MICHAEL RUDOLPH, THIERRY BAL, YVES FRÉGNAC, and ALAIN DESTEXHE. "BIOPHYSICAL AND PHENOMENOLOGICAL MODELS OF MULTIPLE SPIKE INTERACTIONS IN SPIKE-TIMING DEPENDENT PLASTICITY." International Journal of Neural Systems 16, no. 02 (April 2006): 79–97. http://dx.doi.org/10.1142/s0129065706000524.
Full textUramoto, Takumi, and Hiroyuki Torikai. "A Calcium-Based Simple Model of Multiple Spike Interactions in Spike-Timing-Dependent Plasticity." Neural Computation 25, no. 7 (July 2013): 1853–69. http://dx.doi.org/10.1162/neco_a_00462.
Full textDan, Yang, and Mu-Ming Poo. "Spike Timing-Dependent Plasticity: From Synapse to Perception." Physiological Reviews 86, no. 3 (July 2006): 1033–48. http://dx.doi.org/10.1152/physrev.00030.2005.
Full textEcheveste, Rodrigo, and Claudius Gros. "Two-Trace Model for Spike-Timing-Dependent Synaptic Plasticity." Neural Computation 27, no. 3 (March 2015): 672–98. http://dx.doi.org/10.1162/neco_a_00707.
Full textFlorian, Răzvan V. "Reinforcement Learning Through Modulation of Spike-Timing-Dependent Synaptic Plasticity." Neural Computation 19, no. 6 (June 2007): 1468–502. http://dx.doi.org/10.1162/neco.2007.19.6.1468.
Full textLightheart, Toby, Steven Grainger, and Tien-Fu Lu. "Spike-Timing-Dependent Construction." Neural Computation 25, no. 10 (October 2013): 2611–45. http://dx.doi.org/10.1162/neco_a_00501.
Full textLu, Hui, Hyungju Park, and Mu-Ming Poo. "Spike-timing-dependent BDNF secretion and synaptic plasticity." Philosophical Transactions of the Royal Society B: Biological Sciences 369, no. 1633 (January 5, 2014): 20130132. http://dx.doi.org/10.1098/rstb.2013.0132.
Full textLeen, Todd K., and Robert Friel. "Stochastic Perturbation Methods for Spike-Timing-Dependent Plasticity." Neural Computation 24, no. 5 (May 2012): 1109–46. http://dx.doi.org/10.1162/neco_a_00267.
Full textHunzinger, Jason F., Victor H. Chan, and Robert C. Froemke. "Learning complex temporal patterns with resource-dependent spike timing-dependent plasticity." Journal of Neurophysiology 108, no. 2 (July 15, 2012): 551–66. http://dx.doi.org/10.1152/jn.01150.2011.
Full textMendes, Alexandre, Gaetan Vignoud, Sylvie Perez, Elodie Perrin, Jonathan Touboul, and Laurent Venance. "Concurrent Thalamostriatal and Corticostriatal Spike-Timing-Dependent Plasticity and Heterosynaptic Interactions Shape Striatal Plasticity Map." Cerebral Cortex 30, no. 8 (March 7, 2020): 4381–401. http://dx.doi.org/10.1093/cercor/bhaa024.
Full textDissertations / Theses on the topic "STDP [Spike Timing Dependant Plasticity]"
Strain, Thomas. "A spiking neuron training approach using spike timing-dependent plasticity (STDP)." Thesis, Ulster University, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.538954.
Full textIglesias, Javier. "Emergence of oriented circuits driven by synaptic pruning associated with spike-timing-dependent plasticity (STDP)." Université Joseph Fourier (Grenoble), 2005. https://tel.archives-ouvertes.fr/tel-00010650.
Full textMassive synaptic pruning following over-growth is a general feature of mammalian brain maturation. Pruning starts near time of birth and is completed by time of sexual maturation. Trigger signals able to induce synaptic pruning could be related to dynamic functions that depend on the timing of action potentials. Spike-timing-dependent synaptic plasticity (STDP) is a change in the synaptic strength based on the ordering of pre, and postsynaptic spikes. The relation between synaptic efficacy and synaptic pruning suggests that the weak synapses may be modified and removed through competitive "learning" rules. This plasticity rule might produce the strengthening of the connections among neurons that belong to cell assemblies characterized by recurrent patterns of firing. Conversely, the connections that are not recurrently activated might decrease in efficiency and eventually be eliminated. The main goal of our study is to determine whether or not, and under which conditions, such cell assemblies may emerge out of a locally connected random network of integrate-and-fire units distributed on a 2D lattice receiving background noise and content-related input organized in both temporal and spatial dimensions. The originality of our study stands on the relatively large size of the network, 10,000 units, the duration of the experiment, 1,000,000 time units (one time unit corresponding to the duration of a spike), and the application of an original bio-inspired STDP modification rule compatible with hardware implementation
Humble, James. "Learning, self-organisation and homeostasis in spiking neuron networks using spike-timing dependent plasticity." Thesis, University of Plymouth, 2013. http://hdl.handle.net/10026.1/1499.
Full textRené, Alice. "Plasticité synaptique et fonctionnelle dans le cortex visuel primaire : une étude par conditionnement theta - burst in vivo." Paris 6, 2007. http://www.theses.fr/2007PA066656.
Full textFalez, Pierre. "Improving spiking neural networks trained with spike timing dependent plasticity for image recognition." Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I101.
Full textComputer vision is a strategic field, in consequence of its great number of potential applications which could have a high impact on society. This area has quickly improved over the last decades, especially thanks to the advances of artificial intelligence and more particularly thanks to the accession of deep learning. Nevertheless, these methods present two main drawbacks in contrast with biological brains: they are extremely energy intensive and they need large labeled training sets. Spiking neural networks are alternative models offering an answer to the energy consumption issue. One attribute of these models is that they can be implemented very efficiently on hardware, in order to build ultra low-power architectures. In return, these models impose certain limitations, such as the use of only local memory and computations. It prevents the use of traditional learning methods, for example the gradient back-propagation. STDP is a learning rule, observed in biology, which can be used in spiking neural networks. This rule reinforces the synapses in which local correlations of spike timing are detected. It also weakens the other synapses. The fact that it is local and unsupervised makes it possible to abide by the constraints of neuromorphic architectures, which means it can be implemented efficiently, but it also provides a solution to the data set labeling issue. However, spiking neural networks trained with the STDP rule are affected by lower performances in comparison to those following a deep learning process. The literature about STDP still uses simple data but the behavior of this rule has seldom been used with more complex data, such as sets made of a large variety of real-world images.The aim of this manuscript is to study the behavior of these spiking models, trained through the STDP rule, on image classification tasks. The main goal is to improve the performances of these models, while respecting as much as possible the constraints of neuromorphic architectures. The first contribution focuses on the software simulations of spiking neural networks. Hardware implementation being a long and costly process, using simulation is a good alternative in order to study more quickly the behavior of different models. Then, the contributions focus on the establishment of multi-layered spiking networks; networks made of several layers, such as those in deep learning methods, allow to process more complex data. One of the chapters revolves around the matter of frequency loss seen in several spiking neural networks. This issue prevents the stacking of multiple spiking layers. The center point then switches to a study of STDP behavior on more complex data, especially colored real-world image. Multiple measurements are used, such as the coherence of filters or the sparsity of activations, to better understand the reasons for the performance gap between STDP and the more traditional methods. Lastly, the manuscript describes the making of multi-layered networks. To this end, a new threshold adaptation mechanism is introduced, along with a multi-layer training protocol. It is proven that such networks can improve the state-of-the-art for STDP
Jacob, Vincent. "Intégration spatio-temporelle de scènes tactiles et plasticité fonctionnelle dans le cortex à tonneaux du rat." Paris 6, 2007. http://www.theses.fr/2007PA066222.
Full textClassically the connections between whiskers and cortical barrels are considered as independent ways. However, the rat generates complex patterns of contacts during active exploration. The cortical receptive fields (RF) are very large suggesting that multiwhisker information converge on each neuron. In order to study the integration of tactile scenes in the cortex, we have developed a matrix of 25 stimulators. We studied the RFs, their dependence to the omission of a predictable stimulus and the selectivity to global direction generated by sequential deflections of the whiskers. Then primary cortex performs an integrated and non-linear analysis of sensory information. Conditions of activity induce long-term modifications of the RFs. We observed modifications of the sensory response whose sign and intensity depended on the order and the time interval between the stimulations and the post-synaptic activation of the recorded neuron. This result is compatible with STDP rules for which this work is the first validation in the in vivo somatosensory cortex
Bernert, Marie. "Développement d'un réseau de neurones STDP pour le tri en ligne et non-supervisé de potentiels d'action." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAS001.
Full textPattern recognition is a fundamental task for living beings and is perform very efficiently by the brain. Artificial deep neural networks are making quick progress in reproducing these performance and have many applications such as image recognition or natural language processing. However, they require extensive training on large datasets and heavy computations. A promising alternative are spiking neural networks, which closely mimic what happens in the brain, with spiking neurons and spike-timing dependent plasticity (STDP). They are able to perform unsupervised learning and have been used for visual or auditory pattern recognition. However, for now applications using STDP networks lag far behind classical deep learning. Developing new applications for this kind of networks is all the more at stake that they could be implemented in low power neuromorphic hardware that currently undergoes important developments, in particular with analog miniaturized memristive devices able to mimic synaptic plasticity. In this work, we chose to develop an STDP neural network to perform a specific task: spike-sorting, which is a crucial problem in neuroscience. Brain implants based on microelectrode arrays are able to record the activity of individual neurons, appearing in the recorded signal as peak potential variations called action potentials. However, several neurons can be recorded by the same electrode. The goal of spike-sorting is to extract and separate the activity of different neural cells from a common extracellular recording taking advantage of the fact that the shape of an action potential on an electrode depends on the neuron it stems from. Thus spike-sorting can be seen as an unsupervised pattern recognition task where the goal is to detect and classify different waveforms. Most classical spike-sorting approaches use three separated steps: detecting all action potentials in the signal, extract features characterizing their shapes, and separating these features into clusters that should correspond to different neural cells. Though online methods exists, most widespread spike-sorting methods are offline or require an offline preprocessing step, which is not compatible with online application such as Brain-computer interfaces (BCI). Moreover, the development of always larger microelectrode arrays creates a need for fully automatic and computationally efficient algorithms. Using an STDP network brings a new approach to meet these requirements. We designed a network that take the electrode signal as an input, and output spikes that correspond to the spiking activity of the recorded neural cells. It is organized into several layers, designed to achieve different processing steps, connected in feedforward way. The first layer, composed of neurons acting as sensory neurons, convert the input signal into spike train. The following layers are able to learn patterns from the previous layer thanks to STDP rules. Each layer implement different mechanisms that improve their performance, such as resource-dependent STDP, intrinsic plasticity, plasticity triggered by inhibition, or neuron models having rebound spiking properties. An attention mechanism has been implemented to make the network sensitive only to part of the signal containing action potentials. This network was first designed to process data from a single electrode, and then adapted to process data from multiple electrodes. It has been tested on simulated data, which allowed to compare the network output to the known ground truth, and also on real extracellular recordings associated with intracellular recordings that give an incomplete ground truth. Different versions of the network were evaluated and compared to other spike-sorting algorithms, and found to give very satisfying results. Following these software simulations, we initiated an FPGA implementation of the method, which constitutes a first step toward embedded neuromorphic implementation
Masquelier, Timothée. "Mécanismes d'apprentissage pour expliquer la rapidité, la sélectivité et l'invariance des réponses dans le cortex visuel." Phd thesis, Université Paul Sabatier - Toulouse III, 2008. http://tel.archives-ouvertes.fr/tel-00271070.
Full textHelson, Pascal. "Étude de la plasticité pour des neurones à décharge en interaction." Thesis, Université Côte d'Azur, 2021. http://www.theses.fr/2021COAZ4013.
Full textIn this thesis, we study a phenomenon that may be responsible for our memory capacity: the synaptic plasticity. It modifies the links between neurons over time. This phenomenon is stochastic: it is the result of a series of diverse and numerous chemical processes. The aim of the thesis is to propose a model of plasticity for interacting spiking neurons. The main difficulty is to find a model that satisfies the following conditions: it must be both consistent with the biological results of the field and simple enough to be studied mathematically and simulated with a large number of neurons.In a first step, from a rather simple model of plasticity, we study the learning of external signals by a neural network as well as the forgetting time of this signal when the network is subjected to other signals (noise). The mathematical analysis allows us to control the probability to misevaluate the signal. From this, we deduce explicit bounds on the time during which a given signal is kept in memory.Next, we propose a model based on stochastic rules of plasticity as a function of the occurrence time of the neural electrical discharges (Spike Timing Dependent Plasticity, STDP). This model is described by a Piecewise Deterministic Markov Process (PDMP). The long time behaviour of such a neural network is studied using a slow-fast analysis. In particular, sufficient conditions are established under which the process associated with synaptic weights is ergodic. Finally, we make the link between two levels of modelling: the microscopic and the macroscopic approaches. Starting from the dynamics presented at a microscopic level (neuron model and its interaction with other neurons), we derive an asymptotic dynamics which represents the evolution of a typical neuron and its incoming synaptic weights: this is the mean field analysis of the model. We thus condense the information on the dynamics of the weights and that of the neurons into a single equation, that of a typical neuron
Lecerf, Gwendal. "Développement d'un réseau de neurones impulsionnels sur silicium à synapses memristives." Thesis, Bordeaux, 2014. http://www.theses.fr/2014BORD0219/document.
Full textSupported financially by ANR MHANN project, this work proposes an architecture ofspiking neural network in order to recognize pictures, where traditional processing units are inefficient regarding this. In 2008, a new passive electrical component had been discovered : the memristor. Its resistance can be adjusted by applying a potential between its terminals. Behaving intrinsically as artificial synapses, memristives devices can be used inside artificial neural networks.We measure the variation in resistance of a ferroelectric memristor (obtained from UMjCNRS/Thalès) similar to the biological law STDP (Spike Timing Dependant Plasticity) used with spiking neurons. With our measurements on the memristor and our network simulation (aided by INRIASaclay) we designed successively two versions of the IC. The second IC design is driven by specifications of the first IC with additional functionalists. The second IC contains two layers of a spiking neural network dedicated to learn a picture of 81 pixels. A demonstrator of hybrid neural networks will be achieved by integrating a chip of memristive crossbar interfaced with thesecond IC
Book chapters on the topic "STDP [Spike Timing Dependant Plasticity]"
Börgers, Christoph. "Spike Timing-Dependent Plasticity (STDP)." In An Introduction to Modeling Neuronal Dynamics, 349–59. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-51171-9_40.
Full textGriffith, Thom, Jack Mellor, and Krasimira Tsaneva-Atanasova. "Spike Timing-Dependent Plasticity (STDP), Biophysical Models." In Encyclopedia of Computational Neuroscience, 1–5. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4614-7320-6_359-1.
Full textGriffith, Thom, Jack Mellor, and Krasimira Tsaneva-Atanasova. "Spike-Timing Dependent Plasticity (STDP), Biophysical Models." In Encyclopedia of Computational Neuroscience, 2803–7. New York, NY: Springer New York, 2015. http://dx.doi.org/10.1007/978-1-4614-6675-8_359.
Full textKoprinkova-Hristova, Petia, Nadejda Bocheva, Simona Nedelcheva, Miroslava Stefanova, Bilyana Genova, Radoslava Kraleva, and Velin Kralev. "STDP Plasticity in TRN Within Hierarchical Spike Timing Model of Visual Information Processing." In IFIP Advances in Information and Communication Technology, 279–90. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-49161-1_24.
Full textBuonomano, D. V., and T. P. Carvalho. "Spike-Timing-Dependent Plasticity (STDP)." In Encyclopedia of Neuroscience, 265–68. Elsevier, 2009. http://dx.doi.org/10.1016/b978-008045046-9.00822-6.
Full textConference papers on the topic "STDP [Spike Timing Dependant Plasticity]"
Kiani, Mahdi, Nan Du, Danilo Burger, Ilona Skorupa, Ramona Ecke, Stefan E. Schulz, and Heidemarie Schmidt. "Electroforming-free BiFeO3 switches for neuromorphic computing: Spike-timing dependent plasticity (STDP) and cycle-number dependent plasticity (CNDP)." In 2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS). IEEE, 2019. http://dx.doi.org/10.1109/icecs46596.2019.8965060.
Full textDi Hu, Xu Zhang, Ziye Xu, Silvia Ferrari, and Pinaki Mazumder. "Digital implementation of a spiking neural network (SNN) capable of spike-timing-dependent plasticity (STDP) learning." In 2014 IEEE 14th International Conference on Nanotechnology (IEEE-NANO). IEEE, 2014. http://dx.doi.org/10.1109/nano.2014.6968000.
Full textMarukame, T., R. Ichihara, M. Mori, Y. Nishi, S. Yasuda, T. Tanamoto, and Y. Mitani. "Artificial neuron operations and spike-timing-dependent plasticity (STDP) using memristive devices for brain-inspired computing." In 2017 International Conference on Solid State Devices and Materials. The Japan Society of Applied Physics, 2017. http://dx.doi.org/10.7567/ssdm.2017.d-4-03.
Full textQi, Yu, Jiangrong Shen, Yueming Wang, Huajin Tang, Hang Yu, Zhaohui Wu, and Gang Pan. "Jointly Learning Network Connections and Link Weights in Spiking Neural Networks." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/221.
Full textZhang, Tielin, Yi Zeng, Dongcheng Zhao, and Bo Xu. "Brain-inspired Balanced Tuning for Spiking Neural Networks." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/229.
Full textSarim, Mohammad, Thomas Schultz, Manish Kumar, and Rashmi Jha. "An Artificial Brain Mechanism to Develop a Learning Paradigm for Robot Navigation." In ASME 2016 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/dscc2016-9903.
Full text