To see the other types of publications on this topic, follow the link: The NEURON simulator.

Journal articles on the topic 'The NEURON simulator'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'The NEURON simulator.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zeguang, LI, Jun Sun, Chunlin Wei, Zhe Sui, and Xiaoye Qian. "RESEARCH ON THE CROSS-SECTION GENERATING METHOD IN HTGR SIMULATOR BASED ON MACHINE LEARNING METHODS." EPJ Web of Conferences 247 (2021): 02039. http://dx.doi.org/10.1051/epjconf/202124702039.

Full text
Abstract:
With the increasing needs of accurate simulation, the 3-D diffusion reactor physics module has been implemented in HTGR’s engineering simulator to give better neutron dynamics results instead of point kinetics model used in previous nuclear power plant simulators. As the requirement of real-time calculation of nuclear power plant simulator, the cross-sections used in 3-D diffusion module must be calculated very efficiently. Normally, each cross-section in simulator is calculated in the form of polynomial by function of several concerned variables, the expression of which was finalized by multivariate regression from large number scattered database generated by previous calculation. Since the polynomial is explicit and prepared in advance, the cross-sections could be calculated quickly enough in running simulator and achieve acceptable accuracy especially in LWR simulations. However, some of concerned variables in HTGR are in large scope and also the relationships of these variables are non-linear and very complex, it is very hard to use polynomial to meet full range accuracy. In this paper, a cross-section generating method used in HTGR simulator is proposed, which is based on machine learning methods, especially deep neuron network and tree regression methods. This method first uses deep neuron networks to consider the nonlinear relationships between different variables and then uses a tree regression to achieve accurate cross-section results in full range, the parameters of deep neuron networks and tree regression are learned automatically from the scattered database generated by VSOP. With the numerical tests, the proposed cross-section generating method could get more accurate cross-section results and the calculation time is acceptable by the simulator.
APA, Harvard, Vancouver, ISO, and other styles
2

Plesser, Hans E., and Markus Diesmann. "Simplicity and Efficiency of Integrate-and-Fire Neuron Models." Neural Computation 21, no. 2 (February 2009): 353–59. http://dx.doi.org/10.1162/neco.2008.03-08-731.

Full text
Abstract:
Lovelace and Cios ( 2008 ) recently proposed a very simple spiking neuron (VSSN) model for simulations of large neuronal networks as an efficient replacement for the integrate-and-fire neuron model. We argue that the VSSN model falls behind key advances in neuronal network modeling over the past 20 years, in particular, techniques that permit simulators to compute the state of the neuron without repeated summation over the history of input spikes and to integrate the subthreshold dynamics exactly. State-of-the-art solvers for networks of integrate-and-fire model neurons are substantially more efficient than the VSSN simulator and allow routine simulations of networks of some 105 neurons and 109 connections on moderate computer clusters.
APA, Harvard, Vancouver, ISO, and other styles
3

Kleijnen, Robert, Markus Robens, Michael Schiek, and Stefan van Waasen. "A Network Simulator for the Estimation of Bandwidth Load and Latency Created by Heterogeneous Spiking Neural Networks on Neuromorphic Computing Communication Networks." Journal of Low Power Electronics and Applications 12, no. 2 (April 21, 2022): 23. http://dx.doi.org/10.3390/jlpea12020023.

Full text
Abstract:
Accelerated simulations of biological neural networks are in demand to discover the principals of biological learning. Novel many-core simulation platforms, e.g., SpiNNaker, BrainScaleS and Neurogrid, allow one to study neuron behavior in the brain at an accelerated rate, with a high level of detail. However, they do not come anywhere near simulating the human brain. The massive amount of spike communication has turned out to be a bottleneck. We specifically developed a network simulator to analyze in high detail the network loads and latencies caused by different network topologies and communication protocols in neuromorphic computing communication networks. This simulator allows simulating the impacts of heterogeneous neural networks and evaluating neuron mapping algorithms, which is a unique feature among state-of-the-art network models and simulators. The simulator was cross-checked by comparing the results of a homogeneous neural network-based run with corresponding bandwidth load results from comparable works. Additionally, the increased level of detail achieved by the new simulator is presented. Then, we show the impact heterogeneous connectivity can have on the network load, first for a small-scale test case, and later for a large-scale test case, and how different neuron mapping algorithms can influence this effect. Finally, we look at the latency estimations performed by the simulator for different mapping algorithms, and the impact of the node size.
APA, Harvard, Vancouver, ISO, and other styles
4

Holker, Ruchi, and Seba Susan. "Neuroscience-Inspired Parameter Selection of Spiking Neuron Using Hodgkin Huxley Model." International Journal of Software Science and Computational Intelligence 13, no. 2 (April 2021): 89–106. http://dx.doi.org/10.4018/ijssci.2021040105.

Full text
Abstract:
Spiking neural networks (SNN) are currently being researched to design an artificial brain to teach it how to think, perform, and learn like a human brain. This paper focuses on exploring optimal values of parameters of biological spiking neurons for the Hodgkin Huxley (HH) model. The HH model exhibits maximum number of neurocomputational properties as compared to other spiking models, as per previous research. This paper investigates the HH model parameters of Class 1, Class 2, phasic spiking, and integrator neurocomputational properties. For the simulation of spiking neurons, the NEURON simulator is used since it is easy to understand and code.
APA, Harvard, Vancouver, ISO, and other styles
5

Zheng, Zhu An, Chuan Xue Song, Hui Lin, and Si Lun Peng. "Research of the Brake Pedal Feel on Wire-by-Brake-System." Advanced Materials Research 655-657 (January 2013): 1131–35. http://dx.doi.org/10.4028/www.scientific.net/amr.655-657.1131.

Full text
Abstract:
Analysis and comparison with conventional brake systems and brake-by-wire-system with pedal stroke simulator, and the establishment of the pedal stroke simulator model with the AMESim software, joint Matlab/Simulink software to design single neuron adaptive intelligent PID control strategy of the pedal stroke simulator. Through simulation verification draw that this brake-by-wire-systems and the control strategy can achieve the requirements of brake pedal feel of conventional brake systems, and effectively improve comfort during braking.
APA, Harvard, Vancouver, ISO, and other styles
6

Anghel, Daniel Constantin, and Nadia Belu. "Contributions to Ranking an Ergonomic Workstation, Considering the Human Effort and the Microclimate Parameters, Using Neural Networks." Applied Mechanics and Materials 371 (August 2013): 812–16. http://dx.doi.org/10.4028/www.scientific.net/amm.371.812.

Full text
Abstract:
The paper presents a method to use a feed forward neural network in order to rank a working place from the manufacture industry. Neural networks excel in gathering difficult non-linear relationships between the inputs and outputs of a system. The neural network is simulated with a simple simulator: SSNN. In this paper, we considered as relevant for a work place ranking, 6 input parameters: temperature, humidity, noise, luminosity, load and frequency. The neural network designed for the study presented in this paper has 6 input neurons, 13 neurons in the hidden layer and 1 neuron in the output layer. We present also some experimental results obtained through simulations.
APA, Harvard, Vancouver, ISO, and other styles
7

Lytton, William W., Alexandra H. Seidenstein, Salvador Dura-Bernal, Robert A. McDougal, Felix Schürmann, and Michael L. Hines. "Simulation Neurotechnologies for Advancing Brain Research: Parallelizing Large Networks in NEURON." Neural Computation 28, no. 10 (October 2016): 2063–90. http://dx.doi.org/10.1162/neco_a_00876.

Full text
Abstract:
Large multiscale neuronal network simulations are of increasing value as more big data are gathered about brain wiring and organization under the auspices of a current major research initiative, such as Brain Research through Advancing Innovative Neurotechnologies. The development of these models requires new simulation technologies. We describe here the current use of the NEURON simulator with message passing interface (MPI) for simulation in the domain of moderately large networks on commonly available high-performance computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike-passing paradigm, and postsimulation data storage and data management approaches. Using the Neuroscience Gateway, a portal for computational neuroscience that provides access to large HPCs, we benchmark simulations of neuronal networks of different sizes (500–100,000 cells), and using different numbers of nodes (1–256). We compare three types of networks, composed of either Izhikevich integrate-and-fire neurons (I&F), single-compartment Hodgkin-Huxley (HH) cells, or a hybrid network with half of each. Results show simulation run time increased approximately linearly with network size and decreased almost linearly with the number of nodes. Networks with I&F neurons were faster than HH networks, although differences were small since all tested cells were point neurons with a single compartment.
APA, Harvard, Vancouver, ISO, and other styles
8

Mohamed, Z., M. A Ayub, M. H.M Ramli, M. S.B Shaari, and S. Khusairi. "Assistive Robot Simulator for Multi-Objective Evolutionary Algorithm Application." International Journal of Engineering & Technology 7, no. 4.27 (November 30, 2018): 153. http://dx.doi.org/10.14419/ijet.v7i4.27.22506.

Full text
Abstract:
This paper presents a new assistive robot simulator for multi-objective optimization application. The main function of the simulator is to simulate the trajectory of the robot arm when it moves from initial to a goal position in optimized manner. A multi-objective evolutionary algorithm (MOEA) is utilized to generate the robot arm motion optimizing three different objective function; optimum time, distance, and high stability. The generated neuron will be selected from the Pareto optimal based on the required objectives function. The robot will intelligently choose the best neuron for a specific task. For example, to move a glass of water required higher stability compare to move an empty mineral water bottle. The simulator will be connected to the real robot to test the performance in real environment. The kinematics, mechatronics and the real robot specification are utilized in the simulator. The performance of the simulator is presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
9

HAMMARLUND, PER, BJÖRN LEVIN, and ANDERS LANSNER. "BIOLOGICALLY REALISTIC AND ARTIFICIAL NEURAL NETWORK SIMULATORS ON THE CONNECTION MACHINE." International Journal of Modern Physics C 04, no. 01 (February 1993): 49–63. http://dx.doi.org/10.1142/s0129183193000070.

Full text
Abstract:
We describe two neural network (NN) simulators implemented on the Connection Machine (CM). The first program is aimed at biologically realistic simulations and the second at recurrent artificial NNs. Both programs are currently used as simulation engines in research within the SANS group as well as in other groups. The program for biologically realistic NN simulations on the CM is called BIOSIM. The aim is to simulate NNs in which the neurons are modeled with a high degree of biological realism. The cell model used is a compartmentalized abstraction of the neuron. It includes sodium, potassium, calcium, and calcium dependent potassium channels. Synaptic interaction includes conventional chemical synapses as well as voltage gated NMDA synapses. On a CM with 8K processors the program is typically capable of handling some tens of thousands of compartments and more than ten times as many synapses. The artificial NN simulator implements the SANS model, a recurrent NN model closely related to the Hopfield model. The aim has been to effectively support large network simulations, in the order of 8–16K units, on an 8K CM. To make the simulator optimal for different applications, it supports both fully and sparsely connected networks. The implementation for sparsely connected NNs uses a compacted weight matrix. Both implementations are optimized for sparse activity.
APA, Harvard, Vancouver, ISO, and other styles
10

Belyaev and Velichko. "A Spiking Neural Network Based on the Model of VO2—Neuron." Electronics 8, no. 10 (September 20, 2019): 1065. http://dx.doi.org/10.3390/electronics8101065.

Full text
Abstract:
In this paper, we present an electrical circuit of a leaky integrate-and-fire neuron with one VO2 switch, which models the properties of biological neurons. Based on VO2 neurons, a two-layer spiking neural network consisting of nine input and three output neurons is modeled in the SPICE simulator. The network contains excitatory and inhibitory couplings, and implements the winner-takes-all principle in pattern recognition. Using a supervised Spike-Timing-Dependent Plasticity training method and a timing method of information coding, the network was trained to recognize three patterns with dimensions of 3 × 3 pixels. The neural network is able to recognize up to 105 images per second, and has the potential to increase the recognition speed further.
APA, Harvard, Vancouver, ISO, and other styles
11

Yalçın, Nedim Aktan, and Fahri Vatansever. "Educational Simulator for Frequency Estimation using ANN." Academic Perspective Procedia 3, no. 1 (October 25, 2020): 331–36. http://dx.doi.org/10.33793/acperpro.03.01.65.

Full text
Abstract:
In this study, educational simulator for frequency estimation of signals is realized with artificial neural networks. Artificial neural networks are used for training Prony coefficients. Designed simulator is written in MATLAB and effect of neural net parameters (cost function, activation function, neuron size, etc.) to learning ability can be compared. Besides educational purposes, developed simulator can be used by engineers in order to create frequency estimators in practical studies.
APA, Harvard, Vancouver, ISO, and other styles
12

Lytton, William W. "Neural Query System: Data-Mining From Within the NEURON Simulator." Neuroinformatics 4, no. 2 (2006): 163–76. http://dx.doi.org/10.1385/ni:4:2:163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Makino, T. "A Discrete-Event Neural Network Simulator for General Neuron Models." Neural Computing & Applications 11, no. 3-4 (June 1, 2003): 210–23. http://dx.doi.org/10.1007/s00521-003-0358-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ziv, I., D. A. Baxter, and J. H. Byrne. "Simulator for neural networks and action potentials: description and application." Journal of Neurophysiology 71, no. 1 (January 1, 1994): 294–308. http://dx.doi.org/10.1152/jn.1994.71.1.294.

Full text
Abstract:
1. We describe a simulator for neural networks and action potentials (SNNAP) that can simulate up to 30 neurons, each with up to 30 voltage-dependent conductances, 30 electrical synapses, and 30 multicomponent chemical synapses. Voltage-dependent conductances are described by Hodgkin-Huxley type equations, and the contributions of time-dependent synaptic conductances are described by second-order differential equations. The program also incorporates equations for simulating different types of neural modulation and synaptic plasticity. 2. Parameters, initial conditions, and output options for SNNAP are passed to the program through a number of modular ASCII files. These modules can be modified by commonly available text editors that use a conventional (i.e., character based) interface or by an editor incorporated into SNNAP that uses a graphical interface. The modular design facilitates the incorporation of existing modules into new simulations. Thus libraries can be developed of files describing distinctive cell types and files describing distinctive neural networks. 3. Several different types of neurons with distinct biophysical properties and firing properties were simulated by incorporating different combinations of voltage-dependent Na+, Ca2+, and K+ channels as well as Ca(2+)-activated and Ca(2+)-inactivated channels. Simulated cells included those that respond to depolarization with tonic firing, adaptive firing, or plateau potentials as well as endogenous pacemaker and bursting cells. 4. Several types of simple neural networks were simulated that included feed-forward excitatory and inhibitory chemical synaptic connections, a network of electrically coupled cells, and a network with feedback chemical synaptic connections that simulated rhythmic neural activity. In addition, with the use of the equations describing electrical coupling, current flow in a branched neuron with 18 compartments was simulated. 5. Enhancement of excitability and enhancement of transmitter release, produced by modulatory transmitters, were simulated by second-messenger-induced modulation of K+ currents. A depletion model for synaptic depression was also simulated. 6. We also attempted to simulate the features of a more complicated central pattern generator, inspired by the properties of neurons in the buccal ganglia of Aplysia. Dynamic changes in the activity of this central pattern generator were produced by a second-messenger-induced modulation of a slow inward current in one of the neurons.
APA, Harvard, Vancouver, ISO, and other styles
15

Aonishi, Toru, Hiroyoshi Miyakawa, Masashi Inoue, and Masato Okada. "Sharing Models and Theories via NEURON Simulator: For Understanding the Dendrite." Brain & Neural Networks 12, no. 2 (2005): 100–106. http://dx.doi.org/10.3902/jnns.12.100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ahmed, Duraid F., and Ali H. Khalaf. "Development of Artificial Neural Network Model of Crude Oil Distillation Column." Tikrit Journal of Engineering Sciences 22, no. 1 (April 1, 2015): 24–37. http://dx.doi.org/10.25130/tjes.22.1.03.

Full text
Abstract:
Artificial neural network in MATLAB simulator is used to model Baiji crude oil distillation unit based on data generated from aspen-HYSYS simulator. Thirteen inputs, six outputs and over 1487 data set are used to model the actual unit. Nonlinear autoregressive network with exogenous inputs (NARX) and back propagation algorithm are used for training. Seventy percent of data are used for training the network while the remaining thirty percent are used for testing and validating the network to determine its prediction accuracy. One hidden layer and 34 hidden neurons are used for the proposed network with MSE of 0.25 is obtained. The number of neuron are selected based on less MSE for the network. The model founded to predict the optimal operating conditions for different objective functions within the training limit since ANN models are poor extrapolators. They are usually only reliable within the range of data that they had been trained for.
APA, Harvard, Vancouver, ISO, and other styles
17

Song, Xiaoxiao, Luis Valencia-Cabrera, Hong Peng, Jun Wang, and Mario J. Pérez-Jiménez. "Spiking Neural P Systems with Delay on Synapses." International Journal of Neural Systems 31, no. 01 (July 23, 2020): 2050042. http://dx.doi.org/10.1142/s0129065720500422.

Full text
Abstract:
Based on the feature and communication of neurons in animal neural systems, spiking neural P systems (SN P systems) were proposed as a kind of powerful computing model. Considering the length of axons and the information transmission speed on synapses, SN P systems with delay on synapses (SNP-DS systems) are proposed in this work. Unlike the traditional SN P systems, where all the postsynaptic neurons receive spikes at the same instant from their presynaptic neuron, the postsynaptic neurons in SNP-DS systems would receive spikes at different instants, depending on the delay time on the synapses connecting them. It is proved that the SNP-DS systems are universal as number generators. Two small universal SNP-DS systems, with standard or extended rules, are constructed to compute functions, using 56 and 36 neurons, respectively. Moreover, a simulator has been provided, in order to check the correctness of these two SNP-DS systems, thus providing an experimental validation of the universality of the systems designed.
APA, Harvard, Vancouver, ISO, and other styles
18

SPILIOTIS, KONSTANTINOS G., and CONSTANTINOS I. SIETTOS. "MULTISCALE COMPUTATIONS ON NEURAL NETWORKS: FROM THE INDIVIDUAL NEURON INTERACTIONS TO THE MACROSCOPIC-LEVEL ANALYSIS." International Journal of Bifurcation and Chaos 20, no. 01 (January 2010): 121–34. http://dx.doi.org/10.1142/s0218127410025442.

Full text
Abstract:
We show how the Equation-Free approach for multiscale computations can be exploited to systematically study the dynamics of neural interactions on a random regular connected graph under a pairwise representation perspective. Using an individual-based microscopic simulator as a black box coarse-grained timestepper and with the aid of Simulated Annealing we compute the coarse-grained equilibrium bifurcation diagram and analyze the stability of the stationary states, sidestepping the necessity of obtaining explicit closures at the macroscopic level. We also exploit the scheme to perform a rare-events analysis by estimating an effective Fokker–Planck equation describing the evolving probability density function of the corresponding coarse-grained observables.
APA, Harvard, Vancouver, ISO, and other styles
19

Kim, Hyun, Nhayoung Hong, Myungjoon Kim, Sang Yoon, Hyeong Yu, Hyoun-Joong Kong, Su-Jin Kim, et al. "Application of a Perception Neuron® System in Simulation-Based Surgical Training." Journal of Clinical Medicine 8, no. 1 (January 21, 2019): 124. http://dx.doi.org/10.3390/jcm8010124.

Full text
Abstract:
While multiple studies show that simulation methods help in educating surgical trainees, few studies have focused on developing systems that help trainees to adopt the most effective body motions. This is the first study to use a Perception Neuron® system to evaluate the relationship between body motions and simulation scores. Ten medical students participated in this study. All completed two standard tasks with da Vinci Skills Simulator (dVSS) and five standard tasks with thyroidectomy training model. This was repeated. Thyroidectomy training was conducted while participants wore a perception neuron. Motion capture (MC) score that indicated how long the tasks took to complete and each participant’s economy-of-motion that was used was calculated. Correlations between the three scores were assessed by Pearson’s correlation analyses. The 20 trials were categorized as low, moderate, and high overall-proficiency by summing the training model, dVSS, and MC scores. The difference between the low and high overall-proficiency trials in terms of economy-of-motion of the left or right hand was assessed by two-tailed t-test. Relative to cycle 1, the training model, dVSS, and MC scores all increased significantly in cycle 2. Three scores correlated significantly with each other. Six, eight, and six trials were classified as low, moderate, and high overall-proficiency, respectively. Low- and high-scoring trials differed significantly in terms of right (dominant) hand economy-of-motion (675.2 mm and 369.4 mm, respectively) (p = 0.043). Perception Neuron® system can be applied to simulation-based training of surgical trainees. The motion analysis score is related to the traditional scoring system.
APA, Harvard, Vancouver, ISO, and other styles
20

Lytton, William W., Ahmet Omurtag, Samuel A. Neymotin, and Michael L. Hines. "Just-in-Time Connectivity for Large Spiking Networks." Neural Computation 20, no. 11 (November 2008): 2745–56. http://dx.doi.org/10.1162/neco.2008.10-07-622.

Full text
Abstract:
The scale of large neuronal network simulations is memory limited due to the need to store connectivity information: connectivity storage grows as the square of neuron number up to anatomically relevant limits. Using the NEURON simulator as a discrete-event simulator (no integration), we explored the consequences of avoiding the space costs of connectivity through regenerating connectivity parameters when needed: just in time after a presynaptic cell fires. We explored various strategies for automated generation of one or more of the basic static connectivity parameters: delays, postsynaptic cell identities, and weights, as well as run-time connectivity state: the event queue. Comparison of the JitCon implementation to NEURON's standard NetCon connectivity method showed substantial space savings, with associated run-time penalty. Although JitCon saved space by eliminating connectivity parameters, larger simulations were still memory limited due to growth of the synaptic event queue. We therefore designed a JitEvent algorithm that added items to the queue only when required: instead of alerting multiple postsynaptic cells, a spiking presynaptic cell posted a callback event at the shortest synaptic delay time. At the time of the callback, this same presynaptic cell directly notified the first postsynaptic cell and generated another self-callback for the next delay time. The JitEvent implementation yielded substantial additional time and space savings. We conclude that just-in-time strategies are necessary for very large network simulations but that a variety of alternative strategies should be considered whose optimality will depend on the characteristics of the simulation to be run.
APA, Harvard, Vancouver, ISO, and other styles
21

Meamardoost, Saber, Mahasweta Bhattacharya, Eun Jung Hwang, Takaki Komiyama, Claudia Mewes, Linbing Wang, Ying Zhang, and Rudiyanto Gunawan. "FARCI: Fast and Robust Connectome Inference." Brain Sciences 11, no. 12 (November 24, 2021): 1556. http://dx.doi.org/10.3390/brainsci11121556.

Full text
Abstract:
The inference of neuronal connectome from large-scale neuronal activity recordings, such as two-photon Calcium imaging, represents an active area of research in computational neuroscience. In this work, we developed FARCI (Fast and Robust Connectome Inference), a MATLAB package for neuronal connectome inference from high-dimensional two-photon Calcium fluorescence data. We employed partial correlations as a measure of the functional association strength between pairs of neurons to reconstruct a neuronal connectome. We demonstrated using in silico datasets from the Neural Connectomics Challenge (NCC) and those generated using the state-of-the-art simulator of Neural Anatomy and Optimal Microscopy (NAOMi) that FARCI provides an accurate connectome and its performance is robust to network sizes, missing neurons, and noise levels. Moreover, FARCI is computationally efficient and highly scalable to large networks. In comparison with the best performing connectome inference algorithm in the NCC, Generalized Transfer Entropy (GTE), and Fluorescence Single Neuron and Network Analysis Package (FluoroSNNAP), FARCI produces more accurate networks over different network sizes, while providing significantly better computational speed and scaling.
APA, Harvard, Vancouver, ISO, and other styles
22

Kanazawa, Yusuke, Tetsuya Asai, and Yoshihito Amemiya. "Basic Circuit Design of a Neural Processor: Analog CMOS Implementation of Spiking Neurons and Dynamic Synapses." Journal of Robotics and Mechatronics 15, no. 2 (April 20, 2003): 208–18. http://dx.doi.org/10.20965/jrm.2003.p0208.

Full text
Abstract:
We discuss the integration architecture of spiking neurons, predicted to be next-generation basic circuits of neural processor and dynamic synapse circuits. A key to development of a brain-like processor is to learn from the brain. Learning from the brain, we try to develop circuits implementing neuron and synapse functions while enabling large-scale integration, so large-scale integrated circuits (LSIs) realize functional behavior of neural networks. With such VLSI, we try to construct a large-scale neural network on a single semiconductor chip. With circuit integration now reaching micron levels, however, problems have arisen in dispersion of device performance in analog IC and in the influence of electromagnetic noise. A genuine brain computer should solve such problems on the network level rather than the element level. To achieve such a target, we must develop an architecture that learns brain functions sufficiently and works correctly even in a noisy environment. As the first step, we propose an analog circuit architecture of spiking neurons and dynamic synapses representing the model of artificial neurons and synapses in a form closer to that of the brain. With the proposed circuit, the model of neurons and synapses can be integrated on a silicon chip with metal-oxide-semiconductor (MOS) devices. In the sections that follow, we discuss the dynamic performance of the proposed circuit by using a circuit simulator, HSPICE. As examples of networks using these circuits, we introduce a competitive neural network and an active pattern recognition network by extracting firing frequency information from input information. We also show simulation results of the operation of networks constructed with the proposed circuits.
APA, Harvard, Vancouver, ISO, and other styles
23

Monreal, Marleni Reyes, Jessica Quintero Pérez, Miguel Felipe Pérez Escalera, Arturo Reyes Lazalde, and María Eugenia Pérez Bonilla. "Development of the L-type CaV / BK Complex Simulator (I): electrophysiological interaction." South Florida Journal of Development 2, no. 2 (May 17, 2021): 1241–57. http://dx.doi.org/10.46932/sfjdv2n2-009.

Full text
Abstract:
Complexes formed by voltage-activated calcium channels (CaV) and high-conductance potassium channels activated by Ca2+ (BK) have been studied in smooth muscle, secretory cells and in synaptic terminals, where they regulate muscle contraction, secretory activity, and neurotransmission. However, the complex formed by L- type CaV channels and BK in the soma has been poorly treated. Based on immunostaining studies showing the coexistence of these channels in the neuron soma, their possible interaction was theoretically studied. Two simulators based on the Hodgkin and Huxley formalism were developed to perform virtual experiments on current and voltage clamp. The mathematical models were implemented in Visual Basic® 6.0 and were solved numerically. The results indicate that the BK channels were activated with internal Ca2+ at mM concentrations. The BK channels follow the kinetics of L-type CaVs. The interaction of L-type CaV – BK complex in the soma produced a decrease in neuronal excitability.
APA, Harvard, Vancouver, ISO, and other styles
24

Romaro, Cecilia, Fernando Araujo Najman, William W. Lytton, Antonio C. Roque, and Salvador Dura-Bernal. "NetPyNE Implementation and Scaling of the Potjans-Diesmann Cortical Microcircuit Model." Neural Computation 33, no. 7 (June 11, 2021): 1993–2032. http://dx.doi.org/10.1162/neco_a_01400.

Full text
Abstract:
Abstract The Potjans-Diesmann cortical microcircuit model is a widely used model originally implemented in NEST. Here, we reimplemented the model using NetPyNE, a high-level Python interface to the NEURON simulator, and reproduced the findings of the original publication. We also implemented a method for scaling the network size that preserves first- and second-order statistics, building on existing work on network theory. Our new implementation enabled the use of more detailed neuron models with multicompartmental morphologies and multiple biophysically realistic ion channels. This opens the model to new research, including the study of dendritic processing, the influence of individual channel parameters, the relation to local field potentials, and other multiscale interactions. The scaling method we used provides flexibility to increase or decrease the network size as needed when running these CPU-intensive detailed simulations. Finally, NetPyNE facilitates modifying or extending the model using its declarative language; optimizing model parameters; running efficient, large-scale parallelized simulations; and analyzing the model through built-in methods, including local field potential calculation and information flow measures.
APA, Harvard, Vancouver, ISO, and other styles
25

Sun, Wookyung, Sujin Choi, Bokyung Kim, and Hyungsoon Shin. "Effect of Initial Synaptic State on Pattern Classification Accuracy of 3D Vertical Resistive Random Access Memory (VRRAM) Synapses." Journal of Nanoscience and Nanotechnology 20, no. 8 (August 1, 2020): 4730–34. http://dx.doi.org/10.1166/jnn.2020.17798.

Full text
Abstract:
Amidst the considerable attention artificial intelligence (AI) has attracted in recent years, a neuromorphic chip that mimics the biological neuron has emerged as a promising technology. Memristor or Resistive random-access memory (RRAM) is widely used to implement a synaptic device. Recently, 3D vertical RRAM (VRRAM) has become a promising candidate to reducing resistive memory bit cost. This study investigates the operation principle of synapse in 3D VRRAM architecture. In these devices, the classification response current through a vertical pillar is set by applying a training algorithm to the memristors. The accuracy of neural networks with 3D VRRAM synapses was verified by using the HSPICE simulator to classify the alphabet in 7×7 character images. This simulation demonstrated that 3D VRRAMs are usable as synapses in a neural network system and that a 3D VRRAM synapse should be designed to consider the initial value of the memristor to prepare the training conditions for high classification accuracy. These results mean that a synaptic circuit using 3D VRRAM will become a key technology for implementing neural computing hardware.
APA, Harvard, Vancouver, ISO, and other styles
26

WARZECHA, ANNE-KATHRIN, and MARTIN EGELHAAF. "On the performance of biological movement detectors and ideal velocity sensors in the context of optomotor course stabilization." Visual Neuroscience 15, no. 1 (January 1998): 113–22. http://dx.doi.org/10.1017/s0952523898151052.

Full text
Abstract:
It is often assumed that the ultimate goal of a motion-detection system is to faithfully represent the time-dependent velocity of a moving stimulus. This assumption, however, may be an arbitrary standard since the requirements for a motion-detection system depend on the task that is to be solved. In the context of optomotor course stabilization, the performance of a motion-sensitive neuron in the fly's optomotor pathway and of a hypothetical velocity sensor are compared for stimuli as are characteristic of a normal behavioral situation in which the actions and reactions of the animal directly affect its visual input. On average, tethered flies flying in a flight simulator are able to compensate to a large extent the retinal image displacements as are induced by an external disturbance of their flight course. The retinal image motion experienced by the fly under these behavioral closed-loop conditions was replayed in subsequent electrophysiological experiments to the animal while the activity of an identified neuron in the motion pathway was recorded. The velocity fluctuations as well as the corresponding neuronal signals were analyzed with a statistical approach taken from signal-detection theory. An observer scrutinizing either signal performs almost equally well in detecting the external disturbance.
APA, Harvard, Vancouver, ISO, and other styles
27

Oman, Charles M. "Motion sickness: a synthesis and evaluation of the sensory conflict theory." Canadian Journal of Physiology and Pharmacology 68, no. 2 (February 1, 1990): 294–303. http://dx.doi.org/10.1139/y90-044.

Full text
Abstract:
"Motion sickness" is the general term describing a group of common nausea syndromes originally attributed to motion-induced cerebral ischemia, stimulation of abdominal organ afferents, or overstimulation of the vestibular organs of the inner ear. Seasickness, car sickness, and airsickness are commonly experienced examples. However, the identification of other variants such as spectacle sickness and flight simulator sickness in which the physical motion of the head and body is normal or even absent has led to a succession of "sensory conflict" theories that offer a more comprehensive etiologic perspective. Implicit in the conflict theory is the hypothesis that neural and (or) humoral signals originate in regions of the brain subserving spatial orientation, and that these signals somehow traverse to other centers mediating sickness symptoms. Unfortunately, our present understanding of the neurophysiological basis of motion sickness is incomplete. No sensory conflict neuron or process has yet been physiologically identified. This paper reviews the types of stimuli that cause sickness and synthesizes a mathematical statement of the sensory conflict hypothesis based on observer theory from control engineering. A revised mathematical model is presented that describes the dynamic coupling between the putative conflict signals and nausea magnitude estimates. Based on the model, what properties would a conflict neuron be expected to have?Key words: motion sickness, nausea, vestibular, vision, mathematical models.
APA, Harvard, Vancouver, ISO, and other styles
28

O. H. Abdelwahed, O. H. Abdelwahed, and M. El-Sayed Wahed. "Optimizing Single Layer Cellular Neural Network Simulator using Simulated Annealing Technique with Neural Networks." Indian Journal of Applied Research 3, no. 6 (October 1, 2011): 91–94. http://dx.doi.org/10.15373/2249555x/june2013/31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

HERRMANN, CHRISTOPH S., and ANDREAS KLAUS. "AUTAPSE TURNS NEURON INTO OSCILLATOR." International Journal of Bifurcation and Chaos 14, no. 02 (February 2004): 623–33. http://dx.doi.org/10.1142/s0218127404009338.

Full text
Abstract:
Recently, neurobiologists have discovered axons on neurons which synapse on the same neuron's dendrites — so-called autapses. It is not yet clear what functional significance autapses offer for neural behavior. This is an ideal case for using a physical simulation to investigate how an autapse alters the firing of a neuron. We simulated a neural basket cell via the Hodgkin–Huxley equations and implemented an autapse which feeds back onto the soma of the neuron. The behavior of the cell was compared with and without autaptic feedback. Our artificial autapse neuron (AAN) displays oscillatory behavior which is not observed for the same model neuron without autapse. The neuron oscillates between two functional states: one where it fires at high frequency and another where firing is suppressed. This behavior is called "spike bursting" and represents a common pattern recorded from cerebral neurons.
APA, Harvard, Vancouver, ISO, and other styles
30

Pesavento, Michael J., Cynthia D. Rittenhouse, and David J. Pinto. "Response Sensitivity of Barrel Neuron Subpopulations to Simulated Thalamic Input." Journal of Neurophysiology 103, no. 6 (June 2010): 3001–16. http://dx.doi.org/10.1152/jn.01053.2009.

Full text
Abstract:
Our goal is to examine the relationship between neuron- and network-level processing in the context of a well-studied cortical function, the processing of thalamic input by whisker-barrel circuits in rodent neocortex. Here we focus on neuron-level processing and investigate the responses of excitatory and inhibitory barrel neurons to simulated thalamic inputs applied using the dynamic clamp method in brain slices. Simulated inputs are modeled after real thalamic inputs recorded in vivo in response to brief whisker deflections. Our results suggest that inhibitory neurons require more input to reach firing threshold, but then fire earlier, with less variability, and respond to a broader range of inputs than do excitatory neurons. Differences in the responses of barrel neuron subtypes depend on their intrinsic membrane properties. Neurons with a low input resistance require more input to reach threshold but then fire earlier than neurons with a higher input resistance, regardless of the neuron's classification. Our results also suggest that the response properties of excitatory versus inhibitory barrel neurons are consistent with the response sensitivities of the ensemble barrel network. The short response latency of inhibitory neurons may serve to suppress ensemble barrel responses to asynchronous thalamic input. Correspondingly, whereas neurons acting as part of the barrel circuit in vivo are highly selective for temporally correlated thalamic input, excitatory barrel neurons acting alone in vitro are less so. These data suggest that network-level processing of thalamic input in barrel cortex depends on neuron-level processing of the same input by excitatory and inhibitory barrel neurons.
APA, Harvard, Vancouver, ISO, and other styles
31

Hines, M. L., and N. T. Carnevale. "The NEURON Simulation Environment." Neural Computation 9, no. 6 (August 1, 1997): 1179–209. http://dx.doi.org/10.1162/neco.1997.9.6.1179.

Full text
Abstract:
The moment-to-moment processing of information by the nervous system involves the propagation and interaction of electrical and chemical signals that are distributed in space and time. Biologically realistic modeling is needed to test hypotheses about the mechanisms that govern these signals and how nervous system function emerges from the operation of these mechanisms. The NEURON simulation program provides a powerful and flexible environment for implementing such models of individual neurons and small networks of neurons. It is particularly useful when membrane potential is nonuniform and membrane currents are complex. We present the basic ideas that would help informed users make the most efficient use of NEURON.
APA, Harvard, Vancouver, ISO, and other styles
32

Andreev, Valery, Valerii Ostrovskii, Timur Karimov, Aleksandra Tutueva, Elena Doynikova, and Denis Butusov. "Synthesis and Analysis of the Fixed-Point Hodgkin–Huxley Neuron Model." Electronics 9, no. 3 (March 5, 2020): 434. http://dx.doi.org/10.3390/electronics9030434.

Full text
Abstract:
In many tasks related to realistic neurons and neural network simulation, the performance of desktop computers is nowhere near enough. To overcome this obstacle, researchers are developing FPGA-based simulators that naturally use fixed-point arithmetic. In these implementations, little attention is usually paid to the choice of numerical method for the discretization of the continuous neuron model. In our study, the implementation accuracy of a neuron described by simplified Hodgkin–Huxley equations in fixed-point arithmetic is under investigation. The principle of constructing a fixed-point neuron model with various numerical methods is described. Interspike diagrams and refractory period analysis are used for the experimental study of the synthesized discrete maps of the simplified Hodgkin–Huxley neuron model. We show that the explicit midpoint method is much better suited to simulate the neuron dynamics on an FPGA than the explicit Euler method which is in common use.
APA, Harvard, Vancouver, ISO, and other styles
33

D'Haene, Michiel, Benjamin Schrauwen, Jan Van Campenhout, and Dirk Stroobandt. "Accelerating Event-Driven Simulation of Spiking Neurons with Multiple Synaptic Time Constants." Neural Computation 21, no. 4 (April 2009): 1068–99. http://dx.doi.org/10.1162/neco.2008.02-08-707.

Full text
Abstract:
The simulation of spiking neural networks (SNNs) is known to be a very time-consuming task. This limits the size of SNN that can be simulated in reasonable time or forces users to overly limit the complexity of the neuron models. This is one of the driving forces behind much of the recent research on event-driven simulation strategies. Although event-driven simulation allows precise and efficient simulation of certain spiking neuron models, it is not straightforward to generalize the technique to more complex neuron models, mostly because the firing time of these neuron models is computationally expensive to evaluate. Most solutions proposed in literature concentrate on algorithms that can solve this problem efficiently. However, these solutions do not scale well when more state variables are involved in the neuron model, which is, for example, the case when multiple synaptic time constants for each neuron are used. In this letter, we show that an exact prediction of the firing time is not required in order to guarantee exact simulation results. Several techniques are presented that try to do the least possible amount of work to predict the firing times. We propose an elegant algorithm for the simulation of leaky integrate-and-fire (LIF) neurons with an arbitrary number of (unconstrained) synaptic time constants, which is able to combine these algorithmic techniques efficiently, resulting in very high simulation speed. Moreover, our algorithm is highly independent of the complexity (i.e., number of synaptic time constants) of the underlying neuron model.
APA, Harvard, Vancouver, ISO, and other styles
34

Antunes, Gabriela, Samuel F. Faria da Silva, and Fabio M. Simoes de Souza. "Mirror Neurons Modeled Through Spike-Timing-Dependent Plasticity are Affected by Channelopathies Associated with Autism Spectrum Disorder." International Journal of Neural Systems 28, no. 05 (April 19, 2018): 1750058. http://dx.doi.org/10.1142/s0129065717500587.

Full text
Abstract:
Mirror neurons fire action potentials both when the agent performs a certain behavior and watches someone performing a similar action. Here, we present an original mirror neuron model based on the spike-timing-dependent plasticity (STDP) between two morpho-electrical models of neocortical pyramidal neurons. Both neurons fired spontaneously with basal firing rate that follows a Poisson distribution, and the STDP between them was modeled by the triplet algorithm. Our simulation results demonstrated that STDP is sufficient for the rise of mirror neuron function between the pairs of neocortical neurons. This is a proof of concept that pairs of neocortical neurons associating sensory inputs to motor outputs could operate like mirror neurons. In addition, we used the mirror neuron model to investigate whether channelopathies associated with autism spectrum disorder could impair the modeled mirror function. Our simulation results showed that impaired hyperpolarization-activated cationic currents (Ih) affected the mirror function between the pairs of neocortical neurons coupled by STDP.
APA, Harvard, Vancouver, ISO, and other styles
35

Torres-Treviño, Luis M., Angel Rodríguez-Liñán, Luis González-Estrada, and Gustavo González-Sanmiguel. "Single Gaussian Chaotic Neuron: Numerical Study and Implementation in an Embedded System." Discrete Dynamics in Nature and Society 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/318758.

Full text
Abstract:
Artificial Gaussian neurons are very common structures of artificial neural networks like radial basis function. These artificial neurons use a Gaussian activation function that includes two parameters called the center of mass (cm) and sensibility factor (λ). Changes on these parameters determine the behavior of the neuron. When the neuron has a feedback output, complex chaotic behavior is displayed. This paper presents a study and implementation of this particular neuron. Stability of fixed points, bifurcation diagrams, and Lyapunov exponents help to determine the dynamical nature of the neuron, and its implementation on embedded system illustrates preliminary results toward embedded chaos computation.
APA, Harvard, Vancouver, ISO, and other styles
36

Lytton, William W., Diego Contreras, Alain Destexhe, and Mircea Steriade. "Dynamic Interactions Determine Partial Thalamic Quiescence in a Computer Network Model of Spike-and-Wave Seizures." Journal of Neurophysiology 77, no. 4 (April 1, 1997): 1679–96. http://dx.doi.org/10.1152/jn.1997.77.4.1679.

Full text
Abstract:
Lytton, William W., Diego Contreras, Alain Destexhe, and Mircea Steriade. Dynamic interactions determine partial thalamic quiescence in a computer network model of spike-and-wave seizures. J. Neurophysiol. 77: 1679–1696, 1997. In vivo intracellular recording from cat thalamus and cortex was performed during spontaneous spike-wave seizures characterized by synchronously firing cortical neurons correlated with the electroencephalogram. During these seizures, thalamic reticular (RE) neurons discharged with long spike bursts riding on a depolarization, whereas thalamocortical (TC) neurons were either entrained into the seizures (40%) or were quiescent (60%). During quiescence, TC neurons showed phasic inhibitory postsynaptic potentials (IPSPs) that coincided with paroxysmal depolarizing shifts in the simultaneously recorded cortical neuron. Computer simulations of a reciprocally connected TC-RE pair showed two major modes of TC-RE interaction. In one mode, a mutual oscillation involved direct TC neuron excitation of the RE neuron leading to a burst that fed back an IPSP into the TC neuron, producing a low-threshold spike. In the other, quiescent mode, the TC neuron was subject to stronger coalescing IPSPs. Simulated cortical stimulation could trigger a transition between the two modes. This transition could go in either direction and was dependent on the precise timing of the input. The transition did not always follow the stimulation immediately. A larger, multicolumnar simulation was set up to assess the role of the TC-RE pair in the context of extensive divergence and convergence. The amount of TC neuron spiking generally correlated with the strength of total inhibitory input, but large variations in the amount of spiking could be seen. Evidence for mutual oscillation could be demonstrated by comparing TC neuron firing with that in reciprocally connected RE neurons. An additional mechanism for TC neuron quiescence was assessed with the use of a cooperative model of γ-aminobutyric acid-B (GABAB)-mediated responses. With this model, RE neurons receiving repeated strong excitatory input produced TC neuron quiescence due to burst-duration-associated augmentation of GABAB current. We predict the existence of spatial inhomogeneity in apparently generalized spike-wave seizures, involving a center-surround pattern. In the center, intense cortical and RE neuron activity would be associated with TC neuron quiescence. In the surround, less intense hyperpolarization of TC neurons would allow low-threshold spikes to occur. This surround, an “epileptic penumbra,” would be the forefront of the expanding epileptic wave during the process of initial seizure generalization. Therapeutically, we would then predict that agents that reduce TC neuron activity would have a greater effect on seizure onset than on ongoing spike-wave seizures or other thalamic oscillations.
APA, Harvard, Vancouver, ISO, and other styles
37

Samardzic, Natasa M., Jovan S. Bajic, Dalibor L. Sekulic, and Stanisa Dautovic. "Volatile Memristor in Leaky Integrate-and-Fire Neurons: Circuit Simulation and Experimental Study." Electronics 11, no. 6 (March 13, 2022): 894. http://dx.doi.org/10.3390/electronics11060894.

Full text
Abstract:
In this paper, circuit implementation of a leaky integrate-and-fire neuron model with a volatile memristor was proposed and simulated in the SPICE simulation environment. We demonstrate that simple leaky integrate-and-fire (LIF) neuron models composed of: volatile memristor, membrane capacitance and neuron resistance can mimic spatial and temporal integration, firing function and signal decay. The existing leaky term originates from the recovery of the initial resistive state in the memristor in the spontaneous reset cycle, which is essential for emulating the forgetting process in all-memristive neural networks (MNNs). Furthermore, a diffusive perovskite memristor was used to validate the model where intrinsic memristors’ capacitance acts as neuron membrane capacitance. Good agreement with experimental and simulation results was observed. Volatility, as an inherent property of specific memristors, eliminates the need for usage of an additional peripheral circuit which will reinitialize device state, thus allowing the development of energy-efficient, large scale complex memristive neural networks. The presented circuit level model of LIF neurons can facilitate the design of MNNs.
APA, Harvard, Vancouver, ISO, and other styles
38

Lv, Tao-tao, Yan-jun Mo, Tian-yuan Yu, Shuai Shao, Meng-qian Lu, Yu-ting Luo, Yi Shen, Yu-mo Zhang, and Wong Steven. "Using RNA-Seq to Explore the Repair Mechanism of the Three Methods and Three-Acupoint Technique on DRGs in Sciatic Nerve Injured Rats." Pain Research and Management 2020 (June 8, 2020): 1–12. http://dx.doi.org/10.1155/2020/7531409.

Full text
Abstract:
Objective. To study the effects of the three methods and three-acupoint technique on DRG gene expression in SNI model rats and to elucidate the molecular mechanism of the three methods and three-acupoint technique on promoting recovery in peripheral nerve injury. Methods. 27 male SD rats were randomly divided into three groups: a Sham group, the SNI group, and the Tuina group. The Tuina group was treated with a tuina manipulation simulator to simulate massage on points, controlling for both quality and quantity. Point-pressing, plucking, and kneading methods were administered quantitatively at Yinmen (BL37), Chengshan (BL57), and Yanglingquan (GB34) points on the affected side once a day, beginning 7 days after modeling. Intervention was applied once a day for 10 days, then 1 day of rest, followed by 10 more days of intervention, totally equaling 20 times of intervention. The effect of the three methods and three-point technique on the recovery of injured rats was evaluated using behavior analysis. RNA sequencing (RNA-Seq) analysis of differentially expressed genes in DRGs of the three groups of rats was also performed. GO and KEGG enrichment was analyzed and verified using real-time PCR. Results. RNA-Seq combined with database information showed that the number of differentially expressed genes in DRG was the largest in the Tuina group compared with the SNI group, totaling 226. GO function is enriched in the positive regulation of cell processes, ion binding, protein binding, neuron, response to pressure, response to metal ions, neuron projection, and other biological processes. GO function is also enriched in the Wnt, IL-17, and MAPK signaling pathways in the KEGG database. PCR results were consistent with those of RNA sequencing, suggesting that the results of transcriptome sequencing were reliable. Conclusion. The three methods and three-acupoint technique can promote the recovery of SNI model rats by altering the gene sequence in DRGs.
APA, Harvard, Vancouver, ISO, and other styles
39

Arunachalam, Viswanathan, Raha Akhavan-Tabatabaei, and Cristina Lopez. "Results on a Binding Neuron Model and Their Implications for Modified Hourglass Model for Neuronal Network." Computational and Mathematical Methods in Medicine 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/374878.

Full text
Abstract:
The classical models of single neuron like Hodgkin-Huxley point neuron or leaky integrate and fire neuron assume the influence of postsynaptic potentials to last till the neuron fires. Vidybida (2008) in a refreshing departure has proposed models for binding neurons in which the trace of an input is remembered only for a finite fixed period of time after which it is forgotten. The binding neurons conform to the behaviour of real neurons and are applicable in constructing fast recurrent networks for computer modeling. This paper develops explicitly several useful results for a binding neuron like the firing time distribution and other statistical characteristics. We also discuss the applicability of the developed results in constructing a modified hourglass network model in which there are interconnected neurons with excitatory as well as inhibitory inputs. Limited simulation results of the hourglass network are presented.
APA, Harvard, Vancouver, ISO, and other styles
40

Hokkanen, Henri, Vafa Andalibi, and Simo Vanni. "Controlling Complexity of Cerebral Cortex Simulations—II: Streamlined Microcircuits." Neural Computation 31, no. 6 (June 2019): 1066–84. http://dx.doi.org/10.1162/neco_a_01188.

Full text
Abstract:
Recently, Markram et al. (2015) presented a model of the rat somatosensory microcircuit (Markram model). Their model is high in anatomical and physiological detail, and its simulation requires supercomputers. The lack of neuroinformatics and computing power is an obstacle for using a similar approach to build models of other cortical areas or larger cortical systems. Simplified neuron models offer an attractive alternative to high-fidelity Hodgkin-Huxley-type neuron models, but their validity in modeling cortical circuits is unclear. We simplified the Markram model to a network of exponential integrate-and-fire (EIF) neurons that runs on a single CPU core in reasonable time. We analyzed the electrophysiology and the morphology of the Markram model neurons with eFel and NeuroM tools, provided by the Blue Brain Project. We then constructed neurons with few compartments and averaged parameters from the reference model. We used the CxSystem simulation framework to explore the role of short-term plasticity and GABA[Formula: see text] and NMDA synaptic conductances in replicating oscillatory phenomena in the Markram model. We show that having a slow inhibitory synaptic conductance (GABA[Formula: see text] allows replication of oscillatory behavior in the high-calcium state. Furthermore, we show that qualitatively similar dynamics are seen even with a reduced number of cell types (from 55 to 17 types). This reduction halved the computation time. Our results suggest that qualitative dynamics of cortical microcircuits can be studied using limited neuroinformatics and computing resources supporting parameter exploration and simulation of cortical systems. The simplification procedure can easily be adapted to studying other microcircuits for which sparse electrophysiological and morphological data are available.
APA, Harvard, Vancouver, ISO, and other styles
41

MIFTAHOF, ROUSTEM, and N. R. AKHMADEEV. "COMPUTER SIMULATION OF COTRANSMISSION BY EXCITATORY AMINO ACIDS AND ACETYLCHOLINE IN THE ENTERIC NERVOUS SYSTEM." Journal of Mechanics in Medicine and Biology 07, no. 02 (June 2007): 229–46. http://dx.doi.org/10.1142/s0219519407002261.

Full text
Abstract:
The role of cotransmission by α-amino-3-hydroxy-5-methyl-4-isoxalose propionic acid (AMPA), L-aspartate, N-methyl-D-aspartate (NMDA), and acetylcholine (ACh) as well as the coexpression of AMPA, NMDA, and nicotinic ACh (nACh) receptors on the electrophysiological activity of the primary sensory (AH) and motor (S) neurons of the enteric nervous system are numerically assessed. Results of computer simulations showed that AMPA and L-Asp alone can induce fast action potentials of short duration on AH and S neurons. Costimulation of nACh and AMPA receptors on the soma of the S neuron resulted in periodic spiking activity. A characteristic biphasic response was recorded from the AH neuron after coactivation of AMPA and NMDA receptors. Glutamate alone acting on NMDA receptors caused prolonged depolarization of the AH neuron and failed to depolarize the S neuron. Cojoint stimulation of the AMPA or nACh receptors was required to produce the effect of glutamate. The overall electrical response of neurons to the activation of NMDA receptors was long-term depolarization. Acetylcholine, AMPA, and glutamate acting alone or cojointly enhanced phasic contraction of the longitudinal smooth muscle. Treatment of neurons with AMPA, NMDA, and nACh receptor antagonists revealed intricate properties of the AH and S neurons. Application of MK-801, D-AP5, and CPP reduced the excitability of the AH neuron and totally abolished electrical activity in the S neuron. The information gained into the cotransmission by excitatory amino acids and acetylcholine in the enteric nervous system may be beneficial in the development of novel effective therapeutics to treat diseases associated with altered visceral nociception, i.e. irritable bowel syndrome.
APA, Harvard, Vancouver, ISO, and other styles
42

KRASILENKO, VLADIMIR, NATALIYA YURCHUK, and Diana NIKITOVICH. "DESIGN AND SIMULATION OF NEURON-EQUIVALENTORS ARRAY FOR CREATION OF SELF-LEARNING EQUIVALENT-CONVOLUTIONAL NEURAL STRUCTURES (SLECNS)." HERALD OF KHMELNYTSKYI NATIONAL UNIVERSITY 297, no. 3 (July 2, 2021): 58–69. http://dx.doi.org/10.31891/2307-5732-2021-297-3-58-69.

Full text
Abstract:
In the paper, we consider the urgent need to create highly efficient hardware accelerators for machine learning algorithms, including convolutional and deep neural networks (CNN and DNNS), for associative memory models, clustering, and pattern recognition. We show a brief overview of our related works the advantages of the equivalent models (EM) for describing and designing bio-inspired systems. The capacity of NN on the basis of EM and of its modifications is in several times quantity of neurons. Such neural paradigms are very perspective for processing, clustering, recognition, storing large size, strongly correlated, highly noised images and creating of uncontrolled learning machine. And since the basic operational functional nodes of EM are such vector-matrix or matrix-tensor procedures with continuous-logical operations as: normalized vector operations “equivalence”, “nonequivalence”, and etc. , we consider in this paper new conceptual approaches to the design of full-scale arrays of such neuron-equivalentors (NEs) with extended functionality, including different activation functions. Our approach is based on the use of analog and mixed (with special coding) methods for implementing the required operations, building NEs (with number of synapsis from 8 up to 128 and more) and their base cells, nodes based on photosensitive elements and CMOS current mirrors. Simulation results show that the efficiency of NEs relative to the energy intensity is estimated at a value of not less than 1012 an. op. / sec on W and can be increased. The results confirm the correctness of the concept and the possibility of creating NE and MIMO structures on their basis.
APA, Harvard, Vancouver, ISO, and other styles
43

Hu, Xiaoyu, and Chongxin Liu. "Bursting and Synchronization of Coupled Neurons under Electromagnetic Radiation." Complexity 2019 (December 4, 2019): 1–10. http://dx.doi.org/10.1155/2019/4835379.

Full text
Abstract:
Bursting is an important firing activity of neurons, which is caused by a slow process that modulates fast spiking activity. Based on the original second-order Morris-Lecar neuron model, an improved third-order Morris-Lecar neuron model can produce bursting activity is proposed, in which the effect of electromagnetic radiation is considered as a slow process and the original equation of Morris-Lecar neuron model as a fast process. Extensive numerical simulation results show that the improved neuron model can produce different types of bursting, and bursting activity shows a deep dependence on system parameters and electromagnetic radiation parameters. In addition, synchronization transitions of identical as well as no-identical coupled third-order Morris-Lecar neurons are studied, the results show that identical coupled neurons experience a complex synchronization process and reach complete synchronization finally with the increase of coupling intensity. For no-identical coupled neurons, only anti-phase synchronization and in-phase synchronization can be reached. The studies of bursting activity of single neuron and synchronization transition of coupled neurons have important guiding significance for further understanding the information processing of neurons and collective behaviors in neuronal network under electromagnetic radiation environment.
APA, Harvard, Vancouver, ISO, and other styles
44

KATORI, YUICHI, ERIC J. LANG, MIHO ONIZUKA, MITSUO KAWATO, and KAZUYUKI AIHARA. "QUANTITATIVE MODELING OF SPATIO-TEMPORAL DYNAMICS OF INFERIOR OLIVE NEURONS WITH A SIMPLE CONDUCTANCE-BASED MODEL." International Journal of Bifurcation and Chaos 20, no. 03 (March 2010): 583–603. http://dx.doi.org/10.1142/s0218127410025909.

Full text
Abstract:
Inferior olive (IO) neurons project to the cerebellum and contribute to motor control. They can show intriguing spatio-temporal dynamics with rhythmic and synchronized spiking. IO neurons are connected to their neighbors via gap junctions to form an electrically coupled network, and so it is considered that this coupling contributes to the characteristic dynamics of this nucleus. Here, we demonstrate that a gap junction-coupled network composed of simple conductance-based model neurons (a simplified version of a Hodgkin–Huxley type neuron) reproduce important aspects of IO activity. The simplified phenomenological model neuron facilitated the analysis of the single cell and network properties of the IO while still quantitatively reproducing the spiking patterns of complex spike activity observed by simultaneous recording in anesthetized rats. The results imply that both intrinsic bistability of each neuron and gap junction coupling among neurons play key roles in the generation of the spatio-temporal dynamics of IO neurons.
APA, Harvard, Vancouver, ISO, and other styles
45

Xing, J., and G. L. Gerstein. "Networks with lateral connectivity. I. dynamic properties mediated by the balance of intrinsic excitation and inhibition." Journal of Neurophysiology 75, no. 1 (January 1, 1996): 184–99. http://dx.doi.org/10.1152/jn.1996.75.1.184.

Full text
Abstract:
1. We studied the rapid dynamic changes of neuron response properties in the somatosensory cortex by the use of computer simulations. The model consists of three feedforward layers of spiking neurons, corresponding to skin, subcortex, and cortex structures. Measurements and analysis of model activity throughout this work are similar to those used in neurophysiological experiments. 2. The effects of various parameters on response properties of model neurons were investigated. The most important parameters were the lateral excitation and inhibition in the simulated cortical network. 3. The balance between excitation and inhibition is a key factor in determining the stability of the network model. There is a large excitation-inhibition (E-I) parameter region within which the model can stably respond to inputs. 4. The input-output relations and receptive field (RF) sizes of simulated neurons are modifiable by the E-I balance. The shapes of RFs are determined by both feedforward projections and the spatial distribution of lateral connections. 5. We simulated changes in temporal and spatial properties of neurons in response to manipulations that mimic bicuculine methiodide or glutamate application to the cortex. Simulation results agreed well with experimental data, suggesting that cortical transmitter levels play an important role in the dynamic responses of the neural net through their effects on E-I balance. 6. With parameters of the model set to an inhibition-dominant scheme, the model was able to reproduce experimentally observed rapid RF expansions that follow cortical lesion or input denervation. Simulation results also suggested that spontaneous inputs to a sensory system can serve as a source of tonic inhibition in the cortex. 7. We conclude that lateral connections could produce and maintain a cortical network having dynamic properties without the need to invoke synaptic plasticity. Individual neuron properties could be modified by changing the balance of cortical layer excitation and inhibition. In a real brain, this could be achieved either by changing levels of cortical transmitter (gamma-aminobutyric acid. for example) or by changing tonic background input to the cortical network.
APA, Harvard, Vancouver, ISO, and other styles
46

JOHNSTON, DAVID, SIMON PETER MEKHAIL, MARY ANN GO, and VINCENT R. DARIA. "MODELING NEURONAL RESPONSE TO SIMULTANEOUS AND SEQUENTIAL MULTI-SITE SYNAPTIC STIMULATION." International Journal of Modern Physics: Conference Series 17 (January 2012): 1–8. http://dx.doi.org/10.1142/s2010194512007878.

Full text
Abstract:
The flow of information in the brain theorizes that each neuron in a network receives synaptic inputs and sends off its processed signals to neighboring neurons. Here, we model these synaptic inputs to understand how each neuron processes these inputs and transmits neurotransmitters to neighboring neurons. We use the NEURON simulation package to stimulate a neuron at multiple synaptic locations along its dendritic tree. Accumulation of multiple synaptic inputs causes changes in the neuron's membrane potential leading to firing of an action potential. Our simulations show that simultaneous synaptic stimulation approaches firing of an action potential at lesser inputs compared to sequential stimulation at multiple sites distributed along several dendritic branches.
APA, Harvard, Vancouver, ISO, and other styles
47

Reutimann, Jan, Michele Giugliano, and Stefano Fusi. "Event-Driven Simulation of Spiking Neurons with Stochastic Dynamics." Neural Computation 15, no. 4 (April 1, 2003): 811–30. http://dx.doi.org/10.1162/08997660360581912.

Full text
Abstract:
We present a new technique, based on a proposed event-based strategy (Mattia & Del Giudice, 2000), for efficiently simulating large networks of simple model neurons. The strategy was based on the fact that interactions among neurons occur by means of events that are well localized in time (the action potentials) and relatively rare. In the interval between two of these events, the state variables associated with a model neuron or a synapse evolved deterministically and in a predictable way. Here, we extend the event-driven simulation strategy to the case in which the dynamics of the state variables in the inter-event intervals are stochastic. This extension captures both the situation in which the simulated neurons are inherently noisy and the case in which they are embedded in a very large network and receive a huge number of random synaptic inputs. We show how to effectively include the impact of large background populations into neuronal dynamics by means of the numerical evaluation of the statistical properties of single-model neurons under random current injection. The new simulation strategy allows the study of networks of interacting neurons with an arbitrary number of external afferents and inherent stochastic dynamics.
APA, Harvard, Vancouver, ISO, and other styles
48

Ramanathan, Kiruthika, and Sheng-Uei Guan. "Multiorder Neurons for Evolutionary Higher-Order Clustering and Growth." Neural Computation 19, no. 12 (December 2007): 3369–91. http://dx.doi.org/10.1162/neco.2007.19.12.3369.

Full text
Abstract:
This letter proposes to use multiorder neurons for clustering irregularly shaped data arrangements. Multiorder neurons are an evolutionary extension of the use of higher-order neurons in clustering. Higher-order neurons parametrically model complex neuron shapes by replacing the classic synaptic weight by higher-order tensors. The multiorder neuron goes one step further and eliminates two problems associated with higher-order neurons. First, it uses evolutionary algorithms to select the best neuron order for a given problem. Second, it obtains more information about the underlying data distribution by identifying the correct order for a given cluster of patterns. Empirically we observed that when the correlation of clusters found with ground truth information is used in measuring clustering accuracy, the proposed evolutionary multiorder neurons method can be shown to outperform other related clustering methods. The simulation results from the Iris, Wine, and Glass data sets show significant improvement when compared to the results obtained using self-organizing maps and higher-order neurons. The letter also proposes an intuitive model by which multiorder neurons can be grown, thereby determining the number of clusters in data.
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Zhijie, Xia Peng, Fang Han, and Guangxiao Song. "A novel parallel clock-driven algorithm for simulation of neuronal networks based on virtual synapse." SIMULATION 96, no. 4 (February 19, 2020): 415–27. http://dx.doi.org/10.1177/0037549720903804.

Full text
Abstract:
The traditional clock-driven algorithm is very time-consuming when performed on large-scale neuronal networks due to the huge number of synaptic currents computation and low performance of the parallel implementation of the algorithm. We find in this paper that the conductance coefficients of all the synapses coming from the same presynaptic neuron (neuron [Formula: see text] for example) does not need to be computed one by one, rather only one common conductance coefficient needs to be computed for all synapses from this neuron. We then propose an idea of virtual synapse for neuron [Formula: see text] to compute this common conductance coefficient and thereby have [Formula: see text] ([Formula: see text] is the number of neurons in the network) virtual synapses for all presynaptic neurons in the network. Since each common conductance depends on only the spiking activity of the presynaptic neuron [Formula: see text] and is irrelevant of postsynaptic neurons, the computation of the different virtual synapses can be deployed to different computer processing unit efficiently. By introducing a circular data structure for the virtual synapses, we present a novel parallel clock-driven algorithm based on graphics processors for simulation of neuronal networks. It is demonstrated by test results that the proposed algorithm reduces memory and time consumption greatly, and improves the performance of the parallelization for large-scale neuronal network simulations effectively.
APA, Harvard, Vancouver, ISO, and other styles
50

Taylan, Osman, Mona Abusurrah, Ehsan Eftekhari-Zadeh, Ehsan Nazemi, Farheen Bano, and Ali Roshani. "Controlling Effects of Astrocyte on Neuron Behavior in Tripartite Synapse Using VHDL–AMS." Mathematics 9, no. 21 (October 25, 2021): 2700. http://dx.doi.org/10.3390/math9212700.

Full text
Abstract:
Astrocyte cells form the largest cell population in the brain and can influence neuron behavior. These cells provide appropriate feedback control in regulating neuronal activities in the Central Nervous System (CNS). This paper presents a set of equations as a model to describe the interactions between neurons and astrocyte. A VHDL–AMS-based tripartite synapse model that includes a pre-synaptic neuron, the synaptic terminal, a post-synaptic neuron, and an astrocyte cell is presented. In this model, the astrocyte acts as a controller module for neurons and can regulates the spiking activity of them. Simulation results show that by regulating the coupling coefficients of astrocytes, spiking frequency of neurons can be reduced and the activity of neuronal cells is modulated.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography