Academic literature on the topic 'The NEURON simulator'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'The NEURON simulator.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "The NEURON simulator"

1

Zeguang, LI, Jun Sun, Chunlin Wei, Zhe Sui, and Xiaoye Qian. "RESEARCH ON THE CROSS-SECTION GENERATING METHOD IN HTGR SIMULATOR BASED ON MACHINE LEARNING METHODS." EPJ Web of Conferences 247 (2021): 02039. http://dx.doi.org/10.1051/epjconf/202124702039.

Full text
Abstract:
With the increasing needs of accurate simulation, the 3-D diffusion reactor physics module has been implemented in HTGR’s engineering simulator to give better neutron dynamics results instead of point kinetics model used in previous nuclear power plant simulators. As the requirement of real-time calculation of nuclear power plant simulator, the cross-sections used in 3-D diffusion module must be calculated very efficiently. Normally, each cross-section in simulator is calculated in the form of polynomial by function of several concerned variables, the expression of which was finalized by multivariate regression from large number scattered database generated by previous calculation. Since the polynomial is explicit and prepared in advance, the cross-sections could be calculated quickly enough in running simulator and achieve acceptable accuracy especially in LWR simulations. However, some of concerned variables in HTGR are in large scope and also the relationships of these variables are non-linear and very complex, it is very hard to use polynomial to meet full range accuracy. In this paper, a cross-section generating method used in HTGR simulator is proposed, which is based on machine learning methods, especially deep neuron network and tree regression methods. This method first uses deep neuron networks to consider the nonlinear relationships between different variables and then uses a tree regression to achieve accurate cross-section results in full range, the parameters of deep neuron networks and tree regression are learned automatically from the scattered database generated by VSOP. With the numerical tests, the proposed cross-section generating method could get more accurate cross-section results and the calculation time is acceptable by the simulator.
APA, Harvard, Vancouver, ISO, and other styles
2

Plesser, Hans E., and Markus Diesmann. "Simplicity and Efficiency of Integrate-and-Fire Neuron Models." Neural Computation 21, no. 2 (February 2009): 353–59. http://dx.doi.org/10.1162/neco.2008.03-08-731.

Full text
Abstract:
Lovelace and Cios ( 2008 ) recently proposed a very simple spiking neuron (VSSN) model for simulations of large neuronal networks as an efficient replacement for the integrate-and-fire neuron model. We argue that the VSSN model falls behind key advances in neuronal network modeling over the past 20 years, in particular, techniques that permit simulators to compute the state of the neuron without repeated summation over the history of input spikes and to integrate the subthreshold dynamics exactly. State-of-the-art solvers for networks of integrate-and-fire model neurons are substantially more efficient than the VSSN simulator and allow routine simulations of networks of some 105 neurons and 109 connections on moderate computer clusters.
APA, Harvard, Vancouver, ISO, and other styles
3

Kleijnen, Robert, Markus Robens, Michael Schiek, and Stefan van Waasen. "A Network Simulator for the Estimation of Bandwidth Load and Latency Created by Heterogeneous Spiking Neural Networks on Neuromorphic Computing Communication Networks." Journal of Low Power Electronics and Applications 12, no. 2 (April 21, 2022): 23. http://dx.doi.org/10.3390/jlpea12020023.

Full text
Abstract:
Accelerated simulations of biological neural networks are in demand to discover the principals of biological learning. Novel many-core simulation platforms, e.g., SpiNNaker, BrainScaleS and Neurogrid, allow one to study neuron behavior in the brain at an accelerated rate, with a high level of detail. However, they do not come anywhere near simulating the human brain. The massive amount of spike communication has turned out to be a bottleneck. We specifically developed a network simulator to analyze in high detail the network loads and latencies caused by different network topologies and communication protocols in neuromorphic computing communication networks. This simulator allows simulating the impacts of heterogeneous neural networks and evaluating neuron mapping algorithms, which is a unique feature among state-of-the-art network models and simulators. The simulator was cross-checked by comparing the results of a homogeneous neural network-based run with corresponding bandwidth load results from comparable works. Additionally, the increased level of detail achieved by the new simulator is presented. Then, we show the impact heterogeneous connectivity can have on the network load, first for a small-scale test case, and later for a large-scale test case, and how different neuron mapping algorithms can influence this effect. Finally, we look at the latency estimations performed by the simulator for different mapping algorithms, and the impact of the node size.
APA, Harvard, Vancouver, ISO, and other styles
4

Holker, Ruchi, and Seba Susan. "Neuroscience-Inspired Parameter Selection of Spiking Neuron Using Hodgkin Huxley Model." International Journal of Software Science and Computational Intelligence 13, no. 2 (April 2021): 89–106. http://dx.doi.org/10.4018/ijssci.2021040105.

Full text
Abstract:
Spiking neural networks (SNN) are currently being researched to design an artificial brain to teach it how to think, perform, and learn like a human brain. This paper focuses on exploring optimal values of parameters of biological spiking neurons for the Hodgkin Huxley (HH) model. The HH model exhibits maximum number of neurocomputational properties as compared to other spiking models, as per previous research. This paper investigates the HH model parameters of Class 1, Class 2, phasic spiking, and integrator neurocomputational properties. For the simulation of spiking neurons, the NEURON simulator is used since it is easy to understand and code.
APA, Harvard, Vancouver, ISO, and other styles
5

Zheng, Zhu An, Chuan Xue Song, Hui Lin, and Si Lun Peng. "Research of the Brake Pedal Feel on Wire-by-Brake-System." Advanced Materials Research 655-657 (January 2013): 1131–35. http://dx.doi.org/10.4028/www.scientific.net/amr.655-657.1131.

Full text
Abstract:
Analysis and comparison with conventional brake systems and brake-by-wire-system with pedal stroke simulator, and the establishment of the pedal stroke simulator model with the AMESim software, joint Matlab/Simulink software to design single neuron adaptive intelligent PID control strategy of the pedal stroke simulator. Through simulation verification draw that this brake-by-wire-systems and the control strategy can achieve the requirements of brake pedal feel of conventional brake systems, and effectively improve comfort during braking.
APA, Harvard, Vancouver, ISO, and other styles
6

Anghel, Daniel Constantin, and Nadia Belu. "Contributions to Ranking an Ergonomic Workstation, Considering the Human Effort and the Microclimate Parameters, Using Neural Networks." Applied Mechanics and Materials 371 (August 2013): 812–16. http://dx.doi.org/10.4028/www.scientific.net/amm.371.812.

Full text
Abstract:
The paper presents a method to use a feed forward neural network in order to rank a working place from the manufacture industry. Neural networks excel in gathering difficult non-linear relationships between the inputs and outputs of a system. The neural network is simulated with a simple simulator: SSNN. In this paper, we considered as relevant for a work place ranking, 6 input parameters: temperature, humidity, noise, luminosity, load and frequency. The neural network designed for the study presented in this paper has 6 input neurons, 13 neurons in the hidden layer and 1 neuron in the output layer. We present also some experimental results obtained through simulations.
APA, Harvard, Vancouver, ISO, and other styles
7

Lytton, William W., Alexandra H. Seidenstein, Salvador Dura-Bernal, Robert A. McDougal, Felix Schürmann, and Michael L. Hines. "Simulation Neurotechnologies for Advancing Brain Research: Parallelizing Large Networks in NEURON." Neural Computation 28, no. 10 (October 2016): 2063–90. http://dx.doi.org/10.1162/neco_a_00876.

Full text
Abstract:
Large multiscale neuronal network simulations are of increasing value as more big data are gathered about brain wiring and organization under the auspices of a current major research initiative, such as Brain Research through Advancing Innovative Neurotechnologies. The development of these models requires new simulation technologies. We describe here the current use of the NEURON simulator with message passing interface (MPI) for simulation in the domain of moderately large networks on commonly available high-performance computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike-passing paradigm, and postsimulation data storage and data management approaches. Using the Neuroscience Gateway, a portal for computational neuroscience that provides access to large HPCs, we benchmark simulations of neuronal networks of different sizes (500–100,000 cells), and using different numbers of nodes (1–256). We compare three types of networks, composed of either Izhikevich integrate-and-fire neurons (I&F), single-compartment Hodgkin-Huxley (HH) cells, or a hybrid network with half of each. Results show simulation run time increased approximately linearly with network size and decreased almost linearly with the number of nodes. Networks with I&F neurons were faster than HH networks, although differences were small since all tested cells were point neurons with a single compartment.
APA, Harvard, Vancouver, ISO, and other styles
8

Mohamed, Z., M. A Ayub, M. H.M Ramli, M. S.B Shaari, and S. Khusairi. "Assistive Robot Simulator for Multi-Objective Evolutionary Algorithm Application." International Journal of Engineering & Technology 7, no. 4.27 (November 30, 2018): 153. http://dx.doi.org/10.14419/ijet.v7i4.27.22506.

Full text
Abstract:
This paper presents a new assistive robot simulator for multi-objective optimization application. The main function of the simulator is to simulate the trajectory of the robot arm when it moves from initial to a goal position in optimized manner. A multi-objective evolutionary algorithm (MOEA) is utilized to generate the robot arm motion optimizing three different objective function; optimum time, distance, and high stability. The generated neuron will be selected from the Pareto optimal based on the required objectives function. The robot will intelligently choose the best neuron for a specific task. For example, to move a glass of water required higher stability compare to move an empty mineral water bottle. The simulator will be connected to the real robot to test the performance in real environment. The kinematics, mechatronics and the real robot specification are utilized in the simulator. The performance of the simulator is presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
9

HAMMARLUND, PER, BJÖRN LEVIN, and ANDERS LANSNER. "BIOLOGICALLY REALISTIC AND ARTIFICIAL NEURAL NETWORK SIMULATORS ON THE CONNECTION MACHINE." International Journal of Modern Physics C 04, no. 01 (February 1993): 49–63. http://dx.doi.org/10.1142/s0129183193000070.

Full text
Abstract:
We describe two neural network (NN) simulators implemented on the Connection Machine (CM). The first program is aimed at biologically realistic simulations and the second at recurrent artificial NNs. Both programs are currently used as simulation engines in research within the SANS group as well as in other groups. The program for biologically realistic NN simulations on the CM is called BIOSIM. The aim is to simulate NNs in which the neurons are modeled with a high degree of biological realism. The cell model used is a compartmentalized abstraction of the neuron. It includes sodium, potassium, calcium, and calcium dependent potassium channels. Synaptic interaction includes conventional chemical synapses as well as voltage gated NMDA synapses. On a CM with 8K processors the program is typically capable of handling some tens of thousands of compartments and more than ten times as many synapses. The artificial NN simulator implements the SANS model, a recurrent NN model closely related to the Hopfield model. The aim has been to effectively support large network simulations, in the order of 8–16K units, on an 8K CM. To make the simulator optimal for different applications, it supports both fully and sparsely connected networks. The implementation for sparsely connected NNs uses a compacted weight matrix. Both implementations are optimized for sparse activity.
APA, Harvard, Vancouver, ISO, and other styles
10

Belyaev and Velichko. "A Spiking Neural Network Based on the Model of VO2—Neuron." Electronics 8, no. 10 (September 20, 2019): 1065. http://dx.doi.org/10.3390/electronics8101065.

Full text
Abstract:
In this paper, we present an electrical circuit of a leaky integrate-and-fire neuron with one VO2 switch, which models the properties of biological neurons. Based on VO2 neurons, a two-layer spiking neural network consisting of nine input and three output neurons is modeled in the SPICE simulator. The network contains excitatory and inhibitory couplings, and implements the winner-takes-all principle in pattern recognition. Using a supervised Spike-Timing-Dependent Plasticity training method and a timing method of information coding, the network was trained to recognize three patterns with dimensions of 3 × 3 pixels. The neural network is able to recognize up to 105 images per second, and has the potential to increase the recognition speed further.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "The NEURON simulator"

1

Liao, James Yu-Chang. "Evaluating Multi-Modal Brain-Computer Interfaces for Controlling Arm Movements Using a Simulator of Human Reaching." Case Western Reserve University School of Graduate Studies / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=case1404138858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Guillon, Didier. "Noos : Neural Object Oriented Simulator : un simulateur orienté objet d'un neurone biologique." Université Joseph Fourier (Grenoble), 1997. http://www.theses.fr/1997GRE19002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Xu, Shuxiang, University of Western Sydney, and of Informatics Science and Technology Faculty. "Neuron-adaptive neural network models and applications." THESIS_FIST_XXX_Xu_S.xml, 1999. http://handle.uws.edu.au:8081/1959.7/275.

Full text
Abstract:
Artificial Neural Networks have been widely probed by worldwide researchers to cope with the problems such as function approximation and data simulation. This thesis deals with Feed-forward Neural Networks (FNN's) with a new neuron activation function called Neuron-adaptive Activation Function (NAF), and Feed-forward Higher Order Neural Networks (HONN's) with this new neuron activation function. We have designed a new neural network model, the Neuron-Adaptive Neural Network (NANN), and mathematically proved that one NANN can approximate any piecewise continuous function to any desired accuracy. In the neural network literature only Zhang proved the universal approximation ability of FNN Group to any piecewise continuous function. Next, we have developed the approximation properties of Neuron Adaptive Higher Order Neural Networks (NAHONN's), a combination of HONN's and NAF, to any continuous function, functional and operator. Finally, we have created a software program called MASFinance which runs on the Solaris system for the approximation of continuous or discontinuous functions, and for the simulation of any continuous or discontinuous data (especially financial data). Our work distinguishes itself from previous work in the following ways: we use a new neuron-adaptive activation function, while the neuron activation functions in most existing work are all fixed and can't be tuned to adapt to different approximation problems; we only use on NANN to approximate any piecewise continuous function, while a neural network group must be utilised in previous research; we combine HONN's with NAF and investigate its approximation properties to any continuous function, functional, and operator; we present a new software program, MASFinance, for function approximation and data simulation. Experiments running MASFinance indicate that the proposed NANN's present several advantages over traditional neuron-fixed networks (such as greatly reduced network size, faster learning, and lessened simulation errors), and that the suggested NANN's can effectively approximate piecewise continuous functions better than neural networks groups. Experiments also indicate that NANN's are especially suitable for data simulation
Doctor of Philosophy (PhD)
APA, Harvard, Vancouver, ISO, and other styles
4

Preyer, Amanda Jervis. "Coupling and synchrony in neuronal networks electrophysiological experiments /." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/24799.

Full text
Abstract:
Thesis (Ph.D.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Butera, Robert; Committee Member: Canavier, Carmen; Committee Member: DeWeerth, Stephen; Committee Member: Hasler, Paul; Committee Member: Lanterman, Aaron; Committee Member: Prinz, Astrid.
APA, Harvard, Vancouver, ISO, and other styles
5

Kulakov, Anton. "Multiprocessing neural network simulator." Thesis, University of Southampton, 2013. https://eprints.soton.ac.uk/348420/.

Full text
Abstract:
Over the last few years tremendous progress has been made in neuroscience by employing simulation tools for investigating neural network behaviour. Many simulators have been created during last few decades, and their number and set of features continually grows due to persistent interest from groups of researchers and engineers. A simulation software that is able to simulate a large-scale neural network has been developed and presented in this work. Based on a highly abstract integrate-and-fire neuron model a clock-driven sequential simulator has been developed in C++. The created program is able to associate the input patterns with the output patterns. The novel biologically plausible learning mechanism uses Long Term Potentiation and Long Term Depression to change the strength of the connections between the neurons based on a global binary feedback. Later, the sequentially executed model has been extended to a multi-processor system, which executes the described learning algorithm using the event-driven technique on a parallel distributed framework, simulating a neural network asynchronously. This allows the simulation to manage larger scale neural networks being immune to processor failure and communication problems. The multi-processor neural network simulator has been created, the main benefit of which is the possibility to simulate large scale neural networks using high-parallel distributed computing. For that reason the design of the simulator has been implemented considering an efficient weight-adjusting algorithm and an efficient way for asynchronous local communication between processors.
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Prakoon. "The Neural Shell : a neural networks simulator." Connect to resource, 1989. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1228839518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Boatin, William. "Characterization of neuron models." Thesis, Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-04182005-181732/.

Full text
Abstract:
Thesis (M. S.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2006.
Dr. Robert H. Lee, Committee Member ; Dr. Kurt Wiesenfeld, Committee Member ; Dr Robert J. Butera, Committee Member.
APA, Harvard, Vancouver, ISO, and other styles
8

Liss, Anders. "Optimizing stochastic simulation of a neuron with parallelization." Thesis, Uppsala universitet, Avdelningen för beräkningsvetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-324444.

Full text
Abstract:
In order to optimize the solving of stochastic simulations of neuron channels, an attempt to parallelize the solver has been made. The result of the implementation was unsuccessful. However, the implementation is not impossible and is still a field of research with big potential for improving performance of stochastic simulations.
APA, Harvard, Vancouver, ISO, and other styles
9

Tang, Ping. "Simulation du traitement effectué par certaines cellules étoilées du noyau cochléaire antéroventral et analyse de leur comportement en terme de modulation d'amplitude /." Thèse, Chicoutimi : Université du Québec à Chicoutimi, 1995. http://theses.uqac.ca.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Colbrunn, Robb William. "A Robotic Neuro-Musculoskeletal Simulator for Spine Research." Cleveland State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=csu1367977446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "The NEURON simulator"

1

Jayanti, V. R. Neural network simulator interface. Manchester: UMIST, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Goh, Wee Jin. Object-oriented neural network simulator. Oxford: Oxford Brookes University, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Skrzypek, Josef, ed. Neural Network Simulation Environments. Boston, MA: Springer US, 1994. http://dx.doi.org/10.1007/978-1-4615-2736-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Skrzypek, Josef. Neural Network Simulation Environments. Boston, MA: Springer US, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Friesen, W. Otto. NeuroDynamix: Computer models for neurophysiology. New York: Oxford University Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

1973-, Friesen Jonathon A., ed. NeuroDynamix: Computer-based neuronal models for neurophysiology. Oxford: Oxford University Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tucci, Mario, and Marco Garetti, eds. Proceedings of the third International Workshop of the IFIP WG5.7. Florence: Firenze University Press, 2002. http://dx.doi.org/10.36253/88-8453-042-3.

Full text
Abstract:
Contents of the papers presented at the international workshop deal with the wide variety of new and computer-based techniques for production planning and control that has become available to the scientific and industrial world in the past few years: formal modeling techniques, artificial neural networks, autonomous agent theory, genetic algorithms, chaos theory, fuzzy logic, simulated annealing, tabu search, simulation and so on. The approach, while being scientifically rigorous, is focused on the applicability to industrial environment.
APA, Harvard, Vancouver, ISO, and other styles
8

David, McCormick, and Shepherd Gordon M. 1933-, eds. Electrophysiology of the neuron: An interactive tutorial. New York: Oxford University Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Huguenard, John. Electrophysiology of the neuron: An interactive tutorial. New York: Oxford University Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Neural networks and simulation methods. New York: M. Dekker, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "The NEURON simulator"

1

de Kamps, Marc, Hugh Osborne, Lukas Deutz, Frank van der Velde, Mikkel Lepperød, Yi Ming Lai, and David Sichau. "MIIND: A Population-Level Neural Simulator Incorporating Stochastic Point Neuron Models." In Encyclopedia of Computational Neuroscience, 1–4. New York, NY: Springer New York, 2019. http://dx.doi.org/10.1007/978-1-4614-7320-6_100680-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stimberg, Marcel, Dan F. M. Goodman, Romain Brette, and Maurizio De Pittà. "Modeling Neuron–Glia Interactions with the Brian 2 Simulator." In Springer Series in Computational Neuroscience, 471–505. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-00817-8_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

De Schutter, Erik. "Nodus: A User Friendly Neuron Simulator for Macintosh Computers." In Neural Systems: Analysis and Modeling, 113–19. Boston, MA: Springer US, 1993. http://dx.doi.org/10.1007/978-1-4615-3560-7_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chatzikonstantis, George, Diego Jiménez, Esteban Meneses, Christos Strydis, Harry Sidiropoulos, and Dimitrios Soudris. "From Knights Corner to Landing: A Case Study Based on a Hodgkin-Huxley Neuron Simulator." In Lecture Notes in Computer Science, 363–75. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-67630-2_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sridharan, Aadityan, Hemalatha Sasidharakurup, Dhanush Kumar, Nijin Nizar, Bipin Nair, Krishnashree Achuthan, and Shyam Diwakar. "Implementing a Web-Based Simulator with Explicit Neuron and Synapse Models to Aid Experimental Neuroscience and Theoretical Biophysics Education." In Lecture Notes in Electrical Engineering, 57–66. New Delhi: Springer India, 2016. http://dx.doi.org/10.1007/978-81-322-3589-7_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Smith, Robert G. "Measurement of simulation speed: its relation to simulation accuracy." In Computation in Neurons and Neural Systems, 59–64. Boston, MA: Springer US, 1994. http://dx.doi.org/10.1007/978-1-4615-2714-5_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Beeman, David. "Simulation-based Tutorials for Neuroscience Education." In Computation in Neurons and Neural Systems, 65–70. Boston, MA: Springer US, 1994. http://dx.doi.org/10.1007/978-1-4615-2714-5_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hines, Michael, and Ted Carnevale. "NEURON Simulation Environment." In Encyclopedia of Computational Neuroscience, 1–8. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4614-7320-6_795-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hines, Michael, Ted Carnevale, and Robert A. McDougal. "NEURON Simulation Environment." In Encyclopedia of Computational Neuroscience, 1–7. New York, NY: Springer New York, 2019. http://dx.doi.org/10.1007/978-1-4614-7320-6_795-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hines, Michael, and Ted Carnevale. "NEURON Simulation Environment." In Encyclopedia of Computational Neuroscience, 2012–17. New York, NY: Springer New York, 2015. http://dx.doi.org/10.1007/978-1-4614-6675-8_795.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "The NEURON simulator"

1

Chaud, Vitor Martins, and Andre Fabio Kohn. "Development of a User-Friendly Neuron Analyzer and Simulator (NAS)." In 2017 UKSim-AMSS 11th European Modelling Symposium (EMS). IEEE, 2017. http://dx.doi.org/10.1109/ems.2017.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Appukuttan, Shailesh, Darshan Mandge, and Rohit Manchanda. "Implementation of Syncytial Models in NEURON Simulator for Improved Efficiency." In 2020 28th Euromicro International Conference on Parallel, Distributed and Network-Based Processing (PDP). IEEE, 2020. http://dx.doi.org/10.1109/pdp50117.2020.00048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shehzad, Danish, and Zeki Bozkus. "Optimizing NEURON brain simulator with Remote Memory Access on distributed memory systems." In 2015 International Conference on Emerging Technologies (ICET). IEEE, 2015. http://dx.doi.org/10.1109/icet.2015.7389167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Vaughan, Neil, Venketesh N. Dubey, Michael Y. K. Wee, and Richard Isaacs. "Artificial Neural Network to Predict Patient Body Circumferences and Ligament Thicknesses." In ASME 2013 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/detc2013-13088.

Full text
Abstract:
An artificial neural network has been implemented and trained with clinical data from 23088 patients. The aim was to predict a patient’s body circumferences and ligament thickness from patient data. A fully connected feed-forward neural network is used, containing no loops and one hidden layer and the learning mechanism is back-propagation of error. Neural network inputs were mass, height, age and gender. There are eight hidden neurons and one output. The network can generate estimates for waist, arm, calf and thigh circumferences and thickness of skin, fat, Supraspinous and interspinous ligaments, ligamentum flavum and epidural space. Data was divided into a training set of 11000 patients and an unseen test data set of 12088 patients. Twenty five training cycles were completed. After each training cycle neuron outputs advanced closer to the clinically measured data. Waist circumference was predicted within 3.92cm (3.10% error), thigh circumference 2.00cm, (2.81% error), arm circumference 1.21cm (2.48% error), calf circumference 1.41cm, (3.40% error), triceps skinfold 3.43mm, (7.80% error), subscapular skinfold 3.54mm, (8.46% error) and BMI was estimated within 0.46 (0.69% error). The neural network has been extended to predict ligament thicknesses using data from MRI. These predictions will then be used to configure a simulator to offer a patient-specific training experience.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Leslie, Richard Burton, and Greg Schoenau. "Feasibility Study on the Use of Dynamic Neural Networks (DNN’s) for Modeling a Variable Displacement Load Sensing Pump." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-15588.

Full text
Abstract:
The feasibility of using a particular form of neural networks, defined as Dynamic Neural Units (DNU's), to model a pump in a load sensing system is investigated in this paper. Because of the highly complex structure of the pump, its compensators and controlling elements, simulation of load-sensing pump systems pose many challenges to researchers. Several models of pumps, compensators and valves have been developed and published in the literature but they are overly simplified or are in an extremely complex form. One modeling approach which can capture the nonlinear dynamic properties of the pump yet still retain reasonable simplicity in its basic form is to use neural network technology. Previous studies have shown some limited success in using feed forward neurons with dynamic properties being introduced using time delays. A problem referred to a error accumulation has prevented these neural based models from being practical dynamic representations of load sensing systems. Based on the topology of the biological neural systems several new structures, Dynamic Neural Units (DNU's) have been developed. Only one DNU is necessary to capture or represent some of the dynamics of a plant, which a static (feed forward) neuron cannot do. The main advantage of the dynamic neuron is that it reduces the network dimension and the amount of computational requirement and has the potential to avoid this error accumulation problem. The use of Dynamic Neural Networks with Dynamic Neural Units in simulating a variable displacement pump is presented in this paper. Only the pump portion of the load sensing pump system is considered due to problems of interacting operating points. A DNU structure and a DNN (which is comprised of DNU's) are introduced. The simulation results establishes the feasibility of using a Dynamic Neural Networks with DNU's to model a simulated nonlinear hydraulic system such as a load sensing pump.
APA, Harvard, Vancouver, ISO, and other styles
6

Vaghei, Yasaman, Yashar Sarbaz, and Ahmad Ghanbari. "Modeling and Simulation Method Comparison for the Lotka-Volterra Model." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-64193.

Full text
Abstract:
The Lotka-Volterra or predator-prey models contain a pair of first order, non-linear, differential equations, which describe the dynamics of two species interaction in biological systems. Hence, accurate simulation strategies development for mentioned equations is crucial. In this paper, first, the presented model equations are simulated by ARX, ARMAX and BJ parametric models of the Identification Toolbox in MATLAB software. Afterwards, this simulation has been done in the Neural Network Toolbox by Feed-Forward and Elman networks with equal number of neurons, layers and same transfer functions. Finally, the results of these two simulations have been compared to introduce the best simulation methodology. It is shown that more accurate results are achieved by Elman network. In addition, the paper demonstrates that the simulation error can be decreased by simply increasing the number of these neural networks’ neurons.
APA, Harvard, Vancouver, ISO, and other styles
7

Khan, Nafisah, Muhammad Ali, Ahmed Hosny, and Rachid Machrafi. "Response Functions of a Boron-Loaded Plastic Scintillator to Neutron and Gamma Radiation." In 2012 20th International Conference on Nuclear Engineering and the ASME 2012 Power Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/icone20-power2012-54676.

Full text
Abstract:
A boron-loaded plastic scintillator has been investigated for possible use in neutron spectrometry. The sensor composition of hydrogen and carbon leads to multiple scattering collisions that are useful for fast neutron spectroscopy, while its boron component can serve as a thermal neutron detector. Both simulation and experimental work have been carried out to investigate the response functions of the detector to neutrons and gamma radiation. The response functions of the detector have been simulated using Monte Carlo N-Particle eXtended code. For experimental tests, the sensor has been mounted on a photomultiplier tube connected to a compact data acquisition system. The system has been tested in different gamma and neutron fields at the University of Ontario Institute of Technology Neutron Facility. The simulation and experimental results have been compared and analyzed.
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Sufen, Quanhu Zhang, Yonggang Huo, and Man Zhou. "Research on the Neutron Multiplicity Pulse Trains Computer Simulation." In 2018 26th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/icone26-82262.

Full text
Abstract:
Neutron multiplicity counting (NMC) is one of the most advanced Non-Destructive Assay (NDA) techniques. In this paper, the principle and method of generation of neutron pulse trains is studied. A model of 4π cube counter is established for 238Pu sample, the capture time of neutrons is gained by simulation code, then output the neutron pulse trains for post dispose from generated fissile initial-times. The result from post dispose of this neutron pulse trains show a small bias compared with input value, it’s an evidence for the validity of simulation in this paper.
APA, Harvard, Vancouver, ISO, and other styles
9

Li, C. James, and T. Y. Huang. "Automatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks." In ASME 1997 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/imece1997-0402.

Full text
Abstract:
Abstract Automatic nonlinear-system identification is very useful for various disciplines including, e.g., automatic control, mechanical diagnostics, and financial market prediction. This paper describes a fully automatic structural and weight learning method for recurrent neural networks (RNN). The basic idea is training with residuals, i.e., a single hidden neuron RNN is trained to track the residuals of an existing network before it is augmented to the existing network to form a larger and better network. The network continues to grow until either a desired level of accuracy or a preset maximal number of neurons is reached. The method requires neither guess of initial weight values nor the number of neurons in the hidden layer from users. This new structural and weight learning algorithm is used to find RNN models for a two-degree-of-freedom planar robot, a Van der Pol oscillator and a Mackey-Glass equation using their simulated responses to excitations. In addition, a RNN model is obtained for a real robot using its input and output measurements. The algorithm is effective in all four cases and RNN models were shown to be superior to linear models and hybrid models wherever the comparison was made.
APA, Harvard, Vancouver, ISO, and other styles
10

Ishlam Nazrul, Mohammad Nazrul, Carl Tropper, Robert A. McDougal, and William W. Lytton. "Optimizations for Neuron Time Warp(NTW) for stochastic reaction-diffusion models of neurons." In 2017 Winter Simulation Conference (WSC). IEEE, 2017. http://dx.doi.org/10.1109/wsc.2017.8247871.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "The NEURON simulator"

1

Grelle, A., Y. Cao, Y. Gohar, Y. Park, and T. Wei. Neutron Source Facility Simulator (NSFS). Office of Scientific and Technical Information (OSTI), August 2014. http://dx.doi.org/10.2172/1149679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cao, Yan, Thomas Y. Wei, Austin L. Grelle, and Yousry Gohar. Plant model of KIPT neutron source facility simulator. Office of Scientific and Technical Information (OSTI), February 2016. http://dx.doi.org/10.2172/1245184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Farhi, Edward, and Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, December 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.

Full text
Abstract:
We introduce a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning. The quantum circuit consists of a sequence of parameter dependent unitary transformations which acts on an input quantum state. For binary classification a single Pauli operator is measured on a designated readout qubit. The measured output is the quantum neural network’s predictor of the binary label of the input state. We show through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets. We then discuss presenting the data as quantum superpositions of computational basis states corresponding to different label values. Here we show through simulation that learning is possible. We consider using our QNN to learn the label of a general quantum state. By example we show that this can be done. Our work is exploratory and relies on the classical simulation of small quantum systems. The QNN proposed here was designed with near-term quantum processors in mind. Therefore it will be possible to run this QNN on a near term gate model quantum computer where its power can be explored beyond what can be explored with simulation.
APA, Harvard, Vancouver, ISO, and other styles
4

White, Thaddeus. A Modern User Interface for the LANL Neutron Pulse Simulator (NPS). Office of Scientific and Technical Information (OSTI), July 2020. http://dx.doi.org/10.2172/1643903.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Heinisch, H. L., and B. N. Singh. Stochastic annealing simulation of copper under neutron irradiation. Office of Scientific and Technical Information (OSTI), March 1998. http://dx.doi.org/10.2172/335409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Johnson, Don H. Simulation of Excitatory/Inhibitory Interactions in Single Auditory Neurons. Fort Belvoir, VA: Defense Technical Information Center, September 1992. http://dx.doi.org/10.21236/ada253614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shadid, John Nicolas, Robert John Hoekstra, Gary Lee Hennigan, Joseph Pete Jr Castro, and Deborah A. Fixel. Simulation of neutron radiation damage in silicon semiconductor devices. Office of Scientific and Technical Information (OSTI), October 2007. http://dx.doi.org/10.2172/934581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Allgood, G. O. Development of a neural net paradigm that predicts simulator sickness. Office of Scientific and Technical Information (OSTI), March 1993. http://dx.doi.org/10.2172/6178481.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Allgood, G. O. Development of a neural net paradigm that predicts simulator sickness. Office of Scientific and Technical Information (OSTI), March 1993. http://dx.doi.org/10.2172/10172277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kimpland, Robert H., and Steven K. Klein. Neutron Diffusion Model for Prompt Burst Simulation in Fissile Solutions. Office of Scientific and Technical Information (OSTI), August 2013. http://dx.doi.org/10.2172/1091865.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography