To see the other types of publications on this topic, follow the link: Neural network adaptation.

Journal articles on the topic 'Neural network adaptation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Neural network adaptation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Hylton, Todd. "Thermodynamic Neural Network." Entropy 22, no. 3 (2020): 256. http://dx.doi.org/10.3390/e22030256.

Full text
Abstract:
A thermodynamically motivated neural network model is described that self-organizes to transport charge associated with internal and external potentials while in contact with a thermal reservoir. The model integrates techniques for rapid, large-scale, reversible, conservative equilibration of node states and slow, small-scale, irreversible, dissipative adaptation of the edge states as a means to create multiscale order. All interactions in the network are local and the network structures can be generic and recurrent. Isolated networks show multiscale dynamics, and externally driven networks ev
APA, Harvard, Vancouver, ISO, and other styles
2

Vreeswijk, C. van, and D. Hansel. "Patterns of Synchrony in Neural Networks with Spike Adaptation." Neural Computation 13, no. 5 (2001): 959–92. http://dx.doi.org/10.1162/08997660151134280.

Full text
Abstract:
We study the emergence of synchronized burst activity in networks of neurons with spike adaptation. We show that networks of tonically firing adapting excitatory neurons can evolve to a state where the neurons burst in a synchronized manner. The mechanism leading to this burst activity is analyzed in a network of integrate-and-fire neurons with spike adaptation. The dependence of this state on the different network parameters is investigated, and it is shown that this mechanism is robust against inhomogeneities, sparseness of the connectivity, and noise. In networks of two populations, one exc
APA, Harvard, Vancouver, ISO, and other styles
3

Xie, Xurong, Xunying Liu, Tan Lee, and Lan Wang. "Bayesian Learning for Deep Neural Network Adaptation." IEEE/ACM Transactions on Audio, Speech, and Language Processing 29 (2021): 2096–110. http://dx.doi.org/10.1109/taslp.2021.3084072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Patre, P. M., S. Bhasin, Z. D. Wilcox, and W. E. Dixon. "Composite Adaptation for Neural Network-Based Controllers." IEEE Transactions on Automatic Control 55, no. 4 (2010): 944–50. http://dx.doi.org/10.1109/tac.2010.2041682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yu, D. L., and T. K. Chang. "Adaptation of diagonal recurrent neural network model." Neural Computing and Applications 14, no. 3 (2005): 189–97. http://dx.doi.org/10.1007/s00521-004-0453-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Joty, Shafiq, Nadir Durrani, Hassan Sajjad, and Ahmed Abdelali. "Domain adaptation using neural network joint model." Computer Speech & Language 45 (September 2017): 161–79. http://dx.doi.org/10.1016/j.csl.2016.12.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Denker, John S. "Neural network models of learning and adaptation." Physica D: Nonlinear Phenomena 22, no. 1-3 (1986): 216–32. http://dx.doi.org/10.1016/0167-2789(86)90242-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

YAEGER, LARRY S. "IDENTIFYING NEURAL NETWORK TOPOLOGIES THAT FOSTER DYNAMICAL COMPLEXITY." Advances in Complex Systems 16, no. 02n03 (2013): 1350032. http://dx.doi.org/10.1142/s021952591350032x.

Full text
Abstract:
We use an ecosystem simulator capable of evolving arbitrary neural network topologies to explore the relationship between an information theoretic measure of the complexity of neural dynamics and several graph theoretical metrics calculated for the underlying network topologies. Evolutionary trends confirm and extend previous results demonstrating an evolutionary selection for complexity and small-world network properties during periods of behavioral adaptation. The resultant mapping of the space of network topologies occupied by the most complex networks yields new insights into the relations
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Xiaofeng, Suying Xiang, Pengfei Zhu, and Min Wu. "Establishing a Dynamic Self-Adaptation Learning Algorithm of the BP Neural Network and Its Applications." International Journal of Bifurcation and Chaos 25, no. 14 (2015): 1540030. http://dx.doi.org/10.1142/s0218127415400301.

Full text
Abstract:
In order to avoid the inherent deficiencies of the traditional BP neural network, such as slow convergence speed, that easily leading to local minima, poor generalization ability and difficulty in determining the network structure, the dynamic self-adaptive learning algorithm of the BP neural network is put forward to improve the function of the BP neural network. The new algorithm combines the merit of principal component analysis, particle swarm optimization, correlation analysis and self-adaptive model, hence can effectively solve the problems of selecting structural parameters, initial con
APA, Harvard, Vancouver, ISO, and other styles
10

GOLTSEV, ALEXANDER, and DONALD C. WUNSCH. "GENERALIZATION OF FEATURES IN THE ASSEMBLY NEURAL NETWORKS." International Journal of Neural Systems 14, no. 01 (2004): 39–56. http://dx.doi.org/10.1142/s0129065704001838.

Full text
Abstract:
The purpose of the paper is an experimental study of the formation of class descriptions, taking place during learning, in assembly neural networks. The assembly neural network is artificially partitioned into several sub-networks according to the number of classes that the network has to recognize. The features extracted from input data are represented in neural column structures of the sub-networks. Hebbian neural assemblies are formed in the column structure of the sub-networks by weight adaptation. A specific class description is formed in each sub-network of the assembly neural network du
APA, Harvard, Vancouver, ISO, and other styles
11

Hu, Brian, Marina E. Garrett, Peter A. Groblewski, et al. "Adaptation supports short-term memory in a visual change detection task." PLOS Computational Biology 17, no. 9 (2021): e1009246. http://dx.doi.org/10.1371/journal.pcbi.1009246.

Full text
Abstract:
The maintenance of short-term memories is critical for survival in a dynamically changing world. Previous studies suggest that this memory can be stored in the form of persistent neural activity or using a synaptic mechanism, such as with short-term plasticity. Here, we compare the predictions of these two mechanisms to neural and behavioral measurements in a visual change detection task. Mice were trained to respond to changes in a repeated sequence of natural images while neural activity was recorded using two-photon calcium imaging. We also trained two types of artificial neural networks on
APA, Harvard, Vancouver, ISO, and other styles
12

Zhao, S., S. Saha, and X. X. Zhu. "GRAPH NEURAL NETWORK BASED OPEN-SET DOMAIN ADAPTATION." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B3-2022 (May 31, 2022): 1407–13. http://dx.doi.org/10.5194/isprs-archives-xliii-b3-2022-1407-2022.

Full text
Abstract:
Abstract. Owing to the presence of many sensors and geographic/seasonal variations, domain adaptation is an important topic in remote sensing. However, most domain adaptation methods focus on close-set adaptation, i.e., they assume that the source and target domains share the same label space. This assumption often does not hold in practice, as there can be previously unseen classes in the target domain. To circumnavigate this issue, we propose a method for open set domain adaptation, where the target domain contains additional unknown classes that are not present in the source domain. To impr
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Miao, Xu Yang, Yunchong Qian, et al. "Adaptive Neural Network Structure Optimization Algorithm Based on Dynamic Nodes." Current Issues in Molecular Biology 44, no. 2 (2022): 817–32. http://dx.doi.org/10.3390/cimb44020056.

Full text
Abstract:
Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold
APA, Harvard, Vancouver, ISO, and other styles
14

Hsu, Chun-Fei, Ping-Zong Lin, Tsu-Tian Lee, and Chi-Hsu Wang. "Adaptive asymmetric fuzzy neural network controller design via network structuring adaptation." Fuzzy Sets and Systems 159, no. 20 (2008): 2627–49. http://dx.doi.org/10.1016/j.fss.2008.01.034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Khaikine, Maxim, and Klaus Holthausen. "A General Probability Estimation Approach for Neural Computation." Neural Computation 12, no. 2 (2000): 433–50. http://dx.doi.org/10.1162/089976600300015862.

Full text
Abstract:
We describe an analytical framework for the adaptations of neural systems that adapt its internal structure on the basis of subjective probabilities constructed by computation of randomly received input signals. A principled approach is provided with the key property that it defines a probability density model that allows studying the convergence of the adaptation process. In particular, the derived algorithm can be applied for approximation problems such as the estimation of probability densitiesor the recognition of regression functions. These approximation algorithms can be easily extended
APA, Harvard, Vancouver, ISO, and other styles
16

Wu, Jian Hui, Guo Li Wang, Jing Wang, and Yu Su. "BP Neural Network and Multiple Linear Regression in Acute Hospitalization Costs in the Comparative Study." Applied Mechanics and Materials 50-51 (February 2011): 959–63. http://dx.doi.org/10.4028/www.scientific.net/amm.50-51.959.

Full text
Abstract:
The BP neural network is the important component of artificial neural networks, and gradually becomes a branch of the computation statistics. With its many characteristics such as large-scale parallel information processing, excellent self-adaptation and self-learning, the BP neural network has been used in solving the complex nonlinear dynamic system prediction. The BP neural network does not need the precise mathematical model, does not have any supposition request to the material itself. Its processing non-linear problem's ability is stronger than traditional statistical methods. By means o
APA, Harvard, Vancouver, ISO, and other styles
17

Vinken, K., X. Boix, and G. Kreiman. "Incorporating intrinsic suppression in deep neural networks captures dynamics of adaptation in neurophysiology and perception." Science Advances 6, no. 42 (2020): eabd4205. http://dx.doi.org/10.1126/sciadv.abd4205.

Full text
Abstract:
Adaptation is a fundamental property of sensory systems that can change subjective experiences in the context of recent information. Adaptation has been postulated to arise from recurrent circuit mechanisms or as a consequence of neuronally intrinsic suppression. However, it is unclear whether intrinsic suppression by itself can account for effects beyond reduced responses. Here, we test the hypothesis that complex adaptation phenomena can emerge from intrinsic suppression cascading through a feedforward model of visual processing. A deep convolutional neural network with intrinsic suppression
APA, Harvard, Vancouver, ISO, and other styles
18

Sharma, B. Lungsi, and Richard B. Wells. "A demonstration of using the model reference principle to develop the function-oriented adaptive pulse-coded neural network." SIMULATION 96, no. 2 (2019): 207–19. http://dx.doi.org/10.1177/0037549719860587.

Full text
Abstract:
How can one design an adaptive pulsed neural network that is based on psycho-phenomenological foundations? In other words, how can one migrate the adaptive capability of a psychologically modeled neural network to a pulsed network? Neural networks that model psychological phenomena are at a larger scale than physiological models. There is a common presumption that pulse-coded neural network analogs to non-pulsing networks can be obtained by a simple mapping and scaling process of some sort. But the actual in vivo environment of pulse-coded neural network systems produces a much more diverse se
APA, Harvard, Vancouver, ISO, and other styles
19

Zhu, Liqiang, Ying-Cheng Lai, Frank C. Hoppensteadt, and Jiping He. "Probing Changes in Neural Interaction During Adaptation." Neural Computation 15, no. 10 (2003): 2359–77. http://dx.doi.org/10.1162/089976603322362392.

Full text
Abstract:
A procedure is developed to probe the changes in the functional interactions among neurons in primary motor cortex of the monkey brain during adaptation. A monkey is trained to learn a new skill, moving its arm to reach a target under the influence of external perturbations. The spike trains of multiple neurons in the primary motor cortex are recorded simultaneously. We utilize the methodology of directed transfer function, derived from a class of linear stochastic models, to quantify the causal interactions between the neurons. We find that the coupling between the motor neurons tends to incr
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Xudong, Jianhua Zheng, Mingtao Li, Wenzhen Ma, and Yang Hu. "Frequency-Domain Fusing Convolutional Neural Network: A Unified Architecture Improving Effect of Domain Adaptation for Fault Diagnosis." Sensors 21, no. 2 (2021): 450. http://dx.doi.org/10.3390/s21020450.

Full text
Abstract:
In recent years, transfer learning has been widely applied in fault diagnosis for solving the problem of inconsistent distribution of the original training dataset and the online-collecting testing dataset. In particular, the domain adaptation method can solve the problem of the unlabeled testing dataset in transfer learning. Moreover, Convolutional Neural Network (CNN) is the most widely used network among existing domain adaptation approaches due to its powerful feature extraction capability. However, network designing is too empirical, and there is no network designing principle from the fr
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Xudong, Jianhua Zheng, Mingtao Li, Wenzhen Ma, and Yang Hu. "Frequency-Domain Fusing Convolutional Neural Network: A Unified Architecture Improving Effect of Domain Adaptation for Fault Diagnosis." Sensors 21, no. 2 (2021): 450. http://dx.doi.org/10.3390/s21020450.

Full text
Abstract:
In recent years, transfer learning has been widely applied in fault diagnosis for solving the problem of inconsistent distribution of the original training dataset and the online-collecting testing dataset. In particular, the domain adaptation method can solve the problem of the unlabeled testing dataset in transfer learning. Moreover, Convolutional Neural Network (CNN) is the most widely used network among existing domain adaptation approaches due to its powerful feature extraction capability. However, network designing is too empirical, and there is no network designing principle from the fr
APA, Harvard, Vancouver, ISO, and other styles
22

de Sousa, Celso, and Elder Moreira Hermerly. "ADAPTIVE CONTROL OF MOBILE ROBOTS USING A NEURAL NETWORK." International Journal of Neural Systems 11, no. 03 (2001): 211–18. http://dx.doi.org/10.1142/s0129065701000643.

Full text
Abstract:
A Neural Network - based control approach for mobile robot is proposed. The weight adaptation is made on-line, without previous learning. Several possible situations in robot navigation are considered, including uncertainties in the model and presence of disturbance. Weight adaptation laws are presented as well as simulation results.
APA, Harvard, Vancouver, ISO, and other styles
23

Ribar, Srdjan, Vojislav V. Mitic, and Goran Lazovic. "Neural Networks Application on Human Skin Biophysical Impedance Characterizations." Biophysical Reviews and Letters 16, no. 01 (2021): 9–19. http://dx.doi.org/10.1142/s1793048021500028.

Full text
Abstract:
Artificial neural networks (ANNs) are basically the structures that perform input–output mapping. This mapping mimics the signal processing in biological neural networks. The basic element of biological neural network is a neuron. Neurons receive input signals from other neurons or the environment, process them, and generate their output which represents the input to another neuron of the network. Neurons can change their sensitivity to input signals. Each neuron has a simple rule to process an input signal. Biological neural networks have the property that signals are processed through many p
APA, Harvard, Vancouver, ISO, and other styles
24

Marković, Dimitrije, and Claudius Gros. "Intrinsic Adaptation in Autonomous Recurrent Neural Networks." Neural Computation 24, no. 2 (2012): 523–40. http://dx.doi.org/10.1162/neco_a_00232.

Full text
Abstract:
A massively recurrent neural network responds on one side to input stimuli and is autonomously active, on the other side, in the absence of sensory inputs. Stimuli and information processing depend crucially on the qualia of the autonomous-state dynamics of the ongoing neural activity. This default neural activity may be dynamically structured in time and space, showing regular, synchronized, bursting, or chaotic activity patterns. We study the influence of nonsynaptic plasticity on the default dynamical state of recurrent neural networks. The nonsynaptic adaption considered acts on intrinsic
APA, Harvard, Vancouver, ISO, and other styles
25

Siddikov, I. H., P. I. Kalandarov, and D. B. ,. Yadgarova. "Engineering Calculation And Algorithm Of Adaptation Of Parameters Of A Neuro-Fuzzy Controller." American Journal of Applied sciences 03, no. 09 (2021): 41–49. http://dx.doi.org/10.37547/tajas/volume03issue09-06.

Full text
Abstract:
As part of the study, a control scheme with the adaptation of the coefficients of the neuron-fuzzy regulator implemented. The area difference method used as a training method for the network. It improved by adding a rule base, which allows choosing the optimal learning rate for individual neurons of the neural network. The neural network controller applied as a superstructure of the PID controller in the process control scheme. The dynamic object can function in different modes. This technological process operates in different modes in terms of loading and temperature setpoints. Because of exp
APA, Harvard, Vancouver, ISO, and other styles
26

Nerrand, O., P. Roussel-Ragot, L. Personnaz, G. Dreyfus, and S. Marcos. "Neural Networks and Nonlinear Adaptive Filtering: Unifying Concepts and New Algorithms." Neural Computation 5, no. 2 (1993): 165–99. http://dx.doi.org/10.1162/neco.1993.5.2.165.

Full text
Abstract:
The paper proposes a general framework that encompasses the training of neural networks and the adaptation of filters. We show that neural networks can be considered as general nonlinear filters that can be trained adaptively, that is, that can undergo continual training with a possibly infinite number of time-ordered examples. We introduce the canonical form of a neural network. This canonical form permits a unified presentation of network architectures and of gradient-based training algorithms for both feedforward networks (transversal filters) and feedback networks (recursive filters). We s
APA, Harvard, Vancouver, ISO, and other styles
27

Alavash, Mohsen, Sarah Tune, and Jonas Obleser. "Dynamic large-scale connectivity of intrinsic cortical oscillations supports adaptive listening in challenging conditions." PLOS Biology 19, no. 10 (2021): e3001410. http://dx.doi.org/10.1371/journal.pbio.3001410.

Full text
Abstract:
In multi-talker situations, individuals adapt behaviorally to this listening challenge mostly with ease, but how do brain neural networks shape this adaptation? We here establish a long-sought link between large-scale neural communications in electrophysiology and behavioral success in the control of attention in difficult listening situations. In an age-varying sample of N = 154 individuals, we find that connectivity between intrinsic neural oscillations extracted from source-reconstructed electroencephalography is regulated according to the listener’s goal during a challenging dual-talker ta
APA, Harvard, Vancouver, ISO, and other styles
28

Yang, Guochun, Kai Wang, Weizhi Nan, et al. "Distinct Brain Mechanisms for Conflict Adaptation within and across Conflict Types." Journal of Cognitive Neuroscience 34, no. 3 (2022): 445–60. http://dx.doi.org/10.1162/jocn_a_01806.

Full text
Abstract:
Abstract Cognitive conflict, like other cognitive processes, shows the characteristic of adaptation, that is, conflict effects are attenuated when immediately following a conflicting event, a phenomenon known as the conflict adaptation effect (CAE). One important aspect of CAE is its sensitivity to the intertrial coherence of conflict type, that is, behavioral CAE occurs only if consecutive trials are of the same conflict type. Although reliably observed behaviorally, the neural mechanisms underlying such a phenomenon remains elusive. With a paradigm combining the classic Simon task and Stroop
APA, Harvard, Vancouver, ISO, and other styles
29

Save, Ashwini, and Narendra Shekokar. "Cross Domain Adaptation using A Novel Convolution Neural Network." International Journal of Engineering Research and Technology 13, no. 9 (2020): 2230. http://dx.doi.org/10.37624/ijert/13.9.2020.2230-2238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Pan, Yongping, Qin Gao, and Haoyong Yu. "Fast and low-frequency adaptation in neural network control." IET Control Theory & Applications 8, no. 17 (2014): 2062–69. http://dx.doi.org/10.1049/iet-cta.2014.0449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

He, Y., and U. Cilingirogu. "A charge-based on-chip adaptation Kohonen neural network." IEEE Transactions on Neural Networks 4, no. 3 (1993): 462–69. http://dx.doi.org/10.1109/72.217189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Furui, Sadaoki, Daisuke Itoh, and Zhipeng Zhang. "Neural-network-based HMM adaptation for noisy speech recognition." Acoustical Science and Technology 24, no. 2 (2003): 69–75. http://dx.doi.org/10.1250/ast.24.69.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Shi, Yangyang, Martha Larson, and Catholijn M. Jonker. "Recurrent neural network language model adaptation with curriculum learning." Computer Speech & Language 33, no. 1 (2015): 136–54. http://dx.doi.org/10.1016/j.csl.2014.11.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Bereta, Michał. "Kohonen Network-Based Adaptation of Non Sequential Data for Use in Convolutional Neural Networks." Sensors 21, no. 21 (2021): 7221. http://dx.doi.org/10.3390/s21217221.

Full text
Abstract:
Convolutional neural networks have become one of the most powerful computing tools of artificial intelligence in recent years. They are especially suitable for the analysis of images and other data that have an inherent sequence structure, such as time series data. In the case of data in the form of vectors of features, the order of which does not matter, the use of convolutional neural networks is not justified. This paper presents a new method of representing non-sequential data as images that can be analyzed by a convolutional network. The well-known Kohonen network was used for this purpos
APA, Harvard, Vancouver, ISO, and other styles
35

Ziemke, Tom. "Radar Image Segmentation Using Self-Adapting Recurrent Networks." International Journal of Neural Systems 08, no. 01 (1997): 47–54. http://dx.doi.org/10.1142/s0129065797000070.

Full text
Abstract:
This paper presents a novel approach to the segmentation and integration of (radar) images using a second-order recurrent artificial neural network architecture consisting of two sub-networks: a function network that classifies radar measurements into four different categories of objects in sea environments (water, oil spills, land and boats), and a context network that dynamically computes the function network's input weights. It is shown that in experiments (using simulated radar images) this mechanism outperforms conventional artificial neural networks since it allows the network to learn t
APA, Harvard, Vancouver, ISO, and other styles
36

Todd, Hylton. "Thermodynamic Neural Network." Entropy 22, no. 3 (2020): 256. https://doi.org/10.5281/zenodo.3686045.

Full text
Abstract:
These videos are referenced in the article titled "Thermodynamic Neural Network" in the journal Entropy.  Each video is network simulation showing the evolution of the network organization.
APA, Harvard, Vancouver, ISO, and other styles
37

Zhao, Ying. "Evolutionary Neural Network-Based Online Ecological Governance Monitoring of Industrial Water Pollution." International Journal of Swarm Intelligence Research 16, no. 1 (2025): 1–23. https://doi.org/10.4018/ijsir.370397.

Full text
Abstract:
This paper proposes ENNOEIGS, an evolutionary neural network-based online ecological industrial governance system that integrates advanced neural architectures with evolutionary optimization for robust pollution monitoring. The framework combines convolutional neural networks for dimensional reduction of sensor data, external attention mechanisms for discovering pollution pattern correlations, and convolutional long short-term memory networks for modeling the spatiotemporal evolution of contaminants. A genetic algorithm continuously optimizes the neural network parameters, enabling adaptation
APA, Harvard, Vancouver, ISO, and other styles
38

Puga-Guzmán, Sergio A., Carlos Aguilar-Avelar, Javier Moreno-Valenzuela, and Víctor Santibáñez. "Tracking of periodic oscillations in an underactuated system via adaptive neural networks." Journal of Low Frequency Noise, Vibration and Active Control 37, no. 1 (2018): 128–43. http://dx.doi.org/10.1177/1461348417752988.

Full text
Abstract:
In this paper, the tracking control of periodic oscillations in an underactuated mechanical system is discussed. The proposed scheme is derived from the feedback linearization control technique and adaptive neural networks are used to estimate the unknown dynamics and to compensate uncertainties. The proposed neural network-based controller is applied to the Furuta pendulum, which is a nonlinear and nonminimum phase underactuated mechanical system with two degrees of freedom. The new neural network-based controller is experimentally compared with respect to its model-based version. Results ind
APA, Harvard, Vancouver, ISO, and other styles
39

Lin, Baihan. "Regularity Normalization: Neuroscience-Inspired Unsupervised Attention across Neural Network Layers." Entropy 24, no. 1 (2021): 59. http://dx.doi.org/10.3390/e24010059.

Full text
Abstract:
Inspired by the adaptation phenomenon of neuronal firing, we propose the regularity normalization (RN) as an unsupervised attention mechanism (UAM) which computes the statistical regularity in the implicit space of neural networks under the Minimum Description Length (MDL) principle. Treating the neural network optimization process as a partially observable model selection problem, the regularity normalization constrains the implicit space by a normalization factor, the universal code length. We compute this universal code incrementally across neural network layers and demonstrate the flexibil
APA, Harvard, Vancouver, ISO, and other styles
40

Tran, Vu, François Septier, Daisuke Murakami, and Tomoko Matsui. "Spatial–Temporal Temperature Forecasting Using Deep-Neural-Network-Based Domain Adaptation." Atmosphere 15, no. 1 (2024): 90. http://dx.doi.org/10.3390/atmos15010090.

Full text
Abstract:
Accurate temperature forecasting is critical for various sectors, yet traditional methods struggle with complex atmospheric dynamics. Deep neural networks (DNNs), especially transformer-based DNNs, offer potential advantages, but face challenges with domain adaptation across different geographical regions. We evaluated the effectiveness of DNN-based domain adaptation for daily maximum temperature forecasting in experimental low-resource settings. We used an attention-based transformer deep learning architecture as the core forecasting framework and used kernel mean matching (KMM) for domain ad
APA, Harvard, Vancouver, ISO, and other styles
41

Rivera-Rovelo, Jorge, and Eduardo Bayro-Corrochano. "Surface Approximation using Growing Self-Organizing Nets and Gradient Information." Applied Bionics and Biomechanics 4, no. 3 (2007): 125–36. http://dx.doi.org/10.1155/2007/502679.

Full text
Abstract:
In this paper we show how to improve the performance of two self-organizing neural networks used to approximate the shape of a 2D or 3D object by incorporating gradient information in the adaptation stage. The methods are based on the growing versions of the Kohonen's map and the neural gas network. Also, we show that in the adaptation stage the network utilizes efficient transformations, expressed as versors in the conformal geometric algebra framework, which build the shape of the object independent of its position in space (coordinate free). Our algorithms were tested with several images, i
APA, Harvard, Vancouver, ISO, and other styles
42

Maksutova, K., N. Saparkhojayev, and Dusmat Zhamangarin. "DEVELOPMENT OF AN ONTOLOGICAL MODEL OF DEEP LEARNING NEURAL NETWORKS." Bulletin D. Serikbayev of EKTU, no. 1 (March 2024): 190–201. http://dx.doi.org/10.51885/1561-4212_2024_1_190.

Full text
Abstract:
This research paper examines the challenges and prospects associated with the integration of artificial neural networks and knowledge bases. The focus is on leveraging this integration to address practical problems. The paper explores the development, training, and integration of artificial neural net- works, emphasizing their adaptation to knowledge bases. This adaptation involves processes such as in- tegration, communication, representation of ontological structures, and interpretation by the knowledge base of the artificial neural network's representation through input and output. The pape
APA, Harvard, Vancouver, ISO, and other styles
43

Sousa, Miguel Angelo de Abreu de, Edson Lemos Horta, Sergio Takeo Kofuji, and Emilio Del-Moral-Hernandez. "Architecture Analysis of an FPGA-Based Hopfield Neural Network." Advances in Artificial Neural Systems 2014 (December 9, 2014): 1–10. http://dx.doi.org/10.1155/2014/602325.

Full text
Abstract:
Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA) hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neur
APA, Harvard, Vancouver, ISO, and other styles
44

CARTLING, BO. "GENERATION OF ASSOCIATIVE PROCESSES IN A NEURAL NETWORK WITH REALISTIC FEATURES OF ARCHITECTURE AND UNITS." International Journal of Neural Systems 05, no. 03 (1994): 181–94. http://dx.doi.org/10.1142/s0129065794000207.

Full text
Abstract:
A recent neural network model of cortical associative memory incorporating neuronal adaptation by a simplified description of its underlying ionic mechanisms is extended towards more realistic network units and architecture. Excitatory units correspond to groups of adapting pyramidal neurons and inhibitory units to groups of nonadapting interneurons. The network architecture is formed from pairs of one pyramidal and one interneuron unit each with inhibitory connections within and excitatory connections between pairs. The degree of adaptability of the pyramidal units controls the character of t
APA, Harvard, Vancouver, ISO, and other styles
45

Ge, S. S., and T. H. Lee. "Parallel Adaptive Neural Network Control of Robots." Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering 208, no. 4 (1994): 231–37. http://dx.doi.org/10.1243/pime_proc_1994_208_336_02.

Full text
Abstract:
In this paper, a parallel adaptive neural network (NN) control design for robots motivated by the work by Lee and Tan is presented. The controller is based on direct adaptive techniques and an approach of using an additional parallel NN to provide adaptive enhancements to a basic fixed controller, which can be either a NN-based non-linear controller or a model-based non-linear controller. It is shown that, if Gaussian radial basis function networks are used for the additional parallel NN, uniformly stable adaptation is assured and asymptotic tracking of the position reference signal is achieve
APA, Harvard, Vancouver, ISO, and other styles
46

Zhang, Byoung-Tak, Peter Ohm, and Heinz Mühlenbein. "Evolutionary Induction of Sparse Neural Trees." Evolutionary Computation 5, no. 2 (1997): 213–36. http://dx.doi.org/10.1162/evco.1997.5.2.213.

Full text
Abstract:
This paper is concerned with the automatic induction of parsimonious neural networks. In contrast to other program induction situations, network induction entails parametric learning as well as structural adaptation. We present a novel representation scheme called neural trees that allows efficient learning of both network architectures and parameters by genetic search. A hybrid evolutionary method is developed for neural tree induction that combines genetic programming and the breeder genetic algorithm under the unified framework of the minimum description length principle. The method is succ
APA, Harvard, Vancouver, ISO, and other styles
47

Westendorff, Stephanie, Shenbing Kuang, Bahareh Taghizadeh, Opher Donchin, and Alexander Gail. "Asymmetric generalization in adaptation to target displacement errors in humans and in a neural network model." Journal of Neurophysiology 113, no. 7 (2015): 2360–75. http://dx.doi.org/10.1152/jn.00483.2014.

Full text
Abstract:
Different error signals can induce sensorimotor adaptation during visually guided reaching, possibly evoking different neural adaptation mechanisms. Here we investigate reach adaptation induced by visual target errors without perturbing the actual or sensed hand position. We analyzed the spatial generalization of adaptation to target error to compare it with other known generalization patterns and simulated our results with a neural network model trained to minimize target error independent of prediction errors. Subjects reached to different peripheral visual targets and had to adapt to a sudd
APA, Harvard, Vancouver, ISO, and other styles
48

Han, Gang, Haohe Zhang, Zhongliang Zhang, Yan Ma, and Tiantian Yang. "AI-Based Malicious Encrypted Traffic Detection in 5G Data Collection and Secure Sharing." Electronics 14, no. 1 (2024): 51. https://doi.org/10.3390/electronics14010051.

Full text
Abstract:
With the development and widespread application of network information, new technologies led by 5G are emerging, resulting in an increasingly complex network security environment and more diverse attack methods. Unlike traditional networks, 5G networks feature higher connection density, faster data transmission speeds, and lower latency, which are widely applied in scenarios such as smart cities, the Internet of Things, and autonomous driving. The vast amounts of sensitive data generated by these applications become primary targets during the processes of collection and secure sharing, and una
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Xiaoqing, and Xiangjun Wang. "Unsupervised Domain Adaptation with Coupled Generative Adversarial Autoencoders." Applied Sciences 8, no. 12 (2018): 2529. http://dx.doi.org/10.3390/app8122529.

Full text
Abstract:
When large-scale annotated data are not available for certain image classification tasks, training a deep convolutional neural network model becomes challenging. Some recent domain adaptation methods try to solve this problem using generative adversarial networks and have achieved promising results. However, these methods are based on a shared latent space assumption and they do not consider the situation when shared high level representations in different domains do not exist or are not ideal as they assumed. To overcome this limitation, we propose a neural network structure called coupled ge
APA, Harvard, Vancouver, ISO, and other styles
50

Tunik, Eugene, Paul J. Schmitt, and Scott T. Grafton. "BOLD Coherence Reveals Segregated Functional Neural Interactions When Adapting to Distinct Torque Perturbations." Journal of Neurophysiology 97, no. 3 (2007): 2107–20. http://dx.doi.org/10.1152/jn.00405.2006.

Full text
Abstract:
In the natural world, we experience and adapt to multiple extrinsic perturbations. This poses a challenge to neural circuits in discriminating between different context-appropriate responses. Using event-related fMRI, we characterized the neural dynamics involved in this process by randomly delivering a position- or velocity-dependent torque perturbation to subjects’ arms during a target-capture task. Each perturbation was color-cued during movement preparation to provide contextual information. Although trajectories differed between perturbations, subjects significantly reduced error under bo
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!