Academic literature on the topic 'Neural network adaptation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural network adaptation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural network adaptation"

1

Hylton, Todd. "Thermodynamic Neural Network." Entropy 22, no. 3 (2020): 256. http://dx.doi.org/10.3390/e22030256.

Full text
Abstract:
A thermodynamically motivated neural network model is described that self-organizes to transport charge associated with internal and external potentials while in contact with a thermal reservoir. The model integrates techniques for rapid, large-scale, reversible, conservative equilibration of node states and slow, small-scale, irreversible, dissipative adaptation of the edge states as a means to create multiscale order. All interactions in the network are local and the network structures can be generic and recurrent. Isolated networks show multiscale dynamics, and externally driven networks ev
APA, Harvard, Vancouver, ISO, and other styles
2

Vreeswijk, C. van, and D. Hansel. "Patterns of Synchrony in Neural Networks with Spike Adaptation." Neural Computation 13, no. 5 (2001): 959–92. http://dx.doi.org/10.1162/08997660151134280.

Full text
Abstract:
We study the emergence of synchronized burst activity in networks of neurons with spike adaptation. We show that networks of tonically firing adapting excitatory neurons can evolve to a state where the neurons burst in a synchronized manner. The mechanism leading to this burst activity is analyzed in a network of integrate-and-fire neurons with spike adaptation. The dependence of this state on the different network parameters is investigated, and it is shown that this mechanism is robust against inhomogeneities, sparseness of the connectivity, and noise. In networks of two populations, one exc
APA, Harvard, Vancouver, ISO, and other styles
3

Xie, Xurong, Xunying Liu, Tan Lee, and Lan Wang. "Bayesian Learning for Deep Neural Network Adaptation." IEEE/ACM Transactions on Audio, Speech, and Language Processing 29 (2021): 2096–110. http://dx.doi.org/10.1109/taslp.2021.3084072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Patre, P. M., S. Bhasin, Z. D. Wilcox, and W. E. Dixon. "Composite Adaptation for Neural Network-Based Controllers." IEEE Transactions on Automatic Control 55, no. 4 (2010): 944–50. http://dx.doi.org/10.1109/tac.2010.2041682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yu, D. L., and T. K. Chang. "Adaptation of diagonal recurrent neural network model." Neural Computing and Applications 14, no. 3 (2005): 189–97. http://dx.doi.org/10.1007/s00521-004-0453-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Joty, Shafiq, Nadir Durrani, Hassan Sajjad, and Ahmed Abdelali. "Domain adaptation using neural network joint model." Computer Speech & Language 45 (September 2017): 161–79. http://dx.doi.org/10.1016/j.csl.2016.12.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Denker, John S. "Neural network models of learning and adaptation." Physica D: Nonlinear Phenomena 22, no. 1-3 (1986): 216–32. http://dx.doi.org/10.1016/0167-2789(86)90242-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

YAEGER, LARRY S. "IDENTIFYING NEURAL NETWORK TOPOLOGIES THAT FOSTER DYNAMICAL COMPLEXITY." Advances in Complex Systems 16, no. 02n03 (2013): 1350032. http://dx.doi.org/10.1142/s021952591350032x.

Full text
Abstract:
We use an ecosystem simulator capable of evolving arbitrary neural network topologies to explore the relationship between an information theoretic measure of the complexity of neural dynamics and several graph theoretical metrics calculated for the underlying network topologies. Evolutionary trends confirm and extend previous results demonstrating an evolutionary selection for complexity and small-world network properties during periods of behavioral adaptation. The resultant mapping of the space of network topologies occupied by the most complex networks yields new insights into the relations
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Xiaofeng, Suying Xiang, Pengfei Zhu, and Min Wu. "Establishing a Dynamic Self-Adaptation Learning Algorithm of the BP Neural Network and Its Applications." International Journal of Bifurcation and Chaos 25, no. 14 (2015): 1540030. http://dx.doi.org/10.1142/s0218127415400301.

Full text
Abstract:
In order to avoid the inherent deficiencies of the traditional BP neural network, such as slow convergence speed, that easily leading to local minima, poor generalization ability and difficulty in determining the network structure, the dynamic self-adaptive learning algorithm of the BP neural network is put forward to improve the function of the BP neural network. The new algorithm combines the merit of principal component analysis, particle swarm optimization, correlation analysis and self-adaptive model, hence can effectively solve the problems of selecting structural parameters, initial con
APA, Harvard, Vancouver, ISO, and other styles
10

GOLTSEV, ALEXANDER, and DONALD C. WUNSCH. "GENERALIZATION OF FEATURES IN THE ASSEMBLY NEURAL NETWORKS." International Journal of Neural Systems 14, no. 01 (2004): 39–56. http://dx.doi.org/10.1142/s0129065704001838.

Full text
Abstract:
The purpose of the paper is an experimental study of the formation of class descriptions, taking place during learning, in assembly neural networks. The assembly neural network is artificially partitioned into several sub-networks according to the number of classes that the network has to recognize. The features extracted from input data are represented in neural column structures of the sub-networks. Hebbian neural assemblies are formed in the column structure of the sub-networks by weight adaptation. A specific class description is formed in each sub-network of the assembly neural network du
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Neural network adaptation"

1

Donati, Lorenzo. "Domain Adaptation through Deep Neural Networks for Health Informatics." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14888/.

Full text
Abstract:
The PreventIT project is an EU Horizon 2020 project aimed at preventing early functional decline at younger old age. The analysis of causal links between risk factors and functional decline has been made possible by the cooperation of several research institutes' studies. However, since each research institute collects and delivers different kinds of data in different formats, so far the analysis has been assisted by expert geriatricians whose role is to detect the best candidates among hundreds of fields and offer a semantic interpretation of the values. This manual data harmonization approac
APA, Harvard, Vancouver, ISO, and other styles
2

Haskey, Stephen. "A modified One-Class-One-Network ANN architecture for dynamic phoneme adaptation." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/12099.

Full text
Abstract:
As computers begin to pervade aspects of our everyday lives, so the problem of communication from man-to-machine becomes increasingly evident. In recent years, there has been a concerted interest in speech recognition offering a user to communicate freely with a machine. However, this deceptively simple means for exchanging information is in fact extremely complex. A single utterance can contain a wealth of varied information concerning the speaker's gender, age, dialect and mood. Numerous subtle differences such as intonation, rhythm and stress further add to the complexity, increasing the va
APA, Harvard, Vancouver, ISO, and other styles
3

Grativol, Ribeiro Lucas. "Neural network compression in the context of federated learning and edge devices." Electronic Thesis or Diss., Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2024. http://www.theses.fr/2024IMTA0444.

Full text
Abstract:
L’apprentissage fédéré est un cadre d’apprentissage automatique collaboratif et décentralisé, motivé par des préoccupations croissantes concernant la confidentialité desdonnées. En transférant l’entraînement des modèles vers des noeuds locaux et en conservant les données sur place, il favorise une approche plus respectueuse de la vie privée. Toutefois, cette méthode impose un surcoût en termes de communication et de calcul à ceux qui l’adoptent. Dans ce manuscrit, nous examinons les principaux défis de l’apprentissage fédéré et proposons des solutions visant à augumenter l’efficacité tout en r
APA, Harvard, Vancouver, ISO, and other styles
4

Wen, Tsung-Hsien. "Recurrent neural network language generation for dialogue systems." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/275648.

Full text
Abstract:
Language is the principal medium for ideas, while dialogue is the most natural and effective way for humans to interact with and access information from machines. Natural language generation (NLG) is a critical component of spoken dialogue and it has a significant impact on usability and perceived quality. Many commonly used NLG systems employ rules and heuristics, which tend to generate inflexible and stylised responses without the natural variation of human language. However, the frequent repetition of identical output forms can quickly make dialogue become tedious for most real-world users.
APA, Harvard, Vancouver, ISO, and other styles
5

Gangireddy, Siva Reddy. "Recurrent neural network language models for automatic speech recognition." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28990.

Full text
Abstract:
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) for large vocabulary continuous speech recognition (LVCSR). RNNLMs are currently state-of-the-art and shown to consistently reduce the word error rates (WERs) of LVCSR tasks when compared to other language models. In this thesis we propose various advances to RNNLMs. The advances are: improved learning procedures for RNNLMs, enhancing the context, and adaptation of RNNLMs. We learned better parameters by a novel pre-training approach and enhanced the context using prosody and syntactic features.
APA, Harvard, Vancouver, ISO, and other styles
6

Tomashenko, Natalia. "Speaker adaptation of deep neural network acoustic models using Gaussian mixture model framework in automatic speech recognition systems." Thesis, Le Mans, 2017. http://www.theses.fr/2017LEMA1040/document.

Full text
Abstract:
Les différences entre conditions d'apprentissage et conditions de test peuvent considérablement dégrader la qualité des transcriptions produites par un système de reconnaissance automatique de la parole (RAP). L'adaptation est un moyen efficace pour réduire l'inadéquation entre les modèles du système et les données liées à un locuteur ou un canal acoustique particulier. Il existe deux types dominants de modèles acoustiques utilisés en RAP : les modèles de mélanges gaussiens (GMM) et les réseaux de neurones profonds (DNN). L'approche par modèles de Markov cachés (HMM) combinés à des GMM (GMM-HM
APA, Harvard, Vancouver, ISO, and other styles
7

Buttar, Sarpreet Singh. "Applying Artificial Neural Networks to Reduce the Adaptation Space in Self-Adaptive Systems : an exploratory work." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-87117.

Full text
Abstract:
Self-adaptive systems have limited time to adjust their configurations whenever their adaptation goals, i.e., quality requirements, are violated due to some runtime uncertainties. Within the available time, they need to analyze their adaptation space, i.e., a set of configurations, to find the best adaptation option, i.e., configuration, that can achieve their adaptation goals. Existing formal analysis approaches find the best adaptation option by analyzing the entire adaptation space. However, exhaustive analysis requires time and resources and is therefore only efficient when the adaptation
APA, Harvard, Vancouver, ISO, and other styles
8

Palapelas, Kantola Philip. "Extreme Quantile Estimation of Downlink Radio Channel Quality." Thesis, Linköpings universitet, Artificiell intelligens och integrerade datorsystem, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177657.

Full text
Abstract:
The application area of Fifth Generation New Radio (5G-NR) called Ultra-Reliable and Low-Latency Communication (URLLC) requires a reliability, the probability of receiving and decoding a data packet correctly, of 1 - 10^5. For this requirement to be fulfilled in a resource-efficient manner, it is necessary to have a good estimation of extremely low quan- tiles of the channel quality distribution, so that appropriate resources can be distributed to users of the network system.  This study proposes and evaluates two methods for estimating extreme quantiles of the downlink channel quality distrib
APA, Harvard, Vancouver, ISO, and other styles
9

Fic, Miloslav. "Adaptace parametrů ve fuzzy systémech." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221163.

Full text
Abstract:
This Master’s thesis deals with adaptation of fuzzy system parameters with main aim on artificial neural network. Current knowledge of methods connecting fuzzy systems and artificial neural networks is discussed in the search part of this work. The search in Student’s works is discussed either. Chapter focused on methods application deals with classifying ability verification of the chosen fuzzy-neural network with Kohonen learning algorithm. Later the model of fuzzy system with parameters adaptation based on fuzzyneural network with Kohonen learning algorithm is shown.
APA, Harvard, Vancouver, ISO, and other styles
10

Vu, Hien Duc. "Adaptation des méthodes d'apprentissage automatique pour la détection de défauts d'arc électriques." Electronic Thesis or Diss., Université de Lorraine, 2019. http://docnum.univ-lorraine.fr/ulprive/DDOC_T_2019_0152_VU.pdf.

Full text
Abstract:
La détection des arcs électriques se produisant dans un réseau électrique par des approches d’apprentissage représente le cœur des travaux exposés dans cette thèse. Le problème a d’abord été vu comme la classification de séries temporelles à taille fixe en deux classes: normal et défaut. Cette première partie s’appuie sur les travaux de la littérature où les algorithmes de détection sont organisés principalement sur une étape de transformation des signaux acquis sur le réseau, suivie d’une étape d’extraction de caractéristiques descriptives et enfin d’une étape de décision. L’approche multicri
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Neural network adaptation"

1

Lee, Tsu-Chang. Structure level adaptation for artificial neural networks. Kluwer Academic Publishers, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Tsu-Chang. Structure Level Adaptation for Artificial Neural Networks. Springer US, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stonier, Russel J., and Xing Huo Yu. Complex systems: Mechanism of adaptation. IOS Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

1931-, Haykin Simon S., ed. Kalman filtering and neural networks. Wiley, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

J, Stonier Russel, and Xing Huo-yu, eds. Complex systems: Mechanism of adaptation. IOS Press, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Focus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Marcello, Pucci, and Vitale Gianpaolo, eds. Power converters and AC electrical drives with linear neutral networks. CRC Press, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Neuronal adaptation theory: Including 29 exercises with solutions, 43 essential ideas, and 108 partially couloured figures, experiment explanations, and general theorems. Peter Lang, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Channel-Mismatch Compensation in Speaker Identification Feature Selection and Adaptation with Artificial Neural Networks. Storming Media, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Neural network adaptation"

1

Ljung, L., J. Sjöberg, and H. Hjalmarsson. "On Neural Network Model Structures in System Identification." In Identification, Adaptation, Learning. Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cai, ManJun, JinCun Liu, GuangJun Tian, XueJian Zhang, and TiHua Wu. "Hybrid Neural Network Controller Using Adaptation Algorithm." In Advances in Neural Networks – ISNN 2007. Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-72383-7_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Patil, Dipali Himmatrao, and Amit Gadekar. "Tuberculosis Detection Using a Deep Neural Network." In Proceedings in Adaptation, Learning and Optimization. Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-31164-2_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hozjan, Tomaž, Goran Turk, and Iztok Fister. "Hybrid Artificial Neural Network for Fire Analysis of Steel Frames." In Adaptation, Learning, and Optimization. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14400-9_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kursin, Andrei. "Neural Network: Input Anticipation May Lead to Advanced Adaptation Properties." In Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/3-540-44989-2_93.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Tsu-Chang. "Application Example: An Adaptive Neural Network Source Coder." In Structure Level Adaptation for Artificial Neural Networks. Springer US, 1991. http://dx.doi.org/10.1007/978-1-4615-3954-4_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vidyasagar, M. "An Overview of Computational Learning Theory and Its Applications to Neural Network Training." In Identification, Adaptation, Learning. Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-03295-4_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Yongxin, and Timothy M. Hospedales. "Unifying Multi-domain Multitask Learning: Tensor and Neural Network Perspectives." In Domain Adaptation in Computer Vision Applications. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-58347-1_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zajíc, Zbyněk, Jan Zelinka, Jan Vaněk, and Luděk Müller. "Convolutional Neural Network for Refinement of Speaker Adaptation Transformation." In Speech and Computer. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11581-8_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bureš, Tomáš, Petr Hnětynka, Martin Kruliš, et al. "Attuning Adaptation Rules via a Rule-Specific Neural Network." In Leveraging Applications of Formal Methods, Verification and Validation. Adaptation and Learning. Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-19759-8_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Neural network adaptation"

1

Angelini, Christopher F., and Nidhal C. Bouaynaya. "Dynamic Continual Learning: Harnessing Parameter Uncertainty for Improved Network Adaptation." In 2024 International Joint Conference on Neural Networks (IJCNN). IEEE, 2024. http://dx.doi.org/10.1109/ijcnn60899.2024.10650205.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rifat, Shahriar, Jonathan Ashdown, and Francesco Restuccia. "DARDA: Domain-Aware Real-Time Dynamic Neural Network Adaptation*." In 2025 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV). IEEE, 2025. https://doi.org/10.1109/wacv61041.2025.00194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Jinyu, Jui-Ting Huang, and Yifan Gong. "Factorized adaptation for deep neural network." In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6854662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jae Hoon Jeong and Soo-Young Lee. "Speaker adaptation based on judge network with small adaptation words." In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium. IEEE, 2000. http://dx.doi.org/10.1109/ijcnn.2000.859377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Steffens Henrique, Alisson, Vinicius Almeida dos Santos, and Rodrigo Lyra. "NEAT Snake: a both evolutionary and neural network adaptation approach." In Computer on the Beach. Universidade do Vale do Itajaí, 2020. http://dx.doi.org/10.14210/cotb.v11n1.p052-053.

Full text
Abstract:
There are several challenges when modeling artificial intelligencemethods for autonomous players on games (bots). NEAT is one ofthe models that, combining genetic algorithms and neural networks,seek to describe a bot behavior more intelligently. In NEAT, a neuralnetwork is used for decision making, taking relevant inputs fromthe environment and giving real-time decisions. In a more abstractway, a genetic algorithm is applied for the learning step of the neuralnetworks’ weights, layers, and parameters. This paper proposes theuse of relative position as the input of the neural network, basedon t
APA, Harvard, Vancouver, ISO, and other styles
6

Wu, Chunwei, Guitao Cao, Wenming Cao, Hong Wang, and He Ren. "Debiased Prototype Network for Adversarial Domain Adaptation." In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. http://dx.doi.org/10.1109/ijcnn52387.2021.9533346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vesely, Karel, Shinji Watanabe, Katerina Zmolikova, Martin Karafiat, Lukas Burget, and Jan Honza Cernocky. "Sequence summarizing neural network for speaker adaptation." In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016. http://dx.doi.org/10.1109/icassp.2016.7472692.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Patre, Parag M., Shubhendu Bhasin, Zachary D. Wilcox, and Warren E. Dixon. "Composite adaptation for neural network-based controllers." In 2009 Joint 48th IEEE Conference on Decision and Control (CDC) and 28th Chinese Control Conference (CCC). IEEE, 2009. http://dx.doi.org/10.1109/cdc.2009.5400453.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ma, Min, Michael Nirschl, Fadi Biadsy, and Shankar Kumar. "Approaches for Neural-Network Language Model Adaptation." In Interspeech 2017. ISCA, 2017. http://dx.doi.org/10.21437/interspeech.2017-1310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kimoto, T., Y. Yaginuma, S. Nagata, and K. Asakawa. "Inverse modeling of dynamical system-network architecture with identification network and adaptation network." In 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170460.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Neural network adaptation"

1

Miles, Gaines E., Yael Edan, F. Tom Turpin, et al. Expert Sensor for Site Specification Application of Agricultural Chemicals. United States Department of Agriculture, 1995. http://dx.doi.org/10.32747/1995.7570567.bard.

Full text
Abstract:
In this work multispectral reflectance images are used in conjunction with a neural network classifier for the purpose of detecting and classifying weeds under real field conditions. Multispectral reflectance images which contained different combinations of weeds and crops were taken under actual field conditions. This multispectral reflectance information was used to develop algorithms that could segment the plants from the background as well as classify them into weeds or crops. In order to segment the plants from the background the multispectrial reflectance of plants and background were st
APA, Harvard, Vancouver, ISO, and other styles
2

Kosko, Bart. Stability and Adaptation of Neural Networks. Defense Technical Information Center, 1990. http://dx.doi.org/10.21236/ada230108.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yatsymirska, Mariya. KEY IMPRESSIONS OF 2020 IN JOURNALISTIC TEXTS. Ivan Franko National University of Lviv, 2021. http://dx.doi.org/10.30970/vjo.2021.50.11107.

Full text
Abstract:
The article explores the key vocabulary of 2020 in the network space of Ukraine. Texts of journalistic, official-business style, analytical publications of well-known journalists on current topics are analyzed. Extralinguistic factors of new word formation, their adaptation to the sphere of special and socio-political vocabulary of the Ukrainian language are determined. Examples show modern impressions in the media, their stylistic use and impact on public opinion in a pandemic. New meanings of foreign expressions, media terminology, peculiarities of translation of neologisms from English into
APA, Harvard, Vancouver, ISO, and other styles
4

Seginer, Ido, Louis D. Albright, and Robert W. Langhans. On-line Fault Detection and Diagnosis for Greenhouse Environmental Control. United States Department of Agriculture, 2001. http://dx.doi.org/10.32747/2001.7575271.bard.

Full text
Abstract:
Background Early detection and identification of faulty greenhouse operation is essential, if losses are to be minimized by taking immediate corrective actions. Automatic detection and identification would also free the greenhouse manager to tend to his other business. Original objectives The general objective was to develop a method, or methods, for the detection, identification and accommodation of faults in the greenhouse. More specific objectives were as follows: 1. Develop accurate systems models, which will enable the detection of small deviations from normal behavior (of sensors, contro
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!