Academic literature on the topic 'Potts Attractor Neural Network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Potts Attractor Neural Network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Dissertations / Theses on the topic "Potts Attractor Neural Network"

1

Seybold, John. "An attractor neural network model of spoken word recognition." Thesis, University of Oxford, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.335839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pereira, Patrícia. "Attractor Neural Network modelling of the Lifespan Retrieval Curve." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280732.

Full text
Abstract:
Human capability to recall episodic memories depends on how much time has passed since the memory was encoded. This dependency is described by a memory retrieval curve that reflects an interesting phenomenon referred to as a reminiscence bump - a tendency for older people to recall more memories formed during their young adulthood than in other periods of life. This phenomenon can be modelled with an attractor neural network, for example, the firing-rate Bayesian Confidence Propagation Neural Network (BCPNN) with incremental learning. In this work, the mechanisms underlying the reminiscence bu
APA, Harvard, Vancouver, ISO, and other styles
3

Ericson, Julia. "Modelling Immediate Serial Recall using a Bayesian Attractor Neural Network." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-291553.

Full text
Abstract:
In the last decades, computational models have become useful tools for studying biological neural networks. These models are typically constrained by either behavioural data from neuropsychological studies or by biological data from neuroscience. One model of the latter kind is the Bayesian Confidence Propagating Neural Network (BCPNN) - an attractor network with a Bayesian learning rule which has been proposed as a model for various types of memory. In this thesis, I have further studied the potential of the BCPNN in short-term sequential memory. More specifically, I have investigated if the
APA, Harvard, Vancouver, ISO, and other styles
4

Batbayar, Batsukh, and S3099885@student rmit edu au. "Improving Time Efficiency of Feedforward Neural Network Learning." RMIT University. Electrical and Computer Engineering, 2009. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20090303.114706.

Full text
Abstract:
Feedforward neural networks have been widely studied and used in many applications in science and engineering. The training of this type of networks is mainly undertaken using the well-known backpropagation based learning algorithms. One major problem with this type of algorithms is the slow training convergence speed, which hinders their applications. In order to improve the training convergence speed of this type of algorithms, many researchers have developed different improvements and enhancements. However, the slow convergence problem has not been fully addressed. This thesis makes seve
APA, Harvard, Vancouver, ISO, and other styles
5

Villani, Gianluca. "Analysis of an Attractor Neural Network Model for Working Memory: A Control Theory Approach." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-260079.

Full text
Abstract:
Working Memory (WM) is a general-purpose cognitive system responsible for temporaryholding information in service of higher order cognition systems, e.g. decision making. Inthis thesis we focus on a non-spiking model belonging to a special family of biologicallyinspired recurrent Artificial Neural Network aiming to account for human experimentaldata on free recall. Considering its modular structure, this thesis gives a networked systemrepresentation of WM in order to analyze its stability and synchronization properties.Furthermore, with the tools provided by bifurcation analysis we investigate
APA, Harvard, Vancouver, ISO, and other styles
6

Ferland, Guy J. M. G. "A new paradigm for the classification of patterns: The 'race to the attractor' neural network model." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/9298.

Full text
Abstract:
The human brain is arguably the best known classifier around. It can learn complex classification tasks with little apparent effort. It can learn to classify new patterns without forgetting old ones. It can learn a seemingly unlimited number of pattern classes. And it displays amazing resilience through its ability to persevere with reliable classifications despite damage to itself (e.g., dying neurons). These advantages have motivated researchers from many fields in the quest to understand the brain in order to duplicate its ability in an artificial system. And yet, little is known about the
APA, Harvard, Vancouver, ISO, and other styles
7

Rosay, Sophie. "A statistical mechanics approach to the modelling and analysis of place-cell activity." Thesis, Paris, Ecole normale supérieure, 2014. http://www.theses.fr/2014ENSU0010/document.

Full text
Abstract:
Les cellules de lieu de l’hippocampe sont des neurones aux propriétés intrigantes, commele fait que leur activité soit corrélée à la position spatiale de l’animal. Il est généralementconsidéré que ces propriétés peuvent être expliquées en grande partie par les comporte-ments collectifs de modèles schématiques de neurones en interaction. La physique statis-tique fournit des outils permettant l’étude analytique et numérique de ces comportementscollectifs.Nous abordons ici le problème de l’utilisation de ces outils dans le cadre du paradigmedu “réseau attracteur”, une hypothèse théorique sur la n
APA, Harvard, Vancouver, ISO, and other styles
8

Strandqvist, Jonas. "Attractors of autoencoders : Memorization in neural networks." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-97746.

Full text
Abstract:
It is an important question in machine learning to understand how neural networks learn. This thesis sheds further light onto this by studying autoencoder neural networks which can memorize data by storing it as attractors.What this means is that an autoencoder can learn a training set and later produce parts or all of this training set even when using other inputs not belonging to this set. We seek out to illuminate the effect on how ReLU networks handle memorization when trained with different setups: with and without bias, for different widths and depths, and using two different types of tr
APA, Harvard, Vancouver, ISO, and other styles
9

Martí, Ortega Daniel. "Neural stochastic dynamics of perceptual decision making." Doctoral thesis, Universitat Pompeu Fabra, 2008. http://hdl.handle.net/10803/7552.

Full text
Abstract:
Models computacionals basats en xarxes a gran escala d'inspiració neurobiològica permeten descriure els correlats neurals de la decisió observats en certes àrees corticals com una transició entre atractors de la xarxa cortical. L'estimulació provoca un canvi en el paisatge d'atractors que afavoreix la transició entre l'atractor neutre inicial a un dels atractors associats a les eleccions categòriques. El soroll present en el sistema introdueix indeterminació en les transicions. En aquest treball mostrem l'existència de dos mecanismes de decisió qualitativament diferents, cadascun amb signature
APA, Harvard, Vancouver, ISO, and other styles
10

Posani, Lorenzo. "Inference and modeling of biological networks : a statistical-physics approach to neural attractors and protein fitness landscapes." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEE043/document.

Full text
Abstract:
L'avènement récent des procédures expérimentales à haut débit a ouvert une nouvelle ère pour l'étude quantitative des systèmes biologiques. De nos jours, les enregistrements d'électrophysiologie et l'imagerie du calcium permettent l'enregistrement simultané in vivo de centaines à des milliers de neurones. Parallèlement, grâce à des procédures de séquençage automatisées, les bibliothèques de protéines fonctionnelles connues ont été étendues de milliers à des millions en quelques années seulement. L'abondance actuelle de données biologiques ouvre une nouvelle série de défis aux théoriciens. Des
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!