Academic literature on the topic 'Disordered Systems and Neural Networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Disordered Systems and Neural Networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Disordered Systems and Neural Networks"

1

Iesari, Fabio, Hiroyuki Setoyama, and Toshihiro Okajima. "Extracting Local Symmetry of Mono-Atomic Systems from Extended X-ray Absorption Fine Structure Using Deep Neural Networks." Symmetry 13, no. 6 (June 15, 2021): 1070. http://dx.doi.org/10.3390/sym13061070.

Full text
Abstract:
In recent years, neural networks have become a new method for the analysis of extended X-ray absorption fine structure data. Due to its sensitivity to local structure, X-ray absorption spectroscopy is often used to study disordered systems and one of its more interesting property is the sensitivity not only to pair distribution function, but also to three-body distribution, which contains information on the local symmetry. In this study, by considering the case of Ni, we show that by using neural networks, it is possible to obtain not only the radial distribution function, but also the bond angle distribution between the first nearest-neighbors. Additionally, by adding appropriate configurations in the dataset used for training, we show that the neural network is able to analyze also data from disordered phases (liquid and undercooled state), detecting small changes in the local ordering compatible with results obtained through other methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Madkhali, Marwah M. M., Conor D. Rankine, and Thomas J. Penfold. "Enhancing the analysis of disorder in X-ray absorption spectra: application of deep neural networks to T-jump-X-ray probe experiments." Physical Chemistry Chemical Physics 23, no. 15 (2021): 9259–69. http://dx.doi.org/10.1039/d0cp06244h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

BERNARDES, AMÉRICO T., and HANS J. HERRMANN. "A SIMPLE MODEL WITH STRONG ASYMMETRIC COUPLINGS." International Journal of Modern Physics C 04, no. 04 (August 1993): 765–74. http://dx.doi.org/10.1142/s012918319300063x.

Full text
Abstract:
In this paper an Ising model on a random lattice with strongly asymmetric couplings inspired by neural networks is studied. We investigate the phase space structure and find evidence for an ultrametric, "multivalley" structure as observed in disordered magnetic systems. We have calculated the size of the basins of attraction.
APA, Harvard, Vancouver, ISO, and other styles
4

MEDOFF, D. R. "Neural Networks: Neural Systems I: Neural Systems I." American Journal of Psychiatry 157, no. 6 (June 1, 2000): 877. http://dx.doi.org/10.1176/appi.ajp.157.6.877.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

MEDOFF, DEBORAH, and HENRY HOLCOMB. "Neural Networks: Neural Systems II." American Journal of Psychiatry 157, no. 8 (August 2000): 1212. http://dx.doi.org/10.1176/appi.ajp.157.8.1212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

KLESSE, ROCHUS, and MARCUS METZLER. "MODELING DISORDERED QUANTUM SYSTEMS WITH DYNAMICAL NETWORKS." International Journal of Modern Physics C 10, no. 04 (June 1999): 577–606. http://dx.doi.org/10.1142/s0129183199000449.

Full text
Abstract:
It is the purpose of the present article to show that so-called network models, originally designed to describe static properties of disordered electronic systems, can be easily generalized to quantum-dynamical models, which then allow for an investigation of dynamical and spectral aspects. This concept is exemplified by the Chalker–Coddington model for the quantum Hall effect and a three-dimensional generalization of it. We simulate phase coherent diffusion of wave packets and consider spatial and spectral correlations of network eigenstates as well as the distribution of (quasi-)energy levels. Apart from that, it is demonstrated how network models can be used to determine two-point conductances. Our numerical calculations for the three-dimensional model at the Metal-Insulator transition point delivers, among others, an anomalous diffusion exponent of η=3-D2=1.7±0.1. The methods presented here in detail have been used partially in earlier work.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Jun. "Artificial neural networks versus natural neural networks." Decision Support Systems 11, no. 5 (June 1994): 415–29. http://dx.doi.org/10.1016/0167-9236(94)90016-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Antsaklis, P. J. "Neural networks for control systems." IEEE Transactions on Neural Networks 1, no. 2 (June 1990): 242–44. http://dx.doi.org/10.1109/72.80237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Godjevac, Jelena, and Nigel Steele. "Fuzzy Systems and Neural Networks." Intelligent Automation & Soft Computing 4, no. 1 (January 1998): 27–37. http://dx.doi.org/10.1080/10798587.1998.10750719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kosko, Bart, and John C. Burgess. "Neural Networks and Fuzzy Systems." Journal of the Acoustical Society of America 103, no. 6 (June 1998): 3131. http://dx.doi.org/10.1121/1.423096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Disordered Systems and Neural Networks"

1

Laughton, Stephen Nicholas. "Dynamics of neural networks and disordered spin systems." Thesis, University of Oxford, 1995. http://ora.ox.ac.uk/objects/uuid:5531cef6-4682-4750-9c5c-cb69e5e72d64.

Full text
Abstract:
I obtain a number of results for the dynamics of several disordered spin systems, of successively greater complexity. I commence with the generalised Hopfield model trained with an intensive number of patterns, where in the thermodynamic limit macroscopic, deterministic equations of motion can be derived exactly for both the synchronous discrete time and asynchronous continuous time dynamics. I show that for symmetric embedding matrices Lyapunov functions exist at the macroscopic level of description in terms of pattern overlaps. I then show that for asymmetric embedding matrices several types of bifurcation phenomena to complex non-transient dynamics occur, even in this simplest model. Extending a recent result of Coolen and Sherrington, I show how the dynamics of the generalised Hopfield model trained with extensively many patterns and non-trivial embedding matrix can be described by the evolution of a small number of overlaps and the disordered contribution to the 'energy', upon calculation of a noise distribution by the replica method. The evaluation of the noise distribution requires two key assumptions: that the flow equations are self averaging, and that equipartitioning of probability occurs within the macroscopic sub-shells of the ensemble. This method is inexact on intermediate time scales, due to the microscopic information integrated out in order to derive a closed set of equations. I then show how this theory can be improved in a systematic manner by introducing an order parameter function - the joint distribution of spins and local alignment fields, which evolves in time deterministically, according to a driven diffusion type equation. I show how the coefficients in this equation can be evaluated for the generalised Sherrington-Kirkpatrick model, both within the replica symmetric ansatz, and using Parisi's ultrametric ansatz for the replica matrices, upon making once again the two key assumptions (self averaging and equipartitioning). Since the order parameter is now a continuous function, however, the assumption of equipartitioning within the macroscopic sub-shells is much less restricting.
APA, Harvard, Vancouver, ISO, and other styles
2

Gabrié, Marylou. "Towards an understanding of neural networks : mean-field incursions." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLEE035.

Full text
Abstract:
Les algorithmes d’apprentissage automatique utilisant des réseaux de neurones profonds ont récemment révolutionné l'intelligence artificielle. Malgré l'engouement suscité par leurs diverses applications, les excellentes performances de ces algorithmes demeurent largement inexpliquées sur le plan théorique. Ces problèmes d'apprentissage sont décrits mathématiquement par de très grands ensembles de variables en interaction, difficiles à manipuler aussi bien analytiquement que numériquement. Cette multitude est précisément le champ d'étude de la physique statistique qui s'attelle à comprendre, originellement dans les systèmes naturels, comment rendre compte des comportements macroscopiques à partir de cette complexité microscopique. Dans cette thèse nous nous proposons de mettre à profit les progrès récents des méthodes de champ moyen de la physique statistique des systèmes désordonnés pour dériver des approximations pertinentes dans ce contexte. Nous nous appuyons sur les équivalences et les complémentarités entre les algorithmes de passage de message, les développements haute température et la méthode des répliques. Cette stratégie nous mène d'une part à des contributions pratiques pour l'apprentissage non supervisé des machines de Boltzmann. Elle nous permet d'autre part de contribuer à des réflexions théoriques en considérant le paradigme du professeur-étudiant pour modéliser des situations d'apprentissage. Nous développons une méthode pour caractériser dans ces modèles l'évolution de l'information au cours de l’entraînement, et nous proposons une direction de recherche afin de généraliser l'étude de l'apprentissage bayésien des réseaux de neurones à une couche aux réseaux de neurones profonds
Machine learning algorithms relying on deep new networks recently allowed a great leap forward in artificial intelligence. Despite the popularity of their applications, the efficiency of these algorithms remains largely unexplained from a theoretical point of view. The mathematical descriptions of learning problems involves very large collections of interacting random variables, difficult to handle analytically as well as numerically. This complexity is precisely the object of study of statistical physics. Its mission, originally pointed towards natural systems, is to understand how macroscopic behaviors arise from microscopic laws. In this thesis we propose to take advantage of the recent progress in mean-field methods from statistical physics to derive relevant approximations in this context. We exploit the equivalences and complementarities of message passing algorithms, high-temperature expansions and the replica method. Following this strategy we make practical contributions for the unsupervised learning of Boltzmann machines. We also make theoretical contributions considering the teacher-student paradigm to model supervised learning problems. We develop a framework to characterize the evolution of information during training in these model. Additionally, we propose a research direction to generalize the analysis of Bayesian learning in shallow neural networks to their deep counterparts
APA, Harvard, Vancouver, ISO, and other styles
3

Semerjian, Guilhem. "Mean-field disordered systems : glasses and optimization problems, classical and quantum." Habilitation à diriger des recherches, Ecole Normale Supérieure de Paris - ENS Paris, 2013. http://tel.archives-ouvertes.fr/tel-00785924.

Full text
Abstract:
Ce mémoire présente mes activités de recherche dans le domaine de la mécanique statistique des systèmes désordonnés, en particulier sur les modèles de champ moyen à connectivité finie. Ces modèles présentent de nombreuses transitions de phase dans la limite thermodynamique, avec des applications tant pour la physique des verres que pour leurs liens avec des problèmes d'optimisation de l'informatique théorique. Leur comportement sous l'effet de fluctuations quantiques est aussi discuté, en lien avec des perspectives de calcul quantique.
APA, Harvard, Vancouver, ISO, and other styles
4

Doria, Felipe França. "Padrões estruturados e campo aleatório em redes complexas." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2016. http://hdl.handle.net/10183/144076.

Full text
Abstract:
Este trabalho foca no estudo de duas redes complexas. A primeira é um modelo de Ising com campo aleatório. Este modelo segue uma distribuição de campo gaussiana e bimodal. Uma técnica de conectividade finita foi utilizada para resolvê-lo. Assim como um método de Monte Carlo foi aplicado para verificar os resultados. Há uma indicação em nossos resultados que para a distribuição gaussiana a transição de fase é sempre de segunda ordem. Para as distribuições bimodais há um ponto tricrítico, dependente do valor da conectividade . Abaixo de um certo mínimo de , só existe transição de segunda ordem. A segunda é uma rede neural atratora métrica. Mais precisamente, estudamos a capacidade deste modelo para armazenar os padrões estruturados. Em particular, os padrões escolhidos foram retirados de impressões digitais, que apresentam algumas características locais. Os resultados mostram que quanto menor a atividade de padrões de impressões digitais, maior a relação de carga e a qualidade de recuperação. Uma teoria, também foi desenvolvido como uma função de cinco parâmetros: a relação de carga, a conectividade, o grau de densidade da rede, a relação de aleatoriedade e a correlação do padrão espacial.
This work focus on the study of two complex networks. The first one is a random field Ising model. This model follows a gaussian and bimodal distribution, for the random field. A finite connectivity technique was utilized to solve it. As well as a Monte Carlo method was applied to verify our results. There is an indication in our results that for a gaussian distribution the phase transition is always second-order. For the bimodal distribution there is a tricritical point, tha depends on the value of the connectivity . Below a certain minimum , there is only a second-order transition. The second one is a metric attractor neural network. More precisely we study the ability of this model to learn structured patterns. In particular, the chosen patterns were taken from fingerprints, which present some local features. Our results show that the higher the load ratio and retrieval quality are the lower is the fingerprint patterns activity. A theoretical framework was also developed as a function of five parameters: the load ratio, the connectivity, the density degree of the network, the randomness ratio and the spatial pattern correlation.
APA, Harvard, Vancouver, ISO, and other styles
5

Lopez, Matthias. "Test expérimental de l'universalité de la transition d'Anderson avec des atomes froids: Indépendance de l'exposant critique $\nu$ face aux détails microscopiques." Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2010. http://tel.archives-ouvertes.fr/tel-00764091.

Full text
Abstract:
En physique du solide, l'étude des effets du désordre a mené à la découverte d'une transition de phase. A faible désordre le solide est conducteur. A fort désordre ce dernier devient isolant. Cette dernière porte le nom de "transition d'Anderson" ou encore de "transition métal-isolant". Elle peut être caractérisée par un exposant critique . Il est prédit théoriquement que sa valeur est universelle, autrement dit, qu'elle n'est pas dépendante des détails microscopiques caractérisant le désordre, mais seulement des symétries satisfaites par le hamiltonien. La réalisation expérimentale d'un tel système est délicate. Des effets de décohérence trop nombreux viennent fausser la mesure de l'exposant critique. Pour contourner ces difficultés, nous réalisons un rotateur frappé avec des atomes froids. La dynamique quantique de ce système est connue pour être la même que celle de l'électron dans un potentiel désordonné. Nous testons alors différents jeux de paramètres régissant le désordre microscopique, et montrons que l'exposant critique en est indépendant. Ainsi nous prouvons expérimentalement l'universalité de la transition, ainsi que son appartenance à une classe d'universalité : l'ensemble gaussien orthogonal. Nous détaillons par ailleurs un changement de taille dans le dispositif : la réalisation d'une onde stationnaire vertical et d'une détection vélocimétrique par temps de vol.
APA, Harvard, Vancouver, ISO, and other styles
6

Schehr, Grégory. "Des systèmes élastiques désordonnés aux statistiques d'événements rares." Habilitation à diriger des recherches, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00640512.

Full text
Abstract:
Les résultats présentés dans ce mémoire d'habilitation à diriger les recherches s'appuient sur les travaux que j'ai effectués depuis 2006, date à laquelle j'ai rejoint le Laboratoire de Physique Théorique d'Orsay en tant que chargé de recherches au CNRS. J'ai choisi d'articuler ce mémoire autour de trois grandes thématiques : (i) les systèmes élastiques désordonnés, (ii) les propriétés de persistence, (iii) les statistiques d'extrêmes. La première partie s'inscrit dans la continuité des travaux que j'ai effectués lors de ma thèse puis mon post-doctorat. Elle s'articule autour des résultats que j'ai obtenus pour deux modèles élastiques désordonnés : le modèle SOS sur un substrat désordonné en deux dimensions, et le voisinage de la transition de dépiégage d'une ligne élastique, en dimension 1+1 dans un potentiel désordonné. A la fin de mon post-doctorat, j'ai commencé à m'intéresser aux problèmes de persistence, auxquels est consacrée la deuxième partie. Je présente donc mes travaux sur la persistence, dans des situations de dynamique hors d'équilibre mais aussi d'un point de vue un peu plus formel, en connexion avec les propriétés des racines réelles de polynômes aléatoires. Enfin ces propriétés de persistence m'ont amené à m'intéresser aux statistiques d'extrêmes, qui constituent la part la plus importante de ce mémoire, cette thématique étant actuellement mon sujet principal de recherche. Cette dernière partie s'ouvre sur une assez longue introduction sur les statistiques d'extrêmes, où sont présentés un certain nombre de résultats connus (et d'autres moins connus).
APA, Harvard, Vancouver, ISO, and other styles
7

Burdin, Sébastien. "Théories de champ moyen pour les systèmes d'électrons à fortes corrélations." Habilitation à diriger des recherches, Université Sciences et Technologies - Bordeaux I, 2012. http://tel.archives-ouvertes.fr/tel-00711167.

Full text
Abstract:
Ce mémoire d'habilitation à diriger les recherches présente des théories de champ moyen que j'ai appliquées à l'étude de systèmes d'électrons à fortes corrélations. Il s'appuie sur des travaux que j'ai effectués, pour certains dans la continuité de ma thèse, pour d'autres dans des directions nouvelles. L'une des problématiques centrales est celle des transitions de phases quantiques, et les systèmes considérés ont le point commun de décrire des impuretés quantiques, réelles ou artificielles. Les différentes méthodes de champ moyen utilisées sont présentées par des exemples dans le cadre de problèmes physiques particuliers. Après un premier chapitre introductif fortement focalisé sur l'exemple des composés d'électrons f, ce mémoire est structuré en trois chapitres principaux connectés respectivement aux trois thématiques suivantes : les composés Kondo, les liquides de spin, et les systèmes désordonnés.
APA, Harvard, Vancouver, ISO, and other styles
8

Sakellariou, Jason. "Inverse inference in the asymmetric Ising model." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00869738.

Full text
Abstract:
Recent experimental techniques in biology made possible the acquisition of overwhelming amounts of data concerning complex biological networks, such as neural networks, gene regulation networks and protein-protein interaction networks. These techniques are able to record states of individual components of such networks (neurons, genes, proteins) for a large number of configurations. However, the most biologically relevantinformation lies in their connectivity and in the way their components interact, information that these techniques aren't able to record directly. The aim of this thesis is to study statistical methods for inferring information about the connectivity of complex networks starting from experimental data. The subject is approached from a statistical physics point of view drawing from the arsenal of methods developed in the study of spin glasses. Spin-glasses are prototypes of networks of discrete variables interacting in a complex way and are widely used to model biological networks. After an introduction of the models used and a discussion on the biological motivation of the thesis, all known methods of network inference are introduced and analysed from the point of view of their performance. Then, in the third part of the thesis, a new method is proposed which relies in the remark that the interactions in biology are not necessarily symmetric (i.e. the interaction from node A to node B is not the same as the one from B to A). It is shown that this assumption leads to methods that are both exact and efficient. This means that the interactions can be computed exactly, given a sufficient amount of data, and in a reasonable amount of time. This is an important original contribution since no other method is known to be both exact and efficient.
APA, Harvard, Vancouver, ISO, and other styles
9

Hatchett, Jonathan Paul Lewis. "Dynamics of disordered physical and biological systems on dilute networks." Thesis, King's College London (University of London), 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.413165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tattersall, Graham David. "Neural networks and generalisation." Thesis, University of East Anglia, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266735.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Disordered Systems and Neural Networks"

1

Narendra, Kumpati S. Neural networks and dynamical systems. New Haven Co: Yale University Center for Systems Science, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cruse, Holk. Neural networks as cybernetic systems. Stuttgart: G. Thieme Verlag, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Narendra, Kumpati S. Neural networks and dynamical systems. New Haven Co: Yale University Center for Systems Science, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Abe, Shigeo. Neural Networks and Fuzzy Systems. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4615-6253-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tutorial on neural systems modeling. Sunderland, Mass: Sinauer Associates, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bekey, George A., and Goldberg Ken. Neural networks in robotics. New York: Springer Science+Business Media, LLC, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Neural networks for pattern recognition. Cambridge, Mass: MIT Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zurada, Jacek M. Introduction to artificial neural systems. St. Paul: West, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Elmasry, Mohamed I. VLSI Artificial Neural Networks Engineering. Boston, MA: Springer US, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Eeckman, Frank H. Computation in neurons and neural systems. New York: Springer, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Disordered Systems and Neural Networks"

1

Peretto, P., and J. J. Niez. "Collective Properties of Neural Networks." In Disordered Systems and Biological Organization, 171–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jackel, L. D., R. E. Howard, H. P. Graf, J. Denker, and B. Straughn. "High Resolution Microfabrication and Neural Networks." In Disordered Systems and Biological Organization, 193–96. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Noest, A. J. "Percolation and Frustration in Neural Networks." In Disordered Systems and Biological Organization, 381–84. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Personnaz, L., I. Guyon, and G. Dreyfus. "Neural Network Design for Efficient Information Retrieval." In Disordered Systems and Biological Organization, 227–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Buhmann, J., and K. Schulten. "A Physiological Neural Network as an Autoassociative Memory." In Disordered Systems and Biological Organization, 273–79. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Comets, Francis. "The Martingale Method for Mean-Field Disordered Systems at High Temperature." In Mathematical Aspects of Spin Glasses and Neural Networks, 91–113. Boston, MA: Birkhäuser Boston, 1998. http://dx.doi.org/10.1007/978-1-4612-4102-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lakshmi Praveena, T., and N. V. Muthu Lakshmi. "An Enhanced Autism Spectrum Disorder Detection Model Using Convolutional Neural Networks and Machine Learning Algorithms." In Lecture Notes in Networks and Systems, 631–39. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-1941-0_63.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pávez, Rodolfo, Jaime Díaz, Jeferson Arango-López, Danay Ahumada, Carolina Méndez, and Fernando Moreira. "Emotion Recognition in Children with Autism Spectrum Disorder Using Convolutional Neural Networks." In Advances in Intelligent Systems and Computing, 585–95. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72657-7_56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Toulouse, Gérard. "Neural Networks and Statistical Mechanics." In Time-Dependent Effects in Disordered Materials, 359–64. Boston, MA: Springer US, 1987. http://dx.doi.org/10.1007/978-1-4684-7476-3_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bungay, Henry R. "Neural Networks." In Environmental Systems Engineering, 121–38. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-5507-0_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Disordered Systems and Neural Networks"

1

Kozl̸owski, P. "Optimal capacity of Ashkin-Teller neural networks." In Disordered and complex systems. AIP, 2001. http://dx.doi.org/10.1063/1.1358162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fukumizu, Kenji. "Geometry of neural networks and models with singularities." In Disordered and complex systems. AIP, 2001. http://dx.doi.org/10.1063/1.1358172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bollé, D. "Information content of neural networks with self-control and variable activity." In Disordered and complex systems. AIP, 2001. http://dx.doi.org/10.1063/1.1358156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Skantzos, N. S. "Random field Ising chain and neutral networks with synchronous dynamics." In Disordered and complex systems. AIP, 2001. http://dx.doi.org/10.1063/1.1358170.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chadha, Ankita N., Mukesh A. Zaveri, and Jignesh N. Sarvaiya. "Isolated Word Recognition using Neural Network for Disordered Speech." In Telehealth and Assistive Technology / 847: Intelligent Systems and Robotics. Calgary,AB,Canada: ACTAPRESS, 2016. http://dx.doi.org/10.2316/p.2016.846-005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hayashi, Y., R. Setiono, and K. Yoshida. "Diagnosis of hepatobiliary disorders using rules extracted from artificial neural networks." In Proceedings of 8th International Fuzzy Systems Conference. IEEE, 1999. http://dx.doi.org/10.1109/fuzzy.1999.793263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Salhi, Lotfi, Talbi Mourad, and Adnene Cherif. "Voice disorders classification using multilayer neural network." In 2008 2nd International Conference on Signals, Circuits and Systems (SCS). IEEE, 2008. http://dx.doi.org/10.1109/icscs.2008.4746953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cawley, Gavin C., Steven Hayward, Gareth J. Janacek, and Geoff R. Moore. "Sparse Bayesian prediction of disordered residues and disordered regions based on amino-acid composition." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Neshat, Mehdi, and Abas E. Zadeh. "Hopfield neural network and fuzzy Hopfield neural network for diagnosis of liver disorders." In 2010 5th IEEE International Conference Intelligent Systems (IS). IEEE, 2010. http://dx.doi.org/10.1109/is.2010.5548321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kshirsagar, Pravin R., and Sudhir G. Akojwar. "Prediction of neurological disorders using optimized neural network." In 2016 International conference on Signal Processing, Communication, Power and Embedded System (SCOPES). IEEE, 2016. http://dx.doi.org/10.1109/scopes.2016.7955731.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Disordered Systems and Neural Networks"

1

Hirsch, Morris W., Bill Baird, Walter Freeman, and Bernice Gangale. Dynamical Systems, Neural Networks and Cortical Models ASSERT 93. Fort Belvoir, VA: Defense Technical Information Center, November 1994. http://dx.doi.org/10.21236/ada295495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Morgan, Nelson, Jerome Feldman, and John Wawrzynek. Accelerator Systems for Neural Networks, Speech, and Related Applications. Fort Belvoir, VA: Defense Technical Information Center, April 1995. http://dx.doi.org/10.21236/ada298954.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Priebe, C. E., and D. J. Marchette. Experience with Neural Networks at Naval Ocean Systems Center. Fort Belvoir, VA: Defense Technical Information Center, August 1988. http://dx.doi.org/10.21236/ada198923.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhuang, Y., and J. S. Baras. Identification of Infinite Dimensional Systems via Adaptive Wavelet Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, January 1993. http://dx.doi.org/10.21236/ada454923.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mu, Hong H., Y. P. Kakad, and B. G. Sherlock. Application of Artificial Neural Networks in the Design of Control Systems. Fort Belvoir, VA: Defense Technical Information Center, January 2000. http://dx.doi.org/10.21236/ada384438.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Talathi, S. S. Deep Recurrent Neural Networks for seizure detection and early seizure detection systems. Office of Scientific and Technical Information (OSTI), June 2017. http://dx.doi.org/10.2172/1366924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fisher, Andmorgan, Timothy Middleton, Jonathan Cotugno, Elena Sava, Laura Clemente-Harding, Joseph Berger, Allistar Smith, and Teresa Li. Use of convolutional neural networks for semantic image segmentation across different computing systems. Engineer Research and Development Center (U.S.), March 2020. http://dx.doi.org/10.21079/11681/35881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Farhi, Edward, and Hartmut Neven. Classification with Quantum Neural Networks on Near Term Processors. Web of Open Science, December 2020. http://dx.doi.org/10.37686/qrl.v1i2.80.

Full text
Abstract:
We introduce a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning. The quantum circuit consists of a sequence of parameter dependent unitary transformations which acts on an input quantum state. For binary classification a single Pauli operator is measured on a designated readout qubit. The measured output is the quantum neural network’s predictor of the binary label of the input state. We show through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets. We then discuss presenting the data as quantum superpositions of computational basis states corresponding to different label values. Here we show through simulation that learning is possible. We consider using our QNN to learn the label of a general quantum state. By example we show that this can be done. Our work is exploratory and relies on the classical simulation of small quantum systems. The QNN proposed here was designed with near-term quantum processors in mind. Therefore it will be possible to run this QNN on a near term gate model quantum computer where its power can be explored beyond what can be explored with simulation.
APA, Harvard, Vancouver, ISO, and other styles
9

Nikiforov, Vladimir. Laser technology and integrated technical systems in devices and instruments for ophthalmology using elements of artificial intelligence associated with artificial neural networks. Intellectual Archive, May 2019. http://dx.doi.org/10.32370/iaj.2123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nikiforov, Vladimir. Laser equipment and complex technical systems in devices and tools for ophthalmology, that use elements of artificial intelligence interlinked with artificial neural networks. Intellectual Archive, August 2019. http://dx.doi.org/10.32370/iaj.2172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography