Siga este enlace para ver otros tipos de publicaciones sobre el tema: Cellular automata – Computer programs.

Tesis sobre el tema "Cellular automata – Computer programs"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Cellular automata – Computer programs".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Bolduc, Jean-Sébastien. "Cellular-automata based nonlinear adaptive controllers". Thesis, McGill University, 1998. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=20804.

Texto completo
Resumen
An analytical approach is obviously practical only when we want to study nonlinear systems of low complexity. An alternative for more complex processes that has raised a lot of interest in recent years relies on Artificial Neural Networks (ANNs).
In this work we will explore an alternative avenue to the problems of control and identification, where Cellular Automata (CAs) will be considered in place of ANNs. CAs not only share ANNs' most valuable characteristics but they also have interesting characteristics of their own, for a structurally simpler architecture. CAs applications so far have been mainly restrained to simulating natural phenomena occuring in a finite homogeneous space.
Concepts relevant to the problems of control and identification will be introduced in the first part of our work. CAs will then be introduced, with a discussion of the issues raised by their application in the context, A working prototype of a CA-based controller is introduced in the last part of the work, that confirms the interest of using CAs to address the problem of nonlinear adaptive control. (Abstract shortened by UMI.)
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Ratitch, Bohdana. "Continuous function identification with fuzzy cellular automata". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0006/MQ44255.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Hopman, Ryan. "Aribitrary geometry cellular automata for elastodynamics". Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/29742.

Texto completo
Resumen
Thesis (M. S.)--Mechanical Engineering, Georgia Institute of Technology, 2010.
Committee Chair: Dr. Michael Leamy; Committee Member: Dr. Karim Sabra; Committee Member: Dr. Aldo Ferri. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Agin, Ruben. "Logic simulation on a cellular automata machine". Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/43474.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Adams, Roxane. "Implementation of cell clustering in cellular automata". Thesis, Stellenbosch : University of Stellenbosch, 2011. http://hdl.handle.net/10019.1/6674.

Texto completo
Resumen
Thesis (MSc (Mathematical Sciences)) University of Stellenbosch, 2011.
ENGLISH ABSTRACT: Cellular Automata (CA) have become a popular vehicle to study complex dynamical behaviour of systems. CA can be used to model a wide variety of physical, biological, chemical and other systems. Such systems typically consist of subparts that change their state independently, based on the state of their immediate surroundings and some generally shared laws of change. When the CA approach was used to solve the LEGO construction problem, the best solution was found when using a variant of CA allowing for the clustering of cells. The LEGO construction problem concerns the optimal layout of a set of LEGO bricks. The advantages found for using the CA method with clustering in this case are the ease of implementation, the significantly smaller memory usage to previously implemented methods, and its trivial extension to construct multicoloured LEGO sculptures which were previously too complex to construct. In our research we propose to explore the definitions of clustering in CA and investigate the implementation and application of this method. We look at the ant sorting method described by Lumer and Faieta, and compare the implementation of this algorithm using regular CA as well as the clustering variation. The ant sorting model is a simple model, in which ants move randomly in space and pick up and deposit objects on the basis of local information.
AFRIKAANSE OPSOMMING: Sellulêre Outomate (SO) het ’n populêre metode geword om die komplekse dinamiese gedrag van sisteme bestudeer. SO kan gebruik word om ’n groot verskeidenheid fisiese, biologiese, chemiese en ander tipe sisteme te modelleer. Sulke sisteme bestaan tipies uit subafdelings wat, gebaseer op die status van hulle omgewing en ’n paar algemene gedeelde reëls van verandering, hulle status onafhanklik verander. Met die gebruik van die SO benadering om the LEGO konstruksieprobleem op te los, is die beste oplossing bereik deur gebruik te maak van ’n variant van SO, waar selle saamgroepeer kan word. Die LEGO konstruksieprobleem behels die optimale uitleg van ’n stel LEGO blokkies. In hierdie geval is die voordele van die SO met sel groepering die maklike implementasie, ’n beduidende kleiner geheuegebruik teenoor voorheen geïmplementeerde metodes, en die triviale uitbreiding daarvan om gekleurde LEGO beelde wat voorheen te kompleks was, te kan bou. In ons ondersoek verken ons die definisies van selgroepering in SO en ondersoek die implementasie en toepassing van die metode. Ons kyk na die miersorteringsmetode beskryf deur Lumer en Faieta, en vergelyk die implementasie van hierdie algoritme deur gewone SO asook die groeperingsvariasie te gebruik. Die miersorteringsmodel is ’n eenvoudige model waarin miere lukraak in ’n omgewing beweeg en voorwerpe optel of neersit volgens plaaslike inligting.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Serquera, Jaime. "Sound synthesis with cellular automata". Thesis, University of Plymouth, 2012. http://hdl.handle.net/10026.1/1189.

Texto completo
Resumen
This thesis reports on new music technology research which investigates the use of cellular automata (CA) for the digital synthesis of dynamic sounds. The research addresses the problem of the sound design limitations of synthesis techniques based on CA. These limitations fundamentally stem from the unpredictable and autonomous nature of these computational models. Therefore, the aim of this thesis is to develop a sound synthesis technique based on CA capable of allowing a sound design process. A critical analysis of previous research in this area will be presented in order to justify that this problem has not been previously solved. Also, it will be discussed why this problem is worthwhile to solve. In order to achieve such aim, a novel approach is proposed which considers the output of CA as digital signals and uses DSP procedures to analyse them. This approach opens a large variety of possibilities for better understanding the self-organization process of CA with a view to identifying not only mapping possibilities for making the synthesis of sounds possible, but also control possibilities which enable a sound design process. As a result of this approach, this thesis presents a technique called Histogram Mapping Synthesis (HMS), which is based on the statistical analysis of CA evolutions by histogram measurements. HMS will be studied with four different automatons, and a considerable number of control mechanisms will be presented. These will show that HMS enables a reasonable sound design process. With these control mechanisms it is possible to design and produce in a predictable and controllable manner a variety of timbres. Some of these timbres are imitations of sounds produced by acoustic means and others are novel. All the sounds obtained present dynamic features and many of them, including some of those that are novel, retain important characteristics of sounds produced by acoustic means.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Sahota, Parminda. "Evolving cellular automata molecular computer models using genetic algorithms". Thesis, University of Nottingham, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.362898.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Risacher, Daniel R. (Daniel Robert). "Design and implementation of a compiler for cellular automata machines". Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/38807.

Texto completo
Resumen
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1996.
Includes bibliographical references (p. 152).
by Daniel R. Risacher.
M.Eng.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Xie, Jingnan. "Complexity Theoretic Parallels Among Automata, Formal Languages and Real Variables Including Multi-Patterns, L-Systems and Cellular Automata". Thesis, State University of New York at Albany, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10272502.

Texto completo
Resumen

In this dissertation, we emphasize productiveness not just undecidability since pro- ductiveness implies constructive incompleteness. Analogues of Rice?s Theorem for different classes of languages are investigated, refined and generalized. In particular, several sufficient but general conditions are presented for predicates to be as hard as some widely discussed predicates such as ?= ?? and ?= {0,1}??. These conditions provide several general methods for proving complexity/productiveness results and apply to a large number of simple and natural predicates. As the first step in apply- ing these general methods, we investigate the complexity/productiveness of the pred- icates ?= ??, ?= {0,1}?? and other predicates that can be useful sources of many- one reductions for different classes of languages. Then we use very efficient many- one reductions of these basic source predicates to prove many new non-polynomial complexity lower bounds and productiveness results. Moreover, we study the com- plexity/productiveness of predicates for easily recognizable subsets of instances with important semantic properties. Because of the efficiency of our reductions, intuitively these reductions can preserve many levels of complexity. We apply our general methods to pattern languages [1] and multi-pattern lan- guages [2]. Interrelations between multi-pattern languages (or pattern languages) and standard classes of languages such as context-free languages and regular languages are studied. A way to study the descriptional complexity of standard language descriptors (for examples, context-free grammars and regular expressions) and multi-patterns is illustrated. We apply our general methods to several generalizations of regular ex- pressions. A productiveness result for the predicate ?= {0,1}?? is established for synchronized regular expressions [3]. Because of this, many new productiveness re- sults for synchronized regular expressions follow easily. We also apply our general methods to several classes of Lindenmayer systems [4] and of cellular automata [5]. A way of studying the complexity/productiveness of the 0Lness problem is developed and many new results follow from it. For real time one-way cellular automata, we observe that the predicates ?= ?? and ?= {0,1}?? are both productive. Because vi of this, many more general results are presented. For two-way cellular automata, we prove a strong meta-theorem and give a complete characterization for testing containment of any fixed two-way cellular automaton language. Finally, we generalize our methods and apply them to the theory of functions of real variables. In rings, the equivalence to identically 0 function problem which is an analogue of ?= ?? is studied. We show that the equivalence to identically 0 function problem for some classes of elementary functions is productive for different domains including open and closed bounded intervals of real numbers. Two initial results for real fields are also presented.

Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Brown, Robert L. "Application of Cellular Automata to Detection of Malicious Network Packets". NSUWorks, 2014. http://nsuworks.nova.edu/gscis_etd/106.

Texto completo
Resumen
A problem in computer security is identification of attack signatures in network packets. An attack signature is a pattern of bits that characterizes a particular attack. Because there are many kinds of attacks, there are potentially many attack signatures. Furthermore, attackers may seek to avoid detection by altering the attack mechanism so that the bit pattern presented differs from the known signature. Thus, recognizing attack signatures is a problem in approximate string matching. The time to perform an approximate string match depends upon the length of the string and the number of patterns. For constant string length, the time to matchnpatterns is approximatelyO(n); the time increases approximately linearly as the number of patterns increases. A binary cellular automaton is a discrete, deterministic system of cells in which each cell can have one of two values. Cellular automata have the property that the next state of each cell can be evaluated independently of the others. If there is a processing element for each cell, the next states of all cells in a cellular automaton can be computed simultaneously. Because there is no programming paradigm for cellular automata, cellular automata to perform specific functions are createdad hocby hand or discovered using search methods such as genetic algorithms. This research has identified, through evolution by genetic algorithm, cellular automata that can perform approximate string matching for more than one pattern while operating in constant time with respect to the number of patterns, and in the presence of noise. Patterns were recognized by using the bits of a network packet payload as the initial state of a cellular automaton. After a predetermined number of cycles, the ones density of the cellular automaton was computed. Packets for which the ones density was below an experimentally determined threshold were identified as target packets. Six different cellular automaton rules were tested against a corpus of 7.2 million TCP packets in the IDEval data set. No rule produced false negative results, and false positive results were acceptably low.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Mazzolini, Ryan. "Procedurally generating surface detail for 3D models using voxel-based cellular automata". Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20502.

Texto completo
Resumen
Procedural generation is used extensively in the field of computer graphics to automate content generation and speed up development. One particular area often automated is the generation of additional colour and structural detail for existing 3D models. This empowers artists by providing a tool-set that enhances their existing work-ow and saves time. 3D surface structures are traditionally represented by polygon mesh-based models augmented by 2D mapping techniques. These methods can approximate features, such as caves and overhangs, however they are complex and difficult to modify. As an alternative, a grid of voxels can model 3D shapes and surfaces, similar to how 2D pixels form an image. The regular form of voxel-based models is easier to alter, at the cost of additional computational overhead. One technique for generating and altering voxel content is by using Cellular Automata (CA). CAs are able to produce complex structures from simple rules and also easily map to higher dimensions, such as voxel datasets. However, creating CA rule-sets can be difficult and tedious. This is especially true when creating multidimensional CA. In our work we use a grammar system to create surface detail CA. The grammar we develop is similar to formal grammars used in procedural generation, such as L-systems and shape grammars. Our system is composed of three main sections: a model converter, grammar and CA executor. The model converter changes polygon-mesh models to and from a voxel-based model. The grammar provides a simple language to create CA that can consider 3D neighbourhoods and query parameters, such as colour or structure. Finally, the CA executor interprets the produced grammars into surface-oriented CAs. The final output of this system is a polygon-mesh model, altered by the CA, which is usable for graphics applications. We test the system by replicating a number of CA use-cases with our grammar system. From the results, we conclude that our grammar system is capable of creating a wide range of 3D detail CA. However, the high resolution of resulting meshes and slow processing times make the process more suited to o_-line processing and pre-production.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Chambless, Jason Daniel. "A 3D computer model investigation of biofilm detachment and protection mechanisms". Thesis, Montana State University, 2008. http://etd.lib.montana.edu/etd/2008/chambless/ChamblessJ0508.pdf.

Texto completo
Resumen
A biofilm is a dense aggregation of microorganisms attached to each other and a supporting surface. Biofilms are ubiquitous in industrial environments and are also frequently recognized as the source of persistent infections. Biofilm invasions and biofilm-induced infections are often difficult or impossible to remedy. This dissertation presents the results of a 3D hybrid computer model, BacLAB, which was used to simulate detachment and protection mechanisms of biofilms in a cellular automata framework. Protection against antimicrobials afforded by each of four hypothesized protective mechanisms was investigated in order to examine population survival versus antimicrobial exposure time, and the spatial patterns of chemical species and cell types. When compared to each other, the behaviors of the slow penetration, adaptive stress response, substrate limitation, and persister mechanisms produced distinct shapes of killing curves, non-uniform spatial patterns of survival and cell type distribution, and anticipated susceptibility patterns of dispersed biofilm cells. Detachment is an important process that allows an organism the possibility of traveling to and colonizing a new location. Detachment also balances growth and so determines the net accumulation of biomass on the surface. Three hypothetical mechanisms representing various physical and biological influences of detachment were incorporated into BacLAB. The purpose of this investigation was to characterize each of the mechanisms with respect to four criteria: the resulting biofilm structure, the existence of a steady state, the propensity for sloughing events, and the dynamics during starvation. The results showed that varying the detachment mechanism is a critical determinant of biofilm structure and of the dynamics of biofilm accumulation and loss. Phenotypic variants, in the form of dormant cells, can often survive an antimicrobial treatment. The existence of these cells, termed persisters, is one hypothetical explanation for biofilm recalcitrance. Four different combinations of random and substrate-dependant persister mechanisms were simulated through the use of the BacLAB model. The purpose of this study was to determine and compare the effects of differing formation and resuscitation strategies on persister-related protection of biofilms. Analysis of the simulations showed that extended periods of dormancy, without regard to the mechanism, were directly responsible for more tolerant biofilms.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Karaca, Igor. "Random precision some applications of fractals and cellular automata in music composition /". Connect to this title online, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1115254473.

Texto completo
Resumen
Thesis (D. M. A.)--Ohio State University, 2005.
Title from first page of PDF file. Document formatted into pages; contains vii, 133 p.; also includes graphics (some col.). Includes bibliographical references (p. 47-48). Available online via OhioLINK's ETD Center.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Min, Byoung Won. "Trade and war in cellular automata worlds : A computer simulation of interstate interactions /". The Ohio State University, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=osu1486459267519449.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Dolzhenko, Egor. "Modeling State Transitions with Automata". Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4468.

Texto completo
Resumen
Models based on various types of automata are ubiquitous in modern science. These models allow reasoning about deep theoretical questions and provide a basis for the development of efficient algorithms to solve related computational problems. This work discusses several types of automata used in such models, including cellular automata and mandatory results automata. The first part of this work is dedicated to cellular automata. These automata form an important class of discrete dynamical systems widely used to model physical, biological, and chemical processes. Here we discuss a way to study the dynamics of one-dimensional cellular automata through the theory of two-dimensional picture languages. The connection between cellular automata and picture languages stems from the fact that the set of all space-time diagrams of a cellular automaton defines a picture language. We will discuss a hierarchy of cellular automata based on the complexity of the picture languages that they define. In addition to this, we present a characterization of cellular automata that can be described by finite-state transducers. The second part of this work presents a theory of runtime enforcement based on mech- anism models called Mandatory Results Automata (MRAs). MRAs can monitor and trans- form security-relevant actions and their results. Because previous work could not model general security monitors transforming results, MRAs capture realistic behaviors outside the scope of previous models. MRAs also have a simple but realistic operational seman- tics that makes it straightforward to define concrete MRAs. Moreover, the definitions of policies and enforcement with MRAs are significantly simpler and more expressive than those of previous models. Putting all these features together, we argue that MRAs make good general models of (synchronous) runtime mechanisms, upon which a theory of run- time enforcement can be based. We develop some enforceability theory by characterizing the policies deterministic and nondeterministic MRAs enforce.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Andersson, Fredrik. "Procedurellt genererade provinskartor för strategispel : En jämförelse mellan Voronoidiagram och Cellular Automata". Thesis, Högskolan i Skövde, Institutionen för kommunikation och information, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-8185.

Texto completo
Resumen
Procedurell generering är ett område som vuxit mycket på de senaste åren men som också är ett gammalt område. Kända koncept inom detta område är kart- och terränggenerering. Koncepten används för att skapa både verklighetstrolig terräng och för att skapa spelkartor för spelaren att utforska.Detta arbete använder sig av procedurell generering för att undersöka skillnaden mellan två kända metoder; Voronoidiagram och Cellular Automata vid skapandet av en provinskarta. Tre kartor från varje metod evalueras genom att importeras i spelet Crusader Kings 2 och sedan spelas med hjälp av spelets artificiella intelligens. Kartorna evalueras genom att titta på tre olika faktorer för hur intressant kartan är och prestandan för att skapa en karta. Resultatet visar att det både är lite skillnad mellan de två metoderna som jämförts men också att det är väldigt liten skillnad mot standardkartan. Prestandamässigt är det dock Voronoidiagrammet som är överlägset med ungefär dubbel tidseffektivitet jämfört med Cellular Automata.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Singhal, Rahul. "Logic Realization Using Regular Structures in Quantum-Dot Cellular Automata (QCA)". PDXScholar, 2011. https://pdxscholar.library.pdx.edu/open_access_etds/196.

Texto completo
Resumen
Semiconductor industry seems to approach a wall where physical geometry and power density issues could possibly render the device fabrication infeasible. Quantum-dot Cellular Automata (QCA) is a new nanotechnology that claims to offer the potential of manufacturing even denser integrated circuits, which can operate at high frequencies and low power consumption. In QCA technology, the signal propagation occurs as a result of electrostatic interaction among the electrons as opposed to flow to the electrons in a wire. The basic building block of QCA technology is a QCA cell which encodes binary information with the relative position of electrons in it. A QCA cell can be used either as a wire or as logic. In QCA, the directionality of the signal flow is controlled by phase-shifted electric field generated on a separate layer than QCA cell layer. This process is called clocking of QCA circuits. The logic realization using regular structures such as PLAs have played a significant role in the semiconductor field due to their manufacturability, behavioral predictability and the ease of logic mapping. Along with these benefits, regular structures in QCA's would allow for uniform QCA clocking structure. The clocking structure is important because the pioneers of QCA technology propose it to be fabricated in CMOS technology. This thesis presents a detailed design implementation and a comparative analysis of logic realization using regular structures, namely Shannon-Lattices and PLAs for QCAs. A software tool was developed as a part of this research, which automatically generates complete QCA-Shannon-Lattice and QCA-PLA layouts for single-output Boolean functions based on an input macro-cell library. The equations for latency and throughput for the new QCA-PLA and QCA-Shannon-Lattice design implementations were also formulated. The correctness of the equations was verified by performing simulations of the tool-generate layouts with QCADesigner. A brief design trade-off analysis between the tool-generated regular structure implementation and the unstructured custom layout in QCA is presented for the full-adder circuit.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Damera, Prateen Reddy. "A low level analysis of Cellular Automata and Random Boolean Networks as a computational architecture". PDXScholar, 2011. https://pdxscholar.library.pdx.edu/open_access_etds/670.

Texto completo
Resumen
With the transition from single-core to multi-core computing and CMOS technology reaching its physical limits, new computing architectures which are scalable, robust, and low-power are required. A promising alternative to conventional computing architectures are Cellular Automata (CA) networks and Random Boolean Networks (RBN), where simple computational nodes combine to form a network that is capable of performing a larger computational task. It has previously been shown that RBNs can offer superior characteristics over mesh networks in terms of robustness, information processing capabilities, and manufacturing costs while the locally connected computing elements of a CA network provide better scalability and low average interconnect length. This study presents a low level hardware analysis of these architectures using a framework which generates the HDL code and netlist of these networks for various network parameters. The HDL code and netlists are then used to simulate these new computing architectures to estimate the latency, area and power consumed when implemented on silicon and performing a pre-determined computation. We show that for RBNs, information processing is faster compared to a CA network, but CA networks are found to a have lower and better distribution of power dissipation than RBNs because of their regular structure. A well-established task to determine the latency of operation for these architectures is presented for a good understanding of the effect of non-local connections in a network. Programming the nodes for this purpose is done externally using a novel self-configuration algorithm requiring minimal hardware. Configuration for RBNs is done by sending in configuration packets through a randomly chosen node. Logic for identifying the topology for the network is implemented for the nodes in the RBN network to enable compilers to analyze and generate the configuration bit stream for that network. On the other hand, the configuration of the CA network is done by passing in configuration data through the inputs on one of the sides of the cell array and shifting it into the network. A study of the overhead of the network configuration and topology identification mechanisms are presented. An analysis of small-world networks in terms of interconnect power and information propagation capability has been presented. It has been shown that small-world networks, whose randomness lies between that of completely regular and completely irregular networks, are realistic while providing good information propagation capability. This study provides valuable information to help designers make decisions for various performance parameters for both RBN and CA networks, and thus to find the best design for the application under consideration.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Pontecorvo, Carmine. "Edge detection and enhancement using shunting inhibitory cellular neural networks /". Title page, abstract and contents only, 1998. http://web4.library.adelaide.edu.au/theses/09PH/09php814.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Kapkar, Rohan Viren. "Modeling and Simulation of Altera Logic Array Block using Quantum-Dot Cellular Automata". University of Toledo / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1304616947.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Tang, Wing-shun y 鄧榮信. "Study of power spectrum fluctuation in accretion disc by cellular automaton". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B31221695.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Zaggl, Michael A. [Verfasser]. "Computer-based Simulation of Vegetation : Development of a Cellular Automata Model for Grasslands / Michael A. Zaggl". Saarbrücken : VDM Verlag Dr. Müller, 2009. http://www.vdm-verlag.de.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Dunn, Adam. "A model of wildfire propagation using the interacting spatial automata formalism". University of Western Australia. School of Computer Science and Software Engineering, 2007. http://theses.library.uwa.edu.au/adt-WU2007.0071.

Texto completo
Resumen
[Truncated abstract] In this thesis, I address the modelling and computer simulation of spatial, eventdriven systems from a computer science perspective. Spatially explicit models of wildland fire (wildfire) behaviour are addressed as the specific application domain. Wildfire behaviour is expressed as a formal model and the associated simulations are compared to existing models and implementations. It is shown that the in- teracting spatial automata formalism provides a general framework for modelling spatial event-driven systems and is appropriate to wildfire systems. The challenge adressed is that of physically realistic modelling of wildfire behaviour in heterogeneous environments . . . Many current models do not incorporate the influence of a neighbourhood (the geometry of the fire front local to an unburnt volume of fuel, for example), but rather determine the propagation of fire using only point information. Whilst neighbourhood-based influence of behaviour is common to cellular automata theory, its use is very rare in existing models of wildfire models. In this thesis, I present the modelling technique and demonstrate its applicability to wildfire systems via a series of simulation experiments, where I reproduce known spatial wildfire dynamics. I conclude that the interacting spatial automata formalism is appropriate as a basis for constructing new computer simulations of wildfire spread behaviour. Simulation results are compared to existing implementations, highlighting the limitations of current models and demonstrating that the new models are capable of greater physical realism.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Connell, Kathleen L. "An I/O algorithm and a test algorithm for a reconfigurable cellular array". Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/90927.

Texto completo
Resumen
Recent advances in VLSI technology have stimulated research efforts in the area of highly reliable fault tolerant, general purpose computing systems, notably, parallel systems. An automatically reconfigurable, fault-tolerant, parallel architecture is suited to VLSI technology. The architecture, a uniformly interconnected array of identical cells, is capable of functional reconfiguration as well as fault reconfiguration. Microprocessor cells are suggested as the "fabric" for implementation of the array. This thesis also introduces an I/O algorithm as an extension to the reconfiguration process, and outlines the steps by which the array cells construct paths from the active-array to the cellular array I/O ports. Path reconfiguration is presented as the method by which fault-free paths replace faulty paths. A testing algorithm is described for use in the self-testing operation of the array. The types of tests that are conducted on cells are outlined, and the basis by which a cell determines the faulty or fault-free status of a cell is described.
M.S.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Yu, Di. "An Application Developed for Simulation of Electrical Excitation and Conduction in a 3D Human Heart". Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4620.

Texto completo
Resumen
This thesis first reviews the history of General Purpose computing Graphic Processing Unit (GPGPU) and then introduces the fundamental problems that are suitable for GPGPU algorithm. The architecture of GPGPU is compared against modern CPU architecture, and the fundamental difference is outlined. The programming challenges faced by GPGPU and the techniques utilized to overcome these issues are evaluated and discussed. The second part of the thesis presents an application developed with GPGPU technology to simulate the electrical excitation and conduction in a 3D human heart model based on cellular automata model. The algorithm and implementation are discussed in detail and the performance of GPU is compared against CPU.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Ramineni, Narahari. "Tree Restructuring Approach to Mapping Problem in Cellular Architecture FPGAS". PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/4914.

Texto completo
Resumen
This thesis presents a new technique for mapping combinational circuits to Fine-Grain Cellular-Architecture FPGAs. We represent the netlist as the binary tree with decision variables associated with each node of the tree. The functionality of the tree nodes is chosen based on the target FPGA architecture. The proposed tree restructuring algorithms preserve local connectivity and allow direct mapping of the trees to the cellular array, thus eliminating the traditional routing phase. Also, predictability of the signal delays is a very important advantage of the developed approach. The developed bus-assignment algorithm efficiently utilizes the medium distance routing resources (buses). The method is general and can be used for any Fine Grain CA-type FPGA. To demonstrate our techniques, ATMEL 6000 series FPGA was used as a target architecture. The area and delay comparison between our methods and commercial tools is presented using a set of MCNC benchmarks. Final layouts of the implemented designs are included. Results show that the proposed techniques outperform the available commercial tools for ATMEL 6000 FPGAs, both in area and delay optimization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Bozkurt, Halil. "Modeling of Socio-Economic Factors and Adverse Events In an Active War Theater By Using a Cellular Automata Simulation Approach". Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5771.

Texto completo
Resumen
Department of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the grievances among the displeased by adverse events. These non-kinetic operations include rebuilding indigenous institutions' bottom-up economic activity and constructing necessary infrastructure since the success in non-kinetic operations depends on understanding and using social and cultural landscape. This study aims to support decision makers by building a computational model to understand economic factors and their effect on adverse events. In this dissertation, the analysis demonstrates that the use of cellular automata has several significant contributions to support decision makers allocating development funds to stabilize regions with higher adverse event risks, and to better understand the complex socio-economic interactions with adverse events. Thus, this analysis was performed on a set of spatial data representing factors from social and economic data. In studying behavior using cellular automata, cells in the same neighborhood synchronously interact with each other to determine their next states, and small changes in iteration may yield to complex formations of adverse event risk after several iterations of time. The modeling methodology of cellular automata for social and economic analysis in this research was designed in two major implementation levels as follows: macro and micro-level. In the macro-level, the modeling framework integrates population, social, and economic sub-systems. The macro-level allows the model to use regionalized representations, while the micro-level analyses help to understand why the events have occurred. Macro-level subsystems support cellular automata rules to generate accurate predictions. Prediction capability of cellular automata is used to model the micro-level interactions between individual actors, which are represented by adverse events. The results of this dissertation demonstrate that cellular automata model is capable of evaluating socio-economic influences that result in changes in adverse events and identify location, time and impact of these events. Secondly, this research indicates that the socio-economic influences have different levels of impact on adverse events, defined by the number of people killed, wounded or hijacked. Thirdly, this research shows that the socio-economic, influences and adverse events that occurred in a given district have impacts on adverse events that occur in neighboring districts. The cellular automata modeling approach can be used to enhance the capability to understand and use human, social and behavioral factors by generating what-if scenarios to determine the impact of different infrastructure development projects to predict adverse events. Lastly, adverse events that could occur in upcoming years can be predicted to allow decision makers to deter these events or plan accordingly if these events do occur.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Ouyang, Weichen. "A Web-Based Decision Support System For Wildfire Management". University of Cincinnati / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406901679.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Smal, Eugene. "Automated brick sculpture construction". Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019/1621.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Labrado, Carson. "Exploration of Majority Logic Based Designs for Arithmetic Circuits". UKnowledge, 2017. http://uknowledge.uky.edu/ece_etds/102.

Texto completo
Resumen
Since its inception, Moore's Law has been a reliable predictor of computational power. This steady increase in computational power has been due to the ability to fit increasing numbers of transistors in a single chip. A consequence of increasing the number of transistors is also increasing the power consumption. The physical properties of CMOS technologies will make this powerwall unavoidable and will result in severe restrictions to future progress and applications. A potential solution to the problem of rising power demands is to investigate alternative low power nanotechnologies for implementing logic circuits. The intrinsic properties of these emerging nanotechnologies result in them being low power in nature when compared to current CMOS technologies. This thesis specifically highlights quantum dot celluar automata (QCA) and nanomagnetic logic (NML) as just two possible technologies. Designs in NML and QCA are explored for simple arithmetic units such as full adders and subtractors. A new multilayer 5-input majority gate design is proposed for use in NML. Designs of reversible adders are proposed which are easily testable for unidirectional stuck at faults.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Shenoi, Sangeetha Chandra. "A Comparative Study on Methods for Stochastic Number Generation". University of Cincinnati / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1511881394773194.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Foote, David W. "The Design, Realization and Testing of the ILU of the CCM2 Using FPGA Technology". PDXScholar, 1994. https://pdxscholar.library.pdx.edu/open_access_etds/4703.

Texto completo
Resumen
Most existing computers today are built upon a subset of the arithmetic system which is based upon the foundation of set theory. All formal systems can be expressed in terms of arithmetic and logic on current arithmetic computers through an appropriate model, then work with the model using software manipulation. However, severe speed degradation is the price one must pay for using a software-based approach, making several high-level formal systems impractical. To improve the speed at which computers can implement these high-level systems, one must either design special hardware, implementing specific operations much like math and image processing coprocessors, or execute operations upon multiple processors in a parallel fashion. Due to the increase in developing applications for the manipulation of logic functions, an interest in the logic machine has arisen. Many applications such as logic optimization, simulation, pattern recognition and image processing can be better implemented with a logic machine. This thesis proposes the design, hardware realization, and testing of the iterative logic unit (ILU) of the Cube Calculus Machine II (CCM2). The CCM2 is a general purpose computer with an architecture that emphasizes a data path designed to execute operations of cube calculus, a popular algebraic model used in the minimization of Boolean functions. The ILU is an iterative logic array of cells (ITs) using internal distributed control, enabling the execution of basic cube operations, while the Control Unit (CU) handles global signals from the host computer. The ILU of the CCM2 has been realized in hardware using Xilinx Logic Cell Arrays (LCAs). FPGAs offer the logic density and versatility of gate arrays, with the off-the shelf availability and time-to-market advantages of standard user-programmable devices. These devices can be reconfigured, allowing multiple revisions and future design generations to accommodate the same device, thus saving design and production costs, an ideal solution to the resource and financial problems plaguing the University environment.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Wu, Yunhui. "Agent behavior in peer-to-peer shared ride systems /". Connect to thesis, 2007. http://eprints.unimelb.edu.au/archive/00003214.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Lisboa, Leila Sheila Silva. "Cenários de mudanças climáticas usando modelagem dinâmica na Bacia do Alto Taquari". Universidade do Estado do Rio de Janeiro, 2008. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=8271.

Texto completo
Resumen
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
A região Centro-Oeste do Brasil tornou-se nos últimos 40 anos grande produtora de grãos e carne bovina. As condições edafoclimáticas, o sistema de manejo do solo e o descumprimento de leis ambientais trouxeram conseqüências drásticas à região como o agravamento do processo hídrico erosivo, principalmente na Bacia do Alto Taquari (BAT). Cerca de 90% da BAT localiza-se na porção norte do estado de Mato Grosso do Sul (MS), porém os efeitos do transporte de sedimentos e volume de água são refletidos a jusante dos rios, na Bacia do Pantanal. Utilizando-se pressupostos do Painel Intergovernamental de Mudanças Climáticas (IPCC) foram estabelecidos cenários de mudanças climáticas na Bacia do Alto Taquari, visando identificar áreas com maior vulnerabilidade ao processo erosivo em função de pressões de uso da terra. Usando a modelagem dinâmica no TerraME (Environment Modeling) foram gerados cenários topopluviais até 2100, considerando-se para a temperatura do ar média anual um aumento de 1C, em cenário otimista e, em pessimista, elevações térmicas de 3C. Para a precipitação pluvial média anual um cenário foi com aumento de 15% e outro com reduções de 15%. Os dados foram espacializados no ArcGis 9.2 e exportados para o TerraView 3.2, criando-se espaços celulares e integrando-se com as informações do modelo digital do terreno do Shuttle Radar Topography Mission (SRTM) para geração dos mapas topoclimáticos e simulações de cenários no TerraMe. Os resultados apontam que 85% da área da BAT nas condições atuais as temperaturas médias variam entre 23,6 a 25,7C. As simulações térmicas no cenário otimista indicam que em 40 anos as temperaturas tendem a superar o maior limite térmico médio nas áreas ao longo do rio Taquari, no sentido Oeste-Leste. Esses valores evidenciam elevações nas taxas evapotranspiratórias de matas ciliares, indicando reduções na vazão do Taquari. Em cenário pessimista essas temperaturas antecipam sua ocorrência, em um prazo de 20 anos. Os cenários com acréscimo de 15% na precipitação pluvial mostram aumentos no volume de água precipitada na parte norte da Bacia, região mais vulnerável aos problemas de erosão hídrica. Cenários do regime térmico-hídrico apontam áreas mais sensíveis às mudanças climáticas na parte oeste da BAT e impactos ambientais também na Bacia do Pantanal. Conclui-se que o TerraME é indicado para gerar cenários de mudanças climáticas em bacias hidrográficas.
Due to agriculture frontier advance in Centre-Western Brazil in the last 40 years, the region became a major grain and meat producer. Soil and Climate particular characteristics, associated to soil management system brought drastic environmental consequences, such as erosion process, mainly in Upper Taquari Basin (UTB). Approximately 86% of UTB is located in North of Mato Grosso do Sul, however the sediment transport effects are reflected downstream, at Pantanal Basin. This study aimed at modeling meteorological variables and simulating climate change scenarios applying dynamic modeling techniques coupled to geoprocessing tools in UTB in order to support land use planning in the region. IPCC assumptions were adopted to simulate two termopluvial scenarios until 2100 applying TerraME (Modelling Environment) tool. An optimistic scenario considers that yearly average air temperature would be increased by 1C, while pessimistic scenario points out 3C as average temperature elevation. Regarding to annual pluvial precipitation means, an optimistic scenario forecasts 15% of precipitation increment. Reductions of 15% in precipitation are waited in pessimistic foreseen. Isolines spatial distribution was calculated using DEM (Digital Elevation Model) based on SRTM (Shuttle Radar Topography Mission). Scenarios generate different spatial topoclimate patterns in the basin. Prevalent mean temperatures currently vary from 23.6 to 25.7C. After 100 years in simulation, optimistic scenario shows a displacement to thermal range from 22.1 to 23.0C. In the next 40 years, on areas along Taquari river basin, from West to East direction, temperatures will overcome current mean superior thermal limit for the region of UTB, i.e., evapotranspiration rates in riparian zone are likely to increase. This indicates a trend to reduction in stream discharges. In the pessimistic scenarios, these temperatures will be anticipated in 20 years. Scenario with 15% higher pluvial precipitation shows that north part of UTB will receive larger rainfall volumes, what should make erosion problem worse. These scenarios demonstrate spatial-temporal dynamic model potential. Among studied climate variables, air temperature is the most sensitive to express climate change effects in Upper Taquari Basin.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Thapliyal, Himanshu. "Design, Synthesis and Test of Reversible Circuits for Emerging Nanotechnologies". Scholar Commons, 2011. http://scholarcommons.usf.edu/etd/3379.

Texto completo
Resumen
Reversible circuits are similar to conventional logic circuits except that they are built from reversible gates. In reversible gates, there is a unique, one-to-one mapping between the inputs and outputs, not the case with conventional logic. Also, reversible gates require constant ancilla inputs for reconfiguration of gate functions and garbage outputs that help in keeping reversibility. Reversible circuits hold promise in futuristic computing technologies like quantum computing, quantum dot cellular automata, DNA computing, optical computing, etc. Thus, it is important to minimize parameters such as ancilla and garbage bits, quantum cost and delay in the design of reversible circuits. The first contribution of this dissertation is the design of a new reversible gate namely the TR gate (Thapliyal-Ranganathan) which has the unique structure that makes it ideal for the realization of arithmetic circuits such as adders, subtractors and comparators, efficient in terms of the parameters such as ancilla and garbage bits, quantum cost and delay. The second contribution is the development of design methodologies and a synthesis framework to synthesize reversible data path functional units, such as binary and BCD adders, subtractors, adder-subtractors and binary comparators. The objective behind the proposed design methodologies is to synthesize arithmetic and logic functional units optimizing key metrics such as ancilla inputs, garbage outputs, quantum cost and delay. A library of reversible gates such as the Fredkin gate, the Toffoli gate, the TR gate, etc. was developed by coding in Verilog for use during synthesis. The third contribution of this dissertation is the set of methodologies for the design of reversible sequential circuits such as reversible latches, flip-flops and shift registers. The reversible designs of asynchronous set/reset D latch and the D flip-flop are attempted for the first time. It is shown that the designs are optimal in terms of number of garbage outputs while exploring the best possible values for quantum cost and delay. The other important contributions of this dissertation are the applications of reversible logic as well as a special class of reversible logic called conservative reversible logic towards concurrent (online) and offline testing of single as well as multiple faults in traditional and reversible nanoscale VLSI circuits, based on emerging nanotechnologies such as QCA, quantum computing, etc. Nanoelectronic devices tend to have high permanent and transient faults and thus are susceptible to high error rates. Specific contributions include (i) concurrently testable sequential circuits for molecular QCA based on reversible logic, (ii) concurrently testable QCA-based FPGA, (iii) design of self checking conservative logic gates for QCA, (iv) concurrent multiple error detection in emerging nanotechnologies using reversible logic, (v) two-vectors, all 0s and all 1s, testable reversible sequential circuits.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Antonijevic, Filip. "PGG - Processuell Grottgenerering : En jämförelse mellan Cellulär Automat, Random Walk och Perlin Noise". Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-19983.

Texto completo
Resumen
I detta arbete undersöktes processuell generering med tre algoritmer i syfte att skapa grottliknande banor och utvärdera kriterier baserat på eftertraktande egenskaper gällande tid, storlek, variation och pålitlighet. Algoritmerna är cellulär automat, random walk och Perlin noise. Flera olika hjälpfunktioner och algoritmer användes för utvärderingen av kriterierna. Syftet med arbetet var att ta reda på vilken av dessa algoritmer skulle passa bäst att användas i ett roguelikespel. Slutsatsen som drogs från undersökningen är att algoritmen random walk gav det bästa resultat gällande pålitlighet, variation och minst antal områden. Cellulär automat gav bäst resultat för genereringstid och minst antal golvytor. Perlin noise gav minst märkvärdigt resultat, men tillät relativt bättre kontroll över mängden golvytor än både cellulär automat och random walk. Överlag gav random walk det bästa resultat för att användas i syftet att skapa grottliknande banor för roguelikespel.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Johnston, Matthew W. "Computer Modeling the Incursion Patterns of Marine Invasive Species". NSUWorks, 2015. http://nsuworks.nova.edu/occ_stuetd/33.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Judice, Sicilia Ferreira Ponce Pasini. "Animação de Fluidos via Modelos do Tipo Lattice Gas e Lattice Boltzmann". Laboratório Nacional de Computação Científica, 2009. http://www.lncc.br/tdmc/tde_busca/arquivo.php?codArquivo=181.

Texto completo
Resumen
Técnicas baseadas em física têm chamado a atenção da comunidade de computação gráfica, em especial para animação de fluidos (gás ou líquidos). As técnicas tradicionais para animação de fluidos são metodologias top-down baseadas em malhas 2D/3D, tais como Diferenças Finitas e Elementos Finitos, em conjunto com equações de fluidos Navier-Stokes. Entretanto, tais métodos têm um custo computacional alto. Uma alternativa é o uso de técnicas baseadas em Autômatos Celulares do tipo Lattice Gas (LGCA) e o Método de Lattice Boltzmann (LBM). A idéia básica desses métodos consiste em obter a dinâmica macroscópica de um fluido a partir do comportamento coletivo de diversas partículas microscópicas. Em geral, tais metodologias bottom-up são eficientes do ponto de vista computacional. Neste trabalho, são estudados os aspectos teóricos e práticos da animação computacional de fluidos bidimensionais para computação gráfica, usando um método LGCA chamado FHP, e um método LBM chamado D2Q9. É proposto um modelo de fluido 3D baseado nos modelos bidimensionais FHP e D2Q9, bem como em métodos de interpolação. Em seguida, são apresentadas duas aplicações para animação de fluidos através dos métodos mencionados, uma para execução em tempo real e outra para execução off-line. Nos resultados dos experimentos computacionais são enfatizados a simplicidade e o potencial dos modelos propostos quando combinados com técnicas eficientes de rendering.
Physically-based techniques for the animation of fluids (gas or liquids) have taken the attention of the computer graphics community. The traditional fluid animation methods rely on a top down viewpoint that uses 2D/3D mesh based approaches motivated by the Eulerian methods of Finite Element (FE) and Finite Difference (FD), in conjunction with Navier-Stokes equations of fluids. Alternatively, lattice methods comprised by the Lattice Gas Cellular Automata (LGCA) and Lattice Boltzmann (LBM) can be used. The basic idea behind these methods is that the macroscopic dynamics of a fluid is the result of the collective behavior of many microscopic particles. Such bottom-up approaches need low computational resources for both the memory allocation and the computation itself. In this work, we consider animation of fluids for computer graphics applications, using a LGCA method called FHP, and a LBM method called D2Q9, both bidimensional models. We propose 3D fluid animation techniques based on the FHP and D2Q9 as well as interpolation methods. Then, we present two animating frameworks based on the mentioned lattice methods, one for a real time implementation and the other for an off-line implementation. In the experimental results we emphasize the simplicity and power of the presented models when combined with efficient techniques for rendering and compare their efficiency.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Caux, Jonathan. "Parallélisation et optimisation d'un simulateur de morphogénèse d'organes. Application aux éléments du rein". Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2012. http://tel.archives-ouvertes.fr/tel-00932303.

Texto completo
Resumen
Depuis plusieurs dizaines d'années, la modélisation du vivant est un enjeu majeur qui nécessite de plus en plus de travaux dans le domaine de la simulation. En effet, elle ouvre la porte à toute une palette d'applications : l'aide à la décision en environnement et en écologie, l'aide à l'enseignement, l'aide à la décision pour les médecins, l'aide à la recherche de nouveaux traitements pharmaceutiques et la biologie dite " prédictive ", etc. Avant de pouvoir aborder un problème, il est nécessaire de pouvoir modéliser de façon précise le système biologique concerné en précisant bien les questions auxquelles devra répondre le modèle. La manipulation et l'étude de systèmes complexes, les systèmes biologiques en étant l'archétype, pose, de façon générale, des problèmes de modélisation et de simulation. C'est dans ce contexte que la société Integrative BioComputing (IBC) développe depuis le début des années 2000 un prototype d'une Plateforme Générique de Modélisation et de Simulation (la PGMS) dont le but est de fournir un environnement pour modéliser et simuler plus simplement les processus et les fonctions biologiques d'un organisme complet avec les organes le composant. La PGMS étant une plateforme générique encore en phase de développement, elle ne possédait pas les performances nécessaires pour permettre de réaliser la modélisation et la simulation d'éléments importants dans des temps suffisamment courts. Il a donc été décidé, afin d'améliorer drastiquement les performances de la PGMS, de paralléliser et d'optimiser l'implémentation de celle-ci ; le but étant de permettre la modélisation et la simulation d'organes complets dans des temps acceptables. Le travail réalisé au cours de cette thèse a donc consisté à traiter différents aspects de la modélisation et de la simulation de systèmes biologiques afin d'accélérer les traitements de ceux-ci. Le traitement le plus gourmand en termes de temps de calcul lors de l'exécution de la PGMS, le calcul des champs physicochimiques, a ainsi fait l'objet d'une étude de faisabilité de sa parallélisation. Parmi les différentes architectures disponibles pour paralléliser une telle application, notre choix s'est porté sur l'utilisation de GPU (Graphical Processing Unit) à des fins de calculs généralistes aussi couramment appelé GPGPU (General-Purpose computation on Graphics Processing Units). Ce choix a été réalisé du fait, entre autres, du coût réduit du matériel et de sa très grande puissance de calcul brute qui en fait une des architectures de parallélisation les plus accessibles du marché. Les résultats de l'étude de faisabilité étant particulièrement concluant, la parallélisation du calcul des champs a ensuite été intégrée à la PGMS. En parallèle, nous avons également mené des travaux d'optimisations pour améliorer les performances séquentielles de la PGMS. Le résultat de ces travaux est une augmentation de la vitesse d'exécution d'un facteur 18,12x sur les simulations les plus longues (passant de 16 minutes pour la simulation non optimisée utilisant un seul cœur CPU à 53 secondes pour la version optimisée utilisant toujours un seul cœur CPU mais aussi un GPU GTX500). L'autre aspect majeur traité dans ces travaux a été d'améliorer les performances algorithmiques pour la simulation d'automates cellulaires en trois dimensions. En effet, ces derniers permettent aussi bien de simuler des comportements biologiques que d'implémenter des mécanismes de modélisation tels que les interactions multi-échelles. Le travail de recherche s'est essentiellement effectué sur des propositions algorithmiques originales afin d'améliorer les simulations réalisées par IBC sur la PGMS. L'accélération logicielle, à travers l'implémentation de l'algorithme Hash‑Life en trois dimensions, et la parallélisation à l'aide de GPGPU ont été étudiées de façon concomitante et ont abouti à des gains très significatifs en temps de calcul.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Benítez, César Manuel Vargas. "Contributions to the study of the protein folding problem using bioinspired computation and molecular dynamics". Universidade Tecnológica Federal do Paraná, 2015. http://repositorio.utfpr.edu.br/jspui/handle/1/1211.

Texto completo
Resumen
O Problema de Dobramento de Proteínas (PDP) é considerado um dos desafios abertos mais importantes da Biologia e Bioinformática. Nesta tese, uma nova abordagem para simular os pathways de dobramento de proteínas é proposta onde, ao invés de utilizar a estrutura tridimensional da proteína, os estados de dobramento são representados por Mapas de Contatos (MC). Autômatos Celulares bidimensionais (2D-CA) são utilizados para simular o processo de dobramento, onde cada configuração representa um estado de dobramento e é obtida em relação ao seu estado predecessor e uma regra de transição. Determinar uma regra de transição para um dado comportamento dinâmico representa uma tarefa complexa. Portanto, é apresentada uma abordagem distribuida baseada em Programação de Expressão Gênica, chamada pGEP-CA. Funções de fitness específicas, baseadas em medidas de similaridade e simetria, são propostas. Também, um algoritmo heterogêneo paralelo Ecologicamente-inspirado é proposto. Este algoritmo, chamado pECO, é utilizado na reconstrução de estruturas a partir de MCs, usando o modelo 3D-AB off-lattice. De acordo com o nosso conhecimento, é apresentada a primeira aplicação de Dinâmica Molecular (DM) ao PFP, usando o mesmo modelo de proteínas. Experimentos foram realizados para verificar a adequabilidade das abordagens propostas. Além disto, uma breve análise sobre o balanceamento de carga de processamento das arquiteturas paralelas é apresentada. Os resultados mostram que as abordagens obtiveram resultados coerentes, sugerindo que são adequadas para o problema. As regras de transição induzidas pelo pGEP-CA são capazes de gerar 2D-CA que representam MCs corretamente. Sobre a abordagem pECO, os resultados demonstram que a combinação de abordagens evolucionárias concorrentes se beneficia do efeito da coevolução e das diferentes estratégias de busca. Além disto, pode ser observado que a abordagem de DM é capaz de levar a conformações que mimetizam propriedades biológicas, como a formação do núcleo hidrofóbico e os movimentos de respiração (breathing) das proteínas. Também foi observado que o processamento paralelo é essencial, permitindo a obtenção de resultados em tempos de processamento razoáveis. Finalmente, as conclusões e diversas direções de pesquisa são apresentadas.
The Protein Folding Problem (PFP) is considered one of the most important open cha- llenges in Biology and Bioinformatics. In this thesis, a novel approach for simulating the protein folding pathways is proposed where, instead using the three-dimensional structure of the protein, the folding states are represented by Contact Maps (CM). A two-dimensional Cellular Automata (2D-CA) evolver is used to simulate the fol- ding process, where each configuration represents a folding state and it is obtained according to its predecessor and a transition rule. Since finding transition rules for simulating a dynamic behavior is a very difficult task, it is proposed a distributed Gene-Expression Programming (GEP)-based approach, called pGEP-CA. Specific fit- ness functions, based on similarity and symmetry measures, are proposed. Futhermore, a heterogeneous parallel Ecology-inspired algorithm is proposed. This algorithm, called pECO, is used for reconstructing the structures from the CMs, using the 3D-AB off-lattice model. Moreover, to the best of our knowledge, it is presented the first application of Molecular Dynamics (MD) to the PFP, using the same model of proteins. Experiments were done to evaluate the adequacy of the proposed approaches. Also, a brief analysis of the load balancing of the parallel architectures is presented. Results show that the approaches obtained coherent results, suggesting their adequacy for the problem. The induced transition rules by the pGEP-CA are able to generate 2D-CA that represent CMs correctly. Concerning the pECO approach, results show that the combination of concurrent evolutionary approaches took advantage of both the coevolution effect and the different search strategies. In addition, it can be observed that the MD approach is capable of displaying biological features such as the hydrophobic core formation and the protein breathing motion. Furthermore, it is observed that parallel processing was not only justified but also essential for obtaining results in reasonable processing time. Finally, concluding remarks and several research directions for future works are presented.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Hérault, Alexis. "Création d'un système d'information pour la gestion des risques volcaniques". Phd thesis, Université Paris-Est, 2008. http://tel.archives-ouvertes.fr/tel-00470546.

Texto completo
Resumen
La prévention du risque volcanique est un enjeu majeur, notamment pour l'Etna, dont les éruptions fréquentes menacent la province de Catane. Sont exposés les éléments physiques nécessaires à la compréhension des mécanismes intervenant dans un écoulement de lave basaltique. Un système d'information intégrant les principaux aspects du risque volcanique et permettant la création de cartes de risques est alors proposé. Ce système comprend un modèle, basé sur les automates cellulaires et intégrant le traitement d'images satellitaires. Il permet de simuler l'évolution d'une coulée ainsi que son débit. Ce système est alors intégré dans un Système d'Information Géographique. Il est validé sur les éruptions 2001, 2006 et 2007. Enfin, nous développons, pour l'enrichir, un modèle numérique pour le refroidissement d'une coulée de lave à l'aide des Smoothed Particle Hydrodynamics. Ce modèle, validé sur différents cas test, est appliqué au refroidissement d'un lac et d'une coulée de lave. Keywords : risque volcanique, automates cellulaires, système de veille, information élaborée, système d'information géographique, Smoothed Particle Hydrodynamics
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Graf, Brolund Alice. "Compartmental Models in Social Dynamics". Thesis, Uppsala universitet, Avdelningen för systemteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-448163.

Texto completo
Resumen
The dynamics of many aspects of social behaviour, such as spread of fads and fashion, collective action, group decision-making, homophily and disagreement, have been captured by mathematical models. The power of these models is that they can provide novel insight into the emergent dynamics of groups, e.g. 'epidemics' of memes, tipping points for collective action, wisdom of crowds and leadership by small numbers of individuals, segregation and polarisation. A current weakness in the scientific models is their sheer number. 'New' models are continually 'discovered' by physicists, engineers and mathematicians. The models are analysed mathematically, but very seldom provide predictions that can be tested empirically. In this work, we provide a framework of simple models, based on Lotka's original idea of using chemical reactions to describe social interactions. We show how to formulate models for social epidemics, social recovery, cycles, collective action, group decision-making, segregation and polarisation, which we argue encompass the majority of social dynamics models. We present an open-access tool, written in Python, for specifying social interactions, studying them in terms of mass action, and creating spatial simulations of model dynamics. We argue that the models in this article provide a baseline of empirically testable predictions arising from social dynamics, and that before creating new and more complicated versions of the same idea, researchers should explain how their model differs substantially from our baseline models.
Matematiska modeller kan hjälpa oss att förstå många typer av sociala fenomen, som ryktesspridning, spridning av memes, gruppbeslut, segregation och radikalisering. Det finns idag otaliga modeller för sociala beteenden hos människor och djur, och fler presenteras kontinuerligt. Det stora antalet modeller försvårar navigering inom forskningsfältet, och många av modellerna är dessutom komplicerade och svåra att verifiera genom experiment. I detta arbete föreslås ett ramverk av grundläggande modeller, som var och en modellerar en aspekt av socialt beteende; det gäller sociala epidemier, cykler, gemensamt handlande, gruppbeslut, segregation och polarisering. Vi menar att dessa modeller utgör majoriteten av de verifierbara aspekter av socialt beteende som studeras, och att de bör behandlas som en utgångspunkt när en ny modell ska introduceras. Vilka av mekanismerna från utgångspunkten finns representerade i modellen? Skiljer den sig ens nämnvärt från utgångspunkten? Genom att ha en god förståelse för grundmodellerna, och genom att förklara på vilket sätt en ny modell skiljer sig från dem, kan forskare undvika att presentera modeller som i praktiken är mer komplicerade varianter av sådana som redan finns. I detta arbete visar vi hur dessa grundläggande modeller kan formuleras och studeras. Modellerna bygger på enkla regler om vad som händer när individer i en befolkning möter varandra. Till exempel, om en person som har vetskap om ett rykte träffar någon som inte har det, kan ryktet spridas vidare. Därför har antaganden om vilka personer som kan träffa varandra stor påverkan på de resultat som modellerna ger. I detta arbete studeras varje modell med två olika metoder: i den ena har alla personer i befolkningen samma sannolikhet att träffa varandra, i den andra representeras befolkningen av ett rutnät, där varje plats motsvarar en individ. I den senare har alltså varje person ett begränsat antal grannar att interagera med. Vilken av dessa två metoder man väljer har stor betydelse för vilka beteenden modellerna förutspår. Som ett komplement till detta arbete presenteras ett verktyg i form av ett Python-program som utför analysen av modellerna. Detta kan användas för att undersöka grundmodellerna som presenteras i detta arbete, men också för att formulera och analysera nya modeller på samma sätt. På det viset kan nya modeller enkelt jämföras mot grundmodellerna. Verktyget är användbart både som introduktion för de som är nya inom social dynamik, men också för de forskare som som vill ta fram nya modeller och föra forskningsfältet vidare.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Bouré, Olivier. "" Le simple est-il robuste ? " : une étude de la robustesse des systèmes complexes par les automates cellulaires". Phd thesis, Université de Lorraine, 2013. http://tel.archives-ouvertes.fr/tel-00918545.

Texto completo
Resumen
Dans cette thèse, nous étudions la robustesse dans le contexte de la modélisation de systèmes complexes par les automates cellulaires. En effet, si l'on cherche à reproduire un comportement émergent à partir d'un modèle d'automate cellulaire, il nous semble nécessaire de se demander si les comportements observés sont bien le résultat d'interactions entre entités constituantes, ou bien s'ils dépendent d'une définition particulière du modèle. Nous allons ainsi être amenés à considérer la robustesse du modèle, à savoir la résistance de son comportement à de petites variations sur les attributs de sa définition. Dans un premier temps, nous montrons la pertinence de cette approche en considérant plusieurs définitions possibles d'une perturbation de la mise à jour globale et en les appliquant à une classe simple et représentative de modèles d'automates cellulaires, les Automates Cellulaires Elémentaires. Nous observons que, malgré le fait que nos perturbations soient proches et qu'une majorité des modèles considérés ne change pas de comportement, quelques cas particuliers montrent des changements qualitatifs du comportement que nous étudions plus en détail. Dans un second temps, nous appliquons cette approche en nous penchant sur un modèle particulier d'automate cellulaire, qui simule le phénomène de formation d'essaim à partir d'un modèle évolué d'automate cellulaire, le gaz sur réseau. Nous explorons la robustesse du comportement du modèle en considérant la perturbation de deux attributs du modèle, la forme de la grille cellulaire et la mise à jour globale, et en tirons les conclusions sur la relation entre l'observation du comportement et la définition précise du modèle.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Ben, amor Mohamed hedi. "Méthodes numériques et formelles pour l'ingénierie des réseaux biologiques : traitement de l'information par des populations d'oscillateurs. Approches par contraintes et Taxonomie des réseaux biologiques". Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00778673.

Texto completo
Resumen
Cette thèse concerne l'ingénierie des systèmes complexes à partir d'une dynamique souhaitée. En particulier, nous nous intéressons aux populations d'oscillateurs et aux réseaux de régulation génétique. Dans une première partie, nous nous fondons sur une hypothèse, introduite en neurosciences, qui souligne le rôle de la synchronisation neuronale dans le traitement de l'information cognitive. Nous proposons de l'utiliser sur un plan plus large pour étudier le traitement de l'information par des populations d'oscillateurs. Nous discutons des isochrons de quelques oscillateurs classés selon leurs symétries dans l'espace des états. Cela nous permet d'avoir un critère qualitatif pour choisir un oscillateur. Par la suite, nous définissons des procédures d'impression, de lecture et de réorganisation de l'information sur une population d'oscillateurs. En perspective, nous proposons un système à couches d'oscillateurs de Wilson-Cowan. Ce système juxtapose convenablement synchronisation et désynchronisation à travers l'utilisation de deux formes de couplage: un couplage continu et un couplage par pulsation. Nous finissons en proposant une application de ce système: la détection de contours dans une image. En deuxième partie, nous proposons d'utiliser une approche par contraintes pour identifier des réseaux de régulation génétique à partir de connaissances partielles sur leur dynamique et leur structure. Le formalisme que nous utilisons est connu sous le nom de réseaux d'automates booléens à seuil ou réseaux Hopfield-semblables. Nous appliquons cette méthode, afin de déterminer le réseau de régulation de la morphogenèse florale d'Arabidopsis thaliana. Nous montrons l'absence d'unicité des solutions dans l'ensemble des modèles valides (ici, 532 modèles). Nous montrons le potentiel de cette approche dans la détermination et la classification de modèles de réseaux de régulation génétique. L'ensemble de ces travaux mène à un certain nombre d'applications, en particulier dans le développement de nouvelles méthodes de stockage de l'information et dans le design de systèmes de calcul non conventionnel.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Marcovici, Irène. "Automates cellulaires probabilistes et mesures spécifiques sur des espaces symboliques". Phd thesis, Université Paris-Diderot - Paris VII, 2013. http://tel.archives-ouvertes.fr/tel-00933977.

Texto completo
Resumen
Un automate cellulaire probabiliste (ACP) est une chaîne de Markov sur un espace symbolique. Le temps est discret, les cellules évoluent de manière synchrone, et le nouvel état de chaque cellule est choisi de manière aléatoire, indépendamment des autres cellules, selon une distribution déterminée par les états d'un nombre fini de cellules situées dans le voisinage. Les ACP sont utilisés en informatique comme modèle de calcul, ainsi qu'en biologie et en physique. Ils interviennent aussi dans différents contextes en probabilités et en combinatoire. Un ACP est ergodique s'il a une unique mesure invariante qui est attractive. Nous prouvons que pour les AC déterministes, l'ergodicité est équivalente à la nilpotence, ce qui fournit une nouvelle preuve de l'indécidabilité de l'ergodicité pour les ACP. Alors que la mesure invariante d'un AC ergodique est triviale, la mesure invariante d'un ACP ergodique peut être très complexe. Nous proposons un algorithme pour échantillonner parfaitement cette mesure. Nous nous intéressons à des familles spécifiques d'ACP, ayant des mesures de Bernoulli ou des mesures markoviennes invariantes, et étudions les propriétés de leurs diagrammes espace-temps. Nous résolvons le problème de classification de la densité sur les grilles de dimension supérieure ou égale à 2 et sur les arbres. Enfin, nous nous intéressons à d'autres types de problèmes. Nous donnons une caractérisation combinatoire des mesures limites pour des marches aléatoires sur des produits libres de groupes. Nous étudions les mesures d'entropie maximale de sous-décalages de type fini sur les réseaux et sur les arbres. Les ACP interviennent à nouveau dans ce dernier travail.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Berthelot, Geoffroy. "L'expansion phénotypique et ses limites". Phd thesis, Université René Descartes - Paris V, 2013. http://tel.archives-ouvertes.fr/tel-00917998.

Texto completo
Resumen
Le développement futur des performances sportives est un sujet de mythe et de désaccord entre les experts. Un article, publié en 2004, a donné lieu à un vif débat dans le domaine universitaire. Il suggère que les modèles linéaires peuvent être utilisés pour prédire -sur le long terme- la performance humaine dans les courses de sprint. Des arguments en faveur et en défaveur de cette méthodologie ont été avancés par différent scientifiques et d'autres travaux ont montré que le développement des performances est non linéaire au cours du siècle passé. Une autre étude a également souligné que la performance est liée au contexte économique et géopolitique. Dans ce travail, nous avons étudié les frontières suivantes: le développement temporel des performances dans des disciplines Olympiques et non Olympiques, avec le vieillissement chez les humains et d'autres espèces (lévriers, pur sangs, souris). Nous avons également étudié le développement des performances d'un point de vue plus large en analysant la relation entre performance, durée de vie et consommation d'énergie primaire. Nous montrons que ces développements physiologiques sont limités dans le temps et que les modèles linéaires introduits précédemment sont de mauvais prédicteurs des phénomènes biologiques et physiologiques étudiés. Trois facteurs principaux et directs de la performance sportive sont l'âge, la technologie et les conditions climatiques (température). Cependant, toutes les évolutions observées sont liées au contexte international et à l'utilisation des énergies primaires, ce dernier étant un paramètre indirect du développement de la performance. Nous montrons que lorsque les indicateurs des performances physiologiques et sociétales -tels que la durée de vie et la densité de population- dépendent des énergies primaires, la source d'énergie, la compétition inter-individuelle et la mobilité sont des paramètres favorisant la réalisation de trajectoires durables sur le long terme. Dans le cas contraire, la grande majorité (98,7%) des trajectoires étudiées atteint une densité de population égale à 0 avant 15 générations, en raison de la dégradation des conditions environnementales et un faible taux de mobilité. Ceci nous a conduit à considérer que, dans le contexte économique turbulent actuel et compte tenu de la crise énergétique à venir, les performances sociétales et physiques ne devraient pas croître continuellement.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Konecny, Filip. "Vérification relationnelle pour des programmes avec des données entières". Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00805599.

Texto completo
Resumen
Les travaux présentés dans cette thèse sont lies aux problèmes de vérification de l'atteignabilité et de la terminaison de programmes qui manipulent des données entières non-bornées. On décrit une nouvelle méthode de vérification basée sur une technique d'accélération de boucle, qui calcule, de manière exacte, la clôture transitive d'une relation arithmétique. D'abord, on introduit un algorithme d'accélération de boucle qui peut calculer, en quelques secondes, des clôtures transitives pour des relations de l'ordre d'une centaine de variables. Ensuite, on présente une méthode d'analyse de l'atteignabilité, qui manipule des relations entre les variables entières d'un programme, et applique l'accélération pour le calcul des relations entrée-sortie des procédures, de façon modulaire. Une approche alternative pour l'analyse de l'atteignabilité, présentée également dans cette thèse, intègre l'accélération avec l'abstraction par prédicats, afin de traiter le problème de divergence de cette dernière. Ces deux méthodes ont été évaluées de manière pratique, sur un nombre important d'exemples, qui étaient, jusqu'a présent, hors de la portée des outils d'analyse existants. Dernièrement, on a étudié le problème de la terminaison pour certaines classes de boucles de programme, et on a montré la décidabilité pour les relations étudiées. Pour ces classes de relations arithmétiques, on présente un algorithme qui s'exécute en temps au plus polynomial, et qui calcule l'ensemble d'états qui peuvent générer une exécution infinie. Ensuite on a intégré cet algorithme dans une méthode d'analyse de la terminaison pour des programmes qui manipulent des données entières.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Ngô, Van Chan. "Formal verification of a synchronous data-flow compiler : from Signal to C". Phd thesis, Université Rennes 1, 2014. http://tel.archives-ouvertes.fr/tel-01067477.

Texto completo
Resumen
Synchronous languages such as Signal, Lustre and Esterel are dedicated to designing safety-critical systems. Their compilers are large and complicated programs that may be incorrect in some contexts, which might produce silently bad compiled code when compiling source programs. The bad compiled code can invalidate the safety properties that are guaranteed on the source programs by applying formal methods. Adopting the translation validation approach, this thesis aims at formally proving the correctness of the highly optimizing and industrial Signal compiler. The correctness proof represents both source program and compiled code in a common semantic framework, then formalizes a relation between the source program and its compiled code to express that the semantics of the source program are preserved in the compiled code.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

James, George R. "Predicting the spatial pattern of urban growth in Honolulu county using the cellular automata SLEUTH urban growth model". Thesis, 2005. http://hdl.handle.net/10125/11626.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Perez, Delgado Carlos Antonio. "Quantum Cellular Automata: Theory and Applications". Thesis, 2007. http://hdl.handle.net/10012/3316.

Texto completo
Resumen
This thesis presents a model of Quantum Cellular Automata (QCA). The presented formalism is a natural quantization of the classical Cellular Automata (CA). It is based on a lattice of qudits, and an update rule consisting of local unitary operators that commute with their own lattice translations. One purpose of this model is to act as a theoretical model of quantum computation, similar to the quantum circuit model. The main advantage that QCA have over quantum circuits is that QCA make considerably fewer demands on the underlying hardware. In particular, as opposed to direct implementations of quantum circuits, the global evolution of the lattice in the QCA model does not assume independent control over individual \emph{qudits}. Rather, all qudits are to be addressed collectively in parallel. The QCA model is also shown to be an appropriate abstraction for space-homogeneous quantum phenomena, such as quantum lattice gases, spin chains and others. Some results that show the benefits of basing the model on local unitary operators are shown: computational universality, strong connections to the circuit model, simple implementation on quantum hardware, and a series of applications. A detailed discussion will be given on one particular application of QCA that lies outside either computation or simulation: single-spin measurement. This algorithm uses the techniques developed in this thesis to achieve a result normally considered hard in physics. It serves well as an example of why QCA are interesting in their own right.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía