Dissertations / Theses on the topic 'Simulations EM'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Simulations EM.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Schleder, Gabriel Ravanhani. "Intrinsic self-healing nanocomposites : computational simulations." reponame:Repositório Institucional da UFABC, 2017.
Find full textDissertação (mestrado) - Universidade Federal do ABC, Programa de Pós-Graduação em Nanociências e Materiais Avançados, 2017.
Uma estrutura que pode autorregenerar em condições ambiente é um desafio enfrentado atualmente e é uma das áreas mais promissoras na ciência de materiais inteligentes. O presente projeto visa a utilização de métodos teóricos para o estudo das propriedades estruturais e funcionais de nanocompósitos intrinsecamente autorregenerativos, permitindo estratégias mais eficientes para o desenvolvimento de novos materiais. As simulações são baseadas na Teoria do Funcional da Densidade (DFT). Estudamos os componentes isolados que constituem o nanocompósito funcional: diarilbibenzofuranona (DABBF), SHP e nanopartículas de (óxido de) níquel. Estudando a formação da DABBF contra a reação da arilbenzofuranona (ABF) e O2 (auto-oxidação), vemos que a reação de formação sem barreira da DABBF é preferida à auto-oxidação porque existe um processo de transferência de carga que resulta no superóxido fracamente ligado. Realizamos um estudo sistemático por meio de cálculos ab initio para investigar a reação de clusters de Ni13 com moléculas de O2. Avaliamos dinamicamente o efeito sobre as propriedades estruturais, eletrônicas e magnéticas e compreendemos o mecanismo de quimissorção do oxigênio (primeiro estágio da oxidação). Finalmente, estudamos as interações entre os oligômeros do SHP e as nanopartículas, levando ao nanocompósito autorregenerativo. Sugerimos como trabalhos futuros simular as interações entre todos esses materiais levando ao nanocompósito autorregenerativo por meio de uma abordagem multiescala via métodos DFT e de dinâmica molecular (MD).
A structure that can sustain self-healing repair under standard conditions is a challenge faced nowadays and is one of the most promising areas in smart materials science. The present project aims at the use of theoretical methods for the study of structural and functional properties of intrinsically self-healing nanocomposites, allowing improved design strategies for novel materials. The simulations are based on Density Functional Theory (DFT). We studied the isolated components that constitute the functional nanocomposite network: diarylbibenzofuranone (DABBF), SHP, and oxidated nickel nanoparticles. Studying DABBF bond formation against arylbenzofuranone (ABF) and O2 reaction (autoxidation), we see that the barrierless DABBF bond formation is preferred over autoxidation because there is a charge transfer process that results in the weakly bonded superoxide. We performed a systematic study by means of ab initio calculations to investigate Ni13 clusters reaction with O2 molecules. We evaluate dynamically the effect on structural, electronic, and magnetic properties and understand the oxygen chemisorption (first oxidation stage) mechanism. Finally, we study the interactions between SHP oligomers and the nanoparticles, leading to the selfhealing nanocomposite. We suggest as future work simulating the interactions between all these materials leading to the self-healing nanocomposite through a multiscale approach via DFT and molecular dynamics (MD) methods.
Poggi, Davide. "Simulating and interpreting EM side-channel attacks at chip level prior to fabrication." Electronic Thesis or Diss., Université de Montpellier (2022-....), 2022. http://www.theses.fr/2022UMONS006.
Full textIn the last decades, side-channel attacks (SCA) have demonstrated their dangerousnessin retrieving sensitive data from ICs. Among these attacks, those exploiting EM radiationsof ICs are particularly efficient. Indeed, adversaries need to find only one hotspot (positionof the EM probe over the IC surface) where there is an exploitable leakage to compromisethe security of the circuit. As a result, designing secure ICs robust against these attacks isincredibly difficult because designers must warrant there is no hotspot over the whole ICsurface. This task is all the more difficult as there is no CAD tool allowing to verify therobustness of ICs against EM SCA at the design stage, i.e. prior to fabrication. In this thesisa simulation flow allowing to reproduce EM SCA by simulation is proposed. The Biot-Savartlaw is used to model the magnetic field radiated by entire ICs and an innovative methodology, called Noise-to-Add, is introduced. This latter allows to overcome the absence of noise in simulations and correctly interpret simulation correlation attacks results
Li, Zhiyong. "Data-Driven Adaptive Reynolds-Averaged Navier-Stokes k - ω Models for Turbulent Flow-Field Simulations." UKnowledge, 2017. http://uknowledge.uky.edu/me_etds/93.
Full textGrigoletti, Pablo Souza. "Uma arquitetura baseada em sistemas multiagentes para simulações em geoprocessamento." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2007. http://hdl.handle.net/10183/10365.
Full textThis work is situated in the intersection of two areas: Multi-Agent Systems and Geocomputing. In Multi-Agent Systems, the specification of a system can be achieved by the modelling of environment, agents, mechanisms of interaction and organization of the agents. Although being a very important aspect, the modelling and representation of the environment has not been yet fully explored. On the other hand, Geocomputing has always focused in the static representation of spatial phenomena. However, some spatial phenomena are dynamic on time and space and usual representation adopted in Geocomputing do not capture them well. The aim of this work is to provide an architecture, based on Multi-Agent Systems, for the development and execution of Geocomputing simulations. A very important feature of the proposed architecture is the use of vectorial data, from a Geographic Database, for the representation of the environment and the spatial entities that exist in it. Using this continuous and accurate representation form, it is possible to develop more realistic models, representing appropriately geographic features. Moreover, this architecture allows the representation of dynamic phenomena in time and space, an old need of the Geocomputing area. Focusing on the development of the features of this architecture, an analysis of some related works was realized. The development of this architecture and its features was done based on the results of this analysis. Moreover, a prototype was presented in this work, in which some case studies of different scenes were performed, aiming at the evaluation and demonstration of the developed model and its features.
Mutisya, Sylvia Mueni. "Atomistic simulations of water confined in cement." reponame:Repositório Institucional da UFABC, 2018.
Find full textTese (doutorado) - Universidade Federal do ABC, Programa de Pós-Graduação em Nanociências e Materiais Avançados, Santo André, 2018.
A pasta de cimento 'e um material complexo, heterog¿eneo e poroso com excelentes propriedades que o tornam um aglutinante adequado para aplica¸c¿oes em constru¸c¿oes. A qualidade e a durabilidade do cimento t¿em uma forte rela¸c¿ao com a 'agua contida nele. Embora uma boa compreens¿ao das intera¸c¿oes complexas entre os poros do cimento e os flu'ýdos confinados 'e necess'aria para resolver os atuais problemas de durabilidade, at'e ent¿ao nenhum modelo foi bem sucedido na captura de todos os processos e o papel da 'agua no cimento continua a ser uma 'area de pesquisa ativa. Com o intuito de contribuir para a compreens¿ao atual da estrutura do cimento, este trabalho se concentrou no estudo dos processos din¿amicos que acontecem na nanoescala da 'agua confinada em poros de cimento. Utilizamos simula¸c¿oes atom'ýsticas que v¿ao desde primeiros princ'ýpios at'e din¿amica molecular para estudar a principal fase de hidrata¸c¿ao, ou seja, hidrato de silicato de c'alcio (C¿S¿H), modelado por uma estrutura de tobermorita com defeitos. A partir de primeiros princ'ýpios, investigamos a morfologia da superf'ýcie da tobermorita a qual rege as intera¸c¿oes da 'agua com o modelo C-S-H. Demonstramos que a tobermorita forma cristais pseudo-hexagonais e a faceta mais est'avel 'e a (004). A fim de explorar sistemas maiores, checamos a transferabilidade do potencial cl'assico CLAYFF na descri¸c¿ao dos componentes do cimento atrav'es de um estudo comparativo entre simula¸c¿oes de mec¿anica molecular e DFT. Embora as frequ¿encias calculadas com DFT e CLAYFF sejam diferentes, as propriedades estruturais e termodinâmicas apresentam grande concord¿ancia, indicando que o potencial CLAYFF 'e adequado para nossos c'alculos. Um par¿ametro importante para quantificar a din¿amica da 'agua no nanoconfinamento 'e o tempo de relaxa¸c¿ao T2. Para validar a metodologia implementada na determina¸c¿ao te'orica do tempo de relaxa¸c¿ao T2, realizamos simula¸c¿oes para a 'agua confinado dentro de nanoporos de calcita (1¿6 nm). Observamos que a din¿amica translacional 'e a principal respons'avel pela relaxa¸c¿ao de spin das mol'eculas de 'agua pr'oximas 'a superf'ýcie. O tempo de relaxa¸c¿ao T2 para mol'eculas de 'agua adsorvidas na superf'ýcie 'e menor e independente do tamanho de poro, no entanto, uma relaxa¸c¿ao de spin do tipo bulk 'e observada no centro dos poros maiores que 3 nm. Buscando elucidar as diversas propriedades da 'agua nanoconfinada em poros C¿S¿H, dividimos nosso estudo em tr¿es partes. Inicialmente, nos dedicamos a compreender os efeitos de confinamento da 'agua entre camadas (< 1 nm) e poros de gel (1¿5 nm) do modelo C-S-H, assim como suas influ¿encias nas intera¸c¿oes de superf'ýcie. A natureza hidrof'ýlica da superf'ýcie C¿ S¿H 'e evidenciada pela din¿amica lenta da 'agua que interage diretamente com a superf'ýcie. Entre as camadas, a 'agua se encontra praticamente im'ovel e exibe propriedades similares aquelas observadas em 'agua super-resfriada. Na sequ¿encia, investigamos o transporte da 'agua dentro da pasta de cimento implementando a relaxa¸c¿ao de troca do tipo 2D T2¿T2 RMN entre poros de gel com 1 nm e 4 nm conectados entre si. Nossos resultados mostraram que h'a trocas de mol'eculas de 'agua entre poros com um tempo caracter'ýstico de troca de 18 ns. Finalmente, conclu'ýmos nossos estudos considerando a natureza particulada do C¿S¿H a fim de estabelecer uma conex¿ao entre os fen¿omenos observados nas escalas nano e meso. Demonstramos que a 'agua confinada dentro do ambiente C¿S¿H edge-edge apresenta uma din¿amica semelhante ao caso das superf'ýcies planares.
Cement paste is a complex, heterogeneous and porous material with outstanding properties that make it a suitable binder for construction applications. The quality and durability of cement have a strong relationship with the water contained in it. While a good understanding of the complex interactions between the cement pores and the confined fluids is necessary to solve the current durability issues, no single model has been successful in capturing all the processes so far and the role of water in cement still remains an area of active research. To contribute to the current understanding of cement structure, this work has focused on studying the dynamical processes happening at the nanoscale of water confined in cement pores. We employ atomistic simulations ranging from first principles to molecular dynamics to study the main hydration phase, i.e calcium silicate hydrate (C¿S¿H), modeled by a defected tobermorite structure. Starting from first principles, the surface morphology of tobermorite which governs the interactions of water with the C¿S¿H model was investigated. It was shown that tobermorite forms pseudohexagonal crystals and the most stable facet is the (004). To upscale the calculations, the transferability of CLAYFF in the description of cementitious materials was tested through a comparative molecular mechanics and DFT study. Although the frequencies calculated with DFT and CLAYFF differ, the structures and thermodynamic quantities agree quite well, making CLAYFF a reasonable potential for our cement calculations. An important parameter to quantify water dynamics in nanoconfinement is NMR T2 relaxation time. To underpin the implemented methodology for theoretical determination of T2 relaxation time, simulations were performed for water confined within calcite nanopores (1¿6 nm). It was revealed that translational dynamics are the main contribution to spin relaxation of near surface water molecules. The T2 relaxation time for water molecules directly adsorbed on the surface is short and pore size independent, however a bulk¿like spin relaxation is observed at center of pores larger than 3 nm. To disentangle the diverse properties of water nanoconfined in C¿S¿H pores, the study of water within C¿S¿H was subdivided into three parts. The first part was dedicated to understanding the relative effect of the C¿S¿H surface and progressive confinement to water confined in the interlayer (< 1 nm) and gel pores (1¿5 nm) of C¿S¿H model. The hydrophilic nature of the C¿S¿H surface is evidenced by the slow dynamics for the water interacting directly with the surface. Within the interlayer, water is highly immobile and exhibits similar properties as those observed in supercooled water. Next, transport within cement paste was investigated by implementing 2D T2¿T2 NMR exchange relaxation between a 1 nm and 4 nm gel connected pores. We showed that there is water exchange in gel pores quantified by an exchange time of 18 ns. Lastly, the particulate nature of C¿S¿H is taken into consideration to facilitate bridging the gap between the atomistic and the mesoscale cement understanding. It was shown that water confined within the C¿S¿H edge¿edge environment portrays similar dynamics as in the C¿S¿H planar surfaces.
Palladino, Fabio Henrique. "Reconstrução 3D de imagens em tomografia por emissão de pósitrons com Câmaras de Cintilação." Universidade de São Paulo, 2004. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-07032014-160312/.
Full textVolumetric reconstruction in gamma camera based PET imaging Positron Emission Tomography (PET) is considered as a very useful tool for diagnosing and following several diseases in Oncology, Neurology and Cardiology. Two types of systems are available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we assessed a number of factors affecting the quantitation of gamma camera based PET imaging, characterized by a lower sensitivity compared to those of dedicated systems. We also evaluated image quantitation conditions under 2D and 3D acquisition/reconstruction modes, for different reconstruction methods and associated corrections. Acquisition data were simulated by Monte Carla method, using realistic parameters. Several objects of interest were modelled. We reconstructed slices and volumes using FBP, ART, MLEM and OSEM and also included four corrections: detector sensitivity, detector normalization, scatter and attenuation of annihilation photons. We proposed a method to assess detectability and object contrast recovery by using two measurable parameters. Visual analysis was also considered. We found that 3D mode is more effective than 2D for low contrast recovery when the selected (J corrections are applied. Detectability of small structures is limited by partial volume effects and device finite spatial resolution. ART, MLEM and specially 8-subsets OSEM are the most adequate methods for quantitative studies in 3D mode. The parameter that we have defined may also be used as indicators of suitable conditions for quantitation in images.
Depetris, Alessandro. "Desenvolvimento e aplicação de um programa em MatLab Simulink para a simulação do desempenho de veículos rodoviários comerciais em movimento acelerado." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/3/3149/tde-31102005-093516/.
Full textThis work presents a development and application proposal of a simulation program using the software Matlab/Simulink 6.0 as an Automotive Enginnering tool, applying it for the study of the commercial vehicles performance in accelerated and uniform movement. The simulation program considers the engine maps of specific fuel consumption and torque curves and also the transmission system between the engine and the tractive wheels. The simulation program connects itself with another simulation program developed in the EESC-USP Vehicle Computation Laboratory. For the simulations were used the data of a characteristic commercial vehicle submited to various situations on horizontal and inclinated roads, when were taken relevant data like average and instant fuel consumption for each predicted scene. This work also presents vehicle operational data as engine speed, covered spaces, longitudinal velocities and accelerations, adherence and relative slip of the wheels and others. The simulation program shows a hugh potential use in the optimization, development and project phases of commercial vehicles, allowing saving of time and financial resources. The simulation program shows also a potential application in the best vehicle specification to be bought by the client considering the specific conditions of its use.
Sabag, Munir Machado de Sousa. "Transições de fase e criticalidade em modelos estocásticos." Universidade de São Paulo, 2002. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-24022012-141616/.
Full textIn this work, we analyzed three lattice models governed by stochastic dynamics. Our main interest lies on the study of the phase transitions and critical behavior of these models. The first of them is the Domany-Kinzel probabilistic cellular automaton, to which we applied the method of series expansions. Next, we studied the long time behavior of some reaction-diffusion processes by means of numerical simulations. Such processes may be relevant to the understanding of granular compaction. Finally, also by means of numerical simulations, we have analyzed the conserved contact process, which is a version of the original model defined on an ensemble where the number of particles is conserved.
Geisweiller, Nil. "Étude sur la modélisation et la vérification probabiliste d'architectures de simulations distribuées pour l'évaluation de performances." Toulouse, ENSAE, 2006. http://www.theses.fr/2006ESAE0003.
Full textMonaco, Thiago de Oliveira. "Ensaios sobre o surgimento, evolução e manutenção dos mecanismos de senescência em animais." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/5/5144/tde-24082011-165404/.
Full textThe evolutionary basis of senescence is a long standing biological problem. Senescence, in the sense of progressive deterioration of an organism, is distinguished from aging, or the mere passage of time. Because, at least in mammals, aging and senescence occur simultaneously, the terms are wrongly taken as synonyms. Senescence seems explained by the wear of the organism, leading to a paradox facing the mechanisms of natural selection, because, being the biological machinery bearer of self-repair mechanisms, we would expect their improvement, with gradual elimination of the deterioration associated with aging. Historically, there were attempts to resolve this paradox by supposing that senescence confers an adaptive advantage, but this argument would require that older individuals were previously distinct from younger ones and is, therefore, circular. From mid-twentieth century, three hypotheses have prevailed. The mutation accumulation, proposed by Medawar (1951), proposes that the gradual decay of the force of selection with age favors the accumulation of deleterious genes expressed in advanced ages. The antagonistic pleiotropy, proposed by Williams (1957), argues that genes linked to benecial and deleterious traits could, under certain conditions, be favorably selected. Finally, the disposable soma theory, proposed by Kirkwood (1975), suggests that organisms are positively selected when they invest in reproduction even at the expense of somatic maintenance. The dynamic description of the phenomena involved in each mechanism and the relative contribution of each one is still debated. The present work aims to develop stochastic computer models that mimic the conditions for each. We started by historical order, testing the mechanism proposed by Medawar, which is the subject of this thesis. The theory of mutation accumulation found critics who suggest that this mechanism would cause deleterious eects to synchronize in very advanced ages, causing senescence to be a sudden phenomenon limited to these ages. This is a contradiction with the experimental observation, because senescence is a gradual process and even wild animals exhibit senescent phenomena detectable in young ages. To keep the model consistent with the current knowledge on population genetics, this model included the eects of selection, genetic drift and dierent mutation rates. As predicted, in our simulations the mode of the age of onset of deleterious genes stabilized only in very advanced ages, near the end of the age distribution. Also as expected, these distributions ended in earlier ages in scenarios of higher extrinsic mortality. However, in all our simulations, there was more or less wide distribution of ages of onset around the mode. The distribution is enlarged with increased probabilities of mutation, suggesting that the ages of onset may spread throughout life, including, in some scenarios, the maintenance of alleles with manifestation at birth. This is consistent with the innite alleles\' model used in population genetics, where dierent variants compete until achieving an equilibrium between selection, mutation and genetic drift. Considering the demographic criterion for senescence, in which mortality increases with age, we can say that our simulations evolved senescent populations. Although our data do not con ict with the other theories of senescence, they clearly show that the falling force of selection with age is not a sucient mechanism for senescence. Especially with alleles of late expression, the forces of genetic drift may be prominent and should be taken into account to explain the evolution of senescence
Grosche, Lucas Caetano. "Study of the interactions between emulsion flow and a spectrometer probe based on numerical simulations." Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/3/3137/tde-17102014-114519/.
Full textO presente trabalho tem como objetivo o estudo do comportamento do escoamento de uma emulsão do tipo óleo - em água que flui no interior de câmara/duto de medição e que tem como obstáculo em seu caminho uma sonda de um sensor óptico, sensor óptico este que deve avaliar em tempo real a estabilidade da emulsão onde está inserido. A emulsão é constituída por um fluido de corte para usinagem, com gotículas de óleo de diâmetro variando de 100 nanômetros para 100 micrometros. A sonda utilizada junto ao espectrômetro de luz UV- Vis está de acordo com o conceito proposto no projeto de pesquisa chamado EPM (Emulsion Process Monitor in Metalworking Fluid), realizado no âmbito do programa BRAGECRIM entre a Universidade de São Paulo e a Universidade de Bremen. Este estudo baseia-se na simulação numérica das interações entre a emulsão e o sistema de medição proposto, utilizando técnicas de Fluido Dinâmica Computacional (CFD), e tem por objetivo avaliar os efeitos da geometria da sonda, a sua posição em relação ao campo do escoamento, e propriedades do fluido, em especial as propriedades a serem medidas pelo espectrômetro. Tais efeitos estão correlacionadas com alterações na concentração de gotas e a segregação ou o tamanho das gotas dentro da câmara de medição da sonda óptica, o que pode causar mudanças nas leituras de intensidade de luz difusa. Efeitos de segregação devido a perturbações do escoamento em torno da sonda podem ser negligenciados, em condições normais de medição, com a sonda voltada para frente e sua área de entrada contra a corrente do escoamento. Com base nos resultados de simulação, mesmo que a sonda seja deslocada, o efeito sobre as medições ainda é insignificante. Medições foram efetuadas em laboratório e também foram realizadas medições in-situ utilizando um adaptador de medição acoplado diretamente no tubo de injeção de fluido de corte da máquina de perfuração, estes testes foram feitos para de validar os resultados obtidos por simulação, uma vez que não se observou qualquer efeito de segregação relacionada com o sistema de medição. Além disso, foi acrescentada ao estudo, a possibilidade de fixação de bactérias nas paredes de vidro internas da sonda e verificou-se que quando a velocidade de escoamento é suficientemente grande para produzir uma tensão de cisalhamento de cerca de 3-5 Pa a fixação de bacteriana pode ser evitado. Aproveitando os resultados do estudo para prevenção de contaminação por bactéria, mudanças na geometria da sonda foram propostas a fim de atingir uma condição isocinética para o escoamento ao redor e dentro da sonda, resultando em uma maior tensão de cisalhamento para baixas velocidades de fluxo de entrada. Por fim um estudo adicional foi realizado utilizando um modelo de rastreamento de partículas para compreender a relevância do comportamento individual de cada partícula no escoamento da emulsão. Os resultados não indicam qualquer efeito significativo sobre as medições no interior da sonda, embora estudos adicionais devem ser realizados neste tópico, considerando um modelo de balanço populacional para as gotículas de óleo.
Rodovalho, Francielle da Silva. "Simulação numérica de blocos e prismas de alvenaria em situação de incêndio." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/18/18134/tde-27082018-123952/.
Full textThe structural masonry is a very old building system in which the walls have structural and partition function. The use of this building system is widely spread in Brazil, however, few research programs were carried out on their behavior under fire situation and the country has not yet developed standard normative methods for designing structural masonry subject to fire. Thus the purpose of this research was to verify the performance of concrete blockwork structural masonry submitted to high temperatures through the simulation of prisms. In the Abaqus software the behavior of block and prism subjected to compression at room temperature and of the prism under fire situation with different boundary conditions were simulated. The compression of the block and prism at room temperature was validated until ultimate loads. The temperature rises of the non-exposed faces were well represented through thermal simulations. The material\'s resistance loss was adopted according to the technical literature in the thermomechanical simulations. Based on the analyzed examples it was observed that the prisms behave well regarding the thermal insulation under fire situation, mainly when having mortar coating on both sides. Regarding the mechanical resistance criterion, the numeric results were not validated with experimental ones, however, it was possible to represent the thermal deterioration of the materials.
Sousa, Antonio Higo Moreira de. "Coeficientes de parentesco em espécies florestais /." Botucatu, 2018. http://hdl.handle.net/11449/168757.
Full textResumo: Parentesco entre indivíduos em populações naturais é uma informação com múltiplos usos e pode ser acessada por meio de estimadores que usam dados moleculares. Todavia, cada estimador possui pressuposições e, muitas vezes, rigorosas para espécies florestais. Assim, a modelagem dos dados é uma etapa importante e, quase em todos os casos, negligenciada. Este trabalho teve como objetivo avaliar a eficácia de diferentes estimadores de parentesco em Acrocomia aculeata (Jacq.) Lodd. ex Mart., Hymenaea stigonocarpa Mart. ex Hayne e Dipteryx alata Vogel., bem como em quatro diferentes populações simuladas. A partir das coancestrias médias ( ������̅ ) estimadas, foi calculado o erro dos estimadores pressupondo que os indivíduos analisados eram meios-irmãos (������̅=0,125). As estimativas de parentesco oscilaram conforme a espécie e o método utilizado, gerando diferentes valores de erro para cada estimador. A correlação foi observada apenas entre estimadores que possuíam o mesmo método de estimativa ou com pressupostos similares. As populações simuladas tiveram melhores valores estimados e menores erros em comparação com dados reais. Os valores de erro dos estimadores encontrados, demonstram que somente aplicação dos estimadores para a inferência de determinado grau de parentesco, pode gerar resultados viesados e corroborar para ineficácia da tomada de decisão, sendo necessário o uso de informações complementares associadas ao parentesco, como análise do sistema de reprodução, estrutura gen... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: Understanding kinship between individuals in natural populations offers useful information that can be assessed based on estimators of molecular data. However, each estimation method is based on assumptions and is often restricted for forest species. Thus, modeling the data is an important step that is almost always neglected. This study aims to evaluate the efficacy of different kinship estimators for Acrocomia aculeata (Jacq.) Lodd, ex Mart., Hymenaea stigonocarpa Mart. Ex Hayne, and Dipteryx alata Vogel., and four different simulated populations. From the estimated mean coancestry (�����̅), the error estimates were calculated assuming that analyzed individuals were perfect half-siblings (�����̅=0,125). Estimates of kinship ranged according to the species and method used, generating different error values for each estimator. A correlation was observed only between estimators that used the same estimation method or similar assumptions. Simulated populations showed more accurate estimates and lower error values compared to actual data. The error values of the estimators demonstrate that the application of estimators to infer a certain degree of kinship can generate biased results and lead to inefficient decision making. Thus, the use of complementary information associated with kinship is necessary, such as analysis of the reproduction system and genetic structure of the population, enabling more precise inferences of the kinship between evaluated individuals.
Mestre
Dialetachi, Eva Lemmi Giovanini. "Espalhamento dinâmico de luz em sistemas coloidais diluídos." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-20092017-160805/.
Full textThe Dynamic Light Scattering technique, also known as Photon Correlation Spectros- copy, is widely used for the structural characterization for colloidal systems, providing important information on characteristic length scales, correlation times and polarization effects. The relative simple experimental setup and easy-to-use modeling methods are one of the main advantages of this technique. Specifically for diluted systems of particles in solution, one can obtain direct information on the hydrodynamic for the particles in the system. However, in order to retrieve this parameter it is necessary to use modeling and analysis methods for the experimental methods which assumes intrinsic characteristics on the system and have intrinsic limitations due to the resolution of the method when particles with several sizes, concentrations, etc. In several cases, the same experimental data can be described by several different models. In this project it is proposed a systematic study on the limitations on the analysis methods upon simulated and experimental data in order to investigate the applicability of these methods for several system types. Monodisperse and polydisperse systems are investigated, either composed by one type of particles (monomo- dal) or several types of particles (multimodal). As a result, one can obtain indications on the accuracy that the modeling methods can reproduce the simulated parameters. Finally, real experiments were performed using standard samples in order to test the modeling methods and to calibrate the simulation procedures.
Cerini, Maria Fernanda. "Simulações ambientais e caracterização espectroscópica in situ de potenciais bioassinaturas moleculares para aplicação em missões espaciais." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-24092018-143528/.
Full textAstrobiology is a growing research area Brazil, which studies the phenomenon of life in the Universe. One of its sub-themes studies biosignatures: substances which evidence the presence of life, past or present. The detectability of biomolecules, which are potential molecular biosignatures, and the photostability of their spectroscopic signatures in simulated extraterrestrial environments were investigated in laboratory. The experiments were based on irradiations in the ultraviolet, which is the main range of solar radiation responsible for the evolution and degradation of organic molecules in space environments. The research was focused in the biological pigments β-carotene and chlorophyll a, which were irradiated in both pure form and/or mixed with different inorganic substrates, mimicking the surfaces of rocky planets, satellites and asteroids. The facilities of the Brazilian Synchrotron Light Laboratory (LNLS) were used, especially the TGM beamline in the UV, VUV and EUV range, as well as low pressure lamps emitting in the UVC range. In the Space and Planetary Simulation Chamber (AstroCam) of the Astrobiology Research Unit of USP, several environmental parameters were controlled to simulate the surface conditions of Mars. And high-altitude balloons were used to test the response of biomolecules in the stratosphere, where the conditions are similar to those of the Martian surface, in addition to validate experiments which can be sent in space missions. Changes in the biomolecules spectroscopic responses were measured by UV-Vis and IR absorbance and by Raman scattering, either in situ and in real time or ex situ. The techniques proved to be adequate for these studies, since they provided information on the photostability of the biomolecules spectroscopic responses, allowing the testing of their potential as biosignatures on different surfaces of the Solar System. The results can also contribute to space missions, supporting the development and optimization of techniques and procedures, both for the exposure of biomolecules to real space environments – in small and low-cost missions, such as CubeSats –, as well as for the actual detection of biosignatures on extraterrestrial planetary surfaces.
Queiroz, Marco Aurélio Lima de. "Business competition dynamics: agent-based modeling simulations of firms in search of economic performance." reponame:Repositório Institucional do FGV, 2010. http://hdl.handle.net/10438/8170.
Full textApproved for entry into archive by Gisele Isaura Hannickel(gisele.hannickel@fgv.br) on 2011-05-24T14:59:28Z (GMT) No. of bitstreams: 1 71070100736.pdf: 4813549 bytes, checksum: 4b466ead23b0b18a6810a4a640824037 (MD5)
Approved for entry into archive by Gisele Isaura Hannickel(gisele.hannickel@fgv.br) on 2011-05-24T15:03:02Z (GMT) No. of bitstreams: 1 71070100736.pdf: 4813549 bytes, checksum: 4b466ead23b0b18a6810a4a640824037 (MD5)
Made available in DSpace on 2011-05-24T15:12:02Z (GMT). No. of bitstreams: 1 71070100736.pdf: 4813549 bytes, checksum: 4b466ead23b0b18a6810a4a640824037 (MD5) Previous issue date: 2010-12-07
The intent of this work is to explore dynamics of business competition through agentbased modeling simulations of firms searching for performance in markets configured as fitness landscapes. Building upon a growing number of studies in management science that utilizes simulation methods and analogies to Kauffman´s model of biological evolution, we developed a computer model to emulate competition and observe whether different search methods matter, under varied conditions. This study also explores potential explanations for the persistence of above and below average performances of firms under competition.
A intenção deste trabalho é explorar dinâmicas de competição por meio de “simulação baseada em agentes”. Apoiando-se em um crescente número de estudos no campo da estratégia e teoria das organizações que utilizam métodos de simulação, desenvolveu-se um modelo computacional para simular situações de competição entre empresas e observar a eficiência relativa dos métodos de busca de melhoria de desempenho teorizados. O estudo também explora possíveis explicações para a persistência de desempenho superior ou inferior das empresas, associados às condições de vantagem ou desvantagem competitiva
Joioso, Aparecido Luciano Breviglieri. "Aplicação de computação em grade a simulações computacionais de estruturas semicondutoras." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-24042008-114210/.
Full textThis work evaluates the use of grid computing in essential issues related to Computational Physics simulations. In particular, for applications with large scale matrix diagonalization. The Globus Toolkit open source project was used to compare the performance of the linear algebra parallel library ScaLAPACK in two different versions based on the message passing library, the traditional version MPICH and its version developed for a grid computing environment MPICH-G2. Several simulations within large scale diagonalization of complex matrix were performed. A 7.71 speedup was reached with the MPICH-G2 for a 8000 x 8000 size matrix distributed in 8 processes on 64 bits nodes. This was very close to the ideal speedup, that would be in this case, 8. It was also evidenced that the 64 bits architecture has better performance than the 32 bits on the performed simulations for this kind of application.
Jesus, Alexandre Cerqueira de. "Retroanálise de escorregamentos em solos residuais não saturados." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/18/18132/tde-10102008-090913/.
Full textFive historical cases of landslides in the city of Salvador, Bahia, were studied through the survey and treatment of the preexistent data. Complementary experiments were accomplished through the instrumentation field using tensiometers and laboratory testing. The back-analysis, main objective of this research, was accomplished through the geometries of each slope, before and after the rupture, resulting in the definition of the possible medium parameters of resistance to shear. Secondary, analysis of the conventional stability to the evaluation of safety factors of each slope was accomplished, with the parameters of resistance in the unsaturated and saturated condition. At last, numerical simulations were done, aiming at reproducing the conditions which led to the rupture of the slope of Alto do Bom Viver, based on the concepts of mechanics of non saturated soil. The results demonstrated that the majority of the slips studied occur in the horizon of the mature residual soil, where the value of the angle of the medium attrition is of 34 degrees, suitable with the surface of little deep ruptures observed in the field. The parametric analysis suggests that the most probable mechanism of rupture is related to the decrease of the apparent cohesion of the soil due to the decrease of suction.
Bhusal, Bhumi Shankar. "Radiofrequency Induced Heating of Implanted Stereo-electroencephalography Electrodes During MRI Scan: Theory, Measurements and Simulations." Case Western Reserve University School of Graduate Studies / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=case1545139929613149.
Full textSantos, Carlos Eduardo Fiore dos. "\"Sistemas fora do equilíbrio termodinâmico: Um estudo em diferentes abordagens\"." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-11042007-140207/.
Full textIn this PHD thesis, we have presented a study about several nonequilibrium systems with absorbing states by means of different approaches, such as mean-field analysis, usual numerical simulations, analysis in another ensemble and perturbative series expansions. In a specific part of this thesis, we have shown that the approach proposed here for describing nonequilibrium systems in the constant particle number ensemble can also be used to caracterize equilibrium systems, described by Gibbs probability distribution. Finally, we have shown open problems for future researchs.
Alves, Jozismar Rodrigues. "Estudo dos potenciais termodinâmico na coexistência de fase em modelos de rede através de simulação de Monte Carlo." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-24052018-101917/.
Full textThe thermodynamic fields are constant quantities along the co-existence isoterms. However, when the study of theses systems is done through Monte Carlo simulation, loops may be present, depending on the ensemble. The chemical potential loops have been known for a long time and have been understood as due to the interface. However, there are no studies which discuss the restoration of convexity, to our knowledge. In the case of loop of temperature, there are more recent works which point to of the a convex dip in entropy. In this research, we recover Hills heuristic argument from the 60s and numerically demonstrate the equivalence between the canonical and grand canonical ensembles, as well as between the microcanonical and canonical ensembles. Furthermore, we could restore the convexity of the chemical potential, reinterpreting the relation between the thermodynamic free energy and statistical free energy, through the inclusion of the contribution of the free energy of the interface. Our interpretation of simulation data made it possible to establish a very simple method to calculate the surface tension of the interface. In the literature, there are no well established or general methods to calculate the equilibrium pressure of lattice models for mixtures in the canonical ensemble. The Gibbs-Duhem method is very simple only for pure systems. We propose a method to calculate the pressure in the canonical ensemble for lattice models, which can be applied to mixtures. The method is based on the discretization of free energy related to volume and describes the variation in terms of the withdrawal of a lattice line, either empty or occupied by particles column. For pure systems, we have compared our method to Dickman\'s and to the Gibbs-Duhem method. We showed that our method reaches results which are close to the results of Gibbs-Duhem method, which can be considered exact, as we increase the lattice. We verified that the Dickman method is not adapted to the study of phase coexistence, since isotherms do not present one of the loops that allows establishes the density of the phase of higher density. Also, the results for high density are wrong. This happens because the method does not alow the of periodic boundary condition in one of the directions. Our investigation was carried out for pure fluids, both for isotropic orientational interactions, and for mixtures, for the case of isotropic interactions. The results were obtained by using the Metropolis algorithm, the Wang-Landau algorithm, and an adaptation of the Multicanonical method with the Wang-Landau algorithm. The two last are mandatory in the study of the equivalence of ensembles.
Paulista, Neto Antenor José. "Simulação dos produtos da oxidação lipídica em bicamadas." reponame:Repositório Institucional da UFABC, 2015.
Find full textDissertação (mestrado) - Universidade Federal do ABC, Programa de Pós-Graduação em Ciência & Tecnologia - Química, 2015.
Os componentes lipídicos das membranas celulares são substratos comuns de ataque oxidativo. Durante uma oxidação não-enzimática, tanto os fosfolipídios quanto o colesterol podem reagir com moléculas de 1O2 geradas fotodinamicamente via adição ene, produzindo hidroperóxidos de fosfolipídios e colesterol, respectivamente. Os efeitos citotóxicos e apoptóticos desses hidroperóxidos estão relacionados com a sua influência sobre as propriedades biofísicas das membranas fosfolipídicas, assim como na sua habilidade em disseminar o stress oxidativo. Estas habilidades relacionam-se fortemente com a estrutura química destes hidroperóxidos. Contudo, os mecanismos moleculares subjacentes ainda não estão bem compreendidos. Aqui, foram realizadas simulações de dinâmica molecular de diferentes produtos da oxidação dos fosfolipídios e hidroperóxidos de colesterol em bicamadas de fosfatidilcolina monoinsaturadas. O efeito combinado da peroxidação do colesterol e fosfolipídio também foi investigado. Depois da oxidação, tanto o núcleo rígido de esterol quanto as cadeias acila sn-2 dos fosfolipídios reorientam-se e os grupos hidrofílicos inseridos formam ligações de hidrogênio com os grupos carboniléster dos fosfolipídios. Para o colesterol, esta reorientação provocou a perda das propriedades de condensação e ordenação, com possíveis implicações à formação de domínios laterais na membrana. No caso dos radicais peroxila de colesterol, pequenas mudanças na orientação foram observadas dependendo da posição dos grupos -OO no núcleo de esterol. As possíveis implicações para as habilidades do radical propagar a reação em cadeia foram discutidas.
Lipid components of cell membranes are common substrates for oxidative attack. During a non-enzymatic oxidation, both phospholipids as well as cholesterol molecules can react with photodynamically generated 1O2 via ene addition, producing phospholipids and cholesterol hydroperoxides, respectively. The cytotoxic and apoptotic effects of these hydroperoxides are related to their influence on the biophysical properties of phospholipid membranes, as well as their ability to disseminate oxidative stress. These abilities relate strongly to the chemical structure of these hydroperoxides. However, the underlying molecular mechanisms are not well understood. Here, molecular dynamics simulations of different oxidation products of phospholipids and cholesterol hydroperoxides in monounsaturated phosphatidylcholine bilayers have been performed. The combined effect of phospholipid and cholesterol peroxidation was also investigated. After oxidation, both the rigid ring sterol and the sn-2 acyl chains of the phospholipids tilt themselves and the inserted hydrophilic groups form h-bonds with carbonylester groups of phospholipids. For cholesterol, this tilting caused the loss of condensation and ordering properties, with possible implications for the formation of lateral domains in the membrane. In the case of cholesterol peroxyl radicals, small changes in orientation were observed depending on the position of -OO groups in the sterol ring. The possible implications for the radical skill to propagate the chain reaction were discussed.
Silva, Paulo Salem da. "Verification of behaviourist multi-agent systems by means of formally guided simulations." Thesis, Universidade de São Paulo, 2011. http://www.theses.fr/2011PA112267/document.
Full textMulti-agent systems (MASs) can be used to model phenomena that can be decomposed into several interacting agents which exist within an environment. In particular, they can be used to model human and animal societies, for the purpose of analysing their properties by computational means. This thesis is concerned with the automated analysis of a particular kind of such social models, namely, those based on behaviourist principles, which contrasts with the more dominant cognitive approaches found in the MAS literature. The hallmark of behaviourist theories is the emphasis on the definition of behaviour in terms of the interaction between agents and their environment. In this manner, not merely reflexive actions, but also learning, drives, and emotions can be defined. More specifically, in this thesis we introduce a formal agent architecture (specified with the Z Notation) based on the Behaviour Analysis theory of B. F. Skinner, and provide a suitable formal notion of environment (based on the pi-calculus process algebra) to bring such agents together as a MAS. Simulation is often used to analyse MASs. The techniques involved typically consist in implementing and then simulating a MAS several times to either collect statistics or see what happens through animation. However, simulations can be used in a more verification-oriented manner if one considers that they are actually explorations of large state-spaces. In this thesis we propose a novel verification technique based on this insight, which consists in simulating a MAS in a guided way in order to check whether some hypothesis about it holds or not. To this end, we leverage the prominent position that environments have in the MASs of this thesis: the formal specification of the environment of a MAS serves to compute the possible evolutions of the MAS as a transition system, thereby establishing the state-space to be investigated. In this computation, agents are taken into account by being simulated in order to determine, at each environmental state, what their actions are. Each simulation execution is a sequence of states in this state-space, which is computed on-the-fly, as the simulation progresses. The hypothesis to be investigated, in turn, is given as another transition system, called a simulation purpose, which defines the desirable and undesirable simulations (e.g., "every time the agent does X, it will do Y later"). It is then possible to check whether the MAS satisfies the simulation purpose according to a number of precisely defined notions of satisfiability. Algorithmically, this corresponds to building a synchronous product of these two transitions systems (i.e., the MAS's and the simulation purpose) on-the-fly and using it to operate a simulator. That is to say, the simulation purpose is used to guide the simulator, so that only the relevant states are actually simulated. By the end of such an algorithm, it delivers either a conclusive or inconclusive verdict. If conclusive, it becomes known whether the MAS satisfies the simulation purpose w.r.t. the observations made during simulations. If inconclusive, it is possible to perform some adjustments and try again.In summary, then, in this thesis we provide four novel elements: (i) an agent architecture; (ii) a formal specification of the environment of these agents, so that they can be composed into a MAS; (iii) a structure to describe the property of interest, which we named simulation purpose; and (iv) a technique to formally analyse the resulting MAS with respect to a simulation purpose. These elements are implemented in a tool, called Formally Guided Simulator (FGS). Case studies executable in FGS are provided to illustrate the approach
CALHEIROS, JUNIOR Eduardo Jorge. "Estimação computacional dos esforços eletromecânicos em transformadores de potência no sistema CHESF." Universidade Federal de Campina Grande, 2014. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/257.
Full textMade available in DSpace on 2018-02-06T14:28:39Z (GMT). No. of bitstreams: 1 EDUARDO JORGE CALHEIROS JUNIOR - DISSERTACAO PPGEE 2014.pdf: 3431884 bytes, checksum: 5d6e954510b9e7405666b81ef4f13f30 (MD5) Previous issue date: 2014-06
As análises de adequabilidade de equipamentos são indispensáveis para permitir que as empresas do setor elétrico conheçam as condições em que os mesmos operam, garantindo assim o desempenho dos ativos da empresa e sua disponibilidade no Sistema Interligado Nacional. Dentre os equipamentos analisados, destacam-se os transformadores de potência pelo seu alto custo e a sua importância para o sistema elétrico. Grande parte das falhas em transformadores de potência é de origem dielétrica, relacionadas com a danificação da isolação dos condutores devido às deformações mecânicas ocasionadas pelas altas correntes de curto-circuito passantes nos enrolamentos, reduzindo a vida útil do equipamento. A partir destas considerações, propôs-se avaliar os esforços eletromecânicos internos a alguns transformadores de potência presentes no sistema CHESF, por intermédio de simulações computacionais, quando da ocorrência de curto-circuito trifásico. A metodologia para as simulações consistiu em selecionar os transformadores a serem analisados e coletar as informações técnicas necessárias fornecidas pelos fabricantes. Posteriormente foram determinadas as condições sistêmicas às quais os transformadores estariam submetidos e obtidos os níveis de curto-circuito resultantes, para então efetuar as simulações propriamente ditas. Os resultados das simulações computacionais mostraram os esforços eletromecânicos estimados em quatro transformadores de potência em operação no sistema elétrico da CHESF, resultantes do efeito dinâmico da corrente de curto-circuito trifásico que percorrem os enrolamentos do transformador, considerando diferentes configurações sistêmicas. As conclusões das análises dos resultados indicaram possíveis casos de superações de limites admissíveis e revelou a importância dos aspectos construtivos dos transformadores na sua capacidade de suportar os esforços mecânicos causados por curtos-circuitos.
Analysis of suitability of equipment are indispensable for the electric sector companies meet the conditions in which they operate, thus ensuring the performance of the company's assets and its availability in the National Interconnected System (Sistema Interligado Nacional). The power transformers are included among the analyzed equipment due to its high cost and importance for the electrical system. Most faults in power transformers is of dielectric origin, related to insulation damage due to mechanical deformations, caused by the high short-circuit currents in the windings, reducing the life of the equipment. Then, it was proposed to evaluate the internal electromechanical efforts to some power transformers present in the CHESF system, through computer simulations, in the event of a three-phase short circuit. The methodology for the simulations consisted to select the transformers to parse and collect the necessary technical information provided by the manufacturers. Later, certain systemic conditions in which the transformers would be subjected were defined, it was obtained the resulting short circuit levels and then was performed the simulations themselves. The results of the computer simulations showed the electromechanical efforts estimated at four power transformers in operation in the CHESF electrical system, resulting from the dynamic effect of three-phase short-circuit current that run through the windings of the transformer, considering different systemic configurations. The conclusions of the analysis results indicated possible cases of overruns of permissible limits and reveal the importance of the transformers constructive aspects in their ability to withstand the mechanical stress caused by short circuits.
TELES, Ronneesley Moura. "Um estudo de técnicas da Inteligência Artificial aplicadas na distribuição de recursos em áreas geográficas." Universidade Federal de Goiás, 2011. http://repositorio.bc.ufg.br/tede/handle/tde/511.
Full textThis paper studies a problem that is the distribution of resources in geographical areas. Many organizations, from taxis, to the military faces this problem in their daily operations. In essence, it tries to answer the question: What are the best places in which I place my assets in the geographic area X, according to a set of constraints Y? . To answer the question, we studied representation models of the problem, known techniques have been applied in Artificial Intelligence and Operations Research, such as: testing all combinations, greedy algorithms, heuristics and genetic algorithms. Algorithms were created and simulations were performed. Furthermore, Multiagent System was developed that implements one of algorithms developed. In evaluating the proposals made in this dissertation, it was considered the field of Electric Companies in the task of distributing vehicles in a city.
Este trabalho estuda um problema que consiste na distribuição de recursos em áreas geográficas. Muitas organizações, desde cooperativas de táxi até as forças armadas enfrentam este problema em suas operações diárias. Em essência, tenta-se responder a pergunta: Quais os melhores lugares em que devo posicionar meus recursos na área geográfica X, de acordo com um conjunto de restrições Y? . Para responder a pergunta, foram estudados modelos de representação do problema, foram aplicadas técnicas conhecidas da Inteligência Artificial e da Pesquisa Operacional, tais como: teste de todas combinações, algoritmos gulosos, heurísticas e algoritmos genéticos. Foram criados algoritmos e foram realizadas simulações. Além disso, foi desenvolvido um Sistema Multiagente que implementa um dos algoritmos criados. Na avaliação das propostas feitas nesta dissertação, foi considerado o domínio das Companhias Elétricas, na tarefa de distribuir viaturas em uma cidade.
Vidal, Natália Ferreira. "O uso de simulações virtuais em oficinas de formação para professores de ciências da educação básica." Universidade Federal de Juiz de Fora (UFJF), 2017. https://repositorio.ufjf.br/jspui/handle/ufjf/6521.
Full textRejected by Adriana Oliveira (adriana.oliveira@ufjf.edu.br), reason: Título errado on 2018-03-28T16:16:04Z (GMT)
Submitted by Geandra Rodrigues (geandrar@gmail.com) on 2018-03-28T17:28:16Z No. of bitstreams: 1 nataliaferreiravidal.pdf: 1330964 bytes, checksum: 6c0e346127489683df86081ef52376ca (MD5)
Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2018-03-28T17:36:00Z (GMT) No. of bitstreams: 1 nataliaferreiravidal.pdf: 1330964 bytes, checksum: 6c0e346127489683df86081ef52376ca (MD5)
Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2018-03-28T17:36:31Z (GMT) No. of bitstreams: 1 nataliaferreiravidal.pdf: 1330964 bytes, checksum: 6c0e346127489683df86081ef52376ca (MD5)
Made available in DSpace on 2018-03-28T17:36:31Z (GMT). No. of bitstreams: 1 nataliaferreiravidal.pdf: 1330964 bytes, checksum: 6c0e346127489683df86081ef52376ca (MD5) Previous issue date: 2017-12-12
A presente pesquisa investigou o uso das simulações virtuais como alternativa metodológica ao ensino de Ciências. Compreende-se tais simulações como eventos de cibercultura e não apenas como evolução das práticas de ensino potencializadas pelo advento das tecnologias digitais de informação e comunicação. A cibercultura é um termo utilizado na definição dos agenciamentos sociais das comunidades no espaço eletrônico virtual. Para a investigação, realizamos uma oficina de formação com professores de Ciências, a fim de refletir sobre a formação docente desse profissional para atuar com os recursos de Tecnologias Digitais de Informação de Comunicação, mais especificamente com simuladores virtuais na cibercultura. A pesquisa mostra o percurso repleto de dificuldades vivenciadas pela pesquisadora até chegar à oficina, que aconteceu no Infocentro da Faculdade de Educação da UFJF e versou sobre simulações virtuais na plataforma online do PhET. O desenvolvimento da oficina, bem como todo o seu processo de preparação e de execução, foi registrado em relatórios escritos e em arquivos de áudio que compuseram a base de dados para posteriores análises da pesquisadora. O referencial teórico dialogou com a pluralidade das narrativas dos sujeitos da pesquisa, bem como com as reflexões e com as observações da pesquisadora, por meio de um levantamento bibliográfico e também pela fundamentação de estudiosos que tratam da cibercultura como Pierre Levy; da relação entre nativos e imigrantes digitais, dissertada por Marc Prensky e também da metodologia de investigação pesquisa-formação, apresentada nos trabalhos de Edméa Santos. A pesquisa-formação subsidiou o método e a metodologia deste trabalho e alicerçou a produção da oficina de formação, lócus desta investigação. Os resultados produzidos proporcionaram o confronto e a reflexão sobre o uso das simulações virtuais no ensino e também a formação dos professores para atuarem com recursos como os simuladores online. Os dados apontaram para a importância do letramento digital dos professores e sinalizaram as potencialidades do uso de simulação virtual no ensino de Ciências, além de uma reflexão profícua sobre a pesquisa-formação aplicada em ambientes de cibercultura.
The present research investigated on the use of virtual simulations as a methodological alternative for the teaching of science. These simulations are understood as an event of cyberculture and not only as an evolution of teaching procedures enhanced by the growth of the information and communication digital technologies. Cyberculture is a term used in the definition of social assemblages of communities in the virtual electronic area. In the investigation, we conducted a training workshop with teachers of science, fo the purpose of reflecting over the teacher formation of this professional to use the resources of Digital Technologies of Communication Information, more specifically with virtual simulators, in cyberculture. The research shows a path filled with difficulties, lived by the researcher through the way until the workshop, that happened in the Infocenter of the Faculty of Education of the UFJF and laid out about the virtual simulations in the PhET online platform. The outgrowth of the workshop, as well as the entire preparation and execution process, was written down in reports and audio files that compounded the database for the researcher future further analysis. The theoretical reference dialogued with the subjects of the research and his plurality of narrative, as well as the reflections and observations of the researcher, with the resource of a bibliographical survey and also the grounding of experts who deal with cyberculture whit Pierre Levy, the relation 'digital native and immigrant' present by Marc Prensky and also on the research-formation research methodology, presented in the studies of Edmea Santos. The research-formation fed the method and methodology of this work up and consolidated the production of the formation workshop, place of this research. The results obtained provide the confrontation and reflection on the use of virtual simulations for the teaching and also the formation of teachers to work with resources such as online simulators. The data displayed over the importance of the digital literacy of teachers and pointed out the potential of the use of virtual simulation in science education, moreover a profitable reflection on applied research-formation in cyberculture environments.
Melo, Vanio Fragoso. "Modelagem e controle de caimento e dobras em superficies deformaveis." [s.n.], 2004. http://repositorio.unicamp.br/jspui/handle/REPOSIP/260221.
Full textTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação
Made available in DSpace on 2018-08-04T02:43:33Z (GMT). No. of bitstreams: 1 Melo_VanioFragoso_D.pdf: 1580241 bytes, checksum: 61e5b4c09418c6c4e71a4c96ad52c667 (MD5) Previous issue date: 2004
Resumo: Esta tese de doutorado tem como objetivo apresentar uma proposta de modelo computacional de superfície deformável. Há duas vertentes de modelos de superfícies deformáveis fisicamente embasados: modelos de mecânica das partículas e modelos de mecânica dos contínuos. Os modelos de mecânica dos contínuos são mais realísticos e intuitivos, por se basearem numa estrutura geométrica contínua e utilizarem os elementos de Geometria Diferencial para a sua análise. Similar a maioria dos modelos deformáveis com base na mecânica dos contínuos, propomos um modelo de superfície deformável com base na superfície de Cosserat elástica. Diferentemente dos modelos existentes na literatura de Computação Gráfica e Animação, o nosso modelo considera a relação entre as deformações tangenciais e normais, e inclui um novo paradigma lineariz ável para estimar os vetores normais. Isto possibilita a formação de dobras e rugas a partir da ação de forças tangenciais. Uma implementação é apresentada. Para corrigir alguns tipos de desequilíbrios criados pelo método das diferenças finitas empregado na discretização, propomos fatores de correções das forças internas atuantes nas bordas. Experimentalmente, o modelo proposto foi validado com aplicações para caimentos, para criação de dobras e rugas de tecido de pano
Abstract: This doctoral thesis aims at presenting a proposal of a computational model for a deformable surface. There are two trends of deformable surface models that are physically based: particle systems and continuum mechanical models. The continuum mechanical models are more realistic and intuitive, since the underlying geometric structure is a continuum which can be analyzed with use of Differential Geometriy. Similar to most of works based on the continuum mechanics, we propose a deformable surface model based on an elastical Cosserat surface. Differently from the existing models in the Graphic Computer and Animation literature, our model considers a relation between the tangential and normal deformations, and integrate a novel linearizable approach for estimating the normal vectors. This provides an efficient way to simulate folds and wrinkles under the action of tangential forces. An implementation is presented. In order to correct some kinds of unbalance of forces, due to the methods of finite differences employed on the discretization, we propose correction factors for the internal forces that act on the boundaries. Experimentally, the proposed model was validated in the simulations of cloth draping, folds and wrinkles of cloth
Doutorado
Engenharia de Computação
Doutor em Engenharia Elétrica
Pastorio, Dioni Paulo. "ATIVIDADES DIDÁTICAS INOVADORAS DE MECÂNICA DE PARTÍCULAS COM DESENVOLVIMENTO DE COMPETÊNCIAS EM UM AMBIENTE DE COMPUTAÇÃO NUMÉRICA." Universidade Federal de Santa Maria, 2014. http://repositorio.ufsm.br/handle/1/6681.
Full textThis study aimed to investigate how innovative problem-solving learning activities (AD), based on computer simulations, contribute to the development of conceptual, procedural and attitudinal contents. Yet these AD, aimed to develop associated with numerical computing environment (ACN) skills. For this reason, we work with the development, implementation and evaluation of structured from the problem-solving strategy AD. This motion has occurred along class of freshmen undergraduate course in Meteorology, Federal University of Santa Maria (UFSM). Data collection involved the following data collection instruments: initial and final questionnaires, records obtained from the person meetings and even from the responses obtained in the solutions delivered by students of AD. The analysis of these data was done at the time of evaluation, and allowed us to conclude that these activities provide the development of conceptual, procedural and attitudinal contents concomitantly, and the development of skills associated with a mathematical software for general use. Finally, the assessment performed by the students indicated that these activities are more instructive and interesting than the strategy of problem resolutions pencil and paper.
O presente trabalho procurou investigar como atividades didáticas (AD) inovadoras de resolução de problemas, baseadas em simulações computacionais, contribuem para o desenvolvimento de conteúdos conceituais, procedimentais e atitudinais. Ainda estas AD objetivaram o desenvolvimento de competências associadas a um ambiente de computação numérica (ACN). Para isso, trabalhamos com a elaboração, implementação e avaliação das AD estruturadas a partir da estratégia de resolução de problemas. Este processo de implementação ocorreu junto à turma de ingressantes de um curso de graduação em Meteorologias da Universidade Federal de Santa Maria (UFSM). A obtenção dos dados envolveu os seguintes instrumentos de coleta: questionários inicial e final, registros obtidos a partir dos encontros presenciais realizados e ainda a partir das respostas obtidas nas soluções das AD entregues pelos estudantes. A análise destes dados deu-se no momento de avaliação, e possibilitou-nos concluir que estas atividades proporcionam o desenvolvimento dos conteúdos conceituais, procedimentais e atitudinais concomitantemente, e ainda o desenvolvimento de competências associadas a um software matemático de uso geral. Por fim, a avaliação realizada pelos estudantes indicou que estas atividades são mais instrutivas e interessantes do que a estratégia de resoluções de problema de lápis e papel.
Cintra, Renata Azevedo. "Interações orais em língua inglesa no laboratório de multimídia com acesso à Internet /." São José do Rio Preto : [s.n.], 2004. http://hdl.handle.net/11449/93924.
Full textBanca: Marilei Amadeu-Sabino
Banca: Nelson Mitrano Neto
Resumo: Nesta investigação, analisam-se as interações orais (aluno-aluno e aluno-professora), construídas em aulas de língua inglesa, em um laboratório de multimídia, utilizando-se o jogo simulador The Sims e sites da internet, para propiciar o desenvolvimento de tarefas comunicativas orais (Nunan, 1989). Ademais, são analisados alguns fatores que contribuíram para que as interações fossem construídas da maneira descrita. A pesquisa é classificada como de cunho etnográfico (Watson-Gegeo, 1988), havendo uma preocupação com a interação na sala de aula como espaço de aprendizagem (Moita Lopes, 1996). A pesquisadora desempenhou também o papel de professora de um grupo de doze alunos de Licenciatura em Letras, em uma faculdade do noroeste paulista. Destaca-se que esta investigação corresponde à primeira experiência da professora-pesquisadora com a utilização de computadores para o ensino-aprendizagem de língua inglesa, bem como da maioria dos alunos, que era semi-letrada eletronicamente (Buzato, 2001). Os dados, coletados por meio de gravações em áudio e vídeo, se constituíram por momentos de interações orais aluno-aluno e aluno-professora. Tais dados foram triangulados para que pudessem validar as conclusões dessa investigação. Os alunos desenvolveram o letramento eletrônico por meio da construção de andaimes realizada por parceiros mais experientes durante as interações. Alguns fatores interferentes nas interações aluno-aluno foram as atividades desenvolvidas, os temas propostos, as especificidades de cada "material" (jogo e sites), o papel desempenhado pela professora-pesquisadora no laboratório de multimídia e o semi-letramento eletrônico da maioria dos alunos.
Abstract: This research study is of an ethnographic nature (Watson-Gegeo, 1988), concerned with classroom interaction as a learning space (Moita Lopes, 1996). The researcher was also the teacher of twelve learners, undergraduate students and future teachers of English as a foreign language, in a college in the northwest of São Paulo state. The verbal interactions (learner-learner and learner-teacher) analyzed were constructed in a multimedia laboratory, using a simulator game (The Sims) and sites of the Internet in order to propitiate the development of verbal tasks (Nunan, 1989). Some factors that have contributed to construct interactions were analyzed. This study corresponds to the first experience of the teacher-researcher in computer-assisted language learning. The approach used to analyze data was mainly qualitative, with some quantitative analyses. Data were collected by means of audio and video recordings, research diaries, questionnaires, interviews and a research report written by a lesson observer. The preparation and analysis of the questionnaires and interviews were based on Gillham (2000a; 2000b). Data were triangulated in order to validate the results of the study. Learners developed experience in computer-assisted language learning with more experienced peers who provided them with scaffolding during the activities. Some factors which interfered on interactions were the types of activities developed, the topics proposed, the specificity of the materials (game and sites), the role of the researcher-teacher in the laboratory and the learners' lack of experience in computer-assisted language learning.
Mestre
Carvalho, Camila Antunes de. "Avaliação do sombreamento e da iluminação natural em apartamentos de edifícios residenciais verticais multifamiliares de Maceió-AL: o uso de varandas." Universidade Federal de Alagoas, 2010. http://repositorio.ufal.br/handle/riufal/716.
Full textFundação de Amparo a Pesquisa do Estado de Alagoas
Nas regiões de clima tropical quente e úmido, como é o caso de Maceió-AL, o aproveitamento das fontes de iluminação e ventilação naturais e o uso do sombreamento devem ser pensados desde as etapas iniciais do projeto. Para favorecer o sombreamento, elementos como brises, cobogós e varandas são indispensáveis às edificações. No caso das varandas, atualmente, percebe-se a sua utilização de forma aleatória quanto às questões ambientais. Como consequência disso e para sanar o desconforto gerado pela ausência de elementos adequados ao clima, intensifica-se o uso de meios artificiais para garantir o conforto nos ambientes, contribuindo para o desperdício de energia. Diante desse contexto, esse trabalho tem como objetivo geral avaliar o uso de varandas em apartamentos de edifícios residenciais verticais multifamiliares localizados no bairro de Ponta Verde, em Maceió-AL, quanto ao sombreamento e à iluminação natural. Para alcançá-lo, a metodologia utilizada consistiu de uma análise comparativa entre tipologias recorrentes de varandas, quanto ao sombreamento e a iluminação natural, através das técnicas de levantamento de dados, geração de máscaras de sombra e simulações nos programas computacionais Sketchup e Troplux. As análises dos resultados possibilitaram a compreensão do comportamento do sombreamento e da iluminação natural nas tipologias examinadas e ainda a análise comparativa entre os resultados das simulações de insolação e da iluminação natural. Constatou-se que, uma vez optando por uma melhor estratégia de sombreamento, utilizando as varandas, nem sempre será obtido um bom resultado no desempenho luminoso dos ambientes internos. Enfim, o uso de varanda nos edifícios residenciais de Maceió pode ser trabalhado para alcançar melhores resultados no conforto ambiental e para enriquecer o aspecto formal dos edifícios, devendo ser pensado desde a fase inicial do projeto arquitetônico
Santos, Valéria Diniz dos. "Drenagem urbana em áreas especiais: o caso da bacia fechada do bairro de Oitizeiro, João Pessoa." Universidade Federal da Paraíba, 2006. http://tede.biblioteca.ufpb.br:8080/handle/tede/5549.
Full textCoordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
Nowadays, problems including urban drainage are increasing with the percentage of urban area. In fact, the quality of life in big cities is directly affected by the drainage problems. In case of special areas, such as a closed watershed, the problems are increasing because in this sort of watershed there is not an outlet. This peculiar characteristic in its relief or its soil uses originates diversified questions about hidrology, sedimentology and water quality. In this paper, it was made a study about urban drainage problems in a watershed called Lagoa do Buracão, in the city of João Pessoa, State of Paraíba, Brazil. Each feature of the studied area was researched and several hydrological simulations were made in order to analyze three different sceneries of the soil occupation. The sceneries include changes in the level of the lake. It was used rainfall data and physical characteristics of the studied watershed. The results showed that the satisfactory scenery is the first one. Important positive impacts are involved with the proposed solution as well as negative impacts too. The main conclusion is that the proposal solution is highly recommendable for the resolution of the problem of urban drainage of Lagoa do Buracão. It adds other functions as the creation of a desirable landscape in the composition of the urban space, the valuation of the neighboring areas and the restoration of the environment which is currently degraded.
A problemática das inundações e dos alagamentos urbanos vem crescendo paulatinamente nas últimas décadas como conseqüência, principalmente, do crescimento das áreas urbanizadas. Logo, nota-se o comprometimento da qualidade de vida do habitante urbano, em particular no que tange aos transtornos causados pela presença de água em excesso nas vias e lotes urbanos. Tal situação agrava-se caso se considere a ocorrência desta problemática em bacias fechadas (ou endorreicas), uma vez que neste tipo de bacia o escoamento superficial se acumula em lagos ou sumidouros que não se comunicam por uma rede superficial com outros cursos de água a jusante. Constituem, desse modo, entes especiais na abordagem da gestão do meio urbano por não apresentarem uma saída natural para o escoamento das águas pluviais. A avaliação de áreas especiais fornece subsídios para a gestão do meio urbano, possibilitando o tratamento combinado das questões de drenagem pluvial com outras questões urbanísticas para o desenvolvimento sustentável. Na presente pesquisa, estudou-se a problemática da drenagem urbana de uma bacia fechada denominada Lagoa do Buracão, localizada no bairro de Oitizeiro, município de João Pessoa, Estado da Paraíba. Após caracterização detalhada dos diversos aspectos relativos à área enfocada e de variáveis hidrometeorológicas pertinentes, simulou-se o comportamento hidrológico-hidráulico do conjunto bacia-lagoa a partir da modelagem efetivada com o auxílio do Software ABC 6. São considerados três cenários propostos simulados a partir de uma cenário base, que corresponde a conformação atual da área. Os cenários propostos incluem desassoreamento e aprofundamento da lagoa. Os resultados mostraram que a alternativa mais adequada seria a do cenário 1, que abrange desassoreamento da lagoa para profundidade constante de 3 metros e soleira do vertedor na cota 30,5. Com base neste cenário eleito, propõe-se uma solução de drenagem para a lagoa de uso multifuncional, combinando a utilização de técnicas compensatórias de drenagem urbana e a inserção de elementos que propiciem a requalificação da paisagem urbana. São apontados ainda os principais impactos positivos e negativos esperados da implementação desta intervenção na bacia. Conclui-se que a solução proposta é altamente recomendável para a resolução do problema de drenagem urbana da Lagoa do Buracão, agregando outras funções como a criação de uma paisagem que valoriza as áreas do entorno e restaura um ambiente atualmente degradado.
Cabella, Brenno Caetano Troca. "Inferência estatística em métodos de análise de ressonância magnética funcional." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/59/59135/tde-10062008-072315/.
Full textIn the present work, concepts of statistical inference are used for application and comparison of different methods of signal analysis in functional magnetic resonance imaging. The central idea is based on obtaining the probability distribution of the random variable of interest, for each method studied under different values of signal-to-noise ratio (SNR). This purpose is achieved by means of numerical simulations of the hemodynamic response function (HRF) with gaussian noise. This procedure allows us to assess the sensitivity and specificity of the methods employed by the construction of the ROC curves (receiver operating characteristic) for different values of SNR. Under specific experimental conditions, we apply classical methods of analysis (Student\'s t test and correlation), information measures (distance of Kullback-Leibler and its generalized form) and a Bayesian method (independent pixel method). In particular, we show that the distance of Kullback-Leibler D (or relative entropy) and its generalized form are useful measures for analysis of signals within the information theory scenario. These entropies are used as measures of the \"distance\"between the probability functions p1 and p2 of the signal levels related to stimulus and non-stimulus. In order to avoid undesirable divergences of D, we introduced a small parameter d in the definitions of p1 and p2. We extend such analysis, by presenting an original study of the generalized Kullback-Leibler distance Dq (q is Tsallis parameter). In this case, the appropriate choice of range 0 < q < 1 ensures that Dq is finite. We obtain the probability densities f (D) and f (Dq) of the sample averages of the variables D and Dq, respectively, calculated over the N epochs of the entire experiment. For small values of N (N < 30), we show that f (D) and f (Dq) are well approximated by Gamma distributions (qui^2 < 0.0009). Afterward, we studied the independent pixel bayesian method, considering the probability a posteriori as a random variable, and obtaining its distribution for various SNR\'s and probabilities a priori. The results of simulations point to the fact that the correlation and the independent pixel method have better performance than the other methods used (for SNR> -20 dB). However, one should consider that the Student\'s t test and the entropic methods share the advantage of not using a model for HRF in real data analysis. Finally, we obtain the maps corresponding to real data series from an asymptomatic volunteer submitted to an event-related motor stimulus, which shows brain activation in the primary and secondary motor brain areas. We emphasize that the procedure adopted in this study may, in principle, be used in other methods and under different experimental conditions.
Bentes, Jennefer Lavor. "Análise dinâmica da ruptura de cabos em torres autoportantes e estaiadas de linhas de transmissão." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2013. http://hdl.handle.net/10183/87342.
Full textAmidst the main causes of electric energy transmission failure, the collapse of transmission towers is a current research topic in the last decades, due mainly to a huge number of accidents occurring in transmission lines worldwide. In this work, a dynamic analysis was performed associated to the loading due to a broken conductor, which gives rise to a phenomenon known as cascade effect. To better understanding the response of lattice selfsupported and guyed towers under this dynamic load, and in an attempt of determination of criteria for establishment of the longitudinal robustness of transmission line towers, numerical models were developed in the software ANSYS Mechanical/LS-DYNA, considering the discretization of the structural model in space using the finite element method; and the solution of the dynamic problem in the time using the direct integration of the equation of motion, through the Newmark’s method. First, static analyses were performed, accordingly to the considerations of design projects carried out nowadays. Afterwards, two kinds of dynamic analyses were executed: a simplified one, with the applications of the loading using a function in the time and another, which was simulated as a deactivation of a conductor’s finite element. After that, these were submitted to interpretation and comparison among their results. The structural damping was considered in accordance with Rayleigh’s formulation and the catenary of the cables following the equations found by Irvine and Caughey (1974). In order to not restrict the response to one kind of simulation, nine numerical models were developed with the variation of: the kind of tower; the number of towers by line section; the damping level and the type of analysis implemented. The dynamic responses are show in terms of: forces in towers bars; conductors and stays; and the displacements in tower tops.
Silva, Daniel Luiz da. "Controle coerente do processo de absorção de dois fótons em compostos orgânicos." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/76/76132/tde-09112009-093136/.
Full textThe broad spectral band of ultrashort laser pulses has been used to coherently control the lightmatter interaction, by acting on the spectral phase of pulses using the so called pulse shaping methods. This new research area has been held responsible for advances in the understanding and controlling of photo-induced phenomena, especially in nonlinear optics. In this work, we studied the coherent control of two-photon absorption (2PA) processes in organic compounds, employing femtosecond pulses. We investigated the 2PA of perylene derivatives using chirped pulses (quadratic phase mask), by monitoring the two-photon excited fluorescence. Optimization of 2PA in perylene derivatives was achieved by shaping the pulse using a genetic algorithm, which revealed that Fourier transform limited pulses lead to higher 2PA. Quantum chemical calculations, using Density Functional Theory, were carried out to characterize the electronic structure and determine the allowed two-photon transitions of perylene derivatives, backing up our experimental results. Furthermore, we also studied the coherent control of 2PA in organic molecules applying a cosine-like phase mask. In such case, we demonstrated that the control efficiency depends on the detuning between the pulse central wavelength and materials 2PA band. Finally, coherent control of 2PA was explored using a step-like phase mask. Our results indicate that, in this situation, control of 2PA is only attained if a specific ratio between the pulse bandwidth and the 2PA transition bandwidth is used. In conclusion, the results obtained in this work help the understanding of coherent control in molecular systems.
Rodrigues, Áttila Leães. "Estudo de transições de fase em sistemas com simetria \"up-down\" e estados absorventes." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-26092014-105955/.
Full textIn this work we studied a stochastic model with ising symmetry and two simmetric absorbing configurations in a three-dimensional cubic lattice and in two dimensions using a triangular lattice. The study took into account simple mean-field approximations and Monte Carlo simulations. The results showed that the model has a second-order transition from a paramagnetic phase to a ferromagnetic phase and second-order transition from ferromagnetic phase to the absorbing one. A first-order phase transition from the paramagnetic phase to the absorbing phase is observed too. In the phase diagram the two second-order transition lines aproaches to the point where the model behaves like the voter model.
Tita, Volnei. "Contribuição ao estudo de danos e falhas progressivas em estruturas de material compósito polimérico." Universidade de São Paulo, 2003. http://www.teses.usp.br/teses/disponiveis/18/18135/tde-10092015-114215/.
Full textIn this work, material models were proposed to predict the mechanical behavior of composite structures. First of all, it was done a study about damage intra-ply and inter-ply (delamination) on composite materials and about analytical and numerical approaches to solve problems of progressive damage on composite structures was performed. After, many specimens were manufactured and experimental tests (tensile, compression, shear and flexural tests) were carried out. Experimental results and information from literature were used to develop some material models, which were implemented using FORTRAN compiler. These material models were compiled with a commercial finite element program (ABAQUS®) in order to evaluate and calibrate parameters of the models. In the first step, computational simulations of tensile and compression test were carried out to evaluate material models implemented. In the second step, the parameters of the material models were calibrated using three case studies (flexural, indentation and impact test) with some staking sequences. After that, a methodology was proposed to evaluate impact problems on composite structures under low velocity. Therefore, this research project not only shows new contributions but also suggests many future investigations.
Nakano, Alvaro. "Simulação de desempenho energético de tecnologias fotovoltaicas em fachada de edifício no município de São Paulo." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/3/3153/tde-10082017-112621/.
Full textTechnologies commonly applied in the global photovoltaic market are mostly with the rigid panels of crystalline silicon cells, due to the reduction of their prices provided by economies of scale. However, growing demand in the country of new housing units, mainly apartments, has required more appropriated solutions: with lower occupancy of horizontal area and installed on the building\'s facade. In this direction photovoltaic technologies are emerging such as thin films as better solutions than the emerging technologies of third generation, because still they are in technical maturation phase with few options of suppliers in the commercial market. Therefore, the approach of this dissertation was limited to technologies of thin films and semitransparent photovoltaic glazing, in addition to the more usual one that is crystalline silicon. However, the dynamism of the global market has stimulated an evolution in the performance factor of these technologies, which justified the necessity of literature review. Furthermore, most PV projects have been based on the rigid panels with crystalline silicon cells, not considering the alternatives using other technologies. What is noticed is the lack of deeper understanding of the designers about the thin film technologies and the best options for their application in a building, in terms of their performance and their behavior within the frequency spectrum. Thus, this work aimed to contribute to an energy performance analysis based on simulations for technical decision on the most appropriate photovoltaic cell technologies for systems to be installed on the facades of buildings, aided by an existing software tool on the market, PVSYST. Decision making was seen from the perspective of performance in electricity generation, by comparative analysis of simulations results applied on the facade of a hypothetical building in São Paulo. The results showed that the technologies based on crystalline silicon are the most appropriate in cases where energy demand peak in the year is in the summer, as in commercial buildings. On the other hand, the systems composed of the thin films based on indium and copper selenium group are the most suitable for residential buildings, where the period of greatest demand is in the winter.
Cintra, Renata Azevedo [UNESP]. "Interações orais em língua inglesa no laboratório de multimídia com acesso à Internet." Universidade Estadual Paulista (UNESP), 2004. http://hdl.handle.net/11449/93924.
Full textNesta investigação, analisam-se as interações orais (aluno-aluno e aluno-professora), construídas em aulas de língua inglesa, em um laboratório de multimídia, utilizando-se o jogo simulador The Sims e sites da internet, para propiciar o desenvolvimento de tarefas comunicativas orais (Nunan, 1989). Ademais, são analisados alguns fatores que contribuíram para que as interações fossem construídas da maneira descrita. A pesquisa é classificada como de cunho etnográfico (Watson-Gegeo, 1988), havendo uma preocupação com a interação na sala de aula como espaço de aprendizagem (Moita Lopes, 1996). A pesquisadora desempenhou também o papel de professora de um grupo de doze alunos de Licenciatura em Letras, em uma faculdade do noroeste paulista. Destaca-se que esta investigação corresponde à primeira experiência da professora-pesquisadora com a utilização de computadores para o ensino-aprendizagem de língua inglesa, bem como da maioria dos alunos, que era semi-letrada eletronicamente (Buzato, 2001). Os dados, coletados por meio de gravações em áudio e vídeo, se constituíram por momentos de interações orais aluno-aluno e aluno-professora. Tais dados foram triangulados para que pudessem validar as conclusões dessa investigação. Os alunos desenvolveram o letramento eletrônico por meio da construção de andaimes realizada por parceiros mais experientes durante as interações. Alguns fatores interferentes nas interações aluno-aluno foram as atividades desenvolvidas, os temas propostos, as especificidades de cada material (jogo e sites), o papel desempenhado pela professora-pesquisadora no laboratório de multimídia e o semi-letramento eletrônico da maioria dos alunos.
This research study is of an ethnographic nature (Watson-Gegeo, 1988), concerned with classroom interaction as a learning space (Moita Lopes, 1996). The researcher was also the teacher of twelve learners, undergraduate students and future teachers of English as a foreign language, in a college in the northwest of São Paulo state. The verbal interactions (learner-learner and learner-teacher) analyzed were constructed in a multimedia laboratory, using a simulator game (The Sims) and sites of the Internet in order to propitiate the development of verbal tasks (Nunan, 1989). Some factors that have contributed to construct interactions were analyzed. This study corresponds to the first experience of the teacher-researcher in computer-assisted language learning. The approach used to analyze data was mainly qualitative, with some quantitative analyses. Data were collected by means of audio and video recordings, research diaries, questionnaires, interviews and a research report written by a lesson observer. The preparation and analysis of the questionnaires and interviews were based on Gillham (2000a; 2000b). Data were triangulated in order to validate the results of the study. Learners developed experience in computer-assisted language learning with more experienced peers who provided them with scaffolding during the activities. Some factors which interfered on interactions were the types of activities developed, the topics proposed, the specificity of the materials (game and sites), the role of the researcher-teacher in the laboratory and the learners' lack of experience in computer-assisted language learning.
Cunha, Antonio Rodrigues da. "Estudos teórico e experimental de propriedades estruturais e eletrônicas da molécula emodina em solvente e em bicamadas lipídicas." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/43/43134/tde-02102014-142757/.
Full textEmodin (EMH) is one of the most abundant anthraquinone derivatives found in nature. This molecule has been used widely as research material, due to its biological and pharmacological activities such as antiviral, anticancer, antifungal, digestive and antibacterial activities. It is known that Emodin in alkaline aqueous solution can undergo more than one deprotonation, leading to the specie EM- in the first deprotonation process. In this PhD thesis, we studied the structural and electronic properties of this molecule in several solvents and lipid bilayers, in order to characterize the properties related to UV-Vis absorption spectroscopy, reactivity and thermodynamics of this molecule in these environments. Performing quantum mechanics (QM) calculations for all possible deprotonation sites and tautomeric isomers of Emodin in vacuum and in water, we identified the sites of the first, second and third deprotonations. We calculated the pKa1 of Emodin in water and pK*a1 in methanol with free energy perturbation method, implemented in the Monte Carlo simulation, and with QM calculations, where the solvent was treated as a polarizable continuum medium. Our best values for pKa1 of Emodin in these solvents were 8.4±0.5 and 10.3±1.5, which are in very good agreement with the experimental values obtained in this thesis pKa1=8.0±0.2 and pK*a1=11.1±0.1, for water and methanol, respectively. Additionally, we performed molecular dynamics simulations of both species in fully hydrated lipid bilayers of DMPC to investigate at atomic detail the molecular mechanism of the interaction of these species with lipid membrane and its preferred positions in this amphiphilic environment. As results of these simulations, we obtained that both species of Emodin have a strong tendency to insert into the lipid bilayer, remaining near the glycerol group of DMPC. These results corroborate our measured absorption spectra of these species in the bilayer, which qualitatively showed that both species are within the bilayer, inserted in the lipid headgroup region. Our results also show that the effect of EM- specie in the lipid bilayer structure is stronger than the EMH, which corroborate our DSC(Differential Scanning Calorimetry) measurements.
Ievsieieva, Ievgeniia. "Simulação com o código GEANT4 de medida de espessura de revestimento metálico em metal por XRF." Universidade do Estado do Rio de Janeiro, 2012. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=4154.
Full textNesta dissertação são apresentados resultados de simulações Monte Carlo de fluorescência de raios X (XRF), utilizando o programa GEANT4, para medidas de espessura de revestimento metálico (Ni e Zn) em base metálica (Fe). As simulações foram feitas para dois tamanhos de espessura para cada metal de revestimento, (5μm e 10μm), com passos de 0,1 μm e 0,001 μm e com 106 histórias. No cálculo da espessura do revestimento foram feitas as aproximações de feixe de raios X monoenegético, com a análise da transmissão apenas da energia do K-alfa e para uma geometria compatível com um sistema real de medição (ARTAX-200). Os resultados mostraram a eficiência da metodologia de simulação e do cálculo da espessura do revestimento, o que permitirá futuros cálculos, inclusive para multirevestimentos metálicos em base metálica.
This dissertation is presented results of x-ray fluorescence (XRF) Monte Carlo simulations) using GEANT4 for metallic coating (Ni and Zn) thickness determination on metallic base (Fe). The simulations were made for two values of coating thickness for each covering (5 μm and 10 μm), with steps of 0.1 μm and 0.001 μm, and with 106 stories. The monoenergetic x-ray approach was used assuming the transmission of only K-alpha line. The experiment and geometry was similar to the real system of measurement (ARTAX-200 ). The results showed the efficiency of the simulation methodology, as well as the ability to measure the thickness of coating by XRF. Thus, a continuation of this study for multilayer metal coating could be promising.
Roque, Victor Raphael de Castro Mourão. "Simulações hidrodinâmicas relativísticas da transição de fase cosmológica quark-hadron." reponame:Repositório Institucional da UFABC, 2015.
Find full textTese (doutorado) - Universidade Federal do ABC, Programa de Pós-Graduação em Física, 2015.
Durante sua expansão inicial e consequente resfriamento o Universo passou por diversas transições. Uma delas, conhecida como transição de fase da QCD, prevista pelo modelo cosmológico padrão e pela física de partículas, ocorreu por volta de 10 s após o Big Bang, em temperaturas da ordem de 150-200MeV, atuou em quarks e gluôns inicialmente em estado quasi-livre confinando-os em hádrons. A maneira na qual esse processo se deu, pode ter gerado inúmeras implicações nas fases subsequentes e relíquias a serem observadas atualmente. Com o intuito de entender esse processo, resolvemos numericamente as equações de Euler no contexto da relatividade restrita com um código numérico multidimensional desenvolvido durante o trabalho, baseado no método de diferença finita com esquemas de alta ordem para os casos hidrodinâmicos newtoniano e relativístico. Nele empregamos métodos de reconstrução espacial no espaço característico de até sétima ordem, três diferentes separadores de fluxo e esquemas Runge-Kutta de alta estabilidade de terceira e quarta ordem para evolução temporal. Para implementação do caso multidimensional, utilizamos o método "dimensionally unsplit". Os primeiros trabalhos a respeito dessa época partiam do pressuposto que a transição era de primeira ordem e faziam uma análise semi-analítica da nucleação e colisão entre bolhas hadrônicas. Nesses trabalhos era conjecturado que a turbulência criada por esses mecanismos teria um perfil Komolgorov, ou alguma variação, e a partir disso calculava a radiação gravitacional produzida. Contudo resultados obtidos pelos grandes experimentos de colisão de íons pesados no RHIC e LHC, cuja condições geradas possuem algumas similaridades àquelas esperadas no Universo Primordial e nos cálculos feitos pela colaboração Wuppertal-Budapest com teoria da QCD na rede sugerem que, pelos parâmetros provenientes do modelo padrão, essa transição foi analítica, caracterizada por uma transformação suave entre as fases. Nesse cenário, não há mecanismos intrínsecos da transição que possam transferir energias para as escalas maiores para produzir e manter a turbulência no fluido,formando assim espectro diferente dos analisados trabalhos anteriores e consequentemente uma outra evolução do plasma primordial. Através de análises estatísticas e espectrais, propusemos entender a dinâmica de iii um fluido primordial que passa por um transição analítica, estudando a geração de ondas gravitacionais geradas a partir da evolução de um fluido com estado inicial formado por distribuições randômicas de temperatura e velocidade e comparando-as com a curva de sensibilidade do eLISA/NGO. Procuramos entender também como a presença da instabilidade de Kelvin-Helmholtz bidimensional engatilhada a partir das flutuações provenientes de outras eras pode ter influenciado no crescimento de perturbações e da turbulência e suas consequências para o espectro da radiação gravitacional.
During its initial expansion and cooling, the Universe passed through several transitions. One of them, know as QCD phase transition occurred around 10 s after the Big Bang, at a temperature of the order of 150-200MeV, confining quarks and gluons that were initially in a quasi-free state into hadrons. The study of this transition is important for understanding the evolution of the Universe, because depending of the manner in which this process took place, we can expect several types of consequences for the subsequent phases as well as different observable relics. In order to understand this process, we solved numerically the Euler equations in the frame of special relativity with a multi-dimensional numerical code developed during the work, based on the finite differences method with high order schemes. We used spatial reconstruction methods in the characteristic space up to seventh order, three different flux-split and Runge-Kutta schemes of high stability of third and fourth order for time evolution. To implement the multidimensional case, we use the dimensionally unsplit method. Most of previous studies on this topic were based on the assumption of a first order transition. Several studies focused on a semi-analytical analysis of the nucleation, growth and collision of bubbles and their relation to the generation of gravitational waves. In these works it was conjectured that the turbulence created by such mechanisms would have a Komolgorov slope, or some variation of it, and from it the gravitational radiation was estimated. However results obtained in large heavy ion collision experiments at RHIC and LHC, whose conditions have some similarities to those expected in the Early Universe, and calculations made by the Wuppertal-Budapest collaboration with Lattice QCD suggest that, with parameters from the standard cosmological model, the transition it is a crossover characterized by a smooth transformation between phases. In this scenario, no intrinsic mechanisms of the transition could transfer the energy to larger scales to produce and maintain turbulence in the fluid, thereby generating a different spectrum of previous work and therefore another evolution of the primordial plasma. Using statistical and spectral analysis, we study the dynamics of a primordial fluid passing through an analytic transition. We study the gravitational waves generated from the motion of a fluid with an initial state consisting of random distributions of temperature and velocity and compare the results with the sensitivity curve of the eLISA/NGO. We also investigate how the presence of the Kelvin-Helmholtz instability may have influenced the growth of perturbations and turbulence and analyze its consequences for the spectrum of gravitational radiation.
Ferrari, Guilherme Gonçalves. "Novos mapas simpléticos para integração de sistemas hamiltonianos com múltiplas escalas de tempo : enfoque em sistemas gravitacionais de N-corpos." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/127985.
Full textSymplectic maps are well know for preserving the phase space volume in Hamiltonian dynamics and are particularly suited for problems that require long integration times. In this thesis we develop approaches based on symplectic maps for the coupling of multi sub-systems/astrophysics domains/simulation codes for efficient integration of self-gravitating N-body systems with large variation in characteristic time-scales. We establish a family of 48 new symplectic maps based on a recursive Hamiltonian splitting, which allow the coupling to occur in a hierarchical manner, thus contemplating all time-scales of the involved interactions. Our formulation is general enough to allow that such method be used as a recipe to combine different physical phenomena which can be modeled independently by specialized simulation codes. We also introduce a Keplerian-based Hamiltonian splitting for solving the general gravitational Nbody problem as a composition of N2 2-body problems. The resulting method is precise for each individual 2-body solution and produces quick and accurate results for near-Keplerian N-body systems, like planetary systems or a cluster of stars that orbit a supermassive black-hole. The method is also suitable for integration of N-body systems with intrinsic hierarchies, like a star cluster with compact binaries. We present the implementation of the mentioned algorithms and describe our code tupan, which is publicly available on the following url: https://github.com/ggf84/tupan.
Pegoraro, Fábio. "APLICAÇÃO DA SIMULAÇÃO COMPUTACIONAL E TEORIA DAS RESTRIÇÕES (TOC) PARA A REDUÇÃO DO TEMPO DE ESPERA POR ATENDIMENTO DE URGÊNCIA E EMERGÊNCIA EM UM HOSPITAL DA REGIÃO SUL DO ESTADO DO TOCANTINS." Pontifícia Universidade Católica de Goiás, 2012. http://localhost:8080/tede/handle/tede/2435.
Full textThis paper presents the results obtained with the application of computer simulation and theory of constraints to reduce the waiting time for urgent care at the Hospital Regional Gurupi (HRG). The simulation model focuses on the patients in a state of emergency and urgency. Patient in a state of emergency can not remain in the queue and should be treated immediately, as the patient in a state of emergency can stay up to 30 minutes in a queue. The methodology used is exploratory case study techniques supported by semi-structured interviews and questionnaires administered to managers and hospital doctors. The actual model (scenario 1) the hospital is simulated using PROMODEL® software. Two more scenarios are created following the process of continuous improvement of the theory of constraints and simulated using PROMODEL®. It is observed that through the application of computer simulation combined with the theory of constraints, the managers of the Emergency Room (ER) of HRG have adequate information to manage it in order to reduce the waiting time for urgent and emergency care.
Este trabalho apresenta os resultados obtidos com a aplicação da simulação computacionale teoria das restrições para reduzir o tempo de espera por atendimento de urgência no Hospital Regional de Gurupi (HRG). O modelo de simulação tem foco nos pacientes em estado de urgência e emergência. Paciente em estado de emergência não pode permanecer em fila de espera e deve ser atendido imediatamente, já o paciente em estado de urgência pode permanecer até 30 minutos em uma fila de espera. A metodologia usada é exploratória com estudo de caso apoiada em técnicas de entrevista semi-estruturada e questionários aplicados aos gestores e médicos do hospital. O modelo real (cenário 1) do hospital é simulado utilizando o software PROMODEL.® Mais dois cenários são criados seguindo o processo de melhoria contínua da teoria das restrições e simulados utilizando o software PROMODEL®. Observa-se que através da aplicação da simulação computacional aliada à teoria das restrições, os gestores do Pronto Socorro (PS) do HRG possuem informações adequadas para geri-lo de forma a reduzir o tempo de espera por atendimento de urgência e emergência.
Neves, Ubiraci Pereira da Costa. "Processos de polimerização e transição de colapso em polímeros ramificados." Universidade de São Paulo, 1997. http://www.teses.usp.br/teses/disponiveis/76/76131/tde-09102008-133038/.
Full textThe phase diagram and the tricritical point of a collapsing lattice animal are studied through an extended series expansion of the isothermal compressibility KT on a square lattice. As a function of the variables x (fugacity) and y = e1/T (T is the reduced temperature), this series KT is investigated using the partial differential approximants technique. The characteristic flow pattern of partial differential approximant trajectories is determined for a typical stable fixed point. We obtain satisfactory estimates for the tricritical fugacity Xt = 0.024 ± 0.005and temperature Tt = 0.54 ± 0.04.Taking into account only linear scaling fields we are also able to get the scaling exponent γ = 1.4 ± 0.2 and the crossover exponent Φ = 0.66 ± 0.08. Our results are in good agreement with previous estimates from other methods. We also study ramified polymerization through computational simulations on the square lattice of a kinetic growth model generalized to incorporate branching and impurities. The polymer configuration is identified with a bond tree in order to examine its topology. The fractal dimensions of clusters are obtained at criticality. Simulations also allow the study of time evolution of clusters as well as the determination of time autocorrelations and dynamical critical exponents. In regard to finite size effects, a fourth-order cumulant technique is employed to estimate the critical branching probability be and the critical exponents v and β. In the absence of impurities, the surface roughness is described in terms of the Hurst exponents. Finally we simulate this kinetic growth model on the square lattice using a Monte Carlo approach in order to study ramified polymerization with short distance attractive interactions between monomers. The phase boundary separating finite from infinite growth regimes is obtained in the (T,b) space (T is the reduced temperature and b is the branching probability). In the thermodynamic limit, we extrapolate the temperature T = 0.102 ± 0.005 below which the phase is found to be always infinite. We also observe the occurrence of a roughening transition at the polymer surface.
Gama, Ricardo Dias. "Estudo do comportamento de conectores e cabos do ponto de vista da compatibilidade eletromagnética." reponame:Repositório Institucional da UFABC, 2017.
Find full textDissertação (mestrado) - Universidade Federal do ABC, Programa de Pós-Graduação em Engenharia Elétrica, 2017.
Este trabalho consiste na exploração dos problemas e aplicações eletromagnéticas na área de conexão da eletrônica em altas taxas de velocidade de dados. Referências bibliográficas como as equações de Maxwell e suas derivações, técnicas de implementação de conectividades, simplificações como: linhas (striplines), microlinhas (microstrip), placas de circuito impresso empilhadas (stackup) serão abordadas. Também são encontradas bases teóricas como aterramento (grounding) e blindagem (shielding), para aplicações com cabos do tipo coaxial e pares de dados. Os efeitos estudados são reflexão e acoplamentos (crosstalk). Há uma comparação dos resultados em um sistema exemplo com cabo de rede ethernet e conectores para acoplamento e medição, entre simulações computacionais (CST ¿ Computer Simulation Technology , mais exatamente no módulo MWS ¿ Micro Wave Studio), e medições com um VNA (Vector Network Analyzer).
This study covers the electromagnetic evaluation and applications on electronic connections area at high speed data rates. References to Maxwell equations and its derivations concerning high frequency effects, connectivity implementation techniques and simplifications like: striplines, microstrip, stackup of PCBs layers are addressed. Theoretical bases are laid, to applications with coax cables and twisted pairs and their respective grounding and shielding. The effects studied are reflection and crosstalk. There are results for an example - a launcher and Ethernet cable with the connector. The results are shown in simulation (CST, module MWS ¿ Micro Wave Studio), and measurements using a VNA (Vector Network Analyzer).
Silva, Renata Costa da. "Avaliação de ferramenta de simulação da transmissão sonora para projetos de isolamento acústico em edificações habitacionais." Universidade Federal de Santa Maria, 2014. http://repositorio.ufsm.br/handle/1/7871.
Full textIn July, 2013, Brazil s new standard, NBR15575:2013, began to require minimum thermal, luminal, structural and acoustical performance levels in new constructions of residential dwellings. Thus, computer programs that simulate the sound transmission in buildings, can serve as important tools for professionals, enabling the verification of virtual acoustic performance of projects, quickly and economically. By being widely used in European countries, they have the database of existing elements and building systems in Europe. From this, this study aimed to evaluate a computational tool of simulation of sound transmission on the reality of Brazilian buildings. Twenty households were selected in the city of Santa Maria - RS, to be measured in situ and then simulated to airborne and impact noise. The residences were chosen according to standard and types of building systems. For greater representation of Brazilian buildings, residences of low, medium and high standards were chosen. The building systems chosen were structural masonry, structural walls with concrete or ceramic block; and reinforced concrete, with walls of hollow brick; all with massive concrete slab. The software used for the simulations was the SONarchitect and all the features and building elements used were entered into the software s database. The values of Ln,T , Dn,T , and RT per frequency band, and their weighted values Ln,T,w, Dn,T,w and mean RT were obtained through measurements and simulation. The weighted measured and simulated values were similar, on the other hand, the values obtained by frequency band showed disagreement with the lower frequencies precisely, from 50 Hz to 100 Hz. There was also a relationship of impact sound transmission loss with the area size of the partition, in which there is a significant increase in sound insulation in partitions with larger areas. In addition, coefficients and safety factors were determined for each standard and building system to be applied in the spectra of the simulations, aiming to bring the simulated values closer to the measured ones. Parallel with the evaluation of the computational tool, questionnaires with 150 professionals who develop acoustic projects in the country were applied. This step had as main objective to find out what tools and methods are mostly used by professionals in the development of these projects. From 31 received responses, it was determined that 20.88% of the professionals use a computational tool to develop designs. It was also possible to discover some features such as field of expertise and graduation year, which regions they are located, what kind of projects they develop and if they have already developed projects based on the new standard.
Com a entrada em vigor, em julho de 2013, da nova norma brasileira, NBR 15575:2013, passaram a ser exigidos nas novas construções de uso multifamiliar, níveis mínimos de desempenho, tais como térmico, ilumínico, estrutural e acústico. Dessa maneira, programas computacionais que simulam a transmissão sonora em edificações, podem servir como ferramentas importantes aos profissionais, permitindo a verificação do desempenho acústico virtual dos projetos, de forma rápida e econômica. Por serem amplamente utilizados nos países europeus, possuem banco de dados dos elementos e sistemas construtivos existentes na Europa. A partir disso, esse trabalho teve como objetivo a avaliação de uma ferramenta computacional, de simulação da transmissão sonora, para a realidade das construções brasileiras. Foram selecionadas 20 residências, na cidade de Santa Maria - RS, para serem medidas in loco e depois simuladas, aos ruídos aéreo e de impacto. As residências foram escolhidas, de acordo com padrão e sistemas construtivos. Para maior representatividade das construções brasileiras, foram escolhidas residências do padrão baixo, médio e alto. Os sistemas construtivos escolhidos foram alvenaria estrutural, com paredes de bloco estrutural de concreto ou cerâmico; e concreto armado, com paredes de tijolo vazado; todos com laje maciça de concreto. O programa utilizado para as simulações foi o SONarchitect e todas as características, dos elementos construtivos utilizados, foram inseridos no banco de dados do software. Foram obtidos os valores de Ln,T , Dn,T,, e TR por banda de frequência, e seus valores ponderados de Ln,T,w, Dn,T,w e TR médio, nas medições e simulações. Os valores ponderados medidos e simulados foram similares, por outro lado os valores obtidos por banda de frequência apresentaram divergência em relação às frequências mais baixas, precisamente de 50 Hz a 100 Hz. Verificou-se também uma relação da perda de transmissão sonora ao ruído de impacto com o tamanho da área da partição, na qual ocorre um aumento significativo no isolamento sonoro em partições com áreas maiores. Além disso, foram determinados coeficientes e fatores de segurança, para cada padrão e sistema construtivo, a serem aplicados nos espectros das simulações. Com o objetivo de aproximar os valores simulados dos valores medidos. Paralelamente à avaliação da ferramenta computacional, foram aplicados questionários com 150 profissionais que desenvolvem projetos acústicos no país. Essa etapa tinha como objetivo principal descobrir quais ferramentas e métodos são mais utilizados pelos profissionais no desenvolvimento desses projetos. A partir de 31 questionários respondidos, foi possível determinar que 20,88% dos profissionais utilizam alguma ferramenta computacional para desenvolver os projetos. Também foi possível descobrir algumas características desses profissionais como curso e ano de formação, quais as regiões que estão inseridos, que tipo de projetos desenvolvem e se já desenvolveram projetos baseados na nova norma.
Lima, Davi Monteiro Santos de Barros. "Estudo param?trico do processo de inje??o de solventes em po?os horizontais para reservat?rios de ?leos pesados." Universidade Federal do Rio Grande do Norte, 2011. http://repositorio.ufrn.br:8080/jspui/handle/123456789/12959.
Full textCoordena??o de Aperfei?oamento de Pessoal de N?vel Superior
The world has many types of oil that have a range of values of density and viscosity, these are characteristics to identify whether an oil is light, heavy or even ultraheavy. The occurrence of heavy oil has increased significantly and pointing to a need for greater investment in the exploitation of deposits and therefore new methods to recover that oil. There are economic forecasts that by 2025, the heavy oil will be the main source of fossil energy in the world. One such method is the use of solvent vaporized VAPEX which is known as a recovery method which consists of two horizontal wells parallel to each other, with a gun and another producer, which uses as an injection solvent that is vaporized in order to reduce the viscosity of oil or bitumen, facilitating the flow to the producing well. This method was proposed by Dr. Roger Butler, in 1991. The importance of this study is to analyze how the influence some operational reservoir and parameters are important in the process VAPEX, such as accumulation of oil produced in the recovery factor in flow injection and production rate. Parameters such as flow injection, spacing between wells, type of solvent to be injected, vertical permeability and oil viscosity were addressed in this study. The results showed that the oil viscosity is the parameter that showed statistically significant influence, then the choice of Heptane solvent to be injected showed a greater recovery of oil compared to other solvents chosen, considering the spacing between the wells was shown that for a greater distance between the wells to produce more oil
Existem no mundo diversos tipos de ?leo que apresentam uma diversidade de valores de densidade e viscosidade, essas s?o caracter?sticas para identificar se um ?leo ? leve, pesado ou at? mesmo ultrapesado. A ocorr?ncia de ?leo pesado vem aumentando sensivelmente e apontando uma necessidade de maiores investimentos na explora??o de jazidas e consequentemente em novos m?todos de recupera??o desse ?leo. Existem previs?es econ?micas de que, para o ano 2025, o ?leo pesado seja a principal fonte de energia f?ssil no mundo. Um desses novos m?todos seria a utiliza??o de solvente vaporizado conhecido como VAPEX que ? um m?todo de recupera??o que consiste em dois po?os horizontais paralelos entre si, sendo um injetor e outro produtor, que utiliza como inje??o solvente vaporizado que tem com o prop?sito reduzir a viscosidade do ?leo ou betume, facilitando o escoamento at? o po?o produtor. Esse m?todo foi proposto por Dr. Roger Butler, em 1991. A import?ncia do presente estudo ? analisar como influenciam alguns par?metros operacionais e de reservat?rio, importantes no processo VAPEX, tais como o acumulo de ?leo produzido, no fator de recupera??o, na vaz?o de inje??o e na taxa de produ??o. Par?metros como vaz?o de inje??o, espa?amento entre os po?os, tipo do solvente a ser injetado, permeabilidade vertical e a viscosidade do ?leo foram abordados neste estudo. Os resultados mostraram que a viscosidade do ?leo foi o par?metro que mais mostrou influ?ncia significativa estatisticamente, em seguida a escolha do Heptano como solvente a ser injetado mostrou uma maior recupera??o de ?leo em rela??o aos demais solventes escolhidos. Considerando o espa?amento entre os po?os, foi mostrado que para uma maior dist?ncia entre os po?os h? uma maior produ??o de ?leo
Aguiar, Matheus Araújo. "Um agente jogador de GO com busca em árvore Monte-Carlo aprimorada por memória esparsamente distribuída." Universidade Federal de Uberlândia, 2013. https://repositorio.ufu.br/handle/123456789/12547.
Full textCom mais de 4000 anos de história, o jogo de Go é atualmente um dos mais populares jogos de tabuleiro e representa um grande desao para a Inteligência Articial. Apesar de suas regras simples, as técnicas que anteriormente obtiveram sucesso em outros jogos como xadrez e damas não conseguem lidar satisfatoriamente com os padrões e comportamentos complexos que emergem durante uma partida de Go. O presente trabalho implementa o SDM-Go, um agente jogador de Go competitivo que procura reduzir a utilização de supervisão no processo de busca pelo melhor movimento. O SDM-Go emprega o modelo de memória esparsamente distribuída como um recurso adicional à busca em árvore Monte- Carlo utilizada por muitos dos melhores agentes automáticos atuais. Baseado no jogador código-aberto Fuego o uso da memória esparsamente distribuída pelo SDM-Go tem como objetivo ser uma alternativa ao processo fortemente supervisionado utilizado por aquele agente. A busca em árvore Monte-Carlo executada pelo jogador Fuego utiliza um conjunto de heurísticas codicadas por prossionais humanos para guiar as simulações e também avaliar novos nós encontrados na árvore. De maneira distinta, o SDM-Go implementa uma abordagem não supervisionada e independente de domínio, onde o histórico dos valores dos estados de tabuleiros previamente visitados durante a busca são utilizados para avaliar novos estados de tabuleiro (nós da árvore de busca). Desta maneira, o SDM-Go reduz a supervisão do agente Fuego, substituindo as heurísticas deste pela memória esparsamente distribuída que funciona como repositório das informações do histórico de estados de tabuleiro visitados. Assim, as contribuições do SDM-Go consistem em: (1) a utilização de uma memória esparsamente distribuída para substituir a abordagem supervisionada do Fuego para avaliar previamente novos nós encontrados na árvore; (2) a implementação de uma representação de tabuleiro baseada em vetores de bits, para não comprometer o desempenho do sistema em função dos tabuleiros armazenados na memória; (3) a extensão da utilização dos resultados das simulações Monte-Carlo para atualizar os valores dos tabuleiros armazenados na memória. Diferentemente de muitos outros agentes atuais, o uso da memória esparsamente distribuída representa uma abordagem independente de domínio. Os resultados obtidos em torneios contra o conhecido agente código-aberto Fuego mostram que o SDM-Go consegue desempenhar com sucesso a tarefa de prover uma abordagem independente de domínio e não supervisionada para avaliar previamente novos nós encontrados na árvore de busca. Apesar do maior tempo de processamento requerido pela utilização da memória esparsamente distribuída, peça central para o desempenho do agente, o SDM-Go consegue manter um nível de jogo competitivo, principalmente no tabuleiro 9X9.
Mestre em Ciência da Computação
Martins, Keyll Carlos Ribeiro. "Análises experimental, teórica e computacional do escoamento dos gases de exaustão no conversor catalítico platina/paládio instalado em um motor de combustão interna a etanol." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/18/18135/tde-21122006-092443/.
Full textThe thesis describes computational, experimental, and theoretical analysis of the exhaust gases flow through a Pt/Pd catalytic converter installed on internal combustion engine burning ethanol. The obtained values related to energy properties and mass transport in experimental tests is the base of initial values for development of the computational simulation by using CFX and MFIX software. The programs solves a group of conservative equations that allow to analysis of variation such as pressure, temperature, and also, velocity of exhausting gases along the monolithic support, moreover to evaluate the formation of gases and its emission levels on the incoming air-fuel mixture combustion and its respective chemical reactions of oxidation. The catalytic efficiency through the first monolith of the converter were about of 11% THC, 100% NOx e 20% CO. It was verified also a small difference of 1,2 % between the experimental and the simulated average velocity. In addition mathematical models were applied on the study of the load loss in catalytic converter, diffusion-reaction of the consumption and also formation of chemical species.
Larson, Kajsa. "On perfect simulation and EM estimation." Doctoral thesis, Umeå : Department of Mathematics and Mathematical Statistics, Umeå University, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-33779.
Full text