To see the other types of publications on this topic, follow the link: CARTO 3.

Dissertations / Theses on the topic 'CARTO 3'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'CARTO 3.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Fioravanti, Matteo. "Sviluppo di tecniche di elaborazione di dati elettroanatomici per l'analisi dei pattern di attivazione elettrica in fibrillazione atriale." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017.

Find full text
Abstract:
Per quanto risulti chiaro come l'innesco della fibrillazione atriale (principalmente parossistica) sia da attribuire a sorgenti focali collocate prevalentemente in prossimità delle vene polmonari (nel 94% dei casi), risulta tutt'ora estremamente dibattuto quali siano i meccanismi di mantenimento di tale anomalia del ritmo cardiaco. Fra le varie teorie riportate in letteratura, una delle più accreditate risulta essere la teoria dei rotori, secondo la quale il mantenimento della FA sarebbe da attribuire a pattern di rotazione spiraliformi, in grado di persistere, come osservato nel celebre studio CONFIRM, per tempi superiori ai 10 minuti. Il presente lavoro di tesi ha quindi l'obiettivo di dare un contributo alla teoria dei rotori, attraverso la realizzazione di mappe di fase ottenute elaborando segnali endocavitari acquisiti durante procedure di ablazione transcatetere con il sistema di mappaggio elettroanatomico CARTO 3 e l'elettrocatetere diagnostico PentaRay, della Biosense Webster. Dopo aver introdotto la fisiopatologia della fibrillazione atriale, soffermandosi particolarmente sulla teoria dei rotori, nel Capitolo 1, e aver introdotto la procedura di ablazione transcatetere, focalizzandosi sull'ausilio del CARTO 3 come sistema di mappaggio real-time, nel Capitolo 2, è stata infatti presentata l'elaborazione eseguita per poter ottenere dai segnali unipolari intracardiaci prelevati, le mappe di fase dalle quali è stata indagata l'eventuale presenza di rotori in 6 pazienti affetti principalmente da FA persistente (Capitolo 3). In conclusione, nell'ultimo capitolo è stata eseguita un'analisi dei risultati, valutando non solamente l'eventuale presenza di rotori nelle regioni della camera atriale in cui sono stati maggiormente osservati in letteratura, ma soffermandosi anche sulla validità dell'elettrocatetere Pentaray nell'indagare la dinamica alla base dei pattern di rotazione spiraliformi.
APA, Harvard, Vancouver, ISO, and other styles
2

Yilmaz, Ozhan. "Collaboration Among Small Shippers In Cargo Transportation." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/3/12611316/index.pdf.

Full text
Abstract:
As a result of widespread and effective usage of internet, firms tend to collaborate to reduce their operating costs. This thesis analyzes collaboration opportunities for a group of small shippers. A transportation intermediary determining the optimal actions for arriving shippers and a mechanism allocating savings to the shippers is proposed in the thesis. The performance of the intermediary is assessed by using computational analyses. An experimental set is formed that is by changing the parameters that are expected to significantly affect the optimal policy structure and the surplus budget (or deficit) changes. It is seen that increasing variable costs like cross-assignment cost and waiting cost leads to the increase in comparative performance of the optimal policy compared to the naï
ve policy, which is defined according to a simple rule, although increasing dispatching cost, which can be considered as a fixed cost, leads to an opposite result. The performance of the optimal policy is also assessed by using a myopic policy, in which shippers are trying to maximize their own benefit without considering the overall benefit of the grand coalition.
APA, Harvard, Vancouver, ISO, and other styles
3

Can, Mutan Oya. "Comparison Of Regression Techniques Via Monte Carlo Simulation." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605175/index.pdf.

Full text
Abstract:
The ordinary least squares (OLS) is one of the most widely used methods for modelling the functional relationship between variables. However, this estimation procedure counts on some assumptions and the violation of these assumptions may lead to nonrobust estimates. In this study, the simple linear regression model is investigated for conditions in which the distribution of the error terms is Generalised Logistic. Some robust and nonparametric methods such as modified maximum likelihood (MML), least absolute deviations (LAD), Winsorized least squares, least trimmed squares (LTS), Theil and weighted Theil are compared via computer simulation. In order to evaluate the estimator performance, mean, variance, bias, mean square error (MSE) and relative mean square error (RMSE) are computed.
APA, Harvard, Vancouver, ISO, and other styles
4

Angelino, Elaine Lee. "Accelerating Markov chain Monte Carlo via parallel predictive prefetching." Thesis, Harvard University, 2014. http://nrs.harvard.edu/urn-3:HUL.InstRepos:13070022.

Full text
Abstract:
We present a general framework for accelerating a large class of widely used Markov chain Monte Carlo (MCMC) algorithms. This dissertation demonstrates that MCMC inference can be accelerated in a model of parallel computation that uses speculation to predict and complete computational work ahead of when it is known to be useful. By exploiting fast, iterative approximations to the target density, we can speculatively evaluate many potential future steps of the chain in parallel. In Bayesian inference problems, this approach can accelerate sampling from the target distribution, without compromising exactness, by exploiting subsets of data. It takes advantage of whatever parallel resources are available, but produces results exactly equivalent to standard serial execution. In the initial burn-in phase of chain evaluation, it achieves speedup over serial evaluation that is close to linear in the number of available cores.
Engineering and Applied Sciences
APA, Harvard, Vancouver, ISO, and other styles
5

Tak, Hyung Suk. "Topics in Bayesian Hierarchical Modeling and its Monte Carlo Computations." Thesis, Harvard University, 2016. http://nrs.harvard.edu/urn-3:HUL.InstRepos:33493573.

Full text
Abstract:
The first chapter addresses a Beta-Binomial-Logit model that is a Beta-Binomial conjugate hierarchical model with covariate information incorporated via a logistic regression. Various researchers in the literature have unknowingly used improper posterior distributions or have given incorrect statements about posterior propriety because checking posterior propriety can be challenging due to the complicated functional form of a Beta-Binomial-Logit model. We derive data-dependent necessary and sufficient conditions for posterior propriety within a class of hyper-prior distributions that encompass those used in previous studies. Frequency coverage properties of several hyper-prior distributions are also investigated to see when and whether Bayesian interval estimates of random effects meet their nominal confidence levels. The second chapter deals with a time delay estimation problem in astrophysics. When the gravitational field of an intervening galaxy between a quasar and the Earth is strong enough to split light into two or more images, the time delay is defined as the difference between their travel times. The time delay can be used to constrain cosmological parameters and can be inferred from the time series of brightness data of each image. To estimate the time delay, we construct a Gaussian hierarchical model based on a state-space representation for irregularly observed time series generated by a latent continuous-time Ornstein-Uhlenbeck process. Our Bayesian approach jointly infers model parameters via a Gibbs sampler. We also introduce a profile likelihood of the time delay as an approximation of its marginal posterior distribution. The last chapter specifies a repelling-attracting Metropolis algorithm, a new Markov chain Monte Carlo method to explore multi-modal distributions in a simple and fast manner. This algorithm is essentially a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes repelling, followed by an uphill move in density that aims to make local modes attracting. The downhill move is achieved via a reciprocal Metropolis ratio so that the algorithm prefers downward movement. The uphill move does the opposite using the standard Metropolis ratio which prefers upward movement. This down-up movement in density increases the probability of a proposed move to a different mode.
Statistics
APA, Harvard, Vancouver, ISO, and other styles
6

Hellermann, Rolf. "Capacity options for revenue management : theory and applications in the air cargo industry /." Berlin : Springer, 2006. http://dx.doi.org/10.1007/3-540-34420-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vale, Rodrigo Telles da Silva. "Localização de Monte Carlo aplicada a robôs submarinos." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/3/3152/tde-26082015-162614/.

Full text
Abstract:
A tarefa de operar um veículo submarino durante missões de inspeção de ambientes estruturados como, por exemplo, duto de usinas hidrelétricas, é feita principalmente por meio de referências visuais e uma bússola magnética. Porém alguns ambientes desse tipo podem apresentar uma combinação de baixa visibilidade e anomalias ferromagnéticas que inviabilizaria esse tipo de operação. Este trabalho, motivado pelo desenvolvimento de um veículo submarino operado remotamente (ROV) para ser usado em ambientes com essas restrições, propõe um sistema de navegação que utiliza o conhecimento prévio das dimensões do ambiente para corrigir o estado do veículo por meio da correlação dessas dimensões com os dados de um sonar de imageamento 2D. Para fazer essa correlação é utilizado o ltro de partículas, que é uma implementação não paramétrica do ltro Bayesiano. Esse ltro faz a estimação do estado com base nos métodos sequenciais de Monte Carlo e permite trabalhar de uma maneira simples com modelos não lineares. A desvantagem desse tipo de fusão sensorial é o seu alto custo computacional o que geralmente o impede de ser utilizado em aplicações de tempo real. Para que seja possível utilizar esse ltro em tempo real, será proposto neste trabalho uma implementação paralela utilizando uma unidade de processamento gráco (GPU) da NVIDIA e a arquitetura CUDA. Neste trabalho também será feito um estudo da utilização de duas congurações de sensores no sistema de navegação proposto neste trabalho.
The task of navigating a Remotely Operated underwater Vehicles (ROV) during inspection of man-made structures is performed mostly by visual references and occasionally a magnetic compass. Yet, some environments present a combination of low visibility and ferromagnetic anomalies that negates this approach. This paper, motivated by the development of a ROV designed to work on such environment, proposes a navigation method for this kind of vehicle. As the modeling of the system is nonlinear, the method proposed uses a particle lter to represent the vehicle state that is a nonparametric implementation of the Bayes lter. This method to work needs a priori knowledge of the environment map and to make the data association with this map, a 2D image sonar is used. The drawback of the sensor fusion used in this work is its high computational cost which generally prevents it from being used in real time applications. To be possible for this lter to be used in real time application, in this work is proposed a parallel implementation using a graphics processing unit (GPU) from NVIDIA and CUDA architecture. In this work is also made a study of two types of sensors conguration on the navigation system proposed in this work.
APA, Harvard, Vancouver, ISO, and other styles
8

Hayward, Robert M. "A coarse mesh transport method for photons and electrons in 3-D." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/51928.

Full text
Abstract:
A hybrid stochastic-deterministic method, COMET-PE, is developed for dose calculation in radiotherapy. Fast, accurate dose calculation is a key component of successful radiotherapy treatment. To calculate dose, COMET-PE solves the coupled Boltzmann Transport Equations for photons and electrons. The method uses a deterministic iteration to compose response functions that are pre-computed using Monte Carlo. Thus, COMET-PE takes advantage of Monte Carlo physics without incurring the computational costs typically required for statistical convergence. This work extends the method to 3-D problems with realistic source distributions. Additionally, the performance of the deterministic solver is improved, taking advantage of both shared-memory and distributed-memory parallelism to enhance efficiency. To verify the method’s accuracy, it is compared with the DOSXYZnrc (Monte Carlo) method using three different benchmark problems: a heterogeneous slab phantom, a water phantom, and a CT-based lung phantom. For the slab phantom, all errors are less than 1.5% of the maximum dose or less than 3% of local dose. For both the water phantom and the lung phantom, over 97% of voxels receiving greater than 10% of the maximum dose pass a 2% (relative error) / 2 mm (distance-to-agreement) test. Timing comparisons show that COMET-PE is roughly 10-30 times faster than DOSXYZnrc. Thus, the new method provides a fast, accurate alternative to Monte Carlo for dose calculation in radiotherapy treatment planning.
APA, Harvard, Vancouver, ISO, and other styles
9

Nakamura, Y., and J. W. Tucker. "Monte Carlo Study of a Mixed Spin-1 and Spin-3/2 Ising Ferromagnet." IEEE, 2002. http://hdl.handle.net/2237/7158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rety, Stephanie R. "A 3-D Monte Carlo Radiative Transfer Model for the Disk of Gamma Cassiopeiae." University of Toledo / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1278970432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Wang, Lazhi. "Methods in Monte Carlo Computation, Astrophysical Data Analysis and Hypothesis Testing With Multiply-Imputed Data." Thesis, Harvard University, 2015. http://nrs.harvard.edu/urn-3:HUL.InstRepos:17463134.

Full text
Abstract:
We present three topics in this thesis: the next generation warp bridge sampling, Bayesian methods for modeling source intensities, and large-sample hypothesis testing procedures in multiple imputation. Bridge sampling is an effective Monte Carlo method to estimate the ratio of the normalizing constants of two densities. The Monte Carlo errors of the estimator are directly controlled by the overlap between the densities. In Chapter 1, we generalize the warp transformations in Meng and Schilling (2002), and introduce a class of stochastic transformation, called warp-U transformation, which aims at increasing the overlap of the densities of the transformed data without altering the normalizing constants. Warp-U transformation is determined by a Gaussian mixture distribution, which has reasonable amount of overlap with the density of unknown normalizing constant. We show warp-U transformation reduces the f-divergence of two densities, thus bridge sampling with warp-U transformed data has better statistical efficiency than that based on the original data. We then propose a computationally efficient method to find a Gaussian mixture distribution and investigate the performance of the corresponding warp-U bridge sampling. Finally, theoretical and simulation results are provided to shed light on how to choose the tuning parameters in the algorithm. In Chapter 2, we propose a Bayesian hierarchical model to study the distribution of the X-ray intensities of stellar sources. One novelty of the model is its use of a zero-inflated gamma distribution for the source intensities to reflect the possibility of “dark” sources with practically zero luminosity. To quantify the evidence for “dark” sources, we develop a Bayesian hypothesis testing procedure based on the posterior predictive p-value. Statistical properties of the model and the test are investigated via simulation. Finally, we apply our method to a real dataset from Chandra. Chapter 3 presents large-sample hypothesis testing procedures in multiple imputation, a common practice to handle missing data. Several procedures are classified, discussed, and compared in details. We also provide an improvement of a Wald-type procedure and investigate a practical issue of the likelihood-ratio based procedure.
Statistics
APA, Harvard, Vancouver, ISO, and other styles
12

Leite, Luiz Felipe de Queiroga Aguiar. "O canto do bode humano: exílio e estranheza na ambivalência trágica da Galileia contemporânea." Universidade Estadual da Paraíba, 2017. http://tede.bc.uepb.edu.br/jspui/handle/tede/2942.

Full text
Abstract:
Submitted by Ricardo Carrasco (ricardogc84@uepb.edu.br) on 2018-03-27T14:37:51Z No. of bitstreams: 1 PDF - Luiz Felipe de Queiroga Aguiar Leite.pdf: 59818971 bytes, checksum: b6c79d1f88f6b1e9bfd31a7db7fce7a2 (MD5)
Approved for entry into archive by Luciana Medeiros (luciana@uepb.edu.br) on 2018-04-03T11:35:02Z (GMT) No. of bitstreams: 1 PDF - Luiz Felipe de Queiroga Aguiar Leite.pdf: 59818971 bytes, checksum: b6c79d1f88f6b1e9bfd31a7db7fce7a2 (MD5)
Made available in DSpace on 2018-04-03T11:35:02Z (GMT). No. of bitstreams: 1 PDF - Luiz Felipe de Queiroga Aguiar Leite.pdf: 59818971 bytes, checksum: b6c79d1f88f6b1e9bfd31a7db7fce7a2 (MD5) Previous issue date: 2017-03-13
The present work studies the exile and strangeness as one of the most tragic experience of our time problematized by the novel Galileia of the whiter Ronaldo Correia de Brito, through the route of the three main characters: Ismael, Davi and Adonias. We begin by discussing the work of Raymond Williams with attention to the aspects we consider to be central in his theory: commun experience, revolution and sacrifice, tension. The author's tragic theory allows an orientation that the simples use of the term tragic leaves to be desired. Then we approached the studies of Zygmunt Bauman, whose theory of Modernity, ambivalence and waste, in dialogue with Williams’ theory, bases our reflections on the formation of a modern ethos of order and schism that creates contemporary exile and strangeness, one of the most tragic aspects of our time. Starting from the fundamental assumption that macrossocial and microssocial are intrinsically linked, we continue to comment on brief relationships of the novel with greek tragedy, the social tragedy in Williams, and the ambivalence modernity in Bauman. We bring to the fore reflections that confront theoretical aspects of literary regionalism assuming a position: Galileia is a contemporary novel, marked by macrossocial pressures and whose narrator is inserted in tensions of Modernity. We analyze the space of the novel, the space before the arrival of the cousins Ismael, Davi e Adonias to the farm Galileia, and the inner space where the main characters transit and suffer their tensions. We finish with the route of each one of the main characters of the novel. Ismael, the stigmatized whose ambivalence is self-constructed, the outcast rejected by the family; Davi, the traumatized by the rape in childhood, that is born whit the marks of foreignness, white, blue eyes, blond hair, and who is loving for the family for whit he feels only scorn; and Adonias, the narrator-character, the exiling in constant tension, who reaffirms the rejection of the origins that he shares with his family and whose configuration in the novel allows to read an allegory of the human waste, the unresolved state of anti-resiliense and chronic ambivalence of the human goat in tension.
O presente trabalho estuda o exílio e a estranheza como uma das experiências mais trágicas do nosso tempo, problematizadas pelo romance Galileia do escritor Ronaldo Correia de Brito, através do percurso dos três personagens principais: Ismael, Davi e Adonias. Iniciamos discutindo a obra de Raymond Williams com atenção aos aspectos que consideramos centrais em sua teoria: a experiência comun, a revolução e o sacrifício, a tensão. O teoria trágica do autor permite uma orientação que o uso simples do termo trágico deixa a desejar. Em seguida, abordamos os estudos de Zygmunt Bauman, cuja teoria da Modernidade, da ambivalência e do refugo, em diálogo com a teoria de Williams, fundamenta nossas reflexões sobre a formação de um ethos moderno de ordem e cisma que cria o exílio e estranheza contemporâneo, um dos aspectos mais trágicos do nosso tempo. Partindo do pressuposto fundamental de que macrossocial e microssocial estão intrinsecamente ligados, damos continuidade comentando brevemente relações do romance com a tragédia grega, a tragédia social em Williams e a ambivalência moderna em Bauman. Trazemos à tona reflexões que confrontam aspectos teóricos do regionalismo literário assumindo uma posição: Galileia é um romance contemporâneo, marcado por pressões macrossociais e cujo narrador está inserido nas tensões da Modernidade. Analisamos o espaço do romance, o espaço antes da chegada dos primos Ismael, Davi e Adonias à fazenda Galileia, e o espaço interior onde os personagens principais transitam e sofrem suas tensões. Finalizamos com o percurso de cada um dos personagens principais do romance. Ismael, o estigmatizado cuja ambivalência é autoconstruída, o pária rejeitado pela família; Davi, o traumatizado pelo estupro na infância, que nasce com as marcas da estrangeiridade, branco, olhos azuis, cabelos louros, e que é amado pela sua família pela qual sente apenas desprezo; e Adonias, o narradorpersonagem, o exilado em constante tensão, que reafirma a rejeição das origens que ele compartilha com sua família e cuja configuração no romance permite ler uma alegoria do refugo humano, o estado sem redenção de antirresiliencia e ambivalência crônica do bode humano em tensão.
APA, Harvard, Vancouver, ISO, and other styles
13

Obodi, G. N. "Monte Carlo studies of gε⁴ scaler field and the Abelian-Higgs theories in 3-dimensions." Thesis, Imperial College London, 1985. http://hdl.handle.net/10044/1/37805.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Eriksson, Patrik. "MONTE CARLO TREE SEARCH OCH MINIMAX : En jämförelse i tidseffektivitet i ett matcha-3-spel." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15412.

Full text
Abstract:
I arbetet implementerades två algoritmer som utvärderades genom att spela ett matcha-3-spel mot varandra. Den första algoritmen var Minimax som väljer sina drag genom att evaluera de möjliga dragen från ett tillstånd. För att inte sökrymden ska bli för stor begränsas algoritmen med ett sökdjup. Den andra algoritmen var en MCTS som utför flera simuleringar där den utför slumpmässiga drag till ett spelslut för att få en uppskattning över resultatet från de olika dragen. Spelet som de utvärderas på är av typen matcha-3-battle. Flera experiment utfördes sedan på de två algoritmerna, där de spelade flera matcher motvarandra på olika stora bräden, sökdjup och utforskningskonstanter. Testerna visade att i denna implementation var Minimax överlägsen i de fall. När sökrymden blev större presterade MCTS bättre än i mindre rymder, men lyckades aldrig nå en majoritet av vinster.
APA, Harvard, Vancouver, ISO, and other styles
15

RAJAONARISON, LYLIANE. "Developpement d'un logiciel particulaire monte carlo 3 dg simulant l'implantation ionique localisee dans les composants microelectroniques." Paris 11, 1992. http://www.theses.fr/1992PA112309.

Full text
Abstract:
Un logiciel particulaire monte carlo 3d simulant d'une part l'implantation ionique et d'autre part les cascades de collisions a ete developpe: le premier ne s'occupe que des ions implantes, le second concerne les ions et les defauts crees par l'implantation. Les deux modeles sont valables pour une cible amorphe, constituee d'un seul ou de plusieurs types d'atomes, d'une seule couche ou d'une structure multicouche. La comparaison des profils d'implantation simules et des profils d'ions obtenus par les mesures sims nous a permis de valider le modele de l'implantation ionique et de determiner la valeur du parametre ajustable (facteur de correction des pertes electroniques dans le modele lss, fonction uniquement du couple ion-cible) pour differentes combinaisons rencontrees dans la technologie silicium: (b, as, sb)-(si, sio#2, si#3n#4). L'implantation ionique etant la technique principale de dopage des mosfet submicroniques, nous avons applique ce logiciel a la simulation de toutes les implantations necessaires a la fabrication d'un mosfet autoaligne 0,10,5 m#2. Compte tenu de la repartition spatiale des impuretes (donneurs, accepteurs) issue de cette simulation, nous avons etudie le comportement electrique du dispositif considere a l'aide d'un logiciel de simulation de dispositifs (modele particulaire bidimensionnel de type monte carlo). Les resultats obtenus montrent le caractere non stationnaire du transport des electrons dans le canal et une transconductance semblable a celle mesuree
APA, Harvard, Vancouver, ISO, and other styles
16

Lopes, Denise Adorno. "Caracterização microestrutural de ligas do sistema U-Nb-Zr, no canto rico em urânio." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/3/3133/tde-28022011-153908/.

Full text
Abstract:
Foi efetuada a caracterização microestrutural de 10 ligas dos sistemas urânio-nióbio (U-10Nb; U-15Nb; U-20Nb), urânio-zircônio (U- 10Zr; U-15Zr, U-20Zr) e urânio-nióbio-zircônio (U-2,5Nb-2,5Zr; U-5Nb- 5Zr; U-7,5Nb-7,5Zr; U-10Nb-10Zr), no canto rico em urânio. As ligas estudadas são candidatas ao uso como elementos combustíveis tipo placa, utilizados tanto em reatores nucleares de pesquisa como em reatores nucleares de potência. As ligas foram preparadas por fusão a plasma em forno com eletrodo não consumível de tungstênio. Após várias fusões, as amostras sofreram tratamento térmico de homogeneização a 1000ºC por 96 horas, com resfriamento em água. Em seguida, as amostras homogeneizadas foram recozidas a 700 e a 500ºC, com resfriamento em água. No total, foram estudadas 40 amostras de 10 ligas diferentes em 4 condições diferentes: bruto de fundição, homogeneizadas a 1000ºC e envelhecidas a 700 e a 500ºC. Foram utilizadas várias técnicas complementares de caracterização microestrutural: microscopia óptica, microscopia eletrônica de varredura com auxilio de microanálise por dispersão de energia de raios X, difração de raios X com auxílio do método de análise de Rietveld, e medidas de microdureza Vickers. Os resultados mostraram que os elementos de liga Nb e Zr estabilizam a fase alotrópica γ do urânio e atrasam a transformação de γ para β. Neste aspecto, o Nb é mais eficaz que o Zr. Além disto, podem ocorrer durante o resfriamento transformações martensíticas γ→α\', β→α′ e possivelmente γ→γ°. A temperatura de início de transformação martensítica (Ms) formadora da fase diminui com a adição dos elementos de liga estudados. Ms intercepta a temperatura ambiente entre as composições U-5Nb-5Zr e U-7,5Nb-7,5Zr. Foi verificado também que a reação peritetóide α + γ2→ δ do sistema U-Zr possui uma cinética lenta e não pode ser detectada nos tempos e temperaturas estudados. Em algumas ligas foi possível reter na temperatura ambiente ligas com microestrutura martensítica dúcteis, que permitem a conformação mecânica a frio, o que é de significativo interesse tecnológico.
The microstructures of 10 uranium-rich alloys of the uraniumniobium (U-10Nb; U-15Nb; U-20Nb), uranium-zirconium (U-10Zr; U- 15Zr;U-20Zr) and uranium-niobium-zirconium (U-2.5Nb-2.5Zr; U-5Nb-5Zr; U-7.5Nb-7.5Zr; U-10Nb-10Zr)systems have been characterized. The studied alloys are considered for plate-type nuclear fuels fabrication used both in nuclear research reactors and in nuclear power reactors. The alloys were melted by arc plasma methods employing nonconsumable tungsten cathode. After several fusions, samples were subjected to homogenizing heat treatment at 1000ºC for 96 hours and then quenched in water. Then the samples were annealed at 700 and 500ºC. The microstructural characterization encompassed 40 samples of 10 different alloys composition in four different conditions: as cast, homogenized at 1000°C and aged at 700 and 500ºC. Microstructural characterization was performed using several complementary techniques: optical microscopy; scanning electron microscopy with energy-dispersive X-ray analysis; X-ray diffraction with the aid of the Rietveld analysis method; and Vickers microhardness measurements. The results showed that the Nb and Zr additions have stabilized the uranium γ-phase and delayed the γ and β phase transformation. In this regard, Nb was more effective than Zr. However, during cooling martensitic transformations γ→α\', β→α\' and possibly γ→γ° may occur. The martensitic transformation start temperature (Ms), which produces the phase , decreased with Nb and Zr additions. Ms intersected room temperature between the compositions U-5Nb-5Zr e U- 7,5Nb-7,5Zr. It was found that the peritectoid reaction α + γ2 → δ of the U-Zr system showed a very slow kinetics and could not be detected in the range of the studied times and temperatures. An important result of the technological point of view is that in some alloys it was possible to retain at room temperature a ductile martensitic microstructure, allowing cold forming.
APA, Harvard, Vancouver, ISO, and other styles
17

Alahmade, Walaa. "EXPERIMENTAL INVESTIGATION AND MONTE CARLO SIMULATION OF QUASIELASTIC ELECTRON SCATTERING FROM HELIUM-3 CLUSTERS IN HELIUM-4." Kent State University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=kent1619697731858548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Santos, Caio Rubens Gonçalves. "Dimensionamento e análise do ciclo de vida de pavimentos rodoviários: uma abordagem probabilística." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3138/tde-25082011-140705/.

Full text
Abstract:
Frequentemente é utilizada a abordagem determinística tanto em dimensionamentos quanto em análises do ciclo de vida de pavimentos rodoviários. A variabilidade inerente aos parâmetros pertinentes à implantação e ao desempenho de um pavimento é comumente desprezada, porém sua consideração pode ser contemplada com a utilização de uma abordagem probabilística, onde cada variável é caracterizada através de uma distribuição de probabilidade adequada. Uma análise econômica de um pavimento, seja asfáltico ou rígido, deve sempre abordar todos os custos pertinentes, desde a implantação. Os custos e benefícios tanto dos usuários quanto da administração rodoviária devem ser considerados. Um dos principais objetivos da avaliação econômica de pavimentos é o de apoiar a decisão quanto à seleção de alternativas de construção ou manutenção mais viáveis, quanto ao custo, e face de determinadas condições técnicas e econômicas. Este trabalho foca a utilização da abordagem probabilística no dimensionamento e em uma análise econômica de pavimentos rodoviários, tanto asfálticos quanto rígidos. São propostos procedimentos para a determinação da confiabilidade de uma estrutura de pavimento, asfáltico ou rígido, baseados nos métodos do DNIT e da AASHTO. Para análise do ciclo de vida são propostos modelos computacionais para a execução destas análises utilizando-se a equação de desempenho da AASHTO. Os custos dos usuários não foram contemplados nos modelos. O método Monte Carlo foi utilizado em todos os modelos. O risco de falha é determinado para o dimensionamento do pavimento. Os resultados contam ainda com uma distribuição dos custos totais ao longo de um período de projeto, possibilitando uma análise de risco. Os dados de saída (resultados) revelam- se como importantes balizadores para a tomada de decisão quanto à alocação de investimentos em alternativas de pavimentação, considerando os riscos inerentes às variabilidades das parcelas do processo consideradas neste trabalho.
A deterministic approach is often used for pavement design and life cycle cost analysis, which does not consider the inherent variability of some relevant parameters. The inherent variability of relevant parameters to the construction and performance of a pavement is often overlooked, but their consideration can be addressed with the use of a probabilistic approach, where each variable is characterized by a suitable probability distribution. An economic analysis of a pavement, either asphaltic or rigid, should always include an approach that considers all relevant construction costs, since its construction. The benefit-cost for both agency and road users should be considered. Improving the decision making process to choose among all construction and maintenance alternatives is one of the main pavement economic analysis goals. This thesis focuses on the use of probabilistic approach in the pavement design and in the Life Cycle Cost Analysis of road pavements, either asphalt or rigid. Procedures are proposed for determining the reliability of a structure of pavement, based on the methods of DNIT and AASHTO. For life cycle cost analysis, models are proposed for the implementation of these computational analyses using the performance equation of AASHTO. The user costs were not included in the models. The Monte Carlo method was used in all models. The reliability (and the risk of failure) is determined for pavements design. The results also come with a distribution of total costs over an analysis period, allowing a risk analysis. The output data (results) are revealed as important indicators for decision making process regarding the allocation of investments in alternative pavements solution, considering the inherent variability of the parcels of the process considered in this work.
APA, Harvard, Vancouver, ISO, and other styles
19

Natori, Willian Massashi Hisano. "j = 3/2 Quantum spin-orbital liquids." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/76/76131/tde-25092018-161836/.

Full text
Abstract:
Quantum spin liquids (QSLs) are strongly correlated systems displaying fascinating phenomena like long-range entanglement and fractionalized excitations. The research on these states has since its beginning followed trends generated by the synthesis of new compounds and the construction of new theoretical tools. In coherence with this history, a manifold of new results about QSLs were established during the past decade due to studies on the integrable Kitaev model on the honeycomb lattice. This j = 1/2 model displays bond-dependent and anisotropic exchanges that are essential to stabilize its QSL ground state with Majorana fermion excitations and emergent Z2 gauge field. Even more interestingly, this model is relevant to understand the magnetism of a certain class of 4/5d5 Mott insulators with specific lattice constraints, t2g orbital degeneracy and strong spin-orbit coupling (SOC). This mechanism defining these so-called Kitaev materials can be applied to similar compounds based on transition metal ions in different electronic configurations. In this thesis, I investigate minimal models for two types of 4/5d1 Mott insulators: the ones on the ordered double perovskite structure (ODP) and the ones isostructural to the Kitaev materials. Their effective models generically show bond-dependent and anisotropic interactions involving multipoles of an effective j = 3/2 angular momentum. Such degrees of freedom are conveniently written in terms of pseudospin s and pseudo-orbital τ operators resembling spin and orbital operators of Kugel-Khomskii models with twofold orbital degeneracy. Despite their anisotropy, the two realistic models display continuous global symmetries in the limit of vanishing Hund\'s coupling enhancing quantum fluctuations and possibly stabilizing a QSL phase. Parton mean-field theory was used to propose fermionic QSLs that will be called quantum spin-orbital liquids (QSOLs) due their dependence with s and τ. On ODPs, I studied a chiral QSOL with Majorana fermion excitations and a gapless spectrum characterized by nodal lines along the edges of the Brillouin zone. These nodal lines are topological defects of a non-Abelian Berry connection and the system exhibits dispersing surface states. Several experimental responses of the chiral QSOL within the mean-field approximation are compared with the experimental data available for the spin liquid candidate Ba2YMoO6. Moreover, based on a symmetry analysis, I discuss the operators involved in resonant inelastic X-ray scattering (RIXS) amplitudes for 4/5d1 Mott insulators and show that the RIXS cross sections allow one to selectively probe pseudospin and pseudo-orbital degrees of freedom. For the chiral spin-orbital liquid in particular, these cross sections provide information about the spectrum for different flavors of Majorana fermions. The model for materials isostructural to the Kitaev materials has an emergent SU(4) symmetry that is made explicit by means of a Klein transformation on pseudospin degrees of freedom. The model is known to stabilize a QSOL on the honeycomb lattice and instigated the investigation of QSOLs on a generalization of this lattice to three dimensions. Parton mean-field theory was used once again to propose the liquid states, and a variational Monte Carlo (VMC) method was used to compute the energies of the projected wave functions. The numerical results show that the lowest-energy QSOL corresponds to a zero-flux state with a Fermi surface of four-color fermionic partons. Further VMC computations also revealed that this state is stable against formation of plaquette ordering (tetramerization). The energy of this QSOL is highly competitive even when Hund\'s coupling induced perturbations are included, as shown by comparison with simple ordered states. Extensions and perspectives for future work are discussed in the end of this thesis.
Líquidos de spin quânticos (QSLs) são sistemas fortemente correlacionados que apresentam fenômenos fascinantes como emaranhamento de longo alcance e excitações fracionárias. A pesquisa a respeito destes estados seguiu tendências geradas pela síntese de novos compostos e construção de novas técnicas teóricas desde seu princípio. Coerentemente com essa história, uma variedade de novos resultados a respeito de líquidos de spin foram estabelecidos na última década graças a estudos feitos sobre o modelo integrável de Kitaev na rede colmeia. Este modelo de spins j = 1/2 apresenta interações de troca anisotrópicas e direcionalmente dependentes que são essenciais para estabilizar um estado fundamental do tipo QSL com férmions de Majorana e campo de gauge Z2 emergente. Ainda mais interessante, este modelo é relevante para se entender o magnetismo de uma certa classe de isolantes de Mott baseados em metais de transição na configuração 4/5d5 em redes específicas, degenerescência orbital t2g e acoplamento spin-órbita forte (SOC). Esse mecanismo que define os chamados materiais do tipo Kitaev podem ser aplicados a compostos baseados em metais de transição em configurações eletrônicas diferentes. Nesta tese, eu investigo modelos mínimos para dois tipos de isolantes de Mott do tipo 4/5d1: os que se apresentam na estrutura perovskita dupla ordenada (ODP) e os isostruturais aos materiais do tipo Kitaev. Seus modelos efetivos genericamente apresentam interações multipolares anisotrópicas e direcionalmente dependentes de um momento angular efetivo j = 3/2. Estes graus de liberdade são convenientemente escritos em termos de operadores de pseudospin s e pseudo-orbital τ semelhantes a operadores de spin e orbital de modelos do tipo Kugel-Khomskii com orbitais duplamente degenerados. A despeito da anisotropia, esses dois modelos realísticos apresentam simetrias globais contínuas no limite de acoplamento de Hund nulo que incrementam flutuações quânticas e possivelmente estabilizam uma fase do tipo QSL. A teoria de campo médio com partons foi usada para propor QSLs fermiônicos que serão chamados de líquidos spin-orbitais quânticos (QSOLs) devido à dependência deles com s e τ. Em ODPs, eu estudei um líquido de spin quiral com excitações do tipo férmion de Majorana e um espectro sem gap caracterizado por linhas nodais ao longo das arestas da zona de Brillouin. Essas linhas nodais são defeitos topológicos de uma conexão de Berry não-abeliana e o sistema apresenta estados de superfície dispersivos. Várias respostas experimentais foram calculadas para o QSOL quiral dentro da aproximação de campo médio e comparadas com os dados experimentais disponíveis para o candidato a líquido de spin Ba2YMoO6. Além disso, baseado em uma análise de simetria, discuto os operadores envolvidos nas amplitudes de espalhamento de raios-x ressonante para isolantes de Mott na configuração 4/5d1 e mostro que seções de choque de RIXS permitem estudar seletivamente os graus de liberdade de pseudospins e pseudo-orbitais. Para o caso particular do líquido spin-orbital quiral, essas seções de choque nos fornecem informações sobre o espectro de diferentes sabores de férmions de Majorana. Esse modelo possui uma simetria SU(4) emergente que é tornada explícita através de uma transformações de Klein nos graus de liberdade de pseudospin. Sabe-se que este modelo estabiliza um QSOL na rede colmeia, o que instigou uma investigação de QSOLs na generalização desta rede em três dimensões. A teoria de campo médio com partons foi usada novamente para propor estes líquidos quânticos, e o método de Monte Carlo Variacional (VMC) foi usado para calcular as energias das funções de onda projetadas. Os resultados numéricos mostraram que o QSOL de menor energia corresponde a um estado de fluxo-zero com superfície de Fermi envolvendo partons fermiônicos de quatro cores. Cálculos adicionais com VMC também demonstraram que este estado é estável à formação de ordem de plaquetas (tetramerização). A energia deste QSOL é altamente competitiva mesmo quando perturbações induzidas pelo acoplamento de Hund são incluídas, o que é mostrado através da comparação com estados ordenados simples. Extensões e perspectivas para trabalhos futuros são discutidas no final desta tese.
APA, Harvard, Vancouver, ISO, and other styles
20

Wagner, Antoine. "Clinical implementation of a Monte Carlo-based platform for the validation of stereotactic and intensity-modulated radiation therapy." Doctoral thesis, Universite Libre de Bruxelles, 2020. https://dipot.ulb.ac.be/dspace/bitstream/2013/312015/3/ToC.pdf.

Full text
Abstract:
En radiothérapie, le niveau de précision de la dose délivrée au patient au cours de son traitement est d’une importance essentielle dans l’évolution vers une amélioration de la qualité et de la cohérence des données de suivi. L’une des premières étapes vers un système de support à la décision clinique (Clinical-Decision Support System CDSS) est la reconstruction précise de cette dose délivrée, en prenant en compte les nombreux facteurs pouvant générer des déviations significatives entre la dose planifiée visualisée à l’écran par l’utilisateur et la dose réellement accumulée lors des séances de traitement. Ces facteurs incluent les variations de débit de l’accélérateur, les incertitudes d’étalonnage, de calcul de dose, les mouvements du patient et des organes, etc.L’objectif de cette étude est d’implémenter et tester une plate-forme de calcul Monte Carlo pour la validation des systèmes Cyberknife et Tomothérapie installés au Centre Oscar Lambret. L’étude d’un détecteur dédié aux petits faisceaux (la chambre d’ionisation microLion) est également incluse, ce détecteur étant particulièrement adapté aux mesures sur le système Cyberknife.Le contexte et les concepts théoriques sont introduits dans les deux premiers chapitres. Dans le troisième chapitre, la modélisation Monte Carlo du Cyberknife et du détecteur microLion est détaillée. La quatrième partie inclut la description de la plate-forme Moderato et de son module d’évaluation. Dans le dernier chapitre, la modélisation du dernier modèle de Cyberknife (M6) équipé d’un collimateur multi-lames est décrite. Une nouvelle technique est également introduite dans le but d’accélérer la recherche des paramètres du faisceau d’électrons pour un modèle Monte Carlo, permettant une intégration plus simple et automatisée de nouveaux appareils dans Moderato.
In radiation therapy, the accuracy of the dose delivered to the patient during the course of treatment is of great importance to progress towards improved quality and coherence of the outcome data. One of the first steps to evolve towards a Clinical-Decision Support System (CDSS) is to be able to accurately reconstruct that delivered dose, taking into account the range of factors that can potentially generate significant differences between the planned dose visualized on the screen of the dosimetrist, and the actually delivered dose accumulated during the treatment sessions. These factors include accelerator output variations, commissioning uncertainties, dose computation errors, patient and organ movement, etc.The objective of this work is to implement and test a Monte Carlo platform for the validation of the Cyberknife and Tomotherapy systems installed at Centre Oscar Lambret. A study of a small field-dedicated detector (the microLion ionization chamber) is also included, this detector being particularly suited for measurements on the Cyberknife system.The context and theoretical concepts are introduced in the first two chapters. In the third chapter, the Monte Carlo modelling of the Cyberknife and microLion detector is detailed. The fourth part includes the description of the Monte Carlo platform Moderato and its evaluation module. In the final chapter, the modelling of the latest MLC-equipped Cyberknife model (the M6) is described. A new technique is also introduced to accelerate the optimization of the beam electron parameters of a Monte Carlo model, thus allowing for an easier and more automated use of the Moderato system.
Doctorat en Sciences biomédicales et pharmaceutiques (Médecine)
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
21

Prata, João Miguel Tavares. "Métodos de quantificação da chuva incidente em paredes." Master's thesis, Faculdade de Ciências e Tecnologia, 2012. http://hdl.handle.net/10362/10141.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Civil - Perfil de Construção
A chuva incidente conta-se entre os factores mais relevantes que afectam o desempenho termohigrométrico das edificações. Este tipo de precipitação caracteriza-se por ser afectada pela componente horizontal da velocidade do vento que direcciona a chuva para a envolvente dos edifícios. A quantidade de chuva que incide está correlacionada com diversos factores entre os quais: a geometria do edifício, a topografia do meio envolvente e a velocidade do vento apenas para salientar os mais relevantes. Este conjunto de factores converte a abordagem ao tema da sua quantificação num processo complexo. Este fenómeno é responsável pelo aparecimento de anomalias da construção com consequentes prejuízos económicos e sociais. Esta dissertação decorre na sequência de outros trabalhos realizados anteriormente e visa fazer uma actualização do estado do conhecimento e das novas contribuições entretanto surgidas. Actualmente existem três linhas principais de investigação, divididas em métodos experimentais, semi-empíricos e numéricos. Os primeiros reportam-se ao estabelecimento de sistemas de medição in situ, envolvendo aparelhos de medição como pluviómetros, udómetros e anemómetros, ou em alternativa experiências em túneis de vento. O método experimental constitui-se como a principal fonte geradora de dados para implementação nos dois restantes métodos. O segundo método envolve a integração dos dados experimentais, conjugados com expressões matemáticas que permitem a quantificação da chuva incidente de forma mais célere e mais abrangente. É portanto um modelo com bases teóricas cujos coeficientes assentam em observações experimentais. O documento normativo ISO 15927-3 é disso exemplo, assim como um procedimento paralelo desenvolvido por Straube e Burnett. Os métodos numéricos, envolvendo a utilização de ferramentas computacionais de dinâmica de fluidos foram aplicados a este tema em inícios da década de 90. Através de modelações do caso em estudo em elementos de grelha, com aplicação de modelos de turbulência e integração de dados meteorológicos, conseguem fazer uma previsão da chuva incidente com maior qualidade e poder de resolução. Estes métodos são complementares, pelo que há interesse em prosseguir a investigação nestas áreas.
APA, Harvard, Vancouver, ISO, and other styles
22

Shehu, Erald [Verfasser], and A. [Akademischer Betreuer] Cato. "The expression and function of human Anterior Gradient Homolog 3 in Prostate Cancer / Erald Shehu. Betreuer: A. Cato." Karlsruhe : KIT-Bibliothek, 2013. http://d-nb.info/1047383594/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Papanastasiou, Dimitrios. "3 essays on credit risk modeling and the macroeconomic environment." Thesis, University of Edinburgh, 2015. http://hdl.handle.net/1842/22014.

Full text
Abstract:
In the aftermath of the recent financial crisis, the way credit risk is affected by and affects the macroeconomic environment has been the focus of academics, risk practitioners and central bankers alike. In this thesis I approach three distinct questions that aim to provide valuable insight into how corporate defaults, recoveries and credit ratings interact with the conditions in the wider economy. The first question focuses on how well the macroeconomic environment forecasts corporate bond defaults. I approach the question from a macroeconomic perspective and I make full use of the multitude of lengthy macroeconomic time series available. Following the recent literature on data-rich environment modelling, I summarise a large panel of 103 macroeconomic time series into a small set of 6 dynamic factors; the factors capture business cycle, yield curve, credit premia and equity market conditions. Prior studies on dynamic factors use identification schemes based on principal components or recursive short-run restrictions. The main contribution to the body of existing literature is that I provide a novel and more robust identification scheme for the 6 macro-financial stochastic factors, based on a set of over-identifying restrictions. This allows for a more straightforward interpretation of the extracted factors and a more meaningful decomposition of the corporate default dynamics. Furthermore, I use a novel Bayesian estimation scheme based on a Markov chain Monte Carlo algorithm that has not been used before in a credit risk context. I argue that the proposed algorithm provides an effcient and flexible alternative to the simulation based estimation approaches used in the existing literature. The sampling scheme is used to estimate a state-of-the-art dynamic econometric specification that is able to separate macro-economic fluctuations from unobserved default clustering. Finally, I provide evidence that the macroeconomic factors can lead to significant improvements in default probability forecasting performance. The forecasting performance gains become less pronounced the longer the default forecasting horizon. The second question explores the sensitivity of corporate bond defaults and recoveries on monetary policy and macro-financial shocks. To address the question, I follow a more structural approach to extract theory-based economic shocks and quantify the magnitude of the impact on the two main credit risk drivers. This is the first study that approaches the decomposition of the movements in credit risk metrics from a structural perspective. I introduce a VAR model with a novel semi-structural identification scheme to isolate the various shocks at the macro level. The dynamic econometric specification for defaults and recoveries is similar to the one used to address the first question. The specification is flexible enough to allow for the separation of the macroeconomic movements from the credit risk specific unobserved correlation and, therefore, isolate the different shock transmission mechanisms. I report that the corporate default likelihood is strongly affected by balance sheet and real economy shocks for the cyclical industry sectors, while the effects of monetary policy shocks typically take up to one year to materialise. In contrast, recovery rates tend to be more sensitive to asset price shocks, while real economy shocks mainly affect secured debt recovery values. The third question shifts the focus to credit ratings and addresses the Through-the- Cycle dynamics of the serial dependence in rating migrations. The existing literature treats the so-called rating momentum as constant through time. I show that the rating momentum is far from constant, it changes with the business cycle and its magnitude exhibits a non-linear dependence on time spent in a given rating grade. Furthermore, I provide robust evidence that the time-varying rating momentum substantially increases actual and Marked-to-Market losses in periods of stress. The impact on regulatory capital for financial institutions is less clear; nevertheless, capital requirements for high credit quality portfolios can be significantly underestimated during economic downturns.
APA, Harvard, Vancouver, ISO, and other styles
24

Höpfner, Sebastian. "Modulation of Cargo Transport and Sorting through Endosome Motility and Positioning." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2005. http://nbn-resolving.de/urn:nbn:de:swb:14-1132927530983-53598.

Full text
Abstract:
Utilizing various systems such as cell-based assays but also multicellular organisms such as Drosophila melanogaster and C.elegans, for example, the endocytic system has been shown to consist of a network of biochemically and morphologically distinct organelles that carry out specialized tasks in the uptake, recycling and catabolism of growth factors and nutrients, serving a plethora of key biological functions (Mellman, 1996). Different classes of endosomes were found to exhibit a characteristic intracellular steady state distribution. This distribution pattern observed at steady state results from a dynamic interaction of endosomes with the actin and the microtubule cytoskeleton. It remains unclear, however, which microtubule-based motors besides Dynein control the intracellular distribution and motility of early endosomes and how their function is integrated with the sorting and transport of cargo. The first part of this thesis research outlines the search for such motor. I describe the identification of KIF16B which functions as a novel endocytic motor protein. This molecular motor, a kinesin-3, transports early endosomes to the plus end of microtubules, in a process regulated by the small GTPase Rab5 and its effector, the phosphatidylinositol-3-OH kinase hVPS34. In vivo, KIF16B overexpression relocated early endosomes to the cell periphery and inhibited transport to the degradative pathway. Conversely, expression of dominant-negative mutants or ablation of KIF16B by RNAi caused the clustering of early endosomes to the peri-nuclear region, delayed receptor recycling to the plasma membrane and accelerated degradation. These results suggest that KIF16B, by regulating the plus end motility of early endosomes, modulates the intracellular localization of early endosomes and the balance between receptor recycling and degradation. In displaying Rab5 and PI(3)P-containing cargo selectivity, a remarkable property of KIF16B is that it is subjected to the same regulatory principles governing the membrane tethering and fusion machinery (Zerial and McBride, 2001). Since KIF16B can modulate growth factor degradation, we propose that this motor could have also important implications for signaling. Importantly, KIF16B has provided novel insight into how intracellular localization of endosomes governs the transport activity of these organelles. The second part of this thesis describes the proof-of-principle of a genome-wide screening strategy aimed at gaining insights into the next level of understanding: How the spatial distribution of organelles is linked to their function in an experimental system which features cellular polarity, for example, a tissue or organ. The suitability of C. elegans as a model organism to identify genes functioning in endocytosis has been demonstrated by previous genetic screens (Grant and Hirsh 1999; Fares and Greenwald, 2001). Offering excellent morphological resolution and polarization, the nematode intestine represents a good system to study the apical sorting of a transmembrane marker. The steady state localization of such a marker is likely the result of a dynamic process that depends on biosynthetic trafficking to the apical surface, apical endocytosis and recycling occurring through apical recycling endosomes. Therefore, mis-sorting of this marker upon RNA-mediated interference will be indicative of a failure in one of the aforementioned processes. Furthermore, since it is still largely unclear why apical endosomes maintain their polarized localization, this screen will also monitor the morphology of this endocytic compartment using a second marker. Following image acquisition based on an automated confocal microscope, data can be analyzed using custom-built software allowing objective phenotypic analysis. The successful establishment of the proof-of-principle marks the current state-of-the-art of this large-scale screening project.
APA, Harvard, Vancouver, ISO, and other styles
25

Lopes, Juliana da Serra Costa. "Um modelo integrado de simulação-otimização para suporte ao planejamento e à análise de um negócio de aeronaves de propriedade compartilhada." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3138/tde-11082011-133833/.

Full text
Abstract:
Esta pesquisa aborda o problema de alocação de jatos executivos compartilhados para casos em que a demanda diária é variável. É proposta uma ferramenta auxiliar de planejamento de uma empresa de operação de jatos compartilhados. São apresentadas as características principais do tipo de negócio que formam o problema estudado neste trabalho. Consideram-se os aspectos de uma empresa que administra jatos de propriedade compartilhada. O cliente adquire uma cota de uma aeronave e quando solicita uma viagem, com poucas horas de antecedência, a empresa deve garantir a realização do voo em uma aeronave da categoria adquirida. Também é de responsabilidade da empresa a gestão da tripulação, o reposicionamento da frota e a manutenção das aeronaves Este trabalho apresenta o desenvolvimento de uma ferramenta para auxiliar na tomada de decisões estratégicas que envolvem a escolha dos locais de base de operação e o dimensionamento da frota. A metodologia de solução é composta de um modelo de simulação e um de otimização. O modelo de simulação utiliza o método de Monte Carlo para obtenção da demanda de voos dia a dia que gera uma programação de clientes a atender. Os dados da simulação são então estruturados como um problema de fluxo em rede de mínimo custo e é realizada a alocação ótima das aeronaves. A ferramenta foi construída em ambiente de planilha eletrônica Microsoft Excel e aplicada em um caso prático de jatos executivos compartilhados com múltiplas bases. Foram testadas diversas configurações de bases e políticas operacionais como frota homogênea, frota heterogênea e frota alugada. Os resultados da ferramenta permitem determinar o impacto que a escolha das bases de operação tem no tamanho da frota e no reposicionamento de aeronaves. A metodologia mostrou-se robusta e, em tempo adequado, a ferramenta encontrou a solução ótima para cada configuração testada.
This research deals with the problem of scheduling jets with fractional ownership in cases where the demand varies daily. It has been devised a tool to support the planning phase of a company that operates shared jets. The main characteristics of the fractional shared market are presented in this manuscript and the research was developed under the point of view of a provider of fractional ownership. A client becomes a partial owner of an aircraft of a specific model and is entitled to a certain amount of flight hours. When the client requests a flight, usually only a few hours ahead, the fractional provider must guarantee that an aircraft of the requested model is available to the owner at the requested time and place. The provider is responsible for all the operational considerations, including managing the crew and having a well-maintained fleet. This work presents the development of a tool to help making decisions involving the choice of the operational bases and the size of the fleet. The solution methodology is composed of a simulation and a optimization model. Monte Carlo simulation is the method used to obtain the daily flight demand. The results of the simulation are structured as a minimum cost network flow problem to solve optimally the fleet allocation. This tool has been built in a Microsoft Excel spreadsheet environment and applied to a case of fractional jets with multiple bases. Several configurations and operational policies have been tested, such as operations with homogenous fleet, with heterogeneous fleet and with rented fleet. The results provided by the tool allow the user to evaluate the impact that the choice of the operational bases has on the size of the fleet and on the redeployment of the aircrafts. The methodology presented itself as adequate and the developed tool was able to solve optimally, in acceptable time, the problem for each case.
APA, Harvard, Vancouver, ISO, and other styles
26

Ogata, Paulo Hideshi. "Avaliação do perigo de colisão entre aeronaves em operação de aproximação em pistas de aterrissagem paralelas." Universidade de São Paulo, 2004. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-14042004-160344/.

Full text
Abstract:
Neste trabalho é proposta a modelagem de uma ferramenta de auxílio à tomada de decisão com base na avaliação do nível de perigo de colisão entre duas aeronaves em operação de aproximação em pistas de aterrissagem paralelas (UCSPA - Ultra Closely Spaced Parallel Approaches). A ferramenta computacional utilizada na simulação e na obtenção dos dados numéricos está fundamentada no Método de Monte Carlo.
In this work is proposed an aid tool modeling for decision-making process based on collision hazard evaluation between two aircrafts on UCSPA (Ultra Closely Spaced Parallel Approaches) scenario. The computation tool used in the simulation to obtain the numerical data is based on Monte Carlo Method.
APA, Harvard, Vancouver, ISO, and other styles
27

Hammel, Jeffrey Robert. "Development of an unstructured 3-D direct simulation Monte Carlo/particle-in-cell code and the simulation of microthruster flows." Link to electronic thesis, 2002. http://www.wpi.edu/Pubs/ETD/Available/etd-0510102-153614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ahmad, Hasan. "The coach-athlete relationship in the Middle East : cultural considerations." Thesis, Loughborough University, 2014. https://dspace.lboro.ac.uk/2134/15237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Habib, Dayane. "Diffusion de l'hélium-3 hyperpolarisé dans le tissu pulmonaire : évaluation par différentes techniques IRM." Phd thesis, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00435916.

Full text
Abstract:
Ce travail présente une étude expérimentale sur l'effet de la diffusion restreinte de l'hélium-3 hyperpolarisé dans l'acinus pulmonaire effectuée à bas champ magnétique 0,1 T. Plusieurs fantômes avec différentes tailles et connections modélisant l'acinus humain sain et à un stade précoce de l'emphysème ont été réalisés selon le modèle de Kitaoka. L'atténuation du signal dévie par rapport au comportement prévu de décroissance exponentielle en G2, G étant l'intensité de gradient. Cette observation indique une certaine ambiguïté sur la possibilité de quantifier de façon absolue le coefficient de diffusion apparent (ADC), sauf dans la limite G faible. Des simulations Monte-Carlo sont en bon accord avec les mesures. Des séquences originales rapides basées sur le principe des échos de spin multiples ont été développées, pour accéder à une valeur globale d'ADC à des temps longs permettant l'exploration du gaz dans toute la structure de branchement de l'acinus. Des mesures sur un modèle animal d'emphysème (rat) ont été comparées à des cartes obtenues à partir d'acquisitions standard avec petits angles de basculement, elles indiquent une augmentation systématique et toujours significative des ADC par rapport au contrôle sain, pour plusieurs protocoles de mesure. La méthode globale a une meilleure sensibilité que la cartographie standard, en outre elle donne un plus fort contraste d'ADC entre animaux sains et avec emphysème du fait de la possibilité d'employer des valeurs de G plus faibles. Ces outils de mesure de diffusion par IRM et RMN des gaz hyperpolarisés ouvrent des voies prometteuses aussi bien pour la physique de la diffusion que pour les applications médicales.
APA, Harvard, Vancouver, ISO, and other styles
30

Tiftik, Mehmet Emre. "Assessing Domestic Debt Sustainability Of Turkey With A Risk Management Approach." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/3/12607632/index.pdf.

Full text
Abstract:
This thesis analyzes the debt dynamics of Turkey and assesses the sustainability of fisscal policy. The assessment of fiscal policy follows the methodology of Garcia and Rigobon (2004). This approach focuses on the concept of debt sustainability from a risk management perspective and incorporates the effects of stochastic shocks to the economy in its assessment. The results suggest that a continuation of the present fiscal stances will lead to a fiscal unsustainability in Turkey. Furthermore, the results indicate that the properties of the debt dynamics are closely related to the spreads on both dollar denominated debt and YTL denominated debt. This thesis also provides an application of two traditional methodologies, such as Wilcox'
s (1989) methodology and Uctum and Wicken'
s (2000) methodology in order to assess the fiscal sustainability of Turkey.
APA, Harvard, Vancouver, ISO, and other styles
31

Schilling, Alina [Verfasser], Thomas [Akademischer Betreuer] Ledowski, and Gunnar [Gutachter] Cario. "Untersuchung zur Anwendung von Propofol bei Kindern mit Ei-, Soja-, Erdnuss- oder anderer Hülsenfruchtallergie / Alina Schilling ; Gutachter: Gunnar Cario ; Betreuer: Thomas Ledowski." Kiel : Universitätsbibliothek Kiel, 2020. http://nbn-resolving.de/urn:nbn:de:gbv:8:3-2021-00584-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Matsuyama, Rafael Tsuji. "Avaliação de risco em operações de pouso de aeronaves em pistas paralelas utilizando procedimentos e técnicas CSPA." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-26082011-142622/.

Full text
Abstract:
Historicamente, os sistemas de tráfego aéreo incorporaram níveis de automação nas atividades de controle do espaço aéreo com o intuito de atender à crescente demanda por serviços aéreos e de melhorar os níveis de segurança nos procedimentos de voo. Com o crescimento expressivo previsto para os próximos anos, devido ao aumento nos números de voos e de usuários, as opções tradicionais de expansão da malha aérea e / ou construção de novos aeroportos se tornaram onerosas economicamente, tornando necessária a adoção de alternativas, tais como as técnicas / procedimentos para pousos em pistas paralelas, como forma de aproveitar parte da atual infraestrutura aeroportuária existente, sem a necessidade de enormes aportes financeiros. Para avaliar a viabilidade de projetos de pousos simultâneos em pistas paralelas, um dos fatores importantes a serem analisados é o da avaliação do risco de colisão entre aeronaves associado durante esses procedimentos. Nesse cenário, este trabalho de pesquisa propõe uma extensão no modelo de avaliação de segurança de Ogata para procedimentos de pouso em pistas paralelas, considerando que o modelo original tem o objetivo de medir o nível de risco associado somente para operações de pouso convencionais em pistas paralelas. A extensão deste modelo ocorre no sentido de também permitir a simulação em outros cenários distintos de pouso, o que torna possível tanto a realização de comparativos entre técnicas / procedimentos utilizadas em operações de pouso em pistas paralelas, quanto a avaliação do nível de risco associado. Este modelo estendido de segurança utiliza o método de Monte Carlo, da mesma forma que o original, em que um número elevado de simulações de cenários possíveis de pousos em pistas paralelas é avaliado. Com os resultados obtidos, é analisado o impacto da variação da distância entre as pistas na segurança de pousos em pistas paralelas.
Historically, air traffic control systems have incorporated some levels of automation to manage procedures of airspace control in order to meet the growing demand for air transportation services and to improve levels of safety in flight procedures. With significant growth expected in the coming years due to an increase in numbers of flights and passengers, the traditional options of expanding the air traffic network and / or construction of new airports have become economically burdensome, requiring the adoption of alternatives such as techniques / procedures for landings on parallel runways as a way of taking advantage of part of the current airport infrastructure, without requiring enormous financial contributions. To assess the feasibility of projects of landing in parallel runways, one of the important factors to be analyzed is the evaluation of the risk of collision between aircraft, associated to these procedures. In this scenario, this research proposes to extend the Ogata safety assessment model in procedures for landing on parallel runways, whereas the original model aims to measure the level of risk associated only with conventional landing operations in parallel runways. The extension of this model occurs in order to allow the simulation of different landing scenarios, which makes possible both the conduct of comparative techniques / procedures used in landing operations on parallel runways, such as the risk level assessment. This model uses the Monte Carlo simulation, the same as the original model, in which a large number of simulations of possible scenarios for landings on parallel runways are evaluated. With these results, it studies the impact of the change of distance between lanes on the safety of aircraft landing on parallel runways.
APA, Harvard, Vancouver, ISO, and other styles
33

Yuksel, Inci. "Single Shot Hit Probability Computation For Air Defense Based On Error Analysis." Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/3/12608509/index.pdf.

Full text
Abstract:
In this thesis, an error analysis based method is proposed to calculate single shot hit probability (PSSH) values of a fire control system. The proposed method considers that a weapon and a threat are located in three dimensional space. They may or may not have relative motion in three dimensions with respect to each other. The method accounts for the changes in environmental conditions. It is applicable in modeling and simulation as well as in top down design of a fire control system to reduce the design cost. The proposed method is applied to a specific fire control system and it is observed that PSSH values highly depend on the distance between the weapon and the threat, hence they are time varying. Monte Carlo simulation is used to model various defense scenarios in order to evaluate a heuristic developed by Gü
lez (2007) for weapon-threat assignment and scheduling of weapons&rsquo
shots. The heuristic uses the proposed method for PSSH and time of flight computation. It is observed that the difference between the results of simulation and heuristic depends on the scenario used.
APA, Harvard, Vancouver, ISO, and other styles
34

Demirkaya, Gokmen. "Monte Carlo Solution Of A Radiative Heat Transfer Problem In A 3-d Rectangular Enclosure Containing Absorbing, Emitting, And Anisotropically Scattering Medium." Master's thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/1059138/index.pdf.

Full text
Abstract:
In this study, the application of a Monte Carlo method (MCM) for radiative heat transfer in three-dimensional rectangular enclosures was investigated. The study covers the development of the method from simple surface exchange problems to enclosure problems containing absorbing, emitting and isotropically/anisotropically scattering medium. The accuracy of the MCM was first evaluated by applying the method to cubical enclosure problems. The first one of the cubical enclosure problems was prediction of radiative heat flux vector in a cubical enclosure containing purely, isotropically and anisotropically scattering medium with non-symmetric boundary conditions. Then, the prediction of radiative heat flux vector in an enclosure containing absorbing, emitting, isotropically and anisotropically scattering medium with symmetric boundary conditions was evaluated. The predicted solutions were compared with the solutions of method of lines solution (MOL) of discrete ordinates method (DOM). The method was then applied to predict the incident heat fluxes on the freeboard walls of a bubbling fluidized bed combustor, and the solutions were compared with those of MOL of DOM and experimental measurements. Comparisons show that MCM provides accurate and computationally efficient solutions for modelling of radiative heat transfer in 3-D rectangular enclosures containing absorbing, emitting and scattering media with isotropic and anisotropic scattering properties.
APA, Harvard, Vancouver, ISO, and other styles
35

Hougaz, Augusto Borella. "Análise probabilística de durabilidade aplicada a veículos de carga rodoviária." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/3/3132/tde-14112005-080943/.

Full text
Abstract:
Em projetos de veículos, prever adequadamente a durabilidade de um componente estrutural é vital para a redução de custos assim como para se estipular prazos de garantia e de manutenção. Por outro lado, em diversas situações, tal previsão é difícil, pois inúmeros parâmetros não estão sob controle preciso do projetista, dentre eles os mais relevantes são: o carregamento ao qual o componente será submetido e as propriedades de vida em fadiga do material. Desta forma, pode-se aplicar, em conjunto com o MEF, tratamento probabilístico do cálculo da vida em fadiga,resultando em um valor de confiabilidade, a ser usado como esteio do critério de projeto, no intuito de conferir maior significado prático à análise e aos resultados obtidos. Assim sendo, o presente trabalho descreve e relaciona os principais aspectos de um tratamento probabilístico para cálculo de vida em fadiga, configurando-se um procedimento completo subdividido nas seguintes etapas: 1. Modelagem do carregamento sobre o veículo, visando análise espectral em MEF no domínio da freqüência; 2. Cálculo da densidade de probabilidade das amplitudes cíclicas de Rainflow para as tensões atuantes na estrutura, a partir dos resultados da análise espectral em MEF no domínio da freqüência; 3. Tratamento probabilístico das propriedades características de vida em fadiga do material; 4. Cálculo da confiabilidade estrutural de vida em fadiga, tomando os resultados prévios dos tratamentos probabilísticos de tensões e de vida em fadiga do material; 5. Demonstração da viabilidade e aplicação prática de tal procedimento, implementando-o para o caso de um semi-reboque tanque autoportante. As conclusões finais obtidas confirmam o fato de haver mais falhas estruturais em veículos no Brasil do que em países do primeiro mundo, pois se evidenciou que a pior qualidade das vias trafegáveis brasileiras exacerba a probabilidade de falha por fadiga. Por fim, a metodologia proposta na presente tese, quando implementada em um programa computacional pós-processador de MEF, que automaticamente transforme os desvios padrão das tensões, resultantes da análise espectral, em probabilidades de falha, pode, mais adequadamente, subsidiar critério de projeto fundamentado na avaliação probabilística de durabilidade de um veículo através de preceitos bem definidos de confiabilidade estrutural.
In structural design, forecast correctly the part’s lifetime is vital to reduce costs and to estimate the periods of warranty and of maintenance. On the other hand, in many situations, this forecast is difficult because many parameters are not under the engineers control, some examples of this kind of parameters are: the loads acting in the part and the fatigue properties of the material. Therefore, it is possible to apply finite element analysis with a probabilistic approach in fatigue lifetime calculation, what results in reliability values that can be used as design criterion in order to give a broader meaning to the analyses and its results. Hence, this text describes and establishes the relationship between the main aspects of the probabilistic approach in fatigue lifetime calculation, resulting in the definition of a complete procedure that is divided in the following steps: 1. Modeling of the vehicle’s load aiming spectral analysis in the frequency domain; 2. Calculation procedure of the probability density function of Rainflow stress amplitudes applied in the structure upon using the spectral analysis in frequency domain results; 3. Probabilistic approach of fatigue properties of the material; 4. Fatigue reliability calculation through the use of the previous results provided by the probabilistic approach of stresses and of fatigue properties of the material; 5. Implementation of the proposed methodology for fatigue lifetime prediction of a semi-trailer tank. The conclusion confirms the fact of having much more structural failures in Brazilian vehicles than in European ones. This can be said because the results show that the worsening of the roads quality increases the fatigue failure probability. Finally, when implemented in a post-processing software for finite element programs that automatically transforms the stresses standard deviation, obtained by spectral analysis, in the fatigue failure probability, the proposed methodology in the present thesis may, more adequately, work as design criterion based on the probabilistic evaluation of the vehicles durability, upon applying well established structural reliability theory.
APA, Harvard, Vancouver, ISO, and other styles
36

Champagne, Julien. "Etude du rôle de la cycline D1 dans la survie cellulaire." Thesis, Montpellier, 2018. http://www.theses.fr/2018MONTT018.

Full text
Abstract:
Chez la femme, le cancer du sein est le cancer le plus fréquemment diagnostiqué. Différents traitements sont disponibles selon le sous-type tumoral. Cependant, certaines patientes sont réfractaires à ces thérapies et restent vulnérables lors de récidives. Le cancer a longtemps été défini par une division aberrante des cellules, mais aujourd'hui, il est évident que la résistance à la mort cellulaire programmée est un paramètre majeur dans l'étiologie de la maladie.Les cyclines de type D régulent le cycle cellulaire en permettant la transition de la phase G1 à la phase S. Pour cela, elles activent les kinases dépendantes des cyclines 4/6 (CDK4/6) qui phosphorylent les protéines du rétinoblastome ce qui libère le facteur de transcription E2F. La Cycline D1 (CycD1) nucléaire est donc centrale dans le contrôle du cycle. Son gène est amplifié dans les cancers humains et la moitié des patientes atteintes d'un cancer du sein ont une surexpression de CycD1. Par l’activation de CDK4, CycD1 est essentielle à l'apparition et à la progression tumorale. Ainsi, des inhibiteurs spécifiques de CDK4/6 ont été développés contre le cancer du sein. Malheureusement, certaines patientes restent insensibles à ce traitement. À ce titre, le ciblage spécifique de CycD1 pourrait représenter une alternative clinique. En effet, en plus de la régulation du cycle, CycD1 est également impliquée, indépendamment de CDK4, dans la survie des cellules cancéreuses. Cependant, aucun mécanisme de l'impact de CycD1 dans le maintien tumoral n'a été établi pour démontrer ce potentiel thérapeutique. En outre, CycD1 a été décrite dans les organes à l’âge adulte pour réguler le métabolisme du glucose et l'hématopoïèse. Par conséquent, pour éviter tout effet secondaire indésirable, nous avons décidé d’évaluer l’implication potentielle de CycD1 dans les organes adultes. Grâce au Tandem-HTRF, basé sur le transfert d'énergie entre deux anticorps, nous avons révélé la dynamique inattendue de CycD1 dans chaque organe adulte. De plus, nous avons montré que l’altération de l'expression de CycD1 conduit à une diminution des capacités de survie des cellules saines post-mitotiques.Au vu de ces limitations, nous avons développé une nouvelle approche d'ARN interférence spécifique des cellules cancéreuses appelée TAG-RNAi. Cette technologie permet de cibler CycD1 uniquement dans la tumeur afin d'épargner les cellules saines. Cette approche innovante consiste à cibler un tag présent uniquement sur l’ARNm de CycD1 des cellules cancéreuses. Ainsi, nous avons découvert que le ciblage spécifique de CycD1 induit une régression rapide et spontanée des tumeurs dépendantes des oncogènes RAS ou ERBB2. Par protéomique in vivo, j'ai découvert que lors de stress pro-apoptotiques, CycD1 cytoplasmique interagit avec la procaspase-3 et bloque son activation pour empêcher l'apoptose des cellules. Ces travaux démontrent la valeur clinique du ciblage spécifique de CycD1 dans les cancers afin d'améliorer l'efficacité des chimiothérapies.Par conséquent, il restait à déterminer comment appliquer le TAG-RNAi contre CycD1 uniquement dans les cellules cancéreuses des patientes. Puisque le tag exotique présent sur le gène Ccnd1 chez la souris nous a permis de cibler spécifiquement les cellules cancéreuses, nous avons pensé que des mutations retrouvées dans les cancers humains représentaient une option de ciblage. Ainsi, nous avons étendu le concept TAG-RNAi aux mutations somatiques caractéristiques des cancers pour cibler avec succès l'expression des mutants KRAS-G12V ou BRAF-V600E comme exemples. L'idée est donc d'identifier les mutations de Ccnd1 chez les patientes afin d'appliquer le TAG-RNAi comme une thérapie personnalisée afin d’éviter les effets secondaires. Enfin, l'expression de CycD1 représente un nouveau biomarqueur pour le cancer et les troubles liés à l'âge: de faibles taux prédisposent aux maladies dégénératives tandis que des taux élevés indiquent une susceptibilité accrue au cancer
Breast cancer is the most frequently diagnosed cancer in women. This cancer is the leading cause of death in women aged from 35 to 65 years old. Different treatments are now available depending on tumor subtypes. However, some patients are still refractory to these therapies and are at risk of disease relapse. Cancer research has long focused on aberrant cancer cell division but today it is evident that the resistance to programmed cell death is also a major characteristic of the disease.D-type cyclins regulate cell cycle by allowing the transition from the G1-phase to the S-phase. These regulatory subunits activate the Cyclin-Dependent Kinases 4/6 (CDK4/6) that phosphorylate the retinoblastoma proteins which then release the E2F transcription factors. Nuclear Cyclin D1 (CycD1) is therefore central in the control of division. The Ccnd1 gene is amplified in human cancers and half of breast cancer patients bare an overexpression of CycD1. CycD1 is required for mammary carcinoma onset and progression in a CDK4 kinase-dependent manner. Hence, specific CDK4/6 inhibitors have been developed and authorized in the clinics against breast cancer. Unfortunately, some patients remain insensitive to this treatment. In this frame, the specific targeting of CycD1 could represent a strategic alternative in clinics to overcome these pitfalls. Indeed, in addition to cell cycle regulation with CDK4, CycD1 is also involved in CDK4-independent features of cancer cells like cell survival. However, to date, no clear mechanism for the impact of CycD1 in tumor maintenance is established to demonstrate the therapeutic value of its targeting.Moreover, recent studies have demonstrated the participation of CycD1 in adult organs to regulate glucose metabolism and hematopoiesis. As a consequence, to avoid any undesirable side effects, we decided to gauge the potential CycD1 implication in post-mitotic organs body-wide. We set up a new hypersensitive technology named Tandem-HTRF based on the energy transfer between two antibodies to reveal the unexpected dynamics of CycD1 expression in adult organ. Then, we discovered that alterations of CycD1 expression induced dramatic functional consequences on the survival capacities of healthy adult post-mitotic cells.Based on these limitations, we developed a novel RNAi approach specific to cancer cells named TAG-RNAi. This technology allows the silencing of CycD1 in cancer cells only to spare healthy cells. This innovative approach consists in the targeting of a mRNA tag only present on CycD1 from cancer cells. Using this technique, we found that the specific silencing of CycD1 induces a rapid and spontaneous regression of tumors driven by the RAS or ERBB2 oncogenes. Then, thanks to a proteomics screening in vivo, I discovered that under pro-apoptotic stresses the cytoplasmic CycD1 interacts with the procaspase-3 protein and blocks its activation to prevent cancer cell apoptosis. Altogether, my work demonstrates the clinical value of the specific targeting of CycD1 in cancers to increase the efficacy of chemotherapeutic treatments.Hence, it remained to be determined how to apply in patients RNAi against CycD1 only in cancer cells. Because the exotic tagging of its gene was instrumental in mice cancer models, we reasoned that human cancer mutations could represent such a specific tag. We have extended the concept of TAG-RNAi to somatic mutations characteristic of human cancers to successfully target the expression of KRAS-G12V or BRAF-V600E mutants as examples. The idea is therefore to identify Ccnd1 mutations in cancer patients in order to apply TAG-RNAi as a custom therapeutic approach that will manage side effects. More unanticipated, CycD1 expression represents a new biomarker for both cancer and age-related disorders: low CycD1 levels predispose to degenerative complications while high CycD1 levels indicate increased susceptibility to cancer and resistance to treatment
APA, Harvard, Vancouver, ISO, and other styles
37

Mahajan, Thejus. "Excitation and fragmentation of CnN⁺ (n=1-3) molecules in collisions with He atoms at intermediate velocity ; fundamental aspects and application to astrochemistry." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS311/document.

Full text
Abstract:
Dans cette thèse nous avons étudié des collisions entre des projectiles CnN⁺ (n=0,1,2,3) et des atomes d’Hélium à vitesse intermédiaire (2.25 u.a). A cette vitesse, proche de la vitesse des électrons sur les couches de valence externe des atomes et molécules, de nombreux processus électroniques prennent place avec une forte probabilité : ionisation (simple et multiple), excitation électronique, capture d’électron (simple et double). Nous avons mesuré les sections efficaces absolues de tous ces processus. Un autre aspect intéressant de la collision concerne la fragmentation des molécules excitées, que nous avons également mesurée précisément grâce à un dispositif dédié. Les expériences ont été effectuées auprès de l’accélérateur Tandem d’Orsay avec des faisceaux de quelques MeV d’énergie cinétique. Le dispositif AGAT a permis de réaliser les collisions (en condition de collision unique) et de mesurer tout à la fois les sections efficaces des processus et la fragmentation associée. Parallèlement nous avons simulé ces collisions d’un point de vue théorique en utilisant le modèle à Atomes et Electrons Indépendants (IAE) couplé à des calculs CTMC (Classical trajectory Monte Carlo). Sur cette base, nous avons prédit les sections efficaces qui se sont trouvées être en bon accord avec les mesures, à l’exception de la double capture d’électrons. Par ailleurs les rapports de branchement de dissociation des CnN⁺ après excitation électronique sont bien reproduits en utilisant la distribution d’énergie interne des espèces calculées avec le même modèle IAE/CTMC. Ces expériences nous ont permis de construire des « Breakdown Curves » (BDC), véritables cartes d’identité des molécules qui permettent de prévoir, dans le cadre d’une fragmentation statistique comment va fragmenter un système dont on connait l’énergie interne. Avec ces BDC nous avons pu prédire et recommander des rapports de branchement pour des voies de sortie de processus physiques et chimiques d’intérêt astrochimique. Ces données seront insérées dans la base internationale d'astrochimie the Kinetic Data Base for Astrochemistry KIDA. Cette thèse a été réalisée dans le cadre de l’Ecole Doctorale Ondes et Matière (EDOM) à l’Institut des Sciences Moléculaires d’Orsay (ISMO), à l’Université Paris-Sud Paris Saclay
This thesis studies the aftermath of collision between singly positively charged Nitrogenated carbon species CnN⁺ (n=0,1,2,3) and neutral Helium atom at a velocity of 2.25 au. At this velocity, close to the velocity of outer electrons in atoms and molecules, several electronic processes take place and are near their maximum of probability such as ionisation (single, double, triple …), electronic excitation and electron capture (single and double). We looked at their cross sections and how their evolution with the molecule size. Following the collision the molecule can fragment, which leads to another interesting aspect, the fragmentation branching ratios. Collision experiments were done using a Tandem accelerator at Orsay that produced the CnN⁺ projectiles and a dedicated set-up, AGAT, to capture the flying fragments/intact molecule after collision according to their charge to mass ratio. Knowing the number of particles that are shot and the fact that our set-up allows no loss of fragments/intact molecule, we could get the probabilities of various fragments formed. Using these probabilities and a knowledge of the Helium jet profile used, we could measure their cross sections. The probabilities alone are sufficient to obtain the fragmentation branching ratios.The next step was to use a theoretical model to simulate the collision. We used Independent Atom and Electron (IAE) model coupled with Classical Trajectory Monte Carlo (CTMC) method to calculate the desired cross sections. A general good agreement was obtained, with the exception of double electron capture. The model could also predict, through the calculation of the species internal energy, the fragmentation branching ratios of cations CnN⁺ after electronic excitation. Also, the branching ratios were used to construct semi-empirical Breakdown Curves (BDCs), which are internal energy dependent dissociation branching ratios specific to each molecule, type, size and charge. With those, we could recommend products branching ratios to be used for various processes of astrochemical interest. The products branching ratios will be made available for a wider network of researchers under the international Kinetic Database for Astrochemistry (KIDA).This thesis was realized under the doctoral programme of Ecole Doctorale Ondes et Matiere (EDOM) with Institut des Sciences Moléculaires d’Orsay (ISMO) where the author was given an office and Université Paris-Sud where the author is formally enrolled
APA, Harvard, Vancouver, ISO, and other styles
38

Yilmaz, Ercan. "Characteristic X-ray, Photoelectron And Compton-scattered Photon Escape From A Hpge Detector." Phd thesis, METU, 2003. http://etd.lib.metu.edu.tr/upload/3/1210061/index.pdf.

Full text
Abstract:
Escape of photoelectrons, Compton-scattered photons and Ge X-rays from a HPGe detector was studied as a function of energy in the range 8-52 keV. A variable-energy source producing Cu, Rb, Mo, Ag, Ba, and Tb X-rays was used. All three mechanisms for energy loss were observed in the same experiment for Ba and Tb, while only X-ray and photoelectron escapes were evident in the spectra for Ag, Mo, Rb, and Cu. Spectral features and possible mechanisms for partial energy deposition were investigated. A Monte Carlo program was used to simulate the relevant interactions and to estimate the escape probabilities.
APA, Harvard, Vancouver, ISO, and other styles
39

Dias, Pedro Augusto Parente. "Entregas noturnas no município de São Paulo: percepções dos motoristas e recebedores." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/3/3148/tde-17012017-073605/.

Full text
Abstract:
As pesquisas para o desenvolvimento de uma cidade mais sustentável, com menos trânsito e mais qualidade de vida, apontam a necessidade de elaboração de políticas voltadas para o transporte de carga. As entregas urbanas de carga sendo feitas no período noturno pode ajudar a melhorar o trânsito da cidade, e evitar acidentes com ciclistas, pedestres, ônibus e carros. Ao mesmo tempo, entregar mercadorias a noite pode ser vantajoso tanto para os varejistas quanto para os transportadores. Da perspectiva dos varejistas, pode haver mais agilidade em conferir e armazenar mercadorias, melhora do nível de atendimento ao cliente, e mais certeza quanto à pontualidade do horário da chegada do caminhão. Quanto aos transportadores, o tempo de rota pode diminuir, assim como tempo para buscar uma vaga para estacionar; as filas de caminhões podem diminuir, oferecendo mais eficiência ao roteiro de entregas. Entretanto, alguns problemas podem surgir ao executar os descarregamentos noturnos, relacionados à emissão de ruídos e ao risco de assaltos. O objetivo desta pesquisa é assinalar os principais aspectos que influenciam na eficiência das entregas noturnas no âmbito operacional, e identificar quais são os problemas relacionados à execução das entregas noturnas. Para cumprir estes objetivos foi aplicado um questionário a 100 motoristas, e outro questionário a 84 varejistas que realizam operações com entregas/recebimento de mercadorias no período noturno. A partir destes dados, análises estatísticas foram feitas para assinalar quais variáveis de análise estão mais associadas à escolha por executar as atividades de frete no período noturno. Os resultados mostraram que os ruídos são mais críticos para a tomada da decisão do horário de entrega, para os varejistas. Quanto aos resultados atrelados à eficiência logística, a vantagem em fazer entregas noturnas está na maior agilidade para conferir e armazenar as mercadorias e maior assertividade quanto ao horário de entrega, devido às condições do trânsito e da facilidade em estacionar o veículo para efetuar o descarregamento.
The findings of the researches for a more sustainable city, with less traffic and better quality of life, point to the need of new policies for the transport of cargo. Urban overnight deliveries can improve the traffic of city and prevent accidents involving cyclists, pedestrians, buses and cars. At the same time, overnight deliveries might be favorable, both for retailers and drivers. From the perspective of retailers, overnight deliveries may be more efficient in checking and storing goods, more punctual and improve customer service level. For drivers, time route and time to find a place to park may decrease; truck queues may be reduced, which offers more efficiency to the delivery route. However, some problems may arise when performing the night freight, regarding to its noise and the risk of robbery. The objective of this research is to verify the main aspects that maximize the efficiency of night deliveries in the operational context and identify the night freight problems. In order to meet these objectives, a questionnaire was applied to 100 drivers, and another questionnaire to 84 retailers. Statistical methods were used to make sure the noises as the most critical aspects of the night deliveries, for retailers. As for the results related to logistic effectiveness, the advantage of making overnight delivery is the agility to check and store the goods and the accuracy on the delivery schedule, due to traffic conditions, and the ease to park the vehicle to perform the download. In addition, multivariate analysis method combines the quality of customer service with the punctuality of the truck and the agility in check and store goods.
APA, Harvard, Vancouver, ISO, and other styles
40

CARVALHO, Thiago Milograno de. "Monte Carlo quântico aplicado ao estudo do comportamento quântico-clássico do Neônio." Universidade Federal de Goiás, 2009. http://repositorio.bc.ufg.br/tede/handle/tde/810.

Full text
Abstract:
Made available in DSpace on 2014-07-29T15:07:09Z (GMT). No. of bitstreams: 1 Dissertacao Thiago Milograno.pdf: 1110506 bytes, checksum: 08596b9630b30f983f7a8e9f0777f92d (MD5) Previous issue date: 2009-02-20
In this work we have applied Quantum Monte Carlo method at finite temperature known as Path Integral Monte Carlo (PIMC) to study the quantum-classical behavior of the Neon. We have calculated the one body density matrix as well as the atomic momentum distribution which have shown to be significantly different from the classical Maxwell- Boltzmman distribution in the range of densities and temperatures studied. The deviations from a classical gaussian are substantial but it decreases as one goes to temperatures above T = 35 K or densities below p = 20 nm−3. Furthermore, at low temperature the results show that there are more low momentum atoms than in a classical gaussian distribution.
Neste trabalho aplicamos o método de Monte Carlo Quântico à temperatura finita conhecido como Path Integral Monte Carlo (PIMC) a fim de estudar o comportamento quântico-clássico do Neônio. Calculamos a matriz densidade de um corpo, bem como a distribuição de momento atômica que mostrou ser significativamente diferente da distribuição clássica de Maxwell-Boltzmann nos intervalos de densidade e temperatura estudados. Os desvios de uma gaussiana clássica são substanciais porém esses desvios diminuem para temperaturas acima de T = 35 K ou densidades abaixo de p= 20 nm−3. Além disso, para baixas temperaturas os resultados mostram que há mais átomos com momentos menores do que na distribuição clássica gaussiana.
APA, Harvard, Vancouver, ISO, and other styles
41

Andrade, Luís Emmanuel Carvalho de. "Um estudo sobre terminais intermodais para granéis sólidos." Universidade de São Paulo, 2003. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-19042007-164002/.

Full text
Abstract:
Este trabalho está relacionado coma implantação de adequados terminais intermodais para granéis sólidos nas margens das hidrovias, de modo a aumentar a participação do modal fluvial na matriz de transporte do país. Apresenta-se, inicialmente, um panorama da distribuição modal de transportes em que se enfatiza a deficiência dos terminais fluviais existentes como umfator responsável por esta distribuição. É estabelecido, então, como objetivo do trabalho o desenvolvimento do projeto de umterminal intermodal para a hidrovia Tietê-Paraná. O primeiro passo para o projeto é um estudo de demandas e de capacidade da via onde se define: o tipo de carga, quantidade de carga movimentada atualmente e a projeção de demanda, bem como a distribuição da movimentação ao longo do ano. E, emseguida, estima-se a capacidade de movimentação da via. Para desenvolver o projeto é feita uma compilação de diretrizes propostas para dimensionamento de terminais e de seus subsistemas (hidroviário, ferroviário, rodoviário, de armazenagem e de movimentação). Além disto, são analisados os critérios para avaliação do terminal. Antes de desenvolver o projeto, faz-se uma análise crítica dos terminais existentes na hidrovia, assinalando-se suas falhas. Para o dimensionamento do terminal são formuladas diversas configurações que envolvem combinações de diferentes taxas de movimentação e diferentes capacidades de armazenagem. Utilizando a técnica de simulação, com o emprego do software ARENA, obtém-se o desempenho destas configurações em termos de tempo de permanência para cada modal, que mede o nível de serviço oferecido pelo terminal. As alternativas geradas são, então, avaliadas em função do nível de serviço oferecido e do valor presente líquido do investimento, o que conduza escolha de melhor solução.
This thesis is related to the implementation of suitable intermodal terminals for solid bulk cargo with the purpose of increasing the inland waterway participation in the country transportation matrix . Firstly, it is presented a general view of the modal transportation distribution, mainly based on the highway modal, in which the deficiency of the existing inland terminals is stressed as a factor for this distribution. It is specified, therefore, as the purpose of this thesis, the development of the project of an intermodal terminal for the Tietê-Paraná waterway. As the first step of the study, it is performed an analysis of cargo demand as well as of the waterway capacity, in which it is defined: the type of cargo, the amount of cargo transported and its distribution along the year. It is also estimated the cargo demand for the future which remains below the waterway capacity. In order to prepare the project development, it is done a compilation of procedures recommended for multimodal terminals and their components (waterway, railway and highway ends, storage systems and cargo handling equipments). Besides, some criteria to evaluate the terminal performance are presented. A critical analysis of the waterway existing terminals is then presented and their drawbacks are pointed out. For the terminal project, some configurations, which involve combinations of different handling rates and storage capacities , are formulated. The performance evaluation of each option , in terms of vehicle stay time at the terminal, is done by probabilistic simulation technique, employing the commercial software ARENA. The generated options are then compared in terms of service level and net present value and the best configuration is then selected.
APA, Harvard, Vancouver, ISO, and other styles
42

Borges, Daliana Gomes. "Aproveitamento de embalagens cartonadas em compósito de polietileno de baixa densidade." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/3/3133/tde-08012008-110235/.

Full text
Abstract:
A produção de materiais plásticos tem se elevado para 169 milhões de toneladas no mundo, no ano de 2003. Uma parte considerável destes polímeros sintéticos, 36% na Europa são destinados ao setor de embalagens. Estas matérias primas são utilizadas por um período de tempo bastante curto e geram um volume de descartes importante. Mesmo com um desenvolvimento considerável de linhas de gestão dos descartes, seu tratamento e sua eliminação colocam ainda problemas provenientes da dificuldade em reutilizar tais embalagens na forma em que são geradas, seja pelo estado de limpeza em que são descartadas, seja pela composição multi-material que é utilizada para sua produção. O presente trabalho busca o reaproveitamento de Embalagens Cartonadas pós-consumo como reforço em compósito com Polietileno de baixa densidade para i) maximizar o teor de ELV empregada no compósito e ii) melhorar o aspecto visual do compósito por meio de incorporação de concentrado de cores e corantes. Por meio do processo de extrusão, preparou-se o compósito na forma granulada. O material granulado foi moldado por injeção para obtenção dos corpos-de-prova para ensaios de tração. As propriedades viscosimétricas do compósito foram avaliadas por meio do Índice de Fluidez e a morfologia do compósito foi avaliada por meio de microscopia eletrônica de varredura e espectroscopia de energia dispersiva de raio X. Nas condições do trabalho pode-se afirmar que o teor de Embalagem Cartonada que apresenta o melhor conjunto de propriedades de tração está entre 20 e 25% (p/p) no compósito; a utilização de concentrado de cor para melhoria do aspecto visual do compósito não tem influência no conjunto de propriedades de tração do mesmo.
Plastics materials production has increased to 169 million tons in the world in 2003. A great part of these synthetic polymers, it means 36% in Europe are used in packaging applications. These raw materials are used for a short time and produces a great volume of residues. Even with a great development of residues management, its treatments and elimination causes several problems due to difficulty in reuse these packages as its were discarted or the state of dirtiness in which they are discarted or even by the composition multi-material of these packages. This work has as a goal the reuse of Carton Packages (CP) post-use as a reinforcement in a Low Density PolyEthylene (LDPE), to i) maximize the content of CP in the composite and ii) improve the composite visual aspect by means of color masterbatches. With aid of extrusion process, the granuleted composite was prepared. These granulated was moulded by injection to obtain test specimens for mechanical tests. Composites viscosimetric proprierties and morphology were evaluated by means of Melt Index and Scanning Electronic Microscopy (SEM) and Energy Dispersive X-Ray Spectroscopy (EDS). In the work conditions it can be postulate that the best content of CP in the composite that gives the best mechanical properties is between 20 and 25% (w/w); the use of color masterbatches for improve the composite visual aspects has no influence over mechanical properties.
APA, Harvard, Vancouver, ISO, and other styles
43

Colnot, Julie. "Risques de complications associés à la radiothérapie externe : étude comparative des doses délivrées aux tissus sains par les techniques avancées de radiothérapie." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS299/document.

Full text
Abstract:
Les techniques modernes de radiothérapie externe permettent de délivrer précisément la dose à la tumeur. Ce gain en précision se fait cependant au prix de l’irradiation d’un volume plus important de tissus sains alors susceptibles de développer des lésions radio-induites. Aujourd’hui, les risques de complications représentent un enjeu sociétal important, car l’efficacité des traitements permet aux patients une espérance de vie plus longue, augmentant ainsi la probabilité d’effets secondaires à moyen et à long terme. Cependant, l’estimation des risques est conditionnée par une connaissance précise des doses délivrées aux organes sains, directement corrélées aux risques de complications. Ces doses restent encore méconnues, car renseignées de manière incomplète et imprécise par les systèmes de planification de traitement (TPS). Dans ce contexte, l’objectif de la thèse est d’évaluer avec précision les doses délivrées aux tissus sains par les techniques avancées de radiothérapie. D’une part, une étude comparative des doses délivrées aux tissus sains par différentes techniques avancées a été réalisée et, d’autre part, les performances, en termes d’évaluation des doses aux tissus sains des algorithmes des TPS ont été évaluées. Des méthodes numériques et expérimentales ont donc été développées. Tout d’abord, un modèle Monte-Carlo PENELOPE de l’accélérateur Cyberknife a été étendu et validé en 1D et 2D pour l’évaluation des doses hors champ. Ce modèle a ensuite été utilisé pour déterminer les doses délivrées aux tissus sains lors d’un traitement de la région pulmonaire. Cette étude a ainsi permis de fournir des données d’entrée pour les modèles de risque et enfin, de mettre en évidence l’apport en précision de la simulation Monte-Carlo détaillée par rapport au TPS. De plus, un outil expérimental de reconstruction de la dose en 3D à partir de mesures par films radiochromiques a été développé. Un protocole de dosimétrie par gel dosimétrique a également été mis en place. Après validation en 2D et en 3D, l’outil de reconstruction a été mis en œuvre pour comparer les doses délivrées aux tissus sains par trois techniques de radiothérapie (conformationnelle, VMAT et tomothérapie) pour un traitement rénal pédiatrique. Bien que les techniques avancées offrent une excellente conformation, les tissus sains reçoivent des doses jusqu’à 3 fois plus élevées en comparaison avec la radiothérapie conformationnelle. La tomothérapie, disposant d’un blindage supplémentaire, épargne mieux les tissus que le VMAT. Finalement, contrairement à Eclipse™, le TPS de la tomothérapie détermine précisément des doses délivrées jusqu’à 30 cm du champ
Advanced radiotherapy techniques enable highly conformal dose distribution to the tumor. This higher precision is made at the cost of an increased tissue volume receiving low doses. The exposed organs are then susceptible to develop radio-induced lesions. Nowadays, risks of complications represent an important societal challenge as survival rates are increasing due to treatment efficacy and therefore the risk for a subsequent effect also increases. However, risk assessment requires a precise knowledge of the doses delivered to healthy organs, directly correlated to the risk of complications. Those doses are still unknown as calculated incorrectly by the treatment planning systems (TPS). Within this context, this thesis aims at precisely determining the doses delivered to normal tissues by advanced radiotherapy techniques. On the one hand, a comparative study of the doses delivered by different modern techniques was performed and on the other hand, the performance of the TPS dose computation algorithms was evaluated in terms of healthy tissue doses. Thus, numerical and experimental tools have been developed in this work. First, a PENELOPE Monte-Carlo model of a CyberKnife system has been extended and validated in 1-D and 2-D to determine out-of-field doses. This model was then used to evaluate the doses delivered to healthy tissue by a pulmonary treatment. This study provides requisite dosimetric data to evaluate the risks associated to the treatment and finally, it highlights the important precision of detailed Monte-Carlo simulation in comparison with the TPS. Moreover, an experimental 3-D reconstruction tool was developed thanks to radiochromic film measurements. A protocol of gel dosimetry was also established. After 2-D and 3-D validation, the 3-D tool was applied to compare the doses delivered by three radiotherapy techniques (conformational, VMAT and tomotherapy) in a pediatric renal treatment. While advanced techniques deliver highly conformal dose distribution, the doses to organs located at distance of the target are considerably increased up to a factor 3 in comparison with conformal radiotherapy. The tomotherapy spares the healthy tissues compared to VMAT due to its additional shielding. Finally, unlike Eclipse™, the TPS Tomotherapy enables a precise dose evaluation up to 30 cm from the field edge
APA, Harvard, Vancouver, ISO, and other styles
44

Zhu, Wei. "Molecular dynamics simulation of electrolyte solution flow in nanochannels and Monte Carlo simulation of low density CH 3 Cl monolayer on graphite." Columbus, Ohio : Ohio State University, 2004. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1072284612.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2004.
Title from first page of PDF file. Document formatted into pages; contains xiv, 90 p.; also includes graphics. Includes abstract and vita. Advisor: Sherwin J. Singer, Dept. of Chemistry. Includes bibliographical references (p. 86-90).
APA, Harvard, Vancouver, ISO, and other styles
45

Okumura, Shintaro. "New Ring-opening Reactions of Four-membered Carbo- and Sila-cyclic Compounds and Synthesis of 2-Alkoxy-1、3-dienes from Propargylic Alcohol Derivatives." Kyoto University, 2018. http://hdl.handle.net/2433/232487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Aslan, Serdar. "Nonlinear Estimation Techniques Applied To Econometric." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605649/index.pdf.

Full text
Abstract:
This thesis considers the filtering and prediction problems of nonlinear noisy econometric systems. As a filter/predictor, the standard tool Extended Kalman Filter and new approaches Discrete Quantization Filter and Sequential Importance Resampling Filter are used. The algorithms are compared by using Monte Carlo Simulation technique. The advantages of the new algorithms over Extended Kalman Filter are shown.
APA, Harvard, Vancouver, ISO, and other styles
47

Yousef, Diana O. "Structural and functional characterization of the lumenal portion of putative cargo receptor, yp24A/Emp24p /." Access full-text from WCMC:, 2007. http://proquest.umi.com/pqdweb?did=1296098021&sid=3&Fmt=2&clientId=8424&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nascimento, Carlos Augusto Dornellas do. "Gerenciamento de prazos: uma revisão crítica das técnicas em uso em empreendimentos em regime EPC." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-27072007-152858/.

Full text
Abstract:
O segmento de Consultoria em Engenharia, ao lado das Universidades e Centros de Pesquisas, é um dos pilares do desenvolvimento científico-tecnológico do país e é considerado um dos responsáveis pela otimização de investimentos, redução dos prazos e dos custos de implantação de empreendimentos públicos e privados, industrial ou de infra-estrutura. Atualmente, esses empreendimentos vêm passando por modificações significativas, exigindo cada vez um melhor desempenho de seus contratados, tanto nos aspectos de qualidade, desempenho, prazo e custos, quanto nos aspectos associados aos riscos contratuais, pois tornaram -se freqüentes contratações nas modalidades EPC (engineering, procurement, construction) e \"turn-key\". Neste novo cenário, as organizações precisam ser mais competitivas para sobreviverem às condições impostas pelo mercado, e, neste novo contexto, o gerenciamento de prazos tem-se tornado um fator crítico de sucesso. Esta dissertação desenvolve uma extensa revisão das técnicas de gerenciamento de prazos, com ênfase na etapa de programação, e uma análise comparativa das principais técnicas utilizadas no gerenciamento de projetos. Para isto, tomou-se como referência o caso de um projeto real, contratado mediante a modalidade EPC, em desenvolvimento por uma Empresa de Consultoria em Engenharia. Foi realizada a comparação entre os Métodos do Caminho Crítico, Corrente Crítica, Análises Probabilísticas e de Monte Carlo, abordando a aplicabilidade dessas técnicas neste caso. Ao final é feita uma síntese dos principais resultados alcançados, bem como dos requisitos demandados na aplicação de cada uma destas técnicas.
The segment of Consultancy in Engineering, as well as the Universities and the Research Centers are some of the pillars of the scientific -technological development in the country and they have been responsible for optimizing the investments, reducing the deadlines and the costs of project implementation in the public and private sectors, along with the industrial and infra-structure sectors. Nowadays, these projects have been going through major changes, which have demanded better performances from those who are contracted, in areas such as the quality control, performa nces, time and costs, as well as the contract risks, which more often than ever, have fallen into the EPC category (engineering, procurement, and construction) and the \'turn-key\' category. In this new scenario, companies need to be more competitive to live through the conditions imposed by the market, and, in this new context, time management has turned into a critical success factor. This dissertation develops an extensive review of the time management techniques focusing on scheduling and also a comparative analysis of the most important techniques used in Project Management. To do that, a real project was created as reference case in which the EPC category was defined and was developed by an Engineering Consultancy. A comparison among The Critical Path Method and The Critical Chain, Probability Analysis Method and The Monte Carlo Method was made, and it approached the applicability of these techniques in this case. At the end, there is a summary of the most important outcomes, as well as the necessary conditions to apply each of these techniques.
APA, Harvard, Vancouver, ISO, and other styles
49

Ozkan, Pelin. "Analysis Of Stochastic And Non-stochastic Volatility Models." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605421/index.pdf.

Full text
Abstract:
Changing in variance or volatility with time can be modeled as deterministic by using autoregressive conditional heteroscedastic (ARCH) type models, or as stochastic by using stochastic volatility (SV) models. This study compares these two kinds of models which are estimated on Turkish / USA exchange rate data. First, a GARCH(1,1) model is fitted to the data by using the package E-views and then a Bayesian estimation procedure is used for estimating an appropriate SV model with the help of Ox code. In order to compare these models, the LR test statistic calculated for non-nested hypotheses is obtained.
APA, Harvard, Vancouver, ISO, and other styles
50

Fernandez, Marcelo Luiz Alves. "Aplicações de documentação fiscal eletrônica em sistemas logísticos: casos práticos." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/3/3143/tde-22052018-142117/.

Full text
Abstract:
Os documentos fiscais eletrônicos estão implantados no Brasil desde 2006, e significaram relevante modernização na forma como as operações comerciais são documentadas no país. Contudo, apesar de possuírem informações que transcendem a fiscalização tributária, a legislação brasileira impõe o chamado \"sigilo fiscal\", restringindo o acesso a esses documentos apenas à administração tributária. Nesse sentido, o trabalho tem os objetivos de analisar o conceito e os contornos do sigilo fiscal imposto pela legislação brasileira e propor aplicações práticas de como as informações constantes nos documentos fiscais eletrônicos poderiam ser utilizadas pelos demais órgãos públicos e pela iniciativa privada. Dentre os diversos usos potenciais o trabalho foca no tema de sistemas logísticos, mais especificamente no que se refere ao rastreamento de veículos e mercadorias. Nesse sentido são apresentados e detalhados quatro casos práticos de interesse para a iniciativa privada e outros órgãos de fiscalização: identificação do trânsito de veículos de carga com excesso de peso em rodovias; identificação do transporte de produtos perigosos em áreas urbanas; rastreamento de mercadorias de alto valor agregado pelas empresas; e o conhecimento prévio do fluxo de mercadorias, para fins de planejamento de carga e descarga em armazéns, portos e aeroportos. Aborda, também, um caso prático de rastreamento útil para a própria administração tributária, as chamadas \"fronteiras virtuais\". Apresenta, por fim, a especificação técnica mínima para a implantação dos quatro casos práticos de rastreamento, que servirá tanto para a Secretaria da Fazenda como para os órgãos e empresas interessados nas informações contidas nos documentos eletrônicos.
Electronic tax documents have been implemented in Brazil since 2006, and have meant a relevant modernization in the way commercial operations are documented in the country. However, despite having information that transcends tax inspection, Brazilian law imposes so-called \"fiscal secrecy\", restricting access to these documents only to the tax administration. In this sense, the objectives of this paper are to analyze the concept and the contours of fiscal secrecy imposed by Brazilian legislation and propose practical applications of how the information contained in electronic tax documents can be used by other public agencies and private initiative. Among the various potential uses, the work focuses on the topic of logistics systems, more specifically on the tracking of vehicles and goods. In this sense, four practical cases of interest to the private sector and other inspection bodies are identified and detailed: identification of the traffic of overloaded vehicles on highways; identification of transport of hazardous products in urban areas; tracking of high value-added goods by companies; and prior knowledge of the flow of goods for loading and unloading planning in warehouses, ports and airports. It also addresses a practical case of useful tracking for the tax administration itself, the so-called \"virtual borders\". Finally, it presents the minimum technical specification for the implementation of the four practical cases of tracing, which will serve both the Treasury Department and the agencies and companies interested in the information contained in the electronic documents.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography