To see the other types of publications on this topic, follow the link: Generative Principle.

Dissertations / Theses on the topic 'Generative Principle'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Generative Principle.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Verbeeck, Kenny. "Randomness as a generative principle in art and architecture." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/35124.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Architecture, 2006.<br>Includes bibliographical references (leaves [87]-[98]).<br>As designers have become more eloquent in the exploitation of the powerful yet generic calculating capabilities of the computer, contemporary architectural practice seems to have set its mind on creating a logic machine that designs from predetermined constraints. Generating form from mathematical formulae thus gives the design process a scientific twist that allows the design to present itself as the outcome to a rigorous and objective process. So far, several designer-computer relations have been explored. The common designer-computer models are often described as either pre-rational or post-rational. Yet another approach would be the irrational. The hypothesis is that the early design process is in need of the unexpected, rather than iron logic. This research investigated how the use of randomness as a generative principle could present the designer with a creative design environment. The analysis and reading of randomness in art and architecture production takes as examples works of art where the artist/designer saw uncertainty or unpredictability as an intricate part of the process. The selected works incorporate, mostly, an instigating and an interpreting party embedded in the making of the work.<br>(cont.) The negotiations of boundaries between both parties determine the development of the work. Crucial to the selected works of art was the rendering of control or choice from one party to another - whether human, machine or nature - being used as a generative principle. Jackson Pollock serves as the analog example of a scattered computation: an indefinite number of calculations, of which each has a degree of randomness, that relate in a rhizomic manner. Pollock responds to each of these outcomes, allowing the painting to form from intentions rather than expectations. This looking and acting aspect to Pollock's approach is illustrated in the Jackson Pollock shape grammar. Ultimately the investigation of randomness in art is translated to architecture by comparing the Pollock approach in his drip paintings to Greg Lynn's digital design process in the Port Authority Gateway project. In the Pollock approach to digital design agency is given to the tools at hand, yet at the same time, the sheer indefinite number of designer-system interactions allows the design to emerge out of that constructive dialogue in an intuitive manner.<br>by Kenny Verbeeck.<br>S.M.
APA, Harvard, Vancouver, ISO, and other styles
2

Noh, H. Gerrey. "My Mother's Death; Generative Principles in Schenkerian Performance Expression." Kent State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=kent1373383456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

LeBlanc, David C. "The generation of phrase-structure representations from principles." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29338.

Full text
Abstract:
Implementations of grammatical theory have traditionally been based upon Context- Free Grammar (CFG) formalisms which all but ignore questions of learnability. Even implementations which are based upon theories of Generative Grammar (GG), a paradigm which is supposedly motivated by learnability, rarely address such questions. In this thesis we will examine a GG theory which has been formulated primarily to address questions of learnability and present an implementation based upon this theory. The theory argues from Chomsky's definition of epistemological priority that principles which match elements and structures from prelinguistic systems with elements and structures in linguistic systems are preferable to those which are defined purely linguistically or non-linguistically. A procedure for constructing phrase-structure representations from prelinguistic relations using principles of node percolation (rather than the traditional X-theory of GG theories or phrase-structure rules of CFG theories) is presented and this procedure integrated into a left-right, primarily bottom-up parsing mechanism. Specifically, we present a parsing mechanism which derives phrase-structure representations of sentences from Case- and 0-relations using a small number of Percolation Principles. These Percolation Principles simply determine the categorial features of the dominant node of any two adjacent nodes in a representational tree, doing away with explicit phrase structure rules altogether. The parsing mechanism also instantiates appropriate empty categories using a filler-driven paradigm for leftward argument and non-argument movement. Procedures modelling learnability are not implemented in this work, but the applicability of the presented model to a computational model of language is discussed.<br>Science, Faculty of<br>Computer Science, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
4

Erazo, Karlborg Misha. "Generation Zs syn på och attityd till hållbarhetsredovisningen." Thesis, Högskolan i Borås, Akademin för textil, teknik och ekonomi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-23792.

Full text
Abstract:
Hållbarhetsredovisningen kan ses som ett kommunikationsverktyg för företag gentemot sinaintressenter. Då konsumenter generellt sett är företagens primära inkomstkälla blir det därmed en viktig intressent för företaget. En konsumentgrupp som från och med i år beräknas vara den största konsumentgruppen och utgöra nästan 40% av världens konsumtion är Generation Z. Studiens syfte är att undersöka och förstå Generation Zs syn på två av Sveriges största fastfashion företag, nämligen H&amp;M och Gina Tricot, hållbarhetsredovisningar utifrån 3Cprincipen.Vidare är syftet att undersöka om konsumentens attityd gentemot respektive fastfashion företag förändras efter att ha tagit del av hållbarhetsredovisningarna. Konsumenternas attityder analyserades utifrån Solomons et al. (2016) ABC-modell. Tio semistrukturerade intervjuer utfördes där deltagarna först fick dela med sig av generella åsikter kring företagen, därefter fick de ta del av hållbarhetsredovisningarna för att sedan dagen efter svara på ytterligare frågor. Resultatet visade på att konsumenter uppfattar respektivehållbarhetsredovisning som tydlig och jämförbar. För deltagarna som upplevde att hållbarhetsredovisningen var trovärdig var den genomgående anledningen till detta att den innehöll mer information. Konsumentens attityd gentemot respektive fast fashion företag förändrades inte ur många aspekter efter att ha tagit del av hållbarhetsredovisningen. Deltagarna som innan de tagit del av hållbarhetsredovisningen haft en positiv känsla gentemot företaget och därtill får en positiv känsla till hållbarhetsredovisningen uppgav däremot också ett förstärktbeteende. Med andra ord är de vara villiga att handla mer hos det företaget. Dock stämmer inte alltid attityd och beteende överens vilket innebär att man endast kan anta att de blir så.<br>The sustainability report can be seen as a communication tool for companies towards their stakeholders. Since consumers generally are seen as the company's primary source of income they therefore become an important stakeholder for the company. A consumer group that starting from this year is expected to be the biggest consumer group and make up for almost40% of the worlds consumption is Generation Z. This study is aimed to examine and understand Generation Z perception towards two of Sweden's biggest fast fashion companies, H&amp;M and Gina Tricot, sustainability report based on the 3C-principle. Furthermore the purpose is to investigate if the consumers attitude towards respective fast fashion companies changes after they have taken part of the sustainability report. The consumers attitudes was analyzed based on Solomons et al. (2016) ABC-model. Ten semi-structured interviews were conducted where the participants first got to share their general opinions about the presented companies, thereafter they received the sustainability reports to then the day after answer additional questions. The results showed that the consumers perceive each sustainability report as clear and comparable. For the participants that perceived that the sustainability report as credible the frequent reason was that it contained more information. The consumers attitude towards the fast fashion companies did not change in many aspects after taking part of the sustainability report. The participants that had a previous positive feeling towards the company and in addition developed a positive feeling towards the sustainability report declared a reinforced behavior. However there is a gap between attitude and behavior and therefore one can only assume that they will behave that way. This paper is further on written in Swedish.
APA, Harvard, Vancouver, ISO, and other styles
5

Elliott, S. M. "The role of the female principle in Aristotle's de Generatione Animalium." Thesis, University of Cambridge, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598815.

Full text
Abstract:
This thesis examines Aristotle's account of the female role in reproduction in his <I>de Generatione Animalium</I> (<I>GA</I>). Aristotle says that the female contributes matter, while the male contributes form and movement to the offspring. In the beginning of the five book treatise, Aristotle concentrates on the role of the male parent and emphasises the importance of form by using numerous comparisons with production in crafts. The male contributes like the carpenter who imposes his idea into inert and passive matter. By reading only Books One and Two, many get the impression that the female contribution is passive, inert and contributes nothing to the type of animal that is to be produced. I argue that this misrepresents Aristotle's theory of generation. Only by reading the treatise in its entirety does the reader understand that matter in animal generation is different from the matter used in crafts. The female matter is poised to become the type of animal it will be because it contributes specific movements and potentialities. The mother's soul has worked up the material to be the type of matter needed for her own life or that of her offspring. My thesis maintains that Aristotle never meant to indicate by his craft analogies that animal generation is similar to craft production in all respects. The first two Books do not emphasise the complexities of biological matter. In this section of the treatise, Aristotle is preoccupied with his refutation of a theory that posits two identical parental seeds. For this particular argument, he need only show that male and female contributions are not identical. Although he thinks that male and female contributions are dissimilar he does not think they are absolutely asymmetrical.
APA, Harvard, Vancouver, ISO, and other styles
6

Trivedi, Vrinda. "Gamification Principles Applied in an Undergraduate Lecture Environment." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1584001510904202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Palmgren, Myrna. "Optimal Truck Scheduling : Mathematical Modeling and Solution by the Column Generation Principle." Doctoral thesis, Linköping : Linköpings universitet, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-3590.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bertocchi, Matteo. "First principles Second-Harmonic Generation in quantum confined silicon-based systems." Palaiseau, Ecole polytechnique, 2013. http://pastel.archives-ouvertes.fr/docs/00/79/69/33/PDF/tesi-bertocchi.pdf.

Full text
Abstract:
Dans cette thèse, je me suis interessé à la description ab initio du processus de génération de seconde harmonique (SHG), qui est une propriété optique non-linéaire des matériaux, et je me suis concentré sur les systèmes quantiques confinés, à base de silicium. Ces dernières années, les études ab initio ont suscité un grand intérêt pour l'interprétation et la prévision des propriétés des matériaux. Il est indispensable d'améliorer la connaissance des processus non-linéaires et de proposer une description de SHG, à partir des premiers principes. En raison de difficultés importantes, la description de l'optique non linéaire n'a pas encore atteint la précision des phénomènes linéaires. L'état de l'art des calculs ab initio SHG est représenté par l'inclusion des effets à plusieurs corps comme les champs locaux (LF) et l'interaction électron-trou, mais aujourd'hui, l'approche la plus utilisée est l'approximation de particules indépendantes (IPA), la seule en mesure d'aborder les calculs de structures complexes, tels que des surfaces et des interfaces. Alors que IPA peut être une bonne approximation pour les systèmes massifs, dans des matériaux discontinus d'autres effets peuvent être prédominants. L'objectif de ma thèse est de donner une analyse du processus de SHG dans des systèmes complexes comme les interfaces et les systèmes confinés à base de silicium, d'inférer de nouvelles connaissances sur le mécanisme physique mis en jeu et son lien avec la nature du système. J'utilise un formalisme fondé sur la théorie de la fonctionnelle de la densité dépendant du temps (TDDFT) où les effets à plusieurs corps sont inclus par un choix approprié des noyaux de la TDDFT. Le formalisme et le code ont été développés au cours de mon travail, permettant l'étude de matériaux complexes. Mes recherches ont porté sur l'étude de l'interface Si (111)/CaF2 (de type B,T4). Des études de convergence montrent l'importance du matériau semi-conducteur par rapport à l'isolant. La réponse est caractéristique d'une région profonde au-delà de l'interface Si, alors que CaF2 converge rapidement juste après l'interface. La réponse montre une sensibilité aux modifications électroniques, induites dans des états bien en-dessous de l'interface, et non à la structure ionique du silicium, qui retrouve rapidement la configuration du matériau massif. Une procédure de normalisation pour comparer avec l'expérience a été proposée. Les spectres de SHG ont été calculés en IPA, et en introduisant les interactions de champs locaux et excitoniques. De nouveaux comportements ont été observés par rapport aux processus SHG dans GaAs ou SiC, montrant l'importance des effets de champ locaux cristallins. Alors que IPA décrit la position des pics principaux de SHG et que les effets excitoniques modifient légèrement l'intensité totale, seuls les champs locaux reproduisent la forme spectrale et les intensités relatives des pics. Cela souligne combien les effets des différents acteurs dans le processus dépendent de la nature des matériaux. De nouvelles méthodes d'analyse de la réponse ont été proposées: en effet, le lien direct entre la position des pics et les énergies de transition est perdu dans les calculs de SHG : le signal provient d'une équation de Dyson du second ordre où les fonctions de réponse linéaires et non-linéaire pour des fréquences différentes sont mélangées. En outre, la complexité du matériau m'a permis d'obtenir des informations sur une grande variété de systèmes comme les multicouches et les couches de silicium confinées. Les résultats montrent un bon accord avec l'expérience, confirmant la structure de l'interface proposée. Cela souligne la précision du formalisme, la possibilité d'améliorer nos connaissances sur ces matériaux complexes. Les simulations ab-initio de SHG peuvent être utilisées comme une technique prédictive, pour soutenir et guider les expériences et les développements technologiques. Les résultats préliminaires sur les structures Si/Ge sont présentés<br>In this thesis I have dealt with the ab initio description of the second-harmonic generation (SHG) process, a nonlinear optical property of materials, focusing in particular on quantum confined, silicon-based systems. In the last decades, the accuracy and possibilities of ab initio studies have demonstrated a great relevance in both the interpretation and prediction of the materials properties. It is then mandatory to improve the knowledge of the nonlinear optical processes as well as the SHG first-principle description. Nowadays, due to nontrivial difficulties, nonlinear optics has not yet reached the accuracy and development of linear phenomena. In particular, the state of the art of ab initio SHG calculations is represented by the inclusion of many-body effects as crystal local fields (LF) and electron-hole interaction, but today, the mostly used approach is the independent particle approximation (IPA), the only one able to approach calculations of complex structures such as surfaces and interfaces. Whereas IPA can be a good approximation for bulk systems, in discontinuous materials other effects may be predominant. Hence their description is of great relevance although the lack of studies. My thesis tries to give a first analysis of the SHG process in more complex systems as the interfaces and the Si-confined systems, inferring new insights on the physical mechanism and its link with the nature of the system. I use an efficient formalism based on the Time Dependent Density Functional Theory (TDDFT) where many-body effects are included via an appropriate choice of the TDDFT kernels. Both the formalism and the code have been developed during the thesis work permitting the study complex materials. The research has been focused on the Si(111)/CaF2 (T4 B-type) interface case study. Convergence studies show the importance of the semiconductor material with respect to the insulator. The response is characteristic of a deep region beyond the Si interface whereas the CaF2 converges soon after the first interface layers. Moreover, the signal demonstrates to be sensitive to the electronic-states modifications that are induced far below the interface, and not to the Si ionic structure that recovers soon the bulk configuration. A normalization procedure to compare with the experiment has been proposed. The SHG spectra have been calculated in the IPA, introducing LF and excitonic interactions. New behaviors have been observed with respect to the SHG processes on strained silicon, GaAs or SiC showing in particular the importance of crystal local-field effects with respect to both the IPA and the excitons. Whereas IPA can describe the position of the SHG main peaks and the excitonic effects slightly modify the total intensity, only LF are able to correctly reproduce the spectral shape and the relative intensities of the peaks. This underlines how SHG and the different involved effects depends on the nature of the materials. New methods of analysis of the response have been proposed; actually, the direct link between the peaks position and the transition energies is lost in SHG calculations (i. E. The signal comes from a second order Dyson equation where linear and nonlinear response functions at different frequencies are mixed together). Furthermore, the complexity of the system allowed me to extend the study to a large variety of materials as the multilayers and the silicon confined slabs. The results show a good agreement with the experiment confirming the proposed T4 B-type interface structure. This underlines the accuracy of the formalism, the possibility of improving our knowledge on these complex materials going beyond the standard approaches, and confirms the possibility of SHG ab-initio simulations to be employed as a predictive technique, supporting and guiding experiments and technological developments. Preliminary results on Si/Ge superlattice are presented
APA, Harvard, Vancouver, ISO, and other styles
9

Sharp, Randall Martin. "A model of grammar based on principles of government and binding." Thesis, University of British Columbia, 1985. http://hdl.handle.net/2429/24917.

Full text
Abstract:
This thesis describes an implementation of a model of natural language grammar based on current theories of transformational grammar, collectively referred to as Government and Binding (GB) theory. A description is presented of the principles of GB, including X-bar syntax and the theories of Case, Theta, Binding, Bounding, and Government The principles, in effect, constitute an embodiment of "universal grammar" (UG), i.e. the abstract characterization of the innately endowed human language faculty. Associated with the principles is a set of parameters that alter the effect of the principles. The "core grammar" of a specific language is an instantiation of UG with the parameters set in a particular way. To demonstrate the cross-linguistic nature of the theory, a subset of the "core grammars" of Spanish and English is implemented, including their parametric values and certain language-specific transformations required to characterize grammatical sentences. Sentences in one language are read in and converted through a series of reverse transformations to a base representation in the target language. To this representation, transformations are applied that produce a set of output sentences. The well-formedness of these sentences is verified by the general principles of UG as controlled by the parameters. Any that fail to meet the conditions are rejected so that only grammatical sentences are displayed. The model is written in the Prolog programming language.<br>Science, Faculty of<br>Computer Science, Department of<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
10

Davies, Tom Graham. "Sustainability assessment : towards a new generation of policy assessment, principles and process." Thesis, University of Auckland, 2010. http://hdl.handle.net/2292/5723.

Full text
Abstract:
This thesis examines the current state of Strategic Environmental Assessment (SEA) in relation to its explicit aim of Sustainable Development (SD). The argument developed is that with increasing levels of theoretical and practical interest in Strong Sustainability (SS), the current, largely retrospective model of SEA, and its application need to be re-envisioned. The thesis therefore improves this existing model of SEA to ensure greater sensitivity to concerns for stronger sustainability. The new model is referred to as Strategic Sustainability Development (SSD). This model is tested against the current Auckland Regional Growth Strategy (ARGS) to determine whether an SEA model, with explicit Strong Sustainability references can generate a workable process for governments to achieve SD related goals. In particular, the new model is assessed to see whether applying it to the issue of Climate Change would result in substantive environmental gains in Auckland, New Zealand. The ARGS as it stands in 2008 has taken a small but important step towards Sustainability by recognizing the need for limiting spatial growth. To this end, the Auckland Regional Growth Forum has instigated the creation of Auckland���s Metropolitan Urban Limit and internal growth conurbations. These developments, while positive, will however fail to address the key environmental issues facing Auckland. The ARGS, by adopting a framework that conforms to the standard of only Weak Sustainability, will continue to encourage a social and economic growth discourse that promotes unsustainable consumption, social dysfunction, and environmental problems such as air pollution. This thesis therefore argues that the ARGS should employ a model of SEA based on SS, namely the model developed in this thesis, SSD. Had a model such as SSD been applied to Auckland���s development over the last decade, significant positive environmental outcomes may have been achieved. Recent developments, such as the signing of the Kyoto Protocol, represent a qualitative change in the way environmental issues are now taken seriously by governments and publics alike. Environmentalism has thus reached a tipping point where governments clearly have a mandate to give substantive attention to environmental issues. This thesis provides a clear model that can be applied by governments to achieve sustainable environmental outcomes.
APA, Harvard, Vancouver, ISO, and other styles
11

Zheng, Y. "Three-dimensional mesh generation and visualization for finite elements : principles and practice." Thesis, Swansea University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.636734.

Full text
Abstract:
Mesh generation and visualization is a bottleneck problem in all finite element based numerical simulations. The research work reported in this thesis is a step towards integration of a 3D unstructured mesh generation toolkit. Both theoretical and implementational aspects are addressed. The general concepts of mesh generation have been presented. In the light of these concepts, the fundamental aspects of Delaunay triangulation for the unstructured mesh generator have been formulated. Steiner point creation algorithm has been examined by means of numerical experiments with respect to mesh control parameters. Concepts of point spacing tensor and point insertion criterion have been introduced aiming to deal with anisotropic meshes. Surface meshes are created based on 2D geometries by means of coordinate mapping. Triangular and quadrilateral patches in linear and quadratic forms and Non-Uniform Rational B-Spline (NURBS) patches have been used to define surface geometry. The mesh quality improvement has been discussed in terms of parametric plane stretching, diagonal swapping and smoothing procedures. Volume meshes are generated through 3D triangulation and interior point creation based on the surface meshes. Boundary surface conformity is gained <I>via</I> edge swapping, boundary edge and surface recovery, and the robustness of the algorithm has been discussed in a view of the accuracy of geometric judgements. Quality metrics for tetrahedral elements, mesh smoothing technique, visual quality assessment, and adaptivity extension have been discussed. Data conversion between this mesh generator and some existing CAD packages has been considered. Additionally, a scheme of interactive topology specification has been put forward for a 3D multiblock mesh generator. Finally, a finite element visualization facility FEView has been presented. This visualization tool is implemented based upon a object-oriented graphics library, and works as an external module to an interactive program Geomview. Within the FEView, a mesh can be considered as a collection of faces with edges, wire frame, point and element clouds, and the FE analysis results can be visualized <I>via</I> color shading and field icons. FEView provides functions such as domain control, animation control, and local analysis. Numerical results for 2D cases can be shown with 3D effects by using values of the scalar field.
APA, Harvard, Vancouver, ISO, and other styles
12

Jonasson, Anna, Ben Kneppers, and Brendan Moore. "Principles-Based Comparison Framework for Renewable Electricity Options." Thesis, Blekinge Tekniska Högskola, Avdelningen för maskinteknik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4154.

Full text
Abstract:
Electricity generation is both a major contributor to the root causes of environmental unsustainability and an energy source that will likely play an important role in the transition to a sustainable society. Because renewable sources of electricity generation are seen as sustainable as a group, there is a danger that investments will be made in renewable technologies that do not effectively move society towards sustainability. We propose the use of a scientific, principles-based definition of sustainability to compare current and future renewable electricity options on their sustainability potential. This study presents a pilot decision-support comparison tool, Guide for Sustainable Energy Decisions (GSED), designed to give investors, policy makers, and manufacturers strategic guidance on the most effective renewable technologies to invest in for sustainability. The tool is based on a modified version of life cycle assessment (LCA) that allows comparisons of the upstream and downstream effects of generation technologies from a whole-systems sustainability perspective. Early feedback by experts suggests that the tool has strong potential to serve as an effective comparison tool and help decision-makers make strategic investments for sustainability.
APA, Harvard, Vancouver, ISO, and other styles
13

Holman, Ryan Richard. "Identifying and comparing differences in the values of elementary school principals among baby boomers and generation Xers /." La Verne, Calif. : University of La Verne, 2003. http://0-wwwlib.umi.com.garfield.ulv.edu/dissertations/fullcit/3075266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Motloung, Setumo Victor. "Intense pulsed neutron generation based on the principle of Plasma Immersion Ion Implantation (PI3) technique." Thesis, University of the Western Cape, 2006. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_9599_1182748458.

Full text
Abstract:
<p>The development of a deuterium-deuterium/ tritium-deuterium (D-D/ D-T) pulsed neutron generator based on the principle of the Plasma Immersion Ion Implantation (PI3) technique is presented, in terms of investigating development of a compact system to generate an ultra short burst of mono-energetic neutrons (of order 1010 per second) during a short period of time (&lt<br>20&mu<br>s) at repetition rates up to 1 kHz. The system will facilitate neutron detection techniques, such as neutron back-scattering, neutron radiography and time-of-flight activation analysis.</p> <p><br /> Aspects addressed in developing the system includes (a) characterizing the neutron spectra generated as a function of the target configuration/ design to ensure a sustained intense neutron flux for long periods of time, (b) the system was also characterised as a function of power supply operating conditions such as voltage, current, gas pressure and plasma density.</p>
APA, Harvard, Vancouver, ISO, and other styles
15

Zhang, Ru. "Testing Two Models of Paired-Associate Learning Incorporating the Principle of Encoding Specificity." Ohio University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1311256036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Roozbeh, Amir. "Toward Next-generation Data Centers : Principles of Software-Defined “Hardware” Infrastructures and Resource Disaggregation." Licentiate thesis, KTH, Kommunikationssystem, CoS, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-249618.

Full text
Abstract:
The cloud is evolving due to additional demands introduced by new technological advancements and the wide movement toward digitalization. Therefore, next-generation data centers (DCs) and clouds are expected (and need) to become cheaper, more efficient, and capable of offering more predictable services. Aligned with this, we examine the concept of software-defined “hardware” infrastructures (SDHI) based on hardware resource disaggregation as one possible way of realizing next-generation DCs. We start with an overview of the functional architecture of a cloud based on SDHI. Following this, we discuss a series of use-cases and deployment scenarios enabled by SDHI and explore the role of each functional block of SDHI’s architecture, i.e., cloud infrastructure, cloud platforms, cloud execution environments, and applications. Next, we propose a framework to evaluate the impact of SDHI on techno-economic efficiency of DCs, specifically focusing on application profiling, hardware dimensioning, and total cost of ownership (TCO). Our study shows that combining resource disaggregation and software-defined capabilities makes DCs less expensive and easier to expand; hence they can rapidly follow the exponential demand growth. Additionally, we elaborate on technologies behind SDHI, its challenges, and its potential future directions. Finally, to identify a suitable memory management scheme for SDHI and show its advantages, we focus on the management of Last Level Cache (LLC) in currently available Intel processors. Aligned with this, we investigate how better management of LLC can provide higher performance, more predictable response time, and improved isolation between threads. More specifically, we take advantage of LLC’s non-uniform cache architecture (NUCA) in which the LLC is divided into “slices,” where access by the core to which it closer is faster than access to other slices. Based upon this, we introduce a new memory management scheme, called slice-aware memory management, which carefully maps the allocated memory to LLC slices based on their access time latency rather than the de facto scheme that maps them uniformly. Many applications can benefit from our memory management scheme with relatively small changes. As an example, we show the potential benefits that Key-Value Store (KVS) applications gain by utilizing our memory management scheme. Moreover, we discuss how this scheme could be used to provide explicit CPU slicing – which is one of the expectations of SDHI  and hardware resource disaggregation.<br><p>QC 20190415</p>
APA, Harvard, Vancouver, ISO, and other styles
17

Puzikov, Yevgeniy [Verfasser], Iryna [Akademischer Betreuer] Gurevych, Ido [Akademischer Betreuer] Dagan, and Claire [Akademischer Betreuer] Gardent. "Principled Approach to Natural Language Generation / Yevgeniy Puzikov ; Iryna Gurevych, Ido Dagan, Claire Gardent." Darmstadt : Universitäts- und Landesbibliothek, 2021. http://d-nb.info/1237414946/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Copper, Mark L. "Principal component and second generation wavelet analysis of treasury yield curve evolution." FIU Digital Commons, 2004. http://digitalcommons.fiu.edu/etd/2521.

Full text
Abstract:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis.
APA, Harvard, Vancouver, ISO, and other styles
19

Keilwagen, Jens [Verfasser], Ivo [Akademischer Betreuer] Grosse, and Gunnar [Akademischer Betreuer] Rätsch. "Predicting DNA binding sites using generative, discriminative, and hybrid learning principles / Jens Keilwagen. Betreuer: Ivo Grosse ; Gunnar Rätsch." Halle, Saale : Universitäts- und Landesbibliothek Sachsen-Anhalt, 2010. http://d-nb.info/1024976939/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Sachs, Gregory (Gregory Dennis). "A principle based system architecture framework applied for defining, modeling & designing next generation smart grid systems." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62773.

Full text
Abstract:
Thesis (S.M. in Engineering and Management)--Massachusetts Institute of Technology, Engineering Systems Division, 2010.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (p. 81).<br>A strong and growing desire exists, throughout society, to consume electricity from clean and renewable energy sources, such as solar, wind, biomass, geothermal, and others. Due to the intermittent and variable nature of electricity from these sources, our current electricity grid is incapable of collecting, transmitting, and distributing this energy effectively. The "Smart Grid" is a term which has come to represent this 'next generation' grid, capable of delivering, not only environmental benefits, but also key economic, reliability and energy security benefits as well. Due to the high complexity of the electricity grid, a principle based System Architecture framework is presented as a tool for analyzing, defining, and outlining potential pathways for infrastructure transformation. Through applying this framework to the Smart Grid, beneficiaries and stakeholders are identified, upstream and downstream influences on design are analyzed, and a succinct outline of benefits and functions is produced. The first phase of grid transformation is establishing a robust communications and measurement network. This network will enable customer participation and increase energy efficiency through smart metering, real time pricing, and demand response programs. As penetration of renewables increases, the high variability and uncontrollability of additional energy sources will cause significant operation and control challenges. To mitigate this variability reserve margins will be adjusted and grid scale energy storage (such as compressed air, flow batteries, and plugin hybrid electric vehicles or PHEV's) will begin to be introduced. Achieving over 15% renewable energy penetration marks the second phase of transformation. The third phase is enabling mass adoption, whereby over 40% of our energy will come from renewable sources. This level of penetration will only be achieved through fast supply and demand balancing controls and large scale storage. Robust modeling must be developed to test various portfolio configurations.<br>by Gregory Sachs.<br>S.M.in Engineering and Management
APA, Harvard, Vancouver, ISO, and other styles
21

Saenz, Marshall. "Networks of Interaction: Writing Course Design through Fourth Generation Activity Theory and Principles of Play." Bowling Green State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1555108005074448.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Carter, Freeman Darnell. "Generation X and Millennial Generation Assistant Principals' Perceptions of the Challenges and Rewards of the Principalship A Qualitative Study." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/81877.

Full text
Abstract:
Employment figures and population demographics indicate that Baby Boomers (1946-1964) hold a small and shrinking share of school principalships. The oldest Baby Boomer principals began to retire during the middle of the 1990s, and their void created an opportunity for younger Baby Boomers and members of Generation X (1965-1981) to replace them. The youngest Baby Boomer principals are beginning to retire, and Millennial Generation (1982-2000) administrators are stepping up to fill the ranks. Millennial Generation educators have been in the field long enough to develop the requisite classroom teaching experience, graduate school master's level education, and training needed to obtain administrative positions. Principals develop their leadership skills through the assistant principal experience, and because Millennials are a relatively new addition to the ranks of assistant principals, little is known about their perceptions' of the challenges and rewards of the principalship. Generational differences between Generation Xers and Baby Boomers have been investigated by other researchers, but this study was unique because it directly compared Generation X and Millennial Generation assistant principals. The study explored Generation X and Millennial Generation assistant principals' perceptions of the challenges and rewards of the principalship. This qualitative study involved 12 assistant principal participants, and the analysis of the coded interview transcript data produced major coded themes with valuable implications regarding the participants' motivations, career ambitions, professional development needs, and their perceptions' of the principalship. This study indicated that Generation X and Millennial Generation assistant principals have distinct similarities and differences, and school division superintendents who understand the generational differences may make more informed leadership and personnel decisions about their future principals. The findings and implications were intended to assist superintendents and personnel/human resource directors in their efforts to recruit, select, support, and ultimately promote Generation X and Millennial Generation assistant principals to the principalship. The findings of this study suggested opportunities for researchers to continue the investigation of the topic.<br>Ed. D.
APA, Harvard, Vancouver, ISO, and other styles
23

Åberg, Erik. "Review of an industrially implemented model of zoning principles for electricity distribution and energy production." Thesis, KTH, Industriella informations- och styrsystem, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-81329.

Full text
Abstract:
The interconnection of components of industrial automation and control systems (IACS) and enterprise systems involved in processes ranging from generation and transmission to billing within electric utilities poses challenges regarding cyber security as well as division of organisational responsibility. One means of organising these components and systems is to use a zone model in which they are segmented, offering layered defences as well as a logical grouping. One such zone model is the zone model under review, which was presented by Zerbst et al. in a CIRED paper from 2009. This master thesis reviews that zone model and compares it to other industry standard zone models which have been found to be able to be categorised into either functional based models or layered defence models. The outcome is a rough definition of what kind of content fits in the various zones of the reviewed model, as well as a normalised zone model to be used for comparison. A suggested method for dividing system components into zones is based on the 4R-method considering the response time, resolution, reliability and reparability of the system component, although its accuracy has not been empirically tested.
APA, Harvard, Vancouver, ISO, and other styles
24

Derosier, Jonathan A. "Internet topology generation based on reverse-engineered design principles performance tradeoffs between heuristic and optimization-based approaches." Thesis, Monterey, Calif. : Naval Postgraduate School, 2008. http://bosun.nps.edu/uhtbin/hyperion-image.exe/08Jun%5FDerosier.pdf.

Full text
Abstract:
Thesis (M.S. in Operations Research)--Naval Postgraduate School, June 2008.<br>Thesis Advisor(s): Alderson, David L. "June 2008." Description based on title screen as viewed on August 25, 2008. Includes bibliographical references (p. 69). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
25

Kelley, Mary Christena. "The generative power of the holonomic process in architecture : an analysis of its origin, its meaning, and its principles of application." Thesis, Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/22987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Krogh, Robert. "Building and generating facial textures using Eigen faces." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-129935.

Full text
Abstract:
With the evolution in the game industry and other virtual environments, demands on what comes with an application is higher than ever before. This leads to many companies trying to to procedurally generate content in order to save up on storage space and get a wider variety of content. It has become essential to infuse immersion in such application and some companies has even gone as far as to let the player recreate him- or herself to be the hero or heroine of the game. Even so, many AAA companies refrain from using face segmentation software as it gives the power of adding game content by the end users, and that may lead to an increased risk of offensive content, that goes against company standards and policy, to enter their application. By taking the concept of procedural generation and applying this together with face segmentation, placing a Principal Component Analysis (PCA) based texturization model, we allow for a controlled yet functioning face texturization in a run-time virtual environment. In this project we use MatLab to create a controlled Eigen space, infuses this into an application built in Unity 3D using UMA, and lets smaller recreation vectors, that spans a few kilobytes as most, to create textures in run-time. In doing so, we can project faces onto the Eigen space and get fully functioning and texturized characters, able to use ready animations and controllers of the developer’s choice. These Eigen spaces may cost more storage space and loading times up to a limit, but can in turn generate a seemingly endless variation of textural content dynamically. In order to see what potential users prioritize when it comes to applications like these, we conducted a survey where the responders saw variations of this technique and were able to express their view on attributes expected from a “good” (from their point of view) application. In the end we have a UMA ready set of scripts, and a one-time use system to create Eigen spaces for the applications to use it. We worked in close relation with Högström’s Selfie to Avatar face segmentation software and proved the concept in Unity 3D applications.
APA, Harvard, Vancouver, ISO, and other styles
27

Uliyar, Hithesh Sanjiva. "FAULT DIAGNOSIS OF VEHICULAR ELECTRIC POWER GENERATION AND STORAGE." Wright State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=wright1284602099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Fiedler, Katja Verfasser], Michael J. [Akademischer Betreuer] Herrmann, Theo [Akademischer Betreuer] [Geisel, and Florentin [Akademischer Betreuer] Wörgötter. "Generation principles for arm movements in humans / Katja Fiedler. Gutachter: Theo Geisel ; Florentin Wörgötter. Betreuer: Michael J. Herrmann." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2012. http://d-nb.info/1044045485/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Sun, Wei. "Maximising renewable hosting capacity in electricity networks." Thesis, University of Edinburgh, 2015. http://hdl.handle.net/1842/10483.

Full text
Abstract:
The electricity network is undergoing significant changes in the transition to a low carbon system. The growth of renewable distributed generation (DG) creates a number of technical and economic challenges in the electricity network. While the development of the smart grid promises alternative ways to manage network constraints, their impact on the ability of the network to accommodate DG – the ‘hosting capacity’- is not fully understood. It is of significance for both DNOs and DGs developers to quantify the hosting capacity according to given technical or commercial objectives while subject to a set of predefined limits. The combinational nature of the hosting capacity problem, together with the intermittent nature of renewable generation and the complex actions of smart control systems, means evaluation of hosting capacity requires appropriate optimisation techniques. This thesis extends the knowledge of hosting capacity. Three specific but related areas are examined to fill the gaps identified in existing knowledge. New evaluation methods are developed that allow the study of hosting capacity (1) under different curtailment priority rules, (2) with harmonic distortion limits, and (3) alongside energy storage systems. These works together improve DG planning in two directions: demonstrating the benefit provided by a range of smart grid solutions; and evaluating extensive impacts to ensure compliance with all relevant planning standards and grid codes. As an outcome, the methods developed can help both DNOs and DG developers make sound and practical decisions, facilitating the integration of renewable DG in a more cost-effective way.
APA, Harvard, Vancouver, ISO, and other styles
30

Hollstrand, Paulina. "Supporting Pre-Production in Game Development : Process Mapping and Principles for a Procedural Prototyping Tool." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-273926.

Full text
Abstract:
Game development involves both traditional software activities combined with creative work. As a result, game design practices are characterized by an extensive process of iterative development and evaluation, where prototyping is a major component to test and evaluate the player experience. Content creation for the virtual world the players inhabit is one of the most time-consuming aspects of production. This experimental research study focuses on analyzing and formulating challenges and desired properties in a prototyping tool based on Procedural Content Generation to assist game designers in their early ideation process. To investigate this, a proof of concept was iteratively developed based on information gathered from interviews and evaluations with world designers during a conceptual and design study. The final user study assessed the tool’s functionalities and indicated its potential utility in enhancing the designers’ content exploration and risk management during pre-production activities. Key guidelines for the tool’s architecture can be distilled into: (1) A modular design approach supports balance between content controllability and creativity. (2) Design levels and feature representation should combine and range between Micro (specific) to Macro (high-level, abstract). The result revealed challenges in combining exploration of the design space with optimization and refinement of content. However, the thesis specifically concentrated on one specific type of content - city generation, to represent world design content generation. To fully understand the generalizable aspects different types of game content would need to be covered in further research.<br>Spelutveckling består av både traditionell programutveckling i kombination med kreativt arbete. Detta resulterar i att speldesign kännetecknas av en omfattande process av iterativ utveckling, där prototyper är en viktig komponent för att kunna testa och utvärdera spelupplevelsen. En av de mest tidskrävande aspekterna i produktionen är skapandet av innehåll till spelets virtuella värld. Denna experimentella forskningsstudie har fokuserat på att analysera och identifiera utmaningar och användbara egenskaper i ett prototypverktyg baserat på Procedurell Generering. Verktyget syftar till att assistera spelvärlds designers i deras initiala kreativa process för att generera, utveckla och kommunicera idéer. För att undersöka detta utvecklades en proof of concept iterativt baserat på den information som samlats in från intervjuer och utvärderingar med speldesigners, under en konceptuell och design fokuserad fas. Den slutliga användarstudien utvärderade verktygets funktionaliteter och användbarhet. Resultatet indikerade att ett prototypverktyg baserat på Procedurell Generering potentiellt kan förbättra både utforskandet av spelinnehåll och riskhantering. De viktigaste riktlinjerna för verktygets arkitektur kan sammanfattas av: (1) Användandet av moduler i designen stöder balansen mellan kontroll och kreativitet vid skapandet av innehåll. (2) Funktioner i verktyget gynnas av att kombinera och variera mellan Mikro (specifik representation) till Makro (abstrakt, övergripande representation) designnivåer. Studien identifierade ett antal utmaningar med att både kunna utforska material och optimera det. Viktigt att uppmärksamma är att denna studie enbart koncentrerade sig på en specifik typ av innehåll – stadsgenerering, vars syfte var att fungera som en representation av innehållsgenerering i spelvärldsdesign. För att mer omfattande kunna identifiera de generaliserbara aspekterna skulle flera olika typer av spelinnehåll behöva testas i ytterligare forskning
APA, Harvard, Vancouver, ISO, and other styles
31

Seipert, Karen Greene. "A correlational analysis of the values of Baby Boomer and Generation X rural public school principals." Thesis, University of Phoenix, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3571485.

Full text
Abstract:
<p> A quantitative, correlational was used to analyze seven dimensions of work values of Baby Boom and Generation X rural school principals in North Carolina to aid school district administrators in principal motivation and retention. The purpose of the current research study was to determine whether principals from different generational cohorts differ in their work values and was based on the generational characteristics and traits of employees in business organizations. The study was focused on Baby Boom and Generation X rural public school principals from two school districts. A Likert-type online survey based on seven dimensions of work values was administered to 50 principals and assistant principals within the two districts. Forty usable responses were received. The results of the study indicated that while there are not significant generational differences between Baby Boom and Generation X principals in collaboration, leadership, training, loyalty, commitment, or motivation, Baby Boomers scored lower in all areas except training. Baby Boomers scored significantly lower in technology and approached significance in motivation. Future research using a much larger sample size may find significant differences in other areas.</p>
APA, Harvard, Vancouver, ISO, and other styles
32

Mayer, Rosirene. "A linguagem de Oscar Niemeyer." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2003. http://hdl.handle.net/10183/6693.

Full text
Abstract:
Este trabalho tem como objetivo a descrição dos elementos que caracterizam a singularidade da linguagem arquitetônica de Oscar Niemeyer. Argumenta que a identificação de tais elementos passa pelo escrutínio de aspectos não visíveis da obra do arquiteto. A identificação foi possível a partir da análise de edifícios caracterizados pelo perfil curvilíneo e da construção de um modelo que associa os elementos compositivos utilizados por Niemeyer a uma Gramática de Formas. A utilização do modelo possibilitou revelar os princípios generativos - conjunto de regras, vocabulário e relações geométricas – que caracterizam o estilo – ou linguagem arquitetônica de Niemeyer. Ajudou ainda a demonstrar como a linguagem de Niemeyer associa de forma original, operações de transformação como rotação, reflexão, e translação a um vocabulário de curvas. A associação é parametrizada segundo um traçado regulador baseado na seção áurea. Em suas conclusões o trabalho sugere possibilidades de desenvolvimento desta gramática para todas as figuras utilizadas por Niemeyer e a aplicação de princípios generativos no ensino de arquitetura.<br>This work aims at describing the elements that characterize Oscar Niemeyer’s singular architectural language. It argues that the identification of these elements passes for the scrutiny of non-visible aspects of his work. The identification was possible taking into consideration from the analysis of buildings characterized for curved profile and the construction of a model that associates the compositional elements utilized by Niemeyer to a Shape Grammar. The utilization of the model made it possible to reveal the generative principles - set of rules, vocabulary and geometric relations - that characterize Niemeyer’s style and architectural language. It also helped showing how Niemeyer’s language associates, in an original way, operations of transformation such as rotation, reflection, and translation to a vocabulary of curves. The association has its parameters on a drawn line which acts as a regulator based on the golden section. As its conclusion, the work suggests possibilities of development of this grammar for all the forms utilized by Niemeyer and the aplication of generative principles in the teaching of architecture.
APA, Harvard, Vancouver, ISO, and other styles
33

Fallahzadeh, Pardis. "Goal-directed Imitation In Pre-school And Elementary School Children." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613884/index.pdf.

Full text
Abstract:
Imitation is a fundamental way of acquiring knowledge in human development. In their theory of goal-directed imitation (GOADI), Wohlschl&auml<br>ger et al. (2003) divide the representation of observed movements into hierarchically organized aspects the highest of which is usually the goal. In a face-to-face imitation task young children usually copy the (spatial) goal of the body movement in terms of perceptual mirror symmetry rather than match them conceptually onto their own body, as adults do. We refer to these imitation schemes as &ldquo<br>mirroring&rdquo<br>and &ldquo<br>matching&rdquo<br>respectively. In the present study, we investigate the effects of age and perspective of the child with respect to the experimenter (0&deg<br>, 90&deg<br>, 180&deg<br>) in two imitation tasks, a hand-to-ear and a cup-grasping task. Moreover, we evaluate the developmental changes in the imitative behavior of children from a dynamical systems perspective. Children were supposed to imitate the movements of the experimenter. Tasks were conducted on 4.5- to 11-year-old Iranian pre-school and elementary school children (81 female, 84 male). Imitation scores for the spatial goal were analyzed in terms of mirroring or matching. Imitation schemes varied according to age and perspective in both tasks. Overall, older children&rsquo<br>s imitations of movements were more adult-like as established by an adult Iranian control group than those of the younger ones. They rather matched than mirrored observed movements. In the 180&deg<br>and 90&deg<br>conditions the mirroring scheme was predominant, but in 0&deg<br>matching was predominant. GOADI was confirmed<br>however it was qualified by the child&#039<br>s perspective on the experimenter. Children&rsquo<br>s imitations showed a non-linear shift from perceptually-based mirroring to conceptually-based matching of observed movements onto their own body. This shift happens between 6 and 8-9 years of age. The amount of matching depends not only on age but also on control parameters such as spatial perspective, task demands, and exposure.
APA, Harvard, Vancouver, ISO, and other styles
34

Lins, de Azevedo Costa Bhianca. "Generation of high drug loading amorphous solid dispersions by different manufacturing processes." Thesis, Ecole nationale des Mines d'Albi-Carmaux, 2018. http://www.theses.fr/2018EMAC0012/document.

Full text
Abstract:
La principale difficulté lors de l'administration orale d'un ingrédient pharmaceutique actif (API) est de garantir que la dose clinique de l’API sera dissoute dans le volume disponible de liquides gastro-intestinaux. Toutefois, environ 40% des API sur le marché et près de 90% des molécules en cours de développement sont peu solubles dans l’eau et présentent une faible absorption par voie orale, ce qui entraîne une faible biodisponibilité. Les dispersions solides amorphes (ASD) sont considérées comme l’une des stratégies plus efficaces pour résoudre des problèmes de solubilité des principes actifs peu solubles dans l’eau et, ainsi, améliorer leur biodisponibilité orale. En dépit de leur introduction il y a plus de 50 ans comme stratégie pour améliorer l’administration orale des API, la formation et la stabilité physique des ASD font toujours l'objet de recherches approfondies. En effet, plusieurs facteurs peuvent influer sur la stabilité physique des ASD pendant le stockage, parmi lesquels la température de transition vitreuse du mélange binaire API-polymère, la solubilité apparente de l'API dans le polymère, les interactions entre l'API et le polymère et le procédé de fabrication. Cette thèse consistait en deux parties qui avaient pour objectif le développement de nouvelles formulations sous forme d’ASD d'un antirétroviral, l'Efavirenz (EFV), dispersé dans un polymère amphiphile, le Soluplus, en utilisant deux procédés différents, le séchage par atomisation (SD) et l'extrusion à chaud (HME). EFV est l’API BCS de classe II de notre choix car c’est un API qui représente un défi pour les nouvelles formulations. En effet, il a besoin d’ASD plus fortement concentrées, pour lesquelles la stabilité chimique et physique pendant le stockage et la dissolution seront essentielles. Dans le but de développer de manière rationnelle les ASDs EFV- Soluplus à forte concentration, la première partie s'est concentrée sur la construction d'un diagramme de phases EFV-Soluplus en fonction de la composition et de la température. Le diagramme de phases a été construit à partir d'une étude thermique de recristallisation d'un ASD sursaturé (85 %m EFV), générée par séchage par atomisation. À notre connaissance, il s'agit de la première étude à présenter un diagramme de phase pour ce système binaire. Ce diagramme de phases est très utile et démontre que la solubilité de l'EFV dans les solutions varie de 20 %m (25 °C) à 30 %m (40 °C). Les ASD de EFV dans le Soluplus contenant plus de 30 %m d'EFV doivent être surveillées pendant le stockage dans des conditions typiques de température. Ce diagramme de phases peut être considéré comme un outil de pré-formulation pour les chercheurs qui étudient de nouvelles ASD d'EFV dans le Soluplus afin de prédire la stabilité (thermodynamique et cinétique). Les ASD préparées par différentes techniques peuvent afficher des différences dans leurs propriétés physicochimiques. La deuxième partie de cette thèse portait sur la fabrication d’ASD par des procédés HME et SD. Cette étude montre clairement que la formation d’ASD est une stratégie de formulation utile pour améliorer la solubilité dans l'eau et la vitesse de dissolution de l'EFV à partir de mélanges binaires EFV-Soluplus. Les procédés de fabrication (HME et SD) se sont révélés efficaces pour générer des ASD dans une large gamme de compositions en EFV. L'optimisation du ratio EFV-Soluplus peut être utilisée pour adapter la libération cinétique des ASD. Le choix d’une charge EFV élevée dépassant la solubilité thermodynamique de l’EFV dans le Soluplus est possible, mais il convient de prendre en compte sa stabilité cinétique dans le temps<br>The main difficulty when an Active Pharmaceutical Ingredient (API) is orally administered is to guarantee that the clinical dose of the API will be dissolved in the available volume of gastrointestinal fluids. However, about 40% of APIs with market approval and nearly 90% of molecules in the discovery pipeline are poorly water-soluble and exhibits a poor oral absorption, which leads to a weak bioavailability. Amorphous solid dispersions (ASD) are considered as one of the most effective strategies to solve solubility limitations of poorly-water soluble compounds and hence, enhance their oral bioavailability. Despite their introduction as technical strategy to enhance oral APIs bioavailability more than 50 years ago, ASD formation and physical stability remains a subject of intense research. Indeed, several factors can influence the physical storage stability of ASD, among them, the glass transition temperature of the API-carrier binary mixture, the apparent solubility of the API in the carrier, interactions between API and carrier, and the manufacturing process. This thesis consisted of two parts that aim on developing new formulations of ASD of an antiretroviral API, Efavirenz (EFV), dispersed in an amphiphilic polymer, Soluplus, by using two different processes, Spray-drying (SD) and Hot-melt extrusion (HME). EFV is the class II BCS API of our choice because it is a challenging API for new formulations. It needs higher-dosed ASDs, for which chemical and physical stability during storage and dissolution will be critical. Aiming a rational development of high-loaded EFV-Soluplus ASDs, the first part focused on the construction of a temperature- composition EFV-Soluplus phase diagram. The phase-diagram was constructed from a thermal study of recrystallization of a supersaturated ASD (85 wt% in EFV), generated by spray drying. To our knowledge, this is the first study reporting a phase-diagram for this binary system. This phase-diagram is very useful and demonstrated that the EFV solubility in Soluplus ranges from 20 wt% (25 °C) to 30 wt% (40 °C). ASD of EFV in Soluplus containing more than 30 wt% of EFV should be monitored over storage under typical temperature conditions. This phase-diagram might be considered as a preformulation tool for researchers studying novel ASD of EFV in Soluplus, to predict (thermodynamic and kinetic) stability. ASD prepared by different techniques can display differences in their physicochemical properties. The second part of this thesis focused on the manufacturing of ASD by HME or SD processes. This study clearly shows that ASD is a useful formulation strategy to improve the aqueous solubility and the dissolution rate of EFV from EFV-Soluplus binary mixtures. HME and SD manufacturing processes demonstrated to be efficient to generate ASDs in a large range of compositions and loads of EFV. The optimization of EFV to Soluplus ratio can be used to tailor the release kinetics from ASD. The choice of a high EFV load exceeding the thermodynamic solid solubility in Soluplus is possible but it needs the consideration of its kinetic stability over time
APA, Harvard, Vancouver, ISO, and other styles
35

Matijošaitis, Darius. "Automatizuotasis plieno jungčių ir mazgų brėžinių generavimas." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2010. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2010~D_20100701_092602-45229.

Full text
Abstract:
Baigiamojo darbo tikslas – sukurti programinį modulį, skirtą automatizuotam plieno jungčių ir mazgų brėžinių generavimui. Iškeltam tikslui pasiekti darbe tiriamos objektinio projektavimo technologijos ir jų taikymo galimybės kuriant automatizuotas brėžinių generavimo sistemas. Analitinėje darbo dalyje išanalizuoti moduliniai projektavimo principai ir objektinio programavimo projektavimo šablonai. Projektinėje darbo dalyje aprašoma modulinė programos sandara, analizuojamos pradinės duomenų struktūros. Aprašomos atskirai projektuojamos sistemos dalys bei pateikiama bendra suprojektuota automatizuota plieno jungčių ir mazgų brėžinių generavimo sistema. Trumpai aptariamas sudedamųjų plieno jungčių ir mazgų elementų braižymas ir vartotojo sąsajos praplėtimas. Darbą sudaro 3 dalys: problemos analizė ir formulavimas, teorinis pagrindimas, sistemos projektavimas ir realizacija. Atskiru skyriumi pateikiamas publikuotas straipsnis. Darbo apimtis 58 puslapiai teksto be priedų, 24 paveikslai, 39 bibliografiniai šaltiniai.<br>The aim of the final work is to develop a software module for automated generation of steel joint connections drawings. The work involves investigation of object-oriented design techniques and their application to the possibility of developing an automated system for generating drawings. Analytical part of work contains analysis of the principles for modular design and object-oriented design patterns. In part of system design and implementation is described a modular program structure and an initial analysis of the data structures. Also described separately designed system parts and the total designed automated steel joint connections drawing system. Briefly described the components of steel joint connections drawing elements and user interface extension. Work consists of 3 parts: problem analysis and formulation, analytical part, system design and implementation, and published article as a separate section. Thesis consists of 58 pages text without appendixes, 24 pictures, 39 bibliographical entries.
APA, Harvard, Vancouver, ISO, and other styles
36

Carlan, Carina Prina. "Princípios criativos concebidos a partir das noções de pré-coisas e da atividade de transver de Manoel de Barros." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/110099.

Full text
Abstract:
Na dissertação, estudou-se a poesia de Manoel de Barros, como contribuição ao processo criativo do projetista/desenhista (designer) no âmbito inicial de um projeto de concepção/desenho (design). Dá-se atenção especial a duas abordagens da obra do poeta: as ‘pré-coisas’ e a atividade de ‘transver’, que consistem, respectivamente, nos processos ainda sem significado ou forma, e naqueles vistos de outras maneiras. A partir destes conceitos, estabeleceram-se relações com alguns princípios criativos e ferramentas de auxílio à geração de ideias já existentes. Encontrou-se possibilidade de gerar novos princípios criativos a partir da poesia do autor. Para viabilizar esta hipótese, buscou-se suporte teórico em estudos de semiologia, em abordagens sobre criatividade e nas relações do processo (design) com forma, função, material e sensações atribuídos aos produtos. Desta maneira, esta pesquisa resultou na geração de três princípios criativos gerados a partir das noções de ‘pré-coisas’ e da atividade de ‘transver’. Esses, se denominaram, respectivamente, Princípio de Arquissema, que se fundamenta em pensar “no antes”, Princípio de Comparamento, o qual visa atribuir semelhanças, e Princípio de Constativo, que busca ampliar a significação de objetos. Para melhor entender como esses princípios se relacionam ao processo criativo do desenhista (designer), realizaram-se dois exemplos, nos quais se buscou gerar ideias com auxílio dos princípios propostos. Utilizou-se, como exemplo, um objeto ainda em plano conceitual, chamado por Manoel de Barros de ‘desobjeto’, ou “abridor de amanhecer”, a fim de ofertar sugestões para que tal desobjeto passasse a ser um objeto. Por fim, também se buscou problematizar um tema ocasional, optando-se pelo da chuva, a fim de fornecer ideias, ainda conceituais, às conseqüências que ela possa causar.<br>In this study, it was investigated the poetry of Manoel de Barros, as a contribution to a designer’s creative process, in the original scope of a design project. Special attention is given to two approaches of the poet's work: the ‘pre-things’, and the activity of transver, consisting, respectively, of processes yet without meaning or form, and of those seen in other ways. From these concepts, were established relationships with some creative principles and with tools that help the generation of ideas already set. It was found the possibility of generating new creative principles from the author's poetry. In order to test this hypothesis, it was sought theoretical support in studies of semiology, in creative approaches, and in relations of design with form, function, material, and sensations attached to the products. Thus, this research resulted in the generation of three creative principles obtained from conceptions of ‘prethings’ and of ‘transver’ activity. These were called, respectively, Principle of Arquissema, which is based on thinking "in before", Principle of Comparamento, that proposes similarities, and Principle of Constativo, which search for expanding the meaning of objects. For better understanding of how these principles relate to the designer’s creative process, there were performed two examples, in which ideas were generated with the aid of the proposed principles. As an example, it was used an object yet in conceptual outline, called by Manoel de Barros as ‘desobjeto’, or ‘dawn opener’, in order to offer suggestions to transform such ‘desobjeto’ in an object. At last, it was also examined an occasional theme, whose option was for that of rain, in order to offer ideas, even conceptual, and the consequences they may cause.
APA, Harvard, Vancouver, ISO, and other styles
37

Jahirul, Md Islam. "Experimental and statistical investigation of Australian native plants for second-generation biodiesel production." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/83778/9/Jahirul_Islam_Thesis.pdf.

Full text
Abstract:
This work explores the potential of Australian native plants as a source of second-generation biodiesel for internal combustion engines application. Biodiesels were evaluated from a number of non-edible oil seeds which are grow naturally in Queensland, Australia. The quality of the produced biodiesels has been investigated by several experimental and numerical methods. The research methodology and numerical model developed in this study can be used for a broad range of biodiesel feedstocks and for the future development of renewable native biodiesel in Australia.
APA, Harvard, Vancouver, ISO, and other styles
38

Haidar, Ahmad. "Responsible Artificial Intelligence : Designing Frameworks for Ethical, Sustainable, and Risk-Aware Practices." Electronic Thesis or Diss., université Paris-Saclay, 2024. https://www.biblio.univ-evry.fr/theses/2024/interne/2024UPASI008.pdf.

Full text
Abstract:
L'intelligence artificielle (IA) transforme rapidement le monde, redéfinissant les relations entre technologie et société. Cette thèse explore le besoin essentiel de développer, de gouverner et d'utiliser l'IA et l'IA générative (IAG) de manière responsable et durable. Elle traite des risques éthiques, des lacunes réglementaires et des défis associés aux systèmes d'IA, tout en proposant des cadres concrets pour promouvoir une Intelligence Artificielle Responsable (IAR) et une Innovation Numérique Responsable (INR).La thèse commence par une analyse approfondie de 27 déclarations éthiques mondiales sur l'IA pour identifier des principes dominants tels que la transparence, l'équité, la responsabilité et la durabilité. Bien que significatifs, ces principes manquent souvent d'outils pratiques pour leur mise en œuvre. Pour combler cette lacune, la deuxième étude de la recherche présente un cadre intégrateur pour l'IAR basé sur quatre dimensions : technique, IA pour la durabilité, juridique et gestion responsable de l'innovation.La troisième partie de la thèse porte sur l'INR à travers une étude qualitative basée sur 18 entretiens avec des gestionnaires de secteurs divers. Cinq dimensions clés sont identifiées : stratégie, défis spécifiques au numérique, indicateurs de performance organisationnels, impact sur les utilisateurs finaux et catalyseurs. Ces dimensions permettent aux entreprises d'adopter des pratiques d'innovation durable et responsable tout en surmontant les obstacles à leur mise en œuvre.La quatrième étude analyse les risques émergents liés à l'IAG, tels que la désinformation, les biais, les atteintes à la vie privée, les préoccupations environnementales et la suppression d'emplois. À partir d'un ensemble de 858 incidents, cette recherche utilise une régression logistique binaire pour examiner l'impact sociétal de ces risques. Les résultats soulignent l'urgence d'établir des cadres réglementaires renforcés, une responsabilité numérique des entreprises et une gouvernance éthique de l'IA.En conclusion, cette thèse apporte des contributions critiques aux domaines de l'INR et de l'IAR en évaluant les principes éthiques, en proposant des cadres intégratifs et en identifiant des risques émergents. Elle souligne l'importance d'aligner la gouvernance de l'IA sur les normes internationales afin de garantir que les technologies d'IA servent l'humanité de manière durable et équitable<br>Artificial Intelligence (AI) is rapidly transforming the world, redefining the relationship between technology and society. This thesis investigates the critical need for responsible and sustainable development, governance, and usage of AI and Generative AI (GAI). The study addresses the ethical risks, regulatory gaps, and challenges associated with AI systems while proposing actionable frameworks for fostering Responsible Artificial Intelligence (RAI) and Responsible Digital Innovation (RDI).The thesis begins with a comprehensive review of 27 global AI ethical declarations to identify dominant principles such as transparency, fairness, accountability, and sustainability. Despite their significance, these principles often lack the necessary tools for practical implementation. To address this gap, the second study in the research presents an integrative framework for RAI based on four dimensions: technical, AI for sustainability, legal, and responsible innovation management.The third part of the thesis focuses on RDI through a qualitative study of 18 interviews with managers from diverse sectors. Five key dimensions are identified: strategy, digital-specific challenges, organizational KPIs, end-user impact, and catalysts. These dimensions enable companies to adopt sustainable and responsible innovation practices while overcoming obstacles in implementation.The fourth study analyzes emerging risks from GAI, such as misinformation, disinformation, bias, privacy breaches, environmental concerns, and job displacement. Using a dataset of 858 incidents, this research employs binary logistic regression to examine the societal impact of these risks. The results highlight the urgent need for stronger regulatory frameworks, corporate digital responsibility, and ethical AI governance. Thus, this thesis provides critical contributions to the fields of RDI and RAI by evaluating ethical principles, proposing integrative frameworks, and identifying emerging risks. It emphasizes the importance of aligning AI governance with international standards to ensure that AI technologies serve humanity sustainably and equitably
APA, Harvard, Vancouver, ISO, and other styles
39

Briglauer, Wolfgang, and Christian Holzleitner. "Efficient contracts for government intervention in promoting next generation communications networks." Forschungsinstitut für Regulierungsökonomie, WU Vienna University of Economics and Business, 2012. http://epub.wu.ac.at/3641/1/briglauer_holzleitner_efficient_contracts.pdf.

Full text
Abstract:
Although the future socio-economic benefits of a new fibre-based ("next generation access", NGA) telecommunications infrastructure seem to be uncontroversial, most countries have to date only undertaken NGA investments on a small scale. Accordingly, a universal NGA coverage appears to be a rather unrealistic objective without government intervention. Indeed, many governments have already initiated diverse subsidy programs in order to stimulate NGA infrastructure deployment. We contend, however, that the current contract practice of fixing ex ante targets for network expansion is inefficient given the uncertainty about future returns on NGA infrastructure-based services and the public authorities' incomplete information about the capital costs of the network provider. This paper puts forward to delegate the choice of the network expansion to the NGA provider. Simple linear profit-sharing contracts can be designed to control the NGA provider's incentives and to put in balance the public objectives of network expansion and limitation of public expenditure. (author's abstract)<br>Series: Working Papers / Research Institute for Regulatory Economics
APA, Harvard, Vancouver, ISO, and other styles
40

Luu, Keurcien. "Application de l'Analyse en Composantes Principales pour étudier l'adaptation biologique en génomique des populations." Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAS053/document.

Full text
Abstract:
L'identification de gènes ayant permis à des populations de s'adapter à leur environnement local constitue une des problématiques majeures du domaine de la génétique des populations. Les méthodes statistiques actuelles répondant à cette problématique ne sont plus adaptées aux données de séquençage nouvelle génération (NGS). Nous proposons dans cette thèse de nouvelles statistiques adaptées à ces nouveaux volumes de données, destinées à la détection de gènes sous sélection. Nos méthodes reposent exclusivement sur l'Analyse en Composantes Principales, dont nous justifierons l'utilisation en génétique des populations. Nous expliquerons également les raisons pour lesquelles nos approches généralisent les méthodes statistiques existantes et démontrons l'intérêt d'utiliser une approche basée sur l'Analyse en Composantes Principales en comparant nos méthodes à celles de l'état de l'art. Notre travail a notamment abouti au développement de pcadapt, une librairie R permettant l'utilisation de nos statistiques de détection sur des données génétiques variées<br>Identifying genes involved in local adaptation is of major interest in population genetics. Current statistical methods for genome scans are no longer suited to the analysis of Next Generation Sequencing (NGS) data. We propose new statistical methods to perform genome scans on massive datasets. Our methods rely exclusively on Principal Component Analysis which use in population genetics will be discussed extensively. We also explain the reasons why our approaches can be seen as extensions of existing methods and demonstrate how our PCA-based statistics compare with state-of-the-art methods. Our work has led to the development of pcadapt, an R package designed for outlier detection for various genetic data
APA, Harvard, Vancouver, ISO, and other styles
41

Wächter, Thomas. "Semi-automated Ontology Generation for Biocuration and Semantic Search." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-64838.

Full text
Abstract:
Background: In the life sciences, the amount of literature and experimental data grows at a tremendous rate. In order to effectively access and integrate these data, biomedical ontologies – controlled, hierarchical vocabularies – are being developed. Creating and maintaining such ontologies is a difficult, labour-intensive, manual process. Many computational methods which can support ontology construction have been proposed in the past. However, good, validated systems are largely missing. Motivation: The biocuration community plays a central role in the development of ontologies. Any method that can support their efforts has the potential to have a huge impact in the life sciences. Recently, a number of semantic search engines were created that make use of biomedical ontologies for document retrieval. To transfer the technology to other knowledge domains, suitable ontologies need to be created. One area where ontologies may prove particularly useful is the search for alternative methods to animal testing, an area where comprehensive search is of special interest to determine the availability or unavailability of alternative methods. Results: The Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG) developed in this thesis is a system which supports the creation and extension of ontologies by semi-automatically generating terms, definitions, and parent-child relations from text in PubMed, the web, and PDF repositories. The system is seamlessly integrated into OBO-Edit and Protégé, two widely used ontology editors in the life sciences. DOG4DAG generates terms by identifying statistically significant noun-phrases in text. For definitions and parent-child relations it employs pattern-based web searches. Each generation step has been systematically evaluated using manually validated benchmarks. The term generation leads to high quality terms also found in manually created ontologies. Definitions can be retrieved for up to 78% of terms, child ancestor relations for up to 54%. No other validated system exists that achieves comparable results. To improve the search for information on alternative methods to animal testing an ontology has been developed that contains 17,151 terms of which 10% were newly created and 90% were re-used from existing resources. This ontology is the core of Go3R, the first semantic search engine in this field. When a user performs a search query with Go3R, the search engine expands this request using the structure and terminology of the ontology. The machine classification employed in Go3R is capable of distinguishing documents related to alternative methods from those which are not with an F-measure of 90% on a manual benchmark. Approximately 200,000 of the 19 million documents listed in PubMed were identified as relevant, either because a specific term was contained or due to the automatic classification. The Go3R search engine is available on-line under www.Go3R.org.
APA, Harvard, Vancouver, ISO, and other styles
42

Bodnár, Eszter. "Electrospraying of polymer solutions for the generation of micro-particles, nano-structures, and granular films." Doctoral thesis, Universitat Rovira i Virgili, 2016. http://hdl.handle.net/10803/379820.

Full text
Abstract:
S'ha realitzat un estudi sobre els mecanismes de formació de micropartícules polimèriques i les seves pel•lícules granulars, a partir de l'assecat de microgotes de electrosprays. L'estudi se centra en diferents solucions de tres polímers insolubles en aigua: polimetil(metacrilat), poliestirè, i etil cel•lulosa. L'assecat d'aquests electrosprays dóna lloc a diverses morfologies de partícula, que han estat determinades mitjançant microscòpia d'escombrat electrònic, i han estat caracteritzades en funció del solvent, concentració del polímer, el seu pes molecular, i la humitat relativa ambient. Les morfologies obtingudes inclouen una varietat d'estructures de partícula globulars i filamentoses, que, a humitat relativa elevada, poden desenvolupar porositat. Aquestes característiques morfològiques han estat explicades mitjançant models qualitatius que involucren fenòmens fluid dinàmics i sobre separació de fases, presents en sistemes relacionats amb els estudiats. Un dels fenòmens fluid dinàmics involucrats clau són les inestabilitats coulòmbiques de gotes elèctricament carregades. A més, la interacció de no solvent de l'aigua en la precipitació del polímer pot donar lloc a textures poroses sobre la superfície de les partícules. Les diferents formes de textura han estat explicades en referència als fenòmens de breath figure formation (BFF), i a la inversió de fases induïda per vapor (vapor induced phase separation, o VIPS). També hem estudiat el creixement de les pel•lícules granulars formades a partir de les partícules polimèriques. Demostrem que la càrrega elèctrica transportada per les partícules cap a la pel•lícula influeix fortament en la dinàmica de creixement d’aquesta. Un millor coneixement dels mecanismes estudiats en aquesta tesi hauria de permetre dissenyar nous processos de manufactura de partícules i recobriments basats en electrospray. Se ha realizado un estudio sobre los mecanismos de formación de micropartículas poliméricas y sus películas granulares, a partir del secado de microgotas de electropras. El estudio se centra en diferentes soluciones de tres polímeros insolubles en agua: polimetil(metacrilato), poliestireno, y etil celulosa. El secado de estos electrosprays da lugar a diversas morfologías de partícula, que han sido determinadas mediante microscopía de barrido electrónico, y han sido car<br>Se ha realizado un estudio sobre los mecanismos de formación de micropartículas poliméricas y sus películas granulares, a partir del secado de microgotas de electropras. El estudio se centra en diferentes soluciones de tres polímeros insolubles en agua: polimetil(metacrilato), poliestireno, y etil celulosa. El secado de estos electrosprays da lugar a diversas morfologías de partícula, que han sido determinadas mediante microscopía de barrido electrónico, y han sido caracterizadas en función del solvente, concentración del polímero, su peso molecular, y la humedad relativa ambiente. Las morfologías obtenidas incluyen una variedad de estructuras de partícula globulares y filamentosas, que, a humedad relativa elevada, pueden desarrollar porosidad. Estas características morfológicas han sido explicadas mediante modelos cualitativos que involucran fenómenos fluido dinámicos y sobre separación de fases, presentes en sistemas relacionados con los estudiados. Uno de los fenómenos fluido dinámicos involucrados clave son las inestabilidades coulómbicas de gotas eléctricamente cargadas. Además, la interacción de no solvente del agua en la precipitación del polímero puede dar lugar a texturas porosas sobre la superficie de las partículas. Los diferentes tipos de texturas han sido explicadas en referencia a los fenómenos de breath figure formation (BFF), y a inversión de fases inducida por vapor (vapor induced phase separation, o VIPS). También hemos estudiado el crecimiento de las películas granulares formadas a partir de las partículas poliméricas. Demostramos que la carga eléctrica transportada por las partículas hacia la película influye fuertemente en la dinámica de crecimiento de ésta. Un mejor conocimiento de los mecanismos estudiados en esta tesis debería permitir diseñar nuevos procesos de manufactura de partículas y recubrimientos basados en electrospray.<br>A study has been made of the mechanisms underlying the formation of polymeric microparticles and of their granular films, by drying of electrospray microdroplets. The study is focused on different solutions of three water-insoluble polymers: polymethyl(methacrylate), polystyrene, and ethyl cellulose. The drying of such electrosprays result in diverse particle morphologies, which have been determined by scanning electron microscopy, and have been characterized as a function of the solvent, polymer concentration, polymer molecular weight, and ambient relative humidity. The morphologies obtained include a variety of globular and filamented particle structures, which, at elevated relative humidity, can develop porosity. These morphological features have been explained using qualitative models involving fluid dynamic and phase separation phenomena which are known to occur in closely related systems. One of the key fluid dynamic phenomena involved is the coulombic instability of electrically charged droplets. In addition, the non-solvent interaction of water on the precipitation of the polymer can lead to porous textures on the particles surfaces. The different kinds of textures have been explained by reference to breath-figure formation (BFF) and vapor induced phase separation (VIPS) phenomena. We have also studied the growth of the granular films of such polymer particles. We show that the electrical charge transported by the particles to the film have a strong influence on the film growth dynamics. The better understanding of the mechanisms studied in this thesis, should help design new manufacturing processes of particles and coatings based on electrospray.
APA, Harvard, Vancouver, ISO, and other styles
43

Beltran, Royo César. "Generalized unit commitment by the radar multiplier method." Doctoral thesis, Universitat Politècnica de Catalunya, 2001. http://hdl.handle.net/10803/6501.

Full text
Abstract:
This operations research thesis should be situated in the field of the power generation industry. The general objective of this work is to efficiently solve the Generalized Unit Commitment (GUC) problem by means of specialized software. The GUC problem generalizes the Unit Commitment (UC) problem by simultane-ously solving the associated Optimal Power Flow (OPF) problem. There are many approaches to solve the UC and OPF problems separately, but approaches to solve them jointly, i.e. to solve the GUC problem, are quite scarce. One of these GUC solving approaches is due to professors Batut and Renaud, whose methodology has been taken as a starting point for the methodology presented herein.<br/>This thesis report is structured as follows. Chapter 1 describes the state of the art of the UC and GUC problems. The formulation of the classical short-term power planning problems related to the GUC problem, namely the economic dispatching problem, the OPF problem, and the UC problem, are reviewed. Special attention is paid to the UC literature and to the traditional methods for solving the UC problem. In chapter 2 we extend the OPF model developed by professors Heredia and Nabona to obtain our GUC model. The variables used and the modelling of the thermal, hydraulic and transmission systems are introduced, as is the objective function. Chapter 3 deals with the Variable Duplication (VD) method, which is used to decompose the GUC problem as an alternative to the Classical Lagrangian Relaxation (CLR) method. Furthermore, in chapter 3 dual bounds provided by the VDmethod or by the CLR methods are theoretically compared.<br/>Throughout chapters 4, 5, and 6 our solution methodology, the Radar Multiplier (RM) method, is designed and tested. Three independent matters are studied: first, the auxiliary problem principle method, used by Batut and Renaud to treat the inseparable augmented Lagrangian, is compared with the block coordinate descent method from both theoretical and practical points of view. Second, the Radar Sub- gradient (RS) method, a new Lagrange multiplier updating method, is proposed and computationally compared with the classical subgradient method. And third, we study the local character of the optimizers computed by the Augmented Lagrangian Relaxation (ALR) method when solving the GUC problem. A heuristic to improve the local ALR optimizers is designed and tested.<br/>Chapter 7 is devoted to our computational implementation of the RM method, the MACH code. First, the design of MACH is reviewed brie y and then its performance is tested by solving real-life large-scale UC and GUC instances. Solutions computed using our VD formulation of the GUC problem are partially primal feasible since they do not necessarily fulfill the spinning reserve constraints. In chapter 8 we study how to modify this GUC formulation with the aim of obtaining full primal feasible solutions. A successful test based on a simple UC problem is reported. The conclusions, contributions of the thesis, and proposed further research can be found in chapter 9.
APA, Harvard, Vancouver, ISO, and other styles
44

LECUNA, TOVAR MARICARMEN LEANDRA. "Scale down of a dynamic generator of VOC reference gas mixtures." Doctoral thesis, Politecnico di Torino, 2017. http://hdl.handle.net/11583/2676926.

Full text
Abstract:
A system for the dynamic preparation of reference gas mixtures based on the diffusion technique has been developed by the National Metrology Institute of Italy i.e. the Istituto Nazionale di Ricerca Metrologica (INRIM). The gravimetric method used for the estimation of the diffusion rate and consequent concentration, gives the system the property to be a primary standard. The system can generate mixtures with low uncertainty and high stability in the 20 nmol.mol-1 - 2.5 µmol.mol-1 concentration range with a 5% (k = 2) expanded uncertainty for mixtures of acetone in air. Based on this system, a transportable device for the generation of VOC reference gas mixtures to be used as calibration standard was designed and developed. The methodology used for the scale down included several steps. An initial characterization and modelling of the primary device was done using computational tools. Based on the response of the computational model to the different physical quantities, a set of design parameters was identified. The thresholds for this set of parameters were established and translated into a set of design criteria to consider in order to keep the metrological performance target. After the design and development of the transportable device, a metrological characterization was carried out, to verify its capabilities. The metrological characterization of the generator was done in the Dutch National Metrology Institute i.e. the Van Swinden Laboratory (VSL) trough Cavity Ring Down Spectroscopy (CRDS) analyses to evaluate the linearity, the reproducibility and the short term stability. The results for the generation of methanol mixtures with molar fractions in the 80-150 nmol.mol-1 range, were 99.6% linear, with a reproducibility after 3 days within 2,9% and a short term stability better than 1% per hour. Repeatable measurements of the generated concentration were obtained for three different molar fractions, with the use of both CRDS (VSL) and the GC/FID (INRIM). A flow of the desired dry pure carrier gas can be connected to the device. The presence of water in the system has not been taken into account and further analyses should be done before introducing it to the system. Water presence might affect the adsorption rate, and consequently the flushing time before normal operation. This transportable device is able to perform in-situ calibration of instruments and has been designed to generate gas mixtures of up to four species at a time.
APA, Harvard, Vancouver, ISO, and other styles
45

Cimolino, Emmanuelle. "Omnis Aetas - Les âges de la vie chez les historiens de l’époque républicaine et chez Tite-Live : définitions, représentations, enjeux." Thesis, Paris Est, 2011. http://www.theses.fr/2011PEST0013.

Full text
Abstract:
Cette thèse consiste en un travail sur la représentation des groupes d’âge et de leurs rapports entre eux, dans le récit de Tite-Live en s’appuyant sur une comparaison avec d’autres écrits à caractère historique datant de l’époque républicaine et du début du principat. Loin d’envisager la question de la définition de ces âges sous l’angle des différents gradus aetatum, il s’agit plutôt de se concentrer sur l’étude comparée de la vision individuelle des âges de la vie chez Tite-Live, Salluste, César, et les historiens de la République. Ce travail propose une définition de ce que nous considérons, selon les critères anthropologiques modernes, comme des catégories d’âge, en tenant compte de la disparité entre termes masculins et termes féminins, de la multiplication des termes pour désigner une même catégorie, et enfin de l’emploi de certains termes à des fins idéologiques. On s’attachera également à dresser une typologie des rapports entre les différents groupes d’âge, et à voir comment ils contribuent à structurer la vie collective en même temps que les rapports interindividuels. La représentation de ces rapports, entre idéal d’obéissance et de concorde et conflits durables, permet également d’envisager les moments d’une réflexion sur ce qui est censé caractériser la société romaine du passé. Car l’intérêt de l’étude repose aussi sur l’époque de bouleversements et de restauration à laquelle sont écrites les œuvres du corpus, où la politique du principat succède aux troubles de la fin de la République, et cherche à renouer avec les anciennes valeurs romaines. Or, ce moment de redéfinition des valeurs implique une réflexion sur ce qui les définit, et de fait une nécessaire innovation dans les définitions. Comparer les différentes représentations des âges de la vie touche alors à l’étude d’une représentation de l’organisation politique et sociale à Rome aussi bien qu’à l’étude des mentalités<br>The purpose of this work is to investigate the representation of age groups and their relationship in Titus Livius’s Ab Vrbe condita, through a comparison with other historical accounts dating back to the Republican period and the early Principate. This study does not examine how to define age groups by means of the different gradus aetatum, but rather focuses on the comparative study of Livy’s, Sallust’s, Caesar’s and the Roman historians’ own visions of the ages of life. It introduces a definition of what in modern anthropology terms is considered as age category, while taking into account the contrast between grammatical genders, the large number of different words for a same category, as well as the use of lexis for rhetorical purposes. It also presents a typology of the relationship between the different age groups, documenting the part they play in structuring collective life and individual interactions as well. The representation of a relationship ranging from an ideal of obedience and harmony to long-lasting conflicts allows analysing the working of a mind over what supposedly characterizes life in Ancient Rome. As a matter of fact, it is worth noticing that the text corpus of this study is written at a time rife with upheavals and restorations, when the Principate eventually replaces the troubled Roman Republic and attempts to restore its traditional values, which implies working out anew what they actually are. Therefore, comparing the different representations of the ages of life naturally merges into a representation of political and social organisation as well as a survey of mores in Ancient Rome
APA, Harvard, Vancouver, ISO, and other styles
46

Herpich, Juliane [Verfasser], Christian [Akademischer Betreuer] Tetzlaff, Christian [Gutachter] Tetzlaff, et al. "The Principles of Self-Organization of Memories in Neural Networks for Generating and Performing Cognitive Strategies / Juliane Herpich ; Gutachter: Christian Tetzlaff, Stefan Klumpp, Robert Gütig, Jörg Enderlein, Dieter Klopfenstein, Alexander Gail ; Betreuer: Christian Tetzlaff." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2018. http://d-nb.info/1173420797/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Tello, Puerta Fernando. "Deductibility of the Spending Linked to the Expenditures Incurred as a Result of an Extrajudicial Transaction: Are We in the Face of a True Act of Liberality?" Derecho & Sociedad, 2015. http://repositorio.pucp.edu.pe/index/handle/123456789/118858.

Full text
Abstract:
This essay pretends to offer a juridical analysis of recent pronouncements of the Peruvian Tax Adminstration, by virtue of which, such entity states that those disbursements linked to extra judicial transactions do not constitute deductible expenses for the calculation of the Peruvian Income Tax. Under such premises, we offer a civil analysis of the nature of such disbursements and later, cover the issue of their relation with the generation of taxable income for Income Tax purposes.<br>El presente artículo pretende efectuar un análisis jurídico de los recientes pronunciamientos de la Administración Tributaria, en el sentido que los pagos efectuados como consecuencia de transacciones extrajudiciales no resultan deducibles para efectos de la determinación del Impuesto a la Renta. Se trata pues, de un análisis de corte civil respecto de la real naturaleza de los desembolsos efectuados con ocasión de la celebración de una transacción extrajudicial, para luego abordar la problemática de su vinculación con la generación de rentas gravadas con elImpuesto a la Renta.
APA, Harvard, Vancouver, ISO, and other styles
48

Xavier, Gildete Rocha. "Portugues brasileiro como segunda lingua : um estudo sobre o sujeito nulo." [s.n.], 2006. http://repositorio.unicamp.br/jspui/handle/REPOSIP/269157.

Full text
Abstract:
Orientadores: Mary Aizawa Kato, Maria Cecilia Perroni<br>Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Estudos da Linguagem<br>Made available in DSpace on 2018-08-07T21:34:01Z (GMT). No. of bitstreams: 1 Xavier_GildeteRocha_D.pdf: 1279433 bytes, checksum: 16b8a9f21d756dac70208dc8ab02e2b7 (MD5) Previous issue date: 2006<br>Resumo: Este estudo objetiva investigar como se dá a aquisição do sujeito nulo do português brasileiro (PB) como segunda língua (L2) por adultos estrangeiros, falantes nativos de Inglês e Italiano em situação de imersão total. A pesquisa desenvolve-se no âmbito da gramática gerativa, dentro do quadro da Teoria de Princípios e Parâmetros (Chomsky 1981,1986) e do Programa Minimalista (Chomsky, 1993, 1995, 2000). As questões da pesquisa estão relacionadas à questão do acesso à Gramática Universal (GU), por aprendizes de L2. Mais especificamente, procurou-se investigar se os sujeitos aqui analisados têm acesso à GU e, em caso afirmativo, qual seria a forma desse acesso. Os resultados da análise dos dados confirmaram a) a hipótese de acesso direto à GU, através do uso do valor default do parâmetro pro-drop = sujeitos nulos ou preenchidos + a forma verbal unipessoal, nas produções dos falantes de inglês e italiano em fase inicial de aquisição e, b) a hipótese do acesso indireto à GU, via LI, nas produções dos sujeitos falantes de inglês e italiano em fase não inicial de aquisição. Além disso, considerando que as línguas pro-drop não constituem um único tipo, levantou-se a hipótese de que, com base nos dados do input, os aprendizes vão apresentar o pro-drop do PB, a partir da aquisição da concordância dessa língua, o que se confirmou. A tese confirma a hipótese do "bilingüismo universal" de Roeper (1999), não apenas para o estágio inicial, mas para os estágios intermediário e final<br>Abstract: The aim of this study is to investigate the acquisition of the null subject in Brazilian Portuguese (BP) as a second language (L2) by native adult speakers of English and Italian in a situation of total immersion. The research was developed within the framework of the Principles and Parameters Theory (Chomsky 1981, 1986) and the Minimalist Program, Chomsky, 1993, 1995,2000). The research attempted to investigate whether the L2 leaders have access to Universal Grammar (UG) and what the form of that access would be. The results of the analysis confirmed a) the hypothesis of direct access to UG, through the use of the prcxlrop parameter's default value = null or overt subjects + the one-person agreement verbal form, in the production of English and Italian speakers in the initial phase of acquisition and, b) the hypothesis of indirect access to UG through LI, in the production of the English and Italian speakers in the non-initial phase of acquisition. Considering that prodrop languages do not constitute a single type, it was hypothesized that, based upon data from the input, the leamers would present the prcxlrop of BP, starting by the acquisition of the agreement in that language, which was confirmed. The analysis confirms the "universal bilingualism" hypothesis (Roeper, 1999), not only for the initial stage of acquisition, but also for the intermediate and final stages<br>Doutorado<br>Linguistica<br>Doutor em Linguística
APA, Harvard, Vancouver, ISO, and other styles
49

Renaudot, Raphaël. "Conception, fabrication de puces microfluidiques à géométrie programmable et reconfigurable reposant sur les principes d’électromouillage sur diélectrique et de diélectrophorèse liquide." Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENY080/document.

Full text
Abstract:
Dans le domaine des Lab-on-a-chip (LOC), la géométrie des canaux d'une puce microfluidique est souvent spécifique à la réalisation d'un protocole donné. La géométrie d'une puce est définie à l'étape de conception, avant les étapes de fabrication (généralement longues et coûteuses), et ne peut être modifiée a posteriori. Ce constat devient problématique lorsque la géométrie ne répond pas de façon satisfaisante au cahier des charges et qu'un nouveau lot de fabrication doit être démarré afin de redimensionner la puce. Pour pallier cet inconvénient, nous proposons de développer des puces microfluidiques génériques dont la géométrie est programmable et reconfigurable. Ce concept s'appuie largement sur les deux techniques de microfluidique digitale, l'électromouillage sur diélectrique (EWOD) et la diélectrophorèse liquide (LDEP). La première voie d'étude se concentre sur la technique de microfluidique LDEP. Tout d'abord, un modèle électromécanique, décrivant les comportements des liquides lors d'actionnements par LDEP ou EWOD, est établi. Ce modèle sert ensuite de base pour la conception et la fabrication de designs LDEP. Ces derniers sont testés afin d'identifier les géométries et les empilements technologiques, offrant des actionnements LDEP optimisés. L'étude, qui prend en compte un grand nombre de paramètres, montre que, avec des configurations et conditions spécifiques, les actionnements de liquide par LDEP offrent des performances égales, a minima, sur certains points, et supérieures sur d'autres par rapport à l'ensemble des études reportées dans la littérature. Enfin, un protocole de fonctionnalisation de surface par des spots de polymère de quelques microns à plusieurs dizaines de microns de diamètre, utilisant la technologie LDEP, est décrit. Cette méthode est susceptible de concurrencer directement les méthodes de fonctionnalisation classiques. La seconde voie d'étude traite du concept de géométrie programmable et reconfigurable, à l'aide de plateformes microfluidiques couplant les effets LDEP et EWOD. Dans un premier temps, les plateformes en configuration " ouverte " permettent de produire des moules à géométrie programmable pour la réalisation de puces microfluidiques en PDMS. Les résultats de cette étude prometteuse aboutissent, entre autres, à la réalisation de géométries de canaux complexes et typiques dans le domaine de la microfluidique (jonctions en " T " et valves de type " Quake "). Dans un second temps, les résultats les plus aboutis de ce manuscrit sont exposés à propos du concept de géométrie programmable et reconfigurable en utilisant de la paraffine. Un protocole spécifique, exploitant judicieusement les déplacements de liquides par EWOD et LDEP, donne lieu à la fabrication d'un grand nombre de puces microfluidiques, comportant des géométries de canaux complexes et variées. Dans les deux cas, un grand nombre de géométries peut être généré a à partir d'une seule plateforme microfluidique digitale générique. Les résultats obtenus ouvrent des perspectives de travail originales et prometteuses, dont certaines d'entre elles sont abordées en marge des objectifs initiaux. La première se trouve dans la continuité du concept de géométrie programmable et reconfigurable, en proposant une technologie à bas coût (substrat souple en Kapton et impression d'électrodes avec de l'encre conductrice). La seconde perspective instruit la compatibilité des technologies comportant des structures résonantes de type MEMS et des structures métalliques LDEP (en polysilicium) à l'échelle submicronique<br>In the field of lab-on-a-chip (LOC) systems, the channel geometry of a microfluidic chip is often specific to perform a given protocol. The chip geometry is hence defined at the design step, before the fabrication steps (generally time consuming and expensive) and cannot be thereafter modified. This fact becomes an issue when the geometry does not fit satisfactorily to the specifications and a new batch of fabrication has to be started, to size afresh the microfluidic chip. To overcome this inconvenient we propose to develop a new generation of microfluidic chips with a programmable and reconfigurable geometry. This concept is widely based on both digital microfluidic techniques, the electrowetting on dielectrics (EWOD) and the liquid dielectrophoresis (LDEP) actuations. The first investigation is focused on the microfluidic technique LDEP. First, an electromechanical model for liquids behaviours during a EWOD or LDEP actuation is established. This model is then used as a basis for the LDEP patterns design and fabrication. The LDEP patterns are tested to identify the geometries and dielectric layers stacks which give optimized LDEP actuations. By taking into account a broad parameters range, the study shows that, within a precise setup and specific conditions, the LDEP actuations can have equal performances at the minimum, or better performances than those reported in the overall scientific literature until now. Finally, a surface functionalization protocol by polymer spots (diameter size ranging from a few microns to several dozens of microns) utilizing the LDEP technology is described. This method is likely to compete directly with the standard functionalization tools. The second investigation is dealing with the programmable and reconfigurable geometry concept, thanks to microfluidic platforms which get together both EWOD and LDEP technologies on a same component. Firstly, the microfluidic platform in a single plate configuration allows providing master molds with a programmable geometry for the PDMS microfluidic chip fabrication. The results about this promising study lead to the processing of complex channels geometries, typically used in the microfluidic field. Secondly, the more exciting results are exposed about the programmable and reconfigurable microfluidic concept, by using advantageously the paraffin material. A specific protocol which takes advantages of LDEP and EWOD liquids displacements produces a lot of various and different microfluidic chips with complex channels shapes. For both applications, a single generic microfluidic platform can generate a wide number of different geometries, which can be modified partially or totally thereafter. The obtained results open up novel and promising work prospects, which one of them are approached on the fringe of the initial purposes. The first one belongs to the continuity of the programmable and reconfigurable by suggesting a low cost technology based on flexible Kapton substrate and inkjet printing of silver nanoparticules. The second one investigates the technologies compatibility between MEMS/NEMS resonating structures and LDEP metal structures (in polysilicon) at the submicronic scale
APA, Harvard, Vancouver, ISO, and other styles
50

Lima, Maria Francisca Morais de. "O humor como estratégia de compreensão e produção de charges: um estudo inferencial das charges de Myrria." Pontifícia Universidade Católica de São Paulo, 2016. https://tede2.pucsp.br/handle/handle/14378.

Full text
Abstract:
Made available in DSpace on 2016-04-28T19:34:03Z (GMT). No. of bitstreams: 1 Maria Francisca Morais de Lima.pdf: 3869227 bytes, checksum: 108564ea7f7635be117a95366c0943d8 (MD5) Previous issue date: 2016-02-25<br>The Understanding of texts that represent someone's opinions, as cartoons, requires the reader to develop contextual skills capable of generate meaning. So, this thesis discusses the importance of inferential process as a strategy of understanding the humor in political cartoon, taking as a basis the principles of textuality of Beaugrande and Dresller (1981) and inferential categorization framework prepared by Marcuschi (2012). The research's problem was to analyze inferential processes and their importance for critical analysis of texts of humor. To this end, the following objectives are -: analyze inferential procedures that contribute to the understanding of the humor in the cartoon; perform a theoretical path, not only about the first studies on laughter, but also about the perception of humor and its use as a social criticism; identify how the inferential process may contribute to the perception of constituted political criticism in cartoon gender . The study of inferential process for understanding cartoons is justified, since the reader, while reading a cartoon uses the inference to fill the gaps left towards sometimes on purpose by the author in the text. Such gaps are evidenced by the incongruity intentionally assigned by cartoonist. This thesis is divided into four chapters: the first three, presented a theoretical framework that buoyed the analysis of the research corpus consists of cartoons published on Acrítica newspaper opinion notebook from February to November 2013. As an analytical tool, picked up ten (10) cartoons of Myrria organized into five groups, considering the similarity of the issues presented. In the methodological field, it was chosen a phenomenological research method, whose premises will enable an understanding from the man of visions and world and content analysis. As standard understanding of the presented charges were used skills to locate and to infer implicit and explicit information in text and establishing the relationship between the significant resources and order effects, thus enabling the player, not only out of the surface of the text frame, but also be able to understand the relationships built in interdiscourse and intertext of cartoon texts<br>A compreensão de textos opinativos como a charge exige do leitor o desenvolvimento de habilidades contextuais capaz de gerar sentido. Para tanto, esta tese discute a importância do processo inferencial como estratégia de compreensão do humor na charge política, tomando como base de análise os princípios de textualidade de Beaugrande e Dresller (1981) e o quadro de categorização inferencial elaborado por Marcuschi (2012). O problema da pesquisa consistiu em analisar os processos inferenciais e sua importância para a análise crítica de textos de humor. Para tanto, elencaram-se os seguintes objetivos: analisar os procedimentos inferenciais que contribuem para a compreensão do humor presente na charge; realizar um trajeto teórico, não só a respeito dos primeiros estudos sobre o riso, como também a respeito da percepção de humor e sua utilização como aporte de crítica social; identificar como o processo inferencial pode contribuir para a percepção da crítica política constituída no gênero charge. O estudo do processo inferencial para a compreensão de charges se justifica, uma vez que o leitor, ao ler um texto chárgico, utiliza a inferência para preencher as lacunas de sentido deixadas, às vezes de propósito, pelo autor no texto. Tais lacunas são evidenciadas pela incongruência intencionalmente atribuída pelo chargista. Esta tese está dividida em quatro capítulos: nos três primeiros, apresentou-se um aporte teórico que balizou a análise do Corpus da pesquisa constituído por charges publicadas no caderno de opinião do jornal Acrítica no período de fevereiro a novembro de 2013. Como instrumento de análise, escolheram-se dez (10) charges de Myrria, organizadas em cinco grupos, considerando a similaridade dos assuntos apresentados. No campo metodológico, optou-se como método de investigação a fenomenologia, cujos pressupostos permitem realizar uma compreensão a partir das visões de homem e de mundo e a análise de conteúdo. Como padrão de compreensão das charges apresentadas, foram utilizadas as habilidades de localizar e inferir informações explícitas e implícitas no texto e o estabelecimento de relação entre os recursos expressivos e efeitos de sentido, possibilitando assim ao leitor, não só sair da estrutura superficial do texto, como também ser capaz de perceber as relações construídas no interdiscurso e no intertexto dos textos chárgicos
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography