Dissertations / Theses on the topic 'Gridders'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 39 dissertations / theses for your research on the topic 'Gridders.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Shadman, K. (Khashayar) 1972. "The gridded electromagnet probe." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/17041.
Full textIncludes bibliographical references.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
We attempted to measure the anisotropy in the electron distribution function in magnetized plasma by exploiting the adiabatic invariance of the electron's magnetic moment with a probe comprising a grid, a collector, and an inertially cooled electromagnet. The electric mirror force of the grid, which is located at the electromagnet throat, reduces the demand on the magnetic mirror force and thereby on the electromagnet current, which then allows for the construction of a compact probe that can be inserted inside the plasma chamber. An analysis of the effects of space charge inside the grid-collector cavity revealed that the size of the probe's entrance aperture, b, which gives the size of the plasma beam inside the probe, should be chosen to be within a factor of ten of the electron Debye-length [lambda][sub]De. In addition, an analysis of the discrete structure of the grid showed that the mesh wire spacing d should be chosen to be much less than [lambda][sub]De. Also, the wire thickness t should be chosen to be much less than d . We built a probe with a grid of tungsten wires with dimensions, t = 5,[mu]m and d = 200[mu]m . We then tested this probe in a hydrogen plasma immersed in a background magnetic field of B [approximately] 1kG. The plasma was heated by microwaves via the electron cyclotron resonance. It was characterized by a density and temperature equal to n[sub]e [approximately] 10¹⁰ cm⁻³ and T[sub]e [approximately] 10eV, respectively, which gave [lambda][sub]De [approximately] 300[mu]m. The collector's current-voltage characteristic demonstrated the interaction between the electric barrier at the collector and the hybrid electric-magnetic barrier at the grid, thereby establishing the basic principles of the probe.
(cont.) The characteristic also revealed the non-ideal behaviors associated with the electric hole in the mesh and the effects of space charge. These effects in conjunction with the poor signal-to-noise level of the data prevented the measurement of the distribution function. Still, we were able to extract the temperature anisotropy for an assumed two-temperature Maxwellian distribution. The value for this ratio was found to be greater than one (greater temperature for the perpendicular gyro-motion), which is plausible given the way in which the plasma is heated.
by Khashayar Shadman.
Ph.D.
Kindberg, Peter. "Development of a miniature Gridded ion thruster." Thesis, Luleå tekniska universitet, Rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-65750.
Full textLee, Cameron C. "The Development of a Gridded Weather Typing Classification Scheme." Thesis, Kent State University, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3618946.
Full textSince their development in the 1990s, gridded reanalysis data sets have proven quite useful for a broad range of synoptic climatological analyses, especially those utilizing a map pattern classification approach. However, their use in broad-scale, surface weather typing classifications and applications have not yet been explored. This research details the development of such a gridded weather typing classification (GWTC) scheme using North American Regional Reanalysis data for 1979-2010 for the continental United States.
Utilizing eight-times daily observations of temperature, dew point, pressure, cloud cover, u-wind and v-wind components, the GWTC categorizes the daily surface weather of 2,070 locations into one of 11 discrete weather types, nine core types and two transitional types, that remain consistent throughout the domain. Due to the use of an automated deseasonalized z-score initial typing procedure, the character of each type is both geographically and seasonally relative, allowing each core weather type to occur at every location, at any time of the year. Diagnostic statistics reveal a high degree of spatial cohesion among the weather types classified at neighboring locations, along with an effective partitioning of the climate variability of individual locations (via a Variability Skill Score metric) into these 11 weather types. Daily maps of the spatial distribution of GWTC weather types across the United States correspond well to traditional surface weather maps, and comparisons of the GWTC with the Spatial Synoptic Classification are also favorable.
While the potential future utility of the classification is expected to be primarily for the resultant calendars of daily weather types at specific locations, the automation of the methodology allows the classification to be easily repeatable, and therefore, easily transportable to other locations, atmospheric levels, and data sets (including output from gridded general circulation models). Further, the enhanced spatial resolution of the GWTC may also allow for new applications of surface weather typing classifications in mountainous and rural areas not well represented by airport weather stations.
Liu, Chao. "Variations of Global Ocean Salinity from Multiple Gridded Argo Products." Scholar Commons, 2019. https://scholarcommons.usf.edu/etd/7848.
Full textNchaba, Teboho. "Verification of gridded seasonal wind speed forecasts over South Africa." Master's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/4970.
Full textIncludes bibliographical references.
The Climate System Analysis Group (CSAG) at the University of Cape Town produces provisional global and Southern African seasonal wind forecasts generated using the United Kingdom Meteorological Office Atmospheric General Circulation Model (AGCM) HadAM3P (non-standard version of HadAM3). This study examines the quality of the seasonal wind speed forecasts through a forecast verification process for continuous variables using reanalysis products of the National Centers for Environmental Prediction and the Department of Energy (NCEP-DOE) as observations data. The verification analyses are performed using summary measures Mean Error (ME), Mean Absolute Error (MAE), Mean Squared Error (MSE), correlation coefficients, Linear Error in Probability Space (LEPS) and exploratory methods, scatter and conditional quantile plots. These methods are used to determine the aspects of forecast quality namely, bias, accuracy, reliability, resolution, and skill over a 20 year period (1991 to 2010). The results of the study have determined that the use of both accuracy and skill measures for the verification analyses provide more information about the quality of the forecasts, as opposed only one of these. In all provinces, the highest quality seasonal wind speed forecasts are made at 500 hPa and the lowest quality forecasts at 1000 hPa. Furthermore regions, pressure levels, and seasons with the highest forecast quality share the common characteristic that their wind speeds are relatively high. The forecasts add value to the climatology and thus are a useful tool for wind assessment at a seasonal scale. It is suggested that adding spatial resolution to the forecasts through downscaling may prepare them for use in applications such as wind power output forecasting.
Hennig, Benjamin D. "Rediscovering the world : gridded cartograms of human and physical space." Thesis, University of Sheffield, 2011. http://etheses.whiterose.ac.uk/1671/.
Full textBarthol, Christopher John. "Characterization and Modeling of Non-Volatile SONOSSemiconductor Memories with Gridded Capacitors." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1418813120.
Full textKotsakis, Christophoros. "Multiresolution aspects of linear approximation methods in Hilbert spaces using gridded data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0016/NQ54794.pdf.
Full textHardy, Benjamin Arik. "A New Method for the Rapid Calculation of Finely-Gridded Reservoir Simulation Pressures." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd1123.pdf.
Full textKluver, Daria B. "Characteristics and trends in North American snowfall from a comprehensive gridded data set." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 159 p, 2007. http://proquest.umi.com/pqdweb?did=1338873801&sid=15&Fmt=1&clientId=8331&RQT=309&VName=PQD.
Full textSwenson, Anthony P. "The Field-Programmable Gate Array Design of the Gridded Retarding Ion Distribution Sensor." DigitalCommons@USU, 2017. https://digitalcommons.usu.edu/etd/6876.
Full textWeiss, Jeremy, and Michael Crimmins. "Better Coverage of Arizona's Weather and Climate: Gridded Datasets of Daily Surface Meteorological Variables." College of Agriculture, University of Arizona (Tucson, AZ), 2016. http://hdl.handle.net/10150/625414.
Full textMany areas that use agricultural and environmental science for management and planning – ecosystem conservation, crop and livestock systems, water resources, forestry and wildland fire management, urban horticulture – often need historical records of daily weather for activities that range from modeling forage production to determining the frequency of freezing temperatures or heavy rainfall. In the past, such applications primarily have used station-based observations of meteorological variables like temperature and precipitation. However, weather stations are sparsely and irregularly located throughout Arizona, and due to the highly variable terrain across the state (Figure 1), information recorded at these sites may not represent meteorological conditions at distant, non-instrumented locations or over broad areas. This issue, along with others related to quality, length, and completeness of station records, can hinder the use of weather and climate data for agricultural and natural resources applications. In response to an increasing demand for spatially and temporally complete meteorological data as well as the potential constraints of station-based records, the number of gridded daily surface weather datasets is expanding. This bulletin reviews a current suite of these datasets, particularly those that integrate both atmospheric and topographic information in order to better model temperature and precipitation on relatively fine spatial scales, and is intended for readers with knowledge of weather, climate, and geospatial data. In addition to addressing how these datasets are developed and what their spatial domain and resolution, record length, and variables are, this bulletin also summarizes where and how to access these datasets, as well as the general suitability of these datasets for different uses.
Yi, Yuchan. "Determination of Gridded Mean Sea Surface From Altimeter Data of Topex, ERS-1 and GEOSAT /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487931512620891.
Full textFabbiani-Leon, Angelique Marie. "Comparison method between gridded and simulated snow water equivalent estimates to in-situ snow sensor readings." Thesis, University of California, Davis, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1604056.
Full textCalifornia Department of Water Resources (DWR) Snow Surveys Section has recently explored the potential use of recently developed hydrologic models to estimate snow water equivalent (SWE) for the Sierra Nevada mountain range. DWR Snow Surveys Section’s initial step is to determine how well these hydrologic models compare to the trusted regression equations, currently used by DWR Snow Surveys Section. A comparison scheme was ultimately developed between estimation measures for SWE by interpreting model results for the Feather River Basin from: a) National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL) gridded SWE reconstruction product, b) United States Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS), and c) DWR Snow Surveys Section regression equations. Daily SWE estimates were extracted from gridded results by computing an average SWE based on 1,000 ft elevation band increments from 3,000 to 10,000 ft (i.e. an elevation band would be from 3,000 to 4,000 ft). The dates used for processing average SWE estimates were cloud-free satellite image dates during snow ablation months, March to August, for years 2000–2012. The average SWE for each elevation band was linearly interpolated for each snow sensor elevation. The model SWE estimates were then compared to the snow sensor readings used to produce the snow index in DWR’s regression equations. In addition to comparing JPL’s SWE estimate to snow sensor readings, PRMS SWE variable for select hydrologic response units (HRU) were also compared to snow sensor readings. Research concluded with the application of statistical methods to determine the reliability in the JPL products and PRMS simulated SWE variable, with results varying depending on time duration being analyzed and elevation range.
Cui, Wenjun, Xiquan Dong, Baike Xi, and Aaron Kennedy. "Evaluation of Reanalyzed Precipitation Variability and Trends Using the Gridded Gauge-Based Analysis over the CONUS." AMER METEOROLOGICAL SOC, 2017. http://hdl.handle.net/10150/625780.
Full textMeredith, Shaun Lee. "Construction of a gridded energy analyzer for measurements of ion energy distribution in the versatile toroidal facility." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/50545.
Full textIncludes bibliographical references (p. 80-82).
The Versatile Toroidal Facility (VTF) at MIT's Plasma Science and Fusion Center provides a laboratory environment for studying ionospheric plasmas. Various plasma diagnostic devices have been created and used to study the VTF plasma since 1991. An accurate method for measuring VTF's ion characteristics has never been designed or installed in the laboratory facility. Gridded Energy Analyzers (GEA) are useful diagnostic tools for determining plasma ion energy distributions and ion temperature. Research was done on the theory behind Gridded Energy Analyzers and their applicability for use in the Versatile Toroidal Facility. A design and method for constructing a miniaturized GEA for VTF was developed and documented. The construction method covers material selection, machining, and assembly of VTF's miniature GEA. The miniature GEA is a non-perturbing probe used in VTF's plasma, which is approximately 3 cm in diameter. The GEA was constructed and preliminary experimental data was obtained. From this data VTF's ion temperature was found to be approximately 8eV and an ion distribution function was determined to be roughly Maxwellian in nature.
by Shaun L. Meredith.
S.M.
Derksen, Christopher Peter. "Integrating passive microwave remotely sensed imagery and gridded atmospheric data, a study of North American Prairie snow cover." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/NQ65231.pdf.
Full textHofstra, Nynke. "Development and evaluation of a Europrean daily high-resolution gridded dataset of surface temperature and precipitation for 1950-2006." Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.504022.
Full textNazemi, Jonathan H. (Jonathan Hesam) 1974. "Determination of the ion distribution function during magnetic reconnection in the versatile toroidal facility with a gridded energy analyzer." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/16906.
Full textIncludes bibliographical references (leaf 44).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
A gridded energy analyzer (GEA) diagnostic and associated electronics are designed and built to explore the evolution of the ion distribution function during driven magnetic reconnection in the Versatile Toroidal Facility. The temporal evolution of the ion characteristic is measured at different locations throughout the reconnection region, for a number of magnetic field configurations. The measured ion characteristics are found to be in excellent agreement with a theoretical fit constructed from a double Maxwellian distribution, from which the temperatures and drift velocities are found as functions of space and time. It is found that the ion temperature of each Maxwellian exhibit minor temporal variations during reconnection which are negligible in terms of ion heating. Additionally, the temperatures do not significantly change with varying radial position, or magnetic cusp strength. The drift velocities are observed to evolve in time, scale with magnetic cusp strength, and to depend on the exact location throughout the reconnection region. The ions are thus subject to acceleration due to the electric field induced by the ohmic drive. The appearance of a double Maxwellian distribution during the reconnection drive is hypothesized to be due to double ionization of argon atoms. Repetition of the experiment with a hydrogen plasma verified that this scenario is most probable.
by Jonathan H. Nazemi.
S.M.
Caetano, André Francisco Morielo. "Griddler : uma estratégia configurável para armazenamento distribuído de objetos peer-to-peer que combina replicação e erasure coding com sistema de cache /." São José do Rio Preto, 2017. http://hdl.handle.net/11449/151383.
Full textBanca: Geraldo Francisco Donega Zafalon
Banca: Pedro Luiz Pizzigatti Correa
Resumo: Sistemas de gerenciamento de banco de dados, na sua essência, almejam garantir o armazenamento confiável da informação. Também é tarefa de um sistema de gerenciamento de banco de dados oferecer agilidade no acesso às informações. Nesse contexto, é de grande interesse considerar alguns fenômenos recentes: a progressiva geração de conteúdo não-estruturado, como imagens e vídeo, o decorrente aumento do volume de dados em formato digital nas mais diversas mídias e o grande número de requisições por parte de usuários cada vez mais exigentes. Esses fenômenos fazem parte de uma nova realidade, denominada Big Data, que impõe aos projetistas de bancos de dados um aumento nos requisitos de flexibilidade, escalabilidade, resiliência e velocidade dos seus sistemas. Para suportar dados não-estruturados foi preciso se desprender de algumas limitações dos bancos de dados convencionais e definir novas arquiteturas de armazenamento. Essas arquiteturas definem padrões para gerenciamento dos dados, mas um sistema de armazenamento deve ter suas especificidades ajustadas em cada nível de implementação. Em termos de escalabilidade, por exemplo, cabe a escolha entre sistemas com algum tipo de centralização ou totalmente descentralizados. Por outro lado, em termos de resiliência, algumas soluções utilizam um esquema de replicação para preservar a integridade dos dados por meio de cópias, enquanto outras técnicas visam a otimização do volume de dados armazenados. Por fim, ao mesmo tempo que são...
Abstract: Database management systems, in essence, aim to ensure the reliable storage of information. It is also the task of a database management system to provide agility in accessing information. In this context, it is of great interest to consider some recent phenomena: the progressive generation of unstructured content such as images and video, the consequent increase in the volume of data in digital format in the most diverse media and the large number of requests by users increasingly demanding. These phenomena are part of a new reality, named Big Data, that imposes on database designers an increase in the flexibility, scalability, resiliency, and speed requirements of their systems. To support unstructured data, it was necessary to get rid of some limitations of conventional databases and define new storage architectures. These architectures define standards for data management, but a storage system must have its specificities adjusted at each level of implementation. In terms of scalability, for example, it is up to the choice between systems with some type of centralization or totally decentralized. On the other hand, in terms of resiliency, some solutions utilize a replication scheme to preserve the integrity of the data through copies, while other techniques are aimed at optimizing the volume of stored data. Finally, at the same time that new network and disk technologies are being developed, one might think of using caching to optimize access to what is stored. This work explores and analyzes the different levels in the development of distributed storage systems. This work objective is to present an architecture that combines different resilience techniques. The scientific contribution of this work is, in addition to a totally decentralized suggestion of data allocation, the use of an access cache structure with adaptive algorithms in this environment
Mestre
Caetano, André Francisco Morielo [UNESP]. "Griddler: uma estratégia configurável para armazenamento distribuído de objetos peer-to-peer que combina replicação e erasure coding com sistema de cache." Universidade Estadual Paulista (UNESP), 2017. http://hdl.handle.net/11449/151383.
Full textApproved for entry into archive by Luiz Galeffi (luizgaleffi@gmail.com) on 2017-08-23T19:42:08Z (GMT) No. of bitstreams: 1 caetano_afm_me_sjrp.pdf: 2084639 bytes, checksum: d77158373f8168fc0224d407bb07aa99 (MD5)
Made available in DSpace on 2017-08-23T19:42:08Z (GMT). No. of bitstreams: 1 caetano_afm_me_sjrp.pdf: 2084639 bytes, checksum: d77158373f8168fc0224d407bb07aa99 (MD5) Previous issue date: 2017-08-10
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Sistemas de gerenciamento de banco de dados, na sua essência, almejam garantir o armazenamento confiável da informação. Também é tarefa de um sistema de gerenciamento de banco de dados oferecer agilidade no acesso às informações. Nesse contexto, é de grande interesse considerar alguns fenômenos recentes: a progressiva geração de conteúdo não-estruturado, como imagens e vídeo, o decorrente aumento do volume de dados em formato digital nas mais diversas mídias e o grande número de requisições por parte de usuários cada vez mais exigentes. Esses fenômenos fazem parte de uma nova realidade, denominada Big Data, que impõe aos projetistas de bancos de dados um aumento nos requisitos de flexibilidade, escalabilidade, resiliência e velocidade dos seus sistemas. Para suportar dados não-estruturados foi preciso se desprender de algumas limitações dos bancos de dados convencionais e definir novas arquiteturas de armazenamento. Essas arquiteturas definem padrões para gerenciamento dos dados, mas um sistema de armazenamento deve ter suas especificidades ajustadas em cada nível de implementação. Em termos de escalabilidade, por exemplo, cabe a escolha entre sistemas com algum tipo de centralização ou totalmente descentralizados. Por outro lado, em termos de resiliência, algumas soluções utilizam um esquema de replicação para preservar a integridade dos dados por meio de cópias, enquanto outras técnicas visam a otimização do volume de dados armazenados. Por fim, ao mesmo tempo que são desenvolvidas novas tecnologias de rede e disco, pode-se pensar na utilização de caching para otimizar o acesso ao que está armazenado. Este trabalho explora e analisa os diferentes níveis no desenvolvimento de sistemas de armazenamento distribuído. O objetivo deste trabalho é apresentar uma arquitetura que combina diferentes técnicas de resiliência. A contribuição científica deste trabalho é, além de uma sugestão totalmente descentralizada de alocação dos dados, o uso de uma estrutura de cache de acesso nesse ambiente, com algoritmos adaptáveis.
Database management systems, in essence, aim to ensure the reliable storage of information. It is also the task of a database management system to provide agility in accessing information. In this context, it is of great interest to consider some recent phenomena: the progressive generation of unstructured content such as images and video, the consequent increase in the volume of data in digital format in the most diverse media and the large number of requests by users increasingly demanding. These phenomena are part of a new reality, named Big Data, that imposes on database designers an increase in the flexibility, scalability, resiliency, and speed requirements of their systems. To support unstructured data, it was necessary to get rid of some limitations of conventional databases and define new storage architectures. These architectures define standards for data management, but a storage system must have its specificities adjusted at each level of implementation. In terms of scalability, for example, it is up to the choice between systems with some type of centralization or totally decentralized. On the other hand, in terms of resiliency, some solutions utilize a replication scheme to preserve the integrity of the data through copies, while other techniques are aimed at optimizing the volume of stored data. Finally, at the same time that new network and disk technologies are being developed, one might think of using caching to optimize access to what is stored. This work explores and analyzes the different levels in the development of distributed storage systems. This work objective is to present an architecture that combines different resilience techniques. The scientific contribution of this work is, in addition to a totally decentralized suggestion of data allocation, the use of an access cache structure with adaptive algorithms in this environment.
Dobkevicius, Mantas. "Modelling and design of inductively coupled radio frequency gridded ion thrusters with an application to Ion Beam Shepherd type space missions." Thesis, University of Southampton, 2017. https://eprints.soton.ac.uk/413768/.
Full textMao, Ganquan [Verfasser], and Harald [Akademischer Betreuer] Kunstmann. "Stochastic Bias Correction of Dynamically Downscaled Precipitation Fields for Germany Through Copula-based Integration of Gridded Observation Data / Ganquan Mao. Betreuer: Harald Kunstmann." Augsburg : Universität Augsburg, 2016. http://d-nb.info/111353480X/34.
Full textBaro, Johanna. "Modélisation multi-échelles de la morphologie urbaine à partir de données carroyées de population et de bâti." Thesis, Paris Est, 2015. http://www.theses.fr/2015PEST1004/document.
Full textSince a couple of decades the relationships between urban form and travel patterns are central to reflection on sustainable urban planning and transport policy. The increasing distribution of regular grid data is in this context a new perspective for modeling urban structures from measurements of density freed from the constraints of administrative division. Population density data are now available on 200 meters grids covering France. We complete these data with built area densities in order to propose two types of classified images adapted to the study of travel patterns and urban development: classifications of urban fabrics and classifications of morphotypes of urban development. The construction of such classified images is based on theoretical and experimental which raise methodological issues regarding the classification of a statistically various urban spaces. To proceed exhaustively those spaces, we proposed a per-pixel classification method of urban fabrics by supervised transfer learning. Hidden Markov random fields are used to take into account the dependencies in the spatial data. The classifications of morphotypes are then obtained by broadening the knowledge of urban fabrics. These classifications are formalized from chorematique theoretical models and implemented by qualitative spatial reasoning. The analysis of these classifications by methods of quantitative spatial reasoning and factor analysis allowed us to reveal the morphological diversity of 50 metropolitan areas. It highlights the relevance of these classifications to characterize urban areas in accordance with various development issues related to the density or multipolar development
Bek, Jeremy. "Design, simulation, and testing of an electric propulsion cluster frame." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-300970.
Full textGenerellt erbjuder elektrisk framdrivning hög verkningsgrad men relativt låg dragkraft. För att avhjälpa detta kan flera jonmotorer sättas samman i en klusterkonfiguration och drivs parallellt. Detta kräver en noggrann utformning av en ram för att rymma de enskilda framdrivningssystemen. Denna ram måste vara modulär för att kunna användas i olika klusterstorlekar och verifiera termiska och mekaniska krav för att säkerställa den nominella driften av motorerna. Föreliggande rapport syftar till att visa designprocessen för en sådan ram, från preliminär modellering till experimentell studie av en prototyp. Detta dokument innehåller en översikt över den iterativa designprocessen, driven av termiska simuleringar gjorda med COMSOL Multiphysics, som ledde till uppfattningen av en 2 motorer och 4 motorer ram. En klumpelementmodell av jonmotorn skapades också för att modellera dess komplexa termiska beteende. Dessutom var den 2 motorer ram studeras mekaniskt med analytiska beräkningar och simuleringar av enkla laddafall med SolidWorks. Slutligen monterades en prototyp baserad på den 2 motorer rammodellen. Prototypen användes för att göra temperaturmätningar medan den är värd för 2 jonmotorer i en vakuumkammare. Temperaturfördelningen i klustret mättes och jämfördes med simuleringsresultat. Termiska simuleringar av den 2 motorer och 4 motorer ramen visade lovande resultat, medan mekaniska simuleringar av den 2 motorer versionen klarade alla krav. Dessutom överensstämde experimentella resultat till stor del med termiska simuleringar av prototypen. Slutligen var klumpelementmodellen mycket användbar för att kalibrera de andra modellerna med sin höga flexibilitet och snabba beräkningstid.
Holzer, Nicolai. "Development of an interface for the conversion of geodata in a NetCDF data model and publication of this data by the use of the web application DChart, related to the CEOP-AEGIS project." Master's thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-71492.
Full textDas Hochplateau von Tibet mit einer Ausdehnung von 2.5 Millionen Quadratkilometer und einer durchschnittlichen Höhe von über 4 700 Meter beeinflusst wesentlich den asiatischen Monsun und reguliert mit seinen Schnee- und Eisreserven den Wasserhaushalt der Oberläufe der sieben wichtigsten Flüsse Südostasiens. Von diesem Wasserzufluss leben 1.4 Milliarden Menschen und hängt neben dem Ackerbau und der Wirtschaft das gesamte Ökosystem in dieser Gegend ab. Wie die zunehmende Zahl an Dürren und Überschwemmungen zeigt, sind diese jahreszeitlich beeinflussten Wasserreserven allen Anscheins nach vom Klimawandel betroffen, mit negativen Auswirkungen für die flussabwärts liegenden Stromgebiete und demzufolge die dortige Nahrungsmittelsicherheit. Das internationale Kooperationsprojekt CEOP-AEGIS – finanziert von der Europäischen Kommission unter dem Siebten Rahmenprogramm – hat sich deshalb zum Ziel gesetzt, die Hydrologie und Meteorologie dieses Hochplateaus weiter zu erforschen, um daraus seine Rolle in Bezug auf das Klima, den Monsun und den zunehmenden extremen Wetterereignissen tiefgreifender verstehen zu können. Im Rahmen dieses Projektes werden verschiedenartigste Erdbeobachtungsdaten von Fernerkundungssystemen, numerischen Simulationen und Bodenstationsmessungen gesammelt und ausgewertet. Sämtliche Endprodukte des CEOP-AEGIS Projektes werden der wissenschaftlichen Gemeinschaft auf Grundlage einer über das Internet erreichbaren Datenbank zugänglich gemacht, welche eine Zuarbeit zur Initiative GEOSS (Global Earth Observing System of Systems) ist. Hintergründig basiert das CEOP-AEGIS Datenportal auf einem Dapper OPeNDAP Internetserver, welcher die im NetCDF Dateiformat gespeicherten Daten der vordergründigen internetbasierten DChart Benutzerschnittstelle auf Grundlage des OPeNDAP Protokolls bereit stellt. Eingangsdaten von Partnern dieses Projektes sind heterogen nicht nur in Bezug ihres Dateninhalts, sondern auch in Anbetracht ihrer Datenhaltung und Metadatenbeschreibung. Die Daten- und Metadatenhaltung der im NetCDF Dateiformat gespeicherten Endprodukte dieses Projektes müssen jedoch auf einer standardisierten Basis internationalen Konventionen folgen, damit ein hoher Grad an Interoperabilität erreicht werden kann. In Anbetracht dieser Qualitätsanforderungen wurden die technischen Möglichkeiten von NetCDF, OPeNDAP, Dapper und DChart in dieser Diplomarbeit gründlich untersucht, damit auf Grundlage dieser Erkenntnisse eine korrekte Entscheidung bezüglich der Implementierung eines für CEOP-AEGIS Daten passenden und interoperablen NetCDF Datenmodels abgeleitet werden kann, das eine maximale Kompatibilität und Funktionalität mit OPeNDAP und Dapper / DChart sicher stellen soll. Diese NetCDF Implementierung ist Bestandteil einer neu entwickelten Datenschnittstelle, welche heterogene Daten von Projektpartnern in standardisierte NetCDF Datensätze konvertiert und aggregiert, sodass diese mittels OPeNDAP dem auf der Dapper / DChart Technologie basierendem Datenportal von CEOP-AEGIS zugeführt werden können. Einen besonderen Schwerpunkt bei der Entwicklung dieser Datenschnittstelle wurde auf eine intermediäre Daten- und Metadatenhaltung gelegt, welche mit der Zielsetzung von geringem Arbeitsaufwand die Modifizierung ihrer Elemente und somit die Erzeugung von standardisierten NetCDF Dateien auf eine einfache Art und Weise erlaubt. In Anbetracht der beträchtlichen und verschiedenartigsten Geodaten dieses Projektes war es schlussendlich wesentlich, eine hochwertige Datenschnittstelle zur Überführung heterogener Eingangsdaten von Projektpartnern in standardisierte und aggregierte NetCDF Ausgansdateien zu entwickeln, um damit eine maximale Kompatibilität und Funktionalität mit dem CEOP-AEGIS Datenportal und daraus folgend ein hohes Maß an Interoperabilität innerhalb der wissenschaftlichen Gemeinschaft erzielen zu können
Sexton, Christopher G. "Vectorization of gridded urban land use data." 2007. http://www.lib.ncsu.edu/theses/available/etd-11082007-161933/unrestricted/etd.pdf.
Full textHsu, Yi-hsiang, and 許益祥. "An Effective Crosstalk Optimizer in Gridded Channel Routing." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/15240217195902731878.
Full text中原大學
電子工程研究所
90
The wire-to-wire spacing in a VLSI chip becomes closer as the VLSI fabrication technology rapidly evolves in current years. With VLSI design entering the deep sub-micron, the feature sizes continues to decrease and the operating frequency more and more highly. Many of digital and analog circuits were integrated into single chip. When the circuits work at low-voltage system, the chip becomes very sensitivity for noise. The crosstalk effects are between coupling capacitance of two parallel wires that becomes the main source of signal delay. The coupling effect can affect speed of signal transition, then the crosstalk can make fault function of chip. In this thesis, we will propose a timing driven gridded channel routing algorithms for the minimization of crosstalk problem. Compared with previous works, the main distinction of our approach is that it considers the signal arrival time. So that we can precisely estimate the actual delay degradation caused by crosstalk effect. This algorithm was implemented in C++ programming language and run on different benchmark circuits. Experimental data consistently shows that our approach is effective and efficient for crosstalk minimization.
Seglenieks, Frank. "Creation of a gridded time series of hydrological variables for Canada." Thesis, 2009. http://hdl.handle.net/10012/4513.
Full textΣαλαμαλίκης, Βασίλειος. "Ανάπτυξη και αξιολόγηση μεθοδολογίας για τη δημιουργία πλεγματικών (gridded) ισοτοπικών δεδομένων." Thesis, 2010. http://nemertes.lis.upatras.gr/jspui/handle/10889/4252.
Full textSeveral climatic, hydrological and environmental studies require the accurate knowledge of the spatial distribution of stable isotopes in precipitation. Since the number of rain sampling stations for isotope analysis is small and not evenly distributed around the globe, the global distribution of stable isotopes can be calculated via the production of gridded isotopic data sets. Several methods have been proposed for this purpose. Some of them use empirical equations and geostatistical methods in order to minimize eventual errors due to interpolation. In this work a methodology is proposed for the development of 10΄ × 10΄ gridded isotopic data of precipitation in Central and Eastern Mediterranean. Statistical models are developed taking into account geographical and meteorological parameters as independent variables. The initial methodology takes into account only the altitude and latitude of an area. Since however the isotopic composition of precipitation depends also on longitude, the existing models have been modified by adding meteorological parameters as independent variables also. A series of models is proposed taking into account some or a combination of the above mentioned variables. The models are validated using the Thin Plate Smoothing Splines (TPSS) and the Ordinary Kriging (OK) methods.
Hsu, Pei-Hsuan, and 徐沛諠. "Cut Redistribution and Guiding Template Assignment Using DSA for 1D Gridded Layout." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/72854648861832210698.
Full text中原大學
資訊工程研究所
105
Abstract Many semiconductor industries gradually tend to transit from the complex 2D design to a high regularity of 1D gridded design. In this transition process, cut distribution position has become the most important challenge. For more advanced nanometer designs, cuts may be too dense to be printed by 193 nm immersion (193i) lithography. Many techniques have been proposed by excellent researchers, e.g.: Self-Aligned Double Patterning (SADP), high resolution E-beam lithography (EBL) and Directed Self-Assembly (DSA). DSA is a great potential option and outstanding in recent years. In 1D gridded design, the layout wire divides the dense parallel lines into real wires and dummy wires through metal cuts. Real wire is to achieve the function of the target circuit and the remaining line is called Dummy wire. In order to maintain the original layout function, the cut location can only be moved within the dummy wire. In the layout, if the distance of two cuts on the same real wire is less than the design rule and cuts do not belong to the same DSA template, it’s called “Conflict”. The conditions of conflict between two templates are similar. In this thesis, we will discuss how to pick the appropriate DSA patterns to make cuts and produce 1D gridded design as the original circuit with the same function. With the assignment of different guiding template, DSA pattern shape or size will follow the assignment. In our experiments, Simulated Annealing algorithm is applied to pick patterns for the conflict number minimization. In the post-processing phase after SA-Based matching, the number of double cuts is maximized. In terms of the number of Conflict, we can achieve an improvement rate of 104150% compared to the result of Xiao et al.[14] , and 56% improvement compared to Li''s study[7] . In the Double cut stage, an average increase rate of 12.18592593% of the cut number.
黎鈺琦. "Hybrid Lithography Optimization with DSA and E-beam for 1D Gridded Layout." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/34445087566108858225.
Full text國立清華大學
資訊工程學系
103
As the integrated circuit industry continues to scale down the minimun feature size, design style is moving from complex 2D design to highly regular 1D design with the "lines and cuts" process. While dense lines can be fully optimized by their regularity, randomly distributed cuts become a major challenge for the fabrication of 1D layout. In this thesis, we consider a hybrid lithography process combining directed self-assembly (DSA) and E-Beam for 1D gridded layout, and present a cut redistribution algorithm to maximize the number of cuts matched with DSA templates and minimize the number of cuts printed by E-Beam. The robustness of our algorithm is demonstrated through encouraging experimental results.
Erzeybek, Balan Selin. "Characterization and modeling of paleokarst reservoirs using multiple-point statistics on a non-gridded basis." 2012. http://hdl.handle.net/2152/19604.
Full texttext
Thormodsgard, June M. "Analysis of the gridded finite element surface interpolation for geometric rectification and mosaicking of landsat data." 1986. http://catalog.hathitrust.org/api/volumes/oclc/15581760.html.
Full textTypescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 41-43).
Pan, H. B., and 潘宏彬. "The measurement and analysis of wafer type gridded energy analyzer in a pulse-time modulated inductively coupled plasma." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/17626706899889892071.
Full textGlover, Timothy Ward. "Measurement of plasma parameters in the exhaust of a magnetoplasma rocket by gridded energy analyzer and emissive Langmuir probe." Thesis, 2002. http://hdl.handle.net/1911/18084.
Full textShafer, Phillip Edmond. "Developing gridded forecast guidance for warm season lightning over Florida using the perfect prognosis method and mesoscale model output." 2007. http://etd.lib.fsu.edu/theses/available/etd-04022007-161314.
Full textAdvisor: Henry E. Fuelberg, Florida State University, College of Arts and Sciences, Dept. of Meteorology. Title and description from dissertation home page (viewed July 24, 2007). Document formatted into pages; contains xiii, 87 pages. Includes bibliographical references.
Bhowmik, Avit Kumar. "Evaluation of Spatial interpolation techniques for mapping climate variables with low sample density: a case study using a new gridded dataset of Bangladesh." Master's thesis, 2012. http://hdl.handle.net/10362/8321.
Full textThis study explores and analyses the impact of sample density on the performances of the spatial interpolation techniques. It evaluates the performances of two alternative deterministic techniques – Thin Plate Spline and Inverse Distance Weighting, and two alternative stochastic techniques – Ordinary Kriging and Universal Kriging; to interpolate two climate indices - Annual Total Precipitation in Wet Days and the Yearly Maximum Value of the Daily Maximum Temperature, in a low sample density region - Bangladesh, for 60 years – 1948 to 2007. It implies the approach of Spatially Shifted Years to create mean variograms with respect to the low sample density. Seven different performance measurements - Mean Absolute Error, Root Mean Square Errors, Systematic Root Mean Square Errors, Unsystematic Root Mean Square Errors, Index of Agreement, Coefficient of Variation of Prediction and Confidence of Prediction, have been applied to evaluate the performance of the spatial interpolation techniques. The resulted performance measurements indicate that for most of the years the Universal Kriging method performs better to interpolate total precipitation, and the Ordinary Kriging method performs better to interpolate the maximum temperature. Though the difference surfaces indicate a very little difference in the estimating ability of the four spatial interpolation techniques, the residual plots refer to the differences in the interpolated surfaces by different techniques in terms of their over and under estimation. The results also indicate that the Inverse Distance Weighting method performs better for both indices, when the sample density is too low, but the performance is questioned by the inclusion of measurement errors in the interpolated surfaces. All the error measurements show a decreasing trend with the increasing sample density, and the index of agreement and confidence of prediction show an increasing trend over years. Finally, the strong correlation between the Sample Coefficient of Variation and the performance measurements, implies that the more representative the samples are of the climate phenomenon, the more improved are the performances of the spatial interpolation techniques. The correlation between the sample coefficient of variation and the number of samples implies that the high representativity of the sample is attainable with an increased sample density.
Ζάγουρας, Αθανάσιος. "Μέθοδοι εξαγωγής και ψηφιακής επεξεργασίας περιβαλλοντικών σημάτων και εικόνων – Εφαρμογή στην αυτόματη ταξινόμηση χαρτών καιρού." Thesis, 2012. http://hdl.handle.net/10889/6076.
Full textThe synoptic classification of weather systems involves a variety of environmental applications. Recently, the synoptic classification has been found to be relevant with the cognitive area of air pollution. Knowing the synoptic climatology of a region, allows the prediction and possibly the prevention of pollution incidents, resulting in either local sources or in transport of pollutants. This knowledge is greatly enhanced by the categorization (classification) of the synoptic conditions in a given area. In recent years ‘automatic’, non-empirical, classification methods have been developed using computers. So far these efforts have been based on classical statistical methods. The aim of this PhD thesis is the development of methods and the implementation of algorithms to extract and process digital signals and environmental images. Consequently, expert systems for the synoptic classification of weather systems are created based on methods relative to image processing, data analysis and clustering, pattern recognition and graph theory. The objective of this research is demonstrated by its own originality which lies in the fact that the presented techniques have successfully addressed a number of classification problems in different topics. It is the first time that such methods have been applied on Meteorology-Climatology-Physics of the Environment in Greece, namely the synoptic classification of weather systems. The characteristics of the modern methods proposed in this PhD thesis are competitive both in classification quality and in computational time.