To see the other types of publications on this topic, follow the link: Gridders.

Dissertations / Theses on the topic 'Gridders'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 39 dissertations / theses for your research on the topic 'Gridders.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Shadman, K. (Khashayar) 1972. "The gridded electromagnet probe." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/17041.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Nuclear Engineering, 2003.
Includes bibliographical references.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
We attempted to measure the anisotropy in the electron distribution function in magnetized plasma by exploiting the adiabatic invariance of the electron's magnetic moment with a probe comprising a grid, a collector, and an inertially cooled electromagnet. The electric mirror force of the grid, which is located at the electromagnet throat, reduces the demand on the magnetic mirror force and thereby on the electromagnet current, which then allows for the construction of a compact probe that can be inserted inside the plasma chamber. An analysis of the effects of space charge inside the grid-collector cavity revealed that the size of the probe's entrance aperture, b, which gives the size of the plasma beam inside the probe, should be chosen to be within a factor of ten of the electron Debye-length [lambda][sub]De. In addition, an analysis of the discrete structure of the grid showed that the mesh wire spacing d should be chosen to be much less than [lambda][sub]De. Also, the wire thickness t should be chosen to be much less than d . We built a probe with a grid of tungsten wires with dimensions, t = 5,[mu]m and d = 200[mu]m . We then tested this probe in a hydrogen plasma immersed in a background magnetic field of B [approximately] 1kG. The plasma was heated by microwaves via the electron cyclotron resonance. It was characterized by a density and temperature equal to n[sub]e [approximately] 10¹⁰ cm⁻³ and T[sub]e [approximately] 10eV, respectively, which gave [lambda][sub]De [approximately] 300[mu]m. The collector's current-voltage characteristic demonstrated the interaction between the electric barrier at the collector and the hybrid electric-magnetic barrier at the grid, thereby establishing the basic principles of the probe.
(cont.) The characteristic also revealed the non-ideal behaviors associated with the electric hole in the mesh and the effects of space charge. These effects in conjunction with the poor signal-to-noise level of the data prevented the measurement of the distribution function. Still, we were able to extract the temperature anisotropy for an assumed two-temperature Maxwellian distribution. The value for this ratio was found to be greater than one (greater temperature for the perpendicular gyro-motion), which is plausible given the way in which the plasma is heated.
by Khashayar Shadman.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
2

Kindberg, Peter. "Development of a miniature Gridded ion thruster." Thesis, Luleå tekniska universitet, Rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-65750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Cameron C. "The Development of a Gridded Weather Typing Classification Scheme." Thesis, Kent State University, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3618946.

Full text
Abstract:

Since their development in the 1990s, gridded reanalysis data sets have proven quite useful for a broad range of synoptic climatological analyses, especially those utilizing a map pattern classification approach. However, their use in broad-scale, surface weather typing classifications and applications have not yet been explored. This research details the development of such a gridded weather typing classification (GWTC) scheme using North American Regional Reanalysis data for 1979-2010 for the continental United States.

Utilizing eight-times daily observations of temperature, dew point, pressure, cloud cover, u-wind and v-wind components, the GWTC categorizes the daily surface weather of 2,070 locations into one of 11 discrete weather types, nine core types and two transitional types, that remain consistent throughout the domain. Due to the use of an automated deseasonalized z-score initial typing procedure, the character of each type is both geographically and seasonally relative, allowing each core weather type to occur at every location, at any time of the year. Diagnostic statistics reveal a high degree of spatial cohesion among the weather types classified at neighboring locations, along with an effective partitioning of the climate variability of individual locations (via a Variability Skill Score metric) into these 11 weather types. Daily maps of the spatial distribution of GWTC weather types across the United States correspond well to traditional surface weather maps, and comparisons of the GWTC with the Spatial Synoptic Classification are also favorable.

While the potential future utility of the classification is expected to be primarily for the resultant calendars of daily weather types at specific locations, the automation of the methodology allows the classification to be easily repeatable, and therefore, easily transportable to other locations, atmospheric levels, and data sets (including output from gridded general circulation models). Further, the enhanced spatial resolution of the GWTC may also allow for new applications of surface weather typing classifications in mountainous and rural areas not well represented by airport weather stations.

APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Chao. "Variations of Global Ocean Salinity from Multiple Gridded Argo Products." Scholar Commons, 2019. https://scholarcommons.usf.edu/etd/7848.

Full text
Abstract:
Salinity is one of the fundamental ocean state variables. Variations of ocean salinity can be used to infer changes in the global water cycle and air-sea freshwater exchange. Many institutions have developed gridded Argo products of global coverage. However, the existing gridded salinity products have not yet been dedicatedly intercompare and assessed. In this study, the mean state, annual and interannual variabilities, and decadal changes of ocean salinity from five Argo-based gridded salinity products, available from UK Met Office, JAMSTEC, Scripps Institution of Oceanography, China Second Institute of Oceanography, and International Pacific Research Center, are examined and compared for their overlapping period of 2005-2015 within two depth intervals (0-700 m and 700-2000 m), as well as the sea surface. Though some global and regional features are relatively reproducible, obvious discrepancies are found particularly for the deeper layer. These discrepancies are not apparent on the 11-year climatological mean or the trend patterns, but are readily evident on temporal variations. For instance, the potentially undersampled current systems in the North Atlantic and Southern Ocean are one of the main reasons for the observed discrepancies. The gridded products from Scripps, JAMSTEC and Met Office show large deviation from the ensemble mean, particularly in regions like the Atlantic Ocean and the tropical Pacific. Large disagreements are found in the first and final years, which can lead to different estimates on decadal trends. This study can serve as a useful reference on how to utilize and improve the existing gridded salinity products.
APA, Harvard, Vancouver, ISO, and other styles
5

Nchaba, Teboho. "Verification of gridded seasonal wind speed forecasts over South Africa." Master's thesis, University of Cape Town, 2013. http://hdl.handle.net/11427/4970.

Full text
Abstract:
Includes abstract.
Includes bibliographical references.
The Climate System Analysis Group (CSAG) at the University of Cape Town produces provisional global and Southern African seasonal wind forecasts generated using the United Kingdom Meteorological Office Atmospheric General Circulation Model (AGCM) HadAM3P (non-standard version of HadAM3). This study examines the quality of the seasonal wind speed forecasts through a forecast verification process for continuous variables using reanalysis products of the National Centers for Environmental Prediction and the Department of Energy (NCEP-DOE) as observations data. The verification analyses are performed using summary measures Mean Error (ME), Mean Absolute Error (MAE), Mean Squared Error (MSE), correlation coefficients, Linear Error in Probability Space (LEPS) and exploratory methods, scatter and conditional quantile plots. These methods are used to determine the aspects of forecast quality namely, bias, accuracy, reliability, resolution, and skill over a 20 year period (1991 to 2010). The results of the study have determined that the use of both accuracy and skill measures for the verification analyses provide more information about the quality of the forecasts, as opposed only one of these. In all provinces, the highest quality seasonal wind speed forecasts are made at 500 hPa and the lowest quality forecasts at 1000 hPa. Furthermore regions, pressure levels, and seasons with the highest forecast quality share the common characteristic that their wind speeds are relatively high. The forecasts add value to the climatology and thus are a useful tool for wind assessment at a seasonal scale. It is suggested that adding spatial resolution to the forecasts through downscaling may prepare them for use in applications such as wind power output forecasting.
APA, Harvard, Vancouver, ISO, and other styles
6

Hennig, Benjamin D. "Rediscovering the world : gridded cartograms of human and physical space." Thesis, University of Sheffield, 2011. http://etheses.whiterose.ac.uk/1671/.

Full text
Abstract:
'We need new maps' is the central claim made in this thesis. In a world increasingly influenced by human action and interaction, we still rely heavily on mapping techniques that were invented to discover unknown places and explore our physical environment. Although the traditional concept of a map is currently being revived in digital environments, the underlying mapping approaches are not capable of making the complexity of human-environment relationships fully comprehensible. Starting from how people can be put on the map in new ways, this thesis outlines the development of a novel technique that stretches a map according to quantitative data, such as population. The new maps are called gridded cartograms as the method is based on a grid onto which a density-equalising cartogram technique is applied. The underlying grid ensures the preservation of an accurate geographic reference to the real world. It allows the gridded cartograms to be used as basemaps onto which other information can be mapped. This applies to any geographic information from the human and physical environment. As demonstrated through the examples presented in this thesis, the new maps are not limited to showing population as a defining element for the transformation, but can show any quantitative geospatial data, such as wealth, rainfall, or even the environmental conditions of the oceans. The new maps also work at various scales, from a global perspective down to the scale of urban environments. The gridded cartogram technique is proposed as a new global and local map projection that is a viable and versatile alternative to other conventional map projections. The maps based on this technique open up a wide range of potential new applications to rediscover the diverse geographies of the world. They have the potential to allow us to gain new perspectives through detailed cartographic depictions.
APA, Harvard, Vancouver, ISO, and other styles
7

Barthol, Christopher John. "Characterization and Modeling of Non-Volatile SONOSSemiconductor Memories with Gridded Capacitors." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1418813120.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kotsakis, Christophoros. "Multiresolution aspects of linear approximation methods in Hilbert spaces using gridded data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0016/NQ54794.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hardy, Benjamin Arik. "A New Method for the Rapid Calculation of Finely-Gridded Reservoir Simulation Pressures." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd1123.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kluver, Daria B. "Characteristics and trends in North American snowfall from a comprehensive gridded data set." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 159 p, 2007. http://proquest.umi.com/pqdweb?did=1338873801&sid=15&Fmt=1&clientId=8331&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Swenson, Anthony P. "The Field-Programmable Gate Array Design of the Gridded Retarding Ion Distribution Sensor." DigitalCommons@USU, 2017. https://digitalcommons.usu.edu/etd/6876.

Full text
Abstract:
Mankind's ability to predict weather on earth has been greatly enhanced by new instrumentation technology. Similarly, mankind's ability to predict space weather benefits from new technologies. Just as increasing the amount of atmospheric measurements on earth heightened mankind's ability to predict Earth weather, many scientists expect that expanding the amount of plasma measurements in space could be key to enhancing mankind's ability to predict space weather. Small satellites are one of these new technologies that have the potential to greatly enhance our ability to predict space weather. Utilizing many low-cost small satellites allows scientists to take data from more locations than possible with a few high-cost large satellites. Two instruments that have historic use measuring plasma are the Retarding Potential Analyzer and the Ion Drift Meter. Previous work has been done to combine the functionality from both of these instruments into one unit suitable for a small satellite in terms of size, power, and mass. An electrical and mechanical design has been completed to this end and this new instrument is called the Gridded Retarding Ion Distribution Sensor. This thesis describes the design and testing of the FPGA code that runs this new instrument.
APA, Harvard, Vancouver, ISO, and other styles
12

Weiss, Jeremy, and Michael Crimmins. "Better Coverage of Arizona's Weather and Climate: Gridded Datasets of Daily Surface Meteorological Variables." College of Agriculture, University of Arizona (Tucson, AZ), 2016. http://hdl.handle.net/10150/625414.

Full text
Abstract:
7 pp.
Many areas that use agricultural and environmental science for management and planning – ecosystem conservation, crop and livestock systems, water resources, forestry and wildland fire management, urban horticulture – often need historical records of daily weather for activities that range from modeling forage production to determining the frequency of freezing temperatures or heavy rainfall. In the past, such applications primarily have used station-based observations of meteorological variables like temperature and precipitation. However, weather stations are sparsely and irregularly located throughout Arizona, and due to the highly variable terrain across the state (Figure 1), information recorded at these sites may not represent meteorological conditions at distant, non-instrumented locations or over broad areas. This issue, along with others related to quality, length, and completeness of station records, can hinder the use of weather and climate data for agricultural and natural resources applications. In response to an increasing demand for spatially and temporally complete meteorological data as well as the potential constraints of station-based records, the number of gridded daily surface weather datasets is expanding. This bulletin reviews a current suite of these datasets, particularly those that integrate both atmospheric and topographic information in order to better model temperature and precipitation on relatively fine spatial scales, and is intended for readers with knowledge of weather, climate, and geospatial data. In addition to addressing how these datasets are developed and what their spatial domain and resolution, record length, and variables are, this bulletin also summarizes where and how to access these datasets, as well as the general suitability of these datasets for different uses.
APA, Harvard, Vancouver, ISO, and other styles
13

Yi, Yuchan. "Determination of Gridded Mean Sea Surface From Altimeter Data of Topex, ERS-1 and GEOSAT /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487931512620891.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Fabbiani-Leon, Angelique Marie. "Comparison method between gridded and simulated snow water equivalent estimates to in-situ snow sensor readings." Thesis, University of California, Davis, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1604056.

Full text
Abstract:

California Department of Water Resources (DWR) Snow Surveys Section has recently explored the potential use of recently developed hydrologic models to estimate snow water equivalent (SWE) for the Sierra Nevada mountain range. DWR Snow Surveys Section’s initial step is to determine how well these hydrologic models compare to the trusted regression equations, currently used by DWR Snow Surveys Section. A comparison scheme was ultimately developed between estimation measures for SWE by interpreting model results for the Feather River Basin from: a) National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL) gridded SWE reconstruction product, b) United States Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS), and c) DWR Snow Surveys Section regression equations. Daily SWE estimates were extracted from gridded results by computing an average SWE based on 1,000 ft elevation band increments from 3,000 to 10,000 ft (i.e. an elevation band would be from 3,000 to 4,000 ft). The dates used for processing average SWE estimates were cloud-free satellite image dates during snow ablation months, March to August, for years 2000–2012. The average SWE for each elevation band was linearly interpolated for each snow sensor elevation. The model SWE estimates were then compared to the snow sensor readings used to produce the snow index in DWR’s regression equations. In addition to comparing JPL’s SWE estimate to snow sensor readings, PRMS SWE variable for select hydrologic response units (HRU) were also compared to snow sensor readings. Research concluded with the application of statistical methods to determine the reliability in the JPL products and PRMS simulated SWE variable, with results varying depending on time duration being analyzed and elevation range.

APA, Harvard, Vancouver, ISO, and other styles
15

Cui, Wenjun, Xiquan Dong, Baike Xi, and Aaron Kennedy. "Evaluation of Reanalyzed Precipitation Variability and Trends Using the Gridded Gauge-Based Analysis over the CONUS." AMER METEOROLOGICAL SOC, 2017. http://hdl.handle.net/10150/625780.

Full text
Abstract:
Atmospheric reanalyses have been used in many studies to investigate the variabilities and trends of precipitation because of their global coverage and long record; however, their results must be properly analyzed and their uncertainties must be understood. In this study, precipitation estimates from five global reanalyses [ERA-Interim; MERRA, version 2 (MERRA2); JRA-55; CFSR; and 20CR, version 2c (20CRv2c)] and one regional reanalysis (NARR) are compared against the CPC Unified Gauge-Based Analysis (CPCUGA) and GPCP over the contiguous United States (CONUS) during the period 1980-2013. Reanalyses capture the variability of the precipitation distribution over the CONUS as observed in CPCUGA and GPCP, but large regional and seasonal differences exist. Compared with CPCUGA, global reanalyses generally overestimate the precipitation over the western part of the country throughout the year and over the northeastern CONUS during the fall and winter seasons. These issues may be associated with the difficulties models have in accurately simulating precipitation over complex terrain and during snowfall events. Furthermore, systematic errors found in five global reanalyses suggest that their physical processes in modeling precipitation need to be improved. Even though negative biases exist in NARR, its spatial variability is similar to both CPCUGA and GPCP; this is anticipated because it assimilates observed precipitation, unlike the global reanalyses. Based on CPCUGA, there is an average decreasing trend of -1.38mm yr(-1) over the CONUS, which varies depending on the region with only the north-central to northeastern parts of the country having positive trends. Although all reanalyses exhibit similar interannual variation as observed in CPCUGA, their estimated precipitation trends, both linear and spatial trends, are distinct from CPCUGA.
APA, Harvard, Vancouver, ISO, and other styles
16

Meredith, Shaun Lee. "Construction of a gridded energy analyzer for measurements of ion energy distribution in the versatile toroidal facility." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/50545.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Engineering, 1998.
Includes bibliographical references (p. 80-82).
The Versatile Toroidal Facility (VTF) at MIT's Plasma Science and Fusion Center provides a laboratory environment for studying ionospheric plasmas. Various plasma diagnostic devices have been created and used to study the VTF plasma since 1991. An accurate method for measuring VTF's ion characteristics has never been designed or installed in the laboratory facility. Gridded Energy Analyzers (GEA) are useful diagnostic tools for determining plasma ion energy distributions and ion temperature. Research was done on the theory behind Gridded Energy Analyzers and their applicability for use in the Versatile Toroidal Facility. A design and method for constructing a miniaturized GEA for VTF was developed and documented. The construction method covers material selection, machining, and assembly of VTF's miniature GEA. The miniature GEA is a non-perturbing probe used in VTF's plasma, which is approximately 3 cm in diameter. The GEA was constructed and preliminary experimental data was obtained. From this data VTF's ion temperature was found to be approximately 8eV and an ion distribution function was determined to be roughly Maxwellian in nature.
by Shaun L. Meredith.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
17

Derksen, Christopher Peter. "Integrating passive microwave remotely sensed imagery and gridded atmospheric data, a study of North American Prairie snow cover." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/NQ65231.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Hofstra, Nynke. "Development and evaluation of a Europrean daily high-resolution gridded dataset of surface temperature and precipitation for 1950-2006." Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.504022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Nazemi, Jonathan H. (Jonathan Hesam) 1974. "Determination of the ion distribution function during magnetic reconnection in the versatile toroidal facility with a gridded energy analyzer." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/16906.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Physics, 2002.
Includes bibliographical references (leaf 44).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
A gridded energy analyzer (GEA) diagnostic and associated electronics are designed and built to explore the evolution of the ion distribution function during driven magnetic reconnection in the Versatile Toroidal Facility. The temporal evolution of the ion characteristic is measured at different locations throughout the reconnection region, for a number of magnetic field configurations. The measured ion characteristics are found to be in excellent agreement with a theoretical fit constructed from a double Maxwellian distribution, from which the temperatures and drift velocities are found as functions of space and time. It is found that the ion temperature of each Maxwellian exhibit minor temporal variations during reconnection which are negligible in terms of ion heating. Additionally, the temperatures do not significantly change with varying radial position, or magnetic cusp strength. The drift velocities are observed to evolve in time, scale with magnetic cusp strength, and to depend on the exact location throughout the reconnection region. The ions are thus subject to acceleration due to the electric field induced by the ohmic drive. The appearance of a double Maxwellian distribution during the reconnection drive is hypothesized to be due to double ionization of argon atoms. Repetition of the experiment with a hydrogen plasma verified that this scenario is most probable.
by Jonathan H. Nazemi.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
20

Caetano, André Francisco Morielo. "Griddler : uma estratégia configurável para armazenamento distribuído de objetos peer-to-peer que combina replicação e erasure coding com sistema de cache /." São José do Rio Preto, 2017. http://hdl.handle.net/11449/151383.

Full text
Abstract:
Orientador: Carlos Roberto Valêncio
Banca: Geraldo Francisco Donega Zafalon
Banca: Pedro Luiz Pizzigatti Correa
Resumo: Sistemas de gerenciamento de banco de dados, na sua essência, almejam garantir o armazenamento confiável da informação. Também é tarefa de um sistema de gerenciamento de banco de dados oferecer agilidade no acesso às informações. Nesse contexto, é de grande interesse considerar alguns fenômenos recentes: a progressiva geração de conteúdo não-estruturado, como imagens e vídeo, o decorrente aumento do volume de dados em formato digital nas mais diversas mídias e o grande número de requisições por parte de usuários cada vez mais exigentes. Esses fenômenos fazem parte de uma nova realidade, denominada Big Data, que impõe aos projetistas de bancos de dados um aumento nos requisitos de flexibilidade, escalabilidade, resiliência e velocidade dos seus sistemas. Para suportar dados não-estruturados foi preciso se desprender de algumas limitações dos bancos de dados convencionais e definir novas arquiteturas de armazenamento. Essas arquiteturas definem padrões para gerenciamento dos dados, mas um sistema de armazenamento deve ter suas especificidades ajustadas em cada nível de implementação. Em termos de escalabilidade, por exemplo, cabe a escolha entre sistemas com algum tipo de centralização ou totalmente descentralizados. Por outro lado, em termos de resiliência, algumas soluções utilizam um esquema de replicação para preservar a integridade dos dados por meio de cópias, enquanto outras técnicas visam a otimização do volume de dados armazenados. Por fim, ao mesmo tempo que são...
Abstract: Database management systems, in essence, aim to ensure the reliable storage of information. It is also the task of a database management system to provide agility in accessing information. In this context, it is of great interest to consider some recent phenomena: the progressive generation of unstructured content such as images and video, the consequent increase in the volume of data in digital format in the most diverse media and the large number of requests by users increasingly demanding. These phenomena are part of a new reality, named Big Data, that imposes on database designers an increase in the flexibility, scalability, resiliency, and speed requirements of their systems. To support unstructured data, it was necessary to get rid of some limitations of conventional databases and define new storage architectures. These architectures define standards for data management, but a storage system must have its specificities adjusted at each level of implementation. In terms of scalability, for example, it is up to the choice between systems with some type of centralization or totally decentralized. On the other hand, in terms of resiliency, some solutions utilize a replication scheme to preserve the integrity of the data through copies, while other techniques are aimed at optimizing the volume of stored data. Finally, at the same time that new network and disk technologies are being developed, one might think of using caching to optimize access to what is stored. This work explores and analyzes the different levels in the development of distributed storage systems. This work objective is to present an architecture that combines different resilience techniques. The scientific contribution of this work is, in addition to a totally decentralized suggestion of data allocation, the use of an access cache structure with adaptive algorithms in this environment
Mestre
APA, Harvard, Vancouver, ISO, and other styles
21

Caetano, André Francisco Morielo [UNESP]. "Griddler: uma estratégia configurável para armazenamento distribuído de objetos peer-to-peer que combina replicação e erasure coding com sistema de cache." Universidade Estadual Paulista (UNESP), 2017. http://hdl.handle.net/11449/151383.

Full text
Abstract:
Submitted by André Francisco Morielo Caetano null (andremorielo@hotmail.com) on 2017-08-18T20:54:09Z No. of bitstreams: 1 Dissertacao_Andre_Morielo-Principal.pdf: 2084639 bytes, checksum: d77158373f8168fc0224d407bb07aa99 (MD5)
Approved for entry into archive by Luiz Galeffi (luizgaleffi@gmail.com) on 2017-08-23T19:42:08Z (GMT) No. of bitstreams: 1 caetano_afm_me_sjrp.pdf: 2084639 bytes, checksum: d77158373f8168fc0224d407bb07aa99 (MD5)
Made available in DSpace on 2017-08-23T19:42:08Z (GMT). No. of bitstreams: 1 caetano_afm_me_sjrp.pdf: 2084639 bytes, checksum: d77158373f8168fc0224d407bb07aa99 (MD5) Previous issue date: 2017-08-10
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Sistemas de gerenciamento de banco de dados, na sua essência, almejam garantir o armazenamento confiável da informação. Também é tarefa de um sistema de gerenciamento de banco de dados oferecer agilidade no acesso às informações. Nesse contexto, é de grande interesse considerar alguns fenômenos recentes: a progressiva geração de conteúdo não-estruturado, como imagens e vídeo, o decorrente aumento do volume de dados em formato digital nas mais diversas mídias e o grande número de requisições por parte de usuários cada vez mais exigentes. Esses fenômenos fazem parte de uma nova realidade, denominada Big Data, que impõe aos projetistas de bancos de dados um aumento nos requisitos de flexibilidade, escalabilidade, resiliência e velocidade dos seus sistemas. Para suportar dados não-estruturados foi preciso se desprender de algumas limitações dos bancos de dados convencionais e definir novas arquiteturas de armazenamento. Essas arquiteturas definem padrões para gerenciamento dos dados, mas um sistema de armazenamento deve ter suas especificidades ajustadas em cada nível de implementação. Em termos de escalabilidade, por exemplo, cabe a escolha entre sistemas com algum tipo de centralização ou totalmente descentralizados. Por outro lado, em termos de resiliência, algumas soluções utilizam um esquema de replicação para preservar a integridade dos dados por meio de cópias, enquanto outras técnicas visam a otimização do volume de dados armazenados. Por fim, ao mesmo tempo que são desenvolvidas novas tecnologias de rede e disco, pode-se pensar na utilização de caching para otimizar o acesso ao que está armazenado. Este trabalho explora e analisa os diferentes níveis no desenvolvimento de sistemas de armazenamento distribuído. O objetivo deste trabalho é apresentar uma arquitetura que combina diferentes técnicas de resiliência. A contribuição científica deste trabalho é, além de uma sugestão totalmente descentralizada de alocação dos dados, o uso de uma estrutura de cache de acesso nesse ambiente, com algoritmos adaptáveis.
Database management systems, in essence, aim to ensure the reliable storage of information. It is also the task of a database management system to provide agility in accessing information. In this context, it is of great interest to consider some recent phenomena: the progressive generation of unstructured content such as images and video, the consequent increase in the volume of data in digital format in the most diverse media and the large number of requests by users increasingly demanding. These phenomena are part of a new reality, named Big Data, that imposes on database designers an increase in the flexibility, scalability, resiliency, and speed requirements of their systems. To support unstructured data, it was necessary to get rid of some limitations of conventional databases and define new storage architectures. These architectures define standards for data management, but a storage system must have its specificities adjusted at each level of implementation. In terms of scalability, for example, it is up to the choice between systems with some type of centralization or totally decentralized. On the other hand, in terms of resiliency, some solutions utilize a replication scheme to preserve the integrity of the data through copies, while other techniques are aimed at optimizing the volume of stored data. Finally, at the same time that new network and disk technologies are being developed, one might think of using caching to optimize access to what is stored. This work explores and analyzes the different levels in the development of distributed storage systems. This work objective is to present an architecture that combines different resilience techniques. The scientific contribution of this work is, in addition to a totally decentralized suggestion of data allocation, the use of an access cache structure with adaptive algorithms in this environment.
APA, Harvard, Vancouver, ISO, and other styles
22

Dobkevicius, Mantas. "Modelling and design of inductively coupled radio frequency gridded ion thrusters with an application to Ion Beam Shepherd type space missions." Thesis, University of Southampton, 2017. https://eprints.soton.ac.uk/413768/.

Full text
Abstract:
Recently proposed space missions such as Darwin, LISA and NGGM have encouraged the development of electric propulsion thrusters capable of operating in the micro-Newton (N) thrust range. To meet these requirements, radio frequency (RF) gridded ion thrusters need to be scaled down to a few centimetres in size. Due to the small size of these thrusters, it is important to accurately determine the thermal and performance parameters. To achieve this, an RF ion thruster model has been developed, composed of plasma discharge, 2D axisymmetric ion extraction, 3D electromagnetic, 3D thermal and RF circuit models. The plasma discharge model itself is represented using 0D global, 2D axisymmetric and 3D molecular neutral gas, and Boltzmann electron transport sub-models. This is the rst time such a holistic/comprehensive model has been created. The model was successfully validated against experimental data from the RIT 3.5 thruster, developed for the NGGM mission. Afterwards, the computational model was used to design an RF gridded ion thruster for an Ion Beam Shepherd (IBS) type space debris removal mission. Normally, the IBS method requires two thrusters: one for impulse transfer (IT) and one for impulse compensation (IC). This thesis proposes a novel thruster concept for the IBS type missions where a single Double-Sided Thruster (DST) simultaneously producing ion beams for the IT and IC purposes is used. The advantage of DST design is that it requires approximately half the RF power compared with two single-ended thrusters and it has a much simpler sub-system architecture, lower cost, and lower total mass. Such a DST thruster was designed, built and tested, with the requirements and constraints taken from the LEOSWEEP space debris removal mission. During the experimental campaign, a successful extraction of two ion beams was achieved. The thesis has shown that it is possible to control the thrust magnitudes from the IT and IC sides by varying the number of apertures in each ion optics system, proving that the DST concept is a viable alternative for the LOESWEEP mission.
APA, Harvard, Vancouver, ISO, and other styles
23

Mao, Ganquan [Verfasser], and Harald [Akademischer Betreuer] Kunstmann. "Stochastic Bias Correction of Dynamically Downscaled Precipitation Fields for Germany Through Copula-based Integration of Gridded Observation Data / Ganquan Mao. Betreuer: Harald Kunstmann." Augsburg : Universität Augsburg, 2016. http://d-nb.info/111353480X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Baro, Johanna. "Modélisation multi-échelles de la morphologie urbaine à partir de données carroyées de population et de bâti." Thesis, Paris Est, 2015. http://www.theses.fr/2015PEST1004/document.

Full text
Abstract:
La question des liens entre forme urbaine et transport se trouve depuis une vingtaine d'années au cœur des réflexions sur la mise en place de politiques d'aménagement durable. L'essor de la diffusion de données sur grille régulière constitue dans ce cadre une nouvelle perspective pour la modélisation de structures urbaines à partir de mesures de densités affranchies de toutes les contraintes des maillages administratifs. A partir de données de densité de population et de surface bâtie disponibles à l'échelle de la France sur des grilles à mailles de 200 mètres de côté, nous proposons deux types de classifications adaptées à l'étude des pratiques de déplacement et du développement urbain : des classifications des tissus urbains et des classifications des morphotypes de développement urbain. La construction de telles images classées se base sur une démarche de modélisation théorique et expérimentale soulevant de forts enjeux méthodologiques quant à la classification d'espaces urbains statistiquement variés. Pour nous adapter au traitement exhaustif de ces espaces, nous avons proposé une méthode de classification des tissus urbains par transfert d'apprentissage supervisé. Cette méthode utilise le formalisme des champs de Markov cachés pour prendre en compte les dépendances présentes dans ces données spatialisées. Les classifications en morphotypes sont ensuite obtenus par un enrichissement de ces premières images classées, formalisé à partir de modèles chorématiques et mis à œuvre par raisonnement spatial qualitatif. L'analyse de ces images classées par des méthodes de raisonnement spatial quantitatif et d'analyses factorielles nous a permis de révéler la diversité morphologique de 50 aires urbaines françaises. Elle nous a permis de mettre en avant la pertinence de ces classifications pour caractériser les espaces urbains en accord avec différents enjeux d'aménagement relatifs à la densité ou à la multipolarité
Since a couple of decades the relationships between urban form and travel patterns are central to reflection on sustainable urban planning and transport policy. The increasing distribution of regular grid data is in this context a new perspective for modeling urban structures from measurements of density freed from the constraints of administrative division. Population density data are now available on 200 meters grids covering France. We complete these data with built area densities in order to propose two types of classified images adapted to the study of travel patterns and urban development: classifications of urban fabrics and classifications of morphotypes of urban development. The construction of such classified images is based on theoretical and experimental which raise methodological issues regarding the classification of a statistically various urban spaces. To proceed exhaustively those spaces, we proposed a per-pixel classification method of urban fabrics by supervised transfer learning. Hidden Markov random fields are used to take into account the dependencies in the spatial data. The classifications of morphotypes are then obtained by broadening the knowledge of urban fabrics. These classifications are formalized from chorematique theoretical models and implemented by qualitative spatial reasoning. The analysis of these classifications by methods of quantitative spatial reasoning and factor analysis allowed us to reveal the morphological diversity of 50 metropolitan areas. It highlights the relevance of these classifications to characterize urban areas in accordance with various development issues related to the density or multipolar development
APA, Harvard, Vancouver, ISO, and other styles
25

Bek, Jeremy. "Design, simulation, and testing of an electric propulsion cluster frame." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-300970.

Full text
Abstract:
In general, electric propulsion offers very high efficiency but relatively low thrust. To remedy this, several ion engines can be assembled in a clustered configuration and operated in parallel. This requires the careful design of a frame to accommodate the individual propulsion systems. This frame must be modular to be used in different cluster sizes, and verify thermal and mechanical requirements to ensure the nominal operation of the thrusters. The present report aims to show the design process of such a frame, from preliminary modelling to the experimental study of a prototype. This document features an overview of the iterative design process driven by thermal simulations rendered on COMSOL Multiphysics. This process led to the conception of a 2-thruster and 4-thruster cluster frame. A lumped-parameter model of the electric propulsion system was also created to model its complex thermal behaviour. In addition, the 2-thruster frame was studied mechanically with analytical calculations and simulations of simple load cases on SolidWorks. Lastly, a prototype based on the 2-thruster frame model was assembled. The prototype was used to conduct temperature measurements while hosting two operating thrusters inside a vacuum chamber. The temperature distribution in the cluster was measured, and compared to simulation results. Thermal simulations of the 2-thruster and 4-thruster frame showed promising results, while mechanical simulations of the 2-thruster version met all requirements. Moreover, experimental results largely agreed with thermal simulations of the prototype. Finally, the lumped-element model proved instrumental in calibrating the models, with its high flexibility and quick computation time.
Generellt erbjuder elektrisk framdrivning hög verkningsgrad men relativt låg dragkraft. För att avhjälpa detta kan flera jonmotorer sättas samman i en klusterkonfiguration och drivs parallellt. Detta kräver en noggrann utformning av en ram för att rymma de enskilda framdrivningssystemen. Denna ram måste vara modulär för att kunna användas i olika klusterstorlekar och verifiera termiska och mekaniska krav för att säkerställa den nominella driften av motorerna. Föreliggande rapport syftar till att visa designprocessen för en sådan ram, från preliminär modellering till experimentell studie av en prototyp. Detta dokument innehåller en översikt över den iterativa designprocessen, driven av termiska simuleringar gjorda med COMSOL Multiphysics, som ledde till uppfattningen av en 2 motorer och 4 motorer ram. En klumpelementmodell av jonmotorn skapades också för att modellera dess komplexa termiska beteende. Dessutom var den 2 motorer ram studeras mekaniskt med analytiska beräkningar och simuleringar av enkla laddafall med SolidWorks. Slutligen monterades en prototyp baserad på den 2 motorer rammodellen. Prototypen användes för att göra temperaturmätningar medan den är värd för 2 jonmotorer i en vakuumkammare. Temperaturfördelningen i klustret mättes och jämfördes med simuleringsresultat. Termiska simuleringar av den 2 motorer och 4 motorer ramen visade lovande resultat, medan mekaniska simuleringar av den 2 motorer versionen klarade alla krav. Dessutom överensstämde experimentella resultat till stor del med termiska simuleringar av prototypen. Slutligen var klumpelementmodellen mycket användbar för att kalibrera de andra modellerna med sin höga flexibilitet och snabba beräkningstid.
APA, Harvard, Vancouver, ISO, and other styles
26

Holzer, Nicolai. "Development of an interface for the conversion of geodata in a NetCDF data model and publication of this data by the use of the web application DChart, related to the CEOP-AEGIS project." Master's thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-71492.

Full text
Abstract:
The Tibetan Plateau with an extent of about 2,5 million square kilometers at an average altitude higher than 4,700 meters has a significant impact on the Asian monsoon and regulates with its snow and ice reserves the upstream headwaters of seven major south-east Asian rivers. Upon the water supply of these rivers depend over 1,4 billion people, the agriculture, the economics, and the entire ecosystem in this region. As the increasing number of floods and droughts show, these seasonal water reserves however are likely to be influenced by climate change, with negative effects for the downstream water supply and subsequently the food security. The international cooperation project CEOP-AEGIS – funded by the European Commission under the Seventh Framework Program – aims as a result to improve the knowledge of the hydrology and meteorology of the Qinghai-Tibetan Plateau to further understand its role in climate, monsoon and increasing extreme meteorological events. Within the framework of this project, a large variety of earth observation datasets from remote sensing products, model outputs and in-situ ground station measurements are collected and evaluated. Any foreground products of CEOP-AEGIS will have to be made available to the scientific community by an online data repository which is a contribution to the Global Earth Observation System of Systems (GEOSS). The back-end of the CEOP-AEGIS Data Portal relies on a Dapper OPeNDAP web server that serves data stored in the NetCDF file format to a DChart client front-end as web-based user interface. Data from project partners are heterogeneous in its content, and also in its type of storage and metadata description. However NetCDF project output data and metadata has to be standardized and must follow international conventions to achieve a high level of interoperability. Out of these needs, the capabilities of NetCDF, OPeNDAP, Dapper and DChart were profoundly evaluated in order to take correct decisions for implementing a suitable and interoperable NetCDF data model for CEOP-AEGIS data that allows a maximum of compatibility and functionality to OPeNDAP and Dapper / DChart as well. This NetCDF implementation is part of a newly developed upstream data interface that converts and aggregates heterogeneous input data of project partners to standardized NetCDF datasets, so that they can be feed via OPeNDAP to the CEOP-AEGIS Data Portal based on the Dapper / DChart technology. A particular focus in the design of this data interface was set to an intermediate data and metadata representation that easily allows to modify its elements with the scope of achieving standardized NetCDF files in a simple way. Considering the extensive variety and amount of data within this project, it was essential to properly design a data interface that converts heterogeneous input data of project partners to standardized and aggregated NetCDF output files in order to ensure maximum compatibility and functionality within the CEOP-AEGIS Data Portal and subsequently interoperability within the scientific community
Das Hochplateau von Tibet mit einer Ausdehnung von 2.5 Millionen Quadratkilometer und einer durchschnittlichen Höhe von über 4 700 Meter beeinflusst wesentlich den asiatischen Monsun und reguliert mit seinen Schnee- und Eisreserven den Wasserhaushalt der Oberläufe der sieben wichtigsten Flüsse Südostasiens. Von diesem Wasserzufluss leben 1.4 Milliarden Menschen und hängt neben dem Ackerbau und der Wirtschaft das gesamte Ökosystem in dieser Gegend ab. Wie die zunehmende Zahl an Dürren und Überschwemmungen zeigt, sind diese jahreszeitlich beeinflussten Wasserreserven allen Anscheins nach vom Klimawandel betroffen, mit negativen Auswirkungen für die flussabwärts liegenden Stromgebiete und demzufolge die dortige Nahrungsmittelsicherheit. Das internationale Kooperationsprojekt CEOP-AEGIS – finanziert von der Europäischen Kommission unter dem Siebten Rahmenprogramm – hat sich deshalb zum Ziel gesetzt, die Hydrologie und Meteorologie dieses Hochplateaus weiter zu erforschen, um daraus seine Rolle in Bezug auf das Klima, den Monsun und den zunehmenden extremen Wetterereignissen tiefgreifender verstehen zu können. Im Rahmen dieses Projektes werden verschiedenartigste Erdbeobachtungsdaten von Fernerkundungssystemen, numerischen Simulationen und Bodenstationsmessungen gesammelt und ausgewertet. Sämtliche Endprodukte des CEOP-AEGIS Projektes werden der wissenschaftlichen Gemeinschaft auf Grundlage einer über das Internet erreichbaren Datenbank zugänglich gemacht, welche eine Zuarbeit zur Initiative GEOSS (Global Earth Observing System of Systems) ist. Hintergründig basiert das CEOP-AEGIS Datenportal auf einem Dapper OPeNDAP Internetserver, welcher die im NetCDF Dateiformat gespeicherten Daten der vordergründigen internetbasierten DChart Benutzerschnittstelle auf Grundlage des OPeNDAP Protokolls bereit stellt. Eingangsdaten von Partnern dieses Projektes sind heterogen nicht nur in Bezug ihres Dateninhalts, sondern auch in Anbetracht ihrer Datenhaltung und Metadatenbeschreibung. Die Daten- und Metadatenhaltung der im NetCDF Dateiformat gespeicherten Endprodukte dieses Projektes müssen jedoch auf einer standardisierten Basis internationalen Konventionen folgen, damit ein hoher Grad an Interoperabilität erreicht werden kann. In Anbetracht dieser Qualitätsanforderungen wurden die technischen Möglichkeiten von NetCDF, OPeNDAP, Dapper und DChart in dieser Diplomarbeit gründlich untersucht, damit auf Grundlage dieser Erkenntnisse eine korrekte Entscheidung bezüglich der Implementierung eines für CEOP-AEGIS Daten passenden und interoperablen NetCDF Datenmodels abgeleitet werden kann, das eine maximale Kompatibilität und Funktionalität mit OPeNDAP und Dapper / DChart sicher stellen soll. Diese NetCDF Implementierung ist Bestandteil einer neu entwickelten Datenschnittstelle, welche heterogene Daten von Projektpartnern in standardisierte NetCDF Datensätze konvertiert und aggregiert, sodass diese mittels OPeNDAP dem auf der Dapper / DChart Technologie basierendem Datenportal von CEOP-AEGIS zugeführt werden können. Einen besonderen Schwerpunkt bei der Entwicklung dieser Datenschnittstelle wurde auf eine intermediäre Daten- und Metadatenhaltung gelegt, welche mit der Zielsetzung von geringem Arbeitsaufwand die Modifizierung ihrer Elemente und somit die Erzeugung von standardisierten NetCDF Dateien auf eine einfache Art und Weise erlaubt. In Anbetracht der beträchtlichen und verschiedenartigsten Geodaten dieses Projektes war es schlussendlich wesentlich, eine hochwertige Datenschnittstelle zur Überführung heterogener Eingangsdaten von Projektpartnern in standardisierte und aggregierte NetCDF Ausgansdateien zu entwickeln, um damit eine maximale Kompatibilität und Funktionalität mit dem CEOP-AEGIS Datenportal und daraus folgend ein hohes Maß an Interoperabilität innerhalb der wissenschaftlichen Gemeinschaft erzielen zu können
APA, Harvard, Vancouver, ISO, and other styles
27

Sexton, Christopher G. "Vectorization of gridded urban land use data." 2007. http://www.lib.ncsu.edu/theses/available/etd-11082007-161933/unrestricted/etd.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Hsu, Yi-hsiang, and 許益祥. "An Effective Crosstalk Optimizer in Gridded Channel Routing." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/15240217195902731878.

Full text
Abstract:
碩士
中原大學
電子工程研究所
90
The wire-to-wire spacing in a VLSI chip becomes closer as the VLSI fabrication technology rapidly evolves in current years. With VLSI design entering the deep sub-micron, the feature sizes continues to decrease and the operating frequency more and more highly. Many of digital and analog circuits were integrated into single chip. When the circuits work at low-voltage system, the chip becomes very sensitivity for noise. The crosstalk effects are between coupling capacitance of two parallel wires that becomes the main source of signal delay. The coupling effect can affect speed of signal transition, then the crosstalk can make fault function of chip. In this thesis, we will propose a timing driven gridded channel routing algorithms for the minimization of crosstalk problem. Compared with previous works, the main distinction of our approach is that it considers the signal arrival time. So that we can precisely estimate the actual delay degradation caused by crosstalk effect. This algorithm was implemented in C++ programming language and run on different benchmark circuits. Experimental data consistently shows that our approach is effective and efficient for crosstalk minimization.
APA, Harvard, Vancouver, ISO, and other styles
29

Seglenieks, Frank. "Creation of a gridded time series of hydrological variables for Canada." Thesis, 2009. http://hdl.handle.net/10012/4513.

Full text
Abstract:
There is a lack of measured, long-term, reliable, and well-distributed hydrological variables in Canada. These hydrological variables include, but are not limited to: temperature, precipitation, ground runoff, evapotranspiration, soil moisture, and snow water equivalent. The objective of this thesis was to establish the best possible distributed estimates of these hydrological variables for Canada over the period of 1961-2000. The first step was to interpolate measured temperature and precipitation across the country. These interpolated values were then used to calculate the other hydrological variables using the Waterloo Flood Forecasting Model (WATFLOOD). The Waterloo Mapping technique (WATMAP) was developed to use topographic and land cover databases to automatically and systematically derive the information needed to create the drainage database. WATFLOOD was calibrated with the Dynamically Dimensioned Search (DDS) algorithm using the difference between the measured and simulated streamflow as the objective function. After a final calibration of 100 separate DDS runs, distributed time series for the hydrological variables were created. A simple assessment was made for the predictive uncertainty in the simulated streamflow results based on the results of the final calibration. As well, the implications of various climate change scenarios were examined in the context of how they would change the hydrological variables. The major recommendations for future study included: finding other gridded datasets that could be used to verify the ones that were created in this study and examining further the magnitudes of the different kinds of predictive uncertainty (data, model, and parameter). The results of this thesis fit in well with the goals of the study on Predictions in Ungauged Basins. This thesis was organized along the principle of “design the process, not the product”. As such, although a set of final products are presented at the end, the most important part of the thesis was the process that achieved these products. Thus it is not assumed that every technique designed in this thesis will be applicable to every other researcher, but it hoped that most researchers in the field will be able to use at least some parts of the techniques developed here.
APA, Harvard, Vancouver, ISO, and other styles
30

Σαλαμαλίκης, Βασίλειος. "Ανάπτυξη και αξιολόγηση μεθοδολογίας για τη δημιουργία πλεγματικών (gridded) ισοτοπικών δεδομένων." Thesis, 2010. http://nemertes.lis.upatras.gr/jspui/handle/10889/4252.

Full text
Abstract:
Διάφορες κλιματολογικές, υδρολογικές και περιβαλλοντικές μελέτες απαιτούν ακριβή γνώση της χωρικής κατανομής των σταθερών ισοτόπων του υδρογόνου και του οξυγόνου στον υετό. Δεδομένου ότι ο αριθμός των σταθμών συλλογής δειγμάτων υετού για ισοτοπική ανάλυση είναι μικρός και όχι ομογενώς κατανεμημένος σε πλανητικό επίπεδο, η πλανητκή κατανομή των σταθερών ισοτόπων μπορεί να υπολογισθεί μέσω της δημιουργίας πλεγματικών ισοτοπικών δεδομένων, για τη δημιουργία των οποίων έχουν προταθεί διάφορες μέθοδοι. Ορισμένες χρησιμοποιούν εμπειρικές σχέσεις και γεωστατιστικές μεθόδους ώστε να ελαχιστοποιήσουν τα σφάλματα λόγω παρεμβολής. Στην εργασία αυτή γίνεται μια προσπάθεια να δημιουργηθούν βάσεις πλεγματικών δεδομένων της ισοτοπικής σύστασης του υετού με ανάλυση 10΄ × 10΄ για την περιοχή της Κεντρικής και Ανατολικής Μεσογείου. Προσδιορίζονται στατιστικά πρότυπα λαμβάνοντας υπ’ όψιν γεωγραφικές και μετεωρολογικές παραμέτρους, ως ανεξάρτητες μεταβλητές. Η αρχική μεθοδολογία χρησιμοποιεί μόνο το υψόμετρο της περιοχής και το γεωγραφικό της πλάτος ως ανεξάρτητες μεταβλητές. Επειδή η ισοτοπική σύσταση εξαρτάται και από το γεωγραφικό μήκος προστέθηκαν στα υφιστάμενα πρότυπα, εκτός των γεωγραφικών μεταβλητών και μετεωρολογικές. Προτείνεται σειρά προτύπων τα οποία περιλαμβάνουν είτε ορισμένες είτε συνδυασμό αυτών των παραμέτρων. Η αξιολόγηση των προτύπων γίνεται με εφαρμογή των μεθόδων Thin Plate Splines (TPSS) και Ordinary Kriging (ΟΚ).
Several climatic, hydrological and environmental studies require the accurate knowledge of the spatial distribution of stable isotopes in precipitation. Since the number of rain sampling stations for isotope analysis is small and not evenly distributed around the globe, the global distribution of stable isotopes can be calculated via the production of gridded isotopic data sets. Several methods have been proposed for this purpose. Some of them use empirical equations and geostatistical methods in order to minimize eventual errors due to interpolation. In this work a methodology is proposed for the development of 10΄ × 10΄ gridded isotopic data of precipitation in Central and Eastern Mediterranean. Statistical models are developed taking into account geographical and meteorological parameters as independent variables. The initial methodology takes into account only the altitude and latitude of an area. Since however the isotopic composition of precipitation depends also on longitude, the existing models have been modified by adding meteorological parameters as independent variables also. A series of models is proposed taking into account some or a combination of the above mentioned variables. The models are validated using the Thin Plate Smoothing Splines (TPSS) and the Ordinary Kriging (OK) methods.
APA, Harvard, Vancouver, ISO, and other styles
31

Hsu, Pei-Hsuan, and 徐沛諠. "Cut Redistribution and Guiding Template Assignment Using DSA for 1D Gridded Layout." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/72854648861832210698.

Full text
Abstract:
碩士
中原大學
資訊工程研究所
105
Abstract Many semiconductor industries gradually tend to transit from the complex 2D design to a high regularity of 1D gridded design. In this transition process, cut distribution position has become the most important challenge. For more advanced nanometer designs, cuts may be too dense to be printed by 193 nm immersion (193i) lithography. Many techniques have been proposed by excellent researchers, e.g.: Self-Aligned Double Patterning (SADP), high resolution E-beam lithography (EBL) and Directed Self-Assembly (DSA). DSA is a great potential option and outstanding in recent years. In 1D gridded design, the layout wire divides the dense parallel lines into real wires and dummy wires through metal cuts. Real wire is to achieve the function of the target circuit and the remaining line is called Dummy wire. In order to maintain the original layout function, the cut location can only be moved within the dummy wire. In the layout, if the distance of two cuts on the same real wire is less than the design rule and cuts do not belong to the same DSA template, it’s called “Conflict”. The conditions of conflict between two templates are similar. In this thesis, we will discuss how to pick the appropriate DSA patterns to make cuts and produce 1D gridded design as the original circuit with the same function. With the assignment of different guiding template, DSA pattern shape or size will follow the assignment. In our experiments, Simulated Annealing algorithm is applied to pick patterns for the conflict number minimization. In the post-processing phase after SA-Based matching, the number of double cuts is maximized. In terms of the number of Conflict, we can achieve an improvement rate of 104150% compared to the result of Xiao et al.[14] , and 56% improvement compared to Li''s study[7] . In the Double cut stage, an average increase rate of 12.18592593% of the cut number.
APA, Harvard, Vancouver, ISO, and other styles
32

黎鈺琦. "Hybrid Lithography Optimization with DSA and E-beam for 1D Gridded Layout." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/34445087566108858225.

Full text
Abstract:
碩士
國立清華大學
資訊工程學系
103
As the integrated circuit industry continues to scale down the minimun feature size, design style is moving from complex 2D design to highly regular 1D design with the "lines and cuts" process. While dense lines can be fully optimized by their regularity, randomly distributed cuts become a major challenge for the fabrication of 1D layout. In this thesis, we consider a hybrid lithography process combining directed self-assembly (DSA) and E-Beam for 1D gridded layout, and present a cut redistribution algorithm to maximize the number of cuts matched with DSA templates and minimize the number of cuts printed by E-Beam. The robustness of our algorithm is demonstrated through encouraging experimental results.
APA, Harvard, Vancouver, ISO, and other styles
33

Erzeybek, Balan Selin. "Characterization and modeling of paleokarst reservoirs using multiple-point statistics on a non-gridded basis." 2012. http://hdl.handle.net/2152/19604.

Full text
Abstract:
Paleokarst reservoirs consist of complex cave networks, which are formed by various mechanisms and associated collapsed cave facies. Traditionally, cave structures are defined using variogram-based methods in flow models and this description does not precisely represent the reservoir geology. Algorithms based on multiple-point statistics (MPS) are widely used in modeling complex geologic structures. Statistics required for these algorithms are inferred from gridded training images. However, structures like modern cave networks are represented by point data sets. Thus, it is not practical to apply rigid and gridded templates and training images for the simulation of such features. Therefore, a quantitative algorithm to characterize and model paleokarst reservoirs based on physical and geological attributes is needed. In this study, a unique non-gridded MPS analysis and pattern simulation algorithms are developed to infer statistics from modern cave networks and simulate distribution of cave structures in paleokarst reservoirs. Non-gridded MPS technique is practical by eliminating use of grids and gridding procedure, which is challenging to apply on cave network due to its complex structure. Statistics are calculated using commonly available cave networks, which are only represented by central line coordinates sampled along the accessible cave passages. Once the statistics are calibrated, a cave network is simulated by using a pattern simulation algorithm in which the simulation is conditioned to sparse data in the form of locations with cave facies or coordinates of cave structures. To get an accurate model for the spatial extent of the cave facies, an algorithm is also developed to simulate cave zone thickness while simulating the network. The proposed techniques are first implemented to represent connectivity statistics for synthetic data sets, which are used as point-set training images and are analogous to the data typically available for a cave network. Once the applicability of the algorithms is verified, non-gridded MPS analysis and pattern simulation are conducted for the Wind Cave located in South Dakota. The developed algorithms successfully characterize and model cave networks that can only be described by point sets. Subsequently, a cave network system is simulated for the Yates Field in West Texas which is a paleokarst reservoir. Well locations with cave facies and identified cave zone thickness values are used for conditioning the pattern simulation that utilizes the MP-histograms calibrated for Wind Cave. Then, the simulated cave network is implemented into flow simulation models to understand the effects of cave structures on fluid flow. Calibration of flow model against the primary production data is attempted to demonstrate that the pattern simulation algorithm yields detailed description of spatial distribution of cave facies. Moreover, impact of accurately representing network connectivity on flow responses is explored by a water injection case. Fluid flow responses are compared for models with cave networks that are constructed by non-gridded MPS and a traditional modeling workflow using sequential indicator simulation. Applications on the Yates Field show that the cave network and corresponding cave facies are successfully modeled by using the non-gridded MPS. Detailed description of cave facies in the reservoir yields accurate flow simulation results and better future predictions.
text
APA, Harvard, Vancouver, ISO, and other styles
34

Thormodsgard, June M. "Analysis of the gridded finite element surface interpolation for geometric rectification and mosaicking of landsat data." 1986. http://catalog.hathitrust.org/api/volumes/oclc/15581760.html.

Full text
Abstract:
Thesis (M.A.)--University of Wisconsin--Madison, 1986.
Typescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 41-43).
APA, Harvard, Vancouver, ISO, and other styles
35

Pan, H. B., and 潘宏彬. "The measurement and analysis of wafer type gridded energy analyzer in a pulse-time modulated inductively coupled plasma." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/17626706899889892071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Glover, Timothy Ward. "Measurement of plasma parameters in the exhaust of a magnetoplasma rocket by gridded energy analyzer and emissive Langmuir probe." Thesis, 2002. http://hdl.handle.net/1911/18084.

Full text
Abstract:
The 10 kilowatt prototype of the Variable Specific Impulse Magnetoplasma Rocket (VASIMR) engine, abbreviated as VX-10, is designed to eject plasma at exhaust velocities of tens of kilometers per second. In this device, energy is imparted to the plasma ions by two mechanisms: ion cyclotron resonant heating (ICRH), and acceleration in an ambipolar electric field. Measurements from two different electrostatic probes are combined to determine how much each mechanism contributes to the total ion energy. The first probe is a gridded retarding potential analyzer (RPA) that incorporates a multi-channel collimator to obtain precise measurement of the ion and electron parallel energy distributions. The second is an emissive Langmuir probe that measures the DC and RF components of the plasma potential. The plasma potential obtained from the emitting probe allows calculation of the parallel velocity distribution once the parallel energy distribution is obtained from the energy analyzer data. Biasing the RPA housing is shown to minimize the plasma perturbation, as monitored by an auxiliary probe. When this minimization is done, the RPA measurements become compatible with the emissive probe's measurement of plasma potential. The collimated RPA and emissive probe have been used to examine the effects of a double dual half-turn (DDHT) antenna encircling the plasma. When power at the ion cyclotron frequency is applied, changes are seen in the saturation current and mean ion energy of the collimated RPA characteristic. The evolution of these changes as the RPA is moved downstream from the antenna is interpreted as firm evidence of ion cyclotron heating, albeit at absorbed energies of less than 1 electronvolt per ion. The emissive probe shows that, within experimental error, all of the increased ion energy is accounted for by an increase in the plasma potential that occurs when the ICRF power is applied. The combined RPA and emissive probe data also show that there is a jet of flowing plasma in the VX-10 when operated with the helicon source alone but that the signal from this jet is overwhelmed by a rapidly growing stationary plasma within the first second of the discharge.
APA, Harvard, Vancouver, ISO, and other styles
37

Shafer, Phillip Edmond. "Developing gridded forecast guidance for warm season lightning over Florida using the perfect prognosis method and mesoscale model output." 2007. http://etd.lib.fsu.edu/theses/available/etd-04022007-161314.

Full text
Abstract:
Thesis (Ph. D.)--Florida State University, 2007.
Advisor: Henry E. Fuelberg, Florida State University, College of Arts and Sciences, Dept. of Meteorology. Title and description from dissertation home page (viewed July 24, 2007). Document formatted into pages; contains xiii, 87 pages. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
38

Bhowmik, Avit Kumar. "Evaluation of Spatial interpolation techniques for mapping climate variables with low sample density: a case study using a new gridded dataset of Bangladesh." Master's thesis, 2012. http://hdl.handle.net/10362/8321.

Full text
Abstract:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
This study explores and analyses the impact of sample density on the performances of the spatial interpolation techniques. It evaluates the performances of two alternative deterministic techniques – Thin Plate Spline and Inverse Distance Weighting, and two alternative stochastic techniques – Ordinary Kriging and Universal Kriging; to interpolate two climate indices - Annual Total Precipitation in Wet Days and the Yearly Maximum Value of the Daily Maximum Temperature, in a low sample density region - Bangladesh, for 60 years – 1948 to 2007. It implies the approach of Spatially Shifted Years to create mean variograms with respect to the low sample density. Seven different performance measurements - Mean Absolute Error, Root Mean Square Errors, Systematic Root Mean Square Errors, Unsystematic Root Mean Square Errors, Index of Agreement, Coefficient of Variation of Prediction and Confidence of Prediction, have been applied to evaluate the performance of the spatial interpolation techniques. The resulted performance measurements indicate that for most of the years the Universal Kriging method performs better to interpolate total precipitation, and the Ordinary Kriging method performs better to interpolate the maximum temperature. Though the difference surfaces indicate a very little difference in the estimating ability of the four spatial interpolation techniques, the residual plots refer to the differences in the interpolated surfaces by different techniques in terms of their over and under estimation. The results also indicate that the Inverse Distance Weighting method performs better for both indices, when the sample density is too low, but the performance is questioned by the inclusion of measurement errors in the interpolated surfaces. All the error measurements show a decreasing trend with the increasing sample density, and the index of agreement and confidence of prediction show an increasing trend over years. Finally, the strong correlation between the Sample Coefficient of Variation and the performance measurements, implies that the more representative the samples are of the climate phenomenon, the more improved are the performances of the spatial interpolation techniques. The correlation between the sample coefficient of variation and the number of samples implies that the high representativity of the sample is attainable with an increased sample density.
APA, Harvard, Vancouver, ISO, and other styles
39

Ζάγουρας, Αθανάσιος. "Μέθοδοι εξαγωγής και ψηφιακής επεξεργασίας περιβαλλοντικών σημάτων και εικόνων – Εφαρμογή στην αυτόματη ταξινόμηση χαρτών καιρού." Thesis, 2012. http://hdl.handle.net/10889/6076.

Full text
Abstract:
Η συνοπτική ταξινόμηση των συστημάτων καιρού αφορά πληθώρα περιβαλλοντικών εφαρμογών. Προσφάτως, η γνωστική περιοχή για την οποία η συνοπτική ταξινόμηση έχει βαρύνουσα σημασία είναι αυτή της ατμοσφαιρικής ρύπανσης. Η γνώση της συνοπτικής κλιματολογίας μιας περιοχής, επιτρέπει την πρόγνωση και ενδεχομένως την αποφυγή επεισοδίων ρύπανσης, τα οποία οφείλονται είτε σε τοπικές πηγές είτε στην μεταφορά ρύπων. Η γνώση αυτή ενισχύεται σημαντικά μέσω της κατηγοριοποίησης (ταξινόμησης) των συνοπτικών καταστάσεων που επικρατούν σε μία δεδομένη περιοχή. Τα τελευταία χρόνια έχουν γίνει προσπάθειες «αυτόματης», μη εμπειρικής, ταξινόμησης με την χρήση Η/Υ. Οι μέχρι τώρα προσπάθειες επικεντρώνονται σε κλασικές στατιστικές μεθόδους. Σκοπός αυτής της διδακτορικής διατριβής είναι η ανάπτυξη μεθόδων και η υλοποίηση αλγορίθμων για την εξαγωγή και ψηφιακή επεξεργασία περιβαλλοντικών σημάτων και εικόνων. Η εφαρμογή των ανωτέρω οδηγεί στη δημιουργία έμπειρων συστημάτων συνοπτικής ταξινόμησης των συστημάτων καιρού, η οποία βασίζεται σε μεθόδους επεξεργασίας εικόνας, ανάλυσης και ομαδοποίησης δεδομένων, αναγνώρισης προτύπων και θεωρίας γράφων. Η σκοπιμότητα της παρούσης έρευνας διαφαίνεται από τη πρωτοτυπία που παρουσιάζει, η οποία έγκειται στο γεγονός ότι οι τεχνικές που παρουσιάζονται και που έχουν αντιμετωπίσει επιτυχώς σειρά προβλημάτων ταξινόμησης σε διάφορους γνωστικούς τομείς, εφαρμόζονται για πρώτη φορά στην Ελλάδα σε θέματα Μετεωρολογίας-Κλιματολογίας-Φυσικής του Περιβάλλοντος και συγκεκριμένα για την συνοπτική ταξινόμηση των συστημάτων καιρού. Τα χαρακτηριστικά των σύγχρονων μεθόδων επεξεργασίας εικόνας, θεωρίας γράφων και ανάλυσης δεδομένων, καθιστούν τις προτεινόμενες προσεγγίσεις αυτής της διατριβής ανταγωνιστικές τόσο σε επίπεδο ποιότητας ταξινόμησης όσο και σε υπολογιστικό χρόνο.
The synoptic classification of weather systems involves a variety of environmental applications. Recently, the synoptic classification has been found to be relevant with the cognitive area of air pollution. Knowing the synoptic climatology of a region, allows the prediction and possibly the prevention of pollution incidents, resulting in either local sources or in transport of pollutants. This knowledge is greatly enhanced by the categorization (classification) of the synoptic conditions in a given area. In recent years ‘automatic’, non-empirical, classification methods have been developed using computers. So far these efforts have been based on classical statistical methods. The aim of this PhD thesis is the development of methods and the implementation of algorithms to extract and process digital signals and environmental images. Consequently, expert systems for the synoptic classification of weather systems are created based on methods relative to image processing, data analysis and clustering, pattern recognition and graph theory. The objective of this research is demonstrated by its own originality which lies in the fact that the presented techniques have successfully addressed a number of classification problems in different topics. It is the first time that such methods have been applied on Meteorology-Climatology-Physics of the Environment in Greece, namely the synoptic classification of weather systems. The characteristics of the modern methods proposed in this PhD thesis are competitive both in classification quality and in computational time.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography