To see the other types of publications on this topic, follow the link: GPS data processing.

Dissertations / Theses on the topic 'GPS data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'GPS data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Griffin, Terry W. "GPS CaPPture: a System for GPS Trajectory Collection, Processing, and Destination Prediction." Thesis, University of North Texas, 2012. https://digital.library.unt.edu/ark:/67531/metadc115089/.

Full text
Abstract:
In the United States, smartphone ownership surpassed 69.5 million in February 2011 with a large portion of those users (20%) downloading applications (apps) that enhance the usability of a device by adding additional functionality. a large percentage of apps are written specifically to utilize the geographical position of a mobile device. One of the prime factors in developing location prediction models is the use of historical data to train such a model. with larger sets of training data, prediction algorithms become more accurate; however, the use of historical data can quickly become a downfall if the GPS stream is not collected or processed correctly. Inaccurate or incomplete or even improperly interpreted historical data can lead to the inability to develop accurately performing prediction algorithms. As GPS chipsets become the standard in the ever increasing number of mobile devices, the opportunity for the collection of GPS data increases remarkably. the goal of this study is to build a comprehensive system that addresses the following challenges: (1) collection of GPS data streams in a manner such that the data is highly usable and has a reduction in errors; (2) processing and reduction of the collected data in order to prepare it and make it highly usable for the creation of prediction algorithms; (3) creation of prediction/labeling algorithms at such a level that they are viable for commercial use. This study identifies the key research problems toward building the CaPPture (collection, processing, prediction) system.
APA, Harvard, Vancouver, ISO, and other styles
2

Hart, Dennis L., Johnny J. Pappas, and John E. Lindegren. "Desktop GPS Analyst Standardized GPS Data Processing and Analysis on a Personal Computer." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611424.

Full text
Abstract:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
In the last few years there has been a proliferation of GPS receivers and receiver manufacturers. Couple this with a growing number of DoD test programs requiring high accuracy Time-Space-Position-Information (TSPI) with diminishing test support funds and/or needing a wide area, low altitude or surface tracking capability. The Air Force Development Test Center (AFDTC) recognized the growing requirements for using GPS in test programs and the need for a low cost, portable TSPI processing capability which sparked the development of the Desktop GPS Analyst. The Desktop GPS Analyst is a personal computer (PC) based software application for the generation of GPS-based TSPI.
APA, Harvard, Vancouver, ISO, and other styles
3

Satirapod, Chalermchon Surveying &amp Spatial Information Systems Faculty of Engineering UNSW. "Improving the GPS Data Processing Algorithm for Precise Static Relative Positioning." Awarded by:University of New South Wales. School of Surveying and Spatial Information Systems, 2002. http://handle.unsw.edu.au/1959.4/18244.

Full text
Abstract:
Since its introduction in the early 1980????s, the Global Positioning System (GPS) has become an important tool for high-precision surveying and geodetic applications. Carrier phase measurements are the key to achieving high accuracy positioning results. This research addresses one of the most challenging aspects in the GPS data processing algorithm, especially for precise GPS static positioning, namely the definition of a realistic stochastic model. Major contributions of this research are: (a) A comparison of the two data quality indicators, which are widely used to assist in the definition of the stochastic model for GPS observations, has been carried out. Based on the results obtained from a series of tests, both the satellite elevation angle and the signal-to-noise ratio information do not always reflect the reality. (b) A simplified MINQUE procedure for the estimation of the variance-covariance components of GPS observations has been proposed. The proposed procedure has been shown to produce similar results to those from the standard MINQUE procedure. However, the computational load and time are significantly reduced, and in addition the effect of a changing number of satellites on the computations is effectively dealt with. (c) An iterative stochastic modelling procedure has been developed in which all error features in the GPS observations are taken into account. Experimental results show that by applying the proposed procedure, both the certainty and the accuracy of the positioning results are improved. In addition, the quality of ambiguity resolution can be more realistically evaluated. (d) A segmented stochastic modelling procedure has been developed to effectively deal with long observation period data sets, and to reduce the computational load. This procedure will also take into account the temporal correlations in the GPS measurements. Test results obtained from both simulated and real data sets indicate that the proposed procedure can improve the accuracy of the positioning results to the millimetre level. (e) A novel approach to GPS analysis based on a combination of the wavelet decomposition technique and the simplified MINQUE procedure has been proposed. With this new approach, the certainty of ambiguity resolution and the accuracy of the positioning results are improved.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhao, Xiaoyun. "Road network and GPS tracking with data processing and quality assessment." Licentiate thesis, Högskolan Dalarna, Mikrodataanalys, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:du-17354.

Full text
Abstract:
GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.
APA, Harvard, Vancouver, ISO, and other styles
5

Wolf, Jean Louise. "Using GPS data loggers to replace travel diaries in the collection of travel data." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/20203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Bingxia. "Identifying activity type and trip prupose from data collected by passive GPS." HKBU Institutional Repository, 2012. https://repository.hkbu.edu.hk/etd_ra/1385.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zimba, Robert. "High precision GPS data processing for the survey of South African tide gauges." Master's thesis, University of Cape Town, 2000. http://hdl.handle.net/11427/4977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

King, Peter Haywood. "A Low Cost Localization Solution Using a Kalman Filter for Data Fusion." Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/32384.

Full text
Abstract:
Position in the environment is essential in any autonomous system. As increased accuracy is required, the costs escalate accordingly. This paper presents a simple way to systematically integrate sensory data to provide a drivable and accurate position solution at a low cost. The data fusion is handled by a Kalman filter tracking five states and an undetermined number of asynchronous measurements. This implementation allows the user to define additional adjustments to improve the overall behavior of the filter. The filter is tested using a suite of inexpensive sensors and then compared to a differential GPS position. The output of the filter is indeed a drivable solution that tracks the reference position remarkably well. This approach takes advantage of the short-term accuracy of odometry measurements and the long-term fix of a GPS unit. A maximum error of two meters of deviation from the reference is shown for a complex path over two minutes and 100 meters long.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
9

Gentek, Anna. "Activity Recognition Using Supervised Machine Learning and GPS Sensors." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-295600.

Full text
Abstract:
Human Activity Recognition has become a popular research topic among data scientists. Over the years, multiple studies regarding humans and their daily motion habits have been investigated for many different purposes. This fact is not surprising when we look at all the opportunities and applications that can be applied and utilized thanks to the results of these algorithms. In this project we implement a system that can effectively collect sensor data from mobile devices, process it and by using supervised machine learning successfully predict the class of a performed activity. The project was executed based on datasets and features extracted from GPS sensors. The system was trained using various machine learning algorithms and Python SciKit to guarantee optimal solutions with accurate predictions. Finally, we applied a majority vote rule to secure the best possible accuracy of the activity classification process. As a result we were able to identify various activities including walking, cycling, driving and public transportation methods bus and metro with 90+% accuracy.
Att utföra aktivitetsigenkänning på människor har blivit ett populärt forskningsämne bland datavetare, där flertalet studier rörande människor och deras dagliga rörelsevanor undersökts för många olika syften. Detta är inte förvånande när man ser till de möjligheter och användningsområden som kan tillämpas och utnyttjas tack vare resultaten från dessa system. Detta projekt går ut på att implementera ett system som mha samlad sensordata från mobila enheter, kan bearbeta den och genom s.k övervakad maskininlärning med goda resultat bestämma den aktivitet som utförts. Projektet genomfördes baserat på dataset och egenskaper extraherade från GPS-data. Systemet tränades med olika maskininlärningsalgoritmer genom Python SciKit för att välja den bäst lämpade metoden för detta projekt. Slutligen tillämpade vi majority votemetoden för att säkerställa bästa möjliga noggrannhet i aktivitetsklassificeringsprocessen. Resultatet blev ett system som framgångsrikt kan identifiera aktiviteterna gå, cykla, köra bil samt med ett ytterligare fokus på kollektivtrafikmetoderna buss och tunnelbana, med en noggrannhet på över 90%.
Kandidatexjobb i elektroteknik 2020, KTH, Stockholm
APA, Harvard, Vancouver, ISO, and other styles
10

Grejner-Brzezinska, Dorota A. "Analysis of GPS Data Processing Techniques: In Search of Optimized Strategy of Orbit and Earth Rotation Parameter Recovery /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487929745335624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Abreu, Mário Alexandre de. "Análise da qualidade dos dados GPS: estudo de caso da estação de Cananéia." Universidade de São Paulo, 2007. http://www.teses.usp.br/teses/disponiveis/3/3138/tde-08012008-144750/.

Full text
Abstract:
Na Geodésia, quando se deseja determinar coordenadas com exatidão e precisão, faz-se uso do posicionamento relativo. Esta técnica consiste no emprego de dois ou mais receptores GPS onde um receptor permanece sobre um ou mais pontos de coordenadas conhecidas (base) e um ou mais receptores (rover) são colocados sobre os pontos a serem determinados. Para auxiliar e aumentar a produtividade dos usuários que utilizam o posicionamento relativo foram criados os Sistemas de Controle Ativo (SCA), que permitem ao usuário empregar apenas um receptor para realizar o posicionamento relativo, desde que dados coletados simultaneamente de uma estação contínua, pertencente a uma rede ativa, estejam disponíveis. No Brasil, a rede ativa, gerenciada pelo IBGE, é materializada pela RBMC - Rede Brasileira de Monitoramento Contínuo do Sistema GPS. Os dados de 24 estações distribuídas por todo o território nacional são coletados ininterruptamente e disponibilizados para os usuários. Devido ao grande aumento da utilização das redes de monitoramento contínuo, é de suma importância que as observáveis das estações passem por um controle de qualidade, assegurando confiabilidade aos dados coletados. Neste trabalho é apresentada uma análise detalhada dos indicativos de qualidade dos dados da estação NEIA (Cananéia/SP), pertencente a RBMC, referente ao período compreendido entre janeiro de 2002 e abril de 2006, bem como da variação temporal de suas coordenadas (latitude, longitude e altura geométrica), quando comparadas com as coordenadas fiduciais da estação. Estas análises foram realizadas com base nos resultados de processamento obtidos com o uso dos seguintes programas: TEQC, BERNESE V. 5.0 e pelo serviço de processamento on-line AUSPOS.
In Geodesy, when the determination of coordinates with great accuracy and precision is desired, relative positioning is used. This technique consists of the use of the two or more receivers, where a receiver is placed on a station of known coordinates (base) and one or more remote receivers (rover) is placed on the stations to be determines. To help and increase the productivity of the users that use relative positioning, had been created the Control Active Systems (SCA), that allow the use of only one receiver by the user to carry out the relative positioning, provided that the data collected simultaneously from a continuous station of an active network is available. In Brazil, the active network is materialized by the RBMC - Brazilian Network of Continuous Monitoring of GPS, managed by the IBGE. Data of 24 stations distributed throughout the country is collected uninterruptedly and made available for the users. Due to the great increase of the use of the networks of continuous monitoring, it is of high importance that the observations pass through a quality control, assuring trustworthiness to the collected data. In this work is presented a detailed analysis of the data quality of station NEIA (Cananéia/São Paulo State/Brazil), belonging to the RBMC, referring to the period between January 2002 and May 2006, and the secular variation of its coordinates (latitude, longitude and geometric height), when compared with the fiducials coordinates of this station. These analyses had been carried out based on the results obtained by the processing with the following softwares: TEQC, BERNESE V. 5.0 and the AUSPOS on-line processing service.
APA, Harvard, Vancouver, ISO, and other styles
12

Demeulemeester, Kilian. "Needle Localization in Ultrasound Images : FULL NEEDLE AXIS AND TIP LOCALIZATION IN ULTRASOUND IMAGES USING GPS DATA AND IMAGE PROCESSING." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-180454.

Full text
Abstract:
Many medical interventions involve ultrasound based imaging systems to safely localize and navigate instruments into the patient body. To facilitate visual tracking of the instruments, we investigate the techniques and methodologies best suited for solving the problem of needle localization in ultrasound images. We propose a robust procedure that automatically determines the position of a needle in 2D ultrasound images. Such a task is decomposed into the localization of the needle axis and its tip. A first estimation of the axis position is computed with the help of multiple position sensors, including one embedded in the transducer and another in the needle. Based on this, the needle axis is computed using a RANSAC algorithm. The tip is detected by analyzing the intensity along the axis and a Kalman filter is added to compensate for measurement uncertainties. The algorithms were experimentally verified on real ultrasound images acquired by a 2D scanner scanning a portion of a cryogel phantom that contained a thin metallic needle. The experiments shows that the algorithms are capable of detecting a needle at millimeter accuracy.The computational time of the order of milliseconds permits real time needle localization.
APA, Harvard, Vancouver, ISO, and other styles
13

Larocca, Ana Paula Camargo. "Análise de estratégias para processamento de redes geodésicas com o sistema de posicionamento global - GPS." Universidade de São Paulo, 2000. http://www.teses.usp.br/teses/disponiveis/18/18137/tde-24052006-101143/.

Full text
Abstract:
O presente trabalho consiste de apresentação de metodologia para estudo, elaboração e análise de estratégias para processamento de observáveis GPS, para a constituição de redes geodésicas. No desenvolvimento deste trabalho foram utilizados os dados observados da rede geodésica do Estado de São Paulo, concluída em 1994. Esta rede é constituída por vinte e quatro pontos distribuídos pelo estado, mais o vértice CHUA, que é o vértice fundamental da triangulação do Sistema Geodésico Brasileiro. Através das estratégias elaboradas são analisados diversos fatores de importância relevante nos processamentos dos dados GPS, como: influência de dados meteorológicos no processamento de linhas bases longas; resultados de processamentos com efemérides transmitidas e precisas; resultados de processamentos com linhas bases de comprimentos homogêneos e menores ou igual a 150 km; resultados de processamentos considerando apenas duas horas e trinta minutos do tempo total de duração das sessões de observação. Os resultados dos ajustamentos destas estratégias são comparados entre si e se apresenta, então, análises e conclusões sobre a influência dos fatores analisados
The present work consists in the presentation of a methodology for study, elaboration and analyses of strategies to process GPS observables for geodetic networks. In the development of this work, GPS data of the geodetic network of the State of Sao Paulo, concluded in 1994, were used. This network is composed twenty-four points scattered in the State, plus the vertex CHUA, that is the fundamental point of the triangulation of the brazilian geodesy system. Through the strategies elaborated, several factors of main importance for data GPS processing, are analyzed, such as: the influence of meteorological data processing of long baselines; the results of data processing with broadcast and precise ephemeris; the results of data processing with baselines of homogeneous lengths and smaller than or equal to 150 km; the results of data processing considering only two hours and thirty minutes of the total time of duration of the observation sessions. The results of the adjustment of these strategies are compared to each other, followed by analyses and conclusions about the influence of these factors on data processing
APA, Harvard, Vancouver, ISO, and other styles
14

Opperman, B. D. L. "Reconstructing ionospheric TEC over South Africa using signals from a regional GPS network." Thesis, Rhodes University, 2008. http://hdl.handle.net/10962/d1005273.

Full text
Abstract:
Radio signals transmitted by GPS satellites orbiting the Earth are modulated as they propagate through the electrically charged plasmasphere and ionosphere in the near-Earth space environment. Through a linear combination of GPS range and phase measurements observed on two carrier frequencies by terrestrial-based GPS receivers, the ionospheric total electron content (TEC) along oblique GPS signal paths may be quantified. Simultaneous observations of signals transmitted by multiple GPS satellites and observed from a network of South African dual frequency GPS receivers, constitute a spatially dense ionospheric measurement source over the region. A new methodology, based on an adjusted spherical harmonic (ASHA) expansion, was developed to estimate diurnal vertical TEC over the region using GPS observations over the region. The performance of the ASHA methodology to estimate diurnal TEC and satellite and receiver differential clock biases (DCBs) for a single GPS receiver was first tested with simulation data and subsequently applied to observed GPS data. The resulting diurnal TEC profiles estimated from GPS observations compared favourably to measurements from three South African ionosondes and two other GPS-based methodologies for 2006 solstice and equinox dates. The ASHA methodology was applied to calculating diurnal two-dimensional TEC maps from multiple receivers in the South African GPS network. The space physics application of the newly developed methodology was demonstrated by investigating the ionosphere’s behaviour during a severe geomagnetic storm and investigating the long-term ionospheric stability in support of the proposed Square Kilometre Array (SKA) radio astronomy project. The feasibility of employing the newly developed technique in an operational near real-time system for estimating and dissimenating TEC values over Southern Africa using observations from a regional GPS receiver network, was investigated.
APA, Harvard, Vancouver, ISO, and other styles
15

Štrba, Peter. "Určení přetvoření železničního svršku." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2014. http://www.nusl.cz/ntk/nusl-226760.

Full text
Abstract:
The aim of this diploma thesis is to specify the displacement and deformations of the railway tracks axis on bridge structures in cities of Zábřeh na Moravě and Břeclav. The bridge structures covered have great dilatation distances, therefore it is necessary to monitor the movements of the railway tracks depending on climatic conditions. GPS methods have been used throughout the whole measurement and data processing. The result includes also a comparison of GPS and conventional methods. The result of the thesis is a detection of proven displacements. One of the goals of the thesis is the comparison of the displacements and the accuracy using the method chosen by author of the thesis and using classical geodetic methods.
APA, Harvard, Vancouver, ISO, and other styles
16

Elango, Vetri Venthan. "Methodology to model activity participation using longitudinal travel variability and spatial extent of activity." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/54290.

Full text
Abstract:
Macroscopic changes in the urban environment and in the built transportation infrastructure, as well as changes in household demographics and socio-economics, can lead to spatio-temporal variations in household travel patterns and therefore regional travel demand. Dynamics in travel behavior may also simply arise from the randomness associated with values, perceptions, attitudes, needs, preferences and decision-making process of the individual travelers. Most urban travel behavior models and analysis seek to explain variations in travel behavior in terms of characteristics of the individuals and their environment. Spatial extents and temporal variation in an individual’s travel pattern may represent a measure of the individual’s spatial appetite for activity and the variability-seeking nature on his/her travel behavior. The objective of this dissertation effort is to develop a methodology to predict activity participation using revealed spatial extents and temporal variability as variables that represent the spatial appetite and variability-seeking nature associated with individual household. Activity participation is defined as a set of activities in which an individual or household takes part, to satisfy the sustenance, maintenance and discretionary needs of the household. To accomplish the goals of the dissertation, longitudinal travel data collected from the Commute Atlanta Study are used. The raw Global Positioning Systems (GPS) data are processed to summarize trip data by household travel day and individual travel day data. A methodology was developed to automatically identify the activity at the end of each trip. Methods were then developed to estimate travel behavior variability that can represent the variability-seeking nature of the individual. Existing methods to estimate activity space were reviewed and a new Modified Kernel Density area method was developed to address issues with current methods. Finally activity participation models using structural equation modeling methods were developed and the effects of the variability-seeking nature and spatial extent of activities were applied to the models. The variability-seeking nature was presented in the activity participation model as a latent variable with coefficient of variation of trips and distance as indicator variables. The dissertation research found that inclusion of activity space variables can improve the activity participation modeling process to better explain travel behavior.
APA, Harvard, Vancouver, ISO, and other styles
17

Dabiri, Sina. "Application of Deep Learning in Intelligent Transportation Systems." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/87409.

Full text
Abstract:
The rapid growth of population and the permanent increase in the number of vehicles engender several issues in transportation systems, which in turn call for an intelligent and cost-effective approach to resolve the problems in an efficient manner. A cost-effective approach for improving and optimizing transportation-related problems is to unlock hidden knowledge in ever-increasing spatiotemporal and crowdsourced information collected from various sources such as mobile phone sensors (e.g., GPS sensors) and social media networks (e.g., Twitter). Data mining and machine learning techniques are the major tools for analyzing the collected data and extracting useful knowledge on traffic conditions and mobility behaviors. Deep learning is an advanced branch of machine learning that has enjoyed a lot of success in computer vision and natural language processing fields in recent years. However, deep learning techniques have been applied to only a small number of transportation applications such as traffic flow and speed prediction. Accordingly, my main objective in this dissertation is to develop state-of-the-art deep learning architectures for resolving the transport-related applications that have not been treated by deep learning architectures in much detail, including (1) travel mode detection, (2) vehicle classification, and (3) traffic information system. To this end, an efficient representation for spatiotemporal and crowdsourced data (e.g., GPS trajectories) is also required to be designed in such a way that not only be adaptable with deep learning architectures but also contains efficient information for solving the task-at-hand. Furthermore, since the good performance of a deep learning algorithm is primarily contingent on access to a large volume of training samples, efficient data collection and labeling strategies are developed for different data types and applications. Finally, the performance of the proposed representations and models are evaluated by comparing to several state-of-the-art techniques in literature. The experimental results clearly and consistently demonstrate the superiority of the proposed deep-learning based framework for each application.
PHD
APA, Harvard, Vancouver, ISO, and other styles
18

Leung, Chi-man, and 梁志文. "Integration of modern GIS into orienteering course planning and map making." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B2977813X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Yu, Donggang, and dyu@venus it swin edu au. "Processing and recognition of document and GIS images." Swinburne University of Technology, 2005. http://adt.lib.swin.edu.au./public/adt-VSWT20050812.095914.

Full text
Abstract:
In intelligent document processing system and geographical information systems (GIS), the image processing and recognition play an important role. This thesis deals with various problems in processing images in documents and GIS: image smoothing, filling, linearization and extraction of contour features, extraction of structural points, separation and recognition of spurious segments in handwritten digits, reconstruction and recognition of broken digits, and separation and recognition of colour document and GIS images. These approaches are also called Optical Character Recognition (OCR). A new smoothing technique is developed to smooth follow contours of image. With the new smoothing algorithms, spurious pixels (points) of contours are removed based on smooth patterns, and smooth followed contours are found. Also, skeletons of image can be smoothed between neighboring �end� and �junction� points. Smooth following makes linearization of smoothed contours possible based on Freeman codes. A new filling algorithm of contours, project filling, is described based on two kinds of structural patterns. By this method, any complicated contours of images can be filled correctly. Different from other linearization methods, linearization and feature extraction of smoothed contours are based on difference chain codes. Curvature and bend angles of linearized are found. The convexity and concavity of linearized are described. In this way, a series of description features of contours is formed. Structural points are new and useful features to describe morphological structures between neighboring linearized lines. Extraction of structural points is based on structural patterns which are determined by element chain codes. Also, extension Freeman codes are used in this thesis. Structural points make description and recognition of contours possible. In order to recognize handwritten digits in document processing systems, separation of spurious segments, reconstruction of broken digits and recognition of handwritten digits are investigated. Experiments with large number of testing data set show satisfactory results for these algorithms. Separation and recognition of colour document and GIS images are discussed. Object images of document and GIS images are extracted based on the description of shape structures, prior knowledge and color information, which are associated with each other. Color images can be described by a limited number of colors in color document and GIS images. Therefore, separation of color image is done by color reduction method, and recognition of object images is based on structure patterns, prior knowledge and colour information. It can be seen that specific information should be considered in many practical problems to achieve better processing results.
APA, Harvard, Vancouver, ISO, and other styles
20

Hitchcock, Jonathan James. "Automated processing and analysis of gas chromatography/mass spectrometry screening data." Thesis, University of Bedfordshire, 2009. http://hdl.handle.net/10547/134940.

Full text
Abstract:
The work presented is a substantial addition to the established methods of analysing the data generated by gas chromatography and low-resolution mass spectrometry. It has applications where these techniques are used on a large scale for screening complex mixtures, including urine samples for sports drug surveillance. The analysis of such data is usually automated to detect peaks in the chromatograms and to search a library of mass spectra of banned or unwanted substances. The mass spectra are usually not exactly the same as those in the library, so to avoid false negatives the search must report many doubtful matches. Nearly all the samples in this type of screening are actually negative, so the process of checking the results is tedious and time-consuming. A novel method, called scaled subtraction, takes each scan from the test sample and subtracts a mass spectrum taken from a second similar sample. The aim is that the signal from any substance common to the two samples will be eliminated. Provided that the second sample does not contain the specified substances, any which are present in the first sample can be more easily detected in the subtracted data. The spectrum being subtracted is automatically scaled to allow for compounds that are common to both samples but with different concentrations. Scaled subtraction is implemented as part of a systematic approach to preprocessing the data. This includes a new spectrum-based alignment method that is able to precisely adjust the retention times so that corresponding scans of the second sample can be chosen for the subtraction. This approach includes the selection of samples based on their chromatograms. For this, new measures of similarity or dissimilarity are defined. The thesis presents the theoretical foundation for such measures based on mass spectral similarity. A new type of difference plot can highlight significant differences. The approach has been tested, with the encouraging result that there are less than half as many false matches compared with when the library search is applied to the original data. True matches of compounds of interest are still reported by the library search of the subtracted data.
APA, Harvard, Vancouver, ISO, and other styles
21

Zheng, Xinmin, and 鄭新敏. "A fuzzy genetic algorithms (GAs) model for time-cost optimization in construction." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B27510839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Meyer, K. C. (Kobus Cornelius). "Development of a GIS for sea rescue." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53360.

Full text
Abstract:
Thesis (MA)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: Saving the life of another person cannot be measured in monetary terms. It is also impossible to describe the satisfactiori of carrying out a successful rescue to anybody. However, the disappointment and sense of failure when a rescue mission fails and a life is lost, is devastating. Many rescue workers, including those of the National Sea Rescue Institute (NSRI), have experienced this overwhelming sense of failure. Rescue workers often dwell on failed rescue attempts, wishing that they could have arrived on the scene earlier or knew where to start looking for people. The fact that lives are still lost, despite the best efforts of rescue workers, points to the need to improve on life saving techniques, procedures, equipment and technology. Providing the NSRI with a workable tool to help them manage and allocate resources, plan a rescue, determine drift speed and distance or create search patterns, may one day be just enough to save one more life. With this goal in mind, a search and rescue application, called RescueView, was developed utilising ArcView 3.2a. This application was specifically designed for use by the NSRI, and it will be used as a command centre in all NRSI control rooms and for all rescue efforts.
AFRIKAANSE OPSOMMING: Om die lewe van 'n ander persoon te red, kan nie in geldwaarde gemeet word nie. Dit is ook onmoontlik om aan enige iemand die bevrediging van 'n suksesvolle redding te beskryf. Die terleurstelling en gevoel van verlies is egter baie groot wanneer 'n reddingspoging misluk en 'n lewe verloor word. Menige reddingswerkers, insluitend dié van die Nasional Seereddingsinstituut (NSRI), het al hierdie oorweldigende gevoel van mislukking ervaar. Reddingswerkers tob dikwels oor onsuksesvolle reddingspogings en wens dat hulle vroeër op die toneel aangekom het of geweet het waar om vir mense te begin soek. Die feit dat lewensverlies steeds plaasvind, ten spyte van reddingswerkers se beste pogings, dui op die behoefte om lewensreddingstegnieke, -prosedures, -toerusting en -tegnologie te verbeter. ( Deur die NSRI met 'n werkbare instrument te voorsien, wat hulle kan help om hulpbronne te bestuur en toe te wys, 'n redding te beplan, dryfspoed en -afstand te bepaal of soekpatrone te skep, mag eendag dalk net genoeg wees om nog 'n lewe te red. Met hierdie doel in gedagte is RescueView, 'n soek- en reddingsapplikasie, deur middel van ArcView 3.2a ontwikkel. Hierdie applikasie is spesifiek ontwerp vir gebruik deur die NSRI en dit sal as beheersentrurn in alle NSRI kontrolekamers en vir alle reddingspogings gebruik word.
APA, Harvard, Vancouver, ISO, and other styles
23

Bansal, Reeshidev. "Discrimination and Enhancement of Fracture Signals on Surface Seismic Data." Thesis, Virginia Tech, 2003. http://hdl.handle.net/10919/33336.

Full text
Abstract:
Fracture patterns control flow and transport properties in a tight gas reservoir and therefore play a great role in siting the production wells. Hence, it is very important that the exact location and orientation of fractures or fracture swarms is known. Numerical models show that the fractures may be manifested on seismograms as discrete events.A number of data processing workflows were designed and examined to enhance these fracture signals and to suppress the reflections in seismic data. The workflows were first tested on a 2D synthetic data set, and then applied to 3D field data from the San Juan Basin in New Mexico. All these workflows combine conventional processing tools which makes them easily applicable. Use of conventional P-wave data may also make this approach to locate fractures more economical than other currently available technology which often requires S-wave survey or computationally intensive inversion of data. Diode filtering and dip-filtering in the common-offset domain yield good results and work very well in the presence of flat reflectors. NMO-Dip filter depends on the NMO velocity of the subsurface, but removes both flat and slightly dipping reflectors without affecting the fracture signals. Prior application of dip-moveout correction (DMO) did not make any difference on reflections, but included some incoherent noise to the data. The Eigenvector filter performed very well on flat or near-flat reflectors and left the fracture signals almost intact, but introduced some incoherent noise in the presence of steeply dipping reflectors. Harlanâ s scheme and Radon filtering are very sensitive with regard to parameters selection, but perform exceptionally well on flat or near-flat reflectors. Dip-filter, Eigenvector filter, and Radon filter were also tested on 3D land data. Dip-filter and Eigenvector filter suppressed strong reflections with slight perturbations to the fracture signals. Radon filter did not produce satisfactory result due to small residual moveout difference between reflectors and fracture signals.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
24

Guggenmoser, Tobias [Verfasser]. "Data Processing and Trace Gas Retrievals for the GLORIA Limb Sounder / Tobias Guggenmoser." Wuppertal : Universitätsbibliothek Wuppertal, 2015. http://d-nb.info/1071413910/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Oller, Moreno Sergio. "Data processing for Life Sciences measurements with hyphenated Gas Chromatography-Ion Mobility Spectrometry." Doctoral thesis, Universitat de Barcelona, 2018. http://hdl.handle.net/10803/523539.

Full text
Abstract:
Recent progress in analytical chemistry instrumentation has increased the amount of data available for analysis. This progress has been encompassed by computational improvements, that have enabled new possibilities to analyze larger amounts of data. These two factors have allowed to analyze more complex samples in multiple life science fields, such as biology, medicine, pharmacology, or food science. One of the techniques that has benefited from these improvements is Gas Chromatography - Ion Mobility Spectrometry (GC-IMS). This technique is useful for the detection of Volatile Organic Compounds (VOCs) in complex samples. Ion Mobility Spectrometry is an analytical technique for characterizing chemical substances based on the velocity of gas-phase ions in an electric field. It is able to detect trace levels of volatile chemicals reaching for some analytes ppb concentrations. While the instrument has moderate selectivity it is very fast in the analysis, as an ion mobility spectrum can be acquired in tenths of milliseconds. As it operates at ambient pressure, it is found not only as laboratory instrumentation but also in-site, to perform screening applications. For instance it is often used in airports for the detection of drugs and explosives. To enhance the selectivity of the IMS, especially for the analysis of complex samples, a gas chromatograph can be used for sample pre-separation at the expense of the length of the analysis. While there is better instrumentation and more computational power, better algorithms are still needed to exploit and extract all the information present in the samples. In particular, GC-IMS has not received much attention compared to other analytical techniques. In this work we address some of the data analysis issues for GC-IMS: With respect to the pre-processing, we explore several baseline estimation methods and we suggest a variation of Asymmetric Least Squares, a popular baseline estimation technique, that is able to cope with signals that present large peaks or large dynamic range. This baseline estimation method is used in Gas Chromatography - Mass Spectrometry signals as well, as it suits both techniques. Furthermore, we also characterize spectral misalignments in a several months long study, and propose an alignment method based on monotonic cubic splines for its correction. Based on the misalignment characterization we propose an optimal time span between consecutive calibrant samples. We the explore the usage of Multivariate Curve Resolution methods for the deconvolution of overlapped peaks and their extraction into pure components. We propose the use of a sliding window in the retention time axis to extract the pure components from smaller windows. The pure components are tracked through the windows. This approach is able to extract analytes with lower response with respect to MCR, compounds that have a low variance in the overall matrix Finally we apply some of these developments to real world applications, on a dataset for the prevention of fraud and quality control in the classification of olive oils, measured with GC-IMS, and on data for biomarker discovery of prostate cancer by analyzing the headspace of urine samples with a GC-MS instrument.
Els avenços recents en instrumentació química i el progrés en les capacitats computacionals obren noves possibilitats per l’anàlisi de dades provinents de diversos camps en l’àmbit de les ciències de la vida, com la biologia, la medicina o la ciència de l’alimentació. Una de les tècniques que s’ha beneficiat d’aquests avenços és la cromatografia de gasos – espectrometria de mobilitat d’ions (GC-IMS). Aquesta tècnica és útil per detectar compostos orgànics volàtils en mostres complexes. L’IMS és una tècnica analítica per caracteritzar substàncies químiques basada en la velocitat d’ions en fase gasosa en un camp elèctric, capaç de detectar traces d’alguns volàtils en concentracions de ppb ràpidament. Per augmentar-ne la selectivitat, un cromatògraf de gasos pot emprar-se per pre-separar la mostra, a expenses de la durada de l’anàlisi. Tot i disposar de millores en la instrumentació i més poder computacional, calen millors algoritmes per extreure tota la informació de les mostres. En particular, GC-IMS no ha rebut molta atenció en comparació amb altres tècniques analítiques. En aquest treball, tractem alguns problemes de l’anàlisi de dades de GC-IMS: Pel que fa al pre-processat, explorem algoritmes d’estimació de la línia de base i en proposem una millora, adaptada a les necessitats de l’instrument. Aquest algoritme també s’utilitza en mostres de cromatografia de gasos espectrometria de masses (GC-MS), en tant que s’adapta correctament a ambdues tècniques. Caracteritzem els desalineaments espectrals que es produeixen en un estudi de diversos mesos de durada, i proposem un mètode d’alineat basat en splines cúbics monotònics per a la seva correcció i un interval de temps òptim entre dues mostres calibrants. Explorem l’ús de mètodes de resolució multivariant de corbes (MCR) per a la deconvolució de pics solapats i la seva extracció en components purs. Proposem l’ús d’una finestra mòbil en el temps de retenció. Aquesta millora permet extreure més informació d’analits. Finalment utilitzem alguns d’aquests desenvolupaments a dues aplicacions: la prevenció de frau en la classificació d’olis d’oliva, mesurada amb GC-IMS i la cerca de biomarcadors de càncer de pròstata en volàtils de la orina, feta amb GC-MS.
APA, Harvard, Vancouver, ISO, and other styles
26

Pradhan, Anup. "The Geospatial Metadata Server : on-line metadata database, data conversion and GIS processing." Thesis, University of Edinburgh, 2000. http://hdl.handle.net/1842/30655.

Full text
Abstract:
This research proposes an on-line software demonstrator called the Geospatial Metadata Server (GMS), which is designed to catalogue, enter, maintain and query metadata using a centralised metadatabase. The system also converts geospatial data to and from a variety of different formats as well as proactively searches for data throughout the Web using an automated hyperlinks retrieval program. GMS is divided into six components, three of which constitute a Metadata Cataloguing tool. The metadatabase is implemented within an RDBMS, which is capable of querying large quantities of on-line metadata in standardised format. The database schema used to store the metadata was patterned in part off of the Content Standard for Digital Geographic Metadata, which provides geospatial data providers and users a common assemblage of descriptive terminology. Because of the excessive length of metadata records, GMS is equipped with a parsing algorithm and database entry utility that is used to enter delimited on-line metadata text files into the metadatabase. Alternatively the system provides a metadata entry and update utility, which is divided into a series of HTML forms each corresponding to a metadatabase table. The utility cleverly allows users to maintain their personal metadata records over the Web. The other three GMS components constitute a Metadata Querying tool. GMS consists of a search engine that can access its metadatabase, examine retrieved information and use the information within on-line server side software. The search engine integrates the metadatabase, on-line geospatial data and GIS software (i.e. the GMS conversion utilities) over the Web. The on-line conversion utilities are capable of converting geospatial data from anywhere on the Web solving many of the problems associated with the interoperability of vendor specific GIS data formats. Because the conversion utilities operate over the Web, the technology is easier to use and more accessible to a greater number of GIS users. Also integrated into the search engine is a Web robot designed to automatically seek out geospatial data from remote Web sites and index their hypertext links within a file or database.
APA, Harvard, Vancouver, ISO, and other styles
27

Pierce, Karisa M. "Objectively obtaining information from gas chromatographic separations of complex samples using novel data processing and chemometric techniques /." Thesis, Connect to this title online; UW restricted, 2007. http://hdl.handle.net/1773/8575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Leung, Tsui-shan, and 梁翠珊. "A functional analysis of GIS for slope management in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

McElroy, William John. "Development of geophysical mapping and data processing methods applied to base metal ore deposits in Ireland." Thesis, Queen's University Belfast, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.296823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hobson, Alan George Cawood. "Optimising the renewal of natural gas reticulation pipes using GIS." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52980.

Full text
Abstract:
Thesis (MA)--University of Stellenbosch, 2002.
ENGLISH ABSTRACT: A major concern for Energex, Australia's largest energy utility in South East Queensland, is the escape of natural gas out of their reticulation systems. Within many of the older areas in Brisbane, these networks operate primarily at low and medium pressure with a significant percentage of mains being cast iron or steel. Over many years pipes in these networks have been replaced, yet reports show that unaccounted for gas from the same networks remain high. Furthermore, operation and maintenance budgets for these networks are high with many of these pipes close to the end of their economic life. When operation and maintenance costs exceed the costs of replacement, the Energex gas utility initiates projects to renew reticulation networks with polyethylene pipes. Making decisions about pipe renewal requires an evaluation of historical records from a number of sources, namely: • gas consumption figures, • history of leaks, • maintenance and other related cost, and • the loss of revenue contributed by unaccounted for gas. Financial justification of capital expenditure has always been a requirement for renewal projects at the Energex gas utility, however the impact of a deregulation in the energy utility market has necessitated a review of their financial assessment for capital projects. The Energex gas utility has developed an application that evaluates the financial viability of renewal projects. This research will demonstrate the role of GIS integration with the Energex financial application. The results of this study showed that a GIS integrated renewal planning approach incorporates significant benefits including: • Efficient selection of a sub-network based on pipe connectivity, • Discovery of hidden relationships between spatially enabled alphanumeric data and environmental information that improves decision making, and • Enhanced testing of proposed renewal design options by scrutinizing the attributes of spatial data.
AFRIKAANSE OPSOMMING: 'n Groot bron van kommer vir Energex, Australië se grootste energieverskaffer in Suidoos- Queensland, is die verlies van natuurlike gas uit hul gasdistribusie netwerke. In 'n groot deel van ouer Brisbane opereer hierdie netwerke hoofsaaklik teen lae en medium druk, met 'n aansienlike persentasie van hoofpyplyne wat uit gietyster of staal bestaan. Al is sommige pyplyne in hierdie netwerke met verloop van tyd vervang, maak verslae dit duidelik dat 'n groot deel van die gas in hierdie netwerke steeds langs die pad verlore gaan. Die operasionele - en onderhoudsbegrotings vir hierdie netwerke is boonop hoog, met 'n groot persentasie van die pyplyne wat binnekort aan die einde van hulle ekonomiese leeftyd kom. Wanneer operasionele- en onderhoudsonkostes die koste van vervanging oorskry, beplan Energex se gasvoorsienings-afdeling projekte om verspreidingsnetwerke te hernu met poli-etileen pype. Om sinvolle besluite te neem tydens pyplynhernuwings, word verskeie historiese verslae geraadpleeg, insluitend: gasverbruikvlakke, lekplek geskiedenis rekords, onderhoud- en ander verwante onkostes, asook die verlies van inkomste weens verlore gas. Alhoewel finansiële stawing van kapitale uitgawes nog altyd 'n voorvereiste was tydens hernuwingsprojekte by Energex, het die impak van privatisering op die energieverskaffingsmark dit noodsaaklik gemaak om hulle finansiële goedkeuringsproses vir kapitaalprojekte te hersien. Energex het dus 'n sagteware toepassing ontwikkel wat die finansiële gangbaarheid van hernuwingsprojekte evalueer. Hierdie navorsing sal die moontlike integrasie van geografiese inligtingstelsels (GIS) met dié van Energex se finansiële evalueringspakket demonstreer. Die resultate van hierdie studie toon dat die integrasie van GIS in die hernuwingsproses aansienlike voordele inhou, insluitende: • die effektiewe seleksie van sub-netwerke, gebaseer op pyp konnektiwiteit, • die ontdekking van verskuilde verwantskappe tussen geografies-ruimtelike alfanumeriese data en omgewingsinligting, wat besluitneming vergemaklik, en • verbeterde toetsing van voorgestelde hernuwingsopsies deur die indiepte-nagaan van geografiesruimtelike elemente.
APA, Harvard, Vancouver, ISO, and other styles
31

Wei, Li. "Processing and Interpretation of Three-Component Borehole/Surface Seismic Data over Gabor Gas Storage Field." Wright State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1441043179.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Mao, Chiu-dik William, and 毛照逖. "Real estate geographic information systems (GIS) for urban redevelopment through the perspective of a real estate developer." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B27050270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Lalevée, André. "Towards highly flexible hardware architectures for high-speed data processing : a 100 Gbps network case study." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0054/document.

Full text
Abstract:
L’augmentation de la taille des réseaux actuels ainsi que de la diversité des applications qui les utilisent font que les architectures de calcul traditionnelles deviennent limitées. En effet, les architectures purement logicielles ne permettent pas de tenir les débits en jeu, tandis que celles purement matérielles n’offrent pas assez de flexibilité pour répondre à la diversité des applications. Ainsi, l’utilisation de solutions de type matériel programmable, en particulier les Field Programmable Gate Arrays (FPGAs), a été envisagée. En effet, ces architectures sont souvent considérées comme un bon compromis entre performances et flexibilité, notamment grâce à la technique de Reconfiguration Dynamique Partielle (RDP), qui permet de modifier le comportement d’une partie du circuit pendant l’exécution. Cependant, cette technique peut présenter des inconvénients lorsqu’elle est utilisée de manière intensive, en particulier au niveau du stockage des fichiers de configuration, appelés bitstreams. Pour palier ce problème, il est possible d’utiliser la relocation de bitstreams, permettant de réduire le nombre de fichiers de configuration. Cependant cette technique est fastidieuse et exige des connaissances pointues dans les FPGAs. Un flot de conception entièrement automatisé a donc été développé dans le but de simplifier son utilisation.Pour permettre une flexibilité sur l’enchaînement des traitements effectués, une architecture de communication flexible supportant des hauts débits est également nécessaire. Ainsi, l’étude de Network-on-Chips dédiés aux circuits reconfigurables et au traitements réseaux à haut débit.Enfin, un cas d’étude a été mené pour valider notre approche
The increase in both size and diversity of applications regarding modern networks is making traditional computing architectures limited. Indeed, purely software architectures can not sustain typical throughputs, while purely hardware ones severely lack the flexibility needed to adapt to the diversity of applications. Thus, the investigation of programmable hardware, such as Field Programmable Gate Arrays (FPGAs), has been done. These architectures are indeed usually considered as a good tradeoff between performance and flexibility, mainly thanks to the Dynamic Partial Reconfiguration (DPR), which allows to reconfigure a part of the design during run-time.However, this technique can have several drawbacks, especially regarding the storing of the configuration files, called bitstreams. To solve this issue, bitstream relocation can be deployed, which allows to decrease the number of configuration files required. However, this technique is long, error-prone, and requires specific knowledge inFPGAs. A fully automated design flow has been developped to ease the use of this technique. In order to provide flexibility regarding the sequence of treatments to be done on our architecture, a flexible and high-throughput communication structure is required. Thus, a Network-on-Chips study and characterization has been done accordingly to network processing and bitstream relocation properties. Finally, a case study has been developed in order to validate our approach
APA, Harvard, Vancouver, ISO, and other styles
34

Harris, Jeff R. "Processing and integration of geochemical data for mineral exploration: Application of statistics, geostatistics and GIS technology." Thesis, University of Ottawa (Canada), 2002. http://hdl.handle.net/10393/6421.

Full text
Abstract:
Geographic Information Systems (GIS) used in concert with statistical and geostatistical software provide the geologist with a powerful tool for processing, visualizing and analysing geoscience data for mineral exploration applications. This thesis focuses on different methods for analysing, visualizing and integrating geochemical data sampled from various media (rock, till, soil, humus), with other types of geoscience data. Different methods for defining geochemical anomalies and separating geochemical anomalies due to mineralization from other lithologic or surficial factors (i.e. true from false anomalies) are investigated. With respect to lithogeochemical data, this includes methods to distinguish between altered and un-altered samples, methods (normalization) for identifying lithologic from mineralization effects, and various statistical and visual methods for identifying anomalous geochemical concentrations from background. With respect to surficial geochemical data, methods for identifying bedrock signatures, and scavenging effects are presented. In addition, a new algorithm, the dispersal train identification algorithm (DTIA), is presented which broadly helps to identify and characterize anisotropies in till data due to glacial dispersion and more specifically identifies potential dispersal trains using a number of statistical parameters. The issue of interpolation of geochemical data is addressed and methods for determining whether geochemical data should or should not be interpolated are presented. New methods for visualizing geochemical data using red-green-blue (RGB) ternary displays are illustrated. Finally data techniques for integrating geochemical data with other geoscience data to produce mineral prospectivity maps are demonstrated. Both data and knowledge-driven GIS modeling methodologies are used (and compared) for producing prospectivity maps. New ways of preparing geochemical data for input to modeling are demonstrated with the aim of getting the most out of your data for mineral exploration purposes. Processing geochemical data by sub-populations, either by geographic unit (i.e., lithology) or by geochemical classification and alteration style was useful for better identification of geochemical anomalies, with respect to background, and for assessing varying alteration styles. Normal probability plots of geochemical concentrations based on spatial (lithologic) divisions and Principal Component Analysis (PCA) were found to be particularly useful for identifying geochemical anomalies and for identifying associations between major oxide elements that in turn reflect different alteration styles. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
35

Hofuku, Yoyoi, Shinya Cho, Tomohiro Nishida, and Susumu Kanemune. "Why is programming difficult? : proposal for learning programming in “small steps” and a prototype tool for detecting “gaps”." Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2013/6445/.

Full text
Abstract:
In this article, we propose a model for an understanding process that learners can use while studying programming. We focus on the “small step” method, in which students learn only a few concepts for one program to avoid having trouble with learning programming. We also analyze the difference in the description order between several C programming textbooks on the basis of the model. We developed a tool to detect “gaps” (a lot of concepts to be learned in a program) in programming textbooks.
APA, Harvard, Vancouver, ISO, and other styles
36

Arora, Kush. "GIS AVADIT." Akron, OH : University of Akron, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=akron1113880406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Kishore, Annapoorni. "AN INTERNSHIP WITH ENVIRONMENTAL SYSTEMS RESEARCH INSTITUTE." Miami University / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=miami1209153230.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

麥淑嫻 and Shuk-han Ann Mak. "Automating knowledge acquisition and site-selection in a generic knowledge-based GIS system: a theoreticalstudy." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B31240720.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

陳柏慧 and Pak-wai Patty Chan. "Applications of the GIS to urban design in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31980338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ghous, Abid Petroleum Engineering Faculty of Engineering UNSW. "Digital formation evaluation via x-ray micro-computed tomography." Awarded by:University of New South Wales. School of Petroleum Engineering, 2005. http://handle.unsw.edu.au/1959.4/20581.

Full text
Abstract:
Machined fragments of 10 core plugs from oshore reservoirs have been analysed using a high resolution X-ray micro-computed tomography (micro-CT) facility. The facility includes a system capable of acquiring 3D images made up of 20003 voxels on core plugs up to 6 cm diameter with resolutions down to 2 um. The cores analysed include six cores from a gas reservoir and four cores from an oil reservoir. The cores exhibit a very broad range of pore and grain sizes, porosity, permeability and mineralogy. The petrological data, available only for gas reservoir cores, is compared with the data obtained from the tomographic images. Computational results made directly on the digitized tomographic images are presented for the permeability, formation factor, resistivity index and drainage capillary pressure across a range of . We show that data over a range of porosity can be computed from a single fragment. We compare the computations of petrophysical data on fragments to conventional laboratory measurements on the full plug. Permeability predictions from digital and conventional core analysis are consistent. It is shown that a characteristic length scale can be dened as a quality control parameter for the estimation of permeability. Results for formation factor, drainage capillary pressure and resistivity index are encouraging. The results demonstrate the potential to predict petrophysical properties from core material not suited for laboratory testing (e.g., sidewall or damaged core and drill cuttings) and the feasibility of combining digitized images with numerical calculations to predict properties and derive correlations for specic rock lithologies. The small sample size required for analysis makes it possible to produce multiple measurements on a single plug. This represents a potential multiplier on the quantity of core data allowing meaningful distributions or spreads in petrophysical properties to be estimated. We discuss the current limitations of the methodology and suggest improvements; in particular the need to calibrate the simulated data to parallel laboratory core measurements. We also describe the potential to extend the methodology to a wider range of petrophysical properties. This development could lead to a more systematic study of the assumptions, interpretations and analysis methods commonly applied within industry and lead to better correlations between petrophysical properties and log measurements.
APA, Harvard, Vancouver, ISO, and other styles
41

拓也, 桑原, and Takuya Kuwahara. "Characterization of gas-liquid two-phase flow regimes using Magnetic fluid : setup, measurements, signal processing and data analysis." Thesis, https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB10268912/?lang=0, 2008. https://doors.doshisha.ac.jp/opac/opac_link/bibid/BB10268912/?lang=0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Griesbach, Christopher James. "Improving LiDAR Data Post-Processing Techniques for Archaeological Site Management and Analysis: A Case Study from Canaveral National Seashore Park." Scholar Commons, 2015. https://scholarcommons.usf.edu/etd/5491.

Full text
Abstract:
Methods used to process raw Light Detection and Ranging (LiDAR) data can sometimes obscure the digital signatures indicative of an archaeological site. This thesis explains the negative effects that certain LiDAR data processing procedures can have on the preservation of an archaeological site. This thesis also presents methods for effectively integrating LiDAR with other forms of mapping data in a Geographic Information Systems (GIS) environment in order to improve LiDAR archaeological signatures by examining several pre-Columbian Native American shell middens located in Canaveral National Seashore Park (CANA).
APA, Harvard, Vancouver, ISO, and other styles
43

Merlo, Stefania. "Contextualising intra-site spatial analysis : the role of three-dimensional GIS modelling in understanding excavation data." Thesis, University of Cambridge, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.609386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Shea, Geoffrey Yu Kai Surveying &amp Spatial Information Systems Faculty of Engineering UNSW. "A Web-Based Approach to the Integration of Diverse Data Sources for GIS." Awarded by:University of New South Wales. Surveying and Spatial Information Systems, 2001. http://handle.unsw.edu.au/1959.4/17855.

Full text
Abstract:
The rigorous developments of GIS over the past decades have enabled application developers to create powerful systems that are used to facilitate the management of spatial data. Unfortunately, each one of these systems is specific to a local service, with little or no interconnection with services in other locales. This makes it virtually impossible to perform dynamic and interactive GIS operations across multiple locales which have similar or dissimilar system configurations. The Spatial Data Transfer Standard (SDTS) resolved the problems partially by offering excellent conceptual and logical abstraction model for data exchange. Recent advancements of the Internet enlightened the GIS community as to the realization of an ideal concept of information interchange. A suite of new technologies that embraces Extensible Markup Language (XML), Scalable Vector Graphics (SVG), Portable Network Graphics (PNG) and Java creates a powerful and new perspective that can be applied to all phases of online GIS system development. The online GIS is a Web-based approach to integrating diverse spatial data sources for GIS applications. To address the spatial data integration options and implications related to the Web-based approach the investigation was undertaken in 5 phases: (1) Determine the mapping requirements of graphic and non-graphic spatial data for online GIS application; (2) Analyze the requirements of spatial data integration for online environments; (3) Investigate a suitable method for integrating different formats of spatial data; (4) Study the feasibility and applicability of setting up the online GIS; and (5) Develop a prototype for online sharing of teaching resources. Resulting from the critical review on current Internet technology, a conceptual framework for spatial data integration was proposed. This framework was based on the emerging Internet technology on XML, SVG, PNG, and Java. It was comprised of four loosely coupled modules, namely, Application Interface, Presentation, Integrator, and Data module. This loosely coupled framework provides an environment that will be independent of the underlying GIS data structure and makes it easy to change or update the system as a new task or knowledge is acquired. A feasibility study was conducted to test the applicability for the proposed conceptual framework. A detailed user requirements and system specification was thus devised from the feasibility study. These user requirements and system specification provided some guidelines for online GIS application development. They were expressed specifically in terms of six aspects: (1) User; (2) Teaching resources management; (3) Data; (4) Cartography; (5) Functions; and (6) Software development configuration. A prototype system based on some of the devised system specifications was developed. In the prototype software design, the architecture of a Three-Tier Client-Server computing model was adopted. Due to the inadequacy of native support for SVG and PNG in all currently available Web browsers, the prototype was thus implemented in HTML, Java and vendor specific vector format. The prototype demonstrated how teaching resources from a variety of sources and format (including map data and non-map resources) were integrated and shared. The implementation of the prototype revealed that the Web is still an ideal medium for providing wider accessibility of geographical information to a larger number of users through a corporate intranet or the Internet cost-effectively. The investigation concluded that current WWW technology is limited in its capability for spatial data integration and delivering online functionality. However, developing of XML-based GIS data model and graphic standards SVG and PNG for structuring and transferring spatial data on the Internet appear to be providing solutions to the current limitations. It is believed that the ideal world where everyone retrieving spatial information contextually through a Web browser disregarding the information format and location will eventually become true.
APA, Harvard, Vancouver, ISO, and other styles
45

Yip, Kin-man Ernest, and 葉健文. "Developing a city skyline for Hong Kong using GIS and urban design guidelines." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B27049802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Harper, Robert T. "Determination of the proton affinities of gas phase peptides by mass spectrometry and computational chemistry." Scholarly Commons, 2007. https://scholarlycommons.pacific.edu/uop_etds/673.

Full text
Abstract:
Helices in proteins have substantial permanent dipole moments arising from the nearly perfect alignment of the individual dipole moments of each peptide bond. Interaction with this helix "macrodipole" is thought to perturb the pKa values of basic or acidic residues at the helix termini. The goal of this project is to investigate the effect of the helix confonnation on the proton affinities ofbasic amino acids placed at theN- or Ctenninus of helical model peptides in the gas phase. Several series of model peptides having a basic residue, lysine (K) or 2,3- diaminopropionic acid (Dap ), located at either terminus were synthesized by solid phase peptide synthesis using conventional techniques or the amino acid fluoride approach. Proton affinities were determined for several basic amino acids and peptides using mass spectrometry by applying the extended Cooks' kinetic method. Favorable conformations and theoretical proton affinities were probed using computational chemistry. The proton affinities determined for Na-acetyl-(L)-lysine, Ac-AK, Ac-KA, and Ac-KAA are 236.8 ± 1.9 kcal mol-1 , 249.4 ± 2.0 kcal mol-1 , 241.5 ± 1.9 kcal mol-1 , and 244.4 ± 2.0 kcal mol-1 respectively. The large negative entropy changes for each of the peptides upon protonation ( -11.2 to - 21.7 cal mol-1 K- 1 ) are consistent with globular confmmations adopted by the protonated peptides due to extensive intramolecular hydrogen bonding. The measured proton affinities of the peptides increased with the size of the peptide as expected. However, the measured proton affinity of the peptide with C-terminal lysine, Ac-AK, is substantially higher than that of the con·esponding peptide with N-terrninal lysine, Ac-KA, contrary to expectations. Proton affinities determined for these compounds using computational chemistry are in reasonable agreement with experimental results. Additionally, proton affinities calculated for helical polyalanine and Aib (aaminoisobutytic acid) modified polyalanine peptides with C-terminal basic residues (Ac AnK and Ac-(AibA)n-Dap) are much larger than proton affinities calculated for the corresponding peptides with N-terminal basic residues. These results indicate that the helix dipole has a substantial effect on the basicity of residues at the helix termini.
APA, Harvard, Vancouver, ISO, and other styles
47

Breytenbach, Andre. "GIS-based land suitability assessment and allocation decision-making in a degraded rural environmen." Thesis, Stellenbosch : University of Stellenbosch, 2011. http://hdl.handle.net/10019.1/16599.

Full text
Abstract:
Thesis (MSc)--University of Stellenbosch, 2006.
ENGLISH ABSTRACT: Rural development problems faced by the impoverished communities in the Transkei, South Africa, are numerous, and environmental degradation has already taken much of its toll. By working at a micro-catchment-level both the socio-economic and biophysical appreciation of the land resources were captured as encapsulated in the concept of resource management domains. Participatory decision-making allowed functional land use goals and evaluation criteria to be incorporated into computerised multi-criteria evaluation and multi-objective land use allocation models in order to reach an idealised or more sustainable land use situation. In the execution of the decision-making process seven procedural steps were followed, which are discussed in detail and applied in the case study. Synthesis of the results emphasised the envisaged rural planning potential of the methods used.
AFRIKAANSE OPSOMMING: In terme van plattelandse ontwikkeling staar talle probleme die behoeftige gemeenskappe van Transkei, Suid-Afrika, in die gesig en omgewingsdegradering neem ongehinderd sy tol. Deur op ‘n mikro-opvangsgebied vlak te werk kon beide die sosio-ekonomiese en biofisiese waarde van die gebied se hulpbronne bepaal word en uitgebeeld word in hulpbron bestuursdomeine. Deur deelnemende besluitneming is funksionele grondgebruiksdoelwitte en evaluasie kriteria gebruik in gerekenariseerde meervoudige kriteria evaluering en veeldoelige grondgebruiksaanwysingsmodelle ten einde die ideale of ‘n meer volhoubare grondgebruik situasie te verkry. Vir die uitvoering van die besluitnemingsproses is van sewe opeenvolgende stappe gebruik gemaak en die uitvoering daarvan word in diepte bespreek in hierdie gevallestudie. Sintese van die resultate het die potensiaal van hierdie beoogde landelike beplanningsmetodes beklemtoon.
APA, Harvard, Vancouver, ISO, and other styles
48

Ho, Lee-kin Joe, and 何利堅. "Incorporating GIS and CAD technologies in the modelling of three-dimensional urban landscape of Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B42575229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Schreiber, Werner. "GIS and EUREPGAP : applying GIS to increase effective farm management in accordance GAP requirements." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53440.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: With the inception of precision farming techniques during the last decade, agricultural efficiency has improved, leading to greater productivity and enhanced economic benefits associated with agriculture. The awareness of health risks associated with food borne diseases has also increased. Systems such as Hazard Analysis and Critical Control Points (RACCP) in the USA and Good Agricultural Practices (GAP) in Europe are trying to ensure that no food showing signs of microbial contamination associated with production techniques are allowed onto the export market. Growers participating in exporting are thus being forced to conform to the requirements set by international customers. The aim of this study was to compile a computerized record keeping system that would aid farmers with the implementation of GAP on farms, by making use of GIS capabilities. A database, consisting of GAP-specific data was developed. ArcView GIS was used to implement the database, while customized analyses procedures through the use of Avenue assisted in GAP-specific farming related decisions. An agricultural area focusing on the export market was needed for this study, and the nut producing Levubu district was identified as ideal. By making use of ArcView GIS, distinct relationships between different data sets were portrayed in tabular, graphical, geographical and report format. GAP requirements state that growers must base decisions on timely, relevant information. With information available in the above-mentioned formats, decisions regarding actions taken can be justified. By analysing the complex interaction between datasets, the influences that agronomical inputs have on production were portrayed, moving beyond the standard requirements of GAP. Agricultural activities produce enormous quantities of data, and GIS proved to be an indispensable tool because of the ability to analyse and manipulate data with a spatial component. The implementation of good agricultural practices lends itself to the use of GIS. With the correct information available at the right time, better decisions can promote optimal croppmg, whilst rmmrrnzmg the negative effects on the consumer and environment.
AFRIKAANSE OPSOMMING: Gedurende die afgelope dekade het die gebruik van presisie boerderytegnieke tot verbeterde gewasverbouing gelei, wat verhoogde produktiwiteit en ekonomiese welvarendheid tot gevolg gehad het. 'n Wêreldwye bewustheid ten opsigte van die oordrag van siektekieme geasosieer met varsprodukte het ontstaan. Met die implementering van Hazard Analysis and Critical Control Points (HACCP) en Good Agricultural Practices (GAP), poog die VSA en Europa om voedsel wat tekens van besmetting toon van die invoermark te weerhou. Buitelandse produsente en uitvoerders word dus hierdeur gedwing om by internasionale voedselstandaarde aan te pas. Hierdie navorsing het ten doel gehad om 'n gerekenariseerde rekordhouding stelsel daar te stel wat produsente sal bystaan tydens die implementering van GAP, deur gebruik te maak van GIS. 'n Databasis gerig op die implementering van GAP is ontwerp. ArcView GIS is gebruik word om die databasis te implementeer, waarna spesifieke navrae die data ontleed het om sodoende die besluitnemingsproses te vergemaklik. 'n Landbou-area wat aktief in die uitvoermark deelneem was benodig vir dié studie, en die Levubu distrik was ideaal. Verwantskappe tussen datastelle is bepaal en uitgebeeld in tabel-, grafiek- en verslag vorm. Die suksesvolle implementering van GAP vereis dat alle besluite op relevante inligting gebaseer word, en met inligting beskikbaar in die bogenoemde formaat kan alle besluite geregverdig word. Deur die komplekse interaksie tussen insette en produksie te analiseer, was dit moontlik om verwantskappe uit te beeld wat verder strek as wat GAP vereistes stipuleer. Deur die gebruikerskoppelvlak in ArcView te verpersoonlik is die gebruiker nie belaai met onnodige berekeninge nie. Aktiwiteite soos landbou produseer groot datastelle, en die vermoë van GIS om die ruimtelike verwantskappe te analiseer en uit te beeld, het getoon dat GIS 'n instrumentele rol in die besluitnemingsproses speel. Deur middel van beter besluitneming kan optimale gewasverbouing verseker word, terwyl die negatiewe impak op die verbruiker en omgewing tot 'n minimum beperk word.
APA, Harvard, Vancouver, ISO, and other styles
50

Thorell, Marcus, and Mattias Andersson. "Implementering av HAZUS-MH i Sverige : Möjligheter och hinder." Thesis, Karlstads universitet, Fakulteten för hälsa, natur- och teknikvetenskap (from 2013), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-72666.

Full text
Abstract:
För modellering av risker vid naturkatastrofer är GIS ett grundläggande verktyg. HAZUS-MH är ett GIS-baserat riskanalysverktyg, utvecklat av den amerikanska myndigheten FEMA. HAZUS-MH har en välutvecklad metodologi för modellering av naturkatastrofer, vilket är något som efterfrågas på europeisk nivå inom ramen för översvämningsdirektivet. Därför föreligger ett intresse för implementering av HAZUS-MH för icke amerikanska förhållanden. Denna studies syfte är att fördjupa kunskaperna för implementering och användning av HAZUS-MH i Sverige. För att möjliggöra implementering behöver svenska data bearbetas för att matcha datastrukturen i HAZUS-MH. Metoden innefattar en litteraturgenomgång av tidigare studier och manualer samt databearbetning. Erfarenheter av databearbetningen samlades in för att bygga upp en manual för databearbetning samt för att utvärdera möjligheter och hinder för implementering i Sverige. Resultatet visar hur systemkrav och övriga inställningar för användning av HAZUS-MH ser ut. De övriga inställningarna berör koppling till HAZUS-MH databas med mera. För anpassning av svenska data beskrivs databehov (administrativ indelning, inventeringsdata och hydrologiska data), data-bearbetning (rekommenderad arbetsgång för att fylla shape-filer och tabeller med information) och dataimport. Vidare redogör resultatet för tillämpningen av HAZUS-MH med svenska data. Denna studie identifierar flera möjligheter hos HAZUS-MH. Möjligheterna att skapa risk- och sårbarhetskartor samt dataimport är de största. Tidsåtgången för att utföra anpassningen av svenska data var runt 15 arbetsdagar. Denna studie uppskattar att med hjälp av manualer för anpass-ningen kan denna tid kortas till 3 arbetsdagar. Om processen att anpassa svenska data automat-iseras kan tiden kortas ytterligare. Den största barriären enligt denna studie är insamling av data. För att kunna använda HAZUS-MH fulla potential behövs omfattande datainsamling. En annan barriär är begränsningar i hydro-logiska data, det är nödvändigt med externa hydrologiska data för en så korrekt analys som möjligt. Vidare forskning inom området bör enligt denna studie fokusera på metoder för att samla in data samt hur en automatisk process för att anpassa data skulle kunna se ut.
When modeling risks for natural disasters, GIS is a fundamental tool. HAZUS-MH is a GIS-based risk analysis tool, developed by the American authority FEMA. HAZUS-MH has a well-developed methodology for modeling natural disasters, which is something that is demanded at European level within the flood directive framework. Hence, there is an interest in implementing HAZUS-MH for non-US conditions. The aim of the study is to deepen the knowledge for the implementation and use of HAZUS-MH in Sweden. To enable implementation, Swedish data is required to be processed to match the data structure of HAZUS-MH. Methods for this study are a literature review of previous studies and manuals and data processing. Experiences of the data processing were collected to build a manual for data processing and to evaluate opportunities and obstacles for implementation in Sweden. The result shows how system requirements and other settings for using HAZUS-MH look like. The other settings include connection to the HAZUS-MH database et cetera. For adaption of Swedish data, requirements including data (administrative division, inventory data and hydrological data), data processing (recommended workflow to fill shape-files and attribute tables with information) and data import are described. The result also describes the application of HAZUS-MH with Swedish data. This study identifies several possibilities of HAZUS-MH. The opportunities for creating risk and vulnerability maps and data import are the largest. The time required to perform the adaptation of Swedish data was approximately 15 working days. This study estimates that with the help of manuals for the adaption, this time could be shortened to approximately 3 working days. If the process of adapting data is automated, this time could be shortened further. The largest obstacle under this study is the data collection process, to use the full potential of HAZUS-MH extensive data collection is needed. Another obstacle is the limitation of hydrological data, external hydrological data is necessary to get as accurate analysis as possible. Further research in the field should, according to this study, focus on methods of collecting data and development of an automatic process for managing data.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography