To see the other types of publications on this topic, follow the link: Command and Data Handling.

Dissertations / Theses on the topic 'Command and Data Handling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Command and Data Handling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lokken, Patrick Bucknam. "Command and data handling systems for a multi-instrument sounding rocket payload." Thesis, Montana State University, 2011. http://etd.lib.montana.edu/etd/2011/lokken/LokkenP0511.pdf.

Full text
Abstract:
To improve our physical understanding of the solar transition region, the Multi-Order Solar EUV Spectrograph (MOSES) has been developed at Montana State University to perform imaging spectroscopy of the sun at extreme ultraviolet wavelengths. Launched first in 2006, the instrument performed simultaneous imaging and spectroscopy over a narrow band centered around 30.4 nm. The amount of science that can be accomplished with the instrument is limited by its optical bandwidth, which must be small to reduce ambiguity in its data. This limitation can be overcome by launching an array of instruments operating at different wavelengths, piecing together a comprehensive view of transition region activity on a wide bandwidth. However, the command and data handling (C&DH) system on the current MOSES payload cannot support multiple instruments, and a new system must be built to support the addition of more instrumentation. To this end, designs based on a centralized computer topology and a distributed computer topology have been created for a new C&DH system to support the existing MOSES instrument, an EUV Snapshot Imaging Spectrograph (ESIS) and an improved guide scope. It was found that the frame rate of the entire electro-optical system was limited mostly by the low sensitivity of the optics, and that the choice between the two is driven by the mass of the system and the amount of engineering effort required to build the system. The centralized system was found to be both lighter and easier to build, and therefore a centralized system is recommended for the second revision of the MOSES payload.
APA, Harvard, Vancouver, ISO, and other styles
2

Olsen, Douglas. "Implementation of CCSDS Telemetry and Command Standards for the Fast Auroral Snapshot (FAST) Small Explorer Mission." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611870.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Recommendations of the Consultative Committee for Space Data Systems (CCSDS) provide a standard approach for implementing spacecraft packet telemetry and command interfaces. The Fast Auroral Snapshot (FAST) Small Explorer mission relies heavily on the CCSDS virtual channel and packetization concepts to achieve near real-time commanding and distribution of telemetry between separate space borne science and spacecraft processors and multiple ground stations. Use of the CCSDS recommendations allows the FAST mission to realize significant re-use of ground systems developed for the first Small Explorer mission, and also simplifies system interfaces and interactions between flight software developers, spacecraft integrators, and ground system operators.
APA, Harvard, Vancouver, ISO, and other styles
3

DeBoy, Christopher C., Paul D. Schwartz, and Richard K. Huebschman. "Midcourse Space Experiment Spacecraft and Ground Segment Telemetry Design and Implementation." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/608390.

Full text
Abstract:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
This paper reviews the performance requirements that provided the baseline for development of the onboard data system, RF transmission system, and ground segment receiving system of the Midcourse Space Experiment (MSX) spacecraft. The onboard Command and Data Handling (C&DH) System was designed to support the high data outputs of the three imaging sensor systems onboard the spacecraft and the requirement for large volumes of data storage. Because of the high data rates, it was necessary to construct a dedicated X-band ground receiver system at The Johns Hopkins University Applied Physics Laboratory (APL) and implement a tape recorder system for recording and downlinking sensor and spacecraft data. The system uses two onboard tape recorders to provide redundancy and backup capabilities. The storage capability of each tape recorder is 54 gigabits. The MSX C&DH System can record data at 25 Mbps or 5 Mbps. To meet the redundancy requirements of the high-priority experiments, the data can also be recorded in parallel on both tape recorders. To provide longer onboard recording, the data can also be recorded serially on the two recorders. The reproduce (playback) mode is at 25 Mbps. A unique requirement of the C&DH System is to multiplex and commutate the different output rates of the sensors and housekeeping signals into a common data stream for recording. The system also supports 1-Mbps real-time sensor data and 16-kbps real-time housekeeping data transmission to the dedicated ground site and through the U.S. Air Force Satellite Control Network ground stations. The primary ground receiving site for the telemetry is the MSX Tracking System (MTS) at APL. A dedicated 10-m X-band antenna is used to track the satellite during overhead passes and acquire the 25-Mbps telemetry downlinks, along with the 1-Mbps and 16-kbps real-time transmissions. This paper discusses some of the key technology trade-offs that were made in the design of the system to meet requirements for reliability, performance, and development schedule. It also presents some of the lessons learned during development and the impact these lessons will have on development of future systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Landberg, Fredrik. "Flexible role-handling in command and control systems." Thesis, Linköping University, Department of Electrical Engineering, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-7880.

Full text
Abstract:

In organizations the permissions a member has is not decided by their person, but by their functions within the organization. This is also the approach taken within military command and control systems. Military operations are often characterized by frictions and uncontrollable factors. People being absent when needed are one such problem.

This thesis has examined how roles are handled in three Swedish command and control systems. The result is a model for handling vacant roles with the possibility, in some situations, to override ordinary rules.

APA, Harvard, Vancouver, ISO, and other styles
5

Rajyalakshmi, P. S., and R. K. Rajangam. "Data Handling System for IRS." International Foundation for Telemetering, 1987. http://hdl.handle.net/10150/615329.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1987 / Town and Country Hotel, San Diego, California
The three axis stabilized Indian Remote Sensing Satellite will image the earth from a 904 Km polar - sun synchronous orbit. The payload is a set of CCD cameras which collect data in four bands visible and near infra-red region. This payload data from two cameras, each at 10.4 megabits per sec is transmitted in a balanced QPSK in X Band. The payload data before transmission is formatted by adopting Major and Minor frame synchronizing codes. The formatted two streams of data are differentially encoded to take care of 4-phase ambiguity due to QPSK transmission. This paper describes the design and development aspects related to such a Data Handling System. It also highlights the environmental qualification tests that were carried out to meet the requirement of three years operational life of the satellite.
APA, Harvard, Vancouver, ISO, and other styles
6

Benoit, Éric. "Capteurs symboliques et capteurs flous : un nouveau pas vers l'intelligence." Grenoble 1, 1993. http://www.theses.fr/1993GRE10003.

Full text
Abstract:
L'évolution des systèmes de contrôle intégrant des techniques de l'intelligence artificielle nous a amenés à définir un nouveau type de capteur susceptible d'être interfacé directement avec de tels systèmes. Allant plus loin dans cette direction, et afin de pouvoir intégrer les fonctions les plus évoluées du concept de capteur intelligent, nous proposons de munir le capteur d'un organe de décision utilisant lui-même des techniques de l'intelligence artificielle. L'objectif de ces travaux consiste donc à définir les bases de ces nouveaux types de capteurs appelés capteurs symboliques. Un capteur symbolique est un capteur numérique interactif pouvant produire, recevoir et manipuler des informations sous forme symbolique et de ce fait, assimilable par les systèmes de contrôle basés sur la manipulation des connaissances. Les capteurs symboliques flous, ou plus simplement capteurs flous, sont considérés comme un cas particulier des capteurs symboliques. Ils manipulent des informations graduelles représentées par des sous-ensembles flous. Ils peuvent être, par exemple, interfacés avec des systèmes experts flous ou des contrôleurs flous. Une étude des capteurs intelligents est d'abord effectuée en s'appuyant, d'une part sur la synthèse des concepts déjà existants, d'autre part sur l'élaboration d'une définition globale. Un formalisme et un ensemble d'outils sont ensuite proposés pour créer et utiliser les capteurs symboliques et flous. Un langage de manipulation de ces outils, appelé PLICS, a été défini. Puis son compilateur, destiné à être intégré dans ces capteurs, a été réalisé. Enfin, un exemple de capteur flou en reconnaissance de couleurs valide les travaux.
APA, Harvard, Vancouver, ISO, and other styles
7

Serbessa, Yonatan Kebede. "Handling Data Flows of Streaming Internet of Things Data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-302102.

Full text
Abstract:
Streaming data in various formats is generated in a very fast way and these data needs to be processed and analyzed before it becomes useless. The technology currently existing provides the tools to process these data and gain more meaningful information out of it. This thesis has two parts: theoretical and practical. The theoretical part investigates what tools are there that are suitable for stream data flow processing and analysis. In doing so, it starts with studying one of the main streaming data source that produce large volumes of data: Internet of Things. In this, the technologies behind it, common use cases, challenges, and solutions are studied. Then it is followed by overview of selected tools namely Apache NiFi, Apache Spark Streaming and Apache Storm studying their key features, main components, and architecture. After the tools are studied, 5 parameters are selected to review how each tool handles these parameters. This can be useful for considering choosing certain tool given the parameters and the use case at hand. The second part of the thesis involves Twitter data analysis which is done using Apache NiFi, one of the tools studied. The purpose is to show how NiFi can be used for processing data starting from ingestion to finally sending it to storage systems. It is also to show how it communicates with external storage, search, and indexing systems.
APA, Harvard, Vancouver, ISO, and other styles
8

Hanslo, Monique. "Techniques for handling clustered binary data." Master's thesis, University of Cape Town, 2002. http://hdl.handle.net/11427/6950.

Full text
Abstract:
Bibliography : leaves 143-153.
Over the past few decades there has been increasing interest in clustered studies and hence much research has gone into the analysis of data arising from these studies. It is erroneous to treat clustered data, where observations within a cluster are correlated with each other, as one would treat independent data. It has been found that point estimates are not as greatly affected by clustering as are the standard deviations of the estimates. But as a consequence, confidence intervals and hypothesis testing are severely affected. Therefore one has to approach the analysis of clustered data with caution. Methods that specifically deal with correlated data have been developed. Analysis may be further complicated when the outcome variable of interest is binary rather than continuous. Methods for estimation of proportions, their variances, calculation of confidence intervals and a variety of techniques for testing the homogeneity of proportions have been developed over the years (Donner and Klar, 1993; Donner, 1989, and Rao and Scott, 1992). The methods developed within the context of experimental design generally involve incorporating the effect of clustering in the analysis. This cluster effect is quantified by the intracluster correlation and needs to be taken into account when estimating proportions, comparing proportions and in sample size calculations. In the context of observational studies, the effect of clustering is expressed by the design effect which is the inflation in the variance of an estimate that is due to selecting a cluster sample rather than an independent sample. Another important aspect of the analysis of complex sample data that is often neglected is sampling weights. One needs to recognise that each individual may not have the same probability of being selected. These weights adjust for this fact (Little et al, 1997). Methods for modelling correlated binary data have also been discussed quite extensively. Among the many models which have been proposed for analyzing binary clustered data are two approaches which have been studied and compared: the population-averaged and cluster-specific approach. The population-averaged model focuses on estimating the effect of a set of covariates on the marginal expectation of the response. One example of the population-averaged approach for parameter estimation is known as generalized estimating equations, proposed by Liang and Zeger (1986). It involves assuming that elements within a cluster are independent and then imposing a correlation structure on the set of responses. This is a useful application in longitudinal studies where a subject is regarded as a cluster. Then the parameters describe how the population-averaged response rather than a specific subject's response depends on the covariates of interest. On the other hand, cluster specific models introduce cluster to cluster variability in the model by including random effects terms, which are specific to the cluster, as linear predictors in the regression model (Neuhaus et al, 1991). Unlike the special case of correlated Gaussian responses, the parameters for the cluster specific model obtained for binary data describe different effects on the responses compared to that obtained from the population-averaged model. For longitudinal data, the parameters of a cluster-specific model describe how a specific individuals probability of a response depends on the covariates. The decision to use either of these modelling methods depends on the questions of interest. Cluster-specific models are useful for studying the effects of cluster-varying covariates and when an individual's response rather than an average population's response is the focus. The population-averaged model is useful when interest lies in how the average response across clusters changes with covariates. A criticism of this approach is that there may be no individual with the characteristics of the population-averaged model.
APA, Harvard, Vancouver, ISO, and other styles
9

Smith, Gene A. "Space Station-Era Ground Data Handling." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614694.

Full text
Abstract:
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California
To support the Space Station-era space data flows through ground facilities, plans to handle peak return link data rates ranging from 300 Mbps to 1200 Mbps and average rates growing from 50 Mbps to hundreds of Mbps are being made. These numbers represent orders of magnitude greater rates than are handled today. Simply relaying the data to destinations, however, is not sufficient (nor so straightforward). Because of multiplexing of data, on-board tape recording and playback, noise, and other problems with the space-to-ground link, these data must be reassembled for users into the sequences in which the data were originally produced on-board with error checking, retransmission, correction, or flagging as required to eliminate or tag erroneous data. In the past these services (called Level Zero Processing) have required large operations staffs and have involved delays of 30 to 90 days for final formatting and shipping of data tapes to users. NASA's expectations for improving the SS-era operations depend on providing time ordered, error corrected or flagged data sets with no redundant data packets within 24 hours of receipt on the ground with backup of data for one week. These data sets would be transmitted electronically to data centers for higher level processing and would require no more operations personnel than are required today for systems processing less than 1/100 of the data. To support a variety of user requirements, some of the data will be provided in real time or, if recorded on-board, as priority playback data. Other data sets will be created from on-board system engineering or housekeeping data combined with attitude, position, and time parameters into ancillary data packets. On the ground enhancement of the on-board ancillary data packets will provide standard calibrations and transformations not available on-board. Remote access to an interactive ancillary database will allow users to select and withdraw specified parameters based on user-defined criteria. The collection of these services is referred to as ground data handling and will be a critical component of the Space Station-era ground data operations and mission management system under development at Goddard Space Flight Center for NASA institutional support of Space Station-compatible missions. Challenges represented by this need for more ground processing capability include: * High speed, high rate multipath processors capable of continuous, real-time operation. * High volume data storage systems with high rate data ingest, rapid access to separate segments of data sets, and high rate data output. * Sophisticated information and system management services to provide system configuration monitoring and control, user support, and minimal human interaction. * Interactive database structures with traceable parameter updating and self-identified, standard data set formatting.
APA, Harvard, Vancouver, ISO, and other styles
10

Barrera, Raymond C. "Command and Control Data dissemination using IP multicast." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1999. http://handle.dtic.mil/100.2/ADA376707.

Full text
Abstract:
Thesis (M.S. in Software Engineering) Naval Postgraduate School, December 1999.
"December 1999". Thesis advisor(s): Bert Lundy. Includes bibliographical references (p. 75-76). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
11

Schwier, Jason Montgomery. "Pattern recognition for command and control data systems." Connect to this title online, 2009. http://etd.lib.clemson.edu/documents/1252424695/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Gordon, Michael. "SGLS COMMAND DATA ENCODING USING DIRECT DIGITAL SYNTHESIS." International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/608937.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California
The Space Ground Link Subsystem (SGLS) provides full duplex communications for commanding, tracking, telemetry and ranging between spacecraft and ground stations. The up-link command signal is an S-Band carrier phase modulated with the frequency shift keyed (FSK) command data. The command data format is a ternary (S, 1, 0) signal. Command data rates of 1, 2, and 10 Kbps are used. The method presented uses direct digital synthesis (DDS) to generate the SGLS command data and clock signals. The ternary command data and clock signals are input to the encoder, and an FSK subcarrier with an amplitude modulated clock is digitally generated. The command data rate determines the frequencies of the S, 1, 0 tones. DDS ensures that phase continuity will be maintained, and frequency stability will be determined by the microprocessor crystal accuracy. Frequency resolution can be maintained to within a few Hz from DC to over 2 MHZ. This allows for the generation of the 1 and 2 Kbps command data formats as well as the newer 10 Kbps format. Additional formats could be accommodated through software modifications. The use of digital technology provides for encoder self-testing and more comprehensive error reporting.
APA, Harvard, Vancouver, ISO, and other styles
13

Portell, i. de Mora Jordi. "Payload data handling, telemetry and data compression systems for Gaia." Doctoral thesis, Universitat Politècnica de Catalunya, 2005. http://hdl.handle.net/10803/6585.

Full text
Abstract:
Gaia, la nova missió astromètrica de la ESA amb un llançament previst pel 2011, observarà més de mil milions d'estels i altres objectes amb una exactitud sense precedents. Els seus ambiciosos objectius desbanquen completament les missions rivals d'altres agències. Al final de la seva vida útil es generarà el major i més complert mapa tridimensional de la nostra Galàxia.
Una missió com aquesta suposa grans esforços tecnològics i de disseny ja que caldrà detectar, seleccionar i mesurar centenars d'estels cada segon, per enviar-ne posteriorment les dades cap a la Terra -a més d'un milió i mig de quilòmetres. Hem centrat el treball d'aquesta tesi en aquesta vessant de la missió, proposant dissenys pels sistemes de gestió de dades, de telemetria científica, i de compressió de dades. El nostre objectiu final és fer possible la transmissió a l'estació terrestre d'aquesta immensa quantitat de dades generades pels instruments, tenint en compte la limitada capacitat del canal de comunicacions. Això requereix el disseny d'un sistema de compressió de dades sense pèrdues que ofereixi les millors relacions de compressió i garanteixi la integritat de les dades transmeses. Tot plegat suposa un gran repte pels mètodes de la teoria de la informació i pel disseny de sistemes de compressió de dades.
Aquests aspectes tecnològics encara estaven per estudiar o bé només es disposava d'esborranys preliminars -ja que la missió mateixa estava en una etapa preliminar en quan varem començar aquesta tesi. Per tant, el nostre treball ha estat rebut amb entusiasme per part de científics i enginyers del projecte.
En primer lloc hem revisat l'entorn operacional del nostre estudi, descrit a la primera part de la tesi. Això inclou els diversos sistemes de referència i les convencions que hem proposat per tal d'unificar les mesures, referències a dades i dissenys. Aquesta proposta s'ha utilitzat com a referència inicial en la missió i actualment altres científics l'estan ampliant i millorant. També hem recopilat les principals característiques de l'instrument astromètric (en el qual hem centrat el nostre estudi) i revisat les seves directrius operacionals, la qual cosa també s'ha tingut en compte en altres equips.
A la segona part de la tesi descrivim la nostra proposta pel sistema de gestió de dades de la càrrega útil de Gaia, la qual ha estat utilitzada per presentar els requeriments científics als equips industrials i representa en sí mateixa una opció d'implementació viable (tot i que simplificada). En la següent part estudiem la telemetria científica, recopilant els camps de dades a generar pels instruments i proposant un esquema optimitzat de codificació i transmissió, el qual redueix la ocupació del canal de comunicacions i està preparat per incloure un sistema optimitzat de compressió de dades. Aquest darrer serà descrit a la quarta i última part de la tesi, on veurem com la nostra proposta compleix gairebé totalment els requeriments de compressió, arribant a duplicar les relacions de compressió ofertes pels millors sistemes estàndard. El nostre disseny representa la millor solució actualment disponible per Gaia i el seu rendiment ha estat assumit com a disseny base per altres equips.
Cal dir que els resultats del nostre treball van més enllà de la publicació d'una memòria de tesi, complementant-la amb aplicacions de software que hem desenvolupat per ajudar-nos a dissenyar, optimitzar i verificar la operació dels sistemes aquí proposats. També cal indicar que la complexitat del nostre treball ha estat augmentada degut a la necessitat d'actualitzar-lo contínuament als canvis que la missió ha sofert en el seu disseny durant els cinc anys del doctorat. Per acabar, podem dir que estem satisfets amb els resultats del nostre treball, ja que la majoria han estat (o estan essent) tinguts en compte per molts equips involucrats en la missió i per la mateixa Agència Espacial Europea en el disseny final.
APA, Harvard, Vancouver, ISO, and other styles
14

Sun, Jiake, and Wenjie Jiang. "Analysis Tool for Warehouse Material Handling Data." Thesis, Högskolan i Halmstad, Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-15205.

Full text
Abstract:
Effective material handling plays a key role in cutting costs. Well-organized material handling can cut production cost by optimizing product transfer paths, decreasing the damage rate and by increasing the utilization of storage space. This report presents the development of an analysis system for StoraEnso Hylte’s paper reel database. The system extracts and classifies key points from the database which are related to material handling; like attributes related to the product (paper reel), forklift truck information and storage cell utilization. The analysis based on paper reels includes the damage rate and transfer paths of paper reels. A mathematical model is also presented, which tells us that the probability of damage per transport is more important than the number of transports for paper reels handling. The effect of decreasing non-optimal transportation (optimize the path) is very small.
APA, Harvard, Vancouver, ISO, and other styles
15

Sánchez, Pinsach David. "Handling Missing Data in Clinical Decision Support." Doctoral thesis, Universitat Autònoma de Barcelona, 2020. http://hdl.handle.net/10803/671318.

Full text
Abstract:
Decidir quins són els millors tractaments és una tasca complexa quan els pacients pateixen múltiples problemes i quan un equip multidisciplinari està involucrat en la intervenció. Sempre hi ha més d’una opció de tractament i els resultats a vegades es poden veure en un període curt o un cop finalitzat el tractament. En aquest context, el disseny de sistemes eficaços de suport a la decisió clínica (CDSS) per ajudar als metges a seleccionar les intervencions més apropiades segueix sent avui dia un desafiament. La quantitat de dades disponibles no sempre és la mateixa per a tots els pacients, especialment en les fases primerenques del tractament, dificultant la inferència en els CDSS. Per millorar les capacitats dels CDSS, es proposen diferents components per als tractaments de llarga durada. Un primer component se centra a millorar la qualitat de les inferències en les dades desconegudes. L’algoritme d’imputació múltiple dinàmica (DMI) es presenta com una metodologia eficaç per a la millora de les dades. DMI és capaç d’adaptar-se a diferents escenaris amb un percentatge alt o baix de dades desconegudes. Els experiments realitzats revelen que DMI és especialment competitiu en problemes de regressió. Un segon component està dedicat a compensar les mesures de confiança, donada la incertesa associada a la informació desconeguda, incorporant mesures d’Informació Mútua en les confiances existents. El tercer component, basat en un algorisme de detecció de comunitats està orientat a trobar relacions entre decisions clíniques que no són explícites. Finalment, per il·lustrar l’aplicabilitat dels diferents components proposats, es presenten dos casos d’ús clínics reals. Un en el context hospitalari i un altre en el context del domicili.
Decidir cuáles son los mejores tratamientos es una tarea compleja cuando los pacientes sufren múltiples problemas y cuando un equipo multidisciplinario está involucrado en la intervención. Siempre hay más de una opción de tratamiento y los resultados a veces se pueden ver en un período corto o al final, una vez finalizado el tratamiento.En este contexto, el diseño de sistemas eficaces de soporte a la decisión clínica (CDSS) para ayudar a los médicos a seleccionar las intervenciones más apropiadas sigue siendo hoy en día un desafío. La cantidad de datos disponibles no siempre es la misma para todos los pacientes, especialmente en las fases tempranas del tratamiento, lo que dificulta la inferencia en los CDSS. Para mejorar las capacidades de los CDSS, se proponen diferentes componentes para tratamientos a largo plazo. Un primer componente se centra en mejorar la calidad de las inferencias en los datos desconocidos. El algoritmo de imputación múltiple dinámica (DMI) se presenta como un metodología eficaz para la mejora de los datos. DMI es capaz de adaptarse a diferentes escenarios con un porcentaje alto o bajo de datos desconocidos. Los experimentos realizados revelan que DMI es especialmente competitivo en problemas de regresión. Un segundo componente está dedicado a compensar las medidas de confianza, dada la incertidumbre asociada a la información desconocida, incorporando medidas de Información Mutua en las confianzas existentes. El tercer componente basado en un algoritmo de detección de comunidades esta orientado a encontrar relaciones entre decisiones clínicas que no son explícitas. Finalmente, para ilustrar la aplicabilidad de los diferentes componentes propuestos, se presentan dos casos de uso clínico reales. Uno en el contexto hospitalario y otro en el contexto del domicilio.
Deciding which are the best treatments is a complex task when patients suffer multiple impairments and when a multidisciplinary team is involved in the intervention. There is always more than a unique treatment option and the results sometimes can be viewed in a short period or only be capable to be measured when the treatment is finished. In this context, the design of effective Clinical Decision Support Systems (CDSS) to help clinicians to select most appropriate interventions is still a challenge. The amount of available data is not always the same for all patients, especially in early treatment stages, hindering the inference in CDSS. To improve the capabilities of CDSS, different components are proposed within a CDSS framework for long-term treatments. A first component is focused on improving the quality of the inferences in missing data scenarios. The Dynamic Multiple Imputation (DMI) algorithm is presented as an effective methodology for data enhancement in CDSS. DMI is capable to adapt to different scenarios with a low or high percentage of missing data. Several experiments conducted reveal that DMI is competitive with regression problems. A second component is devoted to weigh confidence measures, given the uncertainty associated to missing information, by incorporating Mutual Information measures in confidence existing estimators. A third component, based on a community detection algorithm, is proposed to find relationships between clinical decisions that are not explicit. Finally, to illustrate the applicability of different proposed components, two real clinical use cases with chronic patients are presented. The first in the hospital context and the other in the home context.
APA, Harvard, Vancouver, ISO, and other styles
16

Embleton, Nina Lois. "Handling sparse spatial data in ecological applications." Thesis, University of Birmingham, 2015. http://etheses.bham.ac.uk//id/eprint/5840/.

Full text
Abstract:
Estimating the size of an insect pest population in an agricultural field is an integral part of insect pest monitoring. An abundance estimate can be used to decide if action is needed to bring the population size under control, and accuracy is important in ensuring that the correct decision is made. Conventionally, statistical techniques are used to formulate an estimate from population density data obtained via sampling. This thesis thoroughly investigates an alternative approach of applying numerical integration techniques. We show that when the pest population is spread over the entire field, numerical integration methods provide more accurate results than the statistical counterpart. Meanwhile, when the spatial distribution is more aggregated, the error behaves as a random variable and the conventional error estimates do not hold. We thus present a new probabilistic approach to assessing integration accuracy for such functions, and formulate a mathematically rigorous estimate of the minimum number of sample units required for accurate abundance evaluation in terms of the species diffusion rate. We show that the integration error dominates the error introduced by noise in the density data and thus demonstrate the importance of formulating numerical integration techniques which provide accurate results for sparse spatial data.
APA, Harvard, Vancouver, ISO, and other styles
17

Prasad, Rohini S. M. Massachusetts Institute of Technology, and Gerta Malaj. "Analysis of inefficiencies in shipment data handling." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112861.

Full text
Abstract:
Thesis: M. Eng. in Supply Chain Management, Massachusetts Institute of Technology, Supply Chain Management Program, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 113-116).
Supply chain visibility is critical for businesses to manage their operational risks. Availability of high quality and timely data regarding shipments is a precursor for supply chain visibility. This thesis analyses the errors that occur in shipment data for a freight forwarder. In this study, two types of errors are analyzed: system errors, arising from violations of business rules defined in the software system, and operational errors, which violate business rules or requirements defined outside the software. We consolidated multifarious shipment data from multiple sources and identified the relationship between errors and the shipment attributes such as source or destination country. Data errors can be costly, both from a human rework perspective as well as from the perspective of increased risk due to supply chain visibility loss. Therefore, the results of this thesis will enable companies to focus their efforts and resources on the most promising error avoidance initiatives for shipment data entry and tracking. We use several descriptive analytical techniques, ranging from basic data exploration guided by plots and charts to multidimensional visualizations, to identify the relationship between error occurrences and shipment attributes. Further, we look at classification models to categorize data entries that have a high error probability, given certain attributes of a shipment. We employ clustering techniques (K-means clustering) to group shipments that have similar properties, thereby allowing us to extrapolate behaviors of erroneous data records to future records. Finally, we develop predictive models using Naive-Bayes classifiers and Neural Networks to predict the likelihood of errors in a record. The results of the error analysis in the shipment data are discussed for a freight forwarder. A similar approach can be employed for supply chains of any organization that engages in physical movement of goods, in order to manage the quality of the shipment data inputs, thereby managing their supply chain risks more effectively.
by Rohini Prasad and Gerta Malaj.
M. Eng. in Supply Chain Management
APA, Harvard, Vancouver, ISO, and other styles
18

BRING, MARCUS. "Handling XML Data Using Multi-threaded Solutions." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-153960.

Full text
Abstract:
This project was performed as part of a masters exam incomputer Science and Engineering at the Royal institute of technology in Stockholm. Scania AB acted as an employer and has provided both guidance and mentors. This thesis explores the concept of performing computations on XML data and how multi-threaded programing could improve the performance of such calculations. The thesis begins by discussing the basics of the XML-language and continues with basic concepts for multi-threaded programming. During the project two simulations were developed and benchmarked,one generic simulation to prove the validity of the method and a more realistic simulation that is supposed to indicate if the techniques scale in more real environments.The benchmark result shows good potential and acts as a base for further discussion on the limitations of the system at hand. The results show good potential andmulti-threaded solutions could provide big improvements regarding efficiency.
Hantering av XML-data med hjälp av multitrådade lösningar. Detta projekt utfördes som en del av masterexamen i Datalogi vid Kungliga tekniska högskolan i Stockholm. Scania AB har varit arbetsgivare och har bidragit med såväl vägledning som handledare. I denna uppsats utforskas koncept kring beräkning rörande XML-data och hur multitrådsprogrammering kan hjälpa till att sänka exekveringstider. Rapporten börjar med att förklara grunderna i XML-språket och fortsätter med att presentera grundläggande konceptin om multitrådsprogrammering. Under projektet har två simuleringar utvecklats och prestanda testats, en generell simulering som ämnar att visa att de föreslagna metoderna fungerar och en mer realistisk simulering som ska indikera ifall metoderna även fungerar under mer verklighetstrogna förhållanden. Dessa prestanda tester visar på en god potentialoch ligger till grund för vidare diskussion kring begränsningarna som finns i systemet.
APA, Harvard, Vancouver, ISO, and other styles
19

Barber, Mark H., and Paul R. Richey. "Naval Supply Systems Command: Data Administration planning and implementation." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27154.

Full text
Abstract:
The Paperwork Reduction Act of 1980 mandated that federal government activities establish and enforce Information Resource Management policies. It also recommended the establishment of a Data Administration Branch within federal activities to provide an organizational entity devoted to effective information management. This presents guidelines for the successful implementation of Data Administration, describes a standard for an Information Resources Dictionary System (the Data Administrator's primary tool), and makes recommendations for planning an Information Resources Dictionary System implementation. Keywords: Criticality general. (kr)
APA, Harvard, Vancouver, ISO, and other styles
20

Van, der Merwe Benjamin. "Micro-satellite data handling : a unified information model." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52336.

Full text
Abstract:
Thesis (MScEng)--University of Stellenbosch, 2001.
ENGLISH ABSTRACT: This thesis describes various software technologies implemented, or specifically developed, for the SUNSAT micro-satellite mission. With the discussion centered on the Mission Operations System functions of Data Handling and Mission Control, particular emphasis is placed on data processing aspects such as the deployed database schema, and the data communications mechanisms implemented as part of the communications protocol stack. Both the groundsystem architecture and the Flight Software are discussed, their constituent components are analysed, and recommendations are made for improvement. Finally, a Unified Information Model for the design and operation of future, integrated satellite groundsystems is proposed, with suitable implementation technologies being identified and introduced.
AFRIKAANSE OPSOMMING: Hierdie tesis beskryf die sagteware tegnologieë wat qeirnpternenteer. of spesifiek ontwerp is vir die SUNSAT mikro-satelliet missie, Met die bespreking gefokus op die Missie Operasionele Stelsel funksies van Data Hantering en Missie Beheer, word daar veral klem gelê op data prosesserings aspekte, soos byvoorbeeld die databasis skema wat ontplooi is, asook die data kommunikasie meganismes wat qeimplernenteer is as deel van die kommunikasie protokol stapel. Beide die grondstelsel argitektuur en die Vlugsagteware word bespreek, hulonderskeie komponente word geanaliseer, en aanbevelings ter verbetering word gemaak, Laastens word daar 'n Verenigde Informasie Model voorgestel vir die ontwerp en operasionele werking van 'n toekomstige, qeinteqreerde satelliet grondstelsel. Geskikte tegnologieë vir die implementasie hiervan word ook qeidentifiseer en voorgelê,
APA, Harvard, Vancouver, ISO, and other styles
21

Ren, Yueming Carleton University Dissertation Geography. "Vector data topological encoding error handling and generalization." Ottawa, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Xue. "Handling missing data problems in criminology :an introduction." Thesis, University of Macau, 2016. http://umaclib3.umac.mo/record=b3570879.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Morais, de Lira Ana Karina. "Separating variables in the context of data handling." Thesis, University College London (University of London), 2000. http://discovery.ucl.ac.uk/10020359/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Rothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." TU Bergakademie Freiberg, 2017. https://tubaf.qucosa.de/id/qucosa%3A23195.

Full text
Abstract:
This paper proposes a system for the ingestion and analysis of real-time sensor and actor data of bulk materials handling plants and machinery. It references issues that concern mining sensor data in cyber physical systems (CPS) as addressed in O’Leary et al. [2015].
APA, Harvard, Vancouver, ISO, and other styles
25

Nordmark, Martin. "Optimizing the Process of Handling Road Network Data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-396819.

Full text
Abstract:
Route optimization for different types of land-based vehicles is the process of finding the most efficient route. To achieve this, all necessary calculations throughout the process require road network data of the relevant surroundings. In order to prevent erroneous calculations and results, it is crucial that the data is correctly prepared and adapted for the algorithms and systems performing the route optimization. The company B&M Systemutveckling is active in this area and their process of handling road network data is troublesome in certain factors. This may harm the possibility to make further progress in some fields. In order to improve the process, this thesis analyses three problematic factors which are time consumption, correctness and the required attention from the user. This was done by analysing all parts of the process and acknowledging if and how these factors could be improved. With the results of the analyse in mind, a new improved system was designed. The main focus was to improve these three main factors that are considered especially problematic in the process. A prototype was then built from this design which turned out to be a powerful upgrade of the currently used process, making improvements in all of the main problematic factors. The time consumption decreased from at least one week to one hour. The attention required from the user decreased immensely resulting in a decreased risk for human errors. Utilization of more suitable data structures and algorithms combined with the decreased risk for human errors had a beneficial impact on the correctness.
APA, Harvard, Vancouver, ISO, and other styles
26

Jayawardena, D. P. W. "The role of triangulation in spatial data handling." Thesis, Keele University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.385666.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Rothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2018. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-231350.

Full text
Abstract:
This paper proposes a system for the ingestion and analysis of real-time sensor and actor data of bulk materials handling plants and machinery. It references issues that concern mining sensor data in cyber physical systems (CPS) as addressed in O’Leary et al. [2015].
APA, Harvard, Vancouver, ISO, and other styles
28

Grunzke, Richard. "Generic Metadata Handling in Scientific Data Life Cycles." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-202070.

Full text
Abstract:
Scientific data life cycles define how data is created, handled, accessed, and analyzed by users. Such data life cycles become increasingly sophisticated as the sciences they deal with become more and more demanding and complex with the coming advent of exascale data and computing. The overarching data life cycle management background includes multiple abstraction categories with data sources, data and metadata management, computing and workflow management, security, data sinks, and methods on how to enable utilization. Challenges in this context are manifold. One is to hide the complexity from the user and to enable seamlessness in using resources to usability and efficiency. Another one is to enable generic metadata management that is not restricted to one use case but can be adapted with limited effort to further ones. Metadata management is essential to enable scientists to save time by avoiding the need for manually keeping track of data, meaning for example by its content and location. As the number of files grows into the millions, managing data without metadata becomes increasingly difficult. Thus, the solution is to employ metadata management to enable the organization of data based on information about it. Previously, use cases tended to only support highly specific or no metadata management at all. Now, a generic metadata management concept is available that can be used to efficiently integrate metadata capabilities with use cases. The concept was implemented within the MoSGrid data life cycle that enables molecular simulations on distributed HPC-enabled data and computing infrastructures. The implementation enables easy-to-use and effective metadata management. Automated extraction, annotation, and indexing of metadata was designed, developed, integrated, and search capabilities provided via a seamless user interface. Further analysis runs can be directly started based on search results. A complete evaluation of the concept both in general and along the example implementation is presented. In conclusion, generic metadata management concept advances the state of the art in scientific date life cycle management.
APA, Harvard, Vancouver, ISO, and other styles
29

Randahl, David. "Raoul: An R-Package for Handling Missing Data." Thesis, Uppsala universitet, Statistiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Betz, Hariolf, Frank Raiser, and Thom Frühwirth. "Persistent constraints in constraint handling rules." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4154/.

Full text
Abstract:
In the most abstract definition of its operational semantics, the declarative and concurrent programming language CHR is trivially non-terminating for a significant class of programs. Common refinements of this definition, in closing the gap to real-world implementations, compromise on declarativity and/or concurrency. Building on recent work and the notion of persistent constraints, we introduce an operational semantics avoiding trivial non-termination without compromising on its essential features.
APA, Harvard, Vancouver, ISO, and other styles
31

Abdennadher, Slim, Haythem Ismail, and Frederick Khoury. "Transforming imperative algorithms to constraint handling rules." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4153/.

Full text
Abstract:
Different properties of programs, implemented in Constraint Handling Rules (CHR), have already been investigated. Proving these properties in CHR is fairly simpler than proving them in any type of imperative programming language, which triggered the proposal of a methodology to map imperative programs into equivalent CHR. The equivalence of both programs implies that if a property is satisfied for one, then it is satisfied for the other. The mapping methodology could be put to other beneficial uses. One such use is the automatic generation of global constraints, at an attempt to demonstrate the benefits of having a rule-based implementation for constraint solvers.
APA, Harvard, Vancouver, ISO, and other styles
32

Witta, Eleanor Lea. "Seven methods of handling missing data using samples from a national data base." Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06062008-170840/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Ding, Yuanyuan. "Handling complex, high dimensional data for classification and clustering /." Full text available from ProQuest UM Digital Dissertations, 2007. http://0-proquest.umi.com.umiss.lib.olemiss.edu/pqdweb?index=0&did=1400971141&SrchMode=2&sid=1&Fmt=2&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1219343482&clientId=22256.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Shum, L. L. "Topology control and data handling in wireless sensor networks." Thesis, University College London (University of London), 2009. http://discovery.ucl.ac.uk/18577/.

Full text
Abstract:
Our work in this thesis have provided two distinctive contributions to WSNs in the areas of data handling and topology control. In the area of data handling, we have demonstrated a solution to improve the power efficiency whilst preserving the important data features by data compression and the use of an adaptive sampling strategy, which are applicable to the specific application for oceanography monitoring required by the SECOAS project. Our work on oceanographic data analysis is important for the understanding of the data we are dealing with, such that suitable strategies can be deployed and system performance can be analysed. The Basic Adaptive Sampling Scheduler (BASS) algorithm uses the statistics of the data to adjust the sampling behaviour in a sensor node according to the environment in order to conserve energy and minimise detection delay. The motivation of topology control (TC) is to maintain the connectivity of the network, to reduce node degree to ease congestion in a collision-based medium access scheme; and to reduce power consumption in the sensor nodes. We have developed an algorithm Subgraph Topology Control (STC) that is distributed and does not require additional equipment to be implemented on the SECOAS nodes. STC uses a metric called subgraph number, which measures the 2-hops connectivity in the neighbourhood of a node. It is found that STC consistently forms topologies that have lower node degrees and higher probabilities of connectivity, as compared to k-Neighbours, an alternative algorithm that does not rely on special hardware on sensor node. Moreover, STC also gives better results in terms of the minimum degree in the network, which implies that the network structure is more robust to a single point of failure. As STC is an iterative algorithm, it is very scalable and adaptive and is well suited for the SECOAS applications.
APA, Harvard, Vancouver, ISO, and other styles
35

Twala, Bhekisipho. "Effective techniques for handling incomplete data using decision trees." Thesis, Open University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.418465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Herman, James Stackpole. "A sail force dynamometer : design, implementation and data handling." Thesis, Massachusetts Institute of Technology, 1989. http://hdl.handle.net/1721.1/14491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Shimakawa, Hiromitsu. "Studies on Real-Time Handling of Time Dependent Data." Kyoto University, 1999. http://hdl.handle.net/2433/157059.

Full text
Abstract:
本文データは平成22年度国立国会図書館の学位論文(博士)のデジタル化実施により作成された画像ファイルを基にpdf変換したものである
Kyoto University (京都大学)
0048
新制・課程博士
博士(工学)
甲第7850号
工博第1830号
新制||工||1141(附属図書館)
UT51-99-G444
京都大学大学院工学研究科情報工学専攻
(主査)教授 上林 彌彦, 教授 岩間 一雄, 教授 湯淺 太一
学位規則第4条第1項該当
APA, Harvard, Vancouver, ISO, and other styles
38

Ferreira, Carlos André Marques Viana. "Handling data access latency in distributed medical imaging environments." Doctoral thesis, Universidade de Aveiro, 2015. http://hdl.handle.net/10773/15491.

Full text
Abstract:
Doutoramento em Ciências da Computação
Web-based technologies have been increasingly used in Picture Archive and Communication Systems (PACS), in services related to storage, distribution and visualization of medical images. Nowadays, many healthcare institutions are federating services and outsourcing their repositories to the Cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth limitations. Communication latency is a critical issue that still hinders the adoption of this paradigm. In order to improve the performance of distributed medical imaging networks, routing mechanisms with cache and prefetching can be used. This doctorate proposes a cache architecture based on static rules together with pattern recognition for both cache eviction and prefetching.
As tecnologias Web têm sido usadas cada vez mais no universo dos Picture Archiving and Communication Systems (PACS), nomeadamente em serviços de armazenamento, distribuição e visualização de imagem médica. Atualmente, verificamos que existe uma tendências para as instituições partilharem fluxos de trabalho e contratualizarem serviços na Cloud. No entanto, gerir as comunicações entre entidades geograficamente distribuídas continua a ser um desafio complexo devido ao enorme volume de dados e às limitações de largura de banda. A latência de acesso remoto aos dados é um problema importante que dificulta a adopção deste paradigma. Para melhorar o desempenho de redes distribuídas de imagem médica, podemos utilizar mecanismos de encaminhamento com cache e prefetching. Este doutoramento propõe uma arquitetura de cache baseada em regras estáticas e reconhecimento de padrões para prefetching e limpeza da cache.
APA, Harvard, Vancouver, ISO, and other styles
39

Ang, Su-Shin. "Handling data dependent memory accesses in custom hardware applications." Thesis, Imperial College London, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.501122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Whitelaw, Virigina A. "Telemetry Handling on the Space Station Data Management System." International Foundation for Telemetering, 1987. http://hdl.handle.net/10150/615260.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1987 / Town and Country Hotel, San Diego, California
Traditional space telemetry has generally been handled as asynchronous data stream fed into a time division multiplexed channel on a point-to-point radio frequency (RF) link between space and ground. The data handling concepts emerging for the Space Station challenge each of these precepts. According to current concepts, telemetry data on the Space Station will be packetized. It will be transported asynchronously through onboard networks. The space-to-ground link will not be time division multiplexed, but rather will have flexibly managed virtual channels, and finally, the routing of telemetry data must potentially traverse multiple ground distribution networks. Appropriately, the communication standards for handling telemetry are changing to support the highly networked Space Station environment. While a companion paper (1. W. Marker, "Telemetry Formats for the Space Station RF Links") examines the emerging telemetry concepts and formats for the RF link, this paper focuses on the impact of telemetry handling on the design of the onboard networks that are part of the Data Management System (DMS). The DMS will provide the connectivity between most telemetry sources and the onboard node for transmission to the ground. By far the bulk of data transported by DMS will be telemetry, however, not all telemetry will place the same demands on the communication system and DMS must also satisfy a rich array of services in support of distributed Space Station operations. These services include file transfer, data base access, application messaging and several others. The DMS communications architecture, which will follow the International Standards Organization (ISO) Reference Model, must support both the high throughput needed for telemetry transport, as well as the rich services needed for distributed computer systems. This paper discusses an architectural approach to satisfying the dual set of requirements and discusses several of the functionality vs. performance trade-offs that must be made in developing an optimized mechanism for handling telemetry data in the DMS.
APA, Harvard, Vancouver, ISO, and other styles
41

Almeida, Miguel Alexandre Dias. "Planning, operations and data handling of planetary science missions." Master's thesis, FCT-UNL, 2011. http://hdl.handle.net/10362/7081.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Física
Since the dawn of human kind the celestial sphere has had a special place in our imagination. I always felt the same passion for the cosmos. In particular, fuelled by the journeys of discovery of the Solar System, I always dreamed of dedicating myself to planetary exploration. To follow that idea I studied physics and, in 1999, I finally got the opportunity to work in Planetary Science research. It all started in the Lisbon Observatory, where I was able to analyse infrared data from the Jupiter orbiter, Galileo, collected by its Near-Infrared Mapping Spectrometer (NIMS). At this stage I learned some methods used in data analysis. I continued my career in the European Space Agency in the Science Planning of the SMART-1 lunar spacecraft. I remained in this mission for the following seven years, and saw my responsibilities grow. I started by giving technical support to the Project Scientist. By the end of the mission I had been a major player in the setup of the planning system, and worked in all capacities within the SMART-1 Science and Technology Operations Centre, and the Advanced Moon Micro-Imager Experiment (AMIE). Finally, in 2006, I started working in the Venus Express project as a Liaison Scientist for the Venus Monitoring Camera (VMC) instrument. I was able to re-work the VMC planning system with the Principal Investigator in order to gather more valuable science data. While this optimization was done, and since the spacecraft was already orbiting Venus, my duties also included, and still do, to plan and operate the VMC camera. During my entire career I have had two side projects running along my main tasks. I always had an interest for innovation and some of my ideas developed to integral parts of my main projects. I also always kept an interest in data analysis, as time permitted, that I carried with NIMS, AMIE and VMC.
APA, Harvard, Vancouver, ISO, and other styles
42

Gugliermo, Simona. "Occlusion handling in Augmented Reality context." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-262679.

Full text
Abstract:
Handling occlusion between real and virtual objects is a challenging problem in Augmented Reality (AR) applications. Incorrect and inaccurate occlusion handling may cause confusion in users’ perception which leads to non-realistic and non-immersive AR experiences. Even though there are important research examples and implementations on this topic, they typically suffer from important limitations. The method presented in this thesis takes both raw depth and color information from an RGB-D sensor in order to solve occlusion in unknown AR contexts. The novelty of this method is the automatic trimap generation which is based on the color distribution information of the real scene. Moreover, the proposed method can operate using the information from only the latest frame, or efficiently use the information from multiple frames. In the second case, the knowledge of the previous frame output is used to improve the trimap generation of the latest frame. Experimental evaluations of several scenes demonstrate that this approach largely improves automatic trimap generation using both the RGB and depth information. Furthermore, the final proposed method is compared with some recent state of the art approaches in terms of quality and accuracy, showing that it overcomes some of the known limitations.
Hantering av ocklusion mellan verkliga och virtuella objekt är ett utmanande problem i applikationer med augmenterad verklighet (AR). Felaktig ocklusionshantering kan orsaka förvirring i användarnas uppfattning vilket leder till icke-realistiska AR-upplevelser. Trots att det finns forskningsexempel och implementeringar av framgång inom detta område så har de ofta betydliga begränsningar. Metoden som presenteras i detta arbete tar både rå djup- och färginformation från en RGB-D-sensor för att lösa ocklusion i okända AR-sammanhang. Innovationen med denna metod är den automatiska trimapgenerationen som bygger på informationen i färgdistributionen i den verkliga scenen. Dessutom kan den föreslagna metoden fungera med hjälp av informationen från endast den senaste bilden, eller effektivt använda informationen från flera bilder. I det andra fallet används information från den tidigare bildutmatningen för att förbättra trimapgenerationen av den senaste bilden. Experimentella utvärderingar av flera scener visar att denna metod i stor utsträckning förbättrar automatisk trimapgenerering med både RGB och djupinformation. Vidare jämförs den föreslagna metoden med några av de andra främsta tillvägagångssätt med avsende på kvalitet och noggrannhet, vilket visar att den övervinner några av de kända begränsningarna.
APA, Harvard, Vancouver, ISO, and other styles
43

Pratt, Everett S. "Data compression standards and applications to Global Command and Control System." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA293262.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering and M.S. in Systems Technology) Naval Postgraduate School, December 1994.
Thesis advisor(s): Murali Tummala, Paul H. Moose. "December 1994." Includes bibliographical references. Also available online.
APA, Harvard, Vancouver, ISO, and other styles
44

Davis, Rodney, Greg Hupf, and Chad Woolf. ""ADVANCED DATA DESCRIPTION EXCHANGE SERVICES FOR HETEROGENEOUS SYSTEMS"." International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/605341.

Full text
Abstract:
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California
CCT is conducting research to provide a cross platform software capability that enables a common semantic for control and monitor of highly distributed systems-of-systems C^2 architectures by auto-generating semantic processing services from standardized metadata specifications. This new capability is significant because it will reduce development, operations, and support costs for legacy and future systems that are part of ground and space based distributed command and control systems. It will also establish a space systems information exchange model that can support future highly interoperable and mobile software systems.
APA, Harvard, Vancouver, ISO, and other styles
45

Khailtash, Amal. "Handling large data storage in synthesis of multiple FPGA systems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0007/MQ43653.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Kift, M. H. "The use of hypertext in handling process engineering design data." Thesis, Swansea University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.637796.

Full text
Abstract:
This thesis aims to investigate aspects of data management associated with process engineering design, with a view to proposing improved methods and models for data storage, manipulation and representation, on which a foundation for new process design environments may be based. Several aspects of integrated process design database systems are examined, including data models, data handling, program control and integration. Process design is an increasingly complex activity, with this complexity being compounded by moves towards the adoption of concurrent engineering and "round-the-clock" activities in design offices around the world. It is evident that better support for the management of advanced data types and data control is needed to support the design activity. The main focus of this research is the use of hypertext in an integrated engineering database system, for handling complex information during process design. The research investigates the potential that hypertext provides for easy control of information, and the powerful tools it provides to allow the construction of data relationships. In addition to hypertext, this research investigates the strengths and weaknesses of current database models for engineering and hypertext usage. This leads to recommendations on the models best suited for these tasks. Finally, methods of data and program integration in current integrated systems are examined. Several new methods of integration are investigated, using advanced features of the Microsoft Windows operating system. These new methods aim to achieve close integration of both engineering and conventional applications. The ideas proposed by the research are manifested in the form of computer packages. These have been used in the thesis to demonstrate, by means of realistic examples, the use of hypertext for handling the design process, the employment of an advanced data model for handling complex multimedia information, and program integration.
APA, Harvard, Vancouver, ISO, and other styles
47

Johansson, Åsa M. "Methodology for Handling Missing Data in Nonlinear Mixed Effects Modelling." Doctoral thesis, Uppsala universitet, Institutionen för farmaceutisk biovetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-224098.

Full text
Abstract:
To obtain a better understanding of the pharmacokinetic and/or pharmacodynamic characteristics of an investigated treatment, clinical data is often analysed with nonlinear mixed effects modelling. The developed models can be used to design future clinical trials or to guide individualised drug treatment. Missing data is a frequently encountered problem in analyses of clinical data, and to not venture the predictability of the developed model, it is of great importance that the method chosen to handle the missing data is adequate for its purpose. The overall aim of this thesis was to develop methods for handling missing data in the context of nonlinear mixed effects models and to compare strategies for handling missing data in order to provide guidance for efficient handling and consequences of inappropriate handling of missing data. In accordance with missing data theory, all missing data can be divided into three categories; missing completely at random (MCAR), missing at random (MAR) and missing not at random (MNAR). When data are MCAR, the underlying missing data mechanism does not depend on any observed or unobserved data; when data are MAR, the underlying missing data mechanism depends on observed data but not on unobserved data; when data are MNAR, the underlying missing data mechanism depends on the unobserved data itself. Strategies and methods for handling missing observation data and missing covariate data were evaluated. These evaluations showed that the most frequently used estimation algorithm in nonlinear mixed effects modelling (first-order conditional estimation), resulted in biased parameter estimates independent on missing data mechanism. However, expectation maximization (EM) algorithms (e.g. importance sampling) resulted in unbiased and precise parameter estimates as long as data were MCAR or MAR. When the observation data are MNAR, a proper method for handling the missing data has to be applied to obtain unbiased and precise parameter estimates, independent on estimation algorithm. The evaluation of different methods for handling missing covariate data showed that a correctly implemented multiple imputations method and full maximum likelihood modelling methods resulted in unbiased and precise parameter estimates when covariate data were MCAR or MAR. When the covariate data were MNAR, the only method resulting in unbiased and precise parameter estimates was a full maximum likelihood modelling method where an extra parameter was estimated, correcting for the unknown missing data mechanism's dependence on the missing data. This thesis presents new insight to the dynamics of missing data in nonlinear mixed effects modelling. Strategies for handling different types of missing data have been developed and compared in order to provide guidance for efficient handling and consequences of inappropriate handling of missing data.
APA, Harvard, Vancouver, ISO, and other styles
48

Brint, Andrew Timothy. "Matching algorithms for handling three dimensional molecular co-ordinate data." Thesis, University of Sheffield, 1988. http://etheses.whiterose.ac.uk/15147/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

König, Hans-Henrik. "Calphad data handling for generic precipitation modelling coupled with FEM." Thesis, KTH, Materialvetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280039.

Full text
Abstract:
To enable a generic modelling tool for precipitation kinetics in non-homogeneous components, an efficient data-handling is required to facilitate the integration of models on different length scales, and to decrease the computational time and the use of resources. In this work an automated method to generate, curate and transform Calphad- based thermodynamic and kinetic data to facilitate precipitation models integrated in FEM codes is developed and tested. The open-source Python library, pycalphad, is employed to access Calphad databases. Python scripts are utilized to calculate the thermodynamic and kinetic parameters, required to supply a precipitation model. The obtained data is stored with an open-source software infrastructure. The Cu-Co binary is the chosen model alloy in this work and the corre-sponding parameters are calculated and stored. The obtained results show, that pycalphad can be used to supply the required thermodynamic and kinetic pa- rameters for a precipitation model. Further refinement of the presented sourcecode is required to enable application in the whole composition range.
För utveckling av ett generiskt modelleringsverktyg för utskiljningskinetiken i inhomogena komponenter krävs en effektiv databehandling som möjliggör integration av modeller för olika längdskalor och minskar beräkningstiden och resursförbrukningen. I denna avhandling utvecklas och testas en automatiserad metod för att generera, kurera och transformera termodynamisk och kinetisk Calphad-data. Detta möjliggör integration av utskiljningsmodeller i finita-element metodkoder. Pycalphad tillsammans med en öppen källkod används för att komma åt Calphad-databaser. Ett Python-skript används för att beräkna de termodynamiska och kinetiska parametrarna som används i utskiljningsmodellen. Uppgifterna sparas i en öppen källkodsinfrastruktur. Den utvecklade metoden demonstreras genom att generera, kurera och transformera information för det binära modellsystemet Cu-Co Resultaten visar att Pycalphad kan användas för att tillhandahålla de nödvändiga termodynamiska och kinetiska parametrarna för utskiljningsmodeller. En ytterligare förbättring av den presenterade källkoden är nödvändig för att möjliggöra applikationen inom hela sammansättningsområdet.
APA, Harvard, Vancouver, ISO, and other styles
50

Selvaraj, Poorani. "Group Method of Data Handling – How Does it Measure Up?" Ohio University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1479421385631538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography