Dissertations / Theses on the topic 'Command and Data Handling'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Command and Data Handling.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Lokken, Patrick Bucknam. "Command and data handling systems for a multi-instrument sounding rocket payload." Thesis, Montana State University, 2011. http://etd.lib.montana.edu/etd/2011/lokken/LokkenP0511.pdf.
Full textOlsen, Douglas. "Implementation of CCSDS Telemetry and Command Standards for the Fast Auroral Snapshot (FAST) Small Explorer Mission." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611870.
Full textRecommendations of the Consultative Committee for Space Data Systems (CCSDS) provide a standard approach for implementing spacecraft packet telemetry and command interfaces. The Fast Auroral Snapshot (FAST) Small Explorer mission relies heavily on the CCSDS virtual channel and packetization concepts to achieve near real-time commanding and distribution of telemetry between separate space borne science and spacecraft processors and multiple ground stations. Use of the CCSDS recommendations allows the FAST mission to realize significant re-use of ground systems developed for the first Small Explorer mission, and also simplifies system interfaces and interactions between flight software developers, spacecraft integrators, and ground system operators.
DeBoy, Christopher C., Paul D. Schwartz, and Richard K. Huebschman. "Midcourse Space Experiment Spacecraft and Ground Segment Telemetry Design and Implementation." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/608390.
Full textThis paper reviews the performance requirements that provided the baseline for development of the onboard data system, RF transmission system, and ground segment receiving system of the Midcourse Space Experiment (MSX) spacecraft. The onboard Command and Data Handling (C&DH) System was designed to support the high data outputs of the three imaging sensor systems onboard the spacecraft and the requirement for large volumes of data storage. Because of the high data rates, it was necessary to construct a dedicated X-band ground receiver system at The Johns Hopkins University Applied Physics Laboratory (APL) and implement a tape recorder system for recording and downlinking sensor and spacecraft data. The system uses two onboard tape recorders to provide redundancy and backup capabilities. The storage capability of each tape recorder is 54 gigabits. The MSX C&DH System can record data at 25 Mbps or 5 Mbps. To meet the redundancy requirements of the high-priority experiments, the data can also be recorded in parallel on both tape recorders. To provide longer onboard recording, the data can also be recorded serially on the two recorders. The reproduce (playback) mode is at 25 Mbps. A unique requirement of the C&DH System is to multiplex and commutate the different output rates of the sensors and housekeeping signals into a common data stream for recording. The system also supports 1-Mbps real-time sensor data and 16-kbps real-time housekeeping data transmission to the dedicated ground site and through the U.S. Air Force Satellite Control Network ground stations. The primary ground receiving site for the telemetry is the MSX Tracking System (MTS) at APL. A dedicated 10-m X-band antenna is used to track the satellite during overhead passes and acquire the 25-Mbps telemetry downlinks, along with the 1-Mbps and 16-kbps real-time transmissions. This paper discusses some of the key technology trade-offs that were made in the design of the system to meet requirements for reliability, performance, and development schedule. It also presents some of the lessons learned during development and the impact these lessons will have on development of future systems.
Landberg, Fredrik. "Flexible role-handling in command and control systems." Thesis, Linköping University, Department of Electrical Engineering, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-7880.
Full textIn organizations the permissions a member has is not decided by their person, but by their functions within the organization. This is also the approach taken within military command and control systems. Military operations are often characterized by frictions and uncontrollable factors. People being absent when needed are one such problem.
This thesis has examined how roles are handled in three Swedish command and control systems. The result is a model for handling vacant roles with the possibility, in some situations, to override ordinary rules.
Rajyalakshmi, P. S., and R. K. Rajangam. "Data Handling System for IRS." International Foundation for Telemetering, 1987. http://hdl.handle.net/10150/615329.
Full textThe three axis stabilized Indian Remote Sensing Satellite will image the earth from a 904 Km polar - sun synchronous orbit. The payload is a set of CCD cameras which collect data in four bands visible and near infra-red region. This payload data from two cameras, each at 10.4 megabits per sec is transmitted in a balanced QPSK in X Band. The payload data before transmission is formatted by adopting Major and Minor frame synchronizing codes. The formatted two streams of data are differentially encoded to take care of 4-phase ambiguity due to QPSK transmission. This paper describes the design and development aspects related to such a Data Handling System. It also highlights the environmental qualification tests that were carried out to meet the requirement of three years operational life of the satellite.
Benoit, Éric. "Capteurs symboliques et capteurs flous : un nouveau pas vers l'intelligence." Grenoble 1, 1993. http://www.theses.fr/1993GRE10003.
Full textSerbessa, Yonatan Kebede. "Handling Data Flows of Streaming Internet of Things Data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-302102.
Full textHanslo, Monique. "Techniques for handling clustered binary data." Master's thesis, University of Cape Town, 2002. http://hdl.handle.net/11427/6950.
Full textOver the past few decades there has been increasing interest in clustered studies and hence much research has gone into the analysis of data arising from these studies. It is erroneous to treat clustered data, where observations within a cluster are correlated with each other, as one would treat independent data. It has been found that point estimates are not as greatly affected by clustering as are the standard deviations of the estimates. But as a consequence, confidence intervals and hypothesis testing are severely affected. Therefore one has to approach the analysis of clustered data with caution. Methods that specifically deal with correlated data have been developed. Analysis may be further complicated when the outcome variable of interest is binary rather than continuous. Methods for estimation of proportions, their variances, calculation of confidence intervals and a variety of techniques for testing the homogeneity of proportions have been developed over the years (Donner and Klar, 1993; Donner, 1989, and Rao and Scott, 1992). The methods developed within the context of experimental design generally involve incorporating the effect of clustering in the analysis. This cluster effect is quantified by the intracluster correlation and needs to be taken into account when estimating proportions, comparing proportions and in sample size calculations. In the context of observational studies, the effect of clustering is expressed by the design effect which is the inflation in the variance of an estimate that is due to selecting a cluster sample rather than an independent sample. Another important aspect of the analysis of complex sample data that is often neglected is sampling weights. One needs to recognise that each individual may not have the same probability of being selected. These weights adjust for this fact (Little et al, 1997). Methods for modelling correlated binary data have also been discussed quite extensively. Among the many models which have been proposed for analyzing binary clustered data are two approaches which have been studied and compared: the population-averaged and cluster-specific approach. The population-averaged model focuses on estimating the effect of a set of covariates on the marginal expectation of the response. One example of the population-averaged approach for parameter estimation is known as generalized estimating equations, proposed by Liang and Zeger (1986). It involves assuming that elements within a cluster are independent and then imposing a correlation structure on the set of responses. This is a useful application in longitudinal studies where a subject is regarded as a cluster. Then the parameters describe how the population-averaged response rather than a specific subject's response depends on the covariates of interest. On the other hand, cluster specific models introduce cluster to cluster variability in the model by including random effects terms, which are specific to the cluster, as linear predictors in the regression model (Neuhaus et al, 1991). Unlike the special case of correlated Gaussian responses, the parameters for the cluster specific model obtained for binary data describe different effects on the responses compared to that obtained from the population-averaged model. For longitudinal data, the parameters of a cluster-specific model describe how a specific individuals probability of a response depends on the covariates. The decision to use either of these modelling methods depends on the questions of interest. Cluster-specific models are useful for studying the effects of cluster-varying covariates and when an individual's response rather than an average population's response is the focus. The population-averaged model is useful when interest lies in how the average response across clusters changes with covariates. A criticism of this approach is that there may be no individual with the characteristics of the population-averaged model.
Smith, Gene A. "Space Station-Era Ground Data Handling." International Foundation for Telemetering, 1989. http://hdl.handle.net/10150/614694.
Full textTo support the Space Station-era space data flows through ground facilities, plans to handle peak return link data rates ranging from 300 Mbps to 1200 Mbps and average rates growing from 50 Mbps to hundreds of Mbps are being made. These numbers represent orders of magnitude greater rates than are handled today. Simply relaying the data to destinations, however, is not sufficient (nor so straightforward). Because of multiplexing of data, on-board tape recording and playback, noise, and other problems with the space-to-ground link, these data must be reassembled for users into the sequences in which the data were originally produced on-board with error checking, retransmission, correction, or flagging as required to eliminate or tag erroneous data. In the past these services (called Level Zero Processing) have required large operations staffs and have involved delays of 30 to 90 days for final formatting and shipping of data tapes to users. NASA's expectations for improving the SS-era operations depend on providing time ordered, error corrected or flagged data sets with no redundant data packets within 24 hours of receipt on the ground with backup of data for one week. These data sets would be transmitted electronically to data centers for higher level processing and would require no more operations personnel than are required today for systems processing less than 1/100 of the data. To support a variety of user requirements, some of the data will be provided in real time or, if recorded on-board, as priority playback data. Other data sets will be created from on-board system engineering or housekeeping data combined with attitude, position, and time parameters into ancillary data packets. On the ground enhancement of the on-board ancillary data packets will provide standard calibrations and transformations not available on-board. Remote access to an interactive ancillary database will allow users to select and withdraw specified parameters based on user-defined criteria. The collection of these services is referred to as ground data handling and will be a critical component of the Space Station-era ground data operations and mission management system under development at Goddard Space Flight Center for NASA institutional support of Space Station-compatible missions. Challenges represented by this need for more ground processing capability include: * High speed, high rate multipath processors capable of continuous, real-time operation. * High volume data storage systems with high rate data ingest, rapid access to separate segments of data sets, and high rate data output. * Sophisticated information and system management services to provide system configuration monitoring and control, user support, and minimal human interaction. * Interactive database structures with traceable parameter updating and self-identified, standard data set formatting.
Barrera, Raymond C. "Command and Control Data dissemination using IP multicast." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1999. http://handle.dtic.mil/100.2/ADA376707.
Full text"December 1999". Thesis advisor(s): Bert Lundy. Includes bibliographical references (p. 75-76). Also available online.
Schwier, Jason Montgomery. "Pattern recognition for command and control data systems." Connect to this title online, 2009. http://etd.lib.clemson.edu/documents/1252424695/.
Full textGordon, Michael. "SGLS COMMAND DATA ENCODING USING DIRECT DIGITAL SYNTHESIS." International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/608937.
Full textThe Space Ground Link Subsystem (SGLS) provides full duplex communications for commanding, tracking, telemetry and ranging between spacecraft and ground stations. The up-link command signal is an S-Band carrier phase modulated with the frequency shift keyed (FSK) command data. The command data format is a ternary (S, 1, 0) signal. Command data rates of 1, 2, and 10 Kbps are used. The method presented uses direct digital synthesis (DDS) to generate the SGLS command data and clock signals. The ternary command data and clock signals are input to the encoder, and an FSK subcarrier with an amplitude modulated clock is digitally generated. The command data rate determines the frequencies of the S, 1, 0 tones. DDS ensures that phase continuity will be maintained, and frequency stability will be determined by the microprocessor crystal accuracy. Frequency resolution can be maintained to within a few Hz from DC to over 2 MHZ. This allows for the generation of the 1 and 2 Kbps command data formats as well as the newer 10 Kbps format. Additional formats could be accommodated through software modifications. The use of digital technology provides for encoder self-testing and more comprehensive error reporting.
Portell, i. de Mora Jordi. "Payload data handling, telemetry and data compression systems for Gaia." Doctoral thesis, Universitat Politècnica de Catalunya, 2005. http://hdl.handle.net/10803/6585.
Full textUna missió com aquesta suposa grans esforços tecnològics i de disseny ja que caldrà detectar, seleccionar i mesurar centenars d'estels cada segon, per enviar-ne posteriorment les dades cap a la Terra -a més d'un milió i mig de quilòmetres. Hem centrat el treball d'aquesta tesi en aquesta vessant de la missió, proposant dissenys pels sistemes de gestió de dades, de telemetria científica, i de compressió de dades. El nostre objectiu final és fer possible la transmissió a l'estació terrestre d'aquesta immensa quantitat de dades generades pels instruments, tenint en compte la limitada capacitat del canal de comunicacions. Això requereix el disseny d'un sistema de compressió de dades sense pèrdues que ofereixi les millors relacions de compressió i garanteixi la integritat de les dades transmeses. Tot plegat suposa un gran repte pels mètodes de la teoria de la informació i pel disseny de sistemes de compressió de dades.
Aquests aspectes tecnològics encara estaven per estudiar o bé només es disposava d'esborranys preliminars -ja que la missió mateixa estava en una etapa preliminar en quan varem començar aquesta tesi. Per tant, el nostre treball ha estat rebut amb entusiasme per part de científics i enginyers del projecte.
En primer lloc hem revisat l'entorn operacional del nostre estudi, descrit a la primera part de la tesi. Això inclou els diversos sistemes de referència i les convencions que hem proposat per tal d'unificar les mesures, referències a dades i dissenys. Aquesta proposta s'ha utilitzat com a referència inicial en la missió i actualment altres científics l'estan ampliant i millorant. També hem recopilat les principals característiques de l'instrument astromètric (en el qual hem centrat el nostre estudi) i revisat les seves directrius operacionals, la qual cosa també s'ha tingut en compte en altres equips.
A la segona part de la tesi descrivim la nostra proposta pel sistema de gestió de dades de la càrrega útil de Gaia, la qual ha estat utilitzada per presentar els requeriments científics als equips industrials i representa en sí mateixa una opció d'implementació viable (tot i que simplificada). En la següent part estudiem la telemetria científica, recopilant els camps de dades a generar pels instruments i proposant un esquema optimitzat de codificació i transmissió, el qual redueix la ocupació del canal de comunicacions i està preparat per incloure un sistema optimitzat de compressió de dades. Aquest darrer serà descrit a la quarta i última part de la tesi, on veurem com la nostra proposta compleix gairebé totalment els requeriments de compressió, arribant a duplicar les relacions de compressió ofertes pels millors sistemes estàndard. El nostre disseny representa la millor solució actualment disponible per Gaia i el seu rendiment ha estat assumit com a disseny base per altres equips.
Cal dir que els resultats del nostre treball van més enllà de la publicació d'una memòria de tesi, complementant-la amb aplicacions de software que hem desenvolupat per ajudar-nos a dissenyar, optimitzar i verificar la operació dels sistemes aquí proposats. També cal indicar que la complexitat del nostre treball ha estat augmentada degut a la necessitat d'actualitzar-lo contínuament als canvis que la missió ha sofert en el seu disseny durant els cinc anys del doctorat. Per acabar, podem dir que estem satisfets amb els resultats del nostre treball, ja que la majoria han estat (o estan essent) tinguts en compte per molts equips involucrats en la missió i per la mateixa Agència Espacial Europea en el disseny final.
Sun, Jiake, and Wenjie Jiang. "Analysis Tool for Warehouse Material Handling Data." Thesis, Högskolan i Halmstad, Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-15205.
Full textSánchez, Pinsach David. "Handling Missing Data in Clinical Decision Support." Doctoral thesis, Universitat Autònoma de Barcelona, 2020. http://hdl.handle.net/10803/671318.
Full textDecidir cuáles son los mejores tratamientos es una tarea compleja cuando los pacientes sufren múltiples problemas y cuando un equipo multidisciplinario está involucrado en la intervención. Siempre hay más de una opción de tratamiento y los resultados a veces se pueden ver en un período corto o al final, una vez finalizado el tratamiento.En este contexto, el diseño de sistemas eficaces de soporte a la decisión clínica (CDSS) para ayudar a los médicos a seleccionar las intervenciones más apropiadas sigue siendo hoy en día un desafío. La cantidad de datos disponibles no siempre es la misma para todos los pacientes, especialmente en las fases tempranas del tratamiento, lo que dificulta la inferencia en los CDSS. Para mejorar las capacidades de los CDSS, se proponen diferentes componentes para tratamientos a largo plazo. Un primer componente se centra en mejorar la calidad de las inferencias en los datos desconocidos. El algoritmo de imputación múltiple dinámica (DMI) se presenta como un metodología eficaz para la mejora de los datos. DMI es capaz de adaptarse a diferentes escenarios con un porcentaje alto o bajo de datos desconocidos. Los experimentos realizados revelan que DMI es especialmente competitivo en problemas de regresión. Un segundo componente está dedicado a compensar las medidas de confianza, dada la incertidumbre asociada a la información desconocida, incorporando medidas de Información Mutua en las confianzas existentes. El tercer componente basado en un algoritmo de detección de comunidades esta orientado a encontrar relaciones entre decisiones clínicas que no son explícitas. Finalmente, para ilustrar la aplicabilidad de los diferentes componentes propuestos, se presentan dos casos de uso clínico reales. Uno en el contexto hospitalario y otro en el contexto del domicilio.
Deciding which are the best treatments is a complex task when patients suffer multiple impairments and when a multidisciplinary team is involved in the intervention. There is always more than a unique treatment option and the results sometimes can be viewed in a short period or only be capable to be measured when the treatment is finished. In this context, the design of effective Clinical Decision Support Systems (CDSS) to help clinicians to select most appropriate interventions is still a challenge. The amount of available data is not always the same for all patients, especially in early treatment stages, hindering the inference in CDSS. To improve the capabilities of CDSS, different components are proposed within a CDSS framework for long-term treatments. A first component is focused on improving the quality of the inferences in missing data scenarios. The Dynamic Multiple Imputation (DMI) algorithm is presented as an effective methodology for data enhancement in CDSS. DMI is capable to adapt to different scenarios with a low or high percentage of missing data. Several experiments conducted reveal that DMI is competitive with regression problems. A second component is devoted to weigh confidence measures, given the uncertainty associated to missing information, by incorporating Mutual Information measures in confidence existing estimators. A third component, based on a community detection algorithm, is proposed to find relationships between clinical decisions that are not explicit. Finally, to illustrate the applicability of different proposed components, two real clinical use cases with chronic patients are presented. The first in the hospital context and the other in the home context.
Embleton, Nina Lois. "Handling sparse spatial data in ecological applications." Thesis, University of Birmingham, 2015. http://etheses.bham.ac.uk//id/eprint/5840/.
Full textPrasad, Rohini S. M. Massachusetts Institute of Technology, and Gerta Malaj. "Analysis of inefficiencies in shipment data handling." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112861.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 113-116).
Supply chain visibility is critical for businesses to manage their operational risks. Availability of high quality and timely data regarding shipments is a precursor for supply chain visibility. This thesis analyses the errors that occur in shipment data for a freight forwarder. In this study, two types of errors are analyzed: system errors, arising from violations of business rules defined in the software system, and operational errors, which violate business rules or requirements defined outside the software. We consolidated multifarious shipment data from multiple sources and identified the relationship between errors and the shipment attributes such as source or destination country. Data errors can be costly, both from a human rework perspective as well as from the perspective of increased risk due to supply chain visibility loss. Therefore, the results of this thesis will enable companies to focus their efforts and resources on the most promising error avoidance initiatives for shipment data entry and tracking. We use several descriptive analytical techniques, ranging from basic data exploration guided by plots and charts to multidimensional visualizations, to identify the relationship between error occurrences and shipment attributes. Further, we look at classification models to categorize data entries that have a high error probability, given certain attributes of a shipment. We employ clustering techniques (K-means clustering) to group shipments that have similar properties, thereby allowing us to extrapolate behaviors of erroneous data records to future records. Finally, we develop predictive models using Naive-Bayes classifiers and Neural Networks to predict the likelihood of errors in a record. The results of the error analysis in the shipment data are discussed for a freight forwarder. A similar approach can be employed for supply chains of any organization that engages in physical movement of goods, in order to manage the quality of the shipment data inputs, thereby managing their supply chain risks more effectively.
by Rohini Prasad and Gerta Malaj.
M. Eng. in Supply Chain Management
BRING, MARCUS. "Handling XML Data Using Multi-threaded Solutions." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-153960.
Full textHantering av XML-data med hjälp av multitrådade lösningar. Detta projekt utfördes som en del av masterexamen i Datalogi vid Kungliga tekniska högskolan i Stockholm. Scania AB har varit arbetsgivare och har bidragit med såväl vägledning som handledare. I denna uppsats utforskas koncept kring beräkning rörande XML-data och hur multitrådsprogrammering kan hjälpa till att sänka exekveringstider. Rapporten börjar med att förklara grunderna i XML-språket och fortsätter med att presentera grundläggande konceptin om multitrådsprogrammering. Under projektet har två simuleringar utvecklats och prestanda testats, en generell simulering som ämnar att visa att de föreslagna metoderna fungerar och en mer realistisk simulering som ska indikera ifall metoderna även fungerar under mer verklighetstrogna förhållanden. Dessa prestanda tester visar på en god potentialoch ligger till grund för vidare diskussion kring begränsningarna som finns i systemet.
Barber, Mark H., and Paul R. Richey. "Naval Supply Systems Command: Data Administration planning and implementation." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27154.
Full textVan, der Merwe Benjamin. "Micro-satellite data handling : a unified information model." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52336.
Full textENGLISH ABSTRACT: This thesis describes various software technologies implemented, or specifically developed, for the SUNSAT micro-satellite mission. With the discussion centered on the Mission Operations System functions of Data Handling and Mission Control, particular emphasis is placed on data processing aspects such as the deployed database schema, and the data communications mechanisms implemented as part of the communications protocol stack. Both the groundsystem architecture and the Flight Software are discussed, their constituent components are analysed, and recommendations are made for improvement. Finally, a Unified Information Model for the design and operation of future, integrated satellite groundsystems is proposed, with suitable implementation technologies being identified and introduced.
AFRIKAANSE OPSOMMING: Hierdie tesis beskryf die sagteware tegnologieë wat qeirnpternenteer. of spesifiek ontwerp is vir die SUNSAT mikro-satelliet missie, Met die bespreking gefokus op die Missie Operasionele Stelsel funksies van Data Hantering en Missie Beheer, word daar veral klem gelê op data prosesserings aspekte, soos byvoorbeeld die databasis skema wat ontplooi is, asook die data kommunikasie meganismes wat qeimplernenteer is as deel van die kommunikasie protokol stapel. Beide die grondstelsel argitektuur en die Vlugsagteware word bespreek, hulonderskeie komponente word geanaliseer, en aanbevelings ter verbetering word gemaak, Laastens word daar 'n Verenigde Informasie Model voorgestel vir die ontwerp en operasionele werking van 'n toekomstige, qeinteqreerde satelliet grondstelsel. Geskikte tegnologieë vir die implementasie hiervan word ook qeidentifiseer en voorgelê,
Ren, Yueming Carleton University Dissertation Geography. "Vector data topological encoding error handling and generalization." Ottawa, 1992.
Find full textWang, Xue. "Handling missing data problems in criminology :an introduction." Thesis, University of Macau, 2016. http://umaclib3.umac.mo/record=b3570879.
Full textMorais, de Lira Ana Karina. "Separating variables in the context of data handling." Thesis, University College London (University of London), 2000. http://discovery.ucl.ac.uk/10020359/.
Full textRothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." TU Bergakademie Freiberg, 2017. https://tubaf.qucosa.de/id/qucosa%3A23195.
Full textNordmark, Martin. "Optimizing the Process of Handling Road Network Data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-396819.
Full textJayawardena, D. P. W. "The role of triangulation in spatial data handling." Thesis, Keele University, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.385666.
Full textRothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2018. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-231350.
Full textGrunzke, Richard. "Generic Metadata Handling in Scientific Data Life Cycles." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-202070.
Full textRandahl, David. "Raoul: An R-Package for Handling Missing Data." Thesis, Uppsala universitet, Statistiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297051.
Full textBetz, Hariolf, Frank Raiser, and Thom Frühwirth. "Persistent constraints in constraint handling rules." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4154/.
Full textAbdennadher, Slim, Haythem Ismail, and Frederick Khoury. "Transforming imperative algorithms to constraint handling rules." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4153/.
Full textWitta, Eleanor Lea. "Seven methods of handling missing data using samples from a national data base." Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06062008-170840/.
Full textDing, Yuanyuan. "Handling complex, high dimensional data for classification and clustering /." Full text available from ProQuest UM Digital Dissertations, 2007. http://0-proquest.umi.com.umiss.lib.olemiss.edu/pqdweb?index=0&did=1400971141&SrchMode=2&sid=1&Fmt=2&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1219343482&clientId=22256.
Full textShum, L. L. "Topology control and data handling in wireless sensor networks." Thesis, University College London (University of London), 2009. http://discovery.ucl.ac.uk/18577/.
Full textTwala, Bhekisipho. "Effective techniques for handling incomplete data using decision trees." Thesis, Open University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.418465.
Full textHerman, James Stackpole. "A sail force dynamometer : design, implementation and data handling." Thesis, Massachusetts Institute of Technology, 1989. http://hdl.handle.net/1721.1/14491.
Full textShimakawa, Hiromitsu. "Studies on Real-Time Handling of Time Dependent Data." Kyoto University, 1999. http://hdl.handle.net/2433/157059.
Full textKyoto University (京都大学)
0048
新制・課程博士
博士(工学)
甲第7850号
工博第1830号
新制||工||1141(附属図書館)
UT51-99-G444
京都大学大学院工学研究科情報工学専攻
(主査)教授 上林 彌彦, 教授 岩間 一雄, 教授 湯淺 太一
学位規則第4条第1項該当
Ferreira, Carlos André Marques Viana. "Handling data access latency in distributed medical imaging environments." Doctoral thesis, Universidade de Aveiro, 2015. http://hdl.handle.net/10773/15491.
Full textWeb-based technologies have been increasingly used in Picture Archive and Communication Systems (PACS), in services related to storage, distribution and visualization of medical images. Nowadays, many healthcare institutions are federating services and outsourcing their repositories to the Cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth limitations. Communication latency is a critical issue that still hinders the adoption of this paradigm. In order to improve the performance of distributed medical imaging networks, routing mechanisms with cache and prefetching can be used. This doctorate proposes a cache architecture based on static rules together with pattern recognition for both cache eviction and prefetching.
As tecnologias Web têm sido usadas cada vez mais no universo dos Picture Archiving and Communication Systems (PACS), nomeadamente em serviços de armazenamento, distribuição e visualização de imagem médica. Atualmente, verificamos que existe uma tendências para as instituições partilharem fluxos de trabalho e contratualizarem serviços na Cloud. No entanto, gerir as comunicações entre entidades geograficamente distribuídas continua a ser um desafio complexo devido ao enorme volume de dados e às limitações de largura de banda. A latência de acesso remoto aos dados é um problema importante que dificulta a adopção deste paradigma. Para melhorar o desempenho de redes distribuídas de imagem médica, podemos utilizar mecanismos de encaminhamento com cache e prefetching. Este doutoramento propõe uma arquitetura de cache baseada em regras estáticas e reconhecimento de padrões para prefetching e limpeza da cache.
Ang, Su-Shin. "Handling data dependent memory accesses in custom hardware applications." Thesis, Imperial College London, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.501122.
Full textWhitelaw, Virigina A. "Telemetry Handling on the Space Station Data Management System." International Foundation for Telemetering, 1987. http://hdl.handle.net/10150/615260.
Full textTraditional space telemetry has generally been handled as asynchronous data stream fed into a time division multiplexed channel on a point-to-point radio frequency (RF) link between space and ground. The data handling concepts emerging for the Space Station challenge each of these precepts. According to current concepts, telemetry data on the Space Station will be packetized. It will be transported asynchronously through onboard networks. The space-to-ground link will not be time division multiplexed, but rather will have flexibly managed virtual channels, and finally, the routing of telemetry data must potentially traverse multiple ground distribution networks. Appropriately, the communication standards for handling telemetry are changing to support the highly networked Space Station environment. While a companion paper (1. W. Marker, "Telemetry Formats for the Space Station RF Links") examines the emerging telemetry concepts and formats for the RF link, this paper focuses on the impact of telemetry handling on the design of the onboard networks that are part of the Data Management System (DMS). The DMS will provide the connectivity between most telemetry sources and the onboard node for transmission to the ground. By far the bulk of data transported by DMS will be telemetry, however, not all telemetry will place the same demands on the communication system and DMS must also satisfy a rich array of services in support of distributed Space Station operations. These services include file transfer, data base access, application messaging and several others. The DMS communications architecture, which will follow the International Standards Organization (ISO) Reference Model, must support both the high throughput needed for telemetry transport, as well as the rich services needed for distributed computer systems. This paper discusses an architectural approach to satisfying the dual set of requirements and discusses several of the functionality vs. performance trade-offs that must be made in developing an optimized mechanism for handling telemetry data in the DMS.
Almeida, Miguel Alexandre Dias. "Planning, operations and data handling of planetary science missions." Master's thesis, FCT-UNL, 2011. http://hdl.handle.net/10362/7081.
Full textSince the dawn of human kind the celestial sphere has had a special place in our imagination. I always felt the same passion for the cosmos. In particular, fuelled by the journeys of discovery of the Solar System, I always dreamed of dedicating myself to planetary exploration. To follow that idea I studied physics and, in 1999, I finally got the opportunity to work in Planetary Science research. It all started in the Lisbon Observatory, where I was able to analyse infrared data from the Jupiter orbiter, Galileo, collected by its Near-Infrared Mapping Spectrometer (NIMS). At this stage I learned some methods used in data analysis. I continued my career in the European Space Agency in the Science Planning of the SMART-1 lunar spacecraft. I remained in this mission for the following seven years, and saw my responsibilities grow. I started by giving technical support to the Project Scientist. By the end of the mission I had been a major player in the setup of the planning system, and worked in all capacities within the SMART-1 Science and Technology Operations Centre, and the Advanced Moon Micro-Imager Experiment (AMIE). Finally, in 2006, I started working in the Venus Express project as a Liaison Scientist for the Venus Monitoring Camera (VMC) instrument. I was able to re-work the VMC planning system with the Principal Investigator in order to gather more valuable science data. While this optimization was done, and since the spacecraft was already orbiting Venus, my duties also included, and still do, to plan and operate the VMC camera. During my entire career I have had two side projects running along my main tasks. I always had an interest for innovation and some of my ideas developed to integral parts of my main projects. I also always kept an interest in data analysis, as time permitted, that I carried with NIMS, AMIE and VMC.
Gugliermo, Simona. "Occlusion handling in Augmented Reality context." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-262679.
Full textHantering av ocklusion mellan verkliga och virtuella objekt är ett utmanande problem i applikationer med augmenterad verklighet (AR). Felaktig ocklusionshantering kan orsaka förvirring i användarnas uppfattning vilket leder till icke-realistiska AR-upplevelser. Trots att det finns forskningsexempel och implementeringar av framgång inom detta område så har de ofta betydliga begränsningar. Metoden som presenteras i detta arbete tar både rå djup- och färginformation från en RGB-D-sensor för att lösa ocklusion i okända AR-sammanhang. Innovationen med denna metod är den automatiska trimapgenerationen som bygger på informationen i färgdistributionen i den verkliga scenen. Dessutom kan den föreslagna metoden fungera med hjälp av informationen från endast den senaste bilden, eller effektivt använda informationen från flera bilder. I det andra fallet används information från den tidigare bildutmatningen för att förbättra trimapgenerationen av den senaste bilden. Experimentella utvärderingar av flera scener visar att denna metod i stor utsträckning förbättrar automatisk trimapgenerering med både RGB och djupinformation. Vidare jämförs den föreslagna metoden med några av de andra främsta tillvägagångssätt med avsende på kvalitet och noggrannhet, vilket visar att den övervinner några av de kända begränsningarna.
Pratt, Everett S. "Data compression standards and applications to Global Command and Control System." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA293262.
Full textThesis advisor(s): Murali Tummala, Paul H. Moose. "December 1994." Includes bibliographical references. Also available online.
Davis, Rodney, Greg Hupf, and Chad Woolf. ""ADVANCED DATA DESCRIPTION EXCHANGE SERVICES FOR HETEROGENEOUS SYSTEMS"." International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/605341.
Full textCCT is conducting research to provide a cross platform software capability that enables a common semantic for control and monitor of highly distributed systems-of-systems C^2 architectures by auto-generating semantic processing services from standardized metadata specifications. This new capability is significant because it will reduce development, operations, and support costs for legacy and future systems that are part of ground and space based distributed command and control systems. It will also establish a space systems information exchange model that can support future highly interoperable and mobile software systems.
Khailtash, Amal. "Handling large data storage in synthesis of multiple FPGA systems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0007/MQ43653.pdf.
Full textKift, M. H. "The use of hypertext in handling process engineering design data." Thesis, Swansea University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.637796.
Full textJohansson, Åsa M. "Methodology for Handling Missing Data in Nonlinear Mixed Effects Modelling." Doctoral thesis, Uppsala universitet, Institutionen för farmaceutisk biovetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-224098.
Full textBrint, Andrew Timothy. "Matching algorithms for handling three dimensional molecular co-ordinate data." Thesis, University of Sheffield, 1988. http://etheses.whiterose.ac.uk/15147/.
Full textKönig, Hans-Henrik. "Calphad data handling for generic precipitation modelling coupled with FEM." Thesis, KTH, Materialvetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280039.
Full textFör utveckling av ett generiskt modelleringsverktyg för utskiljningskinetiken i inhomogena komponenter krävs en effektiv databehandling som möjliggör integration av modeller för olika längdskalor och minskar beräkningstiden och resursförbrukningen. I denna avhandling utvecklas och testas en automatiserad metod för att generera, kurera och transformera termodynamisk och kinetisk Calphad-data. Detta möjliggör integration av utskiljningsmodeller i finita-element metodkoder. Pycalphad tillsammans med en öppen källkod används för att komma åt Calphad-databaser. Ett Python-skript används för att beräkna de termodynamiska och kinetiska parametrarna som används i utskiljningsmodellen. Uppgifterna sparas i en öppen källkodsinfrastruktur. Den utvecklade metoden demonstreras genom att generera, kurera och transformera information för det binära modellsystemet Cu-Co Resultaten visar att Pycalphad kan användas för att tillhandahålla de nödvändiga termodynamiska och kinetiska parametrarna för utskiljningsmodeller. En ytterligare förbättring av den presenterade källkoden är nödvändig för att möjliggöra applikationen inom hela sammansättningsområdet.
Selvaraj, Poorani. "Group Method of Data Handling – How Does it Measure Up?" Ohio University / OhioLINK, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1479421385631538.
Full text