To see the other types of publications on this topic, follow the link: Data controllers.

Dissertations / Theses on the topic 'Data controllers'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data controllers.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Rossi, Mauro. "Low order controllers for sampled-data systems." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape8/PQDD_0021/NQ38263.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shultz, James Edward Jr. "Programmable Logic Controllers and Supervisory Control and Data Acquisition: A System Design for Increased Availability." Ohio University / OhioLINK, 1991. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1239733126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shultz, James Edward. "Programmable logic controllers and supervisory control and data acquisition a system design for increased availability." Ohio : Ohio University, 1991. http://www.ohiolink.edu/etd/view.cgi?ohiou1239733126.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Aronsson, Claes. "Evolution of Neural Controllers for Robot Teams." Thesis, University of Skövde, Department of Computer Science, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-732.

Full text
Abstract:

This dissertation evaluates evolutionary methods for evolving cooperative teams of robots. Cooperative robotics is a challenging research area in the field of artificial intelligence. Individual and autonomous robots may by cooperation enhance their performance compared to what they can achieve separately. The challenge of cooperative robotics is that performance relies on interactions between robots. The interactions are not always fully understood, which makes the designing process of hardware and software systems complex. Robotic soccer, such as the RoboCup competitions, offers an unpredictable dynamical environment for competing robot teams that encourages research of these complexities. Instead of trying to solve these problems by designing and implement the behavior, the robots can learn how to behave by evolutionary methods. For this reason, this dissertation evaluates evolution of neural controllers for a team of two robots in a competitive soccer environment. The idea is that evolutionary methods may be a solution to the complexities of creating cooperative robots. The methods used in the experiments are influenced by research of evolutionary algorithms with single autonomous robots and on robotic soccer. The results show that robot teams can evolve to a form of cooperative behavior with simple reactive behavior by relying on self-adaptation with little supervision and human interference.

APA, Harvard, Vancouver, ISO, and other styles
5

Khatra, Ajit Paal Singh. "Implementation of a Multi-Layered Fuzzy Controller on an FPGA." Wright State University / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=wright1153505966.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Weston, Mindy. "The Right to Be Forgotten: Analyzing Conflicts Between Free Expression and Privacy Rights." BYU ScholarsArchive, 2017. https://scholarsarchive.byu.edu/etd/6453.

Full text
Abstract:
As modern technology continues to affect civilization, the issue of electronic rights grows in a global conversation. The right to be forgotten is a data protection regulation specific to the European Union but its consequences are creating an international stir in the fields of mass communication and law. Freedom of expression and privacy rights are both founding values of the United States which are protected by constitutional amendments written before the internet also changed those fields. In a study that analyzes the legal process of when these two fundamental values collide, this research offers insight into both personal and judicial views of informational priority. This thesis conducts a legal analysis of cases that cite the infamous precedents of Melvin v. Reid and Sidis v. F-R Pub. Corp., to examine the factors on which U.S. courts of law determinewhether freedom or privacy rules.
APA, Harvard, Vancouver, ISO, and other styles
7

Lind-Hård, Viktor. "What meets the eye : Naturalistic observations of air traffic controllers eye-movements during arrivals using eye-tracking." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-159934.

Full text
Abstract:
How do air traffic controllers, or ATCos, distribute visual attention and can it vary between controllers? In this study, using primarily eye-tracking data and a couple of on-site interviews, these questions are explored. Two ATCos, with the most similar landings, had their eye-movements recorded with Tobii pro glasses 2 and further analysed by categorizing every fixation into different areas of interest during four landings. Two more ATCos were interviewed briefly during an observational visit to the control tower. The results showed that the ATCos distributed their attention fairly equally between the outside of the control tower and the inside. When attending to something outside the runway was the focus and when attention was inside the control tower the radar was usually the focus. The ATCos differed in their attention distribution by the presumably more experienced ATCo distributing their attention more outside the control tower than the presumably less experienced ATCo.  A large number of fixations were not categorized bringing the method of dividing the ATCos eye-tracking view into areas of interest into question.
APA, Harvard, Vancouver, ISO, and other styles
8

Schildt, Alessandro Nakoneczny. "Síntese de controladores ressonantes baseado em dados aplicado a fontes ininterruptas de energia." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/109164.

Full text
Abstract:
Este trabalho trata da utilização de um método de sintonia de controladores baseado nos dados obtidos da planta. A proposta é a sintonia de controladores ressonantes para aplicação em inversores de frequência presentes em fontes ininterruptas de energia, com o intuito de seguimento de referência senoidal de tensão. Dentro deste contexto, será usado o algoritmo Virtual Reference Feedback Tuning, o qual é um método de identificação de controladores baseado em dados que não é iterativo e não necessita do modelo do sistema para identificar o controlador. A partir dos dados obtidos da planta e também da definição de um modelo de referência pelo projetista, o método estima os parâmetros de uma estrutura fixada previamente para o controlador através da minimização de uma função custo definida pelo erro entre a saída desejada e a saída real. Além disso, uma realimentação de corrente é necessária na malha de controle, onde seu ganho proporcional é definido por experimento empírico. Para demonstrar a utilização do método são apresentados resultados simulados e práticos de uma fonte ininterrupta de energia com potência de 5 kV A utilizando cargas lineares e não-lineares. É avaliado o desempenho do ponto de vista da qualidade do sinal de saída real obtido com controladores sintonizados a partir de diferentes modelos de referência, além do uso de sinais de excitação diversos para o algoritmo V RFT. Os resultados experimentais são obtidos em um inversor de frequência monofásico com uma plataforma em tempo real baseada na placa de aquisição de dados dSPACE DS1104. Os resultados mostram que, em relação as normas internacionais, o sistema de controle proposto possui bom comportamento para seguimento de referência, operando à vazio ou utilizando carga linear.
This work discusses about controller tuning methods based on plant data. The proposal is to tune resonant controllers for application to the frequency inverters found in uninterruptible power supplies, with the goal of following sinusoidal reference signals. Within this context, the Virtual Reference Feedback Tuning algorithm is used, which is a data-driven controller identification method that is not iterative and does not require a system model to identify the controller. Data obtained from the plant and also the definition of a reference model by the designer, are used by the method to estimate the parameters of a previously fixed controller structure through the minimization of a cost function, which is defined by the error between desired and actual outputs. Moreover, a current feedback is required in the control loop where the proportional gain is defined by empirical experiment. To demonstrate the method’s application, simulated and practical results of an uninterruptible power supply with capacity of the 5 kV A will be presented employing linear and nonlinear loads. Evaluates the performance in terms of system’s actual output quality, obtained with controllers tuned with different reference models. Distinct excitation signals are also used to feed the VRFT algorithm. The experimental results achieved from use of an single-phase inverter and a real-time platform based on data acquisition board dSPACE DS1104. The results show that, with respect to international standards, the proposed control system has good performance for tracking reference, operating at empty or using linear load.
APA, Harvard, Vancouver, ISO, and other styles
9

Da, Silva De Aguiar Raquel Stella. "Optimization-based design of structured LTI controllers for uncertain and infinite-dimensional systems." Thesis, Toulouse, ISAE, 2018. http://www.theses.fr/2018ESAE0020/document.

Full text
Abstract:
Les techniques d’optimisation non-lisse permettent de résoudre des problèmes difficiles de l’ingénieur automaticien qui étaient inaccessibles avec les techniques classiques. Il s’agit en particulier de problèmes de commande ou de filtrage impliquant de multiples modèles ou faisant intervenir des contraintes de structure pour la réduction des couts et de la complexité. Il en résulte que ces techniques sont plus à même de fournir des solutions réalistes dans des problématiques pratiques difficiles. Les industriels européens de l’aéronautique et de l’espace ont récemment porté un intérêt tout particulier à ces nouvelles techniques. Ces dernières font parfois partie du "process" industriel (THALES, AIRBUS DS Satellite, DASSAULT A) ou sont utilisées dans les bureaux d’étude: (SAGEM, AIRBUS Transport). Des études sont également en cours comme celle concernant le pilotage atmosphérique des futurs lanceurs tels d’Ariane VI. L’objectif de cette thèse concerne l'exploration, la spécialisation et le développement des techniques et outils de l'optimisation non-lisse pour des problématiques d'ingénierie non résolues de façon satisfaisante - incertitudes de différente nature - optimisation de l'observabilité et de la contrôlabilité - conception simultanée système et commande Il s’agit aussi d’évaluer le potentiel de ces techniques par rapport à l’existant avec comme domaines applicatifs l’aéronautique, le spatial ou les systèmes de puissance de grande dimension qui fournissent un cadre d’étude particulièrement exigeant
Non-smooth optimization techniques help solving difficult engineering problems that would be unsolvable otherwise. Among them, control problems with multiple models or with constraints regarding the structure of the controller. The thesis objectives consist in the exploitation, specialization and development of non smooth optmization techniques and tools for solving engineering problems that are not satisfactorily solved to the present
APA, Harvard, Vancouver, ISO, and other styles
10

Papadaki, Evangelia. "What amendments need to be made to the current EU legal framework to better address the security obligations of data controllers?" Thesis, University of Southampton, 2018. https://eprints.soton.ac.uk/421046/.

Full text
Abstract:
The overall objective of this thesis is to identify the gaps in the current EU legal framework surrounding the security obligations of data controllers and make recommendations to help advance the discussions around the possible ways of effectively addressing the problem of cyber insecurity. The thesis adopts an interdisciplinary approach to data security, which involves legal analysis enriched with considerations from the fields of Computer Science and Managerial Economics. In response to the rapidly changing landscape of emerging technologies, which challenges the conventional thinking of regulators, the thesis calls for a shift in the data security regulation paradigm. The contribution of the thesis to knowledge in this field lies in reframing the elements that need to be incorporated into the laws regulating the security obligations of data controllers. The thesis proposes a holistic, dynamic, hybrid and layered approach to data security, which systematically tailors the security obligations of data controllers to the level of re-identification risk involved in data processing operations, and suggests security measures depending on the security level required while laying down the security objectives to be achieved. The proposed regulatory model can serve as guidance for regulators on the law-making process concerning the security obligations of data controllers. The proposed model aspires to provide adequate clarity to data controllers in terms of the initial phase of the design of security measures, while abstaining from imposing technology specific security requirements in order to grant flexibility to data controllers to adapt the security mechanisms to their particular business model and the given data environment.
APA, Harvard, Vancouver, ISO, and other styles
11

Olofsson, Jakob. "Input and Display of Text for Virtual Reality Head-Mounted Displays and Hand-held Positionally Tracked Controllers." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-64620.

Full text
Abstract:
The recent increase of affordable virtual reality (VR) head-mounted displays has led to many new video games and applications being developed for virtual reality environments. The improvements to VR technology has introduced many new possibilities, but has also introduced new problems to solve in order to make VR software as comfortable and as effective as possible. In this report, different methods of displaying text and receiving text input in VR environments are investigated and measured. An interactive user study was conducted to evaluate and compare the performance and user opinion of three different text display methods and four separate virtual keyboard solutions. Results revealed that the distance between text and user, with the same relative text size, significantly affected the ease of reading the text, and that designing a good virtual keyboard for VR requires a good balance of multiple factors. An example of such factors is the balance between precise control and the amount of physical exertion required. Additionally, the results suggest that the amount of previous experience with virtual reality equipment, and typing skill with regular physical keyboards, can meaningfully impact which solution is most appropriate.
Den senaste tidens ökning av prisvärda virtual reality (VR) glasögon har lett till en ökning av spel och applikationer som utvecklas för virtual reality miljöer. Förbättringarna av VR tekniken har introducerat många nya möjligheter, men även nya problem att lösa för att skapa VR mjukvara som är så bekväm och effektiv som möjligt. I den här rapporten undersöks och mäts olika metoder för att visa samt ta emot text i VR miljöer. Detta undersöktes genom utförandet av en interaktiv användarstudie som utvärderade och jämförde effektiviteten och användaråsikter kring tre olika metoder för att visa text samt fyra olika virtuella tangentbordslösningar. Resultatet visade att avståndet mellan användaren och texten, med samma relativa textstorlek, avsevärt påverkade lättheten att läsa texten, samt att designen av ett bra virtuellt tangentbord för VR kräver en bra balans mellan flera faktorer. Ett exempel på sådana faktorer är balansen mellan noggrann kontroll och den fysiska ansträngning som krävs. Resultatet tyder även på att mängden av tidigare erfarenhet med virtual reality utrustning samt skicklighet att skriva med vanliga fysiska tangentbord betydligt kan påverka vilka lösningar som är mest passande för situationen.
APA, Harvard, Vancouver, ISO, and other styles
12

Groom, Eddie L. "Ethernet controller design for an embedded system using FPGA technology." Birmingham, Ala. : University of Alabama at Birmingham, 2008. https://www.mhsl.uab.edu/dt/2008m/groom.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Gudjonsson, Ludvik. "Comparison of two methods for evolving recurrent artificial neural networks for." Thesis, University of Skövde, University of Skövde, 1998. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-155.

Full text
Abstract:

n this dissertation a comparison of two evolutionary methods for evolving ANNs for robot control is made. The methods compared are SANE with enforced sub-population and delta-coding, and marker-based encoding. In an attempt to speed up evolution, marker-based encoding is extended with delta-coding. The task selected for comparison is the hunter-prey task. This task requires the robot controller to posess some form of memory as the prey can move out of sensor range. Incremental evolution is used to evolve the complex behaviour that is required to successfully handle this task. The comparison is based on computational power needed for evolution, and complexity, robustness, and generalisation of the resulting ANNs. The results show that marker-based encoding is the most efficient method tested and does not need delta-coding to increase the speed of evolution process. Additionally the results indicate that delta-coding does not increase the speed of evolution with marker-based encoding.

APA, Harvard, Vancouver, ISO, and other styles
14

Rydman, Oskar. "Data processing of Controlled Source Audio Magnetotelluric (CSAMT) Data." Thesis, Uppsala universitet, Geofysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-387246.

Full text
Abstract:
During this project three distinct methods to improve the data processing of Controlled Source Audio Magnetotellurics (CSAMT) data are implemented and their advantages and disadvantages are discussed. The methods in question are: Detrending the time series in the time domain, instead of detrending in the frequencydomain. Implementation of a coherency test to pinpoint data segments of low quality andremove these data from the calculations. Implementing a method to detect and remove transients from the time series toreduce background noise in the frequency spectra. Both the detrending in time domain and the transient removal shows potential in improvingdata quality even if the improvements are small(both in the (1-10% range). Due totechnical limitations no coherency test was implemented. Overall the processes discussedin the report did improve the data quality and may serve as groundwork for further improvementsto come.
Projektet behandlar tre stycken metoder för att förbättra signalkvaliten hos Controlled Source Audio Magnetotellurics (CSAMT) data, dessa implementeras och deras för- och nackdelar diskuteras. Metoderna som hanteras är: Avlägsnandet av trender från tidsserier i tidsdomänen istället för i frekvensdomänen. Implementationen av ett koherenstest för att identifiera ”dåliga” datasegment ochavlägsna dessa från vidare beräkningar. Implementationen av en metod för att både hitta och avlägsna transienter (dataspikar) från tidsserien för att minska bakgrundsbruset i frekvensspektrat. Både avlägsnandet av trender samt transienter visar positiv inverkan på datakvaliteten,även om skillnaderna är relativt små (båda på ungefär 1-10%). På grund av begränsningarfrån mätdatan kunde inget meningsfullt koherenstest utformas. Överlag har processernasom diskuteras i rapporten förbättrat datakvaliten och kan ses som ett grundarbete förfortsatta förbättringar inom området.
APA, Harvard, Vancouver, ISO, and other styles
15

Tka, Mouna. "Génération automatique de test pour les contrôleurs logiques programmables synchrones." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM020/document.

Full text
Abstract:
Ce travail de thèse, effectué dans la cadre du projet FUI Minalogic Bluesky, porte sur le test fonctionnel automatisé d'une classe particulière de contrôleurs logiques programmables (em4) produite par InnoVista Sensors. Ce sont des systèmes synchrones qui sont programmés au moyen d'un environnement de développement intégré (IDE). Les personnes qui utilisent et programment ces contrôleurs ne sont pas nécessairement des programmeurs experts. Le développement des applications logicielles doit être par conséquent simple et intuitif. Cela devrait également être le cas pour les tests. Même si les applications définies par ces utilisateurs ne sont pas nécessairement très critiques, il est important de les tester d'une manière adéquate et efficace. Un simulateur inclu dans l'IDE permet aux programmeurs de tester leurs programmes d'une façon qui reste à ce jour informelle et interactive en entrant manuellement des données de test. En se basant sur des recherches précédentes dans le domaine du test des programmes synchrones, nous proposons un nouveau langage de spécification de test, appelé SPTL (Synchronous Programs Testing Language) qui rend possible d'exprimer simplement des scénarios de test qui peuvent être exécutées à la volée pour générer automatiquement des séquences d'entrée de test. Il permet aussi de décrire l'environnement où évolue le système pour mettre des conditions sur les entrées afin d'arriver à des données de test réalistes et de limiter celles qui sont inutiles. SPTL facilite cette tâche de test en introduisant des notions comme les profils d'utilisation, les groupes et les catégories. Nous avons conçu et développé un prototype, nommé "Testium", qui traduit un programme SPTL en un ensemble de contraintes exploitées par un solveur Prolog qui choisit aléatoirement les entrées de test. La génération de données de test s'appuie ainsi sur des techniques de programmation logique par contraintes. Pour l'évaluer, nous avons expérimenté cette méthode sur des exemples d'applications EM4 typiques et réels. Bien que SPTL ait été évalué sur em4, son utilisation peut être envisagée pour la validation d'autres types de contrôleurs ou systèmes synchrones
This thesis work done in the context of the FUI project Minalogic Bluesky, concerns the automated functional testing of a particular class of programmable logic controllers (em4) produced by InnoVista Sensors. These are synchronous systems that are programmed by means of an integrated development environment (IDE). People who use and program these controllers are not necessarily expert programmers. The development of software applications should be as result simple and intuitive. This should also be the case for testing. Although applications defined by these users need not be very critical, it is important to test them adequately and effectively. A simulator included in the IDE allows programmers to test their programs in a way that remains informal and interactive by manually entering test data.Based on previous research in the area of synchronous test programs, we propose a new test specification language, called SPTL (Synchronous Testing Programs Language) which makes possible to simply express test scenarios that can be executed on the fly to automatically generate test input sequences. It also allows describing the environment in which the system evolves to put conditions on inputs to arrive to realistic test data and limit unnecessary ones. SPTL facilitates this testing task by introducing concepts such as user profiles, groups and categories. We have designed and developed a prototype named "Testium", which translates a SPTL program to a set of constraints used by a Prolog solver that randomly selects the test inputs. So, generating test data is based on constraint logic programming techniques.To assess this, we experimented this method on realistic and typical examples of em4 applications. Although SPTL was evaluated on EM4, its use can be envisaged for the validation of other types of synchronous controllers or systems
APA, Harvard, Vancouver, ISO, and other styles
16

Løseth, Lars Ole. "Modelling of Controlled Source Electromagnetic Data." Doctoral thesis, Norwegian University of Science and Technology, Department of Physics, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-1502.

Full text
Abstract:

This work treats modelling of electromagnetic fields from controlled sources in geophysical applications. The focus is on modelling the marine CSEM (controlled source electromagnetic) method in planarly layered media. The recent introduction of SeaBed Logging (SBL) as an application of the marine CSEM method for direct hydrocarbon identification has resulted in increased survey activity, and expanded as well as renewed the interest for investigating electromagnetic field propagation in the subsurface of the earth.

The material within this document consists of a short introduction to the CSEM and SBL methods and four self-contained papers:

• Low-frequency electromagnetic fields in applied geophysics: Waves or diffusion? treats propagation of low-frequency fields in conductive media, and compares their behaviour to nondistorted field propagation in lossless media.

• The first test of the SeaBed Logging method describes the first laboratory test of this method. The scaled experiment was performed in order to validate if the detection of thin resistive layers within conductive surrounding media is possible.

• Asymptotic evaluations of the marine CSEM field integrals elaborates on how electromagnetic signals propagate in an idealized stratified earth model. To this end, the method of steepest descents is applied in order to separate the various wavemodes.

• Electromagnetic fields in planarly layered anisotropic media formulates a mathematical description of the field propagation in stratified media with arbitrary anisotropy. The field equations are solved by using the matrix propagator technique.

Even if electromagnetic field propagation in layered media is a rather mature research subject, the current development of the CSEM and SBL methods demands reinvestigations and new theoretical insights. Optimal survey planning and solid interpretation rely on a thorough understanding of the signal propagation in the subsurface. The main motivation in this thesis is to contribute to increased knowledge of how electromagnetic fields travel in the earth.

APA, Harvard, Vancouver, ISO, and other styles
17

Chang, Wei-Chieh. "Transputer-based robot controller /." Online version of thesis, 1990. http://hdl.handle.net/1850/11557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Ustunturk, Ahmet. "Digital Controller Design For Sampled-data Nonlinear Systems." Phd thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614267/index.pdf.

Full text
Abstract:
In this thesis, digital controller design methods for sampled-data nonlinear systems are considered. Although sampled-data nonlinear control has attracted much attention in recent years, the controller design methods for sampled-data nonlinear systems are still limited. Therefore, a range of controller design methods for sampled-data nonlinear systems are developed such as backstepping, adaptive and robust backstepping, reduced-order observer-based output feedback controller design methods based on the Euler approximate model. These controllers are designed to compensate the effects of the discrepancy between the Euler approximate model and exact discrete time model, parameter estimation error in adaptive control and observer error in output feedback control which behave as disturbance. A dual-rate control scheme is presented for output-feedback stabilization of sampled-data nonlinear systems. It is shown that the designed controllers semiglobally practically asymptotically (SPA) stabilize the closed-loop sampled-data nonlinear system. Moreover, various applications of these methods are given and their performances are analyzed with simulations.
APA, Harvard, Vancouver, ISO, and other styles
19

Bai, Shuanghua. "Assessment of controller performance with embedded data reconciliation." Thesis, University of Ottawa (Canada), 2003. http://hdl.handle.net/10393/26439.

Full text
Abstract:
Process measurements contain some degree of error that can be random or systematic. Random measurement noise is usually of high frequency, and results in high-frequency oscillations of manipulated variables that deteriorate the performance of the control system. Data reconciliation is a procedure that makes use of process models along with process measurements to give more precise and consistent estimates of process variables. Data reconciliation has been traditionally used to provide a more representative set of data to calculate steady-state inventories and process yields. For dynamic systems, the use of data reconciliation is relatively nascent. This work examined the potential use of data reconciliation in closed-loop control as a filter to attenuate the noise in measurements of the controlled variables so that the controllers can act on less variable, more accurate inputs. Data reconciliation filters were implemented in simulations of a PID control system for a binary distillation column. The presence of measurement noise usually results in detuned controllers in order to prevent excessive high-frequency variations of manipulated variables. In this work, a Dynamic Data Reconciliation (DDR) algorithm, embedded within the structure of feedback control loops, was developed to reconcile noisy measurements. (Abstract shortened by UMI.)
APA, Harvard, Vancouver, ISO, and other styles
20

Paradesi, Sharon M. (Sharon Myrtle) 1986. "User-controlled privacy for personal mobile data." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/93839.

Full text
Abstract:
Thesis: Elec. E. in Computer Science, Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 81-82).
Smartphones collect a wide range of sensor data, ranging from the basic, such as location, accelerometer, and Bluetooth, to the more advanced, such as heart rate. Mobile apps on the Android and iOS platforms provide users with "all-or-nothing" controls during installation to get permission for data collection and use. Users have to either agree to have the app collect and use all the requested data or not use the app at all. This is slowly changing with the iOS framework, which now allows users to turn off location sharing with specific apps even after installation. MIT Living Lab platform is a mobile app development platform that uses openPDS to provide MIT users with personal data stores but currently lacks user controls for privacy. This thesis presents PrivacyMate, a suite of tools for MIT Living Labs that provide user-controllable privacy mechanisms for mobile apps. PrivacyMate aims to enable users to maintain better control over their mobile personal data. It extends the model of iOS and allows users to select or deselect various types of data (more than just location information) for collection and use by apps. Users can also provide temporal and spatial specifications to indicate a context in which they are comfortable sharing their data with certain apps. We incorporate the privacy mechanisms offered by PrivacyMate into two mobile apps built on the MIT Living Lab platform: ScheduleME and MIT-FIT. ScheduleME enables users to schedule meetings without disclosing either their locations or points of interest. MIT-FIT enables users to track personal and aggregate high-activity regions and times, as well as view personalized fitness-related event recommendations. The MIT Living Lab team is planning to eventually deploy PrivacyMate and MIT-FIT to the entire MIT community.
by Sharon Myrtle Paradesi.
Elec. E. in Computer Science
APA, Harvard, Vancouver, ISO, and other styles
21

Wernberg, Max. "Security and Privacy of Controller Pilot Data Link Communication." Thesis, Linköpings universitet, Kommunikations- och transportsystem, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-156337.

Full text
Abstract:
Newly implemented technologies within the aviation lack, according to recent studies, built in security measures to protect them against outside interference. In this thesis we study the security and privacy status of the digital wireless Controller Pilot Data Link Communication (CPDLC) used in air traffic management alongside other systems to increase the safety and traffic capacity of controlled airspaces. The findings show that CPDCL is currently insecure and exposed to attacks. Any solutions to remedy this must adhere to its low levels of performance. Elliptical Curve Cryptography, Protected ACARS and Host Identity Protocol have been identified as valid solutions to the system’s security drawbacks and all three are possible to implement in the present state of CPDLC.
APA, Harvard, Vancouver, ISO, and other styles
22

Feng, Jianyuan. "Controller Performance Assessment and Data Reconciliation for Artificial Pancreas." Thesis, Illinois Institute of Technology, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10640735.

Full text
Abstract:

Artificial pancreas (AP) systems are implemented as a treatment for type 1 diabetes (T1D) patients to regulate blood glucose concentration (BGC). With continuous glucose monitoring (CGM), information related to BGC can be measured at a high frequency. It is widely known that besides meals, BGC is also influenced by many other factors such as exercise, sleep, and stress. In order to get information about these factors, different kinds of measurements such as heart rate, acceleration and derived variables such as energy expenditure (EE) should also be collected using equipment like armband and chest band devices to be used as inputs for AP systems. With adequate information about patients, BGC, and other related factors, the controller in AP systems is able to calculate insulin infusion rate for patients based on the model and control algorithm. The insulin pump will deliver the calculated amount of insulin to patient's body to close the loop of BGC regulating.

For AP systems, the performance of model-based control systems depends on the accuracy of the models and may be affected when large dynamic changes in the human body occur or when the equipment performance varies. And those factors may move the operating conditions away from those used in developing the models and designing the control system. Sensor errors such as signal bias and missing data can mislead or stop the calculation of insulin infusion rate. All of these possible performance failures can make AP systems unreliable and endanger the safety of patients.

This project aims to develop additional modules focused on fault detection and diagnosis of the controller and the sensors of the AP system. A controller performance assessment module (CPAM) is developed to generate several indexes to monitor different aspects of controller performance and retune the controller parameters according to different types of controller performance deterioration. A sensor error detection and reconciliation module (SED&RM) is developed to detect sensor error in CGM measurements. The SED&RM is based on two model estimation technologies, outlier-robust Kalman filter (ORKF) and locally weighted partial least squares (LW-PLS) to replace the erroneous sensor signal with the model estimated value. A novel method, the nominal angle analysis (NAA) is introduced to solve problems of false positive and candidate selection for signal reconciliation. SED&RM is extended to multi-sensor error detection and reconciliation module (MSED&RM), which also includes error detection and reconciliation for other sensor signals such as galvanic skin response (GSR) and values derived from original sensor signals such as EE. A multi-level supervision and controller modification (ML-SCM) module integrates CPAM and MSED&RM together and extends the controller modification into different time scales including sample level, period level, and day level.

CPAM is tested with a single input and single output (SISO) version of AP system in UVa/Padova simulator. The results indicate that a generalized predictive control (GPC) with the proposed CPAM has a safer range of glucose concentration variation and more reasonable insulin suggestions than a GPC without controller retuning guided by the proposed CPAM. The performance of SED&RM and MSED&RM is tested with data from clinical experiments. The results indicate that the proposed system can successfully detect most of the erroneous signals and substitute them with reasonable model estimation values. The ML-SCM is tested with both simulation and clinical experiments. The results indicate that the AP system with ML-SCM module has a safer range of glucose concentration distribution and more reasonable insulin infusion rate suggestions than an AP system without the ML-SCM module.

APA, Harvard, Vancouver, ISO, and other styles
23

Dougherty, Paul Xavier. "Controlled exchange of configuration management data by industry." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1996. http://handle.dtic.mil/100.2/ADA324535.

Full text
Abstract:
Thesis (M.S. in Management) Naval Postgraduate School, December 1996.
Thesis advisor(s): Jane N. Feitler, William J. Haga. "December 1996." Includes bibliographical references (p. 79-80). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
24

Lu, Xinyou. "Inversion of controlled-source audio-frequency magnetotelluric data /." Thesis, Connect to this title online; UW restricted, 1999. http://hdl.handle.net/1773/6799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zhang, Simin. "Automated advanced analytics on vehicle data using AI." Thesis, KTH, Fordonsdynamik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-285672.

Full text
Abstract:
The evolution of electrification and autonomous driving on automotive leads to the increasing complexity of the in-vehicle electrical network, which poses a new challenge for testers to do troubleshooting work in massive log files. This thesis project aims to develop a predictive technique for anomaly detection focusing on user function level failures using machine learning technologies.\\ Specifically, it investigates the performance of point anomaly detection models and temporal dependent anomaly detection models on the analysis of Controller Area Network (CAN) data obtained from software-in-loop simulation. For point anomaly detection, the models of Isolation forest, Multivariate normal distribution, and Local outlier factor are implemented respectively. For temporal dependent anomaly detection, the model of an encoder-decoder architecture neural network using Long Short-Temporal Memory (LSTM) units is implemented, so is a stacking hybrid detector in the combination of LSTM Encoder and Local outlier factor.\\ With a comparison of the comprehensive performance of the proposed models, the model of LSTM AutoEncoder is selected for detecting the anomalies on sequential data in CAN logs. The experiment results show promising detection performance of LSTM AutoEncoder on the studied functional failures and suggest that it is possible to be deployed in real-time automated anomaly detection on vehicle systems.
Utvecklingen av elektrifiering och autonom körning på fordon leder till den ökande komplexiteten i fordonets elektriska nätverk, vilket utgör en ny utmaning för testare att göra felsökningsarbete i massiva loggfiler. Detta avhandlings syftar till att utveckla en förutsägbar teknik för detektering av avvikelser med fokus på användarfunktionsnivåfel med maskininlärningstekniker.\\ Specifikt undersöker den prestandan hos punktavvikelsedetekteringsmodeller och tidsberoende anomalidetekteringsmodeller på analysen av data från Controller Area Network (CAN) erhållen från simulering av mjukvara in-loop. För detektion av punktavvikelser implementeras modellerna för Isolation forest, Multivariate normal distribution och Local outlier factor. För temporär beroende anomalidetektering implementeras modellen för ett kodnings-avkodningsarkitekturneuralt nätverk som använder Long Short-Temporal Memory (LSTM) -enheter, så är en stapling hybriddetektor i kombination med LSTM Encoder och Local outlier factor.\\ Med en jämförelse av den omfattande prestandan hos de föreslagna modellerna väljs modellen för LSTM AutoEncoder för att detektera avvikelser på sekventiell data i CAN-loggar. Experimentresultaten visar lovande detektionsprestanda för LSTM AutoEncoder på de studerade funktionella misslyckandena och föreslår att det är möjligt att distribueras i realtid automatiserad anomalidetektering på fordonssystem.
APA, Harvard, Vancouver, ISO, and other styles
26

Holma, Erik. "Data Requirements for a Look-Ahead System." Thesis, Linköping University, Department of Electrical Engineering, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-10197.

Full text
Abstract:

Look ahead cruise control deals with the concept of using recorded topographic road data combined with a GPS to control vehicle speed. The purpose of this is to save fuel without a change in travel time for a given road. This thesis explores the sensitivity of different disturbances for look ahead systems. Two different systems are investigated, one using a simple precalculated speed trajectory without feedback and the second based upon a model predictive control scheme with dynamic programming as optimizing algorithm.

Defect input data like bad positioning, disturbed angle data, faults in mass estimation and wrong wheel radius are discussed in this thesis. Also some investigations of errors in the environmental model for the systems are done. Simulations over real road profiles with two different types of quantization of the road slope data are done. Results from quantization of the angle data in the system are important since quantization will be unavoidable in an implementation of a topographic road map.

The results from the simulations shows that disturbance of the fictive road profiles used results in quite large deviations from the optimal case. For the recorded real road sections however the differences are close to zero. Finally conclusions of how large deviations from real world data a look ahead system can tolerate are drawn.

APA, Harvard, Vancouver, ISO, and other styles
27

Fridlund, Julia. "Processing of Noisy Controlled Source Audio Magnetotelluric (CSAMT) Data." Thesis, Uppsala universitet, Geofysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-396255.

Full text
Abstract:
Controlled Source Audio Magnetotellurics (CSAMT) is a geophysical method for characterizing the resistivity of the subsurface with the help of electromagnetic waves. The method is used for various purposes, such as geothermal- and hydrocarbon exploration, mineral prospecting and for investigation of groundwater resources. Electromagnetic fields are created by running an alternating current in a grounded electric dipole and by varying the frequency, different depths can be targeted. Orthogonal components of the electromagnetic fields are measured at receiver stations a few kilometers away from the source. From these field components, so called magnetotellurics transfer functions are estimated, which can be used to invert for the resistivity of the subsurface. The data used in this project is from a survey conducted in 2014 and 2016 in Kiruna by Uppsala University and the mining company LKAB. Measurements were made at 31 stations along two orthogonal profiles. The data have been processed earlier, but due to noise, especially in the lower frequencies, a significant part of the data set could not be inverted. The aim of this project was to improve the results by analyzing the data and testing different methods to remove noise. First, robust regression was used to account for possible non-Gaussian noise in the estimation of the magnetotelluric transfer functions. Except for one station on profile 1, the robust method did not improve the results, which suggests that the noise is mostly Gaussian. Then modified versions of least squares, each affected by a different bias, were used to estimate the transfer functions. Where there is more noise, the estimates should differ more due to their different biases. The estimates differed most for low frequencies and especially on the part of profile 2 that was measured in 2014. It was investigated whether the railway network could explain part of the low frequency noise. Measures were taken to reduce spectral leakage from the railway signal at 16 ⅔ Hz to the closest transmitter frequencies 14 Hz and 20 Hz, but no clear improvement was seen and more detailed studies need to be conducted to determine this matter. Finally, a method based on comparing the ratio of short-term and long-term averages was tested to remove transients in the measured time series of the electromagnetic field components. This proved difficult to implement due to the variability of the time series’ behavior between different stations, frequencies and field components. However, the method showed some potential for stations 9 and 10 on profile 1, and could probably be developed further to remove transients more efficiently and thus improve the data.
Magnetotellurik med kontrollerad källa (förkortat CSAMT på engelska) är en metod där elektromagnetiska fält används för att undersöka markens resistivitet. Resisitivitet är ett mått på hur bra eller dåligt marken leder elektriska strömmar. Metoden används till exempel för att mäta djupet till berggrunden, som oftast har högre resistivitet (sämre ledningsförmåga) än marken ovanför. Man kan också hitta metaller, så som guld och koppar, vilka har väldigt låg resistivitet (bra ledningsförmåga). Elektromagnetiska vågor skapas genom att man låter en växelström gå igenom en lång ledning. Vågorna färdas först genom luften och sen ner i marken. Hur djupt ner de når beror på växelströmmens frekvens; med låga frekvenser når vågorna djupare ner i marken än med höga. Under markytan inducerar de elektromagnetiska vågorna elektriska strömmar, så kallade telluriska strömmar (dvs. jordströmmar). Strömmarna blir svagare ju längre de färdas och hur snabbt de avtar i styrka beror på jordens resistivitet. Strömmarna skapar också nya elektriska och magnetiska fält som färdas tillbaka mot ytan. Vid markytan mäter man fältens styrka för olika frekveser, vilket då ger information om resistiviteten på olika djup. Från mätningarna tar man ofta fram så kallade magnetotelluriska överföringsfunktioner. Dessa överföringsfunktioner gör det lättare att tolka datan och ta reda på resistiviteten hos marken. I detta projekt har CSAMT-data använts från en undersökning i Kiruna som genomfördes av Uppsala Universitet och gruvföretaget LKAB. Datan har bearbetats tidigare, men på grund av mycket brus i mätningarna blev inte resultatet så bra som väntat. Brus kan komma från allt som genererar elektromagnetiska fält, till exempel elledningar, tågledningar eller naturliga variationer i jordens egna magnetfält. Målet med projektet var att förbättra resultatet genom att analysera datan och testa olika metoder för att ta bort brus. Den vanligaste metoden för att beräkna överföringsfunktionerna antar att det magnetiska fältet är fritt från brus. Detta är inte nödvändigtvis sant och kan leda till bias, alltså ett snedvridet resultat. Andra sätt att beräkna överföringsfunktionerna på ger olika bias. Det här kan man utnyttja för att se hur mycket brus som finns i datan. Om det inte finns något brus alls så blir alla överföringsfunktioner lika, medan om det finns mycket brus så skiljer de sig mer åt. På detta sätt upptäcktes att det var mer brus för frekvenserna 14 och 20 Hz (där 1 Hz är 1 svängning per sekund). En förklaring till det kan vara att tågledningar, som genererar elektromagnetiska fält med 16.67 Hz, ligger nära i frekvens och stör dessa signaler. För att minska brusets påverkan testades så kallad robust processering. Det innebär att man lägger mindre vikt vid de mätningar som tycks vara mycket annorlunda (alltså innehåller mer brus) från andra mätningar. Tyvärr så hjälpte inte denna strategi nämnvärt för att förbättra resultatet. Till sist tog vi fram en metod för att ta bort transienter, vilket är kortvarigt brus med hög intensitet. Transienter kan till exempel komma från åskblixtar, som ju är kortvariga elektriska urladdningar. Det visade sig dock att detta inte var helt enkelt, då det var svårt att se vad som var brus och vad som bara var naturliga variationer hos de elektromagnetiska fälten. Men i några fall kunde bruset urskiljas och därför verkar det troligt att fortsatt arbete med denna metod skulle kunna ge ännu bättre resultat.
APA, Harvard, Vancouver, ISO, and other styles
28

Shan, Chunling. "Natural and Controlled Source Magnetotelluric Data Processing and Modeling." Doctoral thesis, Uppsala universitet, Geofysik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-229917.

Full text
Abstract:
In this thesis, four studies using different geophysical electromagnetic methods are presented. In the first study dealing with airborne measurements, the noise response due to the rotation of the aircraft and the aircraft itself as a metallic conductive body on the Earth's electromagnetic response in very low frequency and low frequency band was investigated. The magnetic fields are independent of the aircraft in the VLF band and part of the LF band. But at higher frequencies (above 100 kHz), the signals are more influenced by the aircraft. The aircraft also generates its own noise frequencies which are mixed with the radio transmitter signals. The second and third studies are applications of radio-, controlled source-magnetotellurics and electrical resistivity tomography methods at a quick-clay landslide site in southwest Sweden. The data are processed and modeled in 2D and 3D, and the models are compared with high-resolution seismic and geotechnical data. The obtained results were further validated and refined by performing synthetic tests in the second study. The third study shows that the 3D models provide larger and more continuous volume of the quick clay structure than traditional 2D models. Both studies have shown that integrated application of geophysical methods for landslides is ideal. Quick clays often overlie the coarse-grained layers showing an increase of resistivity values in the models. In the fourth study, a new audio magnetotelluric data acquisition technique is developed and is named moving magnetotellurics (MMT). In this new technique, the magnetic sensors are placed on the ground and only 15 to 20 minutes data are acquired for each station, which usually is enough to cover the frequency range 30-300 Hz. The new technique is more efficient and convenient than the traditional magnetotelluric method, and test measurements have shown that it is an applicable method in shallow depth studies.
APA, Harvard, Vancouver, ISO, and other styles
29

Wargo, William D., and Howard Eckstein. "An Advanced, Programmable Data Acquisition System." International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/611965.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California
The MicroDAS-1000 is an airborne Data Acquisition System (DAS) designed to meet the growing needs of airframe manufacturers for extensive test data accumulation, processing and evaluation. As such, the system has been designed with emphasis on modularity, miniaturization and ease of operator usage and expansion. The MicroDAS product line includes a series of components used as building blocks to configure systems of virtually any size. The modular design of these components allows considerable latitude to the instrumentation engineer in configuring systems for simple or complex applications. The modular concept has been extended to the design of plug-in modules for different functional requirements and system applications. All units are under software control to allow rapid reconfiguration and setup as requirements for instrumentation and data gathering change.
APA, Harvard, Vancouver, ISO, and other styles
30

Lim, Dongwon. "Synthesis of PID controller from empirical data and guaranteeing performance specifications." [College Station, Tex. : Texas A&M University, 2008. http://hdl.handle.net/1969.1/ETD-TAMU-2773.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kim, Jung Hoon. "Performance Analysis and Sampled-Data Controller Synthesis for Bounded Persistent Disturbances." 京都大学 (Kyoto University), 2015. http://hdl.handle.net/2433/199317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Schumacher, Gary A. "APPLICATIONS FOR A PORTABLE PC/104 BASED INSTRUMENTATION CONTROLLER." International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/607587.

Full text
Abstract:
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
PC based instrumentation and telemetry processing systems are attractive because of their ease of use, familiarity, and affordability. The evolution of PC computing power has resulted in a telemetry processing system easily up to most tasks, even for control of and processing of data from a very complex system such as the Common Airborne Instrumentation System (CAIS) used on the new Lockheed-Martin F-22. A complete system including decommutators, bit synchronizers, IRIG time code readers, simulators, DACs, live video, and tape units for logging can be installed in a rackmount, desktop, or even portable enclosure. The PC/104 standard represents another step forward in the PC industry evolution towards the goals of lower power consumption, smaller size, and greater capacity. The advent of this standard and the availability of processors and peripherals in this form factor has made possible the development of a new generation of portable low cost test equipment. This paper will outline the advantages and applications offered by a full-function, standalone, rugged, and portable instrumentation controller. Applications of this small (5.25"H x 8.0"W x 9.5"L) unit could include: flight line instrumentation check-out, onboard aircraft data monitoring, automotive testing, small craft testing, helicopter testing, and just about any other application where small-size, affordability, and capability are required.
APA, Harvard, Vancouver, ISO, and other styles
33

Júlio, Fábio José Correia. "A layer 2 multipath fabric using a centralized controller." Master's thesis, Faculdade de Ciências e Tecnologia, 2013. http://hdl.handle.net/10362/11139.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Ethernet is the most used L2 protocol in modern datacenters networks. These networks serve many times like the underlying infrastructure for highly virtualised cloud computing services. To support such services the underlying network needs to be prepared to support host mobility and multi-tenant isolation for a high number of hosts while using the available bandwidth e ciently and maintaing the inherent costs low. These important properties are not ensured by Ethernet protocols. The bandwidth is always wasted because the spanning tree protocol is used to calculate paths. Also, the scalability can be an issue because the MAC learning process is based in frame ooding. On layer 3 some of this problems can be solved, but layer 3 is harder to con gure, poses di culties in host mobility and is more expensive. Recent e orts try to bring the advantages of layer 3 to layer 2. Most of them are based in some form of Equal-Cost Multipath (ECMP) to calculate paths on data center network. The solution proposed on this document uses a di erent approach. Paths are calculated using a non-ECMP policy based control-plane that is implemented in an OpenFlow controller. OpenFlow is a new protocol developed to help researchers test their new discovers on real networks without messing with the real tra c. To do that OpenFlow has to be supported by the network's switches. The communication between systems is done by SSL and all switches features are available to the controller. The non-ECMP policy based algorithm is a di erent way to do routing. Instead of using unitary metrics on each link, one policy is chosen for each link. The use of policies opens the possibility to consider very di erent paths as having the same forwarding preference increasing the number of used paths. Our approach uses the recent Backbone Provider Bridging (PBB) standard that adds extra header information to the Ethernet frame and provides isolation between customer and network address space improving scalability.
APA, Harvard, Vancouver, ISO, and other styles
34

Thorve, Swapna. "EpiViewer: An Epidemiological Application For Exploring Time Series Data." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/86829.

Full text
Abstract:
Visualization plays an important role in epidemic time series analysis and forecasting. Viewing time series data plotted on a graph can help researchers identify anomalies and unexpected trends that could be overlooked if the data were reviewed in tabular form. However,there are challenges in reviewing data sets from multiple data sources (data can be aggregated in different ways and measure different criteria which can make a direct comparison between time series difficult. In the face of an emerging epidemic, the ability to visualize time series from various sources and organizations and to reconcile these datasets based on different criteria could be key in developing accurate forecasts and identifying effective interventions. Many tools have been developed for visualizing temporal data; however, none yet supports all the functionality needed for easy collaborative visualization and analysis of epidemic data. In this thesis, we develop EpiViewer, a time series exploration dashboard where users can upload epidemiological time series data from a variety of sources and compare, organize, and track how data evolves as an epidemic progresses. EpiViewer provides an easy-to-use web interface for visualizing temporal datasets either as line charts or bar charts. The application provides enhanced features for visual analysis, such as hierarchical categorization, zooming, and filtering, to enable detailed inspection and comparison of multiple time series on a single canvas. Finally, EpiViewer provides a built-in statistical Epi-features module to help users interpret the epidemiological curves.
Master of Science
We present EpiViewer, a time series exploration dashboard where users can upload epidemiological time series data from a variety of sources and compare, organize, and track how data evolves as an epidemic progresses. EpiViewer is a single page web application that provides a framework for exploring, comparing, and organizing temporal datasets. It offers a variety of features for convenient filtering and analysis of epicurves based on meta-attribute tagging. EpiViewer also provides a platform for sharing data between groups for better comparison and analysis.
APA, Harvard, Vancouver, ISO, and other styles
35

Haddock, Paul C. "TELEMETERY DATA COLLECTION FROM OSCAR SATELLITES." International Foundation for Telemetering, 1998. http://hdl.handle.net/10150/607347.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California
This paper discusses the design, configuration, and operation of a satellite station built for the Center for Space Telemetering and Telecommunications Laboratory in the Klipsch School of Electrical and Computer Engineering Engineering at New Mexico State University (NMSU). This satellite station consists of a computer-controlled antenna tracking system, 2m/70cm transceiver, satellite tracking software, and a demodulator. The satellite station receives satellite telemetry, allows for voice communications, and will be used in future classes. Currently this satellite station is receiving telemetry from an amateur radio satellite, UoSAT-OSCAR-11. Amateur radio satellites are referred to as Orbiting Satellites Carrying Amateur Radio (OSCAR) satellites.
APA, Harvard, Vancouver, ISO, and other styles
36

Rumpfhuber, Eva-Maria. "An integrated analysis of controlled-and passive source seismic data /." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2008. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Morris, Edward C. "Fast imaging techniques of marine controlled source electromagnetic (CSEM) data." Thesis, University of Southampton, 2008. https://eprints.soton.ac.uk/66357/.

Full text
Abstract:
Obtaining information regarding the resistivity structure of the subsurface from marine CSEM data involves complex processes. 1D and 2D forward and inverse modelling are currently the standard approaches used to produce geoelectrical models, with 3D inversion fast becoming a realizable method. However, these methods are time consuming, require expert knowledge to produce reliable results, and suffer from the non-uniqueness of the EM problem. There is therefore considerable scope for developing imaging techniques for marine CSEM data that do not require lengthy, time consuming computations, but make use of entire datasets. These could provide a “first look” for possible structural information conveyed by the data, and may provide starting points or other constraints for inversion. In this thesis, a number of different imaging techniques for marine CSEM data are assessed, with particular reference to applications in hydrocarbon exploration. T-X and F-K imaging are widely used seismic reflection processing techniques that can be applied to CSEM data. Features produced in the T-X and F-K domains by 1D subsurface resistivity structures are investigated. The dip of an arrival corresponding to a subsurface resistive feature is found to depend on its resistivity, with reduction in resistivity producing steeper dipping events. The separation of arrivals according to their dips in the T-X domain is used as a basis for the attempted separation of the airwave, by filtering in the F-K domain. However, this does not prove to be useful. Secondly, in a adaptation of the F-K migration method used in seismic processing, EM migration is investigated, following the approach by (Tompkins, 2004b). The results of the migration method are compared and contrasted to a 1D smooth inversion algorithm. It is found that the migration is mostly dependent on the conductivity contrast across a geoelectrical boundary, whereas the inversion recovers the resistivity thickness product (transverse resistance). Hence, EM migration is a viable alternative to inversion and usefully complements it in regions of large conductivity contrasts. Normalized ElectroMagnetic Imaging (NEMI) extends the standard approach of normalizing the recorded electric field data by a 1D background model, to identify large lateral resistivity variations over a survey area. This is achieved by firstly sorting the data based on sensitivity to the target layer, and then distributing the normalized anomaly in the horizontal plane between the source and receiver using a simple quasitomographical approach. In some scenarios this provides a reasonable estimation of the lateral extent of a 3D resistive body buried in a conductive background. Lastly, Apparent Resistivity Imaging (ARI) is adapted for the use with the marine CSEM method. This generates pseudo-sections in which offsets are mapped into apparent depths. This study shows that whilst vertical resolution of resistive bodies is poor, lateral resolution is high and provides a good estimate of the true extent of a target body. Apparent resistivity pseudo-sections therefore provide a very effective means of “first look” imaging and assessment of marine CSEM data.
APA, Harvard, Vancouver, ISO, and other styles
38

Vieira, da Silva Nuno Miguel. "Three-dimensional modelling and inversion of controlled source electromagnetic data." Thesis, Imperial College London, 2011. http://hdl.handle.net/10044/1/9120.

Full text
Abstract:
The marine Controlled Source Electromagnetic (CSEM) method is an important and almost self-contained discipline in the toolkit of methods used by geophysicists for probing the earth. It has increasingly attracted attention from industry during the past decade due to its potential in detecting valuable natural resources such as oil and gas. A method for three-dimensional CSEM modelling in the frequency domain is presented. The electric field is decomposed in primary and secondary components, as this leads to a more stable solution near the source position. The primary field is computed using a resistivity model for which a closed form of solution exists, for example a homogeneous or layered resistivity model. The secondary electric field is computed by discretizing a second order partial differential equation for the electric field, also referred in the literature as the vector Helmholtz equation, using the edge finite element method. A range of methods for the solution of the linear system derived from the edge finite element discretization are investigated. The magnetic field is computed subsequently, from the solution for the electric field, using a local finite difference approximation of Faraday’s law and an interpolation method. Tests, that compare the solution obtained using the presented method with the solution computed using alternative codes for 1D and 3D synthetic models, show that the implemented approach is suitable for CSEM forward modelling and is an alternative to existing codes. An algorithm for 3D inversion of CSEM data in the frequency domain was developed and implemented. The inverse problem is solved using the L-BFGS method and is regularized with a smoothing constraint. The inversion algorithm uses the presented forward modelling scheme for the computation of the field responses and the adjoint field for the computation of the gradient of the misfit function. The presented algorithm was tested for a synthetic example, showing that it is capable of reconstructing a resistivity model which fits the synthetic data and is close to the original resistivity model in the least-squares sense. Inversion of CSEM data is known to lead to images with low spatial resolution. It is well known that integration with complementary data sets mitigates this problem. It is presented an algorithm for the integration of an acoustic velocity model, which is known a priori, in the inversion scheme. The algorithm was tested in a synthetic example and the results demonstrate that the presented methodology is promising for the improvement of resistivity models obtained from CSEM data.
APA, Harvard, Vancouver, ISO, and other styles
39

Smith, Kimberly Ann. "A micro-coded controller for a medium data rate satellite payload simulator." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/34999.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Samadi, Khah Pouya. "Performance Modeling of OpenStack Controller." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-195649.

Full text
Abstract:
OpenStack is currently the most popular open source platform for Infrastructure as a Service (IaaS) clouds. OpenStack lets users deploy virtual machines and other instances, which handle different tasks for managing a cloud environment on the fly. A lot of cloud platform offerings, including the Ericsson Cloud System, are based on OpenStack. Despite the popularity of OpenStack, there is currently a limited understanding of how much resource is consumed/needed by components of OpenStack under different operating conditions such as number of compute nodes, number of running VMs, the number of users and the rate of requests to the various services. The master thesis attempts to model the resource demand of the various components of OpenStack in function of different operating condition, identify correlations and evaluate how accurate the predictions are. For this purpose, a physical OpenStack is setup with one strong controller node and eight compute nodes. All the experiments and measurements were on virtual OpenStack components on top of the main physical one. In conclusion, a simple model is generated for idle behavior of OpenStack, starting and stopping a Virtual Machine (VM) API calls which predicts the total CPU utilization based on the number of Compute Nodes and VMs.
APA, Harvard, Vancouver, ISO, and other styles
41

Hjärtén, Martin. "Master thesis in interpretation of controlled-source radiomagnetotelluric data from Hallandsåsen." Thesis, Uppsala University, Geophysics, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8015.

Full text
Abstract:

Controlled Source Tensor Magnetotelluric (CSTMT) ground measurements were executed on the Hallandsåsen horst where a major tunnel is under construction. The instrument system EnviroMT are used for this purpose. The major research aspect of this thesis has been to form an opinion of the effectiveness of the method by comparing the results from the CSTMT survey with a prior investigation performed with the DC resistivity method. Another important part of this thesis has been to compile the basic and fundamental CSTMT and RMT theory, in a way that people outside the EM community easily can be introduced to the subject.

When comparing the different inversion models from the CSTMT and DC resistivity surveys one can see differences in the depth at which the conductors are resolved. In the CSTMT inversion models (TE+TM) there are two conductors that possibly can reach the depth of the tunnel in construction. These conductors are not resolved at the deeper structures in the DC resistivity models. Whether the conductors in the CSTMT inversions (TE+TM) truly extend to the depth at which they are modeled, or if they in deeper parts are artificial effects of regularisation in the inversion cannot be said for sure. Accounting for the low frequencies utilised in the TE mode, one has very strong arguments that the deep conductors seen in the CSTMT model are true.

The TE-mode models have shown to be much less affected by the complex problems of near field effects in comparison with the TM-mode models. The evidence of the near field effects is very prominent in the TM-mode phase, but in the phase of the TE-mode one can not see any such tendencies. However, one can see a discontinuity in the same part of three profile lines which shows that the data is disturbed but not nearly as much as in the TM-mode. The apparent resistivity seems to be over all less affected by the near field effects. In the apparent resistivity of the TE-mode, one can not discern any near field effects at all.

In the TM-mode, the apparent resistivity shows higher apparent resistivity than the real apparent resistivity in the near field. To receive more information about the deeper structures, lower controlled source frequencies were allowed in the TE-mode than in the TM-mode inversion models. The RMS in the TE-mode inversions has not been deteriorated, which is an another indication that the TE mode is not very disturbed by the near field effects.

The RMT inversion models are shown to be heavily biased in the deeper parts to which the RMT data are insensitive and regularization determining the outcome of the inversion. One can also see that regularisation is influencing the whole inversion model. In the shallow subsurface the inversion models should be same for CSTMT and RMT, but one can see differences in resistivity between the models.

The real induction arrows show features that are not as clearly displayed in either the phase or apparent resistivity. It seems that the real induction arrows are better at detecting lateral differences in conductivity in a more resistive media, than the phase and apparent resistivity.

APA, Harvard, Vancouver, ISO, and other styles
42

Van, Luinen Steven M. "Lossless statistical data service over Asynchronous Transfer Mode." Curtin University of Technology, Australian Telecommunications Research Institute, 1999. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=9898.

Full text
Abstract:
Asynchronous Transfer Mode (ATM) can provide deterministic channels as required for real time signals, as well as statistical multiplexing. For this reason, ATM has been chosen as the underlying technology for providing a Broadband Integrated Services Digital Network (B-ISDN). Two main classes of services are expected to be supported over a B-ISDN. These classes are real-time services and data services. Data services include computer communications (Local Area Network (LAN) interconnections) and general non-real time traffic, such as file transfer and small transactions.The provision of data services over ATM are better served with statistical multiplexing, provided that the service is loss-free. For multiplexing to be loss-free and still statistical, while the maximum service rate is fixed, the multiplexer tributaries must be controlled in flow, to assure no overflow of the multiplexing buffer. Provision of a service over ATM is accomplished by an ATM layer. Transfer Capability (ATC).This thesis investigates and reports on the operating characteristics of an ATM layer Transfer Capability proposed to the International Telecommunications Union (ITU), and called Controlled Cell Transfer (CCT). CCT uses credit window based flow control on links and a quota based control in switches, and will give loss free statistical multiplexing for data. Other ITU defined ATCs are examined in regard to data service provision and compared with CCT. It is found that only CCT can provide a fast and at the same time efficient data service.The thesis also examines the impact that support of the CCT capability would have on an ATM switch, through determination of required functionality, and mapping of the required functions into a switch design. Finally, an architecture and implementation of an ATM switch is described that would support the CCT as well as the Deterministic Bit Rate (DBR) ++
transfer capability, and would provide efficient data and real-time services.
APA, Harvard, Vancouver, ISO, and other styles
43

Law, Adam. "Novel uses of high-density pre-critical reflection data from the Baltic Shield." Thesis, University of Cambridge, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.309926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wiggins, John Sterling. "Design and specification of a PC-based, open architecture environment controller." Thesis, Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/17299.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ward, Roxanne E. "Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled Trials." Thèse, Université d'Ottawa / University of Ottawa, 2013. http://hdl.handle.net/10393/23974.

Full text
Abstract:
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes. Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG). Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV. Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
APA, Harvard, Vancouver, ISO, and other styles
46

Lin, Jong-Der, and 林宗德. "Digitally Redesigned Dual-Rate Controllers for A Sampled-Data System." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/03747300907696359374.

Full text
Abstract:
碩士
國立成功大學
電機工程研究所
84
This thesis first proposes digitally redesigned dual- rate controllers. Owing to the characteristics of various systems are different, it is not the optimal response of affairs to reduce the original sampling period by half. So we develop the dual-rate sampling, ie, join a tuningly sampling point between the original sampling period. By the dual-rate sampling together with the principle of equivalent areas, the discrete- time input signals are exactly evaluated such that the desired digitally redesigned feedback gain and forward gain will be obtained. Furthermore, we present an efficient digital redesign method by the improved block-pulse function approach and by the dual-rate sampling for inputtime- delay systems. One can find the state responses of continuous-time system will closely match those of the digitally redesigned system by selecting the tuning parameters N and i with the smallest different error. By dual-rate sampling,the coefficients of the block-pulse function expansion are exactly evaluated by integrating both sides of the continuous-time designed closed-loop system such that the desired digitally redesigned feedback gain and forward gain will be obtained.
APA, Harvard, Vancouver, ISO, and other styles
47

Chen, Shin-Hung, and 陳信宏. "Design of Low Data Transmission Rate Controllers for Networked Control Systems." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/22723770380094393435.

Full text
Abstract:
碩士
國立臺灣海洋大學
電機工程學系
97
In this thesis, for networked control systems with periodically sensed, we consider the data transmission rate reducing problem. Firstly, for a L2-gain rendering controller, we derive a deadband transmission condition for sensor nodes. The system will hold internal stability and L2-gain specification if the sensor transmits measured data under the condition. In this case the transmission rate will be significantly reduced since not every measured data should be transmitted. Secondly, when a data transmission fails, we calculate the re-transmission interval. The stability and L2-gain specification are still held if one successful transmission occurs. Finally, a deadband transmission condition for guaranteeing robust internal stability and robust L2-gain specification is derived if there are some uncertainties in the considered systems. Several examples are provided for illustration.
APA, Harvard, Vancouver, ISO, and other styles
48

Cheng-Ming, Huang. "Optimal Digital Redesigns of PAM and PWM Controllers for Sampled-Data Time-Delay System." 2002. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0022-1409200710323328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Lin, Su-Ken, and 林樹根. "Robust Control of Sampled-Data Uncertain Input Time-Delay Systems Using Digitally Redesigned Controllers." Thesis, 1998. http://ndltd.ncl.edu.tw/handle/07206409483541722270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Fu, En-Ping, and 傅恩平. "Design of 2DoF PID Controllers Directly from Plant Data for Stable, Integrating, and Unstable Processes." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/a3tgj8.

Full text
Abstract:
碩士
國立臺北科技大學
化學工程研究所
101
PID controllers are the most widely used controllers in the chemical process industries, and therefore various model-based PID design methods have been developed. A major drawback of the model-based methods is that their effectiveness would degrade for high-order process dynamics because of the inevitable modeling error. Consequently, it is an attractive alternative to design PID controllers directly based on a set of process input and output data without resorting to a process model.   This study proposes a novel method of two-degree-of-freedom (2DoF) PID controller design for stable, integrating, and unstable processes directly using the plant data available from plant test. The proposed method first derives the PID parameters so that the resulting control system behaves as closely as possible to the prescribed reference models. The references models can be defined to describe the desired closed-loop dynamics for either set-point tracking or disturbance rejection. A simple one-dimensional optimization problem is formulated to determine an appropriate reference model, with also the robustness consideration, for the controlled process. Satisfying good control performance for disturbance rejection and set-point tracking simultaneously is not possible by using a one-degree-of-freedom (1DoF) PID controller. Therefore, the 2DoF PID controller with set-point weighting is subsequently designed to improve the set-point response. The set-point weighting parameters for the proportional and derivative modes are obtained so that the set-point response follows a prescribed reference trajectory. Extensive simulation results show the superiority of the proposed method over existing (model-based) PID design methods for various types of process dynamics.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography