To see the other types of publications on this topic, follow the link: Calibration software.

Dissertations / Theses on the topic 'Calibration software'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 46 dissertations / theses for your research on the topic 'Calibration software.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

O'Kennedy, Brian James. "Stereo camera calibration." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/53063.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2002.
ENGLISH ABSTRACT: We present all the components needed for a fully-fledged stereo vision system, ranging from object detection through camera calibration to depth perception. We propose an efficient, automatic and practical method to calibrate cameras for use in 3D machine vision metrology. We develop an automated stereo calibration system that only requires a series of views of a manufactured calibration object in unknown positions. The system is tested against real and synthetic data, and we investigate the robustness of the proposed method compared to standard calibration practice. All the aspects of 3D stereo reconstruction is dealt with and we present the necessary algorithms to perform epipolar rectification on images as well as solving the correspondence and triangulation problems. It was found that the system performs well even in the presence of noise, and calibration is easy and requires no specialist knowledge.
AFRIKAANSE OPSOMMING: Ons beskryf al die komponente van 'n omvattende stereo visie sisteem. Die kern van die sisteem is 'n effektiewe, ge-outomatiseerde en praktiese metode om kameras te kalibreer vir gebruik in 3D rekenaarvisie. Ons ontwikkel 'n outomatiese, stereo kamerakalibrasie sisteem wat slegs 'n reeks beelde van 'n kalibrasie voorwerp in onbekende posisies vereis. Die sisteem word getoets met reële en sintetiese data, en ons vergelyk die robuustheid van die metode met die standaard algoritmes. Al die aspekte van die 3D stereo rekonstruksie word behandel en ons beskryf die nodige algoritmes om epipolêre rektifikasie op beelde te doen sowel as metodes om die korrespondensie- en diepte probleme op te los. Ons wys dat die sisteem goeie resultate lewer in die aanwesigheid van ruis en dat kamerakalibrasie outomaties kan geskied sonder dat enige spesialis kennis benodig word.
APA, Harvard, Vancouver, ISO, and other styles
2

Herbepin, Christian. "Flight Test Instrumentation Manager Software." International Foundation for Telemetering, 2008. http://hdl.handle.net/10150/606200.

Full text
Abstract:
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California
This paper presents the Flight Test Instrumentation Manager Software application internally developed and used inside the Eurocopter Flight Test department. This fully integrated and user friendly tool covers the all management requirement for entire life cycle of the flight test instrumentation equipment and configuration, tracking all the main events: order, calibration, configuration, service and repair, final disposal. FTIManager serves as a central hub between the instrumentation team and the post processing and analysis teams.
APA, Harvard, Vancouver, ISO, and other styles
3

Sjölin, Anders. "Utvärdering av Beamex CMX Calibration Software: med fokus på användbarhet." Thesis, Högskolan i Gävle, Avdelningen för Industriell utveckling, IT och Samhällsbyggnad, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-11353.

Full text
Abstract:
Inom industrin är främsta skälet för kalibreringar att även de bästa instrumenten med tiden förlorar sin förmåga att ge noggranna mätningar. Med kalibreringsmjukvara som stöd effektiviseras kalibreringsprocessen och hanteringen av dess data. Instrumentteknikerna på Vallviks Bruk arbetar i systemet CMX Calibration Software av Beamex, och jag fick uppdraget att utvärdera dess användbarhet. Uppsatsen fokuserar främst på användarna och hur det upplever systemets användargränssnitt. Vidare skall utvärderingens resultat leda till förslag på förbättringar i systemet. Utvärderingen består av ett användartest och en heuristisk utvärdering. En kvantitativ studie utformades för användartestet där en datainsamling genomfördes för mätning av ändamålsenlighet, effektivitet och tillfredställelse. Efter användartestet utförde jag en expertutvärdering där användbarhetsproblemen identifierades utifrån tio fördefinierade heuristiker. Utvärderingens resultat indikerade på att systemets användbarhet hade problem när det gällde åtkomst av systemoperationer då det var anpassat efter vana användare. Av expertutvärderingens resultat kunde dessa problem refereras till Flexibilitet och effektivitet då systemet inte tillgodosåg nybörjare som vana användare. Testpersonerna reagerade på bristen av ikoner och knappar när de navigerade sig fram genom systemet. Av testpersonernas resultat och expertutvärderingen utformades ett förslag om att bygga ut svårtillgängliga systemoperationer i designen och göra det åtkomliga för nybörjare. Övriga förbättringar som kunde genomföras relaterades till heuristiken Konsekvens och standard, men var av mindre betydelse.
APA, Harvard, Vancouver, ISO, and other styles
4

Kupferschmidt, Benjamin. "INTEGRATING ENGINEERING UNIT CONVERSIONS AND SENSOR CALIBRATION INTO INSTRUMENTATION SETUP SOFTWARE." International Foundation for Telemetering, 2007. http://hdl.handle.net/10150/604520.

Full text
Abstract:
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Historically, different aspects of the configuration of an airborne instrumentation system were specified in a variety of different software applications. Instrumentation setup software handled the definition of measurements and PCM Formats while separate applications handled pre-flight checkout, calibration and post-flight data analysis. This led to the manual entry of the same data multiple times. Industry standards such as TMATS strive to address this problem by creating a data-interchange format for passing setup information from one application to another. However, a better alternative is to input all of the relevant setup information about the sensor and the measurement when it is initially created in the instrumentation vendor’s software. Furthermore, an additional performance enhancement can be achieved by adding the ability to perform sensor calibration and engineering unit conversions to pre-flight data visualization software that is tightly coupled with the instrumentation setup software. All of the setup information can then be transferred to the ground station for post-flight processing and data reduction. Detailed reports can also be generated for each measurement. This paper describes the flow of data through an integrated airborne instrumentation setup application that allows sensors and measurements to be defined, acquired, calibrated and converted from raw counts to engineering units. The process of performing a sensor calibration, configuring engineering unit conversions, and importing calibration and transducer data sheets will also be discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Civelek, Utku. "A Software Tool For Vehicle Calibration, Diagnosis And Test Viacontroller Area Network." Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614836/index.pdf.

Full text
Abstract:
Controller Area Networks (CAN&rsquo
s) in vehicles need highly sophisticated software tools to be designed and tested in development and production phases. These tools consume a lot of computer resources and usually have complex user interfaces. Therefore, they are not feasible for vehicle service stations where low-performance computers are used and the workers not very familiar with software are employed. In this thesis, we develop a measurement, calibration, test and diagnosis program -diaCAN- that is suitable for service stations. diaCAN can transmit and receive messages over 3 CAN bus channels. It can display and plot the data received from the bus, import network message and Electronic Control Unit (ECU) configurations, and record bus traffic with standard file formats. Moreover, diaCAN can calibrate ECU values, acquire fault records and test vehicle components with CAN Calibration Protocol functions. All of these capabilities are verified and evaluated on a test bed with real CAN bus and ECUs.
APA, Harvard, Vancouver, ISO, and other styles
6

Mariotti, Gilles <1985&gt. "An Integrated Transmission-Media Noise Calibration Software For Deep-Space Radio Science Experiments." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6519/.

Full text
Abstract:
The thesis describes the implementation of a calibration, format-translation and data conditioning software for radiometric tracking data of deep-space spacecraft. All of the available propagation-media noise rejection techniques available as features in the code are covered in their mathematical formulations, performance and software implementations. Some techniques are retrieved from literature and current state of the art, while other algorithms have been conceived ex novo. All of the three typical deep-space refractive environments (solar plasma, ionosphere, troposphere) are dealt with by employing specific subroutines. Specific attention has been reserved to the GNSS-based tropospheric path delay calibration subroutine, since it is the most bulky module of the software suite, in terms of both the sheer number of lines of code, and development time. The software is currently in its final stage of development and once completed will serve as a pre-processing stage for orbit determination codes. Calibration of transmission-media noise sources in radiometric observables proved to be an essential operation to be performed of radiometric data in order to meet the more and more demanding error budget requirements of modern deep-space missions. A completely autonomous and all-around propagation-media calibration software is a novelty in orbit determination, although standalone codes are currently employed by ESA and NASA. The described S/W is planned to be compatible with the current standards for tropospheric noise calibration used by both these agencies like the AMC, TSAC and ESA IFMS weather data, and it natively works with the Tracking Data Message file format (TDM) adopted by CCSDS as standard aimed to promote and simplify inter-agency collaboration.
APA, Harvard, Vancouver, ISO, and other styles
7

Mirpour, Sasha. "A Comparison of 3D Camera Tracking Software." Thesis, University of Gävle, Department of Mathematics, Natural and Computer Sciences, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-779.

Full text
Abstract:

In the past decade computer generated images have become widely used in the visual effects industry. One of the main reasons is being able to seamlessly blend three dimensional (3D) animation with live-action footage. In this study, different 3D camera tracking software (also referred to as matchmoving) is compared focusing on workflow, user-friendly system, and quality of production.

APA, Harvard, Vancouver, ISO, and other styles
8

Bauer, Zachary Obenour. "A Calibration Method for a Controlled Reception Pattern Antenna and Software Defined Radio Configuration." Ohio University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1357402542.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rogers, Craig N. "Object-oriented design of an automated calibration system for an analog I/O process control device." [Denver, Colo.] : Regis University, 2006. http://165.236.235.140/lib/CRogers2007.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hlavínek, Jakub. "Software WEST pro výpočet čistíren odpadních vod." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2019. http://www.nusl.cz/ntk/nusl-392043.

Full text
Abstract:
In it´s theoretical part, my master thesis handles selected softwares for wastewater treatment mathematical modelling. It summarizes and compares availible softwares on the market, it´s use, advantages and disadvantages. The practical part deals with an exemplar evaluation of selected wastewater treatment plant according to Czech normative standards and evaluation of the same wastewater treatment plant per mathematical model created in WEST program. The assembly of a wastewater treatment plant model consisted of studies, creation of the layout, insertion of provided data, calibration and model evaluation.
APA, Harvard, Vancouver, ISO, and other styles
11

Bao, Rui He. "Case-based reasoning for automotive engine electronic control unit calibration." Thesis, University of Macau, 2009. http://umaclib3.umac.mo/record=b2099648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Shadle, Daryl Allen. "An investigation into the long-term impact of the calibration of software estimation models using raw historical data." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA286112.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, September 1994.
Thesis advisors, T. Hamid and Keebom Kang. "September 1994." Bibliography: p. 127-128. Also available online.
APA, Harvard, Vancouver, ISO, and other styles
13

Náplavová, Eva. "Kalibrace hydraulického modelu vodovodní sítě." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2020. http://www.nusl.cz/ntk/nusl-409704.

Full text
Abstract:
This diploma thesis deals with calibration of hydraulic simulation models, especially with methods used for calibration and parameters that are modified during calibration. The literature review in the field of mathematical modeling, basic principles applied in hydraulic modeling and the current approach to calibration and data collection is done in theoretical part. In the practical part of the thesis, a hydraulic model of the group water supply system Horní Dunajovice is built and subsequently calibrated. The calibration is first performed manually for the normal operational status and then using a calibration software created for this purpose for a load case with high velocity.
APA, Harvard, Vancouver, ISO, and other styles
14

Shetty, Keerthan, and Venkata Sai Nikhil Epuri. "Virtual vehicle capabilities towards verification, validation and calibration of vehicle motion control functions." Thesis, KTH, Fordonsdynamik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-293412.

Full text
Abstract:
Passenger safety and comfort are important aspects in the process of vehicle development. The world is heading towards developing the safest possible vehicle on the road. Using vehicle motion control functions is one of the ways to enhance vehicle stability. These motion control functions need to be developed in an energy optimised way. By complementing some of the development process with virtual models, both the development time and cost could be minimised. Hence, a sustainable way of control function development could be achieved. In order to verify, validate and calibrate vehicle motion control functions, an accurate model of the virtual vehicle is required. Hence, a research question on how good the virtual model needs to be for the purpose has been addressed. This report suggests a framework in order to determine the capabilities of a virtual vehicle.In this report, a comparison study has been carried out by exciting the real car and virtual model of a Volvo XC90 with a focus of covering the six degrees of freedom (Yaw, pitch, roll, longitudinal, lateral and vertical). A semi automated framework that possesses the capability of automating the testing in a virtual platform has been established. From the test results, the virtual vehicle capabilities were determined. Further, in the second part of the report, an example use case has been considered by taking two calibration sets of Electronic stability control (ESC) system in order to verify the previously established framework.The analysis includes various levels of plant and controller complexity such as Model-in-loop, Software-in-loop and Hardware-in-loop and on two different road surfaces, low friction and high friction. From the observations, the virtual models considered correlates well for the purpose of verification and validation. However, for the purpose of calibration, the models need to be fine-tuned in the virtual platform. Furthermore, the correlation on low friction road surface could be improved by simulating the tests using an advanced tyre model. Overall, this study helps in choosing the correct complexity of various subsystems in a vehicle for the purpose of verification, validation and calibration of vehicle motion control functions.
Passagerarsäkerhet och komfort är viktiga aspekter i utvecklingen av ett fordon. Världen är på väg mot att utveckla säkraste möjliga fordon på vägen. Användning av fordonetse rörelsekontrollfunktioner är ett av sätten att förbättra fordonets stabilitet. Dessa rörelsekontrollfunktioner måste utvecklas på ett energioptimerat sätt. Genom att komplettera en del av utvecklingsprocessen med virtuella modeller kan både utvecklingstid och kostnad minimeras. Därför kan ett hållbart sätt att utveckla funktionerna för kontrollfunktioner uppnås. För att verifiera, validera och kalibrera fordonets rörelsekontrollfunktioner krävs en detaljerad modell av ett virtuellt fordon. Därför har en forskningsfråga om hur bra den virtuella modellen måste vara för ändamålet behandlats. Denna rapport föreslår ett ramverk för att bestämma funktionerna hos virtuella fordon.I denna rapport har en jämförelsestudie genomförts genom att excitera den verkliga bilen och den virtuella modellen av en Volvo XC90 med fokus på att täcka de sex frihetsgraderna (gir, nick, roll, längs, lateral, vertikal). Ett semi-automatiserat ramverk som har förmågan att automatisera testningen i en virtuell plattform har skapats. Från testresultaten bestämdes de virtuella fordonsfunktionerna. Vidare har i den andra delen av rapporten ett exempel på användningsfall beaktats genom att man tar två kalibreringsuppsättningar av ESC-system (Electronic Stability Control) för att verifiera det tidigare etablerade ramverket.Analysen innefattar olika nivåer av modell- och styrenhetskomplexitet såsom Model-in-loop, Software-in-loop och Hardware-in-loop och på två olika vägytor, låg friktion och hög friktion. Enligt observationerna är de virtuella modellerna väl korrelerade för verifiering och validering. För kalibreringen måste dock modellerna finjusteras på den virtuella plattformen. Dessutom kunde korrelationen på lågfriktionsvägytan förbättras genom att simulera testerna med hjälp av en avancerad däckmodell. Sammantaget hjälper den här studien att välja rätt komplexitet hos olika delsystem i ett fordon för verifiering, validering och kalibrering av fordonets rörelsekontrollfunktioner.
APA, Harvard, Vancouver, ISO, and other styles
15

Verma, Rashmi <1982&gt. "Towards an all-sky continuum survey with a new K-band multi-feed receiver: system characterization, calibration, software development and pilot survey." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3840/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Truong, Charles. "Détection de ruptures multiples – application aux signaux physiologiques." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLN030/document.

Full text
Abstract:
Ce travail s’intéresse au problème de détection de ruptures multiples dans des signaux physiologiques (univariés ou multivariés). Ce type de signaux comprend par exemple les électrocardiogrammes (ECG), électroencéphalogrammes (EEG), les mesures inertielles (accélérations, vitesses de rotation, etc.). L’objectif de cette thèse est de fournir des algorithmes de détection de ruptures capables (i) de gérer de long signaux, (ii) d’être appliqués dans de nombreux scénarios réels, et (iii) d’intégrer la connaissance d’experts médicaux. Par ailleurs, les méthodes totalement automatiques, qui peuvent être utilisées dans un cadre clinique, font l’objet d’une attention particulière. Dans cette optique, des procédures robustes de détection et des stratégies supervisées de calibration sont décrites, et une librairie Python open-source et documentée, est mise en ligne.La première contribution de cette thèse est un algorithme sous-optimal de détection de ruptures, capable de s’adapter à des contraintes sur temps de calcul, tout en conservant la robustesse des procédures optimales. Cet algorithme est séquentiel et alterne entre les deux étapes suivantes : une rupture est détectée, puis retranchée du signal grâce à une projection. Dans le cadre de sauts de moyenne, la consistance asymptotique des instants estimés de ruptures est démontrée. Nous prouvons également que cette stratégie gloutonne peut facilement être étendue à d’autres types de ruptures, à l’aide d’espaces de Hilbert à noyau reproduisant. Grâce à cette approche, des hypothèses fortes sur le modèle génératif des données ne sont pas nécessaires pour gérer des signaux physiologiques. Les expériences numériques effectuées sur des séries temporelles réelles montrent que ces méthodes gloutonnes sont plus précises que les méthodes sous-optimales standards et plus rapides que les algorithmes optimaux.La seconde contribution de cette thèse comprend deux algorithmes supervisés de calibration automatique. Ils utilisent tous les deux des exemples annotés, ce qui dans notre contexte correspond à des signaux segmentés. La première approche apprend le paramètre de lissage pour la détection pénalisée d’un nombre inconnu de ruptures. La seconde procédure apprend une transformation non-paramétrique de l’espace de représentation, qui améliore les performances de détection. Ces deux approches supervisées produisent des algorithmes finement calibrés, capables de reproduire la stratégie de segmentation d’un expert. Des résultats numériques montrent que les algorithmes supervisés surpassent les algorithmes non-supervisés, particulièrement dans le cas des signaux physiologiques, où la notion de rupture dépend fortement du phénomène physiologique d’intérêt.Toutes les contributions algorithmiques de cette thèse sont dans "ruptures", une librairie Python open-source, disponible en ligne. Entièrement documentée, "ruptures" dispose également une interface consistante pour toutes les méthodes
This work addresses the problem of detecting multiple change points in (univariate or multivariate) physiological signals. Well-known examples of such signals include electrocardiogram (ECG), electroencephalogram (EEG), inertial measurements (acceleration, angular velocities, etc.). The objective of this thesis is to provide change point detection algorithms that (i) can handle long signals, (ii) can be applied on a wide range of real-world scenarios, and (iii) can incorporate the knowledge of medical experts. In particular, a greater emphasis is placed on fully automatic procedures which can be used in daily clinical practice. To that end, robust detection methods as well as supervised calibration strategies are described, and a documented open-source Python package is released.The first contribution of this thesis is a sub-optimal change point detection algorithm that can accommodate time complexity constraints while retaining most of the robustness of optimal procedures. This algorithm is sequential and alternates between the two following steps: a change point is estimated then its contribution to the signal is projected out. In the context of mean-shifts, asymptotic consistency of estimated change points is obtained. We prove that this greedy strategy can easily be extended to other types of changes, by using reproducing kernel Hilbert spaces. Thanks this novel approach, physiological signals can be handled without making assumption of the generative model of the data. Experiments on real-world signals show that those approaches are more accurate than standard sub-optimal algorithms and faster than optimal algorithms.The second contribution of this thesis consists in two supervised algorithms for automatic calibration. Both rely on labeled examples, which in our context, consist in segmented signals. The first approach learns the smoothing parameter for the penalized detection of an unknown number of changes. The second procedure learns a non-parametric transformation of the representation space, that improves detection performance. Both supervised procedures yield finely tuned detection algorithms that are able to replicate the segmentation strategy of an expert. Results show that those supervised algorithms outperform unsupervised algorithms, especially in the case of physiological signals, where the notion of change heavily depends on the physiological phenomenon of interest.All algorithmic contributions of this thesis can be found in ``ruptures'', an open-source Python library, available online. Thoroughly documented, ``ruptures'' also comes with a consistent interface for all methods
APA, Harvard, Vancouver, ISO, and other styles
17

Arsalan, Muhammad. "Future Tuning Process For Embedded Control Systems." Thesis, Blekinge Tekniska Högskola, Sektionen för ingenjörsvetenskap, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-6065.

Full text
Abstract:
This master’s thesis concerns development of embedded control systems.Development process for embedded control systems involves several steps, such as control design, rapid prototyping, fixedpoint implementation and hardware-in-the-loop-simulations. Another step, which Volvo is not currently (September 2009) using within climate control is on-line tuning. One reason for not using this technique today is that the available tools for this task (ATI Vision, INCA from ETAS or CalDesk from dSPACE) do not handle parameter dependencies in a atisfactory way. With these constraints of today, it is not possible to use online tuning and controller development process is more laborious and time consuming.The main task of this thesis is to solve the problem with parameter dependencies and to make online tuning possible.
APA, Harvard, Vancouver, ISO, and other styles
18

Galata, Marek. "Tvorba softwarové podpory k zařízení pro absolutní kalibraci GNSS antén." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2016. http://www.nusl.cz/ntk/nusl-390181.

Full text
Abstract:
The diploma thesis deals with creation of software support for the robotic device used for absolute GNSS antenna calibration, which is developed by Institute of Geodesy at Faculty of Civil Engineering, Brno University of Technology. In the beginning of this thesis, there is in general discussed issue of absolute GNSS antenna calibration. Then the text continues by description of calibration device itself together with summary of testing performed so far. Next sections are focused on its main content. There is mentioned testing calibration process, followed by all of every single steps, which leads the procedure towards gaining calibration parameters for removing errors arising due to instability of antenna´s phase center position.
APA, Harvard, Vancouver, ISO, and other styles
19

Stormyrbakken, Christer. "Automatic compensation for inaccuracies in quadrature mixers." Thesis, Stellenbosch : University of Stellenbosch, 2005. http://hdl.handle.net/10019.1/2334.

Full text
Abstract:
Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2005.
In an ideal software defined radio (SDR), all parameters are defined in software, which means the radio can be reconfigured to handle any communications standard. A major technical challenge that needs to be overcome before this SDR can be realised, is the design of an RF front end that can convert any digital signal to an analogue signal at any carrier frequency and vice versa. Quadrature mixing (QM) can be used to implement and analogue front end, that performs up and down conversion between the complex baseband centred around 0 Hz and the carrier frequency. By separating the tasks of frequency conversion and digital-to-analogue conversion, the latter can be performed at a much lower sample rate, greatly reducing the demands on the hardware. Furthermore, as QM can handle variable carrier frequency and signal bandwidth, this can be done without sacrificing reconfigurability. Using QM as an analogue front end may therefore be the solution to implementing SDR handsets.
APA, Harvard, Vancouver, ISO, and other styles
20

Vinci, Joseph J. "Sparse Aperture Measurement in a Non-Ideal Semi-Anechoic Chamber." University of Dayton / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1557426154482334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Reiss, Mário Luiz Lopes. "Reconstrução tridimensional digital de objetos à curta distância por meio de luz estruturada." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2007. http://hdl.handle.net/10183/10072.

Full text
Abstract:
Neste trabalho apresenta-se o desenvolvimento e avaliação de um sistema de reconstrução 3D por luz estruturada. O sistema denominado de Scan3DSL é baseado em uma câmara digital de pequeno formato e um projetor de padrões. O modelo matemático para a reconstrução 3D é baseado na equação paramétrica da reta formada pelo raio de luz projetado combinado com as equações de colinearidade. Uma estratégia de codificação de padrões foi desenvolvida para permitir o reconhecimento dos padrões projetados em um processo automático. Uma metodologia de calibração permite a determinação dos vetores diretores de cada padrão projetado e as coordenadas do centro de perspectiva do projetor de padrões. O processo de calibração é realizado com a aquisição de múltiplas imagens em um plano de calibração com tomadas em diferentes orientações e posições. Um conjunto de algoritmos de processamento de imagens foi implementado para propiciar a localização precisa dos padrões e de algumas feições, como o centro de massa e quinas. Para avaliar a precisão e as potencialidades da metodologia, um protótipo foi construído, integrando uma única câmara e um projetor de padrões. Experimentos mostram que um modelo de superfície pode ser obtido em um tempo total de processamento inferior a 10 segundos, e com erro absoluto em profundidade em torno de 0,2 mm. Evidencia-se com isso a potencialidade de uso em várias aplicações.
The purpose of this work is to present a structured light system developed. The system named Scan3DSL is based on off-the-shelf digital cameras and a projector of patterns. The mathematical model for 3D reconstruction is based on the parametric equation of the projected straight line combined with the collinearity equations. A pattern codification strategy was developed to allow fully automatic pattern recognition. A calibration methodology enables the determination of the direction vector of each pattern and the coordinates of the perspective centre of the pattern projector. The calibration processes are carried out with the acquisition of several images of a flat surface from different distances and orientations. Several processes were combined to provide a reliable solution for patterns location. In order to assess the accuracy and the potential of the methodology, a prototype was built integrating in a single mount a projector of patterns and a digital camera. The experiments using reconstructed surfaces with real data indicated a relative accuracy of 0.2 mm in depth could be achieved, in a processing time less than 10 seconds.
APA, Harvard, Vancouver, ISO, and other styles
22

Cagni, J?nior El?i. "Software inteligente embarcado aplicado ? corre??o de erro na medi??o de vaz?o em g?s natural." Universidade Federal do Rio Grande do Norte, 2007. http://repositorio.ufrn.br:8080/jspui/handle/123456789/15164.

Full text
Abstract:
Made available in DSpace on 2014-12-17T14:55:01Z (GMT). No. of bitstreams: 1 EloiCJ.pdf: 4045636 bytes, checksum: e880bc5df766f908db7bff364ffa3574 (MD5) Previous issue date: 2007-02-15
This study developed software rotines, in a system made basically from a processor board producer of signs and supervisory, wich main function was correcting the information measured by a turbine gas meter. This correction is based on the use of an intelligent algorithm formed by an artificial neural net. The rotines were implemented in the habitat of the supervisory as well as in the habitat of the DSP and have three main itens: processing, communication and supervision
Este trabalho desenvolveu rotinas de software, em um sistema composto basicamente de placa processadora de sinais (DSP) e supervis?rio, cuja finalidade principal foi corrigir a informa??o medida por um medidor de vaz?o do tipo turbina. Essa corre??o se baseia na utiliza??o de um algoritmo inteligente formado por uma rede neural artificial. As rotinas foram implementadas tanto no ambiente do supervis?rio quanto do DSP e tratam de tr?s ?tens principais: processamento, comunica??o e supervis?o
APA, Harvard, Vancouver, ISO, and other styles
23

Troth, Bill. "TRADEOFFS TO CONSIDER WHEN SELECTING AN AIRBORNE DATA ACQUISITION SYSTEM." International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/606787.

Full text
Abstract:
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
Selecting an airborne data acquisition system involves compromises. No single data acquisition system can be at the same time, lowest cost, smallest, easiest to use and most accurate. The only way to come to a reasonable decision is to carefully plan the project, taking into account what measurements will be required, what are the physical environments involved, what personnel and resources will be needed and of course, how much money is available in the budget? Getting the right mix of equipment, resources and people to do the job within the schedule and the budget is going to involve a number of tradeoffs. A good plan and a thorough knowledge of available resources and equipment will allow you make the necessary decisions. Hopefully, this paper will offer some suggestions that will aid in preparing your plan and give some insight into available system alternatives.
APA, Harvard, Vancouver, ISO, and other styles
24

Angarita, Soto Angie. "Design Philosophy for User Friendly Parameter Handler." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-16187.

Full text
Abstract:
DCU2 (Drive Control Unit 2) is an important control system used in applications for train systems that are configured by a set of parameters. Traditionally, parameterization is conducted by using an excel workbook during the software development. The parameters are set up and further export the parameters to the compilation step. Such approach has a number of disadvantages, e.g., delays on the validation and verification steps, system configuration overhead, and suboptimal system reliability generated by the parameter configurations. To improve the parameterization process, this thesis implements a model-based software architecture approach and automotive industry standards via rapid prototyping by using scrum methodology. We do this by using Matlab/Simulink, TDL (Time Description Language) and UML (Unified Modeling Language) architectural description languages to enable different views of the software architecture. We then develop different prototypes that implement ASAM (Association for Standardization of Automation and Measuring Systems) standards like XCP protocol over Ethernet (code ASAM MCD-1 XCP V1.1.0) and ASAP2 (code ASAM MCD-2 MC) in every scrum sprint. An evaluation then shows that the thesis successfully implements previously defined standards that use commercial tools from e.g., Vector, proving that the parameter‟s unit control can be handled via online calibration and measurement, leading to a significant improvement in Bombardier‟s software development process in a distributed development environment.
APA, Harvard, Vancouver, ISO, and other styles
25

Nelson, Wade, and Diana Shurtleff. "Bridging The Gap Between Telemetry and the PC." International Foundation for Telemetering, 1988. http://hdl.handle.net/10150/615216.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada
The explosive use and extensive development of software and hardware for the IBM PC and PC Clones over the past few years has positioned the PC as one of many viable alternatives to system designers configuring systems for both data acquisition and data analysis. Hardware abounds for capturing signals to be digitized and analyzed by software developed for the PC. Communication software has improved to where system developers can easily link instrumentation devices together to form integrated test environments for analyzing and displaying data. Telemetry systems, notable those developed for lab calibration and ground station environments, are one of many applications which can profit from the rapid development of data acquisition techniques for the PC. Recently developed for the ADS100A telemetry processor is a data acquisition module which allows the system to be linked into the PC world. The MUX-I/O module was designed to allow the PC access to telemetry data acquired through the ADS 100A, as well as provide a method by which data can be input into the telemetry environment from a host PC or equivalent RS-232 or GPIB interface. Signals captured and digitized by the ADS100A can be passed on to the PC for further processing and/or report generation. Providing interfaces of this form to the PC greatly enhances the functionality and scope of the abilities already provided by the ADS100A as one of the major front-end processors used in telemetry processing today. The MUX-I/O module helps "bridge the gap" between telemetry and the PC in an ever increasing demand for improving the quantity and quality of processing power required by today's telemetry environment. This paper focuses on two distinct topics, how to transfer data to and from the PC and what off-the-shelf software is available to provide communication links and analysis of incoming data. Major areas of discussion will include software protocols, pre vs post processing, static vs dynamic processing environments, and discussion of the major data analysis and acquisition packages available for the PC today, such as DaDisp and Lotus Measure, which aid the system designer in analyzing and displaying telemetry data. Novel applications of the telemetry to PC link will be discussed.
APA, Harvard, Vancouver, ISO, and other styles
26

Rodriguez-Ferreira, Julian. "Étalonnage au sol de l’instrument SIMBIO-SYS à bord de la mission ESA/BEPICOLOMBO." Thesis, Paris 11, 2015. http://www.theses.fr/2015PA112011/document.

Full text
Abstract:
La mission BepiColombo est une des pierres angulaires du programme scientifique de l'ESA. Elle permettra l'étude de la planète Mercure grâce à deux sondes mises en orbite autour de la planète. Une des deux sondes, Mercury Planetary Orbiter (MPO) développée par l'ESA, sera dédiée à l'étude de la surface et de l'intérieur de la planète. La mission est prévue pour un lancement en 2016 et une arrivée sur Mercure en janvier 2024. L’IAS est responsable de l’étalonnage de l'ensemble d'imageurs SIMBIO-SYS (Spectrometer and Imagers for MPO BepiColombo-Integrated Observatory SYStem) composé d’une caméra haute résolution (HRIC), d’une caméra stéréoscopique (STC) et d’un imageur hyperspectral visible et proche-infrarouge (VIHI). Ces instruments devraient profondément modifier nos connaissances de la composition et de la géomorphologie de la surface de Mercure. Ma thèse a consisté à participer à la définition et à la mise en place des caractéristiques et des fonctionnalités du dispositif expérimental d'étalonnage qui se compose principalement d’une cuve à vide contenant les instruments, d’un banc optique rassemblant les sources d'étalonnage et les éléments optiques qui reconstituent les conditions d'observation de Mercure, des interfaces mécaniques permettant le positionnement de l'expérience à l'intérieur de la cuve, des interfaces thermiques visant à explorer les températures de fonctionnement des différentes parties des expériences, des interfaces informatiques assurant la communication avec l'expérience et le pilotage du dispositif d'étalonnage en fonction des tests à réaliser. J’ai modélisés et validé expérimentalement certaines performances du dispositif. Enfin, j’ai défini en étroite collaboration avec les équipes italiennes co-responsables des trois instruments les différentes séquences d’étalonnage qui seront utilisées lors de l’étalonnage
BepiColombo is one of the cornerstones of the scientific program of ESA. It will study the planet Mercury with two spacecrafts in orbit around the planet. One of the two spacecrafts, the Mercury Planetary Orbiter (MPO), will be dedicated to the study of the surface and interior of the planet. The mission is scheduled for launch in 2016 and arrival at Mercury in January 2024. IAS is responsible for the calibration of the imaging system SIMBIO-SYS (Spectrometers and Imagers for MPO BepiColombo Integrated Observatory-SYStem) which consists of a high-resolution camera (HRIC), a stereoscopic camera (STC) and a visible and near-infrared hyperspectral imager (VIHI). These instruments should deeply change our understanding of the composition and geomorphology of Mercury surface. My research subject allowed me to participate in all the activities concerning the definition, implementation and validation of the calibration facilities at the IAS. These facilities are divided into different sub-systems: a thermal vacuum chamber containing the instrument during all the calibration campaign that shall simulate the environmental conditions (temperature and pressure), an optical bench with optical components and radiometrically calibrated sources reproducing the observational conditions as it will be seen by the instrument once placed in Mercury’s orbit, mechanical interfaces allowing the positioning and guidance of the instrument when placed inside the vacuum chamber with the required precision and accuracy, thermal interfaces facilitating the thermal excursion of the detectors, software interfaces so as to automatize and control the entire system. I developed a radiometric model of the calibration system and instrument to refine the calibration sources. In parallel, I performed several measurements of some subsystems so as to validate the optical assembly and to improve its control. Finally as a result of a close collaboration with the three Italian scientific teams of the instrument, I elaborate the fully package of the calibration sequences and the detailed instrument configuration that will be used during the calibration campaign
APA, Harvard, Vancouver, ISO, and other styles
27

Björkén, Gustaf. "Mjukvara för mätning av etanolhalt i våt- och torrgas." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230619.

Full text
Abstract:
På Nationellt Forensiskt Centrum (NFC) i Linköping, sker årligen kalibrering av de bevisinstrument som idag används vid trafiknykterhetskontroller i Sverige, Norge och Finland. Vid kalibreringen används en simulatorlösning som har beretts internt på NFC. Etanolhalten i simulatorlösningen kontrolleras med hjälp av ett referensinstrument placerad på NFC. Under de senaste åren har ett nytt referensinstrument tagits fram som ska ersätta det befintliga. Vid utvecklandet av det nya referensinstrumentet har fokus varit att skapa så bra hårdvara som möjligt och mindre tid har lagts på utvecklingen av mjukvara. Syftet med denna studie är att utveckla en central mjukvara, till det nya referensinstrumentet, för att hantera mätning av etanolhalt i våt- och torrgas. En central mjukvara förenklar och effektiviserar arbetet, i gruppen för alkoholutandning hos NFC, genom att valbara typer av mätningar kan utföras via ett och samma gränssnitt. Studien inleds med en litteraturstudie för att få bra förståelse för det berörda området alkoholutandning med fokus på användningsområdet för referensinstrumentet. Parallellt med litteraturstudien genomförs en förstudie, där delar av befintlig mjukvara observeras, översiktliga krav för den nya mjukvaran tas fram och en första version av denna utvecklas. Efter förstudien och litteraturstudien vidareutvecklas mjukvaran utifrån en iterativ systemutvecklingsmodell i nära kontakt med kontaktperson tillika kravställare på NFC. Utvecklingen av mjukvaran fortgår till att alla framtagna krav för mjukvaran anses vara uppfyllda. Studien har resulterat i en mjukvara, för det nya referensinstrumentet, som hanterar mätning av etanolhalt i våt- och torrgas. Mjukvaran är utvecklad med arkitekturen Model-View-Controller i programmeringsspråket Java. Den framtagna mjukvaran består av ett grafiskt användargränssnitt med funktionalitet för att bland annat utföra olika typer av mätningar av etanolhalt i våt- och torrgas, samt funktionalitet för att visa avlästa och beräknade värden både som text och i grafer. Mjukvaran består även av konfigurationsfiler för lagring av defaultvärden, csv-filer för lagring av mätvärden och provresultat i samband med mätningar samt loggfiler för registrering av viktiga händelser.
At the National Forensic Centre (NFC) in Linköping, annually calibration is done of the evidence tools used in conjunction with traffic sobriety controls in Sweden, Norway and Finland. During the calibration a simulator solution, which is prepared internally at NFC, is used. The ethanol content of the simulator solution is checked by using a reference instrument situated at NFC. During the last years, a new reference instrument has been developed which will replace the existing. Throughout developing the new reference instrument, the focus has been to create a hardware as good as possible and less time has been spent on software development. The purpose of this study is to develop a central software, to the new reference instrument, for handling the measurement of ethanol content in hydrogen and dry gas. A central software simplifies and streamlines the work, in the group for alcohol exhalation at NFC, by selectable types of measurements be able to execute through one interface. The study begins with a literature study in order to gain a good understanding of the area of alcohol exhalation, focusing on the use of the reference instrument. In parallel with the literature study a preliminary study is conducted, where parts of the existing software is observed, conceptual requirements for the new software are made and a first version is developed. After the literature study and preliminary study, further development of the software is based on an iterative system development model in close interaction with the contact person, as well as, the requirements specifier at NFC. The study has resulted in a software, for the new reference instrument, which handles the measurement of ethanol content in hydrogen and dry gas. The software is developed with the Model-View Controller architecture in Java programming language. The developed software consists of a graphical user interface with functionality, inter alia, for performing different types of measurements of ethanol in hydrogen and dry gas, as well as functionality for displaying read and calculated values both as text and in graphs. The software also consists of configuration files for storing default values, csv-files for storing measurement values and sample results associated with measurements and also log files for recording important events.
APA, Harvard, Vancouver, ISO, and other styles
28

Lang, Kathrin. "Software for calibrating a digital image processing." Master's thesis, Pontificia Universidad Católica del Perú, 2014. http://tesis.pucp.edu.pe/repositorio/handle/123456789/5350.

Full text
Abstract:
This work is about learning tool wich provides the necessary parameters for a program controlling robots of type LUKAS at the Faculty of Mechanical Engineering. The robot controlling program needs various parameters depending on its environment, like the light intensity distribution, and camera settings as exposure time and gain raw. These values have to be transmitted from the learning tool to the robot controlling software. Chapter one introduces the robots of type LUKAS which are created for the RoboCup Small Size League. Furthermore, it introduces the camera used for image processing. The second chapter explains the learning process according to Christoph UBfeller and deduces the requirements for this work. In the third chapter theoretical basics concerning image processing, wich are fundamental for this work, are explained. Chapter 4 describes the developed learning tool which is used for the learning process and generates the required parameters for the robot controlling software. In chapter five practical test with two persons are represented. The sixth and last chapter summarizes the results.
Tesis
APA, Harvard, Vancouver, ISO, and other styles
29

Chouquet, Julie. "Development of a method for building life cycle analysis at an early design phase Implementation in a tool - Sensitivity and uncertainty of such a method in comparison to detailed LCA software = Calibration of new flavor tagging algorithms using Bs oscillations /." [S.l. : s.n.], 2007. http://digbib.ubka.uni-karlsruhe.de/volltexte/1000009290.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Jelínek, Vít. "Kalibrace skleněných měřítek." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-232162.

Full text
Abstract:
This thesis deals with a more work-efficient and time-efficient method of calibration of standard glass scales, with practical use in the Czech Metrology Institute Regional Inspectorate in Brno. The desired streamlining of calibration were achieved in the use of a 3D coordinate measuring machine Micro-Vu Excel 4520. In the service software InSpec, six measuring programs were designed in the use of a standard glass scale brand SIP. The measurement uncertainties of this calibration were presented and calculated. This thesis draws up a draft proposal of the calibration procedure and drafts a formalized document of the calibration.
APA, Harvard, Vancouver, ISO, and other styles
31

Pereira, José Cristiano. "Modelo causal para análise probabilística de risco de falhas de motores a jato em situação operacional de fabricação." Niterói, 2017. https://app.uff.br/riuff/handle/1/4078.

Full text
Abstract:
Submitted by Secretaria Pós de Produção (tpp@vm.uff.br) on 2017-07-27T19:21:56Z No. of bitstreams: 1 D2014 - José Cristiano Pereira.pdf: 9830334 bytes, checksum: d5be51799514c74451d0ca3358d7757b (MD5)
Made available in DSpace on 2017-07-27T19:21:56Z (GMT). No. of bitstreams: 1 D2014 - José Cristiano Pereira.pdf: 9830334 bytes, checksum: d5be51799514c74451d0ca3358d7757b (MD5)
O processo de fabricação de motores a jato é complexo. Perigos e riscos e muitos elementos críticos estão presentes em milhares de atividades necessárias para fabricar um motor. Na investigação realizada nota-se a inexistência de um modelo específico para calcular quantitativamente a probabilidade de falha operacional de um motor à jato. O objetivo da tese foi desenvolver um modelo causal para análise de risco probabilística de falhas de motores a jato em situação operacional de fabricação. O modelo se caracteriza pela aplicação de rede Bayesiana associada à árvore de falha / árvore de evento e elicitação de probabilidades por especialistas para quantificar a probabilidade de falha. Para a concepção da construção do modelo, foi inicialmente desenvolvida uma pesquisa bibliométrica, através da consulta aos principais motores de busca nacionais e internacionais, em periódicos científicos e técnicos, bancos de dissertações/teses e eventos técnicos relacionados ao tema, para estabelecimento dos estado-da-arte e da técnica. Para a estimativa das probabilidades associadas aos cenários de falhas propostos, foi desenvolvido um processo de elicitação de probabilidade a partir da consulta a especialistas e técnicos. Na concepção do modelo foram consideradas três áreas de influência para a confiabilidade do sistema: humana, software e calibração. Como resultado foi desenvolvido o modelo CAPEMO, que é suportado por um aplicativo que utiliza a teoria das probabilidades (Lei de Bayes) para modelar incerteza. A probabilidade de falha estimada ao final da processo de fabricação, antes do motor ser colocado em operação, contribui no processo de tomada de decisão, melhoria da segurança do sistema e redução de riscos de falha do motor em operação
The process of jet engines manufacturing is complex. Hazards and risks and many critical elements are present in the thousands of activities required to manufacture an engine. In the conducted investigation it is observed a lack of a specific model to estimate quantitatively the probability of a jet engine operational failure. The goal of this thesis is to develop a causal model for probabilistic risk analysis of jet engines failure in manufacturing situational operation. The model is characterized by the application of Bayesian Network associated with the fault tree and event tree to quantify the probability of failure. For the establishment of state-of-the-art and technique and for the conception and construction of the model, a bibliometric research was conducted in the main national and international search engines, in the scientific and technical journals, in the database of dissertations/theses and technical events related to the topic. For the estimation of the probabilities associated with the proposed fault scenarios, a process of probability elicitation from technicians and experts was developed. In the design of the model three areas of influence for the reliability of the system were considered: human, software and calibration. As a result CAPEMO model was developed, that is supported by a software application that uses probability theory to model uncertainty. The probability of engine failure estimated at the end of the manufacturing process, before the motor be put into operation, helps in the allocation of resources in the decision-making process and improves system safety reducing the risk of engine failure in operation
APA, Harvard, Vancouver, ISO, and other styles
32

Green, Steven. "Calorimetry at a future Linear Collider." Thesis, University of Cambridge, 2017. https://www.repository.cam.ac.uk/handle/1810/269648.

Full text
Abstract:
This thesis describes the optimisation of the calorimeter design for collider experiments at the future Compact Linear Collider (CLIC) and the International Linear Collider (ILC). The detector design of these experiments is built around high-granularity Particle Flow Calorimetry that, in contrast to traditional calorimetry, uses the energy measurements for charged particles from the tracking detectors. This can only be realised if calorimetric energy deposits from charged particles can be separated from those of neutral particles. This is made possible with fine granularity calorimeters and sophisticated pattern recognition software, which is provided by the PandoraPFA algorithm. This thesis presents results on Particle Flow calorimetry performance for a number of detector configurations. To obtain these results a new calibration procedure was developed and applied to the detector simulation and reconstruction to ensure optimal performance was achieved for each detector configuration considered. This thesis also describes the development of a software compensation technique that vastly improves the intrinsic energy resolution of a Particle Flow Calorimetry detector. This technique is implemented within the PandoraPFA framework and demonstrates the gains that can be made by fully exploiting the information provided by the fine granularity calorimeters envisaged at a future linear collider. A study of the sensitivity of the CLIC experiment to anomalous gauge couplings that {affect} vector boson scattering processes is presented. These anomalous couplings provide insight into possible beyond standard model physics. This study, which utilises the excellent jet energy resolution from Particle Flow Calorimetry, was performed at centre-of-mass energies of 1.4 TeV and 3 TeV with integrated luminosities of 1.5$\text{ab}^{-1}$ and 2$\text{ab}^{-1}$ respectively. The precision achievable at CLIC is shown to be approximately one to two orders of magnitude better than that currently offered by the LHC. In addition, a study into various technology options for the CLIC vertex detector is described.
APA, Harvard, Vancouver, ISO, and other styles
33

Shakoori, Moghadam Monfared Shaghayegh. "Design and prototyping of indoor positioning systems for Internet-of-Things sensor networks." Doctoral thesis, Universite Libre de Bruxelles, 2021. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/316363.

Full text
Abstract:
Accurate indoor positioning of narrowband Internet-of-Things (IoT) sensors has drawn more attention in recent years. The introduction of Bluetooth Low Energy (BLE) technology is one of the latest developments of IoT and especially applicable for Ultra-Low Power (ULP) applications. BLE is an attractive technology for indoor positioning systems because of its low-cost deployment and reasonable accuracy. Efficient indoor positioning can be achieved by deducing the sensor position from the estimated signal Angle-of-Arrival (AoA) at multiple anchors. An anchor is a base station of known position and equipped with a narrowband multi-antenna array. However, the design and implementation of indoor positioning systems based on AoA measurements involve multiple challenges. The first part of this thesis mainly addresses the impact of hardware impairments on the accuracy of AoA measurements. In practice, the subspace-based algorithms such as Multiple Signal Classification (MUSIC) suffer from sensitivity to array calibration errors coming from hardware imperfections. A detailed experimental implementation is performed using a Software Defined Radio (SDR) platform to precisely evaluate the accuracy of AoA measurements. For this purpose, a new Over-the-Air (OTA) calibration method is proposed and the array calibration error is investigated. The experimental results are compared with the theoretical analysis. These results show that array calibration errors can cause some degrees of uncertainty in AoA estimation. Moreover, we propose iterative positioning algorithms based on AoA measurements for low capacity IoT sensors with high accuracy and fair computational complexity. Efficient positioning accuracy is obtained by iterating between the angle and position estimation steps. We first develop a Data-Aided Maximum a Posteriori (DA- MAP) estimator based on the preamble of the transmitted signal. DA-MAP estimator relies on the knowledge of the transmitted signal which makes it impractical for narrowband communications where the preamble is short. For this reason, a Non-Data- Aided Maximum a Posteriori (NDA-MAP) estimator is developed to improve the AoA accuracy. The iterative positioning algorithms are therefore classified as Data-Aided Iterative (DA-It) and Non-Data-Aided Iterative (NDA-It) depending on the knowledge of the transmitted signal that is used for estimation. Both numerical and experimental analyses are carried out to evaluate the performance of the proposed algorithms. The results show that DA-MAP and NDA-MAP estimators are more accurate than MUSIC. The results also show that DA-It comes very close to the performance of the optimal approach that directly estimates the position based on the observation of the received signal, known as Direct Position Estimation (DPE). Furthermore, the NDA-It algorithm significantly outperforms the DA-It because it can use a much higher number of samples; however, it needs more iterations to converge. In addition, we evaluate the computational savings achieved by the iterative schemes compared to DPE through a detailed complexity analysis. Finally, we investigate the performance degradation of the proposed iterative algorithms due to the impact of multipath and NLOS propagation in indoor environments. Therefore, we develop an enhanced iterative positioning algorithm with an anchor selection method in order to identify and exclude NLOS anchors. The numerical results show that applying the anchor selection strategy significantly improves the positioning accuracy in indoor environments.
Doctorat en Sciences de l'ingénieur et technologie
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
34

Juhas, Miroslav. "Vizuální detekce elektronických součástek." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2010. http://www.nusl.cz/ntk/nusl-218587.

Full text
Abstract:
This thesis describes application of image processing for precise distance measurement in self acting production of a tip for AFM microscopes. The main goal is to measure distances between assembly parts during fabrication process. The purpose is to acquire a data for self acting assembly line which have to substitute inaccurate and nonrecurring manual assembly process. The assembly process consists of three technological steps. In first two steps the tungsten wire is glued to the cantilever. Distance measurement is necessary in all axes for proper alignment of parts. In third step the sharp tip is etched by KOH solution. The right distance between liquid level and the cantilever must be kept. A camera with high resolution and macro objective is used to acquire an image. Camera image is then calibrated to suppress distortions and scene position with respect to camera position. Length conversion coefficient is also computed. Object recognition and distance measurement is based on standard computer vision methods, mainly: adaptive thresholding, moments, image statistics, canny edge detector, Hough transform… Proposed algorithms have been implemented in C++ using Intel OpenCV library. The final achieved distance resolution is about 10µm per pixel. Algorithm output was successfully used to assembly few test tips.
APA, Harvard, Vancouver, ISO, and other styles
35

Hofman, Jiří. "Testovací metody pro hodnocení radiačních efektů v přesných analogových a signálově smíšených obvodech pro aplikace v kosmické elektronice." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2019. http://www.nusl.cz/ntk/nusl-401588.

Full text
Abstract:
The traditional radiation testing of space electronics has been used for more than fifty years to support the radiation hardness assurance. Its typical goal is to ensure reliable operation of the spacecraft in the harsh environment of space. This PhD research looks into the radiation testing from a different perspective; the goal is to develop radiation testing methods that are focused not only on the reliability of the components but also on a continuous radiation-induced degradation of their performance. Such data are crucial for the understanding of the impact of radiation on the measurement uncertainty of data acquisition systems onboard research space missions.
APA, Harvard, Vancouver, ISO, and other styles
36

Berka, Petr. "Snímání scény pomocí USB a FireWire kamer." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2008. http://www.nusl.cz/ntk/nusl-217203.

Full text
Abstract:
This diploma thesis deals with a interface USB and FireWire cameras to a computer through a technology called DirectShow. I used mostly a development kit „MontiVision“, which cooperates with a programming environment as is e.g. Microsoft Visual C++. You find here how to use a direct pixel access, how to get singles photos from a video, how to set up and calibrate a camera and how can look a particular application of the image processing. I wrote an introduction to the stereo-vision above the frame of my task. This thesis is like a manual for students. It includes my personal experiences and experimentations too.
APA, Harvard, Vancouver, ISO, and other styles
37

Brossier, David. "Élaboration et validation d'une base de données haute résolution destinée à la calibration d'un patient virtuel utilisable pour l'enseignement et la prise en charge personnalisée des patients en réanimation pédiatrique Perpetual and Virtual Patients for Cardiorespiratory Physiological Studies Creating a High-Frequency Electronic Database in the PICU: The Perpetual Patient Qualitative subjective assessment of a high-resolution database in a paediatric intensive care unit-Elaborating the perpetual patient's ID card Validation Process of a High-Resolution Database in a Pediatric Intensive Care Unit – Describing the Perpetual Patient’s Validation Evaluation of SIMULRESP©: a simulation software of child and teenager cardiorespiratory physiology." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC428.

Full text
Abstract:
La complexité des patients de réanimation justifie le recours à des systèmes d’aide à la décision thérapeutique. Ces systèmes rassemblent des protocoles automatisés de prise en charge permettant le respect des recommandations et des simulateurs physiologiques ou patients virtuels, utilisables pour personnaliser de façon sécuritaire les prises en charge. Ces dispositifs fonctionnant à partir d’algorithmes et d’équations mathématiques ne peuvent être développés qu’à partir d’un grand nombre de données de patients. Le principal objectif de cette thèse était la mise en place d’une base de données haute résolution automatiquement collectée de patients de réanimation pédiatrique dont le but sera de servir au développement et à la validation d’un simulateur physiologique : SimulResp© . Ce travail présente l’ensemble du processus de mise en place de la base de données, du concept jusqu’à son utilisation
The complexity of the patients in the intensive care unit requires the use of clinical decision support systems. These systems bring together automated management protocols that enable adherence to guidelines and virtual physiological or patient simulators that can be used to safely customize management. These devices operating from algorithms and mathematical equations can only be developed from a large number of patients’ data. The main objective of the work was the elaboration of a high resolution database automatically collected from critically ill children. This database will be used to develop and validate a physiological simulator called SimulResp© . This manuscript presents the whole process of setting up the database from concept to use
APA, Harvard, Vancouver, ISO, and other styles
38

Huang, Chin-Hung, and 黃進鴻. "Rapid Calibration of White Organic Light-emitting Diodes Layer Thickness by ETFOS Software." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/39s36z.

Full text
Abstract:
碩士
國立虎尾科技大學
光電與材料科技研究所
99
A bstract In this study the organic light-emitting diodes and organic layer thickness due to human factors laboratory errors, can the optical simulation software (ETFOS) correction surgery simulation results quickly deviation, the simulation results show that, after experimental and simulation After comparing, the value of the spectrum to identify the main emitting layer (EL) film thickness is 1.5 times the error, electron transport layer (ETL) of thickness 1.75 times the error. And the experimental results show the different voltages, the devices produce a hue (color-shfit) phenomenon, the voltage is 6V, the light-emitting compound area (Location) generated in the distance of the hole transport layer (HTL) / electron transport layer (ETL) 90% at voltage of 7V When light-emitting compound area (Location) generated in the distance of the hole transport layer (HTL) / electron transport layer (ETL) in 85% of the voltage of 8V, light-emitting compound area (Location) produced away from the hole transport layer (HTL) / electron transport layer (ETL) in 74% of the Department, the voltage of 9V, the light-emitting compound area (Location) generated in the distance of the hole transport layer (HTL) / electron transport layer (ETL) in 55% of the Department. Finally, the simulation software (ETFOS) get the best adjustment of the light-emitting layer (EL) of the thickness values in 15nm。
APA, Harvard, Vancouver, ISO, and other styles
39

Goo, Justin M. W. "The effect of negative feedback on confidence calibration and error reduction in spreadsheet development." 2002. http://proquest.umi.com/pqdweb?index=0&did=727403601&SrchMode=1&sid=5&Fmt=2&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1179198165&clientId=23440.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Gavigan, Patrick. "Design, Test, Calibration and Qualification of Satellite Sun Sensors, Power Systems and Supporting Software Development." Thesis, 2011. http://hdl.handle.net/1807/27338.

Full text
Abstract:
This thesis describes technologies developed for nanosatellites at the Space Flight Laboratory. A critical ground station component, the Terminal Node Controller, was upgraded in order to support Generic Nanosatellite Bus and future missions. Sun sensor requirements and operation were reviewed, followed by details of the author's work in executing the acceptance testing on these parts, including thermal shock testing, thermal functional testing, calibration, system level testing and on orbit commissioning. A new calibration test process was developed, along with supporting structure and software to ease the testing process, producing accurate calibration parameters and expected performance results for the sensors. A thermal qualification campaign was completed, demonstrating that sun sensors are capable of functioning with negligible performance degradation after exposure to extreme temperatures. A process for installing the sun sensor pin hole was developed using photolithography. Finally, power subsystem analysis for the NEMO-AM mission is presented.
APA, Harvard, Vancouver, ISO, and other styles
41

Roopa, Variza Daya. "The calibration of a software programme to assess ceramic crown preparations in a pre-clinical setting." Thesis, 2016. http://hdl.handle.net/10539/21537.

Full text
Abstract:
A research report submitted to the Faculty of Health Sciences, University of the Witwatersrand, in partial fulfilment of the requirements for the degree of Master of Science in Dentistry. School of Oral Health Sciences, Faculty of Health Sciences, University of the Witwatersrand, South Africa Johannesburg, 2016
Purpose The use of the PrepCheck (v.1.1) software (Sirona Dental Systems GmbH, Germany) is to enable assessment of student tooth preparations to be digital, objective, accurate, consistent and reliable so that students can work independently. The software has its default parameters configured to arbitrary values which are not based on clinical evidence of tooth preparations. The aim of this study was to set new parameters of the software to realistically assess the quality of clinically acceptable ceramic crown preparations. Method Based on evidence in the literature a new assessment rubric for the evaluation of ceramic tooth preparations was created which allowed for grading of the preparations as Acceptable, Requires Modification or Unacceptable. Sixty preparations were made on typodont teeth for tooth numbers 11, 13, 15, 16, 36 (FDI system). For each tooth four preparations were made to meet all the requirements under the Acceptable Category, four with variations in taper, incisal/ occlusal reduction and axial reduction to be categorised as Requires Modification and four had further variations made so that they fall under the Unacceptable Category. The sixty preparations were assessed by five faculty instructors (acting as raters) at baseline and again after two weeks to assess intra- and inter-rater reliability. Once sufficient agreement had been reached, the software’s parameters were adjusted and the preparations were scanned and compared with the categorical assessment from the instructors. This process was repeated once to test whether the software had been successfully calibrated. Results The intra-rater agreement was substantial, with two raters having excellent intra-rater agreement. However, two raters had poor inter-rater agreement and were then excluded, after which the inter-rater reliability measured by Cohen’s kappa was 0.71, corresponding to ‘substantial’ agreement on both occasions. The majority decision rating from these assessments accurately resembled the intended rating and was used to compare the ratings of the software assessment using the default and new (rubric) parameter settings. The default settings performed poorly, whereas the new settings resulted in substantial agreement with the majority decision of the instructors (raters). Conclusions It was found that the default parameters of the software were unrealistic and not clinically based. The parameters required considerable modification to assist in the development of clinically acceptable preparations. The software shows great promise but the parameters have to be modified to be able to assess preparations that are more realistic for the clinical situation.
MB2016
APA, Harvard, Vancouver, ISO, and other styles
42

Jie-Cheng, Lin, and 林傑澄. "The Software, Firmware, Calibration, and Running of the BGO Background/Luminosity Monitor in BEAST2 for SuperKEKB Commissioning." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/5qcp7s.

Full text
Abstract:
碩士
國立臺灣大學
物理學研究所
105
Beam commissioning of the SuperKEKB collider began in 2016. The Beam Ex-orcism for A STable experiment II (BEAST II) project is particularly designed to measure the beam backgrounds around the interaction point of the SuperKEKB collider for the Belle II experiment. We developed a system using undoped bismuth germanium oxide (BGO) crystals with optical fiber connections to a multianode photomultiplier tube and one field-programmable gate array embedded DAQ (data acquisition) board for real-time beam radiation background monitoring. The radia-tion sensitivity of the BGO system is calibrated as 2.2E−12 ± 11.75% Gy/ADU (analog-to-digital unit) at the 700-V operation voltage with the nominal 10-m-long fibers for transmission. Our γ-ray irradiation study of the BGO system shows that the BGO crystals suffered from radiation damage. The light yields of the BGO crystals dropped by ∼40% after receiving 4.5 krad dose in 2.5 h, which agrees with the results of the radiation hardness study we have reported. The irradiation study also proves that the BGO system is very reliable, being able to function at fairly high radiation conditions without serious saturation or other problems. Besides, the running of the BGO system in BEAST II was very successful. It has provided much useful data for the beam background study. The data that the BGO system provided will facilitate the development of the entire BEAST II project. My study contains the design of the firmware and software, the calibration of the device, the analysis of the results of the irradiation study, and the integration of the data obtained during the running in BEAST II. In this thesis, I make a comprehensive portrait of the BGO system, including the design, calibration, tests, and the results of the beam background study in BEAST II.
APA, Harvard, Vancouver, ISO, and other styles
43

Fehr, Felix [Verfasser]. "Systematic studies, calibration, and software development for event reconstruction and data analysis using the ANTARES deep-sea neutrino telescope / vorgelegt von Felix Fehr." 2010. http://d-nb.info/1001257103/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Nyman, Joakim. "Joint Calibration of a Cladding Oxidation and a Hydrogen Pick-up Model for Westinghouse Electric Sweden AB." Thesis, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-417910.

Full text
Abstract:
Knowledge regarding a nuclear power plants potential and limitations is of utmost importance when working in the nuclear field. One way to extend the knowledge is using fuel performance codes that to its best ability mimics the real-world phenomena. Fuel performance codes involve a system of interlinked and complex models to predict the thermo-mechanical behaviour of the fuel rods. These models use several different model parameters that can be imprecise and therefore the parameters need to be fitted/calibrated against measurement data. This thesis presents two methods to calibrate model parameters in the presence of unknown sources of uncertainty. The case where these methods have been tested are the oxidation and hydrogen pickup of the zirconium cladding around the fuel rods. Initially, training and testing data were sampled by using the Dakota software in combination with the nuclear simulation program TRANSURANUS so that a Gaussian process surrogate model could be built. The model parameters were then calibrated in a Bayesian way by a MCMC algorithm. Additionally, two models are presented to handle unknown sources of uncertainty that may arise from model inadequacies, nuisance parameters or hidden measurement errors, these are the Marginal likelihood optimization method and the Margin method. To calibrate the model parameters, data from two sources were used. One source that only had data regarding the oxide thickness but the data was extensive, and another that had both oxide data and hydrogen concentration data, but less data was available.  The model parameters were calibrated by the use of the presented methods. But an unforeseen non-linearity for the joint oxidation and hydrogen pick-up case when predicting the correlation of the model parameters made this result unreliable.
APA, Harvard, Vancouver, ISO, and other styles
45

Qin, Yajie. "Low Power Analog Interface Circuits toward Software Defined Sensors." Doctoral thesis, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-179671.

Full text
Abstract:
Internet of Things is expanding to the areas such as healthcare, home management, industrial, agriculture, and becoming pervasive in our life, resulting in improved efficiency, accuracy and economic benefits. Smart sensors with embedded interfacing integrated circuits (ICs) are important enablers, hence, variety of smart sensors are required. However, each type of sensor requires specific interfacing chips, which divides the huge market of sensors’ interface chips into lots of niche markets, resulting in high develop cost and long time-to-market period for each type. Software defined sensor is regarded as a promising solution, which is expected to use a flexible interface platform to cover different sensors, deliver specificity through software programming, and integrate easily into the Internet of Things. In this work, research is carried out on the design and implementations of ultra low power analog interface circuits toward software defined sensors for healthcare services based on Internet of Things.    This thesis first explores architectures and circuit techniques for energy-efficient and flexible analog to digital conversion. A time-spreading digital calibration, to calibrate the errors due to finite gain and capacitor mismatch in multi-bit/stage pipelined converters, is developed with short convergence time. The effectiveness of the proposed technique is demonstrated with intensive simulations. Two novel circuit level techniques, which can be combined with digital calibration techniques to further improve the energy efficiency of the converters, are also presented. One is the Common-Mode-Sensing-and-Input-Interchanging (CSII) operational-transconductance-amplifier (OTA) sharing technique to enable eliminating potential memory effects. The other is a workload-balanced multiplying digital-to-analog converter (MDAC) architecture to improve the settling efficiency of a high linear multi-bit stage. Two prototype converters have been designed and fabricated in 0.13 μm CMOS technology. The first one is a 14 bit 50 MS/s digital calibrated pipelined analog to digital converter that employs the workload-balanced MDAC architecture and time-spreading digital calibration technique to achieve improved power-linearity tradeoff. The second one is a 1.2 V 12 bit 5~45 MS/s speed and power-scalable ADC incorporating the CSII OTA-sharing technique, sample-and-hold-amplifier-free topology and adjustable current bias of the building blocks to minimize the power consumption. The detailed measurement results of both converters are reported and deliver the experimental verification of the proposed techniques.     Secondly, this research investigates ultra-low-power analog front-end circuits providing programmability and being suitable for different types of sensors. A pulse-width- -modulation-based architecture with a folded reference is proposed and proven in a 0.18 μm technology to achieve high sensitivity and enlarged dynamic range when sensing the weak current signals. A 8-channel bio-electric sensing front-end, fabricated in a 0.35 μm CMOS technology is also presented that achieves an input impedance of 1 GΩ, input referred noise of 0.97 Vrms and common mode rejection ratio of 114 dB. With the programmable gain and cut-off frequency, the front-end can be configured to monitor for long-term a variety of bio-electric signals, such as electrooculogram (EOG), electromyogram (EMG), electroencephalogram (EEG) and electrocardiogram (ECG) signals. The proposed front-end is integrated with dry electrodes, a microprocessor and wireless link to build a battery powered E-patch for long-term and continuous monitoring. In-vivo test results with dry electrodes in the field trials of sitting, standing, walking and running slowly, show that the quality of ECG signal sensed by the E-patch satisfies the requirements for preventive cardiac care.    Finally, a wireless multimodal bio-electric sensor system is presented. Enabled by a customized flexible mixed-signal system on chip (SoC), this bio-electric sensor system is able to be configured for ECG/EMG/EEG recording, bio-impedance sensing, weak current stimulation, and other promising functions with biofeedback. The customized SoC, fabricated in a 0.18 μm CMOS technology, integrates a tunable analog front-end, a 10 bit ADC, a 14 bit sigma-delta digital to current converter, a 12 bit digital to voltage converter, a digital accelerator for wavelet transformation and data compression, and a serial communication protocol. Measurement results indicate that the SoC could support the versatile bio-electric sensor to operate in various applications according to specific requirements.

QC 20151221

APA, Harvard, Vancouver, ISO, and other styles
46

Gómez, Arias Jaime Andrés. "Modelación y calibración de tránsito usando el software PTV VISSIM: estudio de caso de una intersección vial en la ciudad de Guimarães, Portugal." Master's thesis, 2017. http://hdl.handle.net/1822/70681.

Full text
Abstract:
Dissertação de mestrado integrado em Ingeniería Civil (área de especialização em Planeamiento e infraestructuras de Transporte)
La micro simulación de tránsito es una herramienta que facilita los estudios y análisis de la red vial de las ciudades, con el fin de evaluar y mejorar el funcionamiento operacional, aumentando la capacidad de las vías. Así mismo, ayuda a planear de una mejor manera la malla vial de las ciudades y a tomar las mejores decisiones cuando se tienen varias opciones para la construcción o reforma de las carreteras, intersecciones o intercambiadores. El presenta trabajo muestra el estudio de una intersección vial controlada por señales semafóricas y reglas de prioridad, en la ciudad de Guimarães, Portugal, por medio de la micro simulación de tránsito con el software PTV VISSIM 9.0. Para llevar a cabo el análisis, fue necesario recopilar datos en campo, ajustar valores para los principales parámetros utilizados por el software, y realizar una adecuada calibración y validación del modelo creado. Con el modelo calibrado y validado, fue posible proponer soluciones y evaluar escenarios alternativos, en lo que concierne a el aumento del volumen vehicular y mejoras en los tiempos de las fases de los semáforos.
The micro-simulation of traffic is a tool that facilitates the studies and analysis of a city’s road network, in order to evaluate and improve the operational performance, increasing the capacity of the roads. It also helps to better plan the road network of cities and make the best decisions when there are several options for the construction or reform of roads, intersections or interchanges. The present work shows the study of a traffic intersection controlled by traffic lights and priority rules, in the city of Guimarães, Portugal, by means of the micro simulation of traffic with the PTV software VISSIM 9.0. To carry out the analysis, it was necessary to collect data in the field, to adjust values for the main parameters used by the software, and to carry out an adequate calibration and validation of the model created. With the calibrated and validated model, it was possible to propose solutions and to evaluate alternative scenarios, regarding the increase of vehicular volume and improvements in the times of the phases of the traffic lights.
A micro-simulação de trânsito é uma ferramenta que facilita os estudos e análise da rede rodoviária das cidades, com o fim de avaliar e melhorar o funcionamento operacional, aumentando a capacidade das estradas. De igual modo, ajuda a planear de uma melhor maneira a rede rodoviária das cidades e a tomar as melhores decisões quando existem várias opções para a construção ou reforma das estradas, intersecções e nós. O presente trabalho presenta o estudo de uma intersecção controlada por sinais luminosos e regras de prioridade, na cidade de Guimarães, Portugal, por meio da micro simulação de trânsito com o software PTV VISSIM 9.0. Para fazer a análise, foi necessário recopilar dados em campo, ajustar valores para os principais parâmetros utilizados pelo software, e realizar uma adequada calibração e validação do modelo criado. Com o modelo calibrado e validado, foi possível propor soluções e avaliar cenários alternativos, como o aumento do volume veicular e melhoras nos tempos nas fases dos sinais luminosos.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography