Dissertations / Theses on the topic 'Data processing pipeline'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 31 dissertations / theses for your research on the topic 'Data processing pipeline.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Jakubiuk, Wiktor. "High performance data processing pipeline for connectome segmentation." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/106122.
Full text"December 2015." Cataloged from PDF version of thesis.
Includes bibliographical references (pages 83-88).
By investigating neural connections, neuroscientists try to understand the brain and reconstruct its connectome. Automated connectome reconstruction from high resolution electron miscroscopy is a challenging problem, as all neurons and synapses in a volume have to be detected. A mm3 of a high-resolution brain tissue takes roughly a petabyte of space that the state-of-the-art pipelines are unable to process to date. A high-performance, fully automated image processing pipeline is proposed. Using a combination of image processing and machine learning algorithms (convolutional neural networks and random forests), the pipeline constructs a 3-dimensional connectome from 2-dimensional cross-sections of a mammal's brain. The proposed system achieves a low error rate (comparable with the state-of-the-art) and is capable of processing volumes of 100's of gigabytes in size. The main contributions of this thesis are multiple algorithmic techniques for 2- dimensional pixel classification of varying accuracy and speed trade-off, as well as a fast object segmentation algorithm. The majority of the system is parallelized for multi-core machines, and with minor additional modification is expected to work in a distributed setting.
by Wiktor Jakubiuk.
M. Eng. in Computer Science and Engineering
Nakane, Takanori. "Data processing pipeline for serial femtosecond crystallography at SACLA." Kyoto University, 2017. http://hdl.handle.net/2433/217997.
Full textGu, Wenyu. "Improving the performance of stream processing pipeline for vehicle data." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-284547.
Full textDen växande mängden positionsberoende data (som innehåller både geo-positionsdata (dvs. latitud, longitud) och även fordons- / förarelaterad information) som samlats in från sensorer på fordon utgör en utmaning för datorprogram att bearbeta den totala mängden data från många fordon. Medan den här växande mängden data hanteras måste datorprogrammen som behandlar dessa datauppvisa låg latens och hög genomströmning - annars minskar värdet på resultaten av denna bearbetning. Som en lösning har big data och cloud computing-tekniker använts i stor utsträckning av industrin. Denna avhandling undersöker en molnbaserad bearbetningspipeline som bearbetar fordonsplatsdata. Systemet tar emot fordonsdata i realtid och behandlar data på ett strömmande sätt. Målet är att förbättra prestanda för denna strömmande pipeline, främst med avseende på latens och kostnad. Arbetet började med att titta på den nuvarande lösningen med AWS Kinesis och AWS Lambda. En benchmarking-miljö skapades och användes för att mäta det aktuella systemets prestanda. Dessutom genomfördes en litteraturstudie för att hitta en bearbetningsram som bäst uppfyller både industriella och akademiska krav. Efter en jämförelse valdes Flink som det nya ramverket. En nylösning designades för att använda Fink. Därefter jämfördes prestandan för den nuvarande lösningen och den nya Flink-lösningen med samma benchmarking-miljö och. Slutsatsen är att den nya Flink-lösningen har 86,2% lägre latens samtidigt som den stöder tredubbla kapaciteten för det nuvarande systemet till nästan samma kostnad.
González, Alejandro. "A Swedish Natural Language Processing Pipeline For Building Knowledge Graphs." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254363.
Full textVetskapen om kunskap är den del av det som definierar den nutida människan (som vet, att hon vet). De immateriella begreppen oberoende av materiella attribut är en del av beviset på att människan en själslig varelse som till viss del är oberoende av materialet. För närvarande försöker forskningsinsatser inom artificiell intelligens efterlikna det mänskliga betandet med hjälp av datorer genom att "lära" dem hur man läser och förstår mänskligt språk genom att använda maskininlärningstekniker relaterade till behandling av mänskligt språk. Det finns emellertid fortfarande ett betydande antal utmaningar, till exempel hur man representerar denna kunskap så att den kan användas av en maskin för att dra slutsatser eller ge svar utifrån detta. Denna avhandling presenterar en studie i användningen av ”Natural Language Processing” i en pipeline som kan generera en kunskapsrepresentation av informationen utifrån det svenska språket som bas. Resultatet är ett system som, med svensk text i råformat, bygger en representation i form av en kunskapsgraf av kunskapen eller informationen i den texten.
SHARMA, DIVYA. "APPLICATION OF ML TO MAKE SENCE OF BIOLOGICAL BIG DATA IN DRUG DISCOVERY PROCESS." Thesis, DELHI TECHNOLOGICAL UNIVERSITY, 2021. http://dspace.dtu.ac.in:8080/jspui/handle/repository/18378.
Full textPatuzzi, Ilaria. "16S rRNA gene sequencing sparse count matrices: a count data simulator and optimal pre-processing pipelines." Doctoral thesis, Università degli studi di Padova, 2018. http://hdl.handle.net/11577/3426369.
Full textLo studio delle comunità microbiche è profondamente cambiato da quando fu per la prima volta proposto nel XVII secolo. Quando il ruolo fondamentale dei microbi nel regolare e causare malattie umane divenne evidente, i ricercatori iniziarono a sviluppare una varietà di tecniche per isolare e coltivare i batteri in laboratorio con l'obiettivo di caratterizzarli e classificarli. Alla fine degli anni '70, una svolta in come venivano studiate le comunità batteriche fu apportata dalla scoperta che i geni che codificano per l'RNA ribosomale (rRNA) potevano essere utilizzati come marcatori molecolari per la classificazione degli organismi. Alcuni decenni più tardi, l'avvento della tecnologia di sequenziamento del DNA ha rivoluzionato lo studio delle comunità microbiche, consentendo una visione complessiva coltura-indipendente della comunità contenuta in un campione. Oggi, uno degli approcci più diffusi per profilazione di comunità microbiche si basa sul sequenziamento del gene che codifica per la subunità 16S del ribosoma procariotico (gene dell'rRNA 16S). Poiché il ribosoma svolge un ruolo essenziale nella vita procariotica, esso è onnipresente in tutti i batteri, ma la sua esatta sequenza di DNA è unica per ogni specie. Per questo motivo, esso viene utilizzato come una sorta di impronta molecolare per assegnare a ciascun membro della comunità una caratterizzazione tassonomica. L'avvento delle piattaforme di Next Generation Sequencing (NGS), in grado di produrre un'enorme mole di dati riducendo tempi e costi, ha assicurato alla tecnica di sequenziamento del gene rRNA 16S (16S rDNA-Seq) una crescita nel tasso di elezione come metodologia preferita per eseguire studi sul microbioma. Nonostante ciò, il continuo sviluppo di procedure sia sperimentali che computazionali per 16S rDNA-Seq ha causato una inevitabile mancanza di standardizzazione riguardo al trattamento e all'analisi dei dati di sequenziamento. Ciò è ulteriormente complicato dalle caratteristiche molto peculiari che contraddistinguono la matrice in cui tipicamente le informazioni dei campioni sono riassunte dopo il sequenziamento. Infatti, il limite strumentale sul numero massimo di sequenze ottenibili rende i dati 16S rDNA-Seq composizionali, cioè dati in cui l'abbondanza rilevata di ogni specie batterica dipende dal livello di presenza di altre popolazioni nel campione. Inoltre, le matrici derivate da 16S rDNA-Seq sono in genere molto sparse (70-95% di valori nulli). Ciò è dovuto sia alla diversità biologica tra i campioni sia alla perdita di informazione sulle specie rare durante il sequenziamento, un effetto che è fortemente dipendente sia dalla distribuzione solitamente asimmetrica delle abbondanze delle specie presenti nei microbiomi, sia dal numero di campioni sequenziati nella stessa corsa di sequenziamento (il cosiddetto livello di multiplexing). Le suddette peculiarità rendono la comunemente adottata mutuazione di tool e approcci dall’ambito del sequenziamento di tipo bulk RNA inadeguata per analisi di matrici di conte derivanti da 16S rDNA-Seq. In particolare, fasi di pre-elaborazione non specifiche, come la normalizzazione, rischiano di introdurre forti bias in caso di matrici molto sparse. L'obiettivo principale di questa tesi era quello di identificare delle pipeline di analisi ottimali che riempissero le suddette lacune al fine di ottenere conclusioni solide e affidabili dall'analisi dei dati dell'rRNA-Seq 16S. Tra tutte le fasi di analisi incluse in una tipica pipeline, questo progetto si è concentrato sulla pre-elaborazione di matrici di conte ottenute da esperimenti di 16S rDNA-Seq. Questo scopo è stato raggiunto attraverso diversi passaggi. In primo luogo, sono stati identificati metodi all'avanguardia per la pre-elaborazione dei dati di conte di 16S rDNA-Seq eseguendo un'accurata ricerca bibliografica, che ha rivelato una minima disponibilità di strumenti specifici e la completa mancanza nella consueta pipeline di analisi 16S rDNA-Seq di una fase di pre-elaborazione in cui venga recuperata la perdita di informazioni dovuta al sequenziamento (zero-imputation). Allo stesso tempo, la ricerca bibliografica ha evidenziato che non erano disponibili simulatori specifici per ottenere direttamente dati di conte 16S rDNA-Seq sintetici su cui eseguire l'analisi per identificare pipeline di pre-elaborazione ottimali. Di consequenza, è stato sviluppato un simulatore di matrici di conte sparse derivanti da 16S rDNA-Seq che considera la natura composizionale di questi dati. In seguito, un'analisi comparativa completa di quarantanove pipeline di pre-elaborazione è stata progettata ed eseguita con lo scopo di valutare le prestazioni degli approcci di pre-elaborazione più comunemente utilizzati e più recenti e per verificare l’appropriatezza dell’inclusione di una fase di zero-imputation nel contesto delle analisi di 16S rDNA-Seq. Nel complesso, questa tesi considera il problema della pre-elaborazione dei dati provenienti da 16S rDNA-Seq e fornisce una guida utile per una pre-elaborazione dei dati robusta quando durante un'analisi 16S rDNA-Seq. Inoltre, il simulatore proposto in questo lavoro potrebbe essere uno stimolo e uno strumento prezioso per i ricercatori coinvolti nello sviluppo e nel test dei metodi di bioinformatica, contribuendo così a colmare la mancanza di strumenti specifici per i dati di rDNA-Seq 16S.
NIGRI, ANNA. "Quality data assessment and improvement in pre-processing pipeline to minimize impact of spurious signals in functional magnetic imaging (fMRI)." Doctoral thesis, Politecnico di Torino, 2017. http://hdl.handle.net/11583/2911412.
Full textTorkler, Phillipp [Verfasser], and Johannes [Akademischer Betreuer] Söding. "STAMMP : A statistical model and processing pipeline for PAR-CLIP data reveals transcriptome maps of mRNP biogenesis factors / Phillipp Torkler. Betreuer: Johannes Söding." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2015. http://d-nb.info/1072376628/34.
Full textMaarouf, Marwan Younes. "XML Integrated Environment For Service-Oriented Data Management." Wright State University / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=wright1180450288.
Full textSeverini, Nicola. "Analysis, Development and Experimentation of a Cognitive Discovery Pipeline for the Generation of Insights from Informal Knowledge." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/21013/.
Full textLundgren, Therese. "Digitizing the Parthenon using 3D Scanning : Managing Huge Datasets." Thesis, Linköping University, Department of Science and Technology, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2636.
Full textDigitizing objects and environments from real world has become an important part of creating realistic computer graphics. Through the use of structured lighting and laser time-of-flight measurements the capturing of geometric models is now a common process. The result are visualizations where viewers gain new possibilities for both visual and intellectual experiences.
This thesis presents the reconstruction of the Parthenon temple and its environment in Athens, Greece by using a 3D laser-scanning technique.
In order to reconstruct a realistic model using 3D scanning techniques there are various phases in which the acquired datasets have to be processed. The data has to be organized, registered and integrated in addition to pre and post processing. This thesis describes the development of a suitable and efficient data processing pipeline for the given data.
The approach differs from previous scanning projects considering digitizing this large scale object at very high resolution. In particular the issue managing and processing huge datasets is described.
Finally, the processing of the datasets in the different phases and the resulting 3D model of the Parthenon is presented and evaluated.
Ayala, Cabrera David. "Characterization of components of water supply systems from GPR images and tools of intelligent data analysis." Doctoral thesis, Universitat Politècnica de València, 2015. http://hdl.handle.net/10251/59235.
Full text[ES] Con el paso del tiempo, y debido a múltiples actividades operacionales y de mantenimiento, las redes de los sistemas de abastecimiento de agua (SAAs) sufren intervenciones, modificaciones o incluso, son clausuradas, sin que, en muchos casos, estas actividades sean correctamente registradas. El conocimiento de los trazados y características (estado y edad, entre otros) de las tuberías en los SAAs es obviamente necesario para una gestión eficiente y dinámica de tales sistemas. A esta problemática se suma la detección y el control de las fugas de agua. El acceso a información fiable sobre las fugas es una tarea compleja. En muchos casos, las fugas son detectadas cuando los daños en la red son ya considerables, lo que trae consigo altos costes sociales y económicos. En este sentido, los métodos no destructivos (por ejemplo, ground penetrating radar - GPR), pueden ser una respuesta a estas problemáticas, ya que permiten, como se pone de manifiesto en esta tesis, localizar los trazados de las tuberías, identificar características de los componentes y detectar las fugas de agua cuando aún no son significativas. La selección del GPR, en este trabajo se justifica por sus características como técnica no destructiva, que permite estudiar tanto objetos metálicos como no metálicos. Aunque la captura de información con GPR suele ser exitosa, la configuración de la captura, el gran volumen de información, y el uso y la interpretación de la información requieren de alto nivel de habilidad y experiencia por parte del personal. Esta tesis doctoral se plantea como un avance hacia el desarrollo de herramientas que permitan responder a la problemática del desconocimiento de los activos enterrados de los SAAs. El objetivo principal de este trabajo doctoral es, pues, generar herramientas y evaluar la viabilidad de su aplicación en la caracterización de componentes de un SAA, a partir de imágenes GPR. En este trabajo hemos realizado ensayos de laboratorio específicamente diseñados para plantear, elaborar y evaluar metodologías para la caracterización de los componentes enterrados de los SAAs. Adicionalmente, hemos realizado ensayos de campo, que han permitido determinar la viabilidad de aplicación de tales metodologías bajo condiciones no controladas. Las metodologías elaboradas están basadas en técnicas de análisis inteligentes de datos. El principio básico de este trabajo ha consistido en el tratamiento adecuado de los datos obtenidos mediante el GPR, a fin de buscar información de utilidad para los SAAs respecto a sus componentes, con especial énfasis en las tuberías. Tras la realización de múltiples actividades, se puede concluir que es viable obtener más información de las imágenes de GPR que la que actualmente se obtiene con la típica identificación de hipérbolas. Esta información, además, puede ser observada directamente, de manera más sencilla, mediante las metodologías planteadas en este trabajo doctoral. Con estas metodologías se ha probado que también es viable la identificación de patrones (especialmente el pre-procesado con el algoritmo Agent race) que proporcionan aproximación bastante acertada de la localización de las fugas de agua en los SAAs. También, en el caso de las tuberías, se puede obtener otro tipo de características tales como el diámetro y el material. Como resultado de esta tesis se han desarrollado una serie de herramientas que permiten visualizar, identificar y localizar componentes de los SAAs a partir de imágenes de GPR. El resultado más interesante es que los resultados obtenidos son sintetizados y reducidos de manera que preservan las características de los diferentes componentes registrados en las imágenes de GPR. El objetivo último es que las herramientas desarrolladas faciliten la toma de decisiones en la gestión técnica de los SAAs y que tales herramientas puedan ser operadas incluso por personal con una experiencia limitada en el manejo
[CAT] Amb el temps, a causa de les múltiples activitats d'operació i manteniment, les xarxes de sistemes d'abastament d'aigua (SAAs) se sotmeten a intervencions, modificacions o fins i tot estan tancades. En molts casos, aquestes activitats no estan degudament registrats. El coneixement dels camins i característiques (estat i edat, etc.) de les canonades d'aigua i sanejament fa evident la necessitat d'una gestió eficient i dinàmica d'aquests sistemes. Aquest problema es veu augmentat en gran mesura tenint en compte la detecció i control de fuites. L'accés a informació fiable sobre les fuites és una tasca complexa. En molts casos, les fugues es detecten quan el dany ja és considerable, el que porta costos socials i econòmics. En aquest sentit, els mètodes no destructius (per exemple, ground penetrating radar - GPR) poden ser una resposta constructiva a aquests problemes, ja que permeten, com s'evidencia en aquesta tesi, per determinar rutes de canonades, identificar les característiques dels components, i detectar les fuites d'aigua quan encara no són significatives. La selecció del GPR en aquest treball es justifica per les seves característiques com a tècnica no destructiva que permet estudiar tant objectes metàl·lics i no metàl·lics. Tot i que la captura d'informació amb GPR sol ser reeixida, aspectes com ara la configuració de captura, el gran volum d'informació que es genera, i l'ús i la interpretació d'aquesta informació requereix alt nivell d'habilitat i experiència. Aquesta tesi pot ser vista com un pas endavant cap al desenvolupament d'eines capaces d'abordar el problema de la manca de coneixement sobre els actius d'aigua i sanejament enterrat. L'objectiu principal d'aquest treball doctoral és, doncs, generar eines i avaluar la seva factibilitat d'aplicació a la caracterització dels components de los SAAs, a partir d'imatges GPR. En aquest treball s'han dut a terme proves de laboratori específicament dissenyats per proposar, desenvolupar i avaluar mètodes per a la caracterització dels components d'aigua i sanejament soterrat. A més, hem dut a terme proves de camp, que ens han permès determinar la viabilitat de la implementació d'aquestes metodologies en condicions no controlades. Les metodologies desenvolupades es basen en tècniques d'anàlisi intel·ligent de dades. El principi bàsic d'aquest treball ha consistit en el tractament de dades obtingudes a través del GPR per buscar informació útil sobre els components d'SAA, amb especial èmfasi en la canonades. Després de realitzar nombroses activitats, es pot concloure que, amb l'ús d'imatges de GPR, és factible obtenir més informació que la identificació típica d'hipèrboles realitzat actualment. A més, aquesta informació pot ser observada directament, per exemple, més simplement, utilitzant les metodologies proposades en aquest treball doctoral. Aquestes metodologies també demostren que és factible per identificar patrons (especialment el pre-processat amb l'algoritme Agent race) que proporcionen bastant bona aproximació de la localització de fuites en SAAs. També, en el cas de tubs, es pot obtenir altres característiques com ara el diàmetre i el material. Els principals resultats d'aquesta tesi consisteixen en una sèrie d'eines que hem desenvolupat per localitzar, identificar i visualitzar els components dels SAAS a partir d'imatges GPR. El resultat més interessant és que els resultats obtinguts són sintetitzats i reduïts de manera que preserven les característiques dels diferents components registrats en les imatges de GPR. L'objectiu final és que les eines desenvolupades faciliten la presa de decisions en la gestió tècnica de SAA, i que tals eines poden fins i tot ser operades per personal amb poca experiència en el maneig de metodologies no destructives, específicament GPR.
Ayala Cabrera, D. (2015). Characterization of components of water supply systems from GPR images and tools of intelligent data analysis [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/59235
TESIS
Premiado
Eriksson, Caroline, and Emilia Kallis. "NLP-Assisted Workflow Improving Bug Ticket Handling." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-301248.
Full textVid mjukvaruutveckling går mycket resurser åt till felsökning, en process där tidigare lösningar kan hjälpa till att lösa aktuella problem. Det är ofta tidskrävande att läsa felrapporterna som innehåller denna information. För att minimera tiden som läggs på felsökning och säkerställa att kunskap från tidigare lösningar bevaras inom företaget, utvärderades om sammanfattningar skulle kunna effektivisera detta. Abstrakta och extraherande sammanfattningsmodeller testades för uppgiften och en finjustering av bert-extractive- summarizer gjordes. De genererade sammanfattningarna jämfördes i avseende på upplevd kvalitet, genereringshastighet, likhet mellan varandra och sammanfattningslängd. Den genomsnittliga sammanfattningen innehöll delar av den viktigaste informationen och den föreslagna lösningen var antingen väldokumenterad eller besvarade inte problembeskrivningen alls. Den finjusterade BERT och den abstrakta modellen BART visade goda förutsättningar för att generera sammanfattningar innehållande all den viktigaste informationen.
Heidar, Ryad. "Architectures pour le traitement d'images." Grenoble INPG, 1989. http://www.theses.fr/1989INPG0076.
Full textRoy, Simon A. "Data processing pipelines tailored for imaging Fourier-transform spectrometers." Thesis, Université Laval, 2008. http://www.theses.ulaval.ca/2008/25682/25682.pdf.
Full textGiovanelli, Joseph. "AutoML: A new methodology to automate data pre-processing pipelines." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/20422/.
Full textTallberg, Sebastian. "A COMPARISON OF DATA INGESTION PLATFORMS IN REAL-TIME STREAM PROCESSING PIPELINES." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-48744.
Full textHarrison, William. "Malleability, obliviousness and aspects for broadcast service attachment." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4138/.
Full textDu, Wei. "Advanced middleware support for distributed data-intensive applications." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1126208308.
Full textTitle from first page of PDF file. Document formatted into pages; contains xix, 183 p.; also includes graphics (some col.). Includes bibliographical references (p. 170-183). Available online via OhioLINK's ETD Center
Li, Yunming. "Machine vision algorithms for mining equipment automation." Thesis, Queensland University of Technology, 2000.
Find full textHobson, Alan George Cawood. "Optimising the renewal of natural gas reticulation pipes using GIS." Thesis, Stellenbosch : Stellenbosch University, 2002. http://hdl.handle.net/10019.1/52980.
Full textENGLISH ABSTRACT: A major concern for Energex, Australia's largest energy utility in South East Queensland, is the escape of natural gas out of their reticulation systems. Within many of the older areas in Brisbane, these networks operate primarily at low and medium pressure with a significant percentage of mains being cast iron or steel. Over many years pipes in these networks have been replaced, yet reports show that unaccounted for gas from the same networks remain high. Furthermore, operation and maintenance budgets for these networks are high with many of these pipes close to the end of their economic life. When operation and maintenance costs exceed the costs of replacement, the Energex gas utility initiates projects to renew reticulation networks with polyethylene pipes. Making decisions about pipe renewal requires an evaluation of historical records from a number of sources, namely: • gas consumption figures, • history of leaks, • maintenance and other related cost, and • the loss of revenue contributed by unaccounted for gas. Financial justification of capital expenditure has always been a requirement for renewal projects at the Energex gas utility, however the impact of a deregulation in the energy utility market has necessitated a review of their financial assessment for capital projects. The Energex gas utility has developed an application that evaluates the financial viability of renewal projects. This research will demonstrate the role of GIS integration with the Energex financial application. The results of this study showed that a GIS integrated renewal planning approach incorporates significant benefits including: • Efficient selection of a sub-network based on pipe connectivity, • Discovery of hidden relationships between spatially enabled alphanumeric data and environmental information that improves decision making, and • Enhanced testing of proposed renewal design options by scrutinizing the attributes of spatial data.
AFRIKAANSE OPSOMMING: 'n Groot bron van kommer vir Energex, Australië se grootste energieverskaffer in Suidoos- Queensland, is die verlies van natuurlike gas uit hul gasdistribusie netwerke. In 'n groot deel van ouer Brisbane opereer hierdie netwerke hoofsaaklik teen lae en medium druk, met 'n aansienlike persentasie van hoofpyplyne wat uit gietyster of staal bestaan. Al is sommige pyplyne in hierdie netwerke met verloop van tyd vervang, maak verslae dit duidelik dat 'n groot deel van die gas in hierdie netwerke steeds langs die pad verlore gaan. Die operasionele - en onderhoudsbegrotings vir hierdie netwerke is boonop hoog, met 'n groot persentasie van die pyplyne wat binnekort aan die einde van hulle ekonomiese leeftyd kom. Wanneer operasionele- en onderhoudsonkostes die koste van vervanging oorskry, beplan Energex se gasvoorsienings-afdeling projekte om verspreidingsnetwerke te hernu met poli-etileen pype. Om sinvolle besluite te neem tydens pyplynhernuwings, word verskeie historiese verslae geraadpleeg, insluitend: gasverbruikvlakke, lekplek geskiedenis rekords, onderhoud- en ander verwante onkostes, asook die verlies van inkomste weens verlore gas. Alhoewel finansiële stawing van kapitale uitgawes nog altyd 'n voorvereiste was tydens hernuwingsprojekte by Energex, het die impak van privatisering op die energieverskaffingsmark dit noodsaaklik gemaak om hulle finansiële goedkeuringsproses vir kapitaalprojekte te hersien. Energex het dus 'n sagteware toepassing ontwikkel wat die finansiële gangbaarheid van hernuwingsprojekte evalueer. Hierdie navorsing sal die moontlike integrasie van geografiese inligtingstelsels (GIS) met dié van Energex se finansiële evalueringspakket demonstreer. Die resultate van hierdie studie toon dat die integrasie van GIS in die hernuwingsproses aansienlike voordele inhou, insluitende: • die effektiewe seleksie van sub-netwerke, gebaseer op pyp konnektiwiteit, • die ontdekking van verskuilde verwantskappe tussen geografies-ruimtelike alfanumeriese data en omgewingsinligting, wat besluitneming vergemaklik, en • verbeterde toetsing van voorgestelde hernuwingsopsies deur die indiepte-nagaan van geografiesruimtelike elemente.
Corista, Pedro André da Silva. "IoT data processing pipeline in FoF perspective." Master's thesis, 2017. http://hdl.handle.net/10362/28225.
Full textSantos, João Guilherme Basílio dos. "Photometry Data Processing for ESA's CHEOPS space mission." Master's thesis, 2018. http://hdl.handle.net/10316/86206.
Full textA pesquisa em torno da busca e estudo de planetas extra-solares está a crescer rapidamente e é definida como uma das principais prioridades da Agência Espacial Europeia (ESA) para o programa Visão Cósmica 2015-2025, com missões como a CHEOPS que será lançado no início de 2019, PLATO (PLAnetary Transit and Oscillations of Stars) por volta de 2026 e ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) em 2028. Essas missões serão fundamentais para a compreensão e estudo do campo da ciência planetária. O papel dos dispositivos acoplados carregados (CCDs)neste campo é notório, sendo que estes detectores foram e serão amplamente utilisados em missões espaciais e terrestres. A sua alta eficiência na região óptica do espectro eletromagnético é um dos factores mais importantes para o uso deste detector. Neste trabalho apresentamos o CCD e as suas principais características. Ao longo dos anos, foi necessário não apenas colectar e armazenar as informações recolhidas por estes detectores, mas também pré-processar os dados "crus" recolhidos, removendo erros que surgem naturalmente das características do detector e das suas componentes eletrónicas, bem como de efeitos externos. Este processo de correção irá mais tarde facilitar o trabalho de análise de dados feito pelas equipas de ciência. A missão CHEOPS não é uma exceção, uma vez que foi desenvolvida uma pipeline de redução de dados. Esta pipeline é um software desenvolvido usando a linguagem de programação Python, que corrigirá as imagens "cruas" e depois extrairá a curva de luz da estrela alvo. A curva de luz representa a variação do fluxo recebido de uma estrela com o tempo. Pode ser usado no campo de exoplanetas para estudar as mudanças de fluxo criadas por um planeta transitando a sua estrela hospedeira.
The research surrounding the search and study of extra-solar planets is quickly growing and is defined as one of the main priorities of the European Space Agency(ESA) for the Cosmic Vision 2015-2025 program, with missions such as CHEOPS to be launched in the beginning of 2019, PLATO (PLAnetary Transit and Oscillations of stars) around 2026 and ARIEL (Atmospheric Remote-sensing Exoplanet Large-survey) in 2028. These missions will be fundamental for the comprehension and study of the planetary science field. The role of Charged Coupled Devices (CCDs)in this field is notorious as these detectors have been and will be extensively used in both space and ground-based surveys. Their high efficiency in the optical region of the electromagnetic spectrum is one of the most important factors for the use of this detector. In subsection 2.3 we present the CCD and its main characteristics. Throughout the years it has been necessary not only to collect and store the information gathered by these detectors, but also to pre-process the collected raw data,removing errors that naturally arise from the detector’s and electronics’s intrinsic characteristics, as well as from environmental effects. This correction process will later facilitate the consequent job of data analysis done by the science teams. The CHEOPS mission is no exception to this, since a data reduction pipeline, hereafter DRP, has been developed. This pipeline is a software developed in the Python programming language, that will correct the raw images for undesired instrumental and astrophysical signals and then extract the light curve of the target star. The light-curve represents the variation of the flux received from a star with time. It may be used in exoplanet science to study the flux changes created by a planet transiting its host star.
Morris, Joseph P. "An analysis pipeline for the processing, annotation, and dissemination of expressed sequence tags." 2009. http://etd.louisville.edu/data/UofL0482t2009.pdf.
Full textTitle and description from thesis home page (viewed May 22, 2009). Department of Computer Engineering and Computer Science. Vita. "May 2009." Includes bibliographical references (p. 39-41).
(9708467), Siddhant Srinath Betrabet. "Data Acquisition and Processing Pipeline for E-Scooter Tracking Using 3D LIDAR and Multi-Camera Setup." Thesis, 2021.
Find full textAnalyzing behaviors of objects on the road is a complex task that requires data from various sensors and their fusion to recreate movement of objects with a high degree of accuracy. A data collection and processing system are thus needed to track the objects accurately in order to make an accurate and clear map of the trajectories of objects relative to various coordinate frame(s) of interest in the map. Detection and tracking moving objects (DATMO) and Simultaneous localization and mapping (SLAM) are the tasks that needs to be achieved in conjunction to create a clear map of the road comprising of the moving and static objects.
These computational problems are commonly solved and used to aid scenario reconstruction for the objects of interest. The tracking of objects can be done in various ways, utilizing sensors such as monocular or stereo cameras, Light Detection and Ranging (LIDAR) sensors as well as Inertial Navigation systems (INS) systems. One relatively common method for solving DATMO and SLAM involves utilizing a 3D LIDAR with multiple monocular cameras in conjunction with an inertial measurement unit (IMU) allows for redundancies to maintain object classification and tracking with the help of sensor fusion in cases when sensor specific traditional algorithms prove to be ineffectual when either sensor falls short due to their limitations. The usage of the IMU and sensor fusion methods relatively eliminates the need for having an expensive INS rig. Fusion of these sensors allows for more effectual tracking to utilize the maximum potential of each sensor while allowing for methods to increase perceptional accuracy.
The focus of this thesis will be the dock-less e-scooter and the primary goal will be to track its movements effectively and accurately with respect to cars on the road and the world. Since it is relatively more common to observe a car on the road than e-scooters, we propose a data collection system that can be built on top of an e-scooter and an offline processing pipeline that can be used to collect data in order to understand the behaviors of the e-scooters themselves. In this thesis, we plan to explore a data collection system involving a 3D LIDAR sensor and multiple monocular cameras and an IMU on an e-scooter as well as an offline method for processing the data to generate data to aid scenario reconstruction.
Betrabet, Siddhant S. "Data Acquisition and Processing Pipeline for E-Scooter Tracking Using 3d Lidar and Multi-Camera Setup." Thesis, 2020. http://hdl.handle.net/1805/24776.
Full textAnalyzing behaviors of objects on the road is a complex task that requires data from various sensors and their fusion to recreate the movement of objects with a high degree of accuracy. A data collection and processing system are thus needed to track the objects accurately in order to make an accurate and clear map of the trajectories of objects relative to various coordinate frame(s) of interest in the map. Detection and tracking moving objects (DATMO) and Simultaneous localization and mapping (SLAM) are the tasks that needs to be achieved in conjunction to create a clear map of the road comprising of the moving and static objects. These computational problems are commonly solved and used to aid scenario reconstruction for the objects of interest. The tracking of objects can be done in various ways, utilizing sensors such as monocular or stereo cameras, Light Detection and Ranging (LIDAR) sensors as well as Inertial Navigation systems (INS) systems. One relatively common method for solving DATMO and SLAM involves utilizing a 3D LIDAR with multiple monocular cameras in conjunction with an inertial measurement unit (IMU) allows for redundancies to maintain object classification and tracking with the help of sensor fusion in cases when sensor specific traditional algorithms prove to be ineffectual when either sensor falls short due to their limitations. The usage of the IMU and sensor fusion methods relatively eliminates the need for having an expensive INS rig. Fusion of these sensors allows for more effectual tracking to utilize the maximum potential of each sensor while allowing for methods to increase perceptional accuracy. The focus of this thesis will be the dock-less e-scooter and the primary goal will be to track its movements effectively and accurately with respect to cars on the road and the world. Since it is relatively more common to observe a car on the road than e-scooters, we propose a data collection system that can be built on top of an e-scooter and an offline processing pipeline that can be used to collect data in order to understand the behaviors of the e-scooters themselves. In this thesis, we plan to explore a data collection system involving a 3D LIDAR sensor and multiple monocular cameras and an IMU on an e-scooter as well as an offline method for processing the data to generate data to aid scenario reconstruction.
Kaever, Peter, Wolfgang Oertel, Axel Renno, Peter Seidel, Markus Meyer, Markus Reuter, and Stefan König. "A Versatile Sensor Data Processing Framework for Resource Technology." 2021. https://htw-dresden.qucosa.de/id/qucosa%3A75233.
Full textNovel sensors with the ability to collect qualitatively new information offer the potential to improve experimental infrastructure and methods in the field of research technology. In order to get full access to this information, the entire range from detector readout data transfer over proper data and knowledge models up to complex application functions has to be covered. The extension of existing scientific instruments comprises the integration of diverse sensor information into existing hardware, based on the expansion of pivotal event schemes and data models. Due to its flexible approach, the proposed framework has the potential to integrate additional sensor types and offers migration capabilities to high-performance computing platforms. Two different implementation setups prove the flexibility of this approach, one extending the material analyzing capabilities of a secondary ion mass spectrometry device, the other implementing a functional prototype setup for the online analysis of recyclate. Both setups can be regarded as two complementary parts of a highly topical and ground-breaking unique scientific application field. The requirements and possibilities resulting from different hardware concepts on one hand and diverse application fields on the other hand are the basis for the development of a versatile software framework. In order to support complex and efficient application functions under heterogeneous and flexible technical conditions, a software technology is proposed that offers modular processing pipeline structures with internal and external data interfaces backed by a knowledge base with respective configuration and conclusion mechanisms.:1. Introduction 2. Hardware Architecture and Application Background 3. Software Concept 4. Experimental Results 5. Conclusion and Outlook
Kikta, Marcel. "Vyhodnocování relačních dotazů v proudově orientovaném prostředí." Master's thesis, 2014. http://www.nusl.cz/ntk/nusl-341206.
Full textKarlsson, Christoffer. "The performance impact from processing clipped triangles in state-of-the-art games." Thesis, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16853.
Full textWilson, Derek Alan. "A Dredging Knowledge-Base Expert System for Pipeline Dredges with Comparison to Field Data." 2010. http://hdl.handle.net/1969.1/ETD-TAMU-2010-12-8653.
Full text"Data processing pipelines tailored for imaging Fourier-transform spectrometers." Thesis, Université Laval, 2008. http://www.theses.ulaval.ca/2008/25682/25682.pdf.
Full text