To see the other types of publications on this topic, follow the link: Raw data.

Dissertations / Theses on the topic 'Raw data'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Raw data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

JULARDZIJA, MIRHET. "Processing RAW image data in mobile units." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-27724.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Akbar, Adnan. "Extracting knowledge from raw IoT data streams." Thesis, University of Surrey, 2018. http://epubs.surrey.ac.uk/845475/.

Full text
Abstract:
As sensors are adopted in almost every field of life, the Internet of Things (IoT) is triggering a massive influx of data. This large amount of data is of little value until it is processed intelligently to extract high-level knowledge which can be used to make decisions. The process of knowledge extraction from data streams is complex predominantly due to heterogeneous data sources, unreliable networks and real-time processing requirements. Different recent studies have showed that solutions based on complex event processing (CEP) have the potential to extract high-level knowledge from these data streams. However, the use of CEP for IoT applications is still in early phase and faces many challenges. First, CEP applications are intended to provide reactive solutions by correlating data streams using predefined rules as the events happen. As the notion of many IoT applications is changing from reactive to proactive where complex events can be predicted before they actually happen, solutions based on CEP required an extension to address this issue. To this end, this work proposes a proactive method based on CEP and machine learning (ML) where historical data is exploited using ML part and combined with real-time flow of CEP to provide the basis for predictive event processing. Second, systems based on CEP deploy static rules and there is no means to update the rules according to the current context automatically. In order to address this issue, this thesis proposed a novel method based on ML to find CEP rules automatically and update them according to the current context. Third, in state-of-the-art CEP systems, events are correlated using absolute rules where a complex event detected is either true or false. Given the sporadic nature of IoT, missing and uncertain data is a common phenomenon and CEP systems of today are unable to take this inherent uncertainty of real-world events into account while taking decisions. This thesis addressed this issue by proposing a probabilistic event processing approach by extending state-of-the-art CEP by combining it with Bayesian networks (BNs). Finally, the size and complexity of the IoT data presents a generic challenge which is being addressed throughout in this thesis. The above mentioned contributions were evaluated using real-world data collected from heterogeneous sources to prove the accuracy and reliability of the proposed methods. The feasibility and applicability of the proposed solutions were demonstrated by implementing it for real-world applications. The work presented in this thesis significantly improves state-of-the-art methods and provides a fundamental building block towards extracting knowledge from raw IoT data streams.
APA, Harvard, Vancouver, ISO, and other styles
3

Rothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2018. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-231350.

Full text
Abstract:
This paper proposes a system for the ingestion and analysis of real-time sensor and actor data of bulk materials handling plants and machinery. It references issues that concern mining sensor data in cyber physical systems (CPS) as addressed in O’Leary et al. [2015].
APA, Harvard, Vancouver, ISO, and other styles
4

Tse, Rebecca. "Three-dimensional building reconstruction from raw LIDAR data." Thesis, University of South Wales, 2008. https://pure.southwales.ac.uk/en/studentthesis/threedimensional-building-reconstruction-from-raw-lidar-data(d486c0a1-d4bd-4eb3-a81b-39ab0b23007e).html.

Full text
Abstract:
Airborne Laser Scanning is an advanced surveying technology (also called Light Detection and Ranging - LIDAR) which mounts a laser scanner on an aircraft. The aircraft scans the Earth's surface and captures data by emitting and receiving light pulses transmitted onto the terrain objects. The captured data are in three dimensions (3D); however no extra information is provided to describe them. Additional algorithms are needed to extract meaningful and useful information from the data. The popularity of LIDAR has attracted attention as researchers try to develop algorithms for 3D building reconstruction in Geographical Information Systems. The limited information provided by the data makes building boundary and roof structure extractions become essential tasks when analysing the data. This research examines the limitations of different algorithms for extracting building outlines and remodelling roof structures from the LIDAR data solely, and suggests an alternative approach for reconstructing buildings using raw LIDAR data. Most of the current methods use additional data sources (e.g. cadastral data, aerial photos, or satellite images) and pre-defined building models to reconstruct 3D buildings. The extraction method proposed starts by re-sampling the captured data in a lower resolution index layer and the aim is to search for vertical wall segments which separate the high and low areas. The wall segments found are connected and modified to form closed building outlines and corners. The roof remodelling system suggested starts by creating a triangulation using the extracted data points which are inside the building boundaries. Three clustering methods are used to separate the triangles into groups which share the same properties (e.g. orientation and geographical location). Each group of triangles represents a plane on the roof. Plane to plane relationships are found, and the building corners and roof ridges are calculated by using the three planes intersection. Finally the building is reconstructed from the terrain model using a set of well-developed toolkits to extend the TIN model with preserved topological connectivity. Real LIDAR data are used to evaluate the capability and the validity of the developed algorithms. The data were captured in Bournemouth by the Ordnance Survey UK. In conclusion, several suggestions are made to improve the algorithms for future development.
APA, Harvard, Vancouver, ISO, and other styles
5

Rothschedl, Christopher, Roland Ritt, Paul O'Leary, Matthew Harker, Michael Habacher, and Michael Brandner. "Real-time-data analytics in raw materials handling." TU Bergakademie Freiberg, 2017. https://tubaf.qucosa.de/id/qucosa%3A23195.

Full text
Abstract:
This paper proposes a system for the ingestion and analysis of real-time sensor and actor data of bulk materials handling plants and machinery. It references issues that concern mining sensor data in cyber physical systems (CPS) as addressed in O’Leary et al. [2015].
APA, Harvard, Vancouver, ISO, and other styles
6

Herzberg, Nico, and Mathias Weske. "Enriching raw events to enable process intelligence : research challenges." Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2013/6401/.

Full text
Abstract:
Business processes are performed within a company’s daily business. Thereby, valuable data about the process execution is produced. The quantity and quality of this data is very dependent on the process execution environment that reaches from predominantly manual to fullautomated. Process improvement is one essential cornerstone of business process management to ensure companies’ competitiveness and relies on information about the process execution. Especially in manual process environments data directly related to the process execution is rather sparse and incomplete. In this paper, we present an approach that supports the usage and enrichment of process execution data with context data – data that exists orthogonally to business process data – and knowledge from the corresponding process models to provide a high-quality event base for process intelligence subsuming, among others, process monitoring, process analysis, and process mining. Further, we discuss open issues and challenges that are subject to our future work.<br>Die wertschöpfenden Tätigkeiten in Unternehmen folgen definierten Geschäftsprozessen und werden entsprechend ausgeführt. Dabei werden wertvolle Daten über die Prozessausführung erzeugt. Die Menge und Qualität dieser Daten ist sehr stark von der Prozessausführungsumgebung abhängig, welche überwiegend manuell als auch vollautomatisiert sein kann. Die stetige Verbesserung von Prozessen ist einer der Hauptpfeiler des Business Process Managements, mit der Aufgabe die Wettbewerbsfähigkeit von Unternehmen zu sichern und zu steigern. Um Prozesse zu verbessern muss man diese analysieren und ist auf Daten der Prozessausführung angewiesen. Speziell bei manueller Prozessausführung sind die Daten nur selten direkt zur konkreten Prozessausführung verknüpft. In dieser Arbeit präsentieren wir einen Ansatz zur Verwendung und Anreicherung von Prozessausführungsdaten mit Kontextdaten – Daten die unabhängig zu den Prozessdaten existieren – und Wissen aus den dazugehörigen Prozessmodellen, um ein hochwertige Event- Datenbasis für Process Intelligence Anwendungen, wie zum Beispiel Prozessmonitoring, Prozessanalyse und Process Mining, sicherstellen zu können. Des Weiteren zeigen wir offene Fragestellungen und Herausforderungen auf, welche in Zukunft Gegenstand unserer Forschung sein werden.
APA, Harvard, Vancouver, ISO, and other styles
7

Joakim, Aronsson. "I Want to Breathe You In : Data as Raw Commodity." Thesis, Konstfack, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:konstfack:diva-7977.

Full text
Abstract:
In this paper, I look at the history of the internet and online advertising. The internet is inextricably linked to capitalism and is fueled by advertising. As a result, companies like Facebook, Apple, Microsoft, Google, and Amazon collect data in large volumes to improve targeted advertising. An investigation of new power structures has emerged with the internet and how they dominate its and our future. My creativity lies between art and technology. By merging new technologies like Artificial Intelligence with humor and graphic design, I try to shine a light on the subject.
APA, Harvard, Vancouver, ISO, and other styles
8

Saputra, Michael Wijaya. "Water and Fat Image Reconstruction from MRI Raw Multi Coil Data." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-372138.

Full text
Abstract:
n MRI, water and fat signal separation with robust techniques are often helpful in the diagnosis using MRI. Reliable separation of water and fat will help the doctor to get accurate diagnoses such as the size of a tumour. Moreover, fat images can also help in diagnosing the liver and heart condition. To perform water and fat separation, multiple echoes, i.e. measurements of the raw MR signal at different time points, are required. By utilizing the knowledge of the expected signal evolution, it is possible to perform the separation. A main magnetic field is used in MRI. This field is not perfectly homogeneous. Estimating the non-homogeneities is crucial for correcting the separation signal. This thesis used the method of "Iterative Decomposition of water and fat with Echo Asymmetry and Least-squares estimation" (IDEAL). The aims of the thesis are developed a method which reconstruct fat or water MRI images from raw multi-coil image data and evaluate the method’s accuracy and speed by comparing with an available, implemented reconstruction method. In particular, the stability to so called swap artefacts will be analysed. Estimated field maps or inhomogeneity fields are one important and essential step, but there exist multiple local minima. To avoid choosing the incorrect minima, the initial estimation of the field map had to be close to the actual field map value. Neighbouring pixels would have a similar field map values, since the inhomogeneity field was smoothly varying. As such, we carried out the combination of IDEAL algorithms with a region growing method. We implemented the method to do the water and fat separation from a raw image consisting of multi-coil data and multi- echo. The proposed method was tested and the region growing method shows a significantly improved separation of water and fat, when compared to the traditional method without region growing.
APA, Harvard, Vancouver, ISO, and other styles
9

Moruzzi, Davide <1997&gt. "Cyber data intelligence tool: from raw data breaches to structured storing, performant searching and user-friendly visualisation." Master's Degree Thesis, Università Ca' Foscari Venezia, 2021. http://hdl.handle.net/10579/19916.

Full text
Abstract:
The thesis aims to summarise and motivate the cyber intelligence project carried out during the internship period in a local company. The scenario in which the developed software runs will be presented. The technologies and criticalities found will be highlighted and the final product that can be used by the company for its own purposes will be shown. It is possible to find everything you need by surfing the web, also a lot of stolen credentials. Everyday people upload on the web data which came from a breach due to lack of security in the network infrastructures, malware attacks or human errors. Cyber security companies exploit these data breaches to obtain information and implement security activities for their customers. By exploring the deep web, analysts are able to download a massive amount of unauthorised data that very often are raw or not well formed. Hence the need to develop a tool which is able to ingest and structure any kind of data into a database and link it to a data-visualiser in order to perform optimised targeted searches. The final step is to protect customers by alerting them of possible threats due to a leak of confidential and sensitive credentials.
APA, Harvard, Vancouver, ISO, and other styles
10

Verner, Alexander. "LSTM Networks for Detection and Classification of Anomalies in Raw Sensor Data." Diss., NSUWorks, 2019. https://nsuworks.nova.edu/gscis_etd/1074.

Full text
Abstract:
In order to ensure the validity of sensor data, it must be thoroughly analyzed for various types of anomalies. Traditional machine learning methods of anomaly detections in sensor data are based on domain-specific feature engineering. A typical approach is to use domain knowledge to analyze sensor data and manually create statistics-based features, which are then used to train the machine learning models to detect and classify the anomalies. Although this methodology is used in practice, it has a significant drawback due to the fact that feature extraction is usually labor intensive and requires considerable effort from domain experts. An alternative approach is to use deep learning algorithms. Research has shown that modern deep neural networks are very effective in automated extraction of abstract features from raw data in classification tasks. Long short-term memory networks, or LSTMs in short, are a special kind of recurrent neural networks that are capable of learning long-term dependencies. These networks have proved to be especially effective in the classification of raw time-series data in various domains. This dissertation systematically investigates the effectiveness of the LSTM model for anomaly detection and classification in raw time-series sensor data. As a proof of concept, this work used time-series data of sensors that measure blood glucose levels. A large number of time-series sequences was created based on a genuine medical diabetes dataset. Anomalous series were constructed by six methods that interspersed patterns of common anomaly types in the data. An LSTM network model was trained with k-fold cross-validation on both anomalous and valid series to classify raw time-series sequences into one of seven classes: non-anomalous, and classes corresponding to each of the six anomaly types. As a control, the accuracy of detection and classification of the LSTM was compared to that of four traditional machine learning classifiers: support vector machines, Random Forests, naive Bayes, and shallow neural networks. The performance of all the classifiers was evaluated based on nine metrics: precision, recall, and the F1-score, each measured in micro, macro and weighted perspective. While the traditional models were trained on vectors of features, derived from the raw data, that were based on knowledge of common sources of anomaly, the LSTM was trained on raw time-series data. Experimental results indicate that the performance of the LSTM was comparable to the best traditional classifiers by achieving 99% accuracy in all 9 metrics. The model requires no labor-intensive feature engineering, and the fine-tuning of its architecture and hyper-parameters can be made in a fully automated way. This study, therefore, finds LSTM networks an effective solution to anomaly detection and classification in sensor data.
APA, Harvard, Vancouver, ISO, and other styles
11

TAKAKU, Masao, Yuka EGUSA, Hitomi SAITO, Hitoshi TERAI, and 仁. 寺井. "An Application of the NTCIR-WEB Raw-data Archive Dataset for User Experiments." National Institute of Informatics (NII), 2007. http://hdl.handle.net/2237/8815.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Yang, Yun-zhi, Shun-ji Huang, and Jian-guo Wang. "The Realization Analysis of SAR Raw Data With Block Adaptive Vector Quantization Algorithm." International Foundation for Telemetering, 2003. http://hdl.handle.net/10150/605596.

Full text
Abstract:
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada<br>In this paper, we discuss a Block Adaptive Vector Quantization(BAVQ) Algorithm for Synthetic Aperture Radar(SAR). And we discuss a realization method of BAVQ algorithm for SAR raw data compressing in digital signal processor. Using the algorithm and the digital signal processor, we have compressed the SIR_C/X_SAR data.
APA, Harvard, Vancouver, ISO, and other styles
13

Deller, Yannick. "Raw Data for Peace and Security - The Extraction and Mining of People's Behaviour." Thesis, Malmö universitet, Fakulteten för kultur och samhälle (KS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-22599.

Full text
Abstract:
In 2015, the United Nations Global Pulse launched an experimentation process assessing the viability of big data and artificial intelligence analysis to support peace and security. The proposition of using such analysis, and thereby creating early warning systems based on real-time monitoring, warrants a critical assessment. This thesis engages in an explanatory critique of the discursive (re-)definitions of peace and security as well as big data and artificial intelligence in the United Nations Global Pulse Lab Kampala report Experimenting with Big Data and Artificial Intelligence to Support Peace and Security. The paper follows a qualitative design and utilises critical discourse analysis as its methodology while using instrumentarian violence as a theoretical lens. The study argues that the use of big data and artificial intelligence analysis, in conjunction with data mining on social media and radio broadcasts for the purposes of early warning systems, creates and manifests social relations marked by asymmetric power and knowledge dynamics. The analysis suggests that the report’s discursive and social practices indicate a conceptualisation of peace and security rooted in the notion of social control through prediction. The study reflects on the consequences for social identities, social relations, and the social world itself and suggests potential areas for future research.
APA, Harvard, Vancouver, ISO, and other styles
14

Lopez, Marino Maria Emilia. "Big data analysis interrogating raw material variability and the impact on process performance." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122400.

Full text
Abstract:
Thesis: M.B.A., Massachusetts Institute of Technology, Sloan School of Management, in conjunction with the Leaders for Global Operations Program at MIT, 2019<br>Thesis: S.M., Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, in conjunction with the Leaders for Global Operations Program at MIT, 2019<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 96-102).<br>Within the biopharmaceutical industry, material sciences is a rapidly growing field to continue to ensure reliable production and delivery of medicines. Consequently, there is an on-going need to evaluate and assess new materials, driven by novel process technologies and new modalities. Finding a solution to technically assess the impact of raw material attributes on the manufacturing process represents a significant opportunity to ensure supply. This study seeks to develop a novel predictive framework to assess the impact of raw material variability on the performance of commercial biologic manufacturing processes. Through machine learning techniques, the impact of two strategic raw materials is evaluated by modeling and predicting the outcomes of critical process performance variables and product quality attributes. As part of this research, we aimed to equip Amgen Inc. with a novel learning tool delivering the potential to uncover a deeper level of material variability understanding which: (1) ensures reliable supply through consistent performance, (2) provides insights to material attributes, and (3) delivers the capability to solve material-related investigations more efficiently. Models trained via machine learning showed 89 % average accuracy on predictions for new data. In addition to the demonstrated predictive power, the models developed were highly interpretable and illustrated correlations with several material attributes. Henceforth, the framework developed is the starting point of a novel methodology towards input material variability understanding. The predictive framework was implemented as a web-tool and is currently being piloted at Amgen Inc. The modular design of the predictive models and the web-tool enable the application to other production processes and associated raw materials, and could be generalized across the industry.<br>by Maria Emilia Lopez Marino.<br>M.B.A.<br>S.M.<br>M.B.A. Massachusetts Institute of Technology, Sloan School of Management<br>S.M. Massachusetts Institute of Technology, Department of Civil and Environmental Engineering
APA, Harvard, Vancouver, ISO, and other styles
15

Blizard, Katherine S. "Shark Sim: A Procedural Method of Animating Leopard Sharks Based on Raw Location Data." DigitalCommons@CalPoly, 2013. https://digitalcommons.calpoly.edu/theses/938.

Full text
Abstract:
Fish such as the Leopard Shark (Triakis semifasciata) can be tagged on their fin, released back into the wild, and their location tracked though technologies such as autonomous robots. Timestamped location data about their target is stored. We present a way to procedurally generate an animated simulation of T. semifasciata using only these timestamped location points. This simulation utilizes several components. Input timestamps dictate a monotonic time-space curve mapping the simulation clock to the space curve. The space curve connects all the location points as a spline without any sharp folds that are too implausible for shark traversal. We create a model leopard shark that has convincing kinematics that respond to the space curve. This is achieved through acquiring a skinned model and applying T. semifasciata motion kinematics that respond to velocity and turn commands. These kinematics affect the spine and all fins that control locomotion and direction. Kinematic- based procedural keyframes added onto a queue interpolate while the shark model traverses the path. This simulation tool generates animation sequences that can be viewed in real-time. A user study of 27 individuals was deployed to measure the perceived realism of the sequences as judged by the user by contrasting 5 different film sequences. Results of the study show that on average, viewers perceive our simulation as more realistic than not.
APA, Harvard, Vancouver, ISO, and other styles
16

Wilhelmy, Jochen [Verfasser], and Willi A. [Akademischer Betreuer] Kalender. "Lossless and Lossy Raw Data Compression in CT Imaging / Jochen Wilhelmy. Betreuer: Willi A. Kalender." Erlangen : Universitätsbibliothek der Universität Erlangen-Nürnberg, 2012. http://d-nb.info/1029374414/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Carlsson, Jesper. "Enhancement of Positioning and Attitude Estimation Using Raw GPS Data in an Extended Kalman Filter." Thesis, Linköpings universitet, Reglerteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-109336.

Full text
Abstract:
A Global Positioning System (GPS) can be used to estimate an objects position,given that the object has a GPS antenna. However, the system requires informationfrom at least four independent satellites in order to be able to give a positionestimate. If two GPS antennas and a carrier-phase GPS measurement unit is usedan estimate of the objects heading can be calculated by determine the baselinebetween the two antennas. The method is called GPS Attitude Determination(GPSAD) and requires that an Integer Ambiguity Problem (IAP) is solved. Thismethod is cheaper than more traditional methods to calculate the heading butis dependent on undisturbed GPS-reception. Through support from an InertialMeasurement Unit (IMU), containing accelerometers and gyroscopes, the systemcan be enhanced. In Thorstenson [2012] data from GPS, GPSAD and IMU wasintegrated in an Extended Kalman Filter (EKF) to enhance the performance. Thisthesis is an extension on Thorstensons work and is divided into two separate problems:enhancement of positioning when less than four satellites are available andthe possibility to integrate the EKF with the search of the correct integers for theIAP in order to enhance the estimation of attitude. For both problems an implementationhas been made and the performance has been enhanced for simulateddata. For the first problem it has been possible to enhance the performance onreal data while that has not been possible for the second problem. A number ofproposals is given on how to enhance the performance for the second problemusing real data.
APA, Harvard, Vancouver, ISO, and other styles
18

Ainul, Azyan Zuliyanti Hanizan. "An investigation into the practicality of using a digital camera's RAW data in print publishing applications /." Link to online version, 2005. https://ritdml.rit.edu/dspace/handle/1850/1110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Shokat, Imran. "Computational Analyses of Scientific Publications Using Raw and Manually Curated Data with Applications to Text Visualization." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-78995.

Full text
Abstract:
Text visualization is a field dedicated to the visual representation of textual data by using computer technology. A large number of visualization techniques are available, and now it is becoming harder for researchers and practitioners to choose an optimal technique for a particular task among the existing techniques. To overcome this problem, the ISOVIS Group developed an interactive survey browser for text visualization techniques. ISOVIS researchers gathered papers which describe text visualization techniques or tools and categorized them according to a taxonomy. Several categories were manually assigned to each visualization technique. In this thesis, we aim to analyze the dataset of this browser. We carried out several analyses to find temporal trends and correlations of the categories present in the browser dataset. In addition, a comparison of these categories with a computational approach has been made. Our results show that some categories became more popular than before whereas others have declined in popularity. The cases of positive and negative correlation between various categories have been found and analyzed. Comparison between manually labeled datasets and results of computational text analyses were presented to the experts with an opportunity to refine the dataset. Data which is analyzed in this thesis project is specific to text visualization field, however, methods that are used in the analyses can be generalized for applications to other datasets of scientific literature surveys or, more generally, other manually curated collections of textual documents.
APA, Harvard, Vancouver, ISO, and other styles
20

Rintala, Jonathan. "Speech Emotion Recognition from Raw Audio using Deep Learning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-278858.

Full text
Abstract:
Traditionally, in Speech Emotion Recognition, models require a large number of manually engineered features and intermediate representations such as spectrograms for training. However, to hand-engineer such features often requires both expert domain knowledge and resources. Recently, with the emerging paradigm of deep-learning, end-to-end models that extract features themselves and learn from the raw speech signal directly have been explored. A previous approach has been to combine multiple parallel CNNs with different filter lengths to extract multiple temporal features from the audio signal, and then feed the resulting sequence to a recurrent block. Also, other recent work present high accuracies when utilizing local feature learning blocks (LFLBs) for reducing the dimensionality of a raw audio signal, extracting the most important information. Thus, this study will combine the idea of LFLBs for feature extraction with a block of parallel CNNs with different filter lengths for capturing multitemporal features; this will finally be fed into an LSTM layer for global contextual feature learning. To the best of our knowledge, such a combined architecture has yet not been properly investigated. Further, this study will investigate different configurations of such an architecture. The proposed model is then trained and evaluated on the well-known speech databases EmoDB and RAVDESS, both in a speaker-dependent and speaker-independent manner. The results indicate that the proposed architecture can produce comparable results with state-of-the-art; despite excluding data augmentation and advanced pre-processing. It was reported 3 parallel CNN pipes yielded the highest accuracy, together with a series of modified LFLBs that utilize averagepooling and ReLU activation. This shows the power of leaving the feature learning up to the network and opens up for interesting future research on time-complexity and trade-off between introducing complexity in pre-processing or in the model architecture itself.<br>Traditionellt sätt, vid talbaserad känsloigenkänning, kräver modeller ett stort antal manuellt konstruerade attribut och mellanliggande representationer, såsom spektrogram, för träning. Men att konstruera sådana attribut för hand kräver ofta både domänspecifika expertkunskaper och resurser. Nyligen har djupinlärningens framväxande end-to-end modeller, som utvinner attribut och lär sig direkt från den råa ljudsignalen, undersökts. Ett tidigare tillvägagångssätt har varit att kombinera parallella CNN:er med olika filterlängder för att extrahera flera temporala attribut från ljudsignalen och sedan låta den resulterande sekvensen passera vidare in i ett så kallat Recurrent Neural Network. Andra tidigare studier har också nått en hög noggrannhet när man använder lokala inlärningsblock (LFLB) för att reducera dimensionaliteten hos den råa ljudsignalen, och på så sätt extraheras den viktigaste informationen från ljudet. Således kombinerar denna studie idén om att nyttja LFLB:er för extraktion av attribut, tillsammans med ett block av parallella CNN:er som har olika filterlängder för att fånga multitemporala attribut; detta kommer slutligen att matas in i ett LSTM-lager för global inlärning av kontextuell information. Så vitt vi vet har en sådan kombinerad arkitektur ännu inte undersökts. Vidare kommer denna studie att undersöka olika konfigurationer av en sådan arkitektur. Den föreslagna modellen tränas och utvärderas sedan på de välkända taldatabaserna EmoDB och RAVDESS, både via ett talarberoende och talaroberoende tillvägagångssätt. Resultaten indikerar att den föreslagna arkitekturen kan ge jämförbara resultat med state-of-the-art, trots att ingen ökning av data eller avancerad förbehandling har inkluderats. Det rapporteras att 3 parallella CNN-lager gav högsta noggrannhet, tillsammans med en serie av modifierade LFLB:er som nyttjar average-pooling och ReLU som aktiveringsfunktion. Detta visar fördelarna med att lämna inlärningen av attribut till nätverket och öppnar upp för intressant framtida forskning kring tidskomplexitet och avvägning mellan introduktion av komplexitet i förbehandlingen eller i själva modellarkitekturen.
APA, Harvard, Vancouver, ISO, and other styles
21

Jarick, Ivonne [Verfasser], and Helmut [Akademischer Betreuer] Schäfer. "Strategies for Genome-Wide Association Analyses of Raw Copy Number Variation Data / Ivonne Jarick. Betreuer: Helmut Schäfer." Marburg : Philipps-Universität Marburg, 2013. http://d-nb.info/1045729884/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

PRETTE, NICOLA. "Advanced methods and deep learning for video and satellite data compression." Doctoral thesis, Politecnico di Torino, 2022. http://hdl.handle.net/11583/2971610.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

de, Beste Eugene. "Enabling the processing of bioinformatics workflows where data is located through the use of cloud and container technologies." University of the Western Cape, 2019. http://hdl.handle.net/11394/6767.

Full text
Abstract:
>Magister Scientiae - MSc<br>The growing size of raw data and the lack of internet communication technology to keep up with that growth is introducing unique challenges to academic researchers. This is especially true for those residing in rural areas or countries with sub-par telecommunication infrastructure. In this project I investigate the usefulness of cloud computing technology, data analysis workflow languages and portable computation for institutions that generate data. I introduce the concept of a software solution that could be used to simplify the way that researchers execute their analysis on data sets at remote sources, rather than having to move the data. The scope of this project involved conceptualising and designing a software system to simplify the use of a cloud environment as well as implementing a working prototype of said software for the OpenStack cloud computing platform. I conclude that it is possible to improve the performance of research pipelines by removing the need for researchers to have operating system or cloud computing knowledge and that utilising technologies such as this can ease the burden of moving data.
APA, Harvard, Vancouver, ISO, and other styles
24

Shadle, Daryl Allen. "An investigation into the long-term impact of the calibration of software estimation models using raw historical data." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA286112.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, September 1994.<br>Thesis advisors, T. Hamid and Keebom Kang. "September 1994." Bibliography: p. 127-128. Also available online.
APA, Harvard, Vancouver, ISO, and other styles
25

Klewin, Sebastian [Verfasser], and Johanna [Akademischer Betreuer] Stachel. "Development of the FPGA-based Raw Data Preprocessor for the TPC Readout Upgrade in ALICE / Sebastian Klewin ; Betreuer: Johanna Stachel." Heidelberg : Universitätsbibliothek Heidelberg, 2019. http://d-nb.info/1185170944/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Hürtgen, Gisela [Verfasser], Achim [Akademischer Betreuer] Stahl, and Michael J. [Akademischer Betreuer] Eble. "Determination of lung tumour motion from PET raw data used for accelerometer based motion prediction / Gisela Hürtgen ; Achim Stahl, Michael J. Eble." Aachen : Universitätsbibliothek der RWTH Aachen, 2018. http://d-nb.info/1171323948/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Schultz, Johan. "Sensordatafusion av IR- och radarbilder." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2193.

Full text
Abstract:
<p>Den här rapporten beskriver och utvärderar ett antal algoritmer för multisensordatafusion av radar och IR/TV-data på rådatanivå. Med rådatafusion menas att fusionen ska ske innan attribut- eller objektextrahering. Attributextrahering kan medföra att information går förlorad som skulle kunna förbättra fusionen. Om fusionen sker på rådatanivå finns mer information tillgänglig och skulle kunna leda till en förbättrad attributextrahering i ett senare steg. Två tillvägagångssätt presenteras. Den ena metoden projicerar radarbilden till IR-vyn och vice versa. Fusionen utförs sedan på de par av bilder med samma dimensioner. Den andra metoden fusionerar de två ursprungliga bilderna till en volym. Volymen spänns upp av de tre dimensionerna representerade i ursprungsbilderna. Metoden utökas också genom att utnyttja stereoseende. Resultaten visar att det kan vara givande att utnyttja stereoseende då den extra informationen underlättar fusionen samt ger en mer generell lösning på problemet.</p><br><p>This thesis describes and evaluates a number of algorithms for multi sensor fusion of radar and IR/TV data. The fusion is performed on raw data level, that is prior to attribute extraction. The idea is that less information will be lost compared to attribute level fusion. Two methods are presented. The first method transforms the radar image to the IR-view and vice versa. The images sharing the same dimension are then fused together. The second method fuses the original images to a three dimensional volume. Another version is also presented, where stereo vision is used. The results show that stereo vision can be used with good performance and gives a more general solution to the problem.</p>
APA, Harvard, Vancouver, ISO, and other styles
28

Giljum, Stefan, Hanspeter Wieland, Franz Stephan Lutter, Nina Eisenmenger, Heinz Schandl, and Anne Owen. "The impacts of data deviations between MRIO models on material footprints: A comparison of EXIOBASE, Eora, and ICIO." Wiley, 2019. http://dx.doi.org/10.1111/jiec.12833.

Full text
Abstract:
In various international policy processes such as the UN Sustainable Development Goals, an urgent demand for robust consumption-based indicators of material flows, or material footprints (MFs), has emerged over the past years. Yet, MFs for national economies diverge when calculated with different Global Multiregional Input-Output (GMRIO) databases, constituting a significant barrier to a broad policy uptake of these indicators. The objective of this paper is to quantify the impact of data deviations between GMRIO databases on the resulting MF. We use two methods, structural decomposition analysis and structural production layer decomposition, and apply them for a pairwise assessment of three GMRIO databases, EXIOBASE, Eora, and the OECD Inter-Country Input-Output (ICIO) database, using an identical set of material extensions. Although all three GMRIO databases accord for the directionality of footprint results, that is, whether a countries' final demand depends on net imports of raw materials from abroad or is a net exporter, they sometimes show significant differences in level and composition of material flows. Decomposing the effects from the Leontief matrices (economic structures), we observe that a few sectors at the very first stages of the supply chain, that is, raw material extraction and basic processing, explain 60% of the total deviations stemming from the technology matrices. We conclude that further development of methods to align results from GMRIOs, in particular for material-intensive sectors and supply chains, should be an important research priority. This will be vital to strengthen the uptake of demand-based material flow indicators in the resource policy context.
APA, Harvard, Vancouver, ISO, and other styles
29

Goeta, Samuel. "Instaurer des données, instaurer des publics : une enquête sociologique dans les coulisses de l'open data." Thesis, Paris, ENST, 2016. http://www.theses.fr/2016ENST0045/document.

Full text
Abstract:
Alors que plus de cinquante pays dans le monde ont entrepris une démarche d’ouverture des données publiques, la thèse enquête sur l’émergence et la mise en oeuvre des politiques d’open data. Elle repose sur l’analyse de sources publiques et sur une enquête ethnographique conduite dans sept collectivités locales et institutions françaises. Revenant sur six moments de définition de grands « principes » de l’open data et leur traduction en politique publique par une institution française, Etalab, ce travail montre comment la catégorisation par l’open data a porté l’attention sur les données, en particulier sous leur forme « brute », considérées comme une ressource inexploitée, le « nouveau pétrole » gisant sous les organisations. L’enquête montre que le processus de l’ouverture débute généralement par une phase d’identification marquée par des explorations progressives et incertaines. Elle permet de comprendre que l’identification constitue un geste d’instauration qui transforme progressivement les fichiers de gestion de l’administration en données. Leur mise en circulation provoque des frictions : pour sortir des réseaux sociotechniques de l’organisation, les données doivent généralement passer à travers des circuits de validation et des chaînes de traitement. Par ailleurs, les données doivent souvent subir d’importantes transformations avant leur ouverture pour devenir intelligibles à la fois par les machines et par les humains. Cette thèse montre enfin que l’instauration concerne aussi les publics dont il est attendu qu’ils visualisent, inspectent et exploitent les données ouvertes. L’instauration des publics par des instruments très divers constitue un autre pan du travail invisible des politiques d’open data. Il ressort enfin de cette thèse que l’obligation à l’ouverture des données publiques, une suite possible des politiques d’open data, pose de manière saillante une question fondamentale « qu’est-ce qu’une donnée ? » Plutôt que de réduire la donnée à une catégorie relative, qui s’appliquerait à toutes sortes de matériaux informationnels, les cas étudiés montrent qu’elle est généralement attribuée dès lors que les données sont le point de départ de réseauxsociotechniques dédiés à leur circulation, leur exploitation et leur mise en visibilité<br>As more than fifty countries have launched an open data policy, this doctoral dissertation investigates on the emergence and implementation of such policies. It is based on the analysis of public sources and an ethnographic inquiry conducted in seven French local authorities and institutions. By retracing six moments of definitions of the “open data principles” and their implementation by a French institution, Etalab, this work shows how open data has brought attention to data, particularly in their raw form, considered as an untapped resource, the “new oil” lying under the organisations. The inquiry shows that the process of opening generally begins by a phase of identification marked by progressive and uncertain explorations. It allows to understand that data are progressively instantiated from management files into data. Their circulation provoke frictions: to leave the sociotechnical network of organisations, data generally go through validation circuits and chains of treatment. Besides, data must often undergo important treatments before their opening in order to become intelligible by machines as well as humans. This thesis shows eventually that data publics are also instantiated as they are expected to visualize, inspect and process the data. Data publics are instantiated through various tools, which compose another area of the invisible work of open data projects. Finally, it appears from this work that the possible legal requirement to open data asks a fundamental question, “what is data?” Instead of reducing data to a relational category, which would apply to any informational material, studied cases show that they generally are applied when data are a starting point of sociotechnical networks dedicated to their circulation, their exploitation and their visibility
APA, Harvard, Vancouver, ISO, and other styles
30

Goëta, Samuel. "Instaurer des données, instaurer des publics : une enquête sociologique dans les coulisses de l'open data." Electronic Thesis or Diss., Paris, ENST, 2016. http://www.theses.fr/2016ENST0045.

Full text
Abstract:
Alors que plus de cinquante pays dans le monde ont entrepris une démarche d’ouverture des données publiques, la thèse enquête sur l’émergence et la mise en oeuvre des politiques d’open data. Elle repose sur l’analyse de sources publiques et sur une enquête ethnographique conduite dans sept collectivités locales et institutions françaises. Revenant sur six moments de définition de grands « principes » de l’open data et leur traduction en politique publique par une institution française, Etalab, ce travail montre comment la catégorisation par l’open data a porté l’attention sur les données, en particulier sous leur forme « brute », considérées comme une ressource inexploitée, le « nouveau pétrole » gisant sous les organisations. L’enquête montre que le processus de l’ouverture débute généralement par une phase d’identification marquée par des explorations progressives et incertaines. Elle permet de comprendre que l’identification constitue un geste d’instauration qui transforme progressivement les fichiers de gestion de l’administration en données. Leur mise en circulation provoque des frictions : pour sortir des réseaux sociotechniques de l’organisation, les données doivent généralement passer à travers des circuits de validation et des chaînes de traitement. Par ailleurs, les données doivent souvent subir d’importantes transformations avant leur ouverture pour devenir intelligibles à la fois par les machines et par les humains. Cette thèse montre enfin que l’instauration concerne aussi les publics dont il est attendu qu’ils visualisent, inspectent et exploitent les données ouvertes. L’instauration des publics par des instruments très divers constitue un autre pan du travail invisible des politiques d’open data. Il ressort enfin de cette thèse que l’obligation à l’ouverture des données publiques, une suite possible des politiques d’open data, pose de manière saillante une question fondamentale « qu’est-ce qu’une donnée ? » Plutôt que de réduire la donnée à une catégorie relative, qui s’appliquerait à toutes sortes de matériaux informationnels, les cas étudiés montrent qu’elle est généralement attribuée dès lors que les données sont le point de départ de réseauxsociotechniques dédiés à leur circulation, leur exploitation et leur mise en visibilité<br>As more than fifty countries have launched an open data policy, this doctoral dissertation investigates on the emergence and implementation of such policies. It is based on the analysis of public sources and an ethnographic inquiry conducted in seven French local authorities and institutions. By retracing six moments of definitions of the “open data principles” and their implementation by a French institution, Etalab, this work shows how open data has brought attention to data, particularly in their raw form, considered as an untapped resource, the “new oil” lying under the organisations. The inquiry shows that the process of opening generally begins by a phase of identification marked by progressive and uncertain explorations. It allows to understand that data are progressively instantiated from management files into data. Their circulation provoke frictions: to leave the sociotechnical network of organisations, data generally go through validation circuits and chains of treatment. Besides, data must often undergo important treatments before their opening in order to become intelligible by machines as well as humans. This thesis shows eventually that data publics are also instantiated as they are expected to visualize, inspect and process the data. Data publics are instantiated through various tools, which compose another area of the invisible work of open data projects. Finally, it appears from this work that the possible legal requirement to open data asks a fundamental question, “what is data?” Instead of reducing data to a relational category, which would apply to any informational material, studied cases show that they generally are applied when data are a starting point of sociotechnical networks dedicated to their circulation, their exploitation and their visibility
APA, Harvard, Vancouver, ISO, and other styles
31

Bhattaram, Sneha. "Signal Compression Methods for a Wear Debris Sensor." University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1399201029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Herterich, Rebecka, and Anna Sumarokova. "Coil Sensitivity Estimation and Intensity Normalisation for Magnetic Resonance Imaging." Thesis, KTH, Medicinteknik och hälsosystem, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263149.

Full text
Abstract:
The quest for improved efficiency in magnetic resonance imaging has motivated the development of strategies like parallel imaging where arrays of multiple receiver coils are operated simultaneously in parallel. The objective of this project was to find an estimation of phased-array coil sensitivity profiles of magnetic resonance images of the human body. These sensitivity maps can then be used to perform an intensity inhomogeneity correction of the images. Through investigative work in Matlab, a script was developed that uses data embedded in raw data from a magnetic resonance scan, to generate coil sensitivities for each voxel of the volume of interest and recalculate them to two-dimensional sensitivity maps of the corresponding diagnostic images. The resulting mapped sensitivity profiles can be used in Sensitivity Encoding where a more exact solution can be obtained using the carefully estimated sensitivity maps of the images.<br>Inom magnetresonanstomografi eftersträvas förbättrad effektivitet, villket bidragit till utvecklingen av strategier som parallell imaging, där arrayer av flera mottagarspolar andvänds samtidigt. Syftet med detta projekt var att uppskattamottagarspolarnas känslighetskarta för att utnyttja dem till i metoder inom magnetresonansavbildning. Dessa känslighetskartor kan användas för att utföra intensitetsinhomogenitetskorrigering av bilderna. Genom utforskande arbete i Matlab utvecklades ett skript som tillämpar inbyggd rådata, från en magnetiskresonansavbildning för att generera spolens känslighet för varje voxel av volymen och omberäkna dem till tvådimensionella känslighetskartor av motsvarande diagnostiska bilder. De resulterande kartlagda känslighetsprofilerna kan användas i känslighetskodning, där en mer exakt lösning kan erhållas med hjälp av de noggrant uppskattade känslighetskartorna.
APA, Harvard, Vancouver, ISO, and other styles
33

Çek, Mehmet Emre Savacı Ferit Acar. "Analysis of observed chaotic data/." [s.l.]: [s.n.], 2004. http://library.iyte.edu.tr/tezler/master/elektronikvehaberlesme/T000493.rar.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

DI, PIETRA VINCENZO. "Seamless Positioning and Navigation in Urban Environment." Doctoral thesis, Politecnico di Torino, 2019. http://hdl.handle.net/11583/2732878.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Özsevim, Emrah Püskülcü Halis. "Comparison of different algorithms for exploting the hidden trends in data sources/." [s.l.]: [s.n.], 2003. http://library.iyte.edu.tr/tezler/master/bilgisayaryazilimi/T000258.rar.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Gauthier, Marianne. "Etude de l’influence de l’entrée artérielle tumorale par modélisation numérique et in vitro en imagerie de contraste ultrasonore. : application clinique pour l’évaluation des thérapies ciblées en cancérologie." Thesis, Paris 11, 2011. http://www.theses.fr/2011PA11T088.

Full text
Abstract:
L’échographie dynamique de contraste (DCE-US) est actuellement proposée comme technique d’imagerie fonctionnelle permettant d’évaluer les nouvelles thérapies anti-angiogéniques. Dans ce contexte, L'UPRES EA 4040, Université Paris-Sud 11, et le service d'Echographie de l'Institut Gustave Roussy ont développé une méthodologie permettant de calculer automatiquement, à partir de la courbe de prise de contraste moyenne obtenue dans la tumeur après injection en bolus d’un agent de contraste, un ensemble de paramètres semi-quantitatifs. Actuellement, l’état hémodynamique du patient ou encore les conditions d’injection du produit de contraste ne sont pas pris en compte dans le calcul de ces paramètres à l’inverse d’autres modalités (imagerie par résonance magnétique dynamique de contraste ou scanner de perfusion). L’objectif de cette thèse était donc d’étendre la méthode de déconvolution utilisée en routine dans les autres modalités d’imagerie à l’échographie de contraste. Celle-ci permet de s’affranchir des conditions citées précédemment en déconvoluant la courbe de prise de contraste issue de la tumeur par la fonction d’entrée artérielle, donnant ainsi accès aux paramètres quantitatifs flux sanguin, volume sanguin et temps de transit moyen. Mon travail de recherche s’est alors articulé autour de trois axes. Le premier visait à développer la méthode de quantification par déconvolution dédiée à l’échographie de contraste, avec l’élaboration d’un outil méthodologique suivie de l’évaluation de son apport sur la variabilité des paramètres de la microvascularisation. Des évaluations comparatives de variabilité intra-opérateur ont alors mis en évidence une diminution drastique des coefficients de variation des paramètres de la microvascularisation de 30% à 13% avec la méthode de déconvolution. Le deuxième axe était centré sur l’étude des sources de variabilité influençant les paramètres de la microvascularisation portant à la fois sur les conditions expérimentales et sur les conditions physiologiques de la tumeur. Enfin, le dernier axe a reposé sur une étude rétrospective menée sur 12 patients pour lesquels nous avons évalué l’intérêt de la déconvolution en comparant l’évolution des paramètres quantitatifs et semi-quantitatifs de la microvascularisation en fonction des réponses des tumeurs obtenues par les critères RECIST à partir d’un scan effectué à 2 mois. Cette méthodologie est prometteuse et peut permettre à terme une évaluation plus robuste et précoce des thérapies anti-angiogéniques que les méthodologies actuellement utilisées en routine dans le cadre des examens DCE-US<br>Dynamic contrast-enhanced ultrasonography (DCE-US) is currently used as a functional imaging technique for evaluating anti-angiogenic therapies. A mathematical model has been developed by the UPRES EA 4040, Paris-Sud university and the Gustave Roussy Institute to evaluate semi-quantitative microvascularization parameters directly from time-intensity curves. But DCE-US evaluation of such parameters does not yet take into account physiological variations of the patient or even the way the contrast agent is injected as opposed to other functional modalities (dynamic magnetic resonance imaging or perfusion scintigraphy). The aim of my PhD was to develop a deconvolution process dedicated to the DCE-US imaging, which is currently used as a routine method in other imaging modalities. Such a process would allow access to quantitatively-defined microvascularization parameters since it would provide absolute evaluation of the tumor blood flow, the tumor blood volume and the mean transit time. This PhD has been led according to three main goals. First, we developed a deconvolution method involving the creation of a quantification tool and validation through studies of the microvascularization parameter variability. Evaluation and comparison of intra-operator variabilities demonstrated a decrease in the coefficients of variation from 30% to 13% when microvascularization parameters were extracted using the deconvolution process. Secondly, we evaluated sources of variation that influence microvascularization parameters concerning both the experimental conditions and the physiological conditions of the tumor. Finally, we performed a retrospective study involving 12 patients for whom we evaluated the benefit of the deconvolution process: we compared the evolution of the quantitative and semi-quantitative microvascularization parameters based on tumor responses evaluated by the RECIST criteria obtained through a scan performed after 2 months. Deconvolution is a promising process that may allow an earlier, more robust evaluation of anti-angiogenic treatments than the DCE-US method in current clinical use
APA, Harvard, Vancouver, ISO, and other styles
37

Romanenko, Ilya. "Novel image processing algorithms and methods for improving their robustness and operational performance." Thesis, Loughborough University, 2014. https://dspace.lboro.ac.uk/2134/16340.

Full text
Abstract:
Image processing algorithms have developed rapidly in recent years. Imaging functions are becoming more common in electronic devices, demanding better image quality, and more robust image capture in challenging conditions. Increasingly more complicated algorithms are being developed in order to achieve better signal to noise characteristics, more accurate colours, and wider dynamic range, in order to approach the human visual system performance levels.
APA, Harvard, Vancouver, ISO, and other styles
38

Falch, Thomas Løfsgaard. "3D Visualization of X-ray Diffraction Data." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for datateknikk og informasjonsvitenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-18903.

Full text
Abstract:
X-ray diffraction experiments are used extensively in the sciences to study the structure, chemicalcomposition and physical properties of materials. The output of such experiments are samples of thediffraction pattern, which essentially constitutes a 3D unstructured dataset. In this thesis, wedevelop a method for visualizing such datasets.Our visualization method is based on volume ray casting, but operates directly on the unstructuredsamples, rather than resampling them to form voxels. We estimate the intensity of the X-raydiffraction pattern at points along the rays by interpolation using nearby samples, taking advantageof an octree to facilitate efficient range search. The method is implemented on both the CPUand the GPU.To test our method, actual X-ray diffraction datasets is used, consisting of up to 120M samples.We are able to generate images of good quality. The rendering time varies dramatically, between 5 sand 200 s, depending upon dataset, and settings used. A simple performance model is developedand empirically tested to better understand this variation. Our implementation scales exceptionallywell to more CPU cores, with a speedup of 5.9 on a 6-core CPU. Furthermore, the GPU implementationachieves a speedup of around 4.6 compared to the CPU version.
APA, Harvard, Vancouver, ISO, and other styles
39

Stoner, D. C., and T. F. F. Eklund. "Static RAM Data Recorder for Flight Tests." International Foundation for Telemetering, 1988. http://hdl.handle.net/10150/615241.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada<br>A static Random Access Memory (RAM) data recorder has been developed to recover strain and acceleration data during development tests of high-speed earth penetrating vehicles. Bi-level inputs are also available for continuity measurements. An iteration of this system was modified for use on water entry evaluations.
APA, Harvard, Vancouver, ISO, and other styles
40

Henrique, Lygia Maria Moreno Molina. "Proteção de dados pessoais: um direito relevante no mundo digital." Pontifícia Universidade Católica de São Paulo, 2016. https://tede2.pucsp.br/handle/handle/7009.

Full text
Abstract:
Made available in DSpace on 2016-04-26T20:24:13Z (GMT). No. of bitstreams: 1 Lygia Maria Moreno Molina Henrique.pdf: 615418 bytes, checksum: c987b4fdb53a154420790f23c0ef6d1f (MD5) Previous issue date: 2016-02-22<br>This work has as its central point the study of the right to protection of personal data and how this right is related to the flow of personal data, driven by dynamic new Internet economy. Reflectively, we will analyze issues relevant to the topic and to the current moment, starting with a social approach, broader and more comprehensive, which unfolded form will culminate in the development of the protection of personal data, both in international law, as in Brazil. Also, it will be object of study the use of data as a raw material for the provision of the services offered by companys.com, in order to create innovation and increase competition between them. As well as, we will demonstrate which options of control and protection of personal data the consumer / user has to protect their privacy. Conclusively, by evaluation of brazilian legislative propositions about personal data protection, we will issue a critical-reflective judgment about the failures and successes of each one front to the topics relevant to the protection of personal data<br>Essa dissertação tem como ponto central o estudo do direito à proteção de dados pessoais e de que modo este direito se relaciona com a circulação de dados pessoais, impulsionada pela nova e dinâmica economia da Internet. De forma reflexiva, analisaremos questões pertinentes ao tema e ao momento atual, iniciando com uma abordagem social, mais ampla e abrangente, a qual de forma desdobrada culminará na evolução da tutela de dados pessoais, tanto na legislação internacional, como na brasileira. Será ainda, objeto de estudo a utilização dos dados como matéria-prima para prestação dos serviços das empresas.com, de modo a criar inovações e acirrar a concorrência entre estas. Assim como, vamos demonstrar quais as opções de controle e tutela em relação à circulação de dados pessoais o consumidor/usuário possui a resguardar sua privacidade. De modo conclusivo, mediante a avaliação das proposituras legislativas brasileiras acerca da proteção de dados pessoais, emitiremos um juízo crítico-reflexivo sobre as falhas e êxitos de cada propositura, frente aos temas relevantes à tutela de dados pessoais
APA, Harvard, Vancouver, ISO, and other styles
41

Delahaye, Franck. "From accurate atomic data to elaborate stellar modeling structure and collisional data, opacities, radiative accelerations /." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1126315887.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2005.<br>Title from first page of PDF file. Document formatted into pages; contains xx, 198 p.; also includes graphics (some col.). Includes bibliographical references (p. 191-198). Available online via OhioLINK's ETD Center.
APA, Harvard, Vancouver, ISO, and other styles
42

Leonte, Daniela School of Mathematics UNSW. "Flexible Bayesian modelling of gamma ray count data." Awarded by:University of New South Wales. School of Mathematics, 2003. http://handle.unsw.edu.au/1959.4/19147.

Full text
Abstract:
Bayesian approaches to prediction and the assessment of predictive uncertainty in generalized linear models are often based on averaging predictions over different models, and this requires methods for accounting for model uncertainty. In this thesis we describe computational methods for Bayesian inference and model selection for generalized linear models, which improve on existing techniques. These methods are applied to the building of flexible models for gamma ray count data (data measuring the natural radioactivity of rocks) at the Castlereagh Waste Management Centre, which served as a hazardous waste disposal facility for the Sydney region between March 1978 and August 1998. Bayesian model selection methods for generalized linear models enable us to approach problems of smoothing, change point detection and spatial prediction for these data within a common methodological and computational framework, by considering appropriate basis expansions of a mean function. The data at Castlereagh were collected in the following way. A number of boreholes were drilled at the site, and for each borehole a gamma ray detector recorded gamma ray emissions at different depths as the detector was raised gradually from the bottom of the borehole to ground level. The profile of intensity of gamma counts can be informative about the geology at each location, and estimation of intensity profiles raises problems of smoothing and change point detection for count data. The gamma count profiles can also be modelled spatially, to inform the geological profile across the site. Understanding the geological structure of the site is important for modelling the transport of chemical contaminants beneath the waste disposal area. The structure of the thesis is as follows. Chapter 1 describes the Castlereagh hazardous waste site and the geophysical data, which motivated the methodology developed in this research. We summarise the principles of Gamma Ray (GR) logging, a method routinely employed by geophysicists and environmental engineers in the detailed evaluation of hazardous site geology, and detail the use of the Castlereagh data in this research. In Chapter 2 we review some fundamental ideas of Bayesian inference and computation and discuss them in the context of generalised linear models. Chapter 3 details the theoretical basis of our work. Here we give a new Markov chain Monte Carlo sampling scheme for Bayesian variable selection in generalized linear models, which is analogous to the well-known Swendsen-Wang algorithm for the Ising model. Special cases of this sampling scheme are used throughout the rest of the thesis. In Chapter 4 we discuss the use of methods for Bayesian model selection in generalized linear models in two specific applications, which we implement on the Castlereagh data. First, we consider smoothing problems where we flexibly estimate the dependence of a response variable on one or more predictors, and we apply these ideas to locally adaptive smoothing of gamma ray count data. Second, we discuss how the problem of multiple change point detection can be cast as one of model selection in a generalized linear model, and consider application to change point detection for gamma ray count data. In Chapter 5 we consider spatial models based on partitioning a spatial region of interest into cells via a Voronoi tessellation, where the number of cells and the positions of their centres is unknown, and show how these models can be formulated in the framework of established methods for Bayesian model selection in generalized linear models. We implement the spatial partition modelling approach to the spatial analysis of gamma ray data, showing how the posterior distribution of the number of cells, cell centres and cell means provides us with an estimate of the mean response function describing spatial variability across the site. Chapter 6 presents some conclusions and suggests directions for future research. A paper based on the work of Chapter 3 has been accepted for publication in the Journal of Computational and Graphical Statistics, and a paper based on the work in Chapter 4 has been accepted for publication in Mathematical Geology. A paper based on the spatial modelling of Chapter 5 is in preparation and will be submitted for publication shortly. The work in this thesis was collaborative, to a smaller or larger extent in its various components. I authored Chapters 1 and 2 entirely, including definition of the problem in the context of the CWMC site, data gathering and preparation for analysis, review of the literature on computational methods for Bayesian inference and model selection for generalized linear models. I also authored Chapters 4 and 5 and benefited from some of Dr Nott's assistance in developing the algorithms. In Chapter 3, Dr Nott led the development of sampling scheme B (corresponding to having non-zero interaction parameters in our Swendsen-Wang type algorithm). I developed the algorithm for sampling scheme A (corresponding to setting all algorithm interaction parameters to zero in our Swendsen-Wang type algorithm), and performed the comparison of the performance of the two sampling schemes. The final discussion in Chapter 6 and the direction for further research in the case study context is also my work.
APA, Harvard, Vancouver, ISO, and other styles
43

林吉雄 and Kat-hung Lam. "Geometric object reconstruction from orthogonal ray sum data." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1993. http://hub.hku.hk/bib/B31210855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Reinhard, Erik. "Scheduling and data management for parallel ray tracing." Thesis, University of Bristol, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Lam, Kat-hung. "Geometric object reconstruction from orthogonal ray sum data /." [Hong Kong : University of Hong Kong], 1993. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13458747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Jun, Zhang, Zhang Qishan, Zhang Zhihui, and Huang Jian. "A New Approach to Telemetry Data Decomposition and Analysis Based on Large-Capacity Semiconductor RAM." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/611858.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada<br>With the development of microelectronics and computer technology, telemetry computer systems are demanded to provide larger storage capacity and higher storage data rate than ever before. This paper fully considers various factors of a high-speed PCM fiber-optic telemetry system such as data format, data rate, data storage, the width of data storage, storage data rate. All these considerations lead to a new scheme with a semiconductor RAM and a dedicated program as its basic idea. This scheme chooses 1Mbits or 4Mbits static-RAM chips to implement the telemetry data storage device with a total capacity of 4Mbytes, 16Mbytes, or 64Mbytes. The software running on COMPAQ 386/25M or its compatibles is written in Turbo C 2. 0 to fetch, decompose, display and process data stored in the large-capacity RAM. The main task of the system processing software is to identify the flag words of frame sync-code -pattern and then demultiplex the data into separate channel data to be stored in the disk. Besides the ability to recognize specific data format, the software can also rectify data confusion to some extent. The scheme has already been proved to be efficient to receive large capacity of data with features of high data rate, high data storage in a short time.
APA, Harvard, Vancouver, ISO, and other styles
47

Scheidt, November. "A facial animation driven by X-ray microbeam data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0021/MQ54745.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Zhang, Shuang Nan. "Instrumentation and data analysis for hard X-ray astronomy." Thesis, University of Southampton, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.252689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Dehmelt, Chris. "THE RAH-66 COMANCHE NETWORKED BASED DATA ACQUISITION SYSTEM." International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/605766.

Full text
Abstract:
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California<br>Serial interfaces (RS232, RS422/485) have been the standard method of communications in traditional data acquisition systems. The role of these interfaces has been to supply a simple setup and control path between a host and the data acquisition master and little else. Today’s distributed data acquisition systems (DAS), which are comprised of many types of components including Ground Support Computers (GSC), Pilot Control Units (PCU), Data System Control Units (DSCU), Solid State Recorders (SSR), Data Acquisition Units (DAU) and Cockpit Instrumentation Data Systems (CIDS), are ideally suited to the use of Ethernet for not only setup functions, but for the distribution of acquired data and status to an unlimited number of users. Besides the obvious advantage of higher data rates, Ethernet provides other benefits such as greater data integrity, multi-host capability, and common programming interfaces. This paper details the integration of new L3 Communications - Telemetry East (L3-TE) Ethernet based software and hardware components that are part of the Comanche Data Systems equipment suite.
APA, Harvard, Vancouver, ISO, and other styles
50

Krause, Lennard. "Assessment of Single Crystal X-ray Diffraction Data Quality." Doctoral thesis, Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2017. http://hdl.handle.net/11858/00-1735-0000-0023-3DD4-A.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!