To see the other types of publications on this topic, follow the link: Data Quality Model.

Dissertations / Theses on the topic 'Data Quality Model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data Quality Model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Nitesh, Varma Rudraraju Nitesh, and Boyanapally Varun Varun. "Data Quality Model for Machine Learning." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18498.

Full text
Abstract:
Context: - Machine learning is a part of artificial intelligence, this area is now continuously growing day by day. Most internet related services such as Social media service, Email Spam, E-commerce sites, Search engines are now using machine learning. The Quality of machine learning output relies on the input data, so the input data is crucial for machine learning and good quality of input data can give a better outcome to the machine learning system. In order to achieve quality data, a data scientist can use a data quality model on data of machine learning. Data quality model can help data scientists to monitor and control the input data of machine learning. But there is no considerable amount of research done on data quality attributes and data quality model for machine learning. Objectives: - The primary objectives of this paper are to find and understand the state-of-art and state-of-practice on data quality attributes for machine learning, and to develop a data quality model for machine learning in collaboration with data scientists. Methods: - This paper mainly consists of two studies: - 1) Conducted a literature review in the different database in order to identify literature on data quality attributes and data quality model for machine learning. 2) An in-depth interview study was conducted to allow a better understanding and verifying of data quality attributes that we identified from our literature review study, this process is carried out with the collaboration of data scientists from multiple locations. Totally of 15 interviews were performed and based on the results we proposed a data quality model based on these interviewees perspective. Result: - We identified 16 data quality attributes as important from our study which is based on the perspective of experienced data scientists who were interviewed in this study. With these selected data quality attributes, we proposed a data quality model with which quality of data for machine learning can be monitored and improved by data scientists, and effects of these data quality attributes on machine learning have also been stated. Conclusion: - This study signifies the importance of quality of data, for which we proposed a data quality model for machine learning based on the industrial experiences of a data scientist. This research gap is a benefit to all machine learning practitioners and data scientists who intended to identify quality data for machine learning. In order to prove that data quality attributes in the data quality model are important, a further experiment can be conducted, which is proposed in future work.
APA, Harvard, Vancouver, ISO, and other styles
2

He, Ying Surveying &amp Spatial Information Systems Faculty of Engineering UNSW. "Spatial data quality management." Publisher:University of New South Wales. Surveying & Spatial Information Systems, 2008. http://handle.unsw.edu.au/1959.4/43323.

Full text
Abstract:
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
APA, Harvard, Vancouver, ISO, and other styles
3

Tsai, Eva Y. (Eva Yi-hua). "Inter-database data quality management : a relational-model based approach." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/40202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mallur, Vikram. "A Model for Managing Data Integrity." Thesis, Université d'Ottawa / University of Ottawa, 2011. http://hdl.handle.net/10393/20233.

Full text
Abstract:
Consistent, accurate and timely data are essential to the functioning of a modern organization. Managing the integrity of an organization’s data assets in a systematic manner is a challenging task in the face of continuous update, transformation and processing to support business operations. Classic approaches to constraint-based integrity focus on logical consistency within a database and reject any transaction that violates consistency, but leave unresolved how to fix or manage violations. More ad hoc approaches focus on the accuracy of the data and attempt to clean data assets after the fact, using queries to flag records with potential violations and using manual efforts to repair. Neither approach satisfactorily addresses the problem from an organizational point of view. In this thesis, we provide a conceptual model of constraint-based integrity management (CBIM) that flexibly combines both approaches in a systematic manner to provide improved integrity management. We perform a gap analysis that examines the criteria that are desirable for efficient management of data integrity. Our approach involves creating a Data Integrity Zone and an On Deck Zone in the database for separating the clean data from data that violates integrity constraints. We provide tool support for specifying constraints in a tabular form and generating triggers that flag violations of dependencies. We validate this by performing case studies on two systems used to manage healthcare data: PAL-IS and iMED-Learn. Our case studies show that using views to implement the zones does not cause any significant increase in the running time of a process.
APA, Harvard, Vancouver, ISO, and other styles
5

Akintoye, Samson Busuyi. "Quality of service in cloud computing: Data model; resource allocation; and data availability and security." University of the Western Cape, 2019. http://hdl.handle.net/11394/7066.

Full text
Abstract:
Philosophiae Doctor - PhD
Recently, massive migration of enterprise applications to the cloud has been recorded in the Information Technology (IT) world. The number of cloud providers offering their services and the number of cloud customers interested in using such services is rapidly increasing. However, one of the challenges of cloud computing is Quality-of-Service management which denotes the level of performance, reliability, and availability offered by cloud service providers. Quality-of-Service is fundamental to cloud service providers who find the right tradeoff between Quality-of-Service levels and operational cost. In order to find out the optimal tradeoff, cloud service providers need to comply with service level agreements contracts which define an agreement between cloud service providers and cloud customers. Service level agreements are expressed in terms of quality of service (QoS) parameters such as availability, scalability performance and the service cost. On the other hand, if the cloud service provider violates the service level agreement contract, the cloud customer can file for damages and claims some penalties that can result in revenue losses, and probably detriment to the provider’s reputation. Thus, the goal of any cloud service provider is to meet the Service level agreements, while reducing the total cost of offering its services.
APA, Harvard, Vancouver, ISO, and other styles
6

Malazizi, Ladan. "Development of Artificial Intelligence-based In-Silico Toxicity Models. Data Quality Analysis and Model Performance Enhancement through Data Generation." Thesis, University of Bradford, 2008. http://hdl.handle.net/10454/4262.

Full text
Abstract:
Toxic compounds, such as pesticides, are routinely tested against a range of aquatic, avian and mammalian species as part of the registration process. The need for reducing dependence on animal testing has led to an increasing interest in alternative methods such as in silico modelling. The QSAR (Quantitative Structure Activity Relationship)-based models are already in use for predicting physicochemical properties, environmental fate, eco-toxicological effects, and specific biological endpoints for a wide range of chemicals. Data plays an important role in modelling QSARs and also in result analysis for toxicity testing processes. This research addresses number of issues in predictive toxicology. One issue is the problem of data quality. Although large amount of toxicity data is available from online sources, this data may contain some unreliable samples and may be defined as of low quality. Its presentation also might not be consistent throughout different sources and that makes the access, interpretation and comparison of the information difficult. To address this issue we started with detailed investigation and experimental work on DEMETRA data. The DEMETRA datasets have been produced by the EC-funded project DEMETRA. Based on the investigation, experiments and the results obtained, the author identified a number of data quality criteria in order to provide a solution for data evaluation in toxicology domain. An algorithm has also been proposed to assess data quality before modelling. Another issue considered in the thesis was the missing values in datasets for toxicology domain. Least Square Method for a paired dataset and Serial Correlation for single version dataset provided the solution for the problem in two different situations. A procedural algorithm using these two methods has been proposed in order to overcome the problem of missing values. Another issue we paid attention to in this thesis was modelling of multi-class data sets in which the severe imbalance class samples distribution exists. The imbalanced data affect the performance of classifiers during the classification process. We have shown that as long as we understand how class members are constructed in dimensional space in each cluster we can reform the distribution and provide more knowledge domain for the classifier.
APA, Harvard, Vancouver, ISO, and other styles
7

Gol, Murat. "A New Field-data Based Eaf Model Applied To Power Quality Studies." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12611083/index.pdf.

Full text
Abstract:
Electric Arc Furnace (EAF) modeling has been a common research area to date. This thesis work proposes a new field-data based EAF-specific model. The data used in developing the proposed model is obtained from field measurements of instantaneous voltages and currents of EAF the plants. This model presents the dynamic behavior of the EAF system including all its parts, which are the EAF transformer, the secondary circuit, the electrodes moving and the arc itself. It consists of a cascade connected variable-resistance and &ndash
inductance combination to represent the variation in time of the fundamental frequency, and a current source in parallel with it to inject the harmonics and interharmonics content of the EAF current. The proposed model is capable of representing both AC and DC EAFs, whose controllers&rsquo
set points are the impedance values seen from the low voltage (LV) side of the EAF transformer. The validity of the proposed model has been verified by comparing EMTDC/PSCAD simulations of the model with the field measurements. The results obtained have shown quite satisfactory correlation between the behavior of the proposed model and v the actual EAF operation. To show the advantages of the model while developing FACTS solutions for power quality (PQ) problem mitigation of a given busbar supplying single- or multi-EAF installations, various applications are presented.
APA, Harvard, Vancouver, ISO, and other styles
8

Castillo, Luis Felipe, Carlos Raymundo, and Francisco Dominguez Mateos. "Information architecture model for data governance initiatives in peruvian universities." Association for Computing Machinery, Inc, 2017. http://hdl.handle.net/10757/656361.

Full text
Abstract:
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado.
This current research revealed the need to design an information architecture model for Data Governance In order to reduce the gap between the Information Technology versus the Information Management. The model designed to make a balance between the need to invest in technology and the ability to manage the information that is originated from the use of those technologies, as well as to measure with greater precision the generation of IT value through the use of quality information and user satisfaction. In order to test our model we take a case of study in the Higher Education sector in Peru in order to demonstrate the successful data governance projects with this model. 1
APA, Harvard, Vancouver, ISO, and other styles
9

Borglund, Erik. "A predictive model for attaining quality in recordkeeping." Licentiate thesis, Mid Sweden University, Department of Information Technology and Media, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42.

Full text
Abstract:

Records are a subset of information and recordkeeping requirements demand that a record is managed with maintained authenticity and reliability, i.e. with high quality. Records are evidence of transactions and are used and managed in daily work processes. Records may be preserved for anything from milliseconds to eternity. With computer based information systems the electronic record was born: a record that is born digital. With electronic records problems regarding maintenance of authenticity and reliability have been identified. Electronic records are no longer physical entities as traditional records were. An electronic record is a logical entity that can be spread over different locations in a computer based information system. In this research the aim is to improve the possibility of reaching high quality in recordkeeping systems, i.e. to maintain reliability and authenticity of electronic records, which is necessary if electronic records are to be usable as evidence of transactions. Based on case studies and literature studies, a recordkeeping quality model is presented: a predictive model for attaining quality in recordkeeping. The recordkeeping quality model consists of four major concepts which are interrelated with each other: Electronic records, Records use, Electronic record quality, and Multidimensional perspective. The model is proposed for use when designing and developing computer based information systems which are required to be recordkeeping, systems which manage electronic records. In this research two results beside the recordkeeping quality model are emphasized. The first is that quality in recordkeeping must be seen in a multidimensional perspective, and the second is that recordkeeping systems are information systems with a partially unknown purpose.

APA, Harvard, Vancouver, ISO, and other styles
10

Rogers, David R. "A model based approach for determining data quality metrics in combustion pressure measurement. A study into a quantative based improvement in data quality." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/14100.

Full text
Abstract:
This thesis details a process for the development of reliable metrics that could be used to assess the quality of combustion pressure measurement data - important data used in the development of internal combustion engines. The approach that was employed in this study was a model based technique, in conjunction with a simulation environment - producing data based models from a number of strategically defined measurement points. A simulation environment was used to generate error data sets, from which models of calculated result responses were built. This data was then analysed to determine the results with the best response to error stimulation. The methodology developed allows a rapid prototyping phase where newly developed result calculations may be simulated, tested and evaluated quickly and efficiently. Adopting these newly developed processes and procedures, allowed an effective evaluation of several groups of result classifications, with respect to the major sources of error encountered in typical combustion measurement procedures. In summary, the output gained from this work was that certain result groups could be stated as having an unreliable response to error simulation and could therefore be discounted quickly. These results were clearly identifiable from the data and hence, for the given errors, alternative methods to identify the error sources are proposed within this thesis. However, other results had a predictable response to certain error stimuli, hence; it was feasible to state the possibility of using these results in data quality assessment, or at least establishing any boundaries surrounding their application for this usage. Interactions in responses were also clearly visible using the model based sensitivity analysis as proposed. The output of this work provides a solid foundation of information from which further work and investigation would be feasible, in order to achieve an ultimate goal of a full set of metrics from which combustion data quality could be accurately and objectively assessed.
APA, Harvard, Vancouver, ISO, and other styles
11

Lotter, Norman Owen. "A quality control model for the development of high-confidence flotation test data." Master's thesis, University of Cape Town, 1995. http://hdl.handle.net/11427/18298.

Full text
Abstract:
This thesis addresses the problem of obtaining reliable laboratory scale flotation test data for the Merensky ore type found in the Bushveld complex of South Africa. The complex nature of the platinum-group element (PGE) deportment in this ore renders the normally-practiced procedures inappropriate for this particular testwork. A more robust and thorough procedure is necessary because of the diverse mineralogical forms in which the PGE are found. The evaluation of the mass and value balances has accordingly to take these factors into account. The major features of the evaluation of input and output errors across the laboratory scale flotation test are analysed. It is found that unless size-by-size variance of PGE in a conventionally crushed mill feed is taken into account the mill feed sample size is underestimated by some 176%. Further the preparation of a reference distribution of assayed head material is necessary to provide the 95% confidence limits of grade estimate. The need for repeating flotation tests and compositing the adjudicated products is discussed, concluding that quintuplicates are suitable to achieve a desirable level of confidence in the built-up head grade. The sample preparation of the flotation products has a critical role in minimising evaluation errors, as is the case with fire assaying of samples where minimum numbers of replicate determinations have been calculated. An outlier rejection model for adjudication of the replicate built-up head grades is proposed, and a complete flowsheet of the quality control model is developed from first principles. The effect of this model on the PGE total balance is analysed. It is concluded that workable controls are defined, since a metal balance with < 1% error has been achieved.
APA, Harvard, Vancouver, ISO, and other styles
12

Iyer, Vasanth. "Ensemble Stream Model for Data-Cleaning in Sensor Networks." FIU Digital Commons, 2013. http://digitalcommons.fiu.edu/etd/973.

Full text
Abstract:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.
APA, Harvard, Vancouver, ISO, and other styles
13

Krause, Lennard. "Assessment of Single Crystal X-ray Diffraction Data Quality." Doctoral thesis, Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2017. http://hdl.handle.net/11858/00-1735-0000-0023-3DD4-A.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Omotoso, Toyin. "Water quality profiling of rivers in a data-poor area, southwest Nigeria." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/water-quality-profiling-of-rivers-in-a-datapoor-area-southwest-nigeria(5b376bc7-0ff3-402c-ba2a-484ec2951e85).html.

Full text
Abstract:
The current state of the art in water quality profiling is reviewed to lay a foundation in addressing concerns over poor data in developing countries which has not been adequately covered by previous models. A particular focus is made on Ogbese River, southwest Nigeria as a case study. A process-based model with data-filling capability is projected which transforms processes into an event as a reasonably easy way for assessing and predicting river-water quality in the event of constraints in data collection. The structure of the study involves: (i) hydrologic modelling, (ii) hydraulic load modelling and (iii) instream water quality modelling. The hydrologic modelling assesses and makes use of satellite based rainfall estimates subject to processing and reliability tests. A modification to the conceptual relationship of rainfall distribution frequency which makes the model output sensitive to the season was derived. The hydraulic load modelling integrates diffuse sources of pollutant as spatial data in combination with the catchment runoff. A distance decay weighing factor was introduced into the export coefficient to better determine the effective load delivered into the stream. The utility of the model, implemented on WASP platform, was demonstrated by showing how it can be used for scenario testing. Different modelling concepts were evaluated in view of their ability to produce predictions under changing circumstances using the predictions as guide to management. This study promotes a knowledge base in water quality processes by evaluation of the processes which lead to the end product rather than using data monitoring. The study structures understanding of the phenomena that characterises river water quality and tailors it towards regulatory applications and catchment planning. It, also, provides a sustainable strategy to predict the river water quality, evaluate the risks, and take proactive action in setting up an early warning system, for data-poor regions.
APA, Harvard, Vancouver, ISO, and other styles
15

nilsson, Kevin. "Material modeling in Sheet Metal Forming Simulations : Quality comparison between comonly used material models." Thesis, Blekinge Tekniska Högskola, Institutionen för maskinteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18227.

Full text
Abstract:
In today's automotive industries, many different simulation programs are used to optimize parts before they come into production. This has created a market for complex material models to get the best possible approximation of reality in the simulation environment. Several industries are still using older material models that can’t give an acceptable accuracy for the materials currently in use as they are based on much simpler and older materials. The problem with material models is that there is no direct comparison between the material models which leads to several sheet metal forming companies still holding on to older models like Hill`48.   The purpose of this work is to create a comparison of sheet material models from a user perspective to be able to provide recommendations of material models. Different models will be tested for different materials and will be based on AutoForm's recommendations. AutoForm is a FEM based sheet metal forming simulation program used by large names in the automotive industry. These recommendations are Vegter2017, BBC2005 or Hill`48 for steel and Vegter2017, BBC2005 or Barlat`89 for aluminum.   This work is achieved by comparing experimental data from a Limiting Dome Height (LDH) test with a simulation of this test for all material models and then comparing the results. The data that will be compared consists of the major and minor strain in the sheet as well as the punch force. These parameters are chosen as they give an overview of the model’s applicability as well as accuracy. The test will be performed on all materials available in Volvo Cars material library to create a broader overview of all material models. The material models will also be evaluated depending on their user-friendliness by analyzing what types of data are required to perform a simulation.   The result from these tests showed that BBC 2005 should be recommended for aluminum and steel for companies that have access to biaxial data and for people who put optimization in focus. Hill`48 proved far too deviant in the results for steel and should not be used if other models are available. Vegter 2017 proved perfect for steel simulations as the result were great as well as the necessary material data can be obtained through standardized tensile tests. The result also showed that Vegter2017 should not be used for aluminum since the result was too deviant from the experimental data in aspect for both form approximation and strain magnitude. Barlat`89 gave accurate results with only data from a tensile test which makes it a preferred model when working with aluminum.   The conclusion from this work is that the choice of material model is very dependent on what conditions you have as very few industries have access to the tests required by the BBC 2005 model. Another conclusion may be drawn for Barlat`89 with aluminum and Vegter 2017 with steel as they can be preferred when working with a small timeframe as well as when few test data is available.
Inom dagens bilindustri används det många olika simuleringsprogram för att optimera delar innan de kommer ut i produktion. Detta har då skapat en marknad för komplexa material modeller för att få en så bra approximation av verkligheten som möjligt. I flera industrier använder man sig fortfarande av äldre materialmodeller som egentligen inte håller måttet för dagens material då de är baserade på simplare material. Problemet som har skapat denna situation är att det inte direkt finns en konkret jämförelse mellan materialmodellerna vilket leder till att flera plåtformnings företag fortfarande håller kvar vid äldre modeller som t e x Hill`48.   Syftet med detta arbete är att skapa en jämförelse av plåt materialmodeller från ett användarperspektiv för att kunna ge konkreta bevis till rekommendationer av materialmodeller. Olika modeller skall testas för olika material och baseras på AutoForms rekommendationer. AutoForm är ett FEM baserat plåtformningssimulerings program som används av stora namn inom bilindustrin. Dessa rekommendationer är då att köra Vegter2017, BBC2005 eller Hill`48 för stål samt att köra Vegter2017, BBC2005 eller Barlat`89 för aluminium.   Detta arbete utförs genom att jämföra experimentella data från ett Limiting Dome Height (LDH) test med en simulering av detta test för alla material modeller och sedan jämföra resultaten. Jämförelsen mellan den experimentella och simuleringsdatan kommer att involvera major och minor strain i plåten samt stämpelkraften. Dessa parametrar har valts då de ger en bra översikt över materialmodellernas applicerbarhet och noggrannhet. Testen kommer att utföras på samtliga material som finns tillgängliga i Volvo Cars materialbibliotek för att skapa en breddare syn på samtliga modellers applicerbarhet. Materialmodellerna kommer även att utvärderas beroende på deras användarvänlighet samt vilka typer av data krävs för att använda modellen.   Resultatet visade att BBC 2005 skall rekommenderas för aluminium samt stål till de företag som har tillgång till biaxiella data samt lägger optimering i fokus. Hill`48 visade sig alldeles för avvikande för stål och bör inte användas om andra modeller är tillgängliga. Vegter 2017 visade sig perfekt för stål då resultatet var bra samt att den nödvändiga materialdatan kan införskaffas genom standardiserade dragprov. Resultatet visade även att Vegter 2017 inte bör användas för aluminium då resultatet var för avvikande. Barlat`89 gav bra resultat med endast data från dragprovstest vilket ger att den är att rekommendera för aluminium.   Slutsatsen från detta arbete är att valet av materialmodell är väldigt beroende av vilka förutsättningar som finns då väldigt få industrier har tillgång till de tester som krävs för att använda BBC 2005. I större delar av industrin där minimala optimeringar inte anses som väsentliga är Barlat`89 och Vegter 2017 att föredra då detta leder till snabbare processer.
APA, Harvard, Vancouver, ISO, and other styles
16

Medina, Calderon Richard. "Using a radiative transfer model in conjunction with UV-MFRSR irradiance data for studying aerosols in El Paso-Juarez airshed." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2009. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Abdioskouei, Maryam. "Improving Air Quality Prediction Through Characterizing the Model Errors Using Data from Comprehensive Field Experiments." Thesis, The University of Iowa, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13420451.

Full text
Abstract:

Uncertainty in the emission estimates is one the main reasons for shortcomings in the Chemistry Transport Models (CTMs) which can reduce the confidence level of impact assessment of anthropogenic activities on air quality and climate. This dissertation focuses on understating the uncertainties within the CTMs and reducing these uncertainties by improving emission estimates

The first part of this dissertation focuses on reducing the uncertainties around the emission estimates from oil and Natural Gas (NG) operations by using various observations and high-resolution CTMs. To achieve this goal, we used Weather Research and Forecasting with Chemistry (WRF-Chem) model in conjunction with extensive measurements from two major field campaigns in Colorado. Ethane was used as the indicator of oil and NG emissions to explore the sensitivity of ethane to different physical parametrizations and simulation set-ups in the WRF-Chem model using the U.S. EPA National Emission Inventory (NEI-2011). The sensitivity analysis shows up to 57.3% variability in the modeled ethane normalized mean bias (NMB) across the simulations, which highlights the important role of model configurations on the model performance.

Comparison between airborne measurements and the sensitivity simulations shows a model-measurement bias of ethane up to -15ppb (NMB of -80%) in regions close to oil and NG activities. Under-prediction of ethane concentration in all sensitivity runs suggests an actual under-estimation of the oil and NG emissions in the NEI-2011 in Colorado. To reduce the error in the emission inventory, we developed a three-dimensional variational inversion technique. Through this method, optimal scaling factors up to 6 for ethane emission rates were calculated. Overall, the inversion method estimated between 11% to 15% higher ethane emission rates in the Denver-Julesburg basin compared to the NEI-201. This method can be extended to constrain oil and NG emissions in other regions in the US using the available measurement datasets.

The second part of the dissertation discusses the University of Iowa high-resolution chemical weather forecast framework using WRF-Chem designed for the Lake Michigan Ozone Study (LMOS-2017). LMOS field campaign took place during summer 2017 to address high ozone episodes in coastal communities surrounding Lake Michigan. The model performance for clouds, on-shore flows, and surface and aircraft sampled ozone and NOx concentrations found that the model successfully captured much of the observed synoptic variability of onshore flows. Selection of High-Resolution Rapid Refresh (HRRR) model as initial and boundary condition, and the Noah land surface model, significantly improved comparison of meteorology variables to both ground-based and aircraft data. Model consistently underestimated the daily maximum concentration of ozone. Emission sensitivity analysis suggests that increase in Hydrocarbon (HC). Variational inversion method and measurements by GeoTAS and TROPOMI instruments and airborne and ground-based measurements can be used to constrain NOx emissions in the region.

APA, Harvard, Vancouver, ISO, and other styles
18

Abdioskouei, Maryam. "Improving air quality prediction through characterizing the model errors using data from comprehensive field experiments." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6535.

Full text
Abstract:
Uncertainty in the emission estimates is one the main reasons for shortcomings in the Chemistry Transport Models (CTMs) which can reduce the confidence level of impact assessment of anthropogenic activities on air quality and climate. This dissertation focuses on understating the uncertainties within the CTMs and reducing these uncertainties by improving emission estimates The first part of this dissertation focuses on reducing the uncertainties around the emission estimates from oil and Natural Gas (NG) operations by using various observations and high-resolution CTMs. To achieve this goal, we used Weather Research and Forecasting with Chemistry (WRF-Chem) model in conjunction with extensive measurements from two major field campaigns in Colorado. Ethane was used as the indicator of oil and NG emissions to explore the sensitivity of ethane to different physical parametrizations and simulation set-ups in the WRF-Chem model using the U.S. EPA National Emission Inventory (NEI-2011). The sensitivity analysis shows up to 57.3% variability in the modeled ethane normalized mean bias (NMB) across the simulations, which highlights the important role of model configurations on the model performance. Comparison between airborne measurements and the sensitivity simulations shows a model-measurement bias of ethane up to -15ppb (NMB of -80%) in regions close to oil and NG activities. Under-prediction of ethane concentration in all sensitivity runs suggests an actual under-estimation of the oil and NG emissions in the NEI-2011 in Colorado. To reduce the error in the emission inventory, we developed a three-dimensional variational inversion technique. Through this method, optimal scaling factors up to 6 for ethane emission rates were calculated. Overall, the inversion method estimated between 11% to 15% higher ethane emission rates in the Denver-Julesburg basin compared to the NEI-201. This method can be extended to constrain oil and NG emissions in other regions in the US using the available measurement datasets. The second part of the dissertation discusses the University of Iowa high-resolution chemical weather forecast framework using WRF-Chem designed for the Lake Michigan Ozone Study (LMOS-2017). LMOS field campaign took place during summer 2017 to address high ozone episodes in coastal communities surrounding Lake Michigan. The model performance for clouds, on-shore flows, and surface and aircraft sampled ozone and NOx concentrations found that the model successfully captured much of the observed synoptic variability of onshore flows. Selection of High-Resolution Rapid Refresh (HRRR) model as initial and boundary condition, and the Noah land surface model, significantly improved comparison of meteorology variables to both ground-based and aircraft data. Model consistently underestimated the daily maximum concentration of ozone. Emission sensitivity analysis suggests that increase in Hydrocarbon (HC). Variational inversion method and measurements by GeoTAS and TROPOMI instruments and airborne and ground-based measurements can be used to constrain NOx emissions in the region.
APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Ming-Chieh. "Assessing environmental equivalents for water quality trading." Diss., Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/2245.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Paulo, EvÃnio Mascarenhas. "DETERMINATION OF EMPLOYMENT QUALITY LEVEL: AN ESSAY IN DYNAMIC PANEL MODEL." Universidade Federal do CearÃ, 2015. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=13667.

Full text
Abstract:
Conselho Nacional de Desenvolvimento CientÃfico e TecnolÃgico
This essay deals with questions concerning the level of quality of employment in Brazil in recent years, especially rural employment. The beginning of this research, there is the perception concerning changes in production relations in the field and their impacts on labor relations. Theoretical studies point to a process of approximation of the urban and rural spheres, from the point of view of production and labor relations, which gives it an urbanization beyond the limits of cities. So, if applicable, in this study, a specific methodology to capture possible determinants of quality of employment. Moreover, we propose an index that is believed to be able to give an idea of the level of quality of labor markets discussed here. Then applies a set of four equations for panel data to identify possible determinants of the quality score of the proposed job. The IQE in this research showed a profound heterogeneity in labor relations is in economic activities is in the studied areas. In general, agricultural workers face poor working conditions than their non-agricultural respond to it. Still on asymmetry in the labor market can register the differences between urban and rural employment. The rural universe remains a more precarious environment when compared to urban areas, although the differences come decreasing with time. Thus, as regards the response of the quality of occupations in relation to economic growth, it is observed that agricultural employment responds inversely to growth in agriculture, due to be in this industry, where the worst quality levels employment and positively to the growth of non-agricultural activities where hiring forms are less precarious. In the case of non-agricultural employment, economic growth apparently has not had major impacts on the quality of non-agricultural employment. Thus, economic growth, in this case, only extends to new workers hired the same forms of existing contracts without changing the average level of quality of employment in the labor market as a whole. The survey also showed that the growth of labor income and the average educational level of workers is an important tool not only in the expansion of the levels of job quality, but also coping strategy of "dilemmas" of the surveyed markets, such as the observed heterogeneity among the groups surveyed.
Esse ensaio trata de questÃes relativas ao nÃvel de qualidade do emprego no Brasil nos anos recentes, em especial o emprego rural. O inÃcio dessa pesquisa se constitui na percepÃÃo relativa a mudanÃas nas relaÃÃes de produÃÃo no campo e seus respectivos desdobramentos sobre as relaÃÃes de trabalho. Os estudos teÃricos apontam para um processo de aproximaÃÃo das esferas urbanas e rurais, seja do ponto de vista da produÃÃo como das relaÃÃes de trabalho, caracterizando, assim, uma urbanizaÃÃo para alÃm dos limites das cidades. Assim, aplica-se, nesse estudo, uma metodologia especÃfica a fim de captar possÃveis condicionantes da qualidade do emprego. Ademais, propÃe-se um Ãndice que se acredita ser capaz de dar uma ideia do nÃvel de qualidade dos mercados de trabalho analisados aqui. Em seguida, aplica-se um conjunto de quatro equaÃÃes para dados em painel de modo a verificar possÃveis condicionantes do Ãndice de qualidade do emprego proposto. O IQE proposto nesta pesquisa mostrou uma profunda heterogeneidade nas relaÃÃes de trabalho seja nas atividades econÃmicas seja nos espaÃos estudados. De modo geral, os trabalhadores agrÃcolas enfrentam condiÃÃes de trabalha precÃrias que os seus correspondes nÃo agrÃcolas. Ainda sobre assimetria no mercado de trabalho pode-se registrar as diferenÃas existentes entre o emprego urbano e rural. O universo rural persiste como um ambiente mais precÃrio quando comparado ao meio urbano, muito embora as diferenÃas venham diminuindo com o tempo. Assim, no que diz respeito a resposta da qualidade das ocupaÃÃes em relaÃÃo ao crescimento econÃmico, observa-se que o emprego agrÃcola responde de maneira inversa ao crescimento da agricultura, em virtude se ser, neste setor, onde se encontram os piores nÃveis de qualidade do emprego e positivamente ao crescimento das atividades nÃo agrÃcolas onde as formas de contrataÃÃo sÃo menos precÃrias. No caso do emprego nÃo agrÃcola, o crescimento econÃmico aparentemente nÃo tem tido grandes impactos sobre a qualidade do emprego nÃo agrÃcola. Assim, o crescimento econÃmico, neste caso, apenas estende aos novos trabalhadores contratados as mesmas formas de contrataÃÃo jà existentes sem alterar o nÃvel mÃdio de qualidade do emprego no mercado de trabalho como um todo. A pesquisa mostrou tambÃm que o crescimento dos rendimentos do trabalho e o nÃvel de escolaridade mÃdio dos trabalhadores sÃo instrumentos importantes nÃo sà na ampliaÃÃo dos nÃveis de qualidade do emprego, mas tambÃm estratÃgia de superaÃÃo de âdilemasâ dos mercados pesquisados, como à o caso da heterogeneidade verificada entre os grupos que pesquisamos
APA, Harvard, Vancouver, ISO, and other styles
21

Zhao, Xiaoyun. "Road network and GPS tracking with data processing and quality assessment." Licentiate thesis, Högskolan Dalarna, Mikrodataanalys, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:du-17354.

Full text
Abstract:
GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.
APA, Harvard, Vancouver, ISO, and other styles
22

Fair, Martin Lynn. "Geo-demographic analysis in support of the United States Army Reserve (USAR) Unit Positioning and Quality Assessment Model (UPQUAM)." Thesis, Monterey California. Naval Postgraduate School, 2004. http://hdl.handle.net/10945/1592.

Full text
Abstract:
Manning United States Army Reserve (USAR) units are fundamentally different from manning Regular Army (RA) units. A soldier assigned to a USAR unit must live within 75 miles or 90 minutes commute of his Reserve Center (RC). This makes reserve unit positioning a key factor in the ability to recruit to fill the unit. This thesis automates, documents, reconciles, and assembles data on over 30,000 ZIP Codes, over 800 RCs, and over 260 Military Occupational Specialties (MOSs), drawing on and integrating over a dozen disparate databases. This effort produces a single data file with demographic, vocational, and economic data on every ZIP Code in America, along with the six year results of its RA, USAR, sister service recruit production, and MOS suitability for each of the 264 MOSs. Preliminary model development accounts for about 70% recruit production variation by ZIP Code. This thesis also develops models for the top five MOSs to predict the maximum number of recruits obtained from a ZIP Code for that MOS. Examples illustrate that ZIP Codes vary in their ability to provide recruits with sufficient aptitude for technical fields. Two subsequent theses will use those results. One completes the MOS models. The second uses the models as constraints in an optimization model to position RCs. An initial version of the optimization model is developed in this thesis. Together, the three theses will provide a powerful tool for analysis of a strategic-based optimal reserve force stationing.
Lieutenant Colonel, United States Army
APA, Harvard, Vancouver, ISO, and other styles
23

Fasold, Mario. "Hybridization biases of microarray expression data - A model-based analysis of RNA quality and sequence effects." Doctoral thesis, Universitätsbibliothek Leipzig, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-116957.

Full text
Abstract:
Modern high-throughput technologies like DNA microarrays are powerful tools that are widely used in biomedical research. They target a variety of genomics applications ranging from gene expression profiling over DNA genotyping to gene regulation studies. However, the recent discovery of false positives among prominent research findings indicates a lack of awareness or understanding of the non-biological factors negatively affecting the accuracy of data produced using these technologies. The aim of this thesis is to study the origins, effects and potential correction methods for selected methodical biases in microarray data. The two-species Langmuir model serves as the basal physicochemical model of microarray hybridization describing the fluorescence signal response of oligonucleotide probes. The so-called hook method allows to estimate essential model parameters and to compute summary parameters characterizing a particular microarray sample. We show that this method can be applied successfully to various types of microarrays which share the same basic mechanism of multiplexed nucleic acid hybridization. Using appropriate modifications of the model we study RNA quality and sequence effects using publicly available data from Affymetrix GeneChip expression arrays. Varying amounts of hybridized RNA result in systematic changes of raw intensity signals and appropriate indicator variables computed from these. Varying RNA quality strongly affects intensity signals of probes which are located at the 3\' end of transcripts. We develop new methods that help assessing the RNA quality of a particular microarray sample. A new metric for determining RNA quality, the degradation index, is proposed which improves previous RNA quality metrics. Furthermore, we present a method for the correction of the 3\' intensity bias. These functionalities have been implemented in the freely available program package AffyRNADegradation. We show that microarray probe signals are affected by sequence effects which are studied systematically using positional-dependent nearest-neighbor models. Analysis of the resulting sensitivity profiles reveals that specific sequence patterns such as runs of guanines at the solution end of the probes have a strong impact on the probe signals. The sequence effects differ for different chip- and target-types, probe types and hybridization modes. Theoretical and practical solutions for the correction of the introduced sequence bias are provided. Assessment of RNA quality and sequence biases in a representative ensemble of over 8000 available microarray samples reveals that RNA quality issues are prevalent: about 10% of the samples have critically low RNA quality. Sequence effects exhibit considerable variation within the investigated samples but have limited impact on the most common patterns in the expression space. Variations in RNA quality and quantity in contrast have a significant impact on the obtained expression measurements. These hybridization biases should be considered and controlled in every microarray experiment to ensure reliable results. Application of rigorous quality control and signal correction methods is strongly advised to avoid erroneous findings. Also, incremental refinement of physicochemical models is a promising way to improve signal calibration paralleled with the opportunity to better understand the fundamental processes in microarray hybridization.
APA, Harvard, Vancouver, ISO, and other styles
24

Murray, Denise A. "A Data-Based Practice Model For Pessary Treatment Of Pelvic Organ Prolapse: A Quality Improvement Project." Diss., The University of Arizona, 2014. http://hdl.handle.net/10150/338875.

Full text
Abstract:
Background: Pelvic organ prolapse (POP) can be treated surgically or, more conservatively, with use of a pessary. Objective: To determine if the population of women treated for POP with the use of a pessary in one Nurse Practitioner's (NP) practice demonstrated health outcomes as better, same, or needing improvement through use of a data-based practice model from encounter data extracted from the electronic health record (EHR).Design: The project design was a quality improvement (QI) project, descriptive in nature. One Plan Do Study Act (PDSA) cycle was conducted for this QI project. Setting: NP managed specialty clinic in urban Southwestern Arizona that provides services to women with POP. Patients: Ten randomly selected women who had been treated conservatively for POP with use of a pessary were identified as two subpopulations and evaluated: women who received professional management of the pessary and women who were patient managed. Intervention: The intervention was the development of a data-based practice model, using patient profile data elements derived from the documented EHR encounters of the 10 women. Measurements: Twelve scales were developed to evaluate the patient profile data elements, generating numeric scores for each encounter. Two Decision Rules were then used to evaluate numeric scores by encounter, creating primary and secondary health outcomes. Limitations: Two limitations were identified. The QI project was limited by the small sample size of 10 patients. This is however, true to PDSA guidelines that recommend small scale cycles. The data were limited as only documented data were used. Conclusions: In general, the expected outcome was the outcome observed; the provider was unaware of any women in this QI Project who were not successfully treated with use of a pessary for treatment of POP. The value of this data-based practice model is that outcomes can be aggregated across populations rather than relying on recall of individual outcomes and therefore has potential to be used regularly and systematically as a quality feedback loop, as well as on a larger scale in future PDSA cycles to determine other outcomes beyond a single provider in this or other similar clinical populations.
APA, Harvard, Vancouver, ISO, and other styles
25

Smith, Edwin L. "A system dynamics computer model for long-term water quality planning." Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/41562.

Full text
Abstract:

The objective of this study was to develop a comprehensive, basin-wide, water-quality-planning model using system dynamics methodology. Later, the model was to be interfaced with a more conventional system dynamics model: one simulating social, technological, economic, and political interactions. By doing so, it is envisioned that such management policies as zoning, abatement facilities, and best management practices may be simulated together.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
26

Breuler, Lindsay Mildred. "Developing Ohio 4-H Horse Project Quality Indicators through the Analysis of Enrollment Data and Volunteer Leader Discourse: A Mixed Model Approach." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1429549736.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Dargie, Waltenegus. "Impact of Random Deployment on Operation and Data Quality of Sensor Networks." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-32911.

Full text
Abstract:
Several applications have been proposed for wireless sensor networks, including habitat monitoring, structural health monitoring, pipeline monitoring, and precision agriculture. Among the desirable features of wireless sensor networks, one is the ease of deployment. Since the nodes are capable of self-organization, they can be placed easily in areas that are otherwise inaccessible to or impractical for other types of sensing systems. In fact, some have proposed the deployment of wireless sensor networks by dropping nodes from a plane, delivering them in an artillery shell, or launching them via a catapult from onboard a ship. There are also reports of actual aerial deployments, for example the one carried out using an unmanned aerial vehicle (UAV) at a Marine Corps combat centre in California -- the nodes were able to establish a time-synchronized, multi-hop communication network for tracking vehicles that passed along a dirt road. While this has a practical relevance for some civil applications (such as rescue operations), a more realistic deployment involves the careful planning and placement of sensors. Even then, nodes may not be placed optimally to ensure that the network is fully connected and high-quality data pertaining to the phenomena being monitored can be extracted from the network. This work aims to address the problem of random deployment through two complementary approaches: The first approach aims to address the problem of random deployment from a communication perspective. It begins by establishing a comprehensive mathematical model to quantify the energy cost of various concerns of a fully operational wireless sensor network. Based on the analytic model, an energy-efficient topology control protocol is developed. The protocol sets eligibility metric to establish and maintain a multi-hop communication path and to ensure that all nodes exhaust their energy in a uniform manner. The second approach focuses on addressing the problem of imperfect sensing from a signal processing perspective. It investigates the impact of deployment errors (calibration, placement, and orientation errors) on the quality of the sensed data and attempts to identify robust and error-agnostic features. If random placement is unavoidable and dense deployment cannot be supported, robust and error-agnostic features enable one to recognize interesting events from erroneous or imperfect data.
APA, Harvard, Vancouver, ISO, and other styles
28

Castillo, Luis Felipe, Carlos Raymundo, and Francisco Dominguez Mateos. "Information architecture model for the successful data governance initiative in the peruvian higher education sector." Institute of Electrical and Electronics Engineers Inc, 2017. http://hdl.handle.net/10757/656364.

Full text
Abstract:
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado.
The research revealed the need to design an information architecture model for Data Governance initiative that can serve as an intercom between current IT / IS management trends: Information technology (IT) management and information management. A model is needed that strikes a balance between the need to invest in technology and the ability to manage the information that originates from the use of those technologies, as well as to measure with greater precision the generation of IT value through the use of quality information and user satisfaction, using the technologies that make it possible for the information to reach them to be used in their daily work.
APA, Harvard, Vancouver, ISO, and other styles
29

Fair, Martin Lynn. "Geo-Demoraphic analysis in support of the United States Army Reserve (USAR) Unit Positioning and Quality Assessment Model (UPQUAM) /." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Jun%5FFair.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Xiong, Hui. "Combining Subject Expert Experimental Data with Standard Data in Bayesian Mixture Modeling." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1312214048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ding, Deng. "An integrated modeling framework of socio-economic, biophysical, and hydrological processes in Midwest landscapes: remote sensing data, agro-hydrological model, and agent-based model." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1840.

Full text
Abstract:
Intensive human-environment interactions are taking place in Midwestern agricultural systems. An integrated modeling framework is suitable for predicting dynamics of key variables of the socio-economic, biophysical, hydrological processes as well as exploring the potential transitions of system states in response to changes of the driving factors. The purpose of this dissertation is to address issues concerning the interacting processes and consequent changes in land use, water balance, and water quality using an integrated modeling framework. This dissertation is composed of three studies in the same agricultural watershed, the Clear Creek watershed in East-Central Iowa. In the first study, a parsimonious hydrologic model, the Threshold-Exceedance-Lagrangian Model (TELM), is further developed into RS-TELM (Remote Sensing TELM) to integrate remote sensing vegetation data for estimating evapotranspiration. The goodness of fit of RS-TELM is comparable to a well-calibrated SWAT (Soil and Water Assessment Tool) and even slightly superior in capturing intra-seasonal variability of stream flow. The integration of RS LAI (Leaf Area Index) data improves the model's performance especially over the agriculture dominated landscapes. The input of rainfall datasets with spatially explicit information plays a critical role in increasing the model's goodness of fit. In the second study, an agent-based model is developed to simulate farmers' decisions on crop type and fertilizer application in response to commodity and biofuel crop prices. The comparison between simulated crop land percentage and crop rotations with satellite-based land cover data suggest that farmers may be underestimating the effects that continuous corn production has on yields (yield drag). The simulation results given alternative market scenarios based on a survey of agricultural land owners and operators in the Clear Creek Watershed show that, farmers see cellulosic biofuel feedstock production in the form of perennial grasses or corn stover as a more risky enterprise than their current crop production systems, likely because of market and production risks and lock in effects. As a result farmers do not follow a simple farm-profit maximization rule. In the third study, the consequent water quantity and quality change of the potential land use transitions given alternative biofuel crop market scenarios is explored in a case study in the Clear Creek watershed. A computer program is developed to implement the loose-coupling strategy to couple an agent-based land use model with SWAT. The simulation results show that watershed-scale water quantity (water yield and runoff) and quality variables (sediment and nutrient loads) decrease in values as switchgrass price increases. However, negligence of farmers risk aversions towards biofuel crop adoption would cause overestimation of the impacts of switchgrass price on water quantity and quality.
APA, Harvard, Vancouver, ISO, and other styles
32

Tseng, Hsin-Wu, Jiahua Fan, and Matthew A. Kupinski. "Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems." SPIE-SOC PHOTO-OPTICAL INSTRUMENTATION ENGINEERS, 2016. http://hdl.handle.net/10150/622347.

Full text
Abstract:
The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. (C) 2016 Society of Photo-Optical Instrumentation Engineers (SPIE)
APA, Harvard, Vancouver, ISO, and other styles
33

Franz, Markus [Verfasser], Oliver [Akademischer Betreuer] Hinz, and Alexander [Akademischer Betreuer] Benlian. "Improving Data Quality, Model Functionalities and Optimizing User Interfaces in Decision Support Systems / Markus Franz ; Oliver Hinz, Alexander Benlian." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2017. http://d-nb.info/1128309963/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Luca, Matthieu. "Quality Timber Strength Grading : A prediction of strength using scanned surface grain data and FE-analyses." Thesis, Linnéuniversitetet, Institutionen för teknik, TEK, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-14037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Polańska, Julia, and Michał Zyznarski. "Elaboration of a method for comparison of Business Intelligence Systems which support data mining process." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-2078.

Full text
Abstract:
Business Intelligence Systems are becoming more and more popular in recent years. It is caused by the need of reusing data in order to gain some potentially useful business information about. Those systems are advanced set of tools, which causes high prices of purchase and licensing. Therefore, it is important to choose the system which fits the best particular business needs. The aim of this thesis is to elaborate a method for comparison of existing Business Intelligence Systems that are supporting data mining. The method consist of a quality model, build according to existing standards, and set of steps which should be taken to choose a Business Intelligence System according to particular requirements of its future user. The first part of the thesis focuses on the analysis of existing works providing a way for comparison of those software products. It is shown here that there is no existing systematic approach resolving this problem. However, criteria presented in those works along with the description of quality model standards were used for creating the quality model and proposing a set of basic measures. Also the phrases for the evaluation process were identified. The next part of the research is a case study which purpose is to show the usefulness of proposed evaluation method. The example is simple, but has proven that the method can be easily modified for specific needs and used for comparison of real Business Intelligence Systems. The quality level measured in the case study turned out to be very similar for each system. The evaluation method may be extended in future work with more advanced measures or additional characteristic which were not taken into account in this research.
APA, Harvard, Vancouver, ISO, and other styles
36

Thomson, C. P. H. "From point cloud to building information model : capturing and processing survey data towards automation for high quality 3D models to aid a BIM process." Thesis, University College London (University of London), 2016. http://discovery.ucl.ac.uk/1485847/.

Full text
Abstract:
Building Information Modelling has, more than any previous initiative, established itself as the process by which operational change can occur, driven by a desire to eradicate the inefficiencies in time and value and requiring a change of approach to the whole lifecycle of construction from design through construction to operation and eventual demolition. BIM should provide a common digital platform which allows different stakeholders to supply and retrieve information thereby reducing waste through enhanced decision making. Through the provision of measurement and representative digital geometry for construction and management purposes, surveying is very much a part of BIM. Given that all professions that are involved with construction have to consider the way in which they handle data to fit with the BIM process, it stands to reason that Geomatic or Land Surveyors play a key part. This is further encouraged by the fact that 3D laser scanning has been adopted as the primary measurement technique for geometry capture for BIM. Also it is supported by a laser scanning work stream from the UK Government backed BIM Task Group. Against this backdrop, the research in this thesis investigates the 3D modelling aspects of BIM, from initial geometry capture in the real world, to the generation and storage of the virtual world model, while keeping the workflow and outputs compatible with the BIM process. The focus will be made on a key part of the workflow for capturing as-built conditions: the geometry creation from point clouds. This area is considered a bottleneck in the BIM process for existing assets not helped by their often poor or non-existent documentation. Automated modelling is seen as desirable commercially with the goal of reducing time, and therefore cost, and making laser scanning a more viable proposition for a range of tasks in the lifecycle.
APA, Harvard, Vancouver, ISO, and other styles
37

Banik, Paromita. "A Model for Evaluating the Effectiveness of an Undergraduate Curriculum in Teaching Software Code Quality." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-279687.

Full text
Abstract:
Developers build an entire software by writing quality code, the failure of which can risk organizational reputation, financial wellbeing, and even the lives and security of people. The ability to develop high quality code is therefore a key learning goal of foundational undergraduate computing programmes. However, how effective these undergraduate computing curriculums are in teaching software code quality is a matter of concern as this leads to raising the knowledge of future software developers. Right now, there is a lack of models to evaluate this effectiveness of undergraduate curriculum in providing the knowledge of software code quality to students.In this thesis, we suggest a model for evaluating the effectiveness of undergraduate curriculum in teaching software code quality, which we call ScQUc (Software Code Quality in Undergraduate Curriculum). Our goal is to raise the knowledge and awareness of teaching software code quality in education and make the undergraduate computing curriculum more software code quality centric.Due to the fact that there is a lack of models for evaluating the effectiveness of undergraduate curriculum in teaching software code quality, we had no closely related work to base our research on. We had to rely on other works related to code quality in programming courses that were indirectly connected to our research field. Hence, we dare infer that the ScQUc model was created from scratch.The research method used was qualitative and followed the frame of design science. Data collection was conducted via literature study and via interviews with two experts in the field. The interviewees were selected with the convenience sampling method enhanced with a predefined selection criterion. Data evaluation, on the other hand, was conducted using an evaluation model. The evaluation model included the following evaluation criteria: (1) interviewee credibility, (2) correctness, (3) flexibility, (4) usefulness and (5) experience. One round of interview was conducted with each of the interviewees which resulted in the improvement of the ScQUc model.The ScQUc model consists of a checklist of code quality characteristics and a sequence of activities to be conducted in order to evaluate the effectiveness of an undergraduate curriculum in teaching software code quality. The results of the evaluation of the ScQUc model inferred that the ScQUc model was correct and useful for the academia. Hence, we believe that the ScQUc model is a valid solution for evaluating the effectiveness of undergraduate curriculum in teaching software code quality. Still, however, the ScQUc model needs further evaluation and extension in form of specifications aiding deeper understanding of code quality, how to teach code quality, and to what extent to teach code quality.
Utvecklare bygger hela programvaran genom att skriva kvalitetskod, vars fel kan riskera organisatoriskt rykte, ekonomiskt välbefinnande och till och med människors liv och säkerhet. Förmågan att utveckla kod av hög kvalitet är därför ett viktigt inlärningsmål för grundläggande grundutbildningsprogram. Hur effektiva dessa läroplaner för grundutbildning är för att undervisa programkodkvalitet är emellertid en oro eftersom det leder till att kunskapen hos framtida programutvecklare ökar. Just nu saknas modeller för att utvärdera denna effektivitet i grundutbildningen för att ge studenterna kunskap om mjukvarukodkvalitet.I den här avhandlingen föreslår vi en modell för att utvärdera effektiviteten för grundutbildningen i undervisning av programkodkvalitet, som vi kallar ScQUc (Software Code Quality in Undergraduate Curriculum). Vårt mål är att öka kunskapen och medvetenheten om att undervisa programvarukodkvalitet i utbildningen och göra grundutbildningsberäkningen mer programvarukodkvalitet centrerad.På grund av bristen på modeller för att utvärdera effektiviteten i grundutbildningen i undervisning av programvarukodkvalitet, hade vi inget nära besläktat arbete att basera vår forskning på. Vi var tvungna att lita på andra verk relaterade till kodkvalitet i programmeringskurser som indirekt var kopplade till vårt forskningsområde. Därför drar vi slutsatsen att ScQUc-modellen skapades från grunden.Den använda metoden var kvalitativ och följde ramverket för designvetenskap. Datainsamlingen genomfördes genom litteraturstudie och genom intervjuer med två experter på området. Intervjuerna valdes med hjälp av metoden för provtagning av bekvämlighet förbättrad med ett fördefinierat urvalskriterium. Datautvärdering genomfördes å andra sidan med hjälp av en utvärderingsmodell. Utvärderingsmodellen inkluderade följande utvärderingskriterier: (1) intervjuarens trovärdighet, (2) korrekthet, (3) flexibilitet, (4) användbarhet och (5) erfarenhet. En intervjuomgång genomfördes med var och en av de intervjuade som resulterade i förbättringen av ScQUc-modellen.ScQUc-modellen består av en checklista med egenskaper för kodkvalitet och en sekvens av aktiviteter som ska genomföras för att utvärdera effektiviteten hos en grundutbildningsprogram i undervisning av programkodkvalitet. Resultaten av utvärderingen av ScQUc-modellen drar slutsatsen att ScQUc-modellen var korrekt och användbar för akademin. Därför tror vi att ScQUc-modellen är en giltig lösning för att utvärdera effektiviteten för grundutbildningen i undervisning av programvarukodkvalitet. Men ändå behöver ScQUc-modellen ytterligare utvärdering och utvidgning i form av specifikationer som hjälper djupare förståelse för kodkvalitet, hur man undervisar i kodkvalitet och i vilken utsträckning man ska lära ut kodkvalitet.
APA, Harvard, Vancouver, ISO, and other styles
38

Sasaki, Noriko. "Development and Validation of an Acute Heart Failure-Specific Mortality Predictive Model Based on Administrative Data." 京都大学 (Kyoto University), 2014. http://hdl.handle.net/2433/188706.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Robinson-Bryant, Federica. "Defining a Stakeholder-Relative Model to Measure Academic Department Efficiency at Achieving Quality in Higher Education." Doctoral diss., University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5842.

Full text
Abstract:
In a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality. A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative efficiency assessment, captured the essence of the process which also delineates the approach per tool applied. This decision support model was adapted in higher education to assess academic departmental efficiency at achieving stakeholder-relative quality. Phase 1 was accomplished through a three round, Delphi-like study which involved user group refinement. Those results were compared to the criteria of an engineering accreditation body (ABET) to support the model's validity to capture quality in the College of Engineering & Computer Science, its departments and programs. In Phase 2 the Analytic Hierarchy Process (AHP) was applied to the validated model to quantify the perspective of students, administrators, faculty and employers (SAFE). Using the composite preferences for the collective group (n=74), the model was limited to the top 7 attributes which accounted for about 55% of total preferences. Data corresponding to the resulting variables, referred to as key performance indicators, was collected using various information sources and infused in the data envelopment analysis (DEA) methodology (Phase 3). This process revealed both efficient and inefficient departments while offering transparency of opportunities to maximize quality outputs. Findings validate the potential of the Delphi-like, analytic hierarchical, data envelopment analysis approach for administrative decision-making in higher education. However, the availability of more meaningful metrics and data is required to adapt the model for decision making purposes. Several recommendations were included to improve the usability of the decision support model and future research opportunities were identified to extend the analyses inherent and apply the model to alternative areas.
Ph.D.
Doctorate
Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering
APA, Harvard, Vancouver, ISO, and other styles
40

Liu, Chenang. "Smart Additive Manufacturing Using Advanced Data Analytics and Closed Loop Control." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/91900.

Full text
Abstract:
Additive manufacturing (AM) is a powerful emerging technology for fabrication of components with complex geometries using a variety of materials. However, despite promising potential, due to the complexity of the process dynamics, how to ensure product quality and consistency of AM parts efficiently during the process still remains challenging. Therefore, the objective of this dissertation is to develop effective methodologies for online automatic quality monitoring and improvement, i.e., to build a basis for smart additive manufacturing. The fast-growing sensor technology can easily generate a massive amount of real-time process data, which provides excellent opportunities to address the barriers of online quality assurance in AM through data-driven perspectives. Although this direction is very promising, the online sensing data typically have high dimensionality and complex inherent structure, which causes the tasks of real-time data-driven analytics and decision-making to be very challenging. To address these challenges, multiple data-driven approaches have been developed in this dissertation to achieve effective feature extraction, process modeling, and closed-loop quality control. These methods are successfully validated by a typical AM process, namely, fused filament fabrication (FFF). Specifically, four new methodologies are proposed and developed as listed below, (1) To capture the variation of hidden patterns in sensor signals, a feature extraction approach based on spectral graph theory is developed for defect detection in online quality monitoring of AM. The most informative feature is extracted and integrated with a statistical control chart, which can effectively detect the anomalies caused by cyber-physical attack. (2) To understand the underlying structure of high dimensional sensor data, an effective dimension reduction method based on an integrated manifold learning approach termed multi-kernel metric learning embedded isometric feature mapping (MKML-ISOMAP) is proposed for online process monitoring and defect diagnosis of AM. Based on the proposed method, process defects can be accurately identified by supervised classification algorithms. (3) To quantify the layer-wise quality correlation in AM by taking into consideration of reheating effects, a novel bilateral time series modeling approach termed extended autoregressive (EAR) model is proposed, which successfully correlates the quality characteristics of the current layer with not only past but also future layers. The resulting model is able to online predict the defects in a layer-wise manner. (4) To achieve online defect mitigation for AM process, a closed-loop quality control system is implemented using an image analysis-based proportional-integral-derivative (PID) controller, which can mitigate the defects by adaptively adjusting machine parameters during the printing process in a timely manner. By fully utilizing the online sensor data with innovative data analytics and closed-loop control approaches, the above-proposed methodologies are expected to have excellent performance in online quality assurance for AM. In addition, these methodologies are inherently integrated into a generic framework. Thus, they can be easily transformed for applications in other advanced manufacturing processes.
Doctor of Philosophy
Additive manufacturing (AM) technology is rapidly changing the industry; and online sensor-based data analytics is one of the most effective enabling techniques to further improve AM product quality. The objective of this dissertation is to develop methodologies for online quality assurance of AM processes using sensor technology, advanced data analytics, and closed-loop control. It aims to build a basis for the implementation of smart additive manufacturing. The proposed new methodologies in this dissertation are focused to address the quality issues in AM through effective feature extraction, advanced statistical modeling, and closed-loop control. To validate their effectiveness and efficiency, a widely used AM process, namely, fused filament fabrication (FFF), is selected as the experimental platform for testing and validation. The results demonstrate that the proposed methods are very promising to detect and mitigate quality defects during AM operations. Consequently, with the research outcome in this dissertation, our capability of online defect detection, diagnosis, and mitigation for the AM process is significantly improved. However, the future applications of the accomplished work in this dissertation are not just limited to AM. The developed generic methodological framework can be further extended to many other types of advanced manufacturing processes.
APA, Harvard, Vancouver, ISO, and other styles
41

Masuraha, Anand. "Evaluation of the AERMOD Model and Examination of Required Length of Meteorological Data for Computing Concentrations in Urban Areas." Connect to Online Resource-OhioLINK, 2006. http://www.ohiolink.edu/etd/view.cgi?toledo1145653382.

Full text
Abstract:
Thesis (M.S.C.E.)--University of Toledo, 2006.
Typescript. "A thesis [submitted] as partial fulfillment of the requirements of the Master of Science degree in Civil Engineering." Bibliography: leaves 104-108.
APA, Harvard, Vancouver, ISO, and other styles
42

Cheah, Chin Hong Civil &amp Environmental Engineering Faculty of Engineering UNSW. "Kinematic wave modelling of surface runoff quantity and quality for small urban catchments in Sydney." Awarded By:University of New South Wales. Civil & Environmental Engineering, 2009. http://handle.unsw.edu.au/1959.4/44618.

Full text
Abstract:
Extensive research has been undertaken to improve the robustness of runoff quantity predictions for urban catchments. However, equally robust predictions for runoff quality have yet to be attained. Past studies addressing this issue have typically been confined to the use of simple conceptual or empirical models which forgo the tedious steps of providing a physical representation of the actual system to be modelled. Consequently, even if the modelling results for the test catchments are satisfactory, the reliability and applicability of these models for other catchments remain uncertain. It is deemed that by employing process-based, deterministic models, many of these uncertainties can be eliminated. A lack of understanding of the hydrological processes occurring during storm events and the absence of good calibration data, however, hamper the advancement of such models and limit their use in the field. This research proposes that the development of a hydrologic model based on the kinematic wave equations linked to an advection-dispersion model that simulates pollutant detachment and transport will improve both runoff quantity and quality simulations and enhance the robustness of the predictions. At the very worst, a model of this type could still highlight the underlying issues that inhibit models from reproducing the recorded historical hydrographs and pollutographs. In actual fact, this approach has already been applied by various modellers to simulate the entrainment of pollutants from urban catchments. Also, the paradigm shift to using the Water Sensitive Urban Design (WSUD) approach in designing urban stormwater systems has prompted the need to differentiate the various sources of pollutants in urban catchments such as roads, roofs and other impervious surfaces. The primary objective of the study reported herein is to model runoff quantity and quality from small urban catchments, facilitated by the procurement of the necessary field data to calibrate and validate the model via implementation of a comprehensive field exercise based in Sydney. From a water quality perspective, trace metals were selected as the foci. The study outcomes include the formulation of a linkage of models capable of providing accurate and reliable runoff quantity and quality predictions for the study catchments by taking into consideration: - The different availability of pollutants from urban catchments, i.e. roads vs. roofs; - The build-up characteristics of pollutants on the distinct urban surfaces and their spatial distribution; - The contribution of rainwater to urban runoff pollution; - The partitioning of pollutants according to particulate bound and dissolved phases; - The respective role of rainfall and runoff in the detachment and entrainment of pollutants; - The influence of particle properties such as particle size distribution and density on pollutant transport; and - The relationship associating particulate bound metals to suspended solids. The simulation results obtained using the proposed model were found to be suitable for modelling the detachment and transport of pollutants for small urban catchments. Interpretation of these results reveals several key findings which could help to rectify shortcomings of existing modelling approaches. Even though the robustness of the model presented here may not translate into a significant improvement in the overall robustness of model predictions, the physical basis on which this process-based model was developed nevertheless provides the flexibility necessary for implementation at alternative sites. It is also shown that the availability of reliable runoff data is essential for implementation of the model for other similar urban catchments. In conclusion, the proposed model in this study will serve as a worthy tool in future urban catchment management studies.
APA, Harvard, Vancouver, ISO, and other styles
43

Budree, Adheesh. "A Conceptual Model for determining the Value of Business Intelligence Systems." University of the Western Cape, 2014. http://hdl.handle.net/11394/8376.

Full text
Abstract:
Philosophiae Doctor - PhD
Business Intelligence refers to the use of Information Systems to enable raw data to be collated into information that can be reported, with the end goal of using this information to enhance the business decision-making process. Business Intelligence is enabled by making use of information that is complete, relevant, accurate, timely and accessible. There are currently a number of documented perspectives that can be used to gauge the value of Business Intelligence systems; however, from an overall business value perspective the most robust method would be to identify and analyse the most commonly identified factors that impact the value assigned to Business Intelligence Systems by a company, and the correlation of each of these factors to calculate the overall value. The importance of deriving a conceptual model, representing the major factors identified from literature and moderated by the quantitative research conducted, lies in its enabling companies and government bodies to assess the true value addition of Business Intelligence systems, and to understand the return on investment of these systems for organisations. In doing so, companies can justify or reject any further expenditure on Business Intelligence. The quantitative research for this thesis was conducted together with a project that was run between the University of the Western Cape and the Hochschule Neu-Ulm University of Applied Sciences in Germany. The research was conducted simultaneously across organisations in South Africa and Germany on the use of BI Systems and Corporate Performance Management. The respondents for the research were Chief Executive Officers, Chief Information Officers and Business Intelligence Managers in selected organisations. A Direct Oblimin-factor analysis was conducted on the online survey responses. The survey was conducted on a sample of approximately 1500 Business Intelligence specialists across South Africa and Germany; and 113 responses were gathered. The factor analysis reduced the key factors identified in the literature to a few major factors, namely: Information Quality, Management and Accessibility, Information Usage, and Knowledge-sharing Culture. Thereafter, a Structural-Equation-Modelling analysis was completed using the Partial-least-Squares method. The results indicate that there is a strong relationship between the factor-Information Quality, Management and Accessibility, and the Value of Business Intelligence. It was found that while there was no strong impact from Information Usage and Culture, there was a strong correlation between Information Usage and Culture and Information Quality, Management and Accessibility The research findings are significant for academic researchers, information technology experts, Business Intelligence specialists and Business Intelligence users. This study contributes to the body of knowledge by bringing together disparate factors that have been identified in academic journals; and assessing the relationship each has on the value of Business Intelligence, as well as the correlations that exist between these factors. From this, the final conceptual model was derived using factors that were identified and tested through the Factor Analysis and the PLS-SEM. The following conclusions can be drawn from the research: (1) The assurance of quality information in the form of complete, accurate, relevant and timeous information that is efficiently managed is the most paramount factor to an organisation deriving value from Business Intelligence systems; (2) information accessibility is key, in order to realise the value of Business Intelligence systems in organisations; and (3) Business Intelligence systems cannot add value to an organisation if a culture of information use and sharing is absent within that organisation. The derived model can be practically implemented as a checklist for organisations to assess Business Intelligence system investments as well as current implementations
APA, Harvard, Vancouver, ISO, and other styles
44

Lysáček, Jakub. "Návrh a aplikace modelu pro testování implementace nové části DWH na platformě Teradata." Master's thesis, Vysoká škola ekonomická v Praze, 2017. http://www.nusl.cz/ntk/nusl-359079.

Full text
Abstract:
The thesis focuses on application of theory of software testing in data warehousing area. The main goal of the thesis is an introduction of selected theory of software testing and analogical use of the theory in data warehousing environment. Part of the main goal is an introduction of architectonic model of testing process and later on focusing on part of testing which is problematic in data warehousing area. Partial goal of the thesis is validation of problematic part of the model using practical scenario. This partial goal is divided into two parts. First part focuses on requirements gathering and categorizing their priority. Second part focuses on demonstrating how project requirements, available resources and Teradata specific environment manipulate with the testing process. Theoretical part summarizes selected theory of software testing which is later applied in the area of data warehouse testing. The chapter introduces phases of data warehouse testing and specific goals of each testing phase. Chapter also describes model FURPS which is used to classify software quality dimensions a selected methods of requirements priority classification. An architectural model of testing process and its entities are described in the second part of theory. The theory then focuses on problematic part of the model which is requirements gathering and classification of their priority and demonstrates the influence of time, quality requirements and available resources on the overall process of testing. Practical part introduces a real-life scenario which demonstrates an application of described theory, namely requirements gathering, classification of requirements priorities and assigning dimensions of quality. The next part demonstrates the influence of available resources and requirements on the scope of testing. The outcome of the practical part of the thesis is that requirements gathering and classification of their priorities allows us to classify and sort scope of testing into logical and clear units, assign roles and their responsibilities and flexibly react on changes of project requirements. The thesis also points out that there are risks associated with changes of scope and emphasizes on the need of their evaluation.
APA, Harvard, Vancouver, ISO, and other styles
45

Nordström, Lars. "Use of the CIM framework for data management in maintenance of electricity distribution networks." Doctoral thesis, KTH, Industriella informations- och styrsystem, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3985.

Full text
Abstract:
Aging infrastructure and personnel, combined with stricter financial constraints has put maintenance, or more popular Asset Management, at the top of the agenda for most power utilities. At the same time the industry reports that this area is not properly supported by information systems. Today’s power utilities have very comprehensive and complex portfolios of information systems that serve many different purposes. A common problem in such heterogeneous system architectures is data management, e.g. data in the systems do not represent the true status of the equipment in the power grid or several sources of data are contradictory. The research presented in this thesis concerns how this industrial problem can be better understood and approached by novel use of the ontology standardized in the Common Information Model defined in IEC standards 61970 & 61968. The theoretical framework for the research is that of data management using ontology based frameworks. This notion is not new, but is receiving renewed attention due to emerging technologies, e.g. Service Oriented Architectures, that support implementation of such ontological frameworks. The work presented is empirical in nature and takes its origin in the ontology available in the Common Information Model. The scope of the research is the applicability of the CIM ontology, not as it was intended i.e. in systems integration, but for analysis of business processes, legacy systems and data. The work has involved significant interaction with power distribution utilities in Sweden, in order to validate the framework developed around the CIM ontology. Results from the research have been published continuously, this thesis consists of an introduction and summary and papers describing the main contribution of the work. The main contribution of the work presented in this thesis is the validation of the proposition to use the CIM ontology as a basis for analysis existing legacy systems. By using the data models defined in the standards and combining them with established modeling techniques we propose a framework for information system management. The framework is appropriate for analyzing data quality problems related to power systems maintenance at power distribution utilities. As part of validating the results, the proposed framework has been applied in a case study involving medium voltage overhead line inspection. In addition to the main contribution, a classification of the state of the practice system support for power system maintenance at utilities has been created. Second, the work includes an analysis and classification of how high performance Wide Area communication technologies can be used to improve power system maintenance including improving data quality.
QC 20100614
APA, Harvard, Vancouver, ISO, and other styles
46

Cheema, Maliha. "Business Analysis for A Video Conference solution developed by Telenor." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-204885.

Full text
Abstract:
There has been an increased demand on the mobile infrastructure, with enhancement in technology, content delivery and regulatory changes on access and competition rules. According to Ericsson the mobile data traffic is expected to grow with 45% from 2013 until 2019. These changes have to do with the fact that the service provider’s traditional pricing schemes are distinguishing. These changes enables new business models. The main two drivers behind the growth of mobile data traffic are (1) increased number of mobile data subscribers and (2) the average volume per subscriber has increased. At the same time the revenues are growing extremely slow in comparison with data usage, the revenues from the mobile broadband are limited to just a few cents off ARPU. One of the operator looking at different solutions in order to reduce the revenue gap is Telenor. Telenor wants to incorporate in its business model is videoconference with QoE taken into account. They have therefore developed a new service called Appear.in. Appear.in is challenging the existing video conferencing market. The main goal of the thesis is to investigate how business model for videoconference services changes when quality of experience is taken into account. Furthermore what will the effect be of having QoE in the BM? The primary steps that needs to be considered in order to achieve the goal is to identify the ecosystem of videoconference solutions and analyzing the business models that can be considered to deploy videoconference systems. Several workshops and survey was conducted. Moreover two different Business Model Canvas was created with and without QoE. A comparison was made and based on these comparisons one can conclude that there is not must that differs. When QoE is considered, three out of nine building blocks in the Business Model Canvas gets affected by the addition of QoE, value proposition, customer relationship and customer segment. These building blocks gets involved with customers in one way or the other. By including QoE in the Business Model Canvas showed a way to keep the customers satisfied. Furthermore companies would have the advantage of monitoring and tracking every movement allowing for quick decisions, guaranteeing video quality by preventing the development of videos problems. A number of investigations have been done in the area of Videoconferencing from a technology point of view, but one can find very little research related to Business model. General studies in the field of business model over QoE are being carried out. This thesis is a first step towards a wider and deeper investigation in the area of Videoconferencing with QoE from a Business model point of view.
Nya paradigm i både grossist- och detaljhandel håller på att bildas och accelereras av tekniska framsteg, content delivery, och regeländringar om tillgång och konkurrensregler. Dessutom så har efterfrågan på den mobila infrastrukturen ökat drastiskt. Enligt Ericsson förväntas mobildatatrafiken öka sin årliga tillväxttakt på cirka 45 % mellan 2013 och 2019. Det finns en ökad användning av det mobila bredbandet datatrafik samt antalet abonnenter. Samtidigt växer intäkterna extremt långsam i jämförelse med dataanvändning, intäkterna från mobilt bredband är begränsade till bara några cent av ARPU. Telenor överväger att i sin affärsmodell även lägga till videokonferens med QoE i åtanke. Telenor Digital har utvecklat en ny tjänst Appear.in. Genom att använda en ny standard för realtidskommunikation är Appear.in med och utmanar den befintliga videokonferensmarknaden. Huvudsyftet med avhandlingen är att undersöka hur affärsmodellen för videokonferens tjänster förändras när man tar hänsyn till QoE. Vidare vad kommer effekterna att bli med QoE i affärsmodellen? De primära steg som måste övervägas för att uppnå målet är att identifiera ekosystemet videokonferenslösningar och analysera affärsmodeller som kan anses implementera videokonferenssystem. Ett flertal workshops, kartläggningar och enkätundersökningar genomfördes. Två olika Business Model Canvas skapades en med och en utan QoE. En jämförelse gjordes och baserat på dessa jämförelser kan man dra slutsatsen att det inte finns någon stor skillnad mellan dessa båda modeller. När man tar hänsyn till QoE, så påverkas tre av de nio byggstenarna i Business Model Canvas, värde proposition, kundrelationer och kundsegment. Dessa byggstenar blir involverad med kunder på ett eller annat sätt.Genom att inkludera QoE i sin Business Model Canvas kan man hålla sina kunder nöjda. Dessutom kan företagen ha fördelen att övervaka och spåra varje rörelse, vilket möjliggör snabba beslut, vilket garanterar bildkvalitet genom att förhindra utvecklingen av video problem. En hel del forskning har gjorts på området videokonferens från en teknik synvinkel, men man hittar mycket lite forskning relaterad till affärsmodell. Allmänna studier inom affärsmodeller över QoE har genomförs. Denna avhandling är ett första steg mot en bredare och djupare forskning inom området videokonferens med QoE från en affärsmodell synvinkel.
APA, Harvard, Vancouver, ISO, and other styles
47

Kaynak, Burcak. "Assimilation of trace gas retrievals obtained from satellite (SCIAMACHY), aircraft and ground observations into a regional scale air quality model (CMAQ-DDM/3D)." Diss., Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/37134.

Full text
Abstract:
A major opportunity for using satellite observations of tropospheric chemical concentrations is to improve our scientific understanding of atmospheric processes by integrated analysis of satellite, aircraft, and ground-based observations with global and regional scale models. One endpoint of such efforts is to reduce modeling biases and uncertainties. The idea of coupling these observations with a regional scale air quality model was the starting point of this research. The overall objective of this research was to improve the NOₓ emission inventories by integrating observations from different platforms and regional air quality modeling. Specific objectives were: 1) Comparison of satellite NO₂ retrievals with simulated NO₂ by the regional air quality model. Comparison of simulated tropospheric gas concentrations simulated by the regional air quality model, with aircraft and ground-based observations; 3) Assessment of the uncertainties in comparing satellite NO₂ retrievals with NOₓ emissions estimates and model simulations; 4) Identification of biases in emission inventories by data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations with an iterative inverse method using the regional air quality model coupled with sensitivity calculations; 5) Improvement of our understanding of NOₓ emissions, and the interaction between regional and global air pollution by an integrated analysis of satellite NO₂ retrievals with the regional air quality model. Along with these objectives, a lightning NOₓ emission inventory was prepared for two months of summer 2004 to account for a significant upper level NOₓ source. Spatially-resolved weekly NO₂ variations from satellite retrievals were compared with estimated NOₓ emissions for different region types. Data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations were performed to evaluate the NOₓ emission inventory. This research contributes to a better understanding of the use of satellite NO₂ retrievals in air quality modeling, and improvements in the NOₓ emission inventories by correcting some of the inconsistencies that were found in the inventories. Therefore, it may provide groups that develop emissions estimates guidance on areas for improvement. In addition, this research indicates the weaknesses and the strengths of the satellite NO₂ retrievals and offers suggestions to improve the quality of the retrievals for further use in the tropospheric air pollution research.
APA, Harvard, Vancouver, ISO, and other styles
48

Fasold, Mario [Verfasser], Hans [Akademischer Betreuer] Binder, Peter [Gutachter] Stadler, and Andrew [Gutachter] Harrison. "Hybridization biases of microarray expression data - A model-based analysis of RNA quality and sequence effects / Mario Fasold ; Gutachter: Peter Stadler, Andrew Harrison ; Betreuer: Hans Binder." Leipzig : Universitätsbibliothek Leipzig, 2013. http://d-nb.info/1238367623/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Gebler, Sebastian Thomas [Verfasser], Hendrikus Johannes [Akademischer Betreuer] Hendricks-Franssen, Valentijn [Akademischer Betreuer] Pauwels, and Bernd [Akademischer Betreuer] Diekkrüger. "Inverse conditioning of a high resolution integrated terrestrial model at the hillslope scale: the role of input data quality and model structural errors / Sebastian Thomas Gebler ; Hendrikus Johannes Hendricks-Franssen, Valentijn Pauwels, Bernd Diekkrüger." Aachen : Universitätsbibliothek der RWTH Aachen, 2018. http://d-nb.info/1193656605/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lavender, R. Gregory. "The explication of process-product relationships in DoD-STD-2167 and DoD-STD-2168 via an Augmented Data Flow Diagram model." Thesis, Virginia Tech, 1988. http://hdl.handle.net/10919/45908.

Full text
Abstract:
The research reported in this thesis is an extension and application of the results first introduced by the Procedural Approach to the Evaluation of Software Development Methodologies. The evaluation procedure offers a unique perspective based on the philosophy that a software development methodology should espouse a set of objectives that are achieved by employing certain U principles throughout the software development process, such that the products generated possess certain attributes deemed desirable. Further, definite linkages exist between objectives and principles, and principles and attributes. The work described herein adopts the perspective offered by the evaluation procedure and applies a critical analysis to the process-product relationships in DoD-STD-2l67 and DoD-STD-2l68. In support of the analysis, Augmented Data Flow Diagrams are introduced as an effective tool for concisely capturing the information in both standards. The results of the analysis offer a deeper insight into the requirements for defense system software development, such that one is able to better understand the development process, and more intelligently assess the quality of the software and documentation produced.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography