To see the other types of publications on this topic, follow the link: Data gap.

Dissertations / Theses on the topic 'Data gap'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data gap.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Yang, Liqiang. "Statistical Inference for Gap Data." NCSU, 2000. http://www.lib.ncsu.edu/theses/available/etd-20001110-173900.

Full text
Abstract:
<p>This thesis research is motivated by a special type of missing data - Gap Data, which was first encountered in a cardiology study conducted at Duke Medical School. This type of data include multiple observations of certain event time (in this medical study the event is the reopenning of a certain artery), some of them may have one or more missing periods called ``gaps'' before observing the``first'' event. Therefore, for those observations, the observed first event may not be the true first event because the true first event might have happened in one of the missing gaps. Due to this kind of missing information, estimating the survival function of the true first event becomes very difficult. No research nor discussion has been done on this type of data by now. In this thesis, the auther introduces a new nonparametric estimating method to solve this problem. This new method is currently called Imputed Empirical Estimating (IEE) method. According to the simulation studies, the IEE method provide a very good estimate of the survival function of the true first event. It significantly outperforms all the existing estimating approaches in our simulation studies. Besides the new IEE method, this thesis also explores the Maximum Likelihood Estimate in thegap data case. The gap data is introduced as a special type of interval censored data for thefirst time. The dependence between the censoring interval (in the gap data case is the observedfirst event time point) and the event (in the gap data case is the true first event) makes the gap data different from the well studied regular interval censored data. This thesis points of theonly difference between the gap data and the regular interval censored data, and provides a MLEof the gap data under certain assumptions.The third estimating method discussed in this thesis is the Weighted Estimating Equation (WEE)method. The WEE estimate is a very popular nonparametric approach currently used in many survivalanalysis studies. In this thesis the consistency and asymptotic properties of the WEE estimateused in the gap data are discussed. Finally, in the gap data case, the WEE estimate is showed to be equivalent to the Kaplan-Meier estimate. Numerical examples are provied in this thesis toillustrate the algorithm of the IEE and the MLE approaches. The auther also provides an IEE estimate of the survival function based on the real-life data from Duke Medical School. A series of simulation studies are conducted to assess the goodness-of-fit of the new IEE estimate. Plots and tables of the results of the simulation studies are presentedin the second chapter of this thesis.<P>
APA, Harvard, Vancouver, ISO, and other styles
2

Lam, Yan-ki Jacky. "Developmental normative data for the random gap detection test." Click to view the E-thesis via HKU Scholors Hub, 2005. http://lookup.lib.hku.hk/lookup/bib/B38279289.

Full text
Abstract:
Thesis (B.Sc)--University of Hong Kong, 2005.<br>"A dissertation submitted in partial fulfilment of the requirements for the Bachelor of Science (Speech and Hearing Sciences), The University of Hong Kong, June 30, 2005." Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
3

Hussain, Etikaf. "Transit spatial gap identification: Exploiting big transit and traffic data." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/229971/1/Etikaf_Hussain_Thesis.pdf.

Full text
Abstract:
This PhD exploits the big datasets from the road transport network to accurately model the transit supply and demand, the knowledge for which is used to identify evidence-based transit gaps. The effectiveness of the proposed novel modelling is demonstrated with the case study on the Brisbane network. The developed tool assists transit stakeholders in identifying regions where transit services can be significantly improved, leading to efficient and reliable transit services.
APA, Harvard, Vancouver, ISO, and other styles
4

Alirezaie, Marjan. "Bridging the Semantic Gap between Sensor Data and Ontological Knowledge." Doctoral thesis, Örebro universitet, Institutionen för naturvetenskap och teknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-45908.

Full text
Abstract:
The rapid growth of sensor data can potentially enable a better awareness of the environment for humans. In this regard, interpretation of data needs to be human-understandable. For this, data interpretation may include semantic annotations that hold the meaning of numeric data. This thesis is about bridging the gap between quantitative data and qualitative knowledge to enrich the interpretation of data. There are a number of challenges which make the automation of the interpretation process non-trivial. Challenges include the complexity of sensor data, the amount of available structured knowledge and the inherent uncertainty in data. Under the premise that high level knowledge is contained in ontologies, this thesis investigates the use of current techniques in ontological knowledge representation and reasoning to confront these challenges. Our research is divided into three phases, where the focus of the first phase is on the interpretation of data for domains which are semantically poor in terms of available structured knowledge. During the second phase, we studied publicly available ontological knowledge for the task of annotating multivariate data. Our contribution in this phase is about applying a diagnostic reasoning algorithm to available ontologies. Our studies during the last phase have been focused on the design and development of a domain-independent ontological representation model equipped with a non-monotonic reasoning approach with the purpose of annotating time-series data. Our last contribution is related to coupling the OWL-DL ontology with a non-monotonic reasoner. The experimental platforms used for validation consist of a network of sensors which include gas sensors whose generated data is complex. A secondary data set includes time series medical signals representing physiological data, as well as a number of publicly available ontologies such as NCBO Bioportal repository.
APA, Harvard, Vancouver, ISO, and other styles
5

Carvalho, Ana Margarida de Almeida Bastos. "Support of operational processes in the Data Warehouse: the gap between theory and practice." Master's thesis, Instituto Superior de Economia e Gestão, 2007. http://hdl.handle.net/10400.5/3701.

Full text
Abstract:
Mestrado em Gestão de Sistemas de Informação<br>Data Warehouses have been traditionally accepted as a subject-oriented, integrated, non-volatile, time-variant collection of data in support of management's decisions. They support the process of top and middle management decision-making and the organization's strategic planning processes. Their use to support operational data requirements as well has been somehow controversial by being either supported or criticized by different authors. This thesis focuses on the identification, for a Data Warehouse containing some level of operational activities, of the reasons that may be driving this kind of activities into the Data Warehouse. The thesis is made up of two parts. The first part departed from a literature review of Data Warehouse concepts, its characteristics, its usage and its role into organizations to the general opinion concerning the support of operational activities in Data Warehouse environments. The second part describes a single case study used to search for evidences of operational support in a Data Warehouse environment and to identify possible reasons that may be forcing the Data Warehouse to support operational activities. The research findings show that, for the specific case of the organization studied, there is evidence of the support of operational activities in the organization's Data Warehouse and some evidence was collected concerning the reasons that motivate the localization of these activities. Finally, we will discuss findings and opportunities for further research.<br>Os Data Warehouses têm sido tradicionalmente aceites como uma coleção de dados orientados por assunto, integrados, não voláteis, com diferentes períodos temporais que suportam a tomada de decisões pela gestão. Suportam o processo de tomada de decisão pela gestão de topo e intermédia e os processos de planeamento estratégico da organização. A sua utilização para suportar igualmente requisitos de dados operacionais tem sido de alguma forma controversa sendo apoiada ou criticada por diferentes autores. Esta tese coloca o enfoque na identificação, para um Data Warehouse suportando um determinado nível de actividades operacionais, das razões que podem estar a desviar este tipo de actividades para o Data Warehouse. A tese é constituída por duas partes. A primeira parte partiu da revisão da literatura sobre os conceitos de Data Warehouse, as suas características, a sua utilização e o seu papel nas organizações, para a opinião geralmente aceite relativamente ao suporte de actividades operacionais em ambientes de Data Warehouse. A segunda parte descreve um único estudo de casos utilizado na busca de evidências de suporte operacional num ambiente de Data Warehouse e na identificação das possíveis razões que podem estar a forçar o Data Warehouse a suportar actividades operacionais. Os resultados da investigação mostram que, para o caso específico da organização estudada, existe evidência do suporte de actividades operacionais no Data Warehouse da organização e foi recolhida alguma evidência relativamente às razões que motivam esta localização de actividades. Por fim, serão analisadas as conclusões e oportunidades futuras de investigação.
APA, Harvard, Vancouver, ISO, and other styles
6

Nelson, Wade, and Diana Shurtleff. "Bridging The Gap Between Telemetry and the PC." International Foundation for Telemetering, 1988. http://hdl.handle.net/10150/615216.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada<br>The explosive use and extensive development of software and hardware for the IBM PC and PC Clones over the past few years has positioned the PC as one of many viable alternatives to system designers configuring systems for both data acquisition and data analysis. Hardware abounds for capturing signals to be digitized and analyzed by software developed for the PC. Communication software has improved to where system developers can easily link instrumentation devices together to form integrated test environments for analyzing and displaying data. Telemetry systems, notable those developed for lab calibration and ground station environments, are one of many applications which can profit from the rapid development of data acquisition techniques for the PC. Recently developed for the ADS100A telemetry processor is a data acquisition module which allows the system to be linked into the PC world. The MUX-I/O module was designed to allow the PC access to telemetry data acquired through the ADS 100A, as well as provide a method by which data can be input into the telemetry environment from a host PC or equivalent RS-232 or GPIB interface. Signals captured and digitized by the ADS100A can be passed on to the PC for further processing and/or report generation. Providing interfaces of this form to the PC greatly enhances the functionality and scope of the abilities already provided by the ADS100A as one of the major front-end processors used in telemetry processing today. The MUX-I/O module helps "bridge the gap" between telemetry and the PC in an ever increasing demand for improving the quantity and quality of processing power required by today's telemetry environment. This paper focuses on two distinct topics, how to transfer data to and from the PC and what off-the-shelf software is available to provide communication links and analysis of incoming data. Major areas of discussion will include software protocols, pre vs post processing, static vs dynamic processing environments, and discussion of the major data analysis and acquisition packages available for the PC today, such as DaDisp and Lotus Measure, which aid the system designer in analyzing and displaying telemetry data. Novel applications of the telemetry to PC link will be discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Groero, Jaroslav. "East and West Germany after the Unification: The Wage Gap Analysis." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-193373.

Full text
Abstract:
Under socialism workers had their wages set by the central planners.. In my thesis I use panel data from SHARLIFE questionnaire in order to analyze how returns to East German human capital variables changed after the reunification in 1990.I also compare these returns to West German returns to human capital variables. Before 1990 the returns to experience and education were lower in East Germany than in West Germany. After the reunification East German returns to experience obtained before 1990 and to education decreased. I find a significant decrease of returns to high educated workers who spent in the East German educational system 15 and more years. East German returns to both human capital variables are smaller than West German ones before the reunification and the difference is more pronounced after the reunification.
APA, Harvard, Vancouver, ISO, and other styles
8

Park, Soyoun. "Penalized method based on representatives and nonparametric analysis of gap data." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/37307.

Full text
Abstract:
When there are a large number of predictors and few observations, building a regression model to explain the behavior of a response variable such as a patient's medical condition is very challenging. This is a "p ≫n " variable selection problem encountered often in modern applied statistics and data mining. Chapter one of this thesis proposes a rigorous procedure which groups predictors into clusters of "highly-correlated" variables, selects a representative from each cluster, and uses a subset of the representatives for regression modeling. The proposed Penalized method based on Representatives (PR) extends the Lasso for the p ≫ n data and highly correlated variables, to build a sparse model practically interpretable and maintain prediction quality. Moreover, we provide the PR-Sequential Grouped Regression (PR-SGR) to make computation of the PR procedure efficient. Simulation studies show the proposed method outperforms existing methods such as the Lasso/Lars. A real-life example from a mental health diagnosis illustrates the applicability of the PR-SGR. In the second part of the thesis, we study the analysis of time-to-event data called a gap data when missing time intervals (gaps) possibly happen prior to the first observed event time. If a gap occurs prior to the first observed event, then the first observed event may or may not be the first true event. This incomplete knowledge makes the gap data different from the well-studied regular interval censored data. We propose a Non-Parametric Estimate for the Gap data (NPEG) to estimate the survival function for the first true event time, derive its analytic properties and demonstrate its performance in simulations. We also extend the Imputed Empirical Estimating method (IEE), which is an existing nonparametric method for the gap data up to one gap, to handle the gap data with multiple gaps.
APA, Harvard, Vancouver, ISO, and other styles
9

Moosavi, Seyyed Ali. "TECTAS : bridging the gap between collaborative tagging systems and structured data." Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/29554.

Full text
Abstract:
Ontologies are core building block of the emerging semantic web, and taxonomies which contain class-subclass relationships between concepts are a key component of ontologies. A taxonomy that relates the tags in a collaborative tagging system makes the collaborative tagging system's underlying structure easier to understand. Automatic construction of taxonomies from various data sources such as text data and collaborative tagging systems has been an interesting topic in the field of data mining. This thesis introduces a new algorithm for building a taxonomy of keywords from tags in collaborative tagging systems. This algorithm is also capable of detecting has-a relationships between tags. Proposed method - the TECTAS algorithm - uses association rule mining to detect is-a relationships between tags and can be used in an automatic or semi-automatic framework. TECTAS algorithm is based on the hypothesis that users tend to assign both "child" and "parent" tags to a resource. Proposed method leverages association rule mining algorithms, bi-gram pruning using search engines, discovering relationships when pairs of tags have a common child, and lexico-syntactic patterns to detect meronyms. In addition to proposing the TECTAS algorithm, several experiments are reported using four real data sets: Del.icio.us, LibraryThing, CiteULike, and IMDb. Based on these experiments, the following topics are addressed in this thesis: (1) Verify the necessity of building domain specific taxonomies (2) Analyze tagging behavior of users in collaborative tagging systems (3) Verify the effectiveness of our algorithm compared to previous approaches (4) Use of additional quality and richness metrics for evaluation of automatically extracted taxonomies.
APA, Harvard, Vancouver, ISO, and other styles
10

Giese, Holger, Stephan Hildebrandt, and Leen Lambers. "Toward bridging the gap between formal semantics and implementation of triple graph grammars." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4521/.

Full text
Abstract:
The correctness of model transformations is a crucial element for the model-driven engineering of high quality software. A prerequisite to verify model transformations at the level of the model transformation specification is that an unambiguous formal semantics exists and that the employed implementation of the model transformation language adheres to this semantics. However, for existing relational model transformation approaches it is usually not really clear under which constraints particular implementations are really conform to the formal semantics. In this paper, we will bridge this gap for the formal semantics of triple graph grammars (TGG) and an existing efficient implementation. Whereas the formal semantics assumes backtracking and ignores non-determinism, practical implementations do not support backtracking, require rule sets that ensure determinism, and include further optimizations. Therefore, we capture how the considered TGG implementation realizes the transformation by means of operational rules, define required criteria and show conformance to the formal semantics if these criteria are fulfilled. We further outline how static analysis can be employed to guarantee these criteria.
APA, Harvard, Vancouver, ISO, and other styles
11

Tardivo, Gianmarco. "Methods for gap filling in long term meteorological series and correlation analysis of meteorological networks." Doctoral thesis, Università degli studi di Padova, 2013. http://hdl.handle.net/11577/3422634.

Full text
Abstract:
Climate data are very useful in many fields of the scientific research. Nowadays, in many cases these data are available through giant data-base that are often yielded by automatic meteorological networks. In order to make possible research analysis and the running of computational models, these data base need to be validated, homogenized, and to be without missing values. Validation and homogenization are common operations, nowadays: the organizations that manage these data-base provide these services. The main problem remain the reconstruction of the missing data. This dissertation deal with two main topics: (a) the reconstruction of missing values of daily precipitation and temperature datasets; (b) a base analysis on the time and space correlation between stations of a meteorological network. (a) At first, a new adaptive method to reconstruct temperature data is described. This method is compare with a non-adaptive one. A detailed analysis of the effects of the number of predictors for a regression-based approach (to reconstruct daily temperature data) and their search strategy is then presented. Precipitation and temperature are the most important climatological variables, so, a method to reconstruct daily precipitation data is chosen through a comparison of four technique. (b) The methods selected in phase (a) make it possible to reconstruct the two data-base (precipitation and temperature) that will be used for the next and last work: the correlation analysis, through time and space of network data.<br>I dati climatologici sono molto utili in molti campi della ricerca scientifica. Oggigiorno, molte volte questi dati sono disponibili sottoforma di enormi data-base che sono spesso prodotti da stazioni meteorologiche automatiche. Affinché analisi di ricerca e lavori di modellistica siano possibili su questi data-base, essi devono subire un’opera di omogeneizzazione, validazione e ricostruzione dei dati mancanti. Le operazioni di validazione ed omogeneizzazione sono già per lo più condotte dalle organizzazioni che gestiscono questi dati. Il problema principale rimane quello della ricostruzione dei dati mancanti. Questa tesi si occupa principalmente di due argomenti: (a) la ricostruzione di valori mancanti di insiemi di dati di precipitazione e temperatura giornalieri; (b) un’analisi fondamentale sulla correlazione spazio-temporale tra le stazioni di una rete meteorologica. (a) Per prima cosa, si presenta un nuovo modello adattivo per ricostruire i dati di temperatura. Questo modello viene confrontato con uno non adattivo. Poi si presenterà un’analisi dettagliata sulla scelta ed il numero di predittori per metodi di ricostruzione di tipo multi-regressivo. Precipitazioni e temperatura sono le più importanti variabili climatologiche, così, viene scelto un metodo per ricostruire anche i dati giornalieri di pioggia, questa scelta viene fatta attraverso un confronto fra 4 tecniche. (b) Questi due metodi (ricostruzione di pioggia e temperature) permettono di ricostruire i data-base che vengono usati per il prossimo ed ultimo lavoro: l’analisi di correlazione, attraverso le coordinate spaziale e temporale della rete.
APA, Harvard, Vancouver, ISO, and other styles
12

Brody, Samuel. "Closing the gap in WSD : supervised results with unsupervised methods." Thesis, University of Edinburgh, 2009. http://hdl.handle.net/1842/3981.

Full text
Abstract:
Word-Sense Disambiguation (WSD), holds promise for many NLP applications requiring broad-coverage language understanding, such as summarization (Barzilay and Elhadad, 1997) and question answering (Ramakrishnan et al., 2003). Recent studies have also shown that WSD can benefit machine translation (Vickrey et al., 2005) and information retrieval (Stokoe, 2005). Much work has focused on the computational treatment of sense ambiguity, primarily using data-driven methods. The most accurate WSD systems to date are supervised and rely on the availability of sense-labeled training data. This restriction poses a significant barrier to widespread use of WSD in practice, since such data is extremely expensive to acquire for new languages and domains. Unsupervised WSD holds the key to enable such application, as it does not require sense-labeled data. However, unsupervised methods fall far behind supervised ones in terms of accuracy and ease of use. In this thesis we explore the reasons for this, and present solutions to remedy this situation. We hypothesize that one of the main problems with unsupervised WSD is its lack of a standard formulation and general purpose tools common to supervised methods. As a first step, we examine existing approaches to unsupervised WSD, with the aim of detecting independent principles that can be utilized in a general framework. We investigate ways of leveraging the diversity of existing methods, using ensembles, a common tool in the supervised learning framework. This approach allows us to achieve accuracy beyond that of the individual methods, without need for extensive modification of the underlying systems. Our examination of existing unsupervised approaches highlights the importance of using the predominant sense in case of uncertainty, and the effectiveness of statistical similarity methods as a tool for WSD. However, it also serves to emphasize the need for a way to merge and combine learning elements, and the potential of a supervised-style approach to the problem. Relying on existing methods does not take full advantage of the insights gained from the supervised framework. We therefore present an unsupervised WSD system which circumvents the question of actual disambiguation method, which is the main source of discrepancy in unsupervised WSD, and deals directly with the data. Our method uses statistical and semantic similarity measures to produce labeled training data in a completely unsupervised fashion. This allows the training and use of any standard supervised classifier for the actual disambiguation. Classifiers trained with our method significantly outperform those using other methods of data generation, and represent a big step in bridging the accuracy gap between supervised and unsupervised methods. Finally, we address a major drawback of classical unsupervised systems – their reliance on a fixed sense inventory and lexical resources. This dependence represents a substantial setback for unsupervised methods in cases where such resources are unavailable. Unfortunately, these are exactly the areas in which unsupervised methods are most needed. Unsupervised sense-discrimination, which does not share those restrictions, presents a promising solution to the problem. We therefore develop an unsupervised sense discrimination system. We base our system on a well-studied probabilistic generative model, Latent Dirichlet Allocation (Blei et al., 2003), which has many of the advantages of supervised frameworks. The model’s probabilistic nature lends itself to easy combination and extension, and its generative aspect is well suited to linguistic tasks. Our model achieves state-of-the-art performance on the unsupervised sense induction task, while remaining independent of any fixed sense inventory, and thus represents a fully unsupervised, general purpose, WSD tool.
APA, Harvard, Vancouver, ISO, and other styles
13

Karlsson, Christoffer. "Control of critical data flows : Automated monitoring of insurance data." Thesis, KTH, Skolan för elektro- och systemteknik (EES), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-187733.

Full text
Abstract:
EU insurance companies work on implementing the Solvency II directive, which calls for stronger focus on data quality and information controls. Information controls are procedures that can validate data at rest and data in motion to detect errors and anomalies. In this master thesis a case study was carried out at AMF, a Swedish pension insurance company, to identify and investigate their critical data flows and the controls performed in the respective flows. A purpose of this project is to help AMF ensure data quality requirements from the Financial Supervisory Authority that they have to fulfill. The thesis was conducted at AMF between September and December 2015, and included tasks such as carrying out interviews, Enterprise Architecture modeling, analysis, prototyping, product evaluation and calculation of a business case.  A gap analysis was carried out to analyze the needs for change regarding existing information controls at AMF, where different states of the company are documented and analyzed. The current state corresponds to the present situation at the company including attributes to be improved while the future state outlines the target condition that the company wants to achieve. A gap between the current state and future state is identified and elements that make up the gap are presented in the gap description. Lastly, possible remedies for bridging the gap between the current and future state are presented.  Furthermore, a prototype of an automated control tool from a company called Infogix has been implemented and analyzed regarding usability, governance and cost.  A benefits evaluation was carried out on the information control tool to see whether an investment would be beneficial for AMF. The benefit evaluation was carried out using the PENG method, a Swedish model developed by three senior consultants that has been specially adjusted for evaluation of IT investments. The evaluation showed that such an investment would become beneficial during the second year after investment.<br>Försäkringsbolag i EU arbetar med införandet av Solvens II-direktivet som kräver att företag har ett större fokus på datakvalitet och informationskontroller. I detta examensarbete har en fältstudie utförts på AMF som är ett svenskt pensionsbolag. Arbetet har gått ut på att identifiera och undersöka kritiska dataflöden i företaget samt kontroller som utförs i dessa flöden. Ett syfte med arbetet var att hjälpa AMF att kunna påvisa att man uppfyller krav från finansinspektionen på datakvalitet och spårbarhet. Projektet utfördes under perioden september till december hösten 2015, vilket inkluderade arbetsuppgifter såsom intervjuer, Enterprise Architecture-modellering, implementering av prototyp, produktutvärdering samt kalkylering av ett business case.  En gap-analys har utförts för att analysera behovet av förändringar på de nuvarande informationskontrollerna som finns på AMF, där olika lägen har dokumenterats och analyserats. Nuläget motsvarar hur situationen ser ut på företaget i dagsläget och fokuserar på de attribut som man vill förbättra, medan önskat läge beskriver de mål som företaget vill uppnå. Ett gap mellan nuläge och önskat läge identifieras tillsammans med de faktorer som utgör skillnaden mellan dessa lägen presenteras. Till sist presenteras tänkbara åtgärder för att uppnå önskat läge. Som en del av detta examensarbete har en prototyp av ett automatiserat kontrollverktyg från ett företag som heter Infogix implementerats och utvärderas med avseende på användbarhet, styrning och kostnad. En nyttovärdering har utförts på kontrollverktyget för att undersöka huruvida en investering skulle vara gynnsam för AMF. Nyttovärderingen gjordes med hjälp av PENG, en svensk nyttovärderingsmodell utvecklad av tre ekonomer/IT-konsulter, som har anpassat speciellt för att bedöma IT-investeringar. Värderingen visade på att en sådan investering skulle komma att bli gynnsam under andra året efter att investeringen gjordes.
APA, Harvard, Vancouver, ISO, and other styles
14

El, Mosleh M. (Mohamad). "The use of remote sensing to fill the gap data in lake water balance." Master's thesis, University of Oulu, 2017. http://urn.fi/URN:NBN:fi:oulu-201709062800.

Full text
Abstract:
In the arid and semi-arid regions, closed lakes are extensively used to satisfy the high water demands that threats their existence. The Bakhtegan lake is considered as one of the biggest lakes in Iran and has been drying out in the past years due to intensive irrigation in that region. The multiple modifications applied to the Kor river feeding Bakhtegan has affected the lake water intake drastically where the lake dries out completely during the irrigation season and hot summer days. The river inflow data has been missing between the years 1984 and 1997 due to the destruction of the closest gauge stations which was installed above the lake. The aim of this study is to estimate the flow data during this period by combining water balance simulation and remote sensing techniques. Remote sensing has been considered as a very efficient tool in analyzing water features for many water resources engineering applications. Based on the remotely sensed images of Bakhtegan and the water balance equation for the period (1998–2000), we obtained all components of effective parameters of water balance equation (all inflow, e.g., river inflow, rainfall and outflow, e.g., evaporation) and consequently the area-volume depth curve of the lake was developed. The water balance simulation based on the developed area-volume depth curve was validated for the period (2001–2003) that we had real hydrological and climate data. Finally, the gap data was filled by utilizing the water balance simulation based on the results of remote sensing.
APA, Harvard, Vancouver, ISO, and other styles
15

Campbell, Cory A. "The Changing Landscape of Finance in Higher Education: Bridging the Gap Through Data Analytics." Case Western Reserve University School of Graduate Studies / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=case1523021768570795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Garringer, James. "The Role of Protocol Analysis in Cybersecurity| Closing the Gap on Undetected Data Breaches." Thesis, Utica College, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10974156.

Full text
Abstract:
<p> Organizations of all sizes are targets for a cyberattack. Undetected data breaches result in the catastrophic loss of personally identifiable information (PII) causing considerable financial and reputation harm to organizations, while also imposing a risk of identity fraud to consumers. The purpose of this study was to consider the impact that undetected data breaches have on organizations with an additional focus on shortening the gap between the time of data breach and the time of detection through manual protocol analysis and intrusion detection system (IDS) solutions. This research reviewed the available literature detailing the effects of undetected data breaches on organizations as well as the advanced exploitation of protocols and anomaly detection through manual protocol analysis and IDS. </p><p> Manual protocol analysis provides situational anomaly detection when compared to baseline network traffic, but implies privacy concerns and does not allow timely detection of most cyberattacks. Automated IDS stream-based flows allow quicker detection of cyberattacks. Network flow-based IDS misses hidden attacks due to lack of a data payload requiring manual analysis instead, while host-based IDS adversely affects the performance of the host computer, but successfully identifies anomalies based on known signatures. This study recommended a complementary defense-in-depth solution which employs manual protocol analysis and both host-based and network-based IDS solutions as a viable strategy for reducing the time between data breach and time of detection. This study additionally recommended that security operation center personnel and IT departments should receive protocol analysis training to support manual detection against a known network traffic baseline.</p><p>
APA, Harvard, Vancouver, ISO, and other styles
17

Archer, Elizabeth. "Bridging the gap : optimising a feedback system for monitoring learner performance." Thesis, University of Pretoria, 2010. http://hdl.handle.net/2263/26608.

Full text
Abstract:
Globally, a wealth of educational data has been collected on learner performance in a bid to improve and monitor the quality of education. Unfortunately, the data seem to have had only limited influence on learning and teaching in classrooms. This thesis aimed to bridge this gap between the availability of learner performance data and their use in informing planning and action in schools. A design research approach was used to optimise the feedback system for the South African Monitoring system for Primary schools (SAMP). Design research aims to produce both an intervention to address a complex real-world challenge and to develop design guidelines to support other designers faced with similar challenges in their own context. In this research, the process of developing and improving the feedback system was also used to examine ways of facilitating the use of the feedback. Multiple cycles of design, implementation and evaluation of four different prototypes of the feedback system were conducted, employing evaluations from both experts (e.g. Dutch and South African academics, research and educational psychologists, instrument designers and teacher trainers) as well as school users (teachers, principals and HoDs). Mixed methods were employed throughout the study, with different sub-samples of school users sampled from the population of 22 schools (English, Afrikaans and Sepedi) in the Tshwane region participating in SAMP. The various research cycles incorporated interviews, observations, journals, questionnaires, the Delphi technique and expert evaluations to examine not only data-use, but also aspects such as problem-solving, planning, data-literacy and attitudes towards evidence-based practice in the schools. Data was analysed using Rasch Modelling, descriptive statistics and computer-aided qualitative data analysis. The study showed that an effective feedback system facilitates appropriate use through a gradual process of enlightenment, is flexible and responsive to user inputs, values collaboration and includes instrument, reporting and support components in its design. An optimum feedback system also positively influences school feedback and monitoring culture by providing opportunities for positive experiences with feedback and increasing data-literacy. This improves the chances of feedback being used for planning, decision-making and action in the schools. An effective feedback system must also offer a comprehensive package to accommodate different users, with various levels of data sophistication, functioning in diverse contexts. The research also showed that an effective feedback system mediates thinking about educational instruction and curriculum and can therefore be a potent change agent. Use of clear, simple, intuitive data presentation in the feedback system allows for experiential learning to increase user data-literacy. The design research approach employed in this study offers an appropriate and powerful approach to adapting, developing and optimising a feedback system. User involvement in design research ensures greater contextualisation and familiarity with the system, while engendering trust and a greater sense of ownership, all of which increase the receptiveness and responsiveness of users to feedback. Finally, the research also contributed design guidelines for other developers of feedback systems, an integrated conceptual framework for use of monitoring feedback and a functioning feedback system employed by 22 schools in the Tshwane region.<br>Thesis (PhD)--University of Pretoria, 2011.<br>Science, Mathematics and Technology Education<br>unrestricted
APA, Harvard, Vancouver, ISO, and other styles
18

Yan, Mingjin. "Methods of Determining the Number of Clusters in a Data Set and a New Clustering Criterion." Diss., Virginia Tech, 2005. http://hdl.handle.net/10919/29957.

Full text
Abstract:
In cluster analysis, a fundamental problem is to determine the best estimate of the number of clusters, which has a deterministic effect on the clustering results. However, a limitation in current applications is that no convincingly acceptable solution to the best-number-of-clusters problem is available due to high complexity of real data sets. In this dissertation, we tackle this problem of estimating the number of clusters, which is particularly oriented at processing very complicated data which may contain multiple types of cluster structure. Two new methods of choosing the number of clusters are proposed which have been shown empirically to be highly effective given clear and distinct cluster structure in a data set. In addition, we propose a sequential type of clustering approach, called multi-layer clustering, by combining these two methods. Multi-layer clustering not only functions as an efficient method of estimating the number of clusters, but also, by superimposing a sequential idea, improves the flexibility and effectiveness of any arbitrary existing one-layer clustering method. Empirical studies have shown that multi-layer clustering has higher efficiency than one layer clustering approaches, especially in detecting clusters in complicated data sets. The multi-layer clustering approach has been successfully implemented in clustering the WTCHP microarray data and the results can be interpreted very well based on known biological knowledge. Choosing an appropriate clustering method is another critical step in clustering. K-means clustering is one of the most popular clustering techniques used in practice. However, the k-means method tends to generate clusters containing a nearly equal number of objects, which is referred to as the ``equal-size'' problem. We propose a clustering method which competes with the k-means method. Our newly defined method is aimed at overcoming the so-called ``equal-size'' problem associated with the k-means method, while maintaining its advantage of computational simplicity. Advantages of the proposed method over k-means clustering have been demonstrated empirically using simulated data with low dimensionality.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
19

Ohme, Frank. "Bridging the gap between post-Newtonian theory and numerical relativity in gravitational-wave data analysis." Phd thesis, Universität Potsdam, 2012. http://opus.kobv.de/ubp/volltexte/2012/6034/.

Full text
Abstract:
One of the most exciting predictions of Einstein's theory of gravitation that have not yet been proven experimentally by a direct detection are gravitational waves. These are tiny distortions of the spacetime itself, and a world-wide effort to directly measure them for the first time with a network of large-scale laser interferometers is currently ongoing and expected to provide positive results within this decade. One potential source of measurable gravitational waves is the inspiral and merger of two compact objects, such as binary black holes. Successfully finding their signature in the noise-dominated data of the detectors crucially relies on accurate predictions of what we are looking for. In this thesis, we present a detailed study of how the most complete waveform templates can be constructed by combining the results from (A) analytical expansions within the post-Newtonian framework and (B) numerical simulations of the full relativistic dynamics. We analyze various strategies to construct complete hybrid waveforms that consist of a post-Newtonian inspiral part matched to numerical-relativity data. We elaborate on exsisting approaches for nonspinning systems by extending the accessible parameter space and introducing an alternative scheme based in the Fourier domain. Our methods can now be readily applied to multiple spherical-harmonic modes and precessing systems. In addition to that, we analyze in detail the accuracy of hybrid waveforms with the goal to quantify how numerous sources of error in the approximation techniques affect the application of such templates in real gravitational-wave searches. This is of major importance for the future construction of improved models, but also for the correct interpretation of gravitational-wave observations that are made utilizing any complete waveform family. In particular, we comprehensively discuss how long the numerical-relativity contribution to the signal has to be in order to make the resulting hybrids accurate enough, and for currently feasible simulation lengths we assess the physics one can potentially do with template-based searches.<br>Eine der aufregendsten Vorhersagen aus Einsteins Gravitationstheorie, die bisher noch nicht direkt durch ein Experiment nachgewiesen werden konnten, sind Gravitationswellen. Dies sind winzige Verzerrungen der Raumzeit selbst, und es wird erwartet, dass das aktuelle Netzwerk von groß angelegten Laserinterferometern im kommenden Jahrzehnt die erste direkte Gravitationswellenmessung realisieren kann. Eine potentielle Quelle von messbaren Gravitationswellen ist das Einspiralen und Verschmelzen zweier kompakter Objekte, wie z.B. ein Binärsystem von Schwarzen Löchern. Die erfolgreiche Identifizierung ihrer charakteristischen Signatur im Rausch-dominierten Datenstrom der Detektoren hängt allerdings entscheidend von genauen Vorhersagen ab, was wir eigentlich suchen. In dieser Arbeit wird detailliert untersucht, wie die komplettesten Wellenformenmodelle konstruiert werden können, indem die Ergebnisse von (A) analytischen Entwicklungen im post-Newtonschen Verfahren und (B) numerische Simulationen der voll-relativistischen Bewegungen verknüpft werden. Es werden verschiedene Verfahren zur Erstellung solcher "hybriden Wellenformen", bei denen der post-Newtonsche Teil mit numerischen Daten vervollständigt wird, analysiert. Existierende Strategien für nicht-rotierende Systeme werden vertieft und der beschriebene Parameterraum erweitert. Des Weiteren wird eine Alternative im Fourierraum eingeführt. Die entwickelten Methoden können nun auf multiple sphärisch-harmonische Moden und präzedierende Systeme angewandt werden. Zusätzlich wird die Genauigkeit der hybriden Wellenformen mit dem Ziel analysiert, den Einfluss verschiedener Fehlerquellen in den Näherungstechniken zu quantifizieren und die resultierenden Einschränkungen bei realen Anwendungen abzuschätzen. Dies ist von größter Bedeutung für die zukünftige Entwicklung von verbesserten Modellen, aber auch für die korrekte Interpretation von Gravitationswellenbeobachtungen, die auf Grundlage solcher Familien von Wellenformen gemacht worden sind. Insbesondere wird diskutiert, wie lang der numerische Anteil des Signals sein muss, um die Hybride genau genug konstruieren zu können. Für die aktuell umsetzbaren Simulationslängen wird die Physik eingeschätzt, die mit Hilfe von Modell-basierten Suchen potentiell untersucht werden kann.
APA, Harvard, Vancouver, ISO, and other styles
20

Kolb, Dominik. "Printing the invisible : bridging the gap between data and matter through voxel-based 3D printing." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112911.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2017<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 74-79).<br>Scientific visualizations are central to the representation and communication of data in ways that are at once efficient and effective. Numerous data types have established unique formats of representation. In the context of three-dimensional (3D) data sets, such information is often presented as a 3D rendering, a video or an interactive application. The purpose of such visualization is often to emulate the physical, three-dimensional world; however, they remain inherently virtual. Recent advancements in additive manufacturing are making it possible to 'physicalize' three-dimensional data through 3D printing. Still, most 3D printing methods are geared towards single material printing workflows devoid of the ability to physically visualize volumetric data with high fidelity matching their virtual origin. As a result, information and detail are compromised. To overcome this limitation, I propose, design and evaluate a workflow to 'physicalize' such data through multi-material 3D printing. The thesis focuses on methods for voxel-based additive fabrication at high spatial resolution of three-dimensional data sets including - but not limited to point clouds, volumes, lines and graphs, and image stacks. This is achieved while maintaining the original data with high fidelity. I demonstrate that various data sets - often visualized through rasterization on screen - can be translated into physical, materially heterogeneous objects, by means of multi-material, voxel-based 3D printing. This workflow - its related tools, techniques and technologies contained herein - enables bridging the gap between digital information presentation and physical material composition. Developed methods are experimentally tested with various data across scales, disciplines and problem contexts - including application domains such as biomedicine, physics and archeology.<br>by Dominik Kolb.<br>S.M.<br>S.M. Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences
APA, Harvard, Vancouver, ISO, and other styles
21

Tashakor, Ghazal. "Delivering Business Intelligence Performance by Data Warehouse and ETL Tuning." Thesis, Mittuniversitetet, Institutionen för informationsteknologi och medier, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-20062.

Full text
Abstract:
Abstract The aim of this thesis is to show how numerous organizations such as CGI Consultant attempt to introduce BI-Solutions through IT and other operational methods in order to deal with large companies, which want to make their competitive market position stronger. This aim is achieved by Gap Analyzing in the BI roadmap and available Data Warehouses based on one of the company projects which were handed over to CGI from Lithuania. The fundamentals in achieving the BI-Solutions through IT, which has built the thesis methodology by research are, data warehousing, content analytics and performance management, data movement (Extract, Transform and Load) and CGI BI methodology, business process management, TeliaSonera Maintenance Management Model (TSM3) and AM model of CGI in the high level. The part of the thesis basically requires some research and practical work on Informatica PowerCenter, Microsoft SQL Server Management Studio and low level details such as database tuning, DBMS tuning implementation and ETL workflows optimization.   Keywords: BI, ETL, DW, DBMS, TSM3, AM, Gap Analysing
APA, Harvard, Vancouver, ISO, and other styles
22

Drago, Christian. "PresentAction: a disruptive mobile system designed for bridging the physical-digital gap in information sharing events." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-187030.

Full text
Abstract:
Since 1960, when the rst slide projectors were manufactured and people started using slides as a support for delivering information to their audience in educational and institutional environments, not much has changed. Nowadays slide presentations are still based on the traditional interaction methods between presenter and audience that limit the experience to the live event. Considering the worldwide raising number of smartphone users and the increasing interest by researchers in mobile education, this research aims to give its contribution to a marginally explored field: the mobile presentation systems space. Mobile presentation system is intended to be a fully mobile service that supports presenter and audience during all the phases of a slide based event and enhance the user experience by enabling interaction between physical and digital presentation spaces. The first goal of this work was to understand if there is an underserved demand for innovation in the current presentation interaction model and what the perceived needs and wants are. People sensible to this field went through interviews and surveys, conducted from both the presenter and the audience perspectives. The results showed that about 87% of the participants had previously experienced visibility problems during presentations, causing frustration and lack of concentration damaging both audience and presenter. 80% of the people use email for sharing their slide set to the audience, a non effective way to reach all the attendants. The ability to send/get the material was considered critical both from the presenter and the audience points of view, assessing it 3.3/5 and 3.9/5 respectively. Other major problems of technical nature (software and hardware) emerged as often experienced by almost half the participants. This preliminary qualitative investigation built the basis of the PresentAction concept that was conceived to solve the raised problems and to support presenter and audience during all the phases of a presentation (before, during and after the live session) in a single solution. The concept was the guideline of the designed and developed system. PresentAction lets any connected smartphone user, regardless his/her technical skills, run slideshows from already existing slide decks anywhere and at any time and automatically create complete, high quality, interactive, updatable, learning object delivered to the attending users through smart content delivery methods. The system was experimentally validated, both technically (several performance measurements) and from the end user experience point of view (system trial and surveys). The users were really satisfied, rating the developed system at 4.25/5 and with more than 80% of the participants willing to adopt this system both as presenter and audience. The users emphasized in particular the usefulness of being able to run presentations even without the support of a projector (connecting directly presenter and audience devices) and the possibility of creating a personal, mobile portfolio of "talking" slides of the events they attended. The performed experiments also demonstrated that pre-fetching based slide distribution is fundamental for increasing the number of concurrent users in the audience while keeping a real time feeling for the presentation, with limited delays in the transition between slides on their mobile device. Finally a new concept of slides as learning multimedia objects, merging images with audio and supporting a variety of digital on demand services, was developed and demonstrated as a superior solution, in terms of both communication costs and user experience, to accessing a complete raw video of the presentation event. Experimental evidence showed that this approach can require as low as 945 times less communication resource than current video-based solutions.
APA, Harvard, Vancouver, ISO, and other styles
23

Dumas, Jeremiah Percy. "A spatial decision support system utilizing data from the Gap Analysis Program and a Bayesian Belief Network." Master's thesis, Mississippi State : Mississippi State University, 2005. http://library.msstate.edu/etd/show.asp?etd=etd-07072005-104946.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Akinpelu, Mobolaji Olatokunbo. "Scaling Success : learning from education intervention programs to close the racial education achievement gap." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/104821.

Full text
Abstract:
Thesis: S.M. in Technology and Policy, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, Technology and Policy Program, 2016.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 105-110).<br>An overview of American education reveals a concerning pattern: when outcomes are disaggregated by race, students from certain racial minority groups often have poorer outcomes than White students. This pattern, the racial education achievement gap, can be seen in different sorts of measures from the literature, including in the low representation of minority students at elite public institutions. To address this low representation, and to keep universities racially diverse, administrators and policymakers often turn to race-based affirmative action, the explicit (and contentious) consideration of an applicant's race in admissions decisions. College-centered education intervention programs are another tool administrators and policymakers use to address the gap reflected in elite college enrollment and to keep campuses diverse. This thesis asks how do and how can appropriately designed college-based education intervention programs help to both keep racial diversity and close the racial educational achievement gap in America's colleges? To this end, chapter one lays out the motivating issues - the gap, affirmative action, and education intervention programs; chapter two contains the case study of two successful programs, focusing on the programs' designs, the participants' experiences, and the conditions that foster academic excellence in minority students; chapter three, in part using causal loop diagrams from system dynamics modelling, makes the case for appreciating education as a complex system - one with interlocking political, economic, pedagogic, and sociocultural forces - and thus urges caution in drawing conclusions from chapter two; and chapter four, drawing from the two preceding chapters, proposes three policy recommendations to improve not just the presence of minority students at selective institutions, but, more importantly, their overall academic thriving.<br>by Mobolaji Olatokunbo Akinpelu.<br>S.M. in Technology and Policy
APA, Harvard, Vancouver, ISO, and other styles
25

Schreiber, Werner. "GIS and EUREPGAP : applying GIS to increase effective farm management in accordance GAP requirements." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53440.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2003.<br>ENGLISH ABSTRACT: With the inception of precision farming techniques during the last decade, agricultural efficiency has improved, leading to greater productivity and enhanced economic benefits associated with agriculture. The awareness of health risks associated with food borne diseases has also increased. Systems such as Hazard Analysis and Critical Control Points (RACCP) in the USA and Good Agricultural Practices (GAP) in Europe are trying to ensure that no food showing signs of microbial contamination associated with production techniques are allowed onto the export market. Growers participating in exporting are thus being forced to conform to the requirements set by international customers. The aim of this study was to compile a computerized record keeping system that would aid farmers with the implementation of GAP on farms, by making use of GIS capabilities. A database, consisting of GAP-specific data was developed. ArcView GIS was used to implement the database, while customized analyses procedures through the use of Avenue assisted in GAP-specific farming related decisions. An agricultural area focusing on the export market was needed for this study, and the nut producing Levubu district was identified as ideal. By making use of ArcView GIS, distinct relationships between different data sets were portrayed in tabular, graphical, geographical and report format. GAP requirements state that growers must base decisions on timely, relevant information. With information available in the above-mentioned formats, decisions regarding actions taken can be justified. By analysing the complex interaction between datasets, the influences that agronomical inputs have on production were portrayed, moving beyond the standard requirements of GAP. Agricultural activities produce enormous quantities of data, and GIS proved to be an indispensable tool because of the ability to analyse and manipulate data with a spatial component. The implementation of good agricultural practices lends itself to the use of GIS. With the correct information available at the right time, better decisions can promote optimal croppmg, whilst rmmrrnzmg the negative effects on the consumer and environment.<br>AFRIKAANSE OPSOMMING: Gedurende die afgelope dekade het die gebruik van presisie boerderytegnieke tot verbeterde gewasverbouing gelei, wat verhoogde produktiwiteit en ekonomiese welvarendheid tot gevolg gehad het. 'n Wêreldwye bewustheid ten opsigte van die oordrag van siektekieme geasosieer met varsprodukte het ontstaan. Met die implementering van Hazard Analysis and Critical Control Points (HACCP) en Good Agricultural Practices (GAP), poog die VSA en Europa om voedsel wat tekens van besmetting toon van die invoermark te weerhou. Buitelandse produsente en uitvoerders word dus hierdeur gedwing om by internasionale voedselstandaarde aan te pas. Hierdie navorsing het ten doel gehad om 'n gerekenariseerde rekordhouding stelsel daar te stel wat produsente sal bystaan tydens die implementering van GAP, deur gebruik te maak van GIS. 'n Databasis gerig op die implementering van GAP is ontwerp. ArcView GIS is gebruik word om die databasis te implementeer, waarna spesifieke navrae die data ontleed het om sodoende die besluitnemingsproses te vergemaklik. 'n Landbou-area wat aktief in die uitvoermark deelneem was benodig vir dié studie, en die Levubu distrik was ideaal. Verwantskappe tussen datastelle is bepaal en uitgebeeld in tabel-, grafiek- en verslag vorm. Die suksesvolle implementering van GAP vereis dat alle besluite op relevante inligting gebaseer word, en met inligting beskikbaar in die bogenoemde formaat kan alle besluite geregverdig word. Deur die komplekse interaksie tussen insette en produksie te analiseer, was dit moontlik om verwantskappe uit te beeld wat verder strek as wat GAP vereistes stipuleer. Deur die gebruikerskoppelvlak in ArcView te verpersoonlik is die gebruiker nie belaai met onnodige berekeninge nie. Aktiwiteite soos landbou produseer groot datastelle, en die vermoë van GIS om die ruimtelike verwantskappe te analiseer en uit te beeld, het getoon dat GIS 'n instrumentele rol in die besluitnemingsproses speel. Deur middel van beter besluitneming kan optimale gewasverbouing verseker word, terwyl die negatiewe impak op die verbruiker en omgewing tot 'n minimum beperk word.
APA, Harvard, Vancouver, ISO, and other styles
26

King, Caleb B. "Bridging the Gap: Selected Problems in Model Specification, Estimation, and Optimal Design from Reliability and Lifetime Data Analysis." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/73165.

Full text
Abstract:
Understanding the lifetime behavior of their products is crucial to the success of any company in the manufacturing and engineering industries. Statistical methods for lifetime data are a key component to achieving this level of understanding. Sometimes a statistical procedure must be updated to be adequate for modeling specific data as is discussed in Chapter 2. However, there are cases in which the methods used in industrial standards are themselves inadequate. This is distressing as more appropriate statistical methods are available but remain unused. The research in Chapter 4 deals with such a situation. The research in Chapter 3 serves as a combination of both scenarios and represents how both statisticians and engineers from the industry can join together to yield beautiful results. After introducing basic concepts and notation in Chapter 1, Chapter 2 focuses on lifetime prediction for a product consisting of multiple components. During the production period, some components may be upgraded or replaced, resulting in a new ``generation" of component. Incorporating this information into a competing risks model can greatly improve the accuracy of lifetime prediction. A generalized competing risks model is proposed and simulation is used to assess its performance. In Chapter 3, optimal and compromise test plans are proposed for constant amplitude fatigue testing. These test plans are based on a nonlinear physical model from the fatigue literature that is able to better capture the nonlinear behavior of fatigue life and account for effects from the testing environment. Sensitivity to the design parameters and modeling assumptions are investigated and suggestions for planning strategies are proposed. Chapter 4 considers the analysis of ADDT data for the purposes of estimating a thermal index. The current industry standards use a two-step procedure involving least squares regression in each step. The methodology preferred in the statistical literature is the maximum likelihood procedure. A comparison of the procedures is performed and two published datasets are used as motivating examples. The maximum likelihood procedure is presented as a more viable alternative to the two-step procedure due to its ability to quantify uncertainty in data inference and modeling flexibility.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
27

Gulati, Mayank. "Bridging Sim-to-Real Gap in Offline Reinforcement Learning for Antenna Tilt Control in Cellular Networks." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292948.

Full text
Abstract:
Antenna tilt is the angle subtended by the radiation beam and horizontal plane. This angle plays a vital role in determining the coverage and the interference of the network with neighbouring cells and adjacent base stations. Traditional methods for network optimization rely on rule-based heuristics to do decision making for antenna tilt optimization to achieve desired network characteristics. However, these methods are quite brittle and are incapable of capturing the dynamics of communication traffic. Recent advancements in reinforcement learning have made it a viable solution to overcome this problem but even this learning approach is either limited to its simulation environment or is limited to off-policy offline learning. So far, there has not been any effort to overcome the previously mentioned limitations, so as to make it applicable in the real world. This work proposes a method that consists of transferring reinforcement learning policies from a simulated environment to a real environment i.e. sim-to-real transfer through the use of offline learning. The approach makes use of a simulated environment and a fixed dataset to compensate for the underlined limitations. The proposed sim-to-real transfer technique utilizes a hybrid policy model, which is composed of a portion trained in simulation and a portion trained on the offline real-world data from the cellular networks. This enables to merge samples from the real-world data to the simulated environment consequently modifying the standard reinforcement learning training procedures through knowledge sharing between the two environment’s representations. On the one hand, simulation enables to achieve better generalization performance with respect to conventional offline learning as it complements offline learning with learning through unseen simulated trajectories. On the other hand, the offline learning procedure enables to close the sim-to-real gap by exposing the agent to real-world data samples. Consequently, this transfer learning regime enable us to establish optimal antenna tilt control which in turn results in improved coverage and reduced interference with neighbouring cells in the cellular network.<br>Antennlutning är den vinkel som dämpas av strålningsstrålen och det horisontella planet. Denna vinkel spelar en viktig roll för att bestämma täckningen och störningen av nätverket med angränsande celler och intilliggande basstationer. Traditionella metoder för nätverksoptimering förlitar sig på regelbaserad heuristik för att göra beslutsfattande för antennlutningsoptimering för att uppnå önskade nätverksegenskaper. Dessa metoder är dock ganska styva och är oförmögna att fånga dynamiken i kommunikationstrafiken. De senaste framstegen inom förstärkningsinlärning har gjort det till en lönsam lösning att lösa detta problem, men även denna inlärningsmetod är antingen begränsad till dess simuleringsmiljö eller är begränsad till off-policy offline inlärning. Hittills har inga ansträngningar gjorts för att övervinna de tidigare nämnda begränsningarna för att göra det tillämpligt i den verkliga världen. Detta arbete föreslår en metod som består i att överföra förstärkningsinlärningspolicyer från en simulerad miljö till en verklig miljö, dvs. sim-till-verklig överföring genom användning av offline-lärande. Metoden använder en simulerad miljö och en fast dataset för att kompensera för de understrukna begränsningarna. Den föreslagna sim-till-verkliga överföringstekniken använder en hybridpolicymodell, som består av en del utbildad i simulering och en del utbildad på offline-verkliga data från mobilnätverk. Detta gör det möjligt att slå samman prover från verklig data till den simulerade miljön och därmed modifiera standardutbildningsförfarandena för förstärkning genom kunskapsdelning mellan de två miljöernas representationer. Å ena sidan möjliggör simulering att uppnå bättre generaliseringsprestanda med avseende på konventionellt offlineinlärning eftersom det kompletterar offlineinlärning med inlärning genom osynliga simulerade banor. Å andra sidan möjliggör offline-inlärningsförfarandet att stänga sim-till-real-klyftan genom att exponera agenten för verkliga dataprov. Följaktligen möjliggör detta överföringsinlärningsregime att upprätta optimal antennlutningskontroll som i sin tur resulterar i förbättrad täckning och minskad störning med angränsande celler i mobilnätet.
APA, Harvard, Vancouver, ISO, and other styles
28

Lennmor, Lynn. "Mind the Gap… A Case Study about Cross-functional Collaboration between Teams in Game Development." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254998.

Full text
Abstract:
Game development today is a complex process that differs from traditional software development by presenting unique challenges stemming from a multidisciplinary structured process, including teams from multiple fields, such as art, sound, programming, design, human factors and more. This, together with the growth of the industry during recent years has increased the need for a more efficient cross-functional collaboration and understanding between these teams. This study focuses on the collaboration and understanding between two distinct teams, User Research (UR) and Development in order to try and shed some light on an emerging challenge of a gap in understanding that exists between the two fields. A case study was conducted at an established game company in Sweden, where a UR team was closely observed and analyzed. The results of this study showed that the issues and practices could be grouped into three different areas, Process, Communication, and Understanding that affected each other differently. Where a majority of the issues found often related to Communication and Understanding problems. The findings provided a glimpse of the gap in understanding in a game development process and what problems it can entail and what the possible solutions could streamline the process. However, in order to fully understand and fill this gap more thorough observations during a longer period of time is required.<br>Spelutveckling idag är en komplex process, som skiljer sig från traditionell programutveckling genom att den presenterar unika utmaningar som härstammar från en multidisciplinär strukturerad process. Som inkluderar teams från många olika fält, såsom konst, ljud, programmering, design, mänskliga faktorer och många fler. Detta tillsammans med utvidgning av industrin de senaste åren har det skett ett behov av mer effektivt tvärfunktionellt samarbete och förståelse mellan dessa team. Denna studie fokuserar på samarbete och förståelsen mellan två specifika team, User Research (UR) och Development för att försöka belysa den uppkomna utmaningen av en klyfta i förståelsen som existerar mellan de två fälten. En fallstudie gjordes på ett etablerat spelföretag i Sverige, där ett UR team noggrant observerades and analyserades. Resultaten från studien visar att problem och praxis kunde grupperas i tre olika områden, Process, Kommunikation och Förståelse där var och en påverkade varandra olika, där majoriteten av de identifierade problemen ofta relaterade till Kommunikation och Förståelse problem. Upptäckterna gav en skymt av klyftan i förståelse som finns i en spelutvecklingsprocess och vilka problem den kan medföra samt vilka möjliga lösningar som skulle kunna effektivisera denna process. Dock, för att få full förståelse över denna klyfta och hur man kan skulle kunna fylla den så behövs en mer noggrannare studie över en längre tid.
APA, Harvard, Vancouver, ISO, and other styles
29

MacKintosh, Hamish. "Developing the silviculture of continuous cover forestry : using the data and experience collected from the Glentress Trial Area." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/7943.

Full text
Abstract:
Continuous Cover Forestry (CCF) has become increasingly popular since the early 1990s. CCF utilises several silvicultural techniques in order to promote and enhance forest structural diversity and favours natural regeneration. As CCF is relatively new to the UK there are still areas of knowledge regarding management interventions that need to be improved upon. This study utilises simple models, seedling physiology and a hybrid gap model and applies them to the Glentress Trial Area which has been under transformation from even-aged forestry since 1952. These efforts have led to an improved understanding of thinning interventions and the effects they may have on future stand structure. Since the formation of the Forestry Commission in 1919, clearfell-replant forestry has been the main form of management practiced in the UK. CCF management differs in several respects and is commonly practiced using expert knowledge in Continental Europe. In the UK the knowledge-base is still growing and therefore simple models can prove useful for guiding management. This study investigated the use of the idealised reverse-J and the Equilibrium Growing Stock (EGS). This study found that the reverse-J shaped diameter distribution is maintained at the Trial, Block and sub- Block scale indicating that an irregular structure is being approached. In addition, the diminution coefficient, a parameter of the reverse-J distribution, falls within values typical of continental Europe. Comparison of the actual diameter-frequency distribution against an ideal reverse-J distribution can inform both thinning intensity and which diameter classes to target. The EGS, which is a volume–diameter distribution, examines standing volume and how that volume is distributed across three broad diameter classes. Typical distributions from the Swiss Jura indicate that percentage volume should be split 20:30:50 across diameter classes. The EGS analysis showed that standing volume in the Trial Area is much lower than European values at just 174 m3 ha-1. In addition, the classic 20:30:50 percent split was not observed. The 1990 data set showed a 49:43:8 distribution but by 2008 it was 40:41:19. As natural regeneration is favoured in CCF a better understanding of seedling physiology is essential. This study established open (15-35 m2 ha-1) and closed canopy plots (>35 m2 ha-1). Plot characteristics were recorded and then seedlings were selected for physical measurements, chlorophyll fluorescence and gas-exchange measurements. There were clear differences between the physical characteristics with a mean Apical Dominance Ratio (APR) of 1.41 for the open plots and 0.9 for the closed plots which is consistent with previous studies suggesting an APR of 1 is needed for successful regeneration. The chlorophyll fluorescence measurements showed a linear relationship with PAR. However, although the results of the gas-exchange measurements showed an increase in photosynthetic rates with PAR for open plots, there was no obvious relationship in the closed plots. As a result, the study did not find a linear relationship between photosynthetic rate and chlorophyll fluorescence. Finally a complex, hybrid gap model was used to investigate the effects of management on predicted future stand structure. The hybrid gap model, PICUS v1.41, was parameterised for Sitka spruce. The model was used to explore different management scenarios on stand structure over two time periods; 1954-2008 and 1952- 2075. The output from the group selection with underplanting scenario, which resembled the actual management, produced realistic output that was comparable to the stand characteristics measured during the 2008 assessment. The output from the 1952-2075 runs suggested that thinning to a residual basal area suitable to allow natural regeneration (<30 m2 ha-1) or a group selection with underplanting were the best management options for maintaining structural diversity.
APA, Harvard, Vancouver, ISO, and other styles
30

Ren, Xiaoyuan S. M. (Xiaoyuan Charlene) Massachusetts Institute of Technology. "Mining the gap : pathways towards an integrated water, sanitation and health framework for outbreak control in rural India." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/111396.

Full text
Abstract:
Thesis: S.M. in Technology and Policy, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, Technology and Policy Program, 2017.<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 173-176).<br>The scientific connection between sanitation, water quality and health is well established. However, in the present Indian scenario, monitoring and governance of the three sectors is handled separately. At present, the need to integrate sanitation, water quality, and health is felt during waterborne disease outbreaks such as large-scale diarrhea, typhoid or cholera. Despite the general interest shown for a cross-sector integrated framework in outbreak control, numerous administrative and technical gaps exist preventing the implementation of this framework. This study attempts to address these implementation barriers through the analysis of governing institutions and data integration of large public databases for the selected districts of Gujarat, India. Interagency collaboration barrier is analyzed through a comprehensive institutional analysis on the water, sanitation and health monitoring sectors. The lack of administrative incentive due to the narrow definition of monitoring targets is identified as the primary barrier for collaboration. Districts that already achieved 100% open-defecation-free status are identified as key entry points for potential pilot implementation of an integrated framework. National Informatics Center and Water and Sanitation Management Organization (WASMO) are considered key nodal points for building channels of interagency connections. Data integration and utilization barriers are analyzed through habitation-level matching of the 3 separate monitoring databases - namely, Swatch Bharat Mission (SBM) database for sanitation, Integrated Management Information System (IMIS) database for rural drinking water quality and Integrated Disease Surveillance Programme (IDSP) for outbreak data. The most critical data barrier is the discrepancy between administrative units across the databases, resulting in 25% mismatched habitation data and variables with 30% contradictory data entries. Quality concerns over inconsistent and missing data are also raised, especially for data collected by grassroots workers. A decision support model based on the integrated database is constructed through a Driver-Pressure- State-Exposure-Effect-Action (DPSEEA) framework. A significant correlation is observed between chains connecting sanitation initiatives and water quality. Significant risk factors associated with outbreak occurrence cannot be identified at the current stage. Even though implementing this model is within reach, and doing so promises to offer an efficient tool for integrated governance of the three sectors, incomplete datasets is currently the key barrier to a comprehensive assessment of model effectiveness.<br>by Xiaoyuan "Charlene" Ren.<br>S.M. in Technology and Policy
APA, Harvard, Vancouver, ISO, and other styles
31

Nordström, Fanny, and Claudia Järvelä. "Digital Competencies and Data Literacy in Digital Transformations : Experience from the Technology Consultants." Thesis, Uppsala universitet, Företagsekonomiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-450947.

Full text
Abstract:
The digital revolution is challenging both individuals and organizations to be more comfortable using various digital technologies. Digital technologies enable and generate high amounts of data, but people are not very good at interpreting or making sense of it. This study aimed to explore the role of digital competencies and data literacy in digital transformations and identify the consequences the lack of digital competencies and data literacy can cause within digital transformation projects. The authors studied technology consultants' perspectives with experience in digital transformation projects using an exploratory qualitative research design building on the empirical data gathered from semi-structured interviews. The authors were able to identify that the technology consultants perceived digital competencies as crucial skills for individuals to possess in digital transformations. At the same time, data literacy was not considered a crucial skill in the context of digital transformations. Regarding the consequences of a digital skills gap, the technology consultants saw issues within the implementation of the project, delays, or indirect waste of resources like monetary assets.
APA, Harvard, Vancouver, ISO, and other styles
32

Hughes, Bridget Y. "Collective impact: Closing the gap in educational outcomes for Aboriginal and Torres Strait Islander peoples in Queensland." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/230011/1/Bridget_Hughes_Thesis.pdf.

Full text
Abstract:
This thesis examined the educational outcomes for Indigenous children enrolled in Queensland state (public) primary schools from the perspective of the collective and social impact of programs and services. The study used quantifiable data to show that the gap is not closing, regardless of an improvement in attendance, along with literacy and numeracy achievement levels, in certain regions of Queensland.
APA, Harvard, Vancouver, ISO, and other styles
33

Ohme, Frank [Verfasser], and Bernard F. [Akademischer Betreuer] Schutz. "Bridging the gap between post-Newtonian theory and numerical relativity in gravitational-wave data analysis / Frank Ohme. Betreuer: Bernard F. Schutz." Potsdam : Universitätsbibliothek der Universität Potsdam, 2012. http://d-nb.info/1024613771/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Slámová, Gabriela. "Analýza zpracování osobních údajů podle Nařízení GDPR." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2018. http://www.nusl.cz/ntk/nusl-377967.

Full text
Abstract:
This diploma thesis deals with the proposal of a personal data protection system according to the General Data Protection Regulation in the organization Dentalife s.r.o.. The proposal was implemented on the basis of an analysis of the current situation which revealed serious shortcomings in line with the General Data Protection Regulation. Based on the identified deficiencies, a recommendation has been drawn up which, in the event of its subsequent implementation, will put the current situation into line with this Regulation. The theme of the diploma thesis was selected primarily because of its up-to-date and missing materials that would describe and explain the individual steps of the whole process of analysis and implementation.
APA, Harvard, Vancouver, ISO, and other styles
35

Campos, Maria Manuel Trindade. "Public sector wage gap and fiscal adjustments on the run-up to the euro area." Master's thesis, Instituto Superior de Economia e Gestão, 2011. http://hdl.handle.net/10400.5/4310.

Full text
Abstract:
Mestrado em Econometria Aplicada e Previsão<br>This study examines the fiscal adjustments that took place on the run-up to the euro area and how were they reflected on the func¬tioning of the public sector labour markets in euro area countries. OECD data are used to identify and characterize episodes of fiscal consolidation in a broad set of countries and within the 1983-2001 time-frame, but focusing, in particular, on those corresponding to the euro area founding Member States and to the 1993-1997 period. To assess developments referring to compensation of employees and how the occurrence of these episodes affected public sector employment and wage growth in countries that in the 1990s were engaged in the fulfilment of the Maastricht criteria, microeconomic data drawn from the European Community Household Panel is used. Such data is also employed to estimate the public-private wage gap, using a novel ap¬proach that allows the estimation of quantile regressions accounting for individual-specific fixed effects. Results suggest that, on the run-up to the euro area, macroeconomic and interest rate conditions made it easier to comply with the Maastricht criteria without requiring partic¬ularly strong primary expenditure cuts. Regarding, more specifically, the expenditure with compensation of employees, there is evidence of a relative moderation in terms of the admission of civil servants, wage growth and the evolution of public-private wage gaps, but it is not striking and was reversed shortly after the assessment of the criteria. This may explain why none of the fiscal adjustments identified in euro area countries in 1993-1997 was successful in persistently reducing public debt ratios.<br>O presente estudo pretende analisar os ajustamentos orçamentais que ocorreram no período anterior ao início da UEM e de que modo os mesmos se reflectiram no funcionamento dos mercados de trabalho do sector público da área do euro. Com base em dados da OCDE, são identificados e caracterizados episódios de consolidação orçamental num conjunto alargado de países entre 1983 e 2001, mas atenção especial e devotada aos correspondentes aos países fundadores da UEM e ao período de 1993 a 1997. Com o objectivo de estudar a evolução das despesas com pessoal e de que forma a ocorrência destes episódios afectou o crescimento do emprego e dos salários no sector público em países que ao longo da decada de 1990 estavam envolvidos no cumprimento dos critérios de Maastricht, são usados dados microeconómicos do Painel de Agregados Familiares da Comunidade Europeia. Estes dados são igualmente empregues para estimar prémios salariais associados ao sector público, usando uma nova abordagem que permite a estimação de regressões de quantis tendo em conta efeitos fixos específicos aos indivíduos. Os resultados obtidos sugerem que, no período anterior ao início da UEM, as condições macroeconómicas e a evolucção das taxas de juro facilitaram o cumprimento dos critérios de Maastricht sem necessidade de cortes severos na despesa primária. No que respeita, mais concretamente, às despesas com pessoal, existem indícios de uma relativa moderação em termos da admissão de novos funcionários públicos, do crescimento dos salários e da evolução dos prémios salariais, mas a mesma não parece ter sido particularmente forte, verificando-se uma reversão logo após a avaliação do cumprimento dos critérios. Estes factores poderão explicar por que razão nenhum dos ajustamentos orçamentais identificados em países da UEM no período 1993-1997 produziu efeitos duradouros de reduçcão dos rácios da dívida.
APA, Harvard, Vancouver, ISO, and other styles
36

Stauch, Vanessa Juliane. "Data-led methods for the analysis and interpretation of eddy covariance observations." Phd thesis, kostenfrei, 2006. http://opus.kobv.de/ubp/volltexte/2007/1238/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Picher, Andrea. "The technology gap and the emergence of French and German industrial policy in the domain of data processing and computers, 1960--1970." Thesis, University of Ottawa (Canada), 2003. http://hdl.handle.net/10393/26389.

Full text
Abstract:
The idea that a technology gap between the United States of America and Western Europe existed emerged in the early 1960s. Western Europeans attributed the gap to a dramatic increase in direct American investment, government support for R and D, firm size, as well as the brain drain, while American Scholars argued that the roots of the gap were the archaic educational systems and the hierarchical social structures of Western Europe. In order to support their national computer industries against American competition, French and German policy makers chose to counter the technology gap by developing national support programs. Although both countries responded to the same socio-economic problem, the resulting industrial programs differed fundamentally. The purpose of this study is to gain an understanding of industrial policy in the 1960s and, through a comparative analysis show, how industrial policy is shaped by political and cultural aspects within individual countries.
APA, Harvard, Vancouver, ISO, and other styles
38

Patrick, Michele Colleen. "'CLOSING THE GAP' Negotiating Alignment with Australia's First Peoples. A Comparative Discourse Analysis of the 2017 speeches presented by Australian Political Leaders." Thesis, Griffith University, 2018. http://hdl.handle.net/10072/378152.

Full text
Abstract:
In Australia, Closing the Gap is a highly profiled federal government policy aimed at closing the gap of disadvantage between Australia’s First Peoples and non-Indigenous Australians. This policy comprises of a yearly report providing statistical data addressing the progress of the initiative. As a significant parliamentary contribution towards the ideology of reconciliation in Australia, political leaders present a national address that responds to the statistical data of the report. This thesis presents a com-bined discourse analysis of the speeches presented in 2017, by Prime Minister Malcolm Turnbull and Leader of the Opposition Bill Shorten. Being a political discourse analy-sis, it focuses on the language features used by Australian political leaders to support their political ideology. Michèle Koven (2002) presented a model that explained how political leaders align (or misalign) themselves with other social actors. This research will adapt that model to identify how these leaders position themselves ideologically through their Closing the Gap speeches. Then by using critical discourse analysis, it will also present a typology of discursive strategies used in such political discourses, when negotiating an ideological alignment with Australia’s First Peoples. These two approaches will be further justified with two more supporting analyses. This compara-tive analysis contributes to a clearer understanding of how political language is used in Australia. Additionally, it contributes to the surprisingly minimal literature related to Australian political discourse analysis surrounding Indigenous issues, reconciliation and the Closing the Gap policy itself. By analysing such political speeches, reflection, engagement and empowerment then have the capacity to influence institutionalised notions of racism, poverty and class-consciousness with the view to rectifying them.<br>Thesis (Masters)<br>Master of Arts Research (MARes)<br>School of Hum, Lang & Soc Sc<br>Arts, Education and Law<br>Full Text
APA, Harvard, Vancouver, ISO, and other styles
39

Aich, Sudipto. "Evaluation of Driver Performance While Making Unprotected Intersection Turns Utilizing Naturalistic Data Integration Methods." Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/76892.

Full text
Abstract:
Within the set of all vehicle crashes that occur annually, of intersection-related crashes are over-represented. The research conducted here uses an empirical approach to study driver behavior at intersections, in a naturalistic paradigm. A data-mining algorithm was used to aggregate the data from two different naturalistic databases to obtain instances of unprotected turns at the same intersection. Several dependent variables were analyzed which included visual entropy, mean-duration of glances to locations in the driver's view, gap-acceptance/rejection time. Kinematic dependent variables include peak/average speed, and peak longitudinal and lateral acceleration. Results indicated that visual entropy and peak speed differs amongst drivers of the three age-groups (older, middle-age, teens) in the presence of traffic in the intersecting streams while negotiating a left turn. Although not significant, but approaching significance, were differences in gap acceptance times, with the older driver accepting larger gaps compared to the younger teen drivers. Significant differences were observed for peak speed and average speed during a left turn, with younger drivers exhibiting higher values for both. Overall, this research has resulted in contribution towards two types of engineering application. Firstly, the analyses of traffic levels, gap acceptance, and gap non-acceptance represented exploratory efforts, ones that ventured into new areas of technical content, using newly available naturalistic driving data. Secondly, the findings from this thesis are among the few that can be used to inform the further development, refinement, and testing of technology (and training) solutions intended to assist drivers in making successful turns and avoiding crashes at intersections.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
40

Sokolova, Karina. "Bridging the gap between Privacy by Design and mobile systems by patterns." Thesis, Troyes, 2016. http://www.theses.fr/2016TROY0008/document.

Full text
Abstract:
De nos jours, les smartphones et les tablettes génèrent, reçoivent, mémorisent et transfèrent vers des serveurs une grande quantité de données en proposant des services aux utilisateurs via des applications mobiles facilement téléchargeables et installables. Le grand nombre de capteurs intégrés dans un smartphone lui permet de collecter de façon continue des informations très précise sur l'utilisateur et son environnement. Cette importante quantité de données privées et professionnelles devient difficile à superviser.L'approche «Privacy by Design», qui inclut sept principes, propose d'intégrer la notion du respect des données privées dès la phase de la conception d’un traitement informatique. En Europe, la directive européenne sur la protection des données privées (Directive 95/46/EC) intègre des notions du «Privacy by Design». La nouvelle loi européenne unifiée (General Data Protection Régulation) renforce la protection et le respect des données privées en prenant en compte les nouvelles technologies et confère au concept de «Privacy by Design» le rang d’une obligation légale dans le monde des services et des applications mobiles.L’objectif de cette thèse est de proposer des solutions pour améliorer la transparence des utilisations des données personnelles mobiles, la visibilité sur les systèmes informatiques, le consentement et la sécurité pour finalement rendre les applications et les systèmes mobiles plus conforme au «Privacy by (re)Design»<br>Nowadays, smartphones and smart tablets generate, receive, store and transfer substantial quantities of data, providing services for all possible user needs with easily installable programs, also known as mobile applications. A number of sensors integrated into smartphones allow the devices to collect very precise information about the owner and his environment at any time. The important flow of personal and business data becomes hard to manage.The “Privacy by Design” approach with 7 privacy principles states privacy can be integrated into any system from the software design stage. In Europe, the Data Protection Directive (Directive 95/46/EC) includes “Privacy by Design” principles. The new General Data Protection Regulation enforces privacy protection in the European Union, taking into account modern technologies such as mobile systems and making “Privacy by Design” not only a benefit for users, but also a legal obligation for system designers and developers.The goal of this thesis is to propose pattern-oriented solutions to cope with mobile privacy problems, such as lack of transparency, lack of consent, poor security and disregard for purpose limitation, thus giving mobile systems more Privacy by (re) Design
APA, Harvard, Vancouver, ISO, and other styles
41

Dahan, Danielle (Danielle S. ). "Whose fault is it anyway? evaluating the energy efficiency gap in commercial buildings & measuring energy savings associated with fault detection and diagnostics." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/115034.

Full text
Abstract:
Thesis: S.M. in Technology and Policy, Massachusetts Institute of Technology, School of Engineering, Institute for Data, Systems, and Society, Technology and Policy Program, 2017.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references.<br>According to the International Energy Agency, energy efficiency programs make up 72% of global greenhouse gas abatement strategies. However, there is extensive literature that shows compelling evidence for an "energy efficiency gap" in which expected energy savings from energy efficiency programs are not realized. Due to the importance of energy efficiency in global climate mitigation, as well as the significant federal, state, and local budgets for energy efficiency, there is a clear need for further research in this domain to evaluate the energy efficiency gap and prioritize methods for reducing the gap. Further, there is significantly less research on the gap as it applies to commercial buildings; the majority of research does not take advantage of advancements in available statistical modeling techniques; and there is very limited research evaluating the gap as it applies to the new field of fault detection and diagnostics (FDD). With FDD, building owners are able to closely monitor on an ongoing basis any faults that begin to occur in a commercial building that can waste energy and lead to the gap in energy efficiency. However, there has been very little research evaluating these systems in real buildings and calculating the energy efficiency impact. This thesis proposes and tests a modeling approach using novel machine learning algorithms to estimate counterfactual energy usage in real buildings and calculate the energy efficiency savings associated with an existing FDD system. In this thesis, I propose a modeling technique using novel machine learning algorithms to estimate counterfactual energy usage of commercial buildings. I take advantage of high-frequency 15- minute interval electricity, chilled water, and steam energy usage data over several years in four campus buildings. I then compare the accuracy of these models applied to brand-new data using three different machine learning modeling techniques, the Lasso Model, Ridge Regression, and an Elastic Net Model. Finally, I applied these models to 8 time periods in which the existing FDD system identified a fault, thus isolating the energy impact of the fault. With this approach, I found that each of the three modeling techniques outperformed the other two techniques in at least one of the models, indicating that there is likely a benefit from using three approaches in building energy modeling. Further, I found that the models are likely able to isolate the energy increase associated with these faults, with some models yielding a higher confidence level than others. In addition to the overall average increase in energy, the faults showed consistent results in the daily load profile shifts after the fault occurred. Overall, the faults yielded monthly energy cost increases of $800-$1600 each. This methodology could therefore be used in more buildings and with different types of fault detection diagnostics systems to better evaluate the benefits of FDD software across applications. By using this method more extensively, we can better inform policy that can in turn aim reduce the energy efficiency gap in commercial buildings.<br>by Danielle Dahan.<br>S.M. in Technology and Policy
APA, Harvard, Vancouver, ISO, and other styles
42

Baitalmal, Mohammad Hamza. "A Grounded Theory Model of the Relationship between Big Data and an Analytics Driven Supply Chain Competitive Strategy." Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1404511/.

Full text
Abstract:
The technology for storing and using big data is evolving rapidly and those that can keep pace are likely to garner additional competitive advantages. One approach to uncovering existing practice in a manner that provides insights for building theory is the use of grounded theory. The current research employs qualitative research following a grounded theory approach to explore gap in understanding the relationship between big data (BD) and the supply chain (SC). In this study eight constructs emerged: Organizational and environmental factors, big data and supply chain analytics, alignment, data governance, big data capabilities, cost of quality, risk analysis and supply chain performance. The contribution of this research resulted in a new theoretical framework that provides researchers and practitioners with an ability to visualize the relationship between collection and use of BD and the SC. This framework provides a model for future researchers to test the relationships posited and continue to extend understanding about how BD can benefit SC practice. While it is anticipated that the proposed theoretical framework will evolve as a result of future examination and enhanced understating of the relationships shown the framework presented represents a critical first step for moving the literature and practice forward.
APA, Harvard, Vancouver, ISO, and other styles
43

Kransell, Martin. "The Value of Data Regarding Traceable Attributes in a New Era of Agriculture : Bridging the Information Gap Between Consumers and Producers of Organic Meat." Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-35089.

Full text
Abstract:
Purpose – This study aims to explore, and suggest solutions to, the gap between the supply of information from organic meat producers and the demand of information from consumers regarding traceable characteristics (attributes) of meat in a limited geographical area in order to maximize the utilization and value of collected data. Design/methodology/approach – A mixed methods research design is applied to collect both quantitative data from consumers and qualitative data from suppliers to produce empirical results of the supply and demand of information. A theoretical framework of organic food purchase intent is used for the quantitative study as well as the correlation between consumers’ perceived importance of attributes and their willingness-to-pay for meat. The results of the empirical studies are compared to each other in an effort to expose a possible gap using a gap analysis. Findings – Meat is shifting from a price based commodity to a product based on characteristics. This study reveals that there is now a gap between the information made available by organic meat producers and the demand of information from consumers that needs to be recognized in order to maximize the value of collected data. Information regarding environmental impact of raising and transporting the animals is not extensively collected. A substantial amount of data about attributes of perceived importance, such as safety and handling, animal welfare and medication or other treatments is collected but not extensively shared with consumers. Research limitations/implications – The small sample size in a unique area and the scope of the survey data does not provide a result that can be truly generalized. It is therefore suggested that future studies produce results from a larger sample that incorporates the perceived accessibility of important information for consumers. Practical implications – This contributes to the emerging literature of organic food production by comparing both the supply and the demand of information regarding attributes of meat. This information is valuable to organic meat producers and marketers as well as developers of agricultural systems and databases that should shift their focus to consumer oriented traceability systems. Originality/value – This study goes beyond the substantial body of literature regarding attributes of organic food and consumers preferences by comparing these factors to the available supply of information by meat producers and by suggesting solutions to bridge the gap between them. Keywords – Organic meat, Organic agriculture, e-Agriculture, Traceability, Traceability systems, Consumer oriented, Consumer behavior, Willingness-to-pay, Supply and demand, Information gap, Gap analysis, Business development, United States of America, Sense-making theory, Mixed methods Paper type – Research paper, Bachelor’s thesis
APA, Harvard, Vancouver, ISO, and other styles
44

Lindbergh, Patric. "Heuristiker för digitala spel : En studie om Game Approachability Principles." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-9886.

Full text
Abstract:
Detta arbete har utformats för att besvara frågeställningen: ”Kan Game Approachability Principles (GAP) upptäcka användbarhets-, spelbarhets- och åtkomlighetsproblem i tidiga stadier av utvecklingsprocessen av ett spel?”. Teorier om användbarhet, åtkomlighet, spelbarhet, olika utvärderingsmetoder och användartestning samt Game Approachability Principles studerades för att skapa en teoribildning inom området. Två prototyper av ett spel skapades: en hi-fi prototyp och en lo-fi prototyp. Den första representerar det senare stadiet i utvecklingen av ett spel och den andra representerar det tidigare stadiet. Det har utförts en användartestning på hi-fi prototypen och två olika utvärderingar på lo-fi prototypen för att samla in data om hur väl GAP fungerar. Resultaten visar på att Game Approachability Principles upptäcker användarhets-, spelbarhets- och åtkomlighetsproblem i tidiga stadier av utvecklingsprocessen av ett spel.
APA, Harvard, Vancouver, ISO, and other styles
45

Bien, Nicole Ma. "Primary school achievement gaps and school decisions to support the academic achievement of disadvantaged students with data: A cross-country comparative study." Thesis, The University of Sydney, 2016. http://hdl.handle.net/2123/15742.

Full text
Abstract:
Reducing the academic disadvantage of all students is a significant educational goal for many countries. Increasingly, education reforms around the world, including those in Australia and the United States have sought to reduce achievement gaps by adopting a strategy of embedding accountability anchored by standardised assessment. Whether to meet federal and state educational requirements, to provide transparency to the general public, or to inform curriculum and instruction at individual schools, policy makers rely on assessment data and data-driven practice to make a difference. Although external forces such as policy expectations are generally the first step in creating social change, the internal beliefs of change agents can impact their course of action. Being agents of change, school educators can choose to adopt data-driven practice for compliance, or also to engage with data for continuous improvement. Applying the efficacy theory and the theory of planned behaviour from the social cognitive tradition, this thesis examined educators’ belief mechanism regarding embracing data-practice, and considered the direct and indirect benefits of data-engagement, as well as the costs that ensued for teaching and learning. Using standardised assessment results from 2008–2013 in Australia and two counties in California, and six case studies across New South Wales, California, and Hawaii, the present mixed methods research found evidence of progress in raising the proficiency of disadvantaged students, but not in narrowing achievement gaps between advantaged and disadvantaged students. The case studies suggest a positive relationship between academic proficiency progress and data engagement. This can be explained by the structural design and operational procedures of the data-driven process enhancing educators’ attitudes, intention, perceived efficacy beliefs, and perceived behavioural control relating to the challenging task of raising the educational outcomes of disadvantaged students. As a result, participants could see beyond mere compliance with data-driven practice to its potential for professional improvement.
APA, Harvard, Vancouver, ISO, and other styles
46

Kruger, Wandi. "Addressing application software package project failure : bridging the information technology gap by aligning business processes and package functionality." Thesis, Stellenbosch : Stellenbosch University, 2011. http://hdl.handle.net/10019.1/17868.

Full text
Abstract:
Thesis (MComm)--Stellenbosch University, 2011.<br>ENGLISH ABSTRACT: An application software package implementation is a complex endeavour, and as such it requires the proper understanding, evaluation and redefining of the current business processes to ensure that the project delivers on the objectives set at the start of the project. Numerous factors exist that may contribute to the unsuccessful implementation of application software package projects. However, the most significant contributor to the failure of an application software package project lies in the misalignment of the organisation’s business processes with the functionality of the application software package. Misalignment is attributed to a gap that exists between the business processes of an organisation and what functionality the application software package has to offer to translate the business processes of an organisation into digital form when implementing and configuring an application software package. This gap is commonly referred to as the information technology (IT) gap. The purpose of this assignment is to examine and discuss to what degree a supporting framework such as the Projects IN Controlled Environment (PRINCE2) methodology assists in the alignment of the organisation’s business processes with the functionality of the end product; as so many projects still fail even though the supporting framework is available to assist organisations with the implementation of the application software package. This assignment proposes to define and discuss the IT gap. Furthermore this assignment will identify shortcomings and weaknesses in the PRINCE2 methodology which may contribute to misalignment between the business processes of the organisation and the functionality of the application software package. Shortcomings and weaknesses in the PRINCE2 methodology were identified by: • Preparing a matrix table summarising the reasons for application software package failures by conducting a literature study; Mapping the reasons from the literature study to those listed as reasons for project failure by the Office of Government Commerce (the publishers of the PRINCE2 methodology); • Mapping all above reasons to the PRINCE2 methodology to determine whether the reasons identified are adequately addressed in the PRINCE2 methodology. This assignment concludes by proposing recommendations for aligning the business processes with the functionality of the application software package (addressing the IT gap) as well as recommendations for addressing weaknesses identified in the PRINCE2 methodology. By adopting these recommendations in conjunction with the PRINCE2 methodology the proper alignment between business processes and the functionality of the application software package may be achieved. The end result will be more successful application software package project implementations.<br>AFRIKAANSE OPSOMMING: Toepassingsprogrammatuurpakket implementering is komplekse strewe en vereis daarom genoegsame kennis, evaluasie en herdefiniëring van die huidige besigheidsprosesse om te verseker dat die projek resultate lewer volgens die doelwitte wat aan die begin van die projek neergelê is. Daar bestaan talryke faktore wat kan bydrae tot die onsuksesvolle implementering van toepassingsprogrammatuurpakket projekte. Die grootste bydrae tot die mislukking van toepassingsprogrammatuurpakket lê egter by die wanbelyning van die organisasie se besigheidsprosesse met die funksionaliteit van die toepassingsprogrammatuurpakket. Wanbelyning spruit uit gaping tussen die besigheidsprosesse van `n organisasie en die funksionaliteit wat die toepassingsprogrammatuur kan aanbied om die besigheidsprosesse van 'n organisasie om te skakel in digitale formaat wanneer `n toepassingsprogrammatuurpakket geimplementeer en gekonfigureer word. Daar word gewoonlik na hierdie gaping verwys as die informasie tegnologie (IT) gaping. Die doel van hierdie opdrag is om te evalueer en bespreek in watter mate ondersteunende raamwerk soos die PRojects IN Controlled Environment (PRINCE2) metodologie kan help om die organisasie se besigheidsprosesse in lyn te bring met die funksionaliteit van die eindproduk; aangesien so baie projekte steeds misluk ten spyte van die ondersteunende raamwerke wat beskikbaar is om organisasies by te staan met die implementering. Die opdrag beoog om die IT gaping te definieer en te bepreek. Verder sal hierdie opdrag die swakhede in die PRINCE2 metodologie, wat moontlik die volbringing van behoorlike belyning tussen die besigheidsprosesse en die funksionaliteit van die toepassingsprogrammatuurpakket belemmer, identifiseer. Swakhede en tekortkominge in die PRINCE2 metodologie is as volg geïdentifiseer: • Voorbereiding van matriks-tabel wat die redes vir toepassingsprogrammatuurpakket mislukking deur middel van die uitvoering van literatuurstudie opsom • Koppeling van die redes bekom deur middel van die literatuurstudie met die redes vir projek mislukking geidentifiseer deur die Office of Government Commerce (uitgewers van die PRINCE2 metodologie) • Koppeling van al die bogenoemde redes na die PRINCE2 metodologie om vas te stel of die redes wat geïdentifiseer is voldoende deur die PRINCE2 metodologie aangespreek word. Die opdrag sluit af met aanbevelings om die besigheidsprosesse in lyn te bring met die funksionaliteit van die toepassingsprogrammatuurpakket en aanbevelings vir swakhede wat in die PRINCE2 metodologie geïdentifiseer is aan te spreek. Behoorlike belyning tussen besigheidsprosesse en die funksionaliteit van toepassingsprogrammatuurpakket kan behaal word indien hierdie aanbevelings aangeneem word en tesame met die PRINCE2 metodologie gebruik word. Die eindresultaat is meer suksesvolle implementering van toepassingsprogrammatuurpakket projekte.
APA, Harvard, Vancouver, ISO, and other styles
47

Bonde-Hansen, Martin. "The Dynamics of Rent Gap Formation in Copenhagen : An empirical look into international investments in the rental market." Thesis, Malmö universitet, Malmö högskola, Institutionen för Urbana Studier (US), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-41157.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nchotindoh, Lewis, and Armelle Kemoum. "Enterprise Systems : Achieving an appropriate fit between ERP systems and business processes." Thesis, Jönköping University, JIBS, Business Informatics, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-10423.

Full text
Abstract:
<p>Enterprise resource planning (ERP) system which first emerged in the early 90’s, have become</p><p>so popular today that almost every large business corporation uses one. Recent years</p><p>have seen increased spending on this software application package, but sadly not all organizations</p><p>which invest in ERP systems have gotten back their money’s worth. There have</p><p>been many instances of unsuccessful ERP implementation attempts and in some cases total</p><p>failure. Researchers and scholars have long since studied these systems in a bid to provide</p><p>explanations for the complexity and failure of implementation attempts. This has led to the</p><p>establishment of critical success factors (CSF), which slightly vary with different authors</p><p>but have some strong common points. Prominent among these CSF’s is alignment between</p><p>business processes and ERP built-in processes.</p><p>This project therefore focuses on alignment issues that arise between the software package</p><p>and the organization’s business processes during ERP implementation efforts. The main</p><p>purpose of the work is to establish a set of ‘best practices’ that must be considered or executed</p><p>in order to secure a good alignment between the system and software package.</p><p>In order to achieve the main goal of this work, a case study approach has been used. The</p><p>case in question is a small manufacturing firm in Vetlanda called T-Emballage, which uses</p><p>an ERP system Jeeves. Mainly using scientific research techniques which comprise a deductive</p><p>approach and a mixed mode data collection method, some valuable conclusions are</p><p>drawn from the analysed data. The critical points that are worth considering during ERP</p><p>implementation, when trying to achieve alignment, arrived at in this study include: carrying</p><p>out a proper gap analysis, top management leading and following up, choosing software according</p><p>to needs, simplicity, flexibility, communication and liability of data.</p><p>This project further investigates the advantages and disadvantages of adapting or altering</p><p>the software package to fit the organization’s business processes and vice versa. Finally, the</p><p>work also tries to assess the alignment maturity level that has been attained in the organization</p><p>by applying the Luftman’s alignment maturity model.</p>
APA, Harvard, Vancouver, ISO, and other styles
49

Rodríguez, Espigares Ismael 1990. "GPCRmd - A web platform for collection, visualization and analysis of molecular dynamics data for G protein-coupled receptors : Bridging the gap between dynamics and receptor functionality." Doctoral thesis, Universitat Pompeu Fabra, 2018. http://hdl.handle.net/10803/664667.

Full text
Abstract:
In this thesis, we present GPCRmd, an online repository with a submission system and visualization platform specifically designed for storing and providing easy access to molecular dynamics (MD) data of G protein-coupled receptor (GPCR). This database stores MD trajectories and necessary metadata (e.g. force-field, simulation software, integration time-step) for posterior analysis, ensuring data reproducibility and integrity. Importantly, we demonstrate the usefulness of implemented analysis tools in two case studies related to GPCR signaling bias and membrane-induced GPCR modulation. These tools enabled us to detect important structural rearrangement in the initial phase of β-arrestin signaling in the δ-opioid receptor. In addition, we captured relevant molecular mechanisms which are responsible for cholesterol-induced modulation of the 5-HT2A receptor.<br>Aquí presentem el GPCRmd, un repositori en línia amb un sistema de dipòsit i una plataforma de visualització dissenyats específicament per oferir l’emmagatzematge i un fàcil accés a dades dinàmica molecular (MD) de receptors acoblats a proteïnes G (GPCRs). Aquesta base de dades emmagatzema trajectòries de MD i les metadades (per exemple, el programari de simulació, el camp de força o el temps d'integració) necessàries per a una anàlisi posterior, garantint la reproductibilitat i la integritat de les dades. És important destacar la utilitat de les eines d'anàlisi implementades en dos estudis de cas relacionats amb el biaix de senyalització de GPCRs i la modulació de GPCRs induïda per membrana. Aquestes eines ens han permès detectar una important reordenació estructural en la fase inicial de senyalització per β-arrestina en el receptor δ-opioide. A més, hem capturat el mecanismes moleculars rellevants responsables de la modulació induïda per colesterol del receptor 5-HT2A.
APA, Harvard, Vancouver, ISO, and other styles
50

Mohapatra, Avantika. "Designing for AI : A collaborative framework to bridge the gap between designers and data scientists, and enabling designers to create human-centered AI products and services." Thesis, KTH, Integrerad produktutveckling, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-286016.

Full text
Abstract:
Emerging advances in the realm of Artificial Intelligence (AI) have had a tremendous impact on various fields around us and society as a whole. As technologies continue to evolve, so will the role of designers when it comes to using AI. It has the potential to be the next tool designers use to create human-centered products and services. To truly understand AI and harness its capabilities, it is crucial to demystify the term and its inner workings. This thesis is explorative research to shed light on collaborative intelligence and how designers can harness the capabilities of AI. It further explores how to integrate the principles of design and AI to create AI-driven products and services. In addition to background research conducted on both design and AI, the importance of both these fields’ intersection was also researched upon. The project followed the Double Diamond design process principles, consisting of four phases: discover, define, develop, and deliver. This process was then used again to design a framework that bridges the gap between AI &amp; design principles. This research aimed to explore how designers could use AI to develop new products and services. The project resulted in a framework that guides designers on how to get acclimated to AI and uses a specific set of principles to design for AI. It contains concepts necessary to understand the different aspects of AI and aims to build a common language amongst all AI practitioners. The framework also serves as a basic outline of a workshop that provides various design methods that AI practitioners can use to ideate AI-driven solutions.<br>Framsteg inom Artificiell Intelligens (AI) har redan haft enorma effekter på diverse ämnesområden med direkt eller indirekt påverkan på oss människor. I och med att teknologin som utnyttjar AI kommer att utvecklas, kommer även rollen för designers att förändras. AI innehar potentialen att bli nästa verktyg som kan brukas för att skapa människocentrerade produkter och tjänster. För att förstå och nyttja AI och dess förmåga är det kritiskt att avmystifiera termen och dess potential. Detta explorativa arbete syftar till att nysta upp kollaborativ intelligens samt att undersöka hur det kan användas av designers för att nyttja AIs fulla potential. Därtill utforskar arbetet hur AI och designprinciper kan integreras för att skapa AI-baserade produkter och tjänster. Utöver forskningen inom design och AI undersöks även fältens skärningspunkter. Följande arbete använder sig av Double Diamond designprocessen och dess principer: discover, define, develop, and deliver. Denna process kommer att användas för att konstruera ett ramverk som binder samman AI och designprinciper. Arbetet syftar till att utforska hur designers kan använda AI för att skapa nya produkter och tjänster. Resultatet är ett ramverk som kan vägleda designers att acklimatisera sig med AI och dess specifika principer för att kunna applicera AI i sitt arbete. Ramverket innefattar nödvändiga koncept för att förstå olika aspekter av AI och strävar efter att bygga ett gemensamt språk för alla utövare av AI. Ramverket ger dessutom riktlinjer för att strukturera workshops som förser alla möjliga AI användare med designmetoder för att skapa AI baserade lösningar.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography