Dissertations / Theses on the topic 'Additive Models'

To see the other types of publications on this topic, follow the link: Additive Models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Additive Models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Belitz, Christiane. "Model Selection in Generalised Structured Additive Regression Models." Diss., lmu, 2007. http://nbn-resolving.de/urn:nbn:de:bvb:19-78896.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hofner, Benjamin. "Boosting in structured additive models." Diss., lmu, 2011. http://nbn-resolving.de/urn:nbn:de:bvb:19-138053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pya, Natalya. "Additive models with shape constraints." Thesis, University of Bath, 2010. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.527433.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In many practical situations when analyzing a dependence of one or more explanatory variables on a response variable it is essential to assume that the relationship of interest obeys certain shape constraints, such as monotonicity or monotonicity and convexity/concavity. In this thesis a new approach to shape preserving smoothing within generalized additive models has been developed. In contrast with previous quadratic programming based methods, the project develops intermediate rank penalized smoothers with shape constrained restrictions based on re-parameterized B-splines and penalties based on the P-spline ideas of Eilers and Marx (1996). Smoothing under monotonicity constraints and monotonicity together with convexity/concavity for univariate smooths; and smoothing of bivariate functions with monotonicity restrictions on both covariates and on only one of them are considered. The proposed shape constrained smoothing has been incorporated into generalized additive models with a mixture of unconstrained and shape restricted smooth terms (mono-GAM). A fitting procedure for mono-GAM is developed. Since a major challenge of any flexible regression method is its implementation in a computationally efficient and stable manner, issues such as convergence, rank deficiency of the working model matrix, initialization, and others have been thoroughly dealt with. A question about the limiting posterior distribution of the model parameters is solved, which allows us to construct Bayesian confidence intervals of the mono-GAM smooth terms by means of the delta method. The performance of these confidence intervals is examined by assessing realized coverage probabilities using simulation studies. The proposed modelling approach has been implemented in an R package monogam. The model setup is the same as in mgcv(gam) with the addition of shape constrained smooths. In order to be consistent with the unconstrained GAM, the package provides key functions similar to those associated with mgcv(gam). Performance and timing comparisons of mono-GAM with other alternative methods has been undertaken. The simulation studies show that the new method has practical advantages over the alternatives considered. Applications of mono-GAM to various data sets are presented which demonstrate its ability to model many practical situations.
4

Joshi, Miland. "Applications of generalized additive models." Thesis, University of Warwick, 2011. http://wrap.warwick.ac.uk/47759/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Main Purpose The study is primarily a contribution to a question of strategy rather than the development of a new method. It explores the circumstances in which the use of generalized additive models can be recommended. It is thus a contribution to answering the question "When is it a good idea (or not so good an idea) to use GAMs?" Content Following an introductory exposition in which they are compared to generalized linear models, subsequent chapters deal with evidence that could support possible recommendations: 1. A survey of recent studies, in which GAMs have been used and recommended, regarded with greater reserve, or compared to other methods. 2. Original case studies in which the applicability of GAMs is investigated, namely: (a) Receiver operating characteristic curves used in medical diagnostic testing, the associated diagnostic likelihood ratios, and the modelling of the risk score. (b) A study of a possible heat wave effect on mortality in London. (c) Shorter studies, including a study of factors influencing the length of stay in hospital in Queensland, Australia, and a simulation study. 3. Diagnostics, looking in particular at concurvity, and the problems of defining and detecting it. The study ends with recommendations for the use of GAMs, and possible areas for further research. The appendices include a glossary, technical appendices and R code for computations involved in the project.
5

Piccirilli, Marco. "Additive models for energy markets." Doctoral thesis, Università degli studi di Padova, 2018. http://hdl.handle.net/11577/3426712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This Dissertation explores the capability of additive models to describe prices in energy markets, by focusing in particular on the specific case of electricity and natural gas. In Chapter 1 we study a dynamic portfolio optimization problem designed for intraday electricity trading. In Chapter 2 we introduce a no-arbitrage tractable framework based on the Heath- Jarrow-Morton approach for a multicommodity energy forward market. Chapter 3 deals with a thorough empirical study of a two-factor model derived by the framework of Chapter 2, with an application to the German power futures market. Finally, in Chapter 4 we discuss option pricing for additive factor models by Fourier transform methods. We introduce a two-factor futures price model with jumps in order to capture the implied volatility smile of European electricity options. An application to the European Energy Exchange Power Derivatives market is presented.
Questa Tesi esplora la capacità dei modelli additivi di descrivere i prezzi nei mercati energetici, concentrandosi in particolare sul caso specifico dell’elettricità e del gas naturale. Nel Capitolo 1 studiamo un problema di ottimizzazione dinamica di portafoglio per il trading di energia elettrica su mercati intraday. Nel Capitolo 2 introduciamo un framework trattabile e privo di arbitraggio basato sull’approccio di Heath-Jarrow-Morton per mercati a termine energetici multicommodity. Il Capitolo 3 si occupa di uno studio empirico approfondito di un modello a due fattori derivato dal framework del Capitolo 2, con un’applicazione al mercato a termine elettrico tedesco. Infine, nel Capitolo 4 discutiamo il prezzaggio di opzioni per modelli fattoriali additivi con metodi di trasformata di Fourier. Introduciamo un modello di prezzi futures a due fattori con salti al fine di catturare lo smile delle volatilità implicite di opzioni Europee sull’elettricità. Viene presentata un’applicazione al mercato European Energy Exchange Power Derivatives.
6

譚維新 and Wai-san Wilson Tam. "Implementation and applications of additive models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B31221671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tam, Wai-san Wilson. "Implementation and applications of additive models /." Hong Kong : University of Hong Kong, 1999. http://sunzi.lib.hku.hk/hkuto/record.jsp?B20715444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Xiangmin. "Nonconvex selection in nonparametric additive models." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
High-dimensional data offers researchers increased ability to find useful factors in predicting a response. However, determination of the most important factors requires careful selection of the explanatory variables. In order to tackle this challenge, much work has been done on single or grouped variable selection under the penalized regression framework. Although the topic of variable selection has been extensively studied under the parametric framework, its applications to more flexible nonparametric models are yet to be explored. In order to implement the variable selection in nonparametric additive models, I introduce and study two nonconvex selection methods under the penalized regression framework, namely the group MCP and the adaptive group LASSO, aiming at improvements on the selection performances of the more widely known group LASSO method in such models. One major part of the dissertation focuses on the theoretical properties of the group MCP and the adaptive group LASSO. I derive their selection and estimation properties. The application of the presently proposed methods to nonparametric additive models are further examined using simulation. Their applications to areas such as the economics and genomics are presented as well. Under both the simulation studies and data applications, the group MCP and the adaptive group LASSO have shown their advantages over the more traditionally used group LASSO method. For the proposed adaptive group LASSO that uses the newly proposed weights, whose recursive application is therefore never studied before, I also derive its theoretical properties under a very general framework. Simulation studies under linear regression are included. In addition to the theoretical and empirical investigations, throughout the dissertation, several other important issues have been briefly discussed, including the computing algorithms and different ways of selecting tuning parameters.
9

Läuter, Henning. "Estimation in partly parametric additive Cox models." Universität Potsdam, 2003. http://opus.kobv.de/ubp/volltexte/2011/5150/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The dependence between survival times and covariates is described e.g. by proportional hazard models. We consider partly parametric Cox models and discuss here the estimation of interesting parameters. We represent the ma- ximum likelihood approach and extend the results of Huang (1999) from linear to nonlinear parameters. Then we investigate the least squares esti- mation and formulate conditions for the a.s. boundedness and consistency of these estimators.
10

Hofner, Benjamin [Verfasser]. "Boosting in Structured Additive Models / Benjamin Hofner." München : Verlag Dr. Hut, 2012. http://d-nb.info/1020299223/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Heinzl, Felix. "Clustering in linear and additive mixed models." Diss., Ludwig-Maximilians-Universität München, 2013. http://nbn-resolving.de/urn:nbn:de:bvb:19-157169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Berglund, Daniel. "Models for Additive and Sufficient Cause Interaction." Licentiate thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259608.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The aim of this thesis is to develop and explore models in, and related to, the sufficient cause framework, and additive interaction. Additive interaction is closely connected with public health interventions and can be used to make inferences about the sufficient causes in order to find the mechanisms behind an outcome, for instance a disease. In paper A we extend the additive interaction, and interventions, to include continuous exposures. We show that there does not exist a model that does not lead to inconsistent conclusions about the interaction. The sufficient cause framework can also be expressed using Boolean functions, which is expanded upon in paper B. In this paper we define a new model based on the multifactor potential outcome model (MFPO) and independence of causal influence models (ICI). In paper C we discuss the modeling and estimation of additive interaction in relation to if the exposures are harmful or protective conditioned on some other exposure. If there is uncertainty about the effects direction there can be errors in the testing of the interaction effect.
Målet med denna avhandling är att utveckla, och utforska modeller i det så kallade sufficent cause ramverket, och additiv interaktion. Additiv interaktion är nära kopplat till interventioner inom epidemiology och sociologi, men kan också användas för statistiska tester för sufficient causes för att förstå mekanimser bakom ett utfall, tex en sjukdom. I artikel A så expanderar vi modellen för additiv interaktion och interventioner till att också inkludera kontinuerliga variabler. Vi visar att det inte finns någon modell som inte leder till motsägelser i slutsatsen om interaktionen. Sufficient cause ramverket kan också utryckas via Boolska funktioner, vilket byggs vidare på i artikel B. I den artikeln definerar vi en modell baserad på mutltifactor potential outcome modellen (MFPO) och independence of causal influence modellen (ICI). I artikel C diskuterar vi modelleringen och estimering av additiv interaktion i relation till om variablerna har skadlig eller skyddande effekt betingat på någon annan variabel. Om det finns osäkerhet kring en effekts riktning så kan det leda till fel i testerna för den additiva interaktionen.

Examinator: Professor Henrik Hult, Matematik, KTH

13

Busolin, Francesco <1995&gt. "Document pruning strategies for additive Ranking models." Master's Degree Thesis, Università Ca' Foscari Venezia, 2020. http://hdl.handle.net/10579/18164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
La maggior parte dei moderni motori di ricerca fa uso di modelli di machine learning additivi per valutare la rilevanza di documenti in relazione a una query. Dato che i modelli usati sono composti da numerosi sotto modelli il costo totale è direttamente dipendente dal loro numero ed esso è quindi legato anche alla responsività del sistema influenzando il tempo di risposta delle query. In questa tesi vengono discusse delle strategie atte a interrompere lo scoring di documenti che difficilmente risulteranno rilevanti. Tali strategie sono state testate utilizzando come modello una foresta di alberi di regressione e come dataset il noto e largamente utilizzato Microsoft Learning to Rank Dataset. Alla fine si verrà a mostrare che è possibile ottenere degli speedup di oltre 2x con perdite minime di qualità dei risultati, valutata attraverso l’ uso dell’indice NDCG@10.
14

Marra, Giampiero. "Some problems in model specification and inference for generalized additive models." Thesis, University of Bath, 2010. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.527788.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Regression models describingthe dependence between a univariate response and a set of covariates play a fundamental role in statistics. In the last two decades, a tremendous effort has been made in developing flexible regression techniques such as generalized additive models(GAMs) with the aim of modelling the expected value of a response variable as a sum of smooth unspecified functions of predictors. Many nonparametric regression methodologies exist includinglocal-weighted regressionand smoothing splines. Here the focus is on penalized regression spline methods which can be viewed as a generalization of smoothing splines with a more flexible choice of bases and penalties. This thesis addresses three issues. First, the problem of model misspecification is treated by extending the instrumental variable approach to the GAM context. Second, we study the theoretical and empirical properties of the confidence intervals for the smooth component functions of a GAM. Third, we consider the problem of variable selection within this flexible class of models. All results are supported by theoretical arguments and extensive simulation experiments which shed light on the practical performance of the methods discussed in this thesis.
15

Brezger, Andreas. "Bayesian P-Splines in Structured Additive Regression Models." Diss., lmu, 2005. http://nbn-resolving.de/urn:nbn:de:bvb:19-39420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Mohammadi, Mahdi. "Heterogeneity in additive and multiplicative event history models." Thesis, University of Newcastle Upon Tyne, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Heterogeneity in survival and recurrent event data is often due to unknown, unmeasured, or immeasurable factors. Subjects may experience heterogeneous failure times or event rates due to different levels of vulnerability to the event of interest. The more prone the subjects, the shorter the survival times and the higher the event rates. Furthermore, the presence of cured subjects who are not susceptible to the event contributes to the heterogeneity. Frailty and cure models can take into account the unexplained variation due to heterogeneity and cured fractions. This research explores the ideas of these models for failure and recurrent event data. The models are checked by simulation studies and they are applied to three data sets wherever applicable. For survival data, we investigate by simulation the results of a frailty mixture model which includes frailty and cure models. Even for a small size (e.g. 100), this model fits well to the data from either frailty or cure models. We also explore misspecification of the Cox and frailty model theoretically and by simulation when data are generated from the cure model. Although the regression parameters are underestimated under the misspecified Cox model, the frailty model fits well to simulated data with a cured fraction. Furthermore, regression parameters are underestimated under a misspecified cure model when the frailty model holds. In the case of a high rate of administrative censoring (80%), the bias is small in all misspecified models. The Aalen and Cox frailty models for failure times are compared in terms of frailty parameter estimates. Under both the Cox and Aalen frailty models, the frailty variance is underestimated. However, the frailty variance is estimated to be smaller under the Aalen frailty model because this model, as opposed to the Cox frailty model, allows for time-dependent regression parameters which can explain part of the random processes. We include a time-constant frailty term into the Aalen intensity model to construct an individual time-constant frailty model (ITCF) for recurrent event data and suggest a dynamic procedure to estimate the parameters. Estimated frailty and regression parameters are unbiased in the simulation study. Although a misspecified Aalen model ignores heterogeneity, unbiased regression parameters are obtained. However, the intensity and residuals are not estimated appropriately. Several models for clustered recurrent event data are suggested. Models can be used to estimate the correlation between subjects within the clusters and heterogeneity between them. One of the models can also consider cured fractions at the cluster and individual levels. This model can make a difference to the significant results when a cluster is event-free or the rate of event-free subjects is considerably different at various levels of a covariate. A time-dependent frailty model is also explored. This model assumes that at each time there is a frailty term with variance ξ, but there is correlation between different times. The correlation between frailties at time u and v is assumed to be p |u-v|. We use an approximation for small values of ξ to estimate the parameters. Simulation studies confirm that results are good and the bias is ignorable for both frailty and regression parameters. This model includes the ITCF model when p = 1 and a misspecified ITCF model underestimates the ξ. When p is not close to 1 (e.g. 0.8), the two models can be differentiated by composite likelihood. Three data sets are used throughout the thesis. The first (leukaemia) has single event survival times whereas the second (patient controlled analgesia) and third (Blue bay diarrhoea) have recurrent events. Clustering is present in the third dataset.
17

Filippou, Panagiota. "Penalized likelihood estimation of trivariate additive binary models." Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10042688/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In many empirical situations, modelling simultaneously three or more outcomes as well as their dependence structure can be of considerable relevance. Trivariate modelling is continually gaining in popularity (e.g., Genest et al., 2013; Król et al., 2016; Zhong et al., 2012) because of the appealing property to account for the endogeneity issue and non-random sample selection bias, two issues that commonly arise in empirical studies (e.g., Zhang et al., 2015; Radice et al., 2013; Marra et al., 2017; Bärnighausen et al., 2011). The applied and methodological interest in trivariate modelling motivates the current thesis and the aim is to develop and estimate a generalized trivariate binary regression model, which accounts for several types of covariate effects (such as linear, nonlinear, random and spatial effects), as well as error correlations. In particular, the thesis focuses on the following targets. First, we address the issue in estimating accurately the correlation coefficients, which characterize the dependence of the binary responses conditional on regressors. We found that this is not an unusual occurrence for trivariate binary models and as far as we know such a limitation is neither discussed nor dealt with. Based on this framework, we develop models for dealing with data suffering from endogeneity and/or nonrandom sample selection. Moreover, we propose trivariate Gaussian copula models where the link functions can in principle be derived from any parametric distribution and the parameters describing the association between the responses can be made dependent on several types of covariate effects. All the coefficients of the model are estimated simultaneously within a penalized likelihood framework based on a carefully structured trust region algorithm with integrated automatic multiple smoothing parameter selection. The developments have been incorporated in the function SemiParTRIV()/gjrm() in the R package GJRM (Marra & Radice, 2017). The extensive use of simulated data as well as real datasets illustrates each development in detail and completes the analysis.
18

Bech, Katarzyna. "On nonparametric additive error models with discrete regressors." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/389713/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis contributes to the literature on nonparametric additive error models with discrete explanatory variables. Although nonparametric methods have become very popular in recent decades, research on the impact of the discreteness of regressors is sparse. Main interest is in an unknown nonparametric conditional mean function in the presence of endogenous explanatory variables. Under endogeneity, the identifying power of the model depends on the number of support points of the discrete instrument relative to that of the regressor. Under non-parametric identification failure, we show that some linear functionals of the conditional mean function are point-identified, while some are completely unconstrained. A test for point identification is suggested. Observing that the simple nonparametric model can be interpreted as a linear regression, new approaches to testing for exogeneity of the regressor(s) are proposed. For the point-identified case, the test is an adapted version of the familiar Durbin-Wu-Hausman approach. This extends the work of Blundell and Horowitz (2007) to the case of discrete regressors and instruments. For the partially identified case, the Durbin-Wu-Hausman approach is not available, and the test statistic is derived from a constrained minimization problem. In this second case, the asymptotic null distribution is non-standard, and a simple device is suggested to compute the critical values in practical applications. Both tests are shown to be consistent, and a simulation study reveals that both have satisfactory finite-sample properties. The practicability of the suggested testing procedures is illustrated in applications to the modelling of returns to education.
19

Feng, Zhenghui. "Estimation and selection in additive and generalized linear models." HKBU Institutional Repository, 2012. https://repository.hkbu.edu.hk/etd_ra/1435.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Koehn, Sebastian. "Generalized additive models in the context of shipping economics." Thesis, University of Leicester, 2009. http://hdl.handle.net/2381/4172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis addresses three current issues in maritime economics by the application of semi-parametric estimations within a generalized additive model framework. First, this thesis shows that there are vessel and contract specific differences in time charter rates for dry bulk vessels. The literature on microeconomic factors of time charter rates could show the emergence of a two-tier tanker market during the post-OPA90 period. However, previous results do not allow for any safe conclusions about the existence of a two-tier dry bulk market. This thesis extends the results of previous research by showing that a good part of the variation in physical time charter rates is due to microeconomic factors. It empirically proves the existence of a two-tier dry-bulk market. Controlling for a variety of contract specific effects as well as vessel specific factors the presented model quantifies quality induced differences in physical dry bulk charter rates. Second, the literature on the formation of ship prices focuses exclusively on rather homogeneous shipping segments, such as tankers and dry bulk carriers. Due to the comparatively low number of sales and the complexity of the ships, vessel valuation in highly specialised and small sectors, such as chemical tankers, is a much more challenging task. The empirical results of this thesis confirm the findings in recent literature that ship valuation is a non-linear function of size, age and market conditions, whilst other parameters that are particular to the chemicals market also play a significant role. The third topic addresses the recent increase in operational expenses of merchant vessels (opex). The available literature cannot explain the development nor provides information on vessel individual level. This thesis considers a quantitative model of opex that is particularly successful in explaining the variation in opex across vessels of different type, size, age and specification. The results confirm that differences in opex are due to the behaviour of a vessel's operator and to regulatory requirements. Furthermore, it shows that there are significant differences in opex due to earnings and employment status of a vessel.
21

Elgmati, Entisar. "Additive intensity models for discrete time recurrent event data." Thesis, University of Newcastle Upon Tyne, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.556142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The thesis considers the Aalen additive regression model for recurrent event data. The model itself, estimation of the cumulative regression functions, testing procedures, checking goodness of fit and inclusion of dynamic covariates in the model are reviewed. A disadvantage of this model is that estimates of the conditional probabilities are not constrained to lie between zero and one, therefore a model with logistic intensity is considered. Results under the logistic model are shown to be qualitatively similar to those under the additive model. The additive model is extended to incorporate the possibility of spatial or spatio-temporal clustering, possibly caused by unobserved environmental factors or infectivity. Various tests for the presence of clustering are described and implemented. The issue of frailty modelling and its connection to dynamic modelling is presented and examined. We show that frailty and dynamic models are almost indistinguishable in terms of residual summary plots. A graphical procedure based on the property that the covariance between martingale residuals at time to and t > to is independent of t is proposed and supplemented by a formal test statistic to investigate the adequacy of the fitted models. The results can be used to compare models and to check the validity of the model being tested. Also we investigate properties under various types of model misspecification. All our works are illustrated using two sets of data measuring daily prevalence and incidence of infant diarrhoea in Salvador, Brazil. Significant clustering is identified in the data. We investigate risk factors for diarrhoea and there is strong evidence of dynamic effects being important, implying heterogeneity between individuals not explained by measured socio- economic and environmental factors.
22

Martens, Robert. "Strategies for Adopting Additive Manufacturing Technology Into Business Models." ScholarWorks, 2018. https://scholarworks.waldenu.edu/dissertations/5572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Additive manufacturing (AM), also called 3-dimensional printing (3DP), emerged as a disruptive technology affecting multiple organizations' business models and supply chains and endangering incumbents' financial health, or even rendering them obsolete. The world market for products created by AM has increased more than 25% year over year. Using Christensen's theory of disruptive innovation as a conceptual framework, the purpose of this multiple case study was to explore the successful strategies that 4 individual managers, 1 at each of 4 different light and high-tech manufacturing companies in the Netherlands, used to adopt AM technology into their business models. Participant firms originated from 3 provinces and included a value-added logistics service provider and 3 machine shops serving various industries, including the automotive and medical sectors. Data were collected through semistructured interviews, member checking, and analysis of company documents that provided information about the adoption of 3DP into business models. Using Yin's 5-step data analysis approach, data were compiled, disassembled, reassembled, interpreted, and concluded until 3 major themes emerged: identify business opportunities for AM technology, experiment with AM technology, and embed AM technology. Because of the design freedom the use of AM enables, in combination with its environmental efficiency, the implications for positive social change include possibilities for increasing local employment, improving the environment, and enhancing healthcare for the prosperity of local and global citizens by providing potential solutions that managers could use to deploy AM technology.
23

Utami, Zuliana Sri. "Penalized regression methods with application to generalized linear models, generalized additive models, and smoothing." Thesis, University of Essex, 2017. http://repository.essex.ac.uk/20908/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Recently, penalized regression has been used for dealing problems which found in maximum likelihood estimation such as correlated parameters and a large number of predictors. The main issues in this regression is how to select the optimal model. In this thesis, Schall’s algorithm is proposed as an automatic selection of weight of penalty. The algorithm has two steps. First, the coefficient estimates are obtained with an arbitrary penalty weight. Second, an estimate of penalty weight λ can be calculated by the ratio of the variance of error and the variance of coefficient. The iteration is continued from step one until an estimate of penalty weight converge. The computational cost is minimized because the optimal weight of penalty could be obtained within a small number of iterations. In this thesis, Schall’s algorithm is investigated for ridge regression, lasso regression and two-dimensional histogram smoothing. The proposed algorithm are applied to real data sets and simulation data sets. In addition, a new algorithm for lasso regression is proposed. The performance of results of the algorithm was almost comparable in all applications. Schall’s algorithm can be an efficient algorithm for selection of weight of penalty.
24

Hofner, Benjamin Verfasser], and Torsten [Akademischer Betreuer] [Hothorn. "Boosting in structured additive models / Benjamin Hofner. Betreuer: Torsten Hothorn." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2011. http://d-nb.info/1020362065/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Pan, Yiyang. "A robust fit for generalized partial linear partial additive models." Thesis, University of British Columbia, 2013. http://hdl.handle.net/2429/44647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In regression studies, semi-parametric models provide both flexibility and interpretability. In this thesis, we focus on a robust model fitting algorithm for a family of semi-parametric models – the Generalized Partial Linear Partial Addi- tive Models (GAPLMs), which is a hybrid of the widely-used Generalized Linear Models (GLMs) and Generalized Additive Models (GAMs). The traditional model fitting algorithms are mainly based on likelihood proce- dures. However, the resulting fits can be severely distorted by the presence of a small portion of atypical observations (also known as “outliers”), which deviate from the assumed model. Furthermore, the traditional model diag- nostic methods might also fail to detect outliers. In order to systematically solve these problems, we develop a robust model fitting algorithm which is resistant to the effect of outliers. Our method combines the backfitting algorithm and the generalized Speckman estimator to fit the “partial linear partial additive” styled models. Instead of using the likelihood-based weights and adjusted response from the generalized local scoring algorithm (GLSA), we apply the robust weights and adjusted response derived form the robust quasi-likelihood proposed by Cantoni and Ronchetti (2001). We also extend previous methods by proposing a model prediction algorithm for GAPLMs. To compare our robust method with the non-robust one given by the R function gam::gam(), which uses the backfitting algorithm and the GLSA, we report the results of a simulation study. The simulation results show that our robust fit can effectively resist the damage of outliers and it performs similarly to non-robust fit in clean datasets. Moreover, our robust algorithm is observed to be helpful in identifying outliers, by comparing the fitted values with the observed response variable. In the end, we apply our method to analyze the global phytoplankton data. We interpret the outliers reported by our robust fit with an exploratory analysis and we see some interesting patterns of those outliers in the dataset. We believe our result can provide more information for the relative research.
26

Alshanbari, Huda Mohammed H. "Additive Cox proportional hazards models for next-generation sequencing data." Thesis, University of Leeds, 2017. http://etheses.whiterose.ac.uk/19739/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Eighty-Nine Non-Small Cell Lung Cancer (NSCLC) patients experience chromosomal rearrangements called Copy Number Alteration (CNA), where the cells have abnormal number of copies in one or more regions in their genome, this genetic alteration are known to drive cancer development. An important aim of this thesis is to propose a way to combine the clinical covariate as fixed predictors with CNAs genomics windows as smoothing terms using the penalized additive Cox Proportional Hazards (PH) model. Most of the proposed prediction methods assume linearity of the CNAs genomic windows along with the clinical covariates. However, the continuous covariates can affect the hazard via more complicated nonlinear functional forms. Therefore, Cox PH model with continuous covariate are likely misspecified, because it is not fitting the correct functional form for the continuous covariates. Some reports of the work on combining the clinical covariates with high-dimensional genomic data in a clinical genomic prediction are based on standard Cox PH model. Most of them focus on applying variable selection to high-dimensional CNA genomic data. Our main interest is to propose a variable selection procedure to select important nonlinear effects from CNAs genomic-windows. Two different approaches of feature selection are presented which are discrete and shrinkage. Discrete feature selection is based on penalized univariate variable selection, which identify the subset of the CNAs genomic-windows have the strongest effects on the survival time, while feature selection by shrinkage works by adding a second penalty to the penalized partial log-likelihood, that leads to penalizing the smoothing coefficients in the model, as a result some of the smoothing coefficient are being set to the zero. For the NSCLC dataset, we find that the size of the tumor cells and spread cancer into the lymph nodes are significant factors that increase the hazard of the patients survival, and the estimate of the smooth log hazard ratio curves identify that some of the significant CNA genomic-windows contribute a higher or lower hazard of death to the survival of some significant CNA genomic-windows across the genome.
27

Ramirez, Girly Manguba. "Prediction and variable selection in sparse ultrahigh dimensional additive models." Diss., Kansas State University, 2013. http://hdl.handle.net/2097/15989.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Doctor of Philosophy
Department of Statistics
Haiyan Wang
The advance in technologies has enabled many fields to collect datasets where the number of covariates (p) tends to be much bigger than the number of observations (n), the so-called ultrahigh dimensionality. In this setting, classical regression methodologies are invalid. There is a great need to develop methods that can explain the variations of the response variable using only a parsimonious set of covariates. In the recent years, there have been significant developments of variable selection procedures. However, these available procedures usually result in the selection of too many false variables. In addition, most of the available procedures are appropriate only when the response variable is linearly associated with the covariates. Motivated by these concerns, we propose another procedure for variable selection in ultrahigh dimensional setting which has the ability to reduce the number of false positive variables. Moreover, this procedure can be applied when the response variable is continuous or binary, and when the response variable is linearly or non-linearly related to the covariates. Inspired by the Least Angle Regression approach, we develop two multi-step algorithms to select variables in sparse ultrahigh dimensional additive models. The variables go through a series of nonlinear dependence evaluation following a Most Significant Regression (MSR) algorithm. In addition, the MSR algorithm is also designed to implement prediction of the response variable. The first algorithm called MSR-continuous (MSRc) is appropriate for a dataset with a response variable that is continuous. Simulation results demonstrate that this algorithm works well. Comparisons with other methods such as greedy-INIS by Fan et al. (2011) and generalized correlation procedure by Hall and Miller (2009) showed that MSRc not only has false positive rate that is significantly less than both methods, but also has accuracy and true positive rate comparable with greedy-INIS. The second algorithm called MSR-binary (MSRb) is appropriate when the response variable is binary. Simulations demonstrate that MSRb is competitive in terms of prediction accuracy and true positive rate, and better than GLMNET in terms of false positive rate. Application of MSRb to real datasets is also presented. In general, MSR algorithm usually selects fewer variables while preserving the accuracy of predictions.
28

Hofner, Benjamin [Verfasser], and Torsten [Akademischer Betreuer] Hothorn. "Boosting in structured additive models / Benjamin Hofner. Betreuer: Torsten Hothorn." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2011. http://nbn-resolving.de/urn:nbn:de:bvb:19-138053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Hinton, Thomas James. "Rapid Prototyping Tissue Models of Mammary Duct Epithelium." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ductal Carcinoma in Situ (DCIS) does not have a clinically useful indicator of malignancy, and it is often benign, except in 20% of cases. Even more important, it has a cure – removal of the affected breast. DCIS patients overwhelmingly elect for invasive therapies to escape that 20% malignant chance. Overtreatment such as this costs the patients, and it highlights the need for a DCIS model capable of distinguishing the 20% in need of treatment. Some labs have taken steps toward three-dimensional, complex, and biomimetic models of mammary tissues using a variety of endogenous and synthetic gels and 3D printing. We developed FRESH (Freeform Reversible Embedding of Suspended Hydrogels) as the first method capable of 3D printing highly biomimetic shapes from endogenous gels. Utilizing FRESH, we aim to rapid prototype models of mammary duct epithelia that are biomimetic, parametric, and capable of iterative evolution. First, we investigate the principles of 3D printers modified for extruding fluids and construct a comprehensive hardware and software platform for printing gelling fluids. Second, we apply the FRESH method to 3D print collagen and alginate hydrogels, demonstrating patency of printed vascular models, topological fidelity, and the synergistic combination of hydrogel properties in multi-material prints. Finally, we rapid prototype an epithelial monolayer by seeding a 3D printed collagen manifold, and we demonstrate maintenance of the tissue’s geometry across a week of culture. We provide evidence of fidelity in prints such as an epithelial tree printed at 200% scale using unmodified collagen type I, and we investigate the combination of hydrogel properties in multi-material prints by utilizing a second hydrogel (alginate) to reinforce and preserve the fidelity of this collagen tree during handling. Our approach utilizes faster (>40 mm/s), cheaper (
30

Durban, Reguera Maria L. "Modelling spatial trends and local competition effects using semiparametric additive models." Thesis, Heriot-Watt University, 1998. http://hdl.handle.net/10399/1287.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Greven, Sonja. "Non-standard problems in inference for additive and linear mixed models." Göttingen Cuvillier, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ma, Pulong. "Hierarchical Additive Spatial and Spatio-Temporal Process Models for Massive Datasets." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1535635193581096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

PORAT, INGRID, and KLARA HOVSTADIUS. "A Business Model Perspective on Additive Manufacturing." Thesis, KTH, Industriell Management, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239665.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Additive manufacturing (AM) is an immature manufacturing technology which is often considered to have the potential of disrupting the manufacturing industry and many industrial companies are currently investigating how they can position themselves within the AM market. Technological innovations alone are often insufficient to fully exploit the benefits of new technology and requires to be accompanied with business model innovation. Consequently, companies face challenges to find guidance related to the application of AM; what to offer and to whom (value proposition), how to deliver such offering (value creation) and how to capture the profit (value capture) – that is, how to structure an AM business model. Therefore, this research investigates how large incumbent manufacturing companies tackle the emerging AM market from a business model perspective. The research unpacks the common themes within three business model components (value proposition, value creation and value capture) in the context of an AM business model, where theme 5 is contradicted by theory and by several other themes: 1. Immature demand 2. Internal cases as a starting point 3. Knowledge offerings 4. End-to-end solutions 5. Broad customer focus 6. Start in a technology niche, then expand 7. Invest in machines to learn AM 8. Change in designer mindset required 9. Partnerships to drive the AM market forward 10. A shift in power 11. Close customer relations 12. It is a race to the market The research is based on a multiple-case study consisting of 16 interviews at six different companies and two universities.
Additiv tillverkning (AM) är en omogen tillverkningsteknik som anses ha potential att kraftigt påverka den tillverkande industrin och många företag närmar sig nu AM för att undersöka hur de kan ta en stark position på marknaden. Teknologiska innovationer i sig är ofta otillräckliga för att till fullo utnyttja fördelar med ny teknik och därför krävs även innovation av affärsmodeller. Det kan vara svårt för företag att hitta argument och stöd för hur en affärsmodell inom AM ska struktureras, det vill säga avgöra vad som ska erbjudas och till vem (value proposition), hur erbjudandet ska levereras (value creation) och hur vinsten ska tillvaratas (value capture). Därför undersöker den här studien hur stora tillverkande företag möter den växande AM-marknaden utifrån ett affärsmodellsperspektiv. Forskningen påvisar gemensamma teman inom tre affärsmodellskomponenter (value proposition, value creation, value capture) i en AM-kontext, där tema 5 motsägs både av teorin och av flera andra teman: 1. Omogen efterfrågan 2. Starta med interna uppdrag 3. Kunskapserbjudanden 4. Helhetslösningar 5. Brett kundfokus 6. Börja i en tekniknisch, expandera sedan 7. Investera i maskiner för att bygga kunskap 8. Behov av förändring i designers tankesätt 9. Partnerskap för att driva AM-marknaden framåt 10. Maktpositionen skiftar 11. Nära kundrelationer 12. Det pågår ett race till marknaden Forskningen är baserad på en multipel fallstudie som inkluderar 16 intervjuer på sex olika företag och två universitet.
34

Hercz, Daniel. "Flexible modeling with generalized additive models and generalized linear mixed models: comprehensive simulation and case studies." Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=114300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis compares GAMs and GLMMs in the context of modeling nonlinear curves. The study contains a comprehensive simulation and a few real life data analyses. The simulation uses thousands of generated datasets to compare and contrast the two models' (and linear models as a benchmark) fit, extent of nonlinearity, and shape of the resulting curve. The data analyses extend the results of the simulation to GLMM/GAM curves of lung function with measures of smoking as the independent variable. An additional and larger real life data analysis with dichotomous outcomes rounds out the study and allow for more representative results.
Cette these compare des GAM et GLMM dans le cadre de la modélisation des courbes non-linéaires. L'étude comprend une simulation complète et quelques analyses réelles. La simulation utilise des milliers de 'datasets' générés pour comparer forme entres les deux modèles (et les modèles linéaires comme point de repère), l'étendue de la non-linéarité, et la forme de la courbe obtenue. Les analyses d'étendre les résultats de la simulation à courbes de la fonction pulmonaire avec de GLMM / GAM avec mesures du tabagisme (la variable indépendante). Un autre analyse réelle avec les résultats dichotomiques complète l'étude et que les résultats soient plus représentatifs.
35

Heinzl, Felix Verfasser], and Gerhard [Akademischer Betreuer] [Tutz. "Clustering in linear and additive mixed models / Felix Heinzl. Betreuer: Gerhard Tutz." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2013. http://d-nb.info/1035066823/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Heinzl, Felix [Verfasser], and Gerhard [Akademischer Betreuer] Tutz. "Clustering in linear and additive mixed models / Felix Heinzl. Betreuer: Gerhard Tutz." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2013. http://d-nb.info/1035066823/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Fu, Liang. "Consumption and investment decision an analysis of aggregate and time-additive models /." [Gainesville, Fla.] : University of Florida, 2009. http://purl.fcla.edu/fcla/etd/UFE0024971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

De, Zan Martina <1994&gt. "ExplainableAI: on explaining forest of decision trees by using generalized additive models." Master's Degree Thesis, Università Ca' Foscari Venezia, 2021. http://hdl.handle.net/10579/18604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In recent years, decision support systems have become more and more perva- sive in our society, playing an important role in our everyday life. But these systems, often called black-box models, are extremely complex and it may be impossible to understand or explain how they work in a human interpretable way. This lack of explainability is an issue: ethically because we have to be sure that our system is fair and reasonable; practically because people tend to trust more what they understand. However, substituting black-box model with a more interpretable one in the process of decision making may be impossible: interpretable model may not work as well as the original one or training data may be no longer available. In this thesis we focus on forests of decision trees, which are particular cases of black-box models. If fact, trees are interpretable models, but forests are composed by thousand of trees that cooperate to take decisions, making the final model too complex to comprehend its behavior. In this work we show that Generalized Additive Models (GAMs) can be used to explain forests of decision trees with a good level of accuracy. In fact, GAMs are linear combination of single-features or pair-features mod- els, called shape functions. Since shape functions can be only one- or two- dimensional functions, they can be easily visualized and interpreted by user. At the same time, shape functions can be arbitrarily complex, making GAMs as powerful as other more complex models.
39

VITRANO, Angela. "Modelling Spatio-Temporal Elephant Movement Data: a Generalized Additive Mixed Models Framework." Doctoral thesis, Università degli Studi di Palermo, 2014. http://hdl.handle.net/10447/90988.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis focuses on understanding how environmental factors influence elephant movement and in investigating the spatio-temporal patterns. The thesis analyses movement data of some African elephants (Loxodonta Africana) living in the Kruger National Park and its associated private reserves of South Africa. Due to heterogeneity among elephants, and nonlinear relationships between elephant movement and environmental variables, Generalized Additive Mixed Models (GAMMs) were employed. Results showed delayed effects of rainfall and temperature and particular trends in time and space.
40

Sánchez, Rocha Martín. "Wall-models for large eddy simulation based on a generic additive-filter formulation." Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M. S.)--Aerospace Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Menon, Suresh; Committee Member: Cvitanović, Predrag; Committee Member: Sankar, Lakshmi N.; Committee Member: Smith, Marilyn J.; Committee Member: Yeung, Pui-Kuen
41

Hart, Derrick N. "Finite Field Models of Roth's Theorem in One and Two Dimensions." Thesis, Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Recent work on many problems in additive combinatorics, such as Roth's Theorem, has shown the usefulness of first studying the problem in a finite field environment. Using the techniques of Bourgain to give a result in other settings such as general abelian groups, the author gives a walk through, including proof, of Roth's theorem in both the one dimensional and two dimensional cases (it would be more accurate to refer to the two dimensional case as Shkredov's Theorem). In the one dimensional case the argument is at its base Meshulam's but the structure will be essentially Green's. Let Ϝⁿ [subscript p], p ≠ 2 be the finite field of cardinality N = pⁿ. For large N, any subset A ⊂ Ϝⁿ [subscript p] of cardinality ∣A ∣≳ N ∕ log N must contain a triple of the form {x, x + d, x + 2d} for x, d ∈ Ϝⁿ [subscript p], d ≠ 0. In the two dimensional case the argument is Lacey and McClain who made considerable refinements to this argument of Green who was bringing the argument to the finite field case from a paper of Shkredov. Let Ϝ ⁿ ₂ be the finite field of cardinality N = 2ⁿ. For all large N, any subset A⊂ Ϝⁿ ₂ × Ϝⁿ ₂ of cardinality ∣A ∣≳ N ² (log n) − [superscript epsilon], ε <, 1, must contain a corner {(x, y), (x + d, y), (x, y + d)} for x, y, d ∈ Ϝⁿ₂ and d ≠ 0.
42

Kirichenko, L., I. Ivanisenko, and T. Radivilova. "Investigation of Multifractal Properties of Additive Data Stream." Thesis, 1 th IEEE International Conference on Data Stream Mining & Processing, 2016. http://openarchive.nure.ua/handle/document/3810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The work presents results of a numerical study of fractal characteristics of multifractal stream at addition of stream, which does not have multifractal properties. They showed that the generalized Hurst exponent of total stream tends to one of original multifractal stream with increase in signal/noise ratio.
43

Sánchez, Rocha Martín. "Wall-models for large eddy simulation based on a generic additive-filter formulation." Diss., Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this work, the mathematical implications of merging two different turbulence modeling approaches are addressed by deriving the exact hybrid RANS/LES Navier-Stokes equations. These equations are derived by introducing an additive-filter, which linearly combines the RANS and LES operators with a blending function. The equations derived predict additional hybrid terms, which represent the interactions between RANS and LES formulations. Theoretically, the prediction of the hybrid terms demonstrates that the hybridization of the two approaches cannot be accomplished only by the turbulence model equations, as it is claimed in current hybrid RANS/LES models. The importance of the exact hybrid RANS/LES equations is demonstrated by conducting numerical calculations on a turbulent flat-plate boundary layer. Results indicate that the hybrid terms help to maintain an equilibrated model transition when the hybrid formulation switches from RANS to LES. Results also indicate, that when the hybrid terms are not included, the accuracy of the calculations strongly relies on the blending function implemented in the additive-filter. On the other hand, if the exact equations are resolved, results are only weakly affected by the characteristics of the blending function. Unfortunately, for practical applications the hybrid terms cannot be exactly computed. Consequently, a reconstruction procedure is proposed to approximate these terms. Results show, that the model proposed is able to mimic the exact hybrid terms, enhancing the accuracy of current hybrid RANS/LES approaches. In a second effort, the Two Level Simulation (TLS) approach is proposed as a near-wall model for LES. Here, TLS is first extended to compressible flows by deriving the small-scale equations required by the model. The full compressible TLS formulation and the hybrid TLS/LES approach is validated simulating the flow over a flat-plate turbulent boundary layer. Overall, results are found in reasonable agreement with experimental data and LES calculations.
44

Degenhardt, Regina [Verfasser]. "Advanced Lattice Boltzmann Models for the Simulation of Additive Manufacturing Processes / Regina Degenhardt." München : Verlag Dr. Hut, 2017. http://d-nb.info/1149579137/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Rügamer, David [Verfasser], and Sonja [Akademischer Betreuer] Greven. "Estimation, model choice and subsequent inference: methods for additive and functional regression models / David Rügamer ; Betreuer: Sonja Greven." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2018. http://d-nb.info/1161670874/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Nian, Gaowei. "A score test of homogeneity in generalized additive models for zero-inflated count data." Kansas State University, 2014. http://hdl.handle.net/2097/18230.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Master of Science
Department of Statistics
Wei-Wen Hsu
Zero-Inflated Poisson (ZIP) models are often used to analyze the count data with excess zeros. In the ZIP model, the Poisson mean and the mixing weight are often assumed to depend on covariates through regression technique. In other words, the effect of covariates on Poisson mean or the mixing weight is specified using a proper link function coupled with a linear predictor which is simply a linear combination of unknown regression coefficients and covariates. However, in practice, this predictor may not be linear in regression parameters but curvilinear or nonlinear. Under such situation, a more general and flexible approach should be considered. One popular method in the literature is Zero-Inflated Generalized Additive Models (ZIGAM) which extends the zero-inflated models to incorporate the use of Generalized Additive Models (GAM). These models can accommodate the nonlinear predictor in the link function. For ZIGAM, it is also of interest to conduct inferences for the mixing weight, particularly evaluating whether the mixing weight equals to zero. Many methodologies have been proposed to examine this question, but all of them are developed under classical zero-inflated models rather than ZIGAM. In this report, we propose a generalized score test to evaluate whether the mixing weight is equal to zero under the framework of ZIGAM with Poisson model. Technically, the proposed score test is developed based on a novel transformation for the mixing weight coupled with proportional constraints on ZIGAM, where it assumes that the smooth components of covariates in both the Poisson mean and the mixing weight have proportional relationships. An intensive simulation study indicates that the proposed score test outperforms the other existing tests when the mixing weight and the Poisson mean truly involve a nonlinear predictor. The recreational fisheries data from the Marine Recreational Information Program (MRIP) survey conducted by National Oceanic and Atmospheric Administration (NOAA) are used to illustrate the proposed methodology.
47

Asadollahiyazdi, Elnaz. "Integrated Design of Additive Manufacturing Based on Design for Manufacturing and Skin-skeleton Models." Thesis, Troyes, 2018. http://www.theses.fr/2018TROY0026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Aujourd'hui, la fabrication additive (FA) fait évoluer le monde de la fabrication grâce à ses capacités de production de formes complexes couche par couche. L'approche de conception pour la fabrication (DFM) aide à considérer les contraintes de FA et à maîtriser les caractéristiques du produit dans la gestion de son cycle de vie. Plusieurs études sont consacrées à l'approche de conception intégrée pour la FA, mais aucune approche ne prend en compte toutes les étapes du cycle de vie du produit dans le niveau d'optimisation de sa conception et de sa fabrication. Ainsi, cette thèse fournit une approche DFM pour la FA afin d'étudier simultanément différents attributs, contraintes et critères de conception et de fabrication dès la définition du produit. L'approche Peau-Squelette modélise la première définition du produit. Il contient une analyse fonctionnelle, un modèle d'usage et un modèle de fabrication. Dans ce travail, un nouveau moteur de résolution, qui agit à l’interface du modèle de produit et du modèle de fabrication, est proposé grâce à l'analyse des technologies FA et de leurs paramètres et critères. Ce moteur repose sur un problème d'optimisation bi-objectif pour minimiser le temps de production et la masse du matériau en proposant les solutions optimales pour les propriétés mécaniques et la rugosité du produit. Cette méthodologie permet de définir le modèle de produit. L'approche est mise en œuvre à travers une première technologie de dépôt par fil fondu (FDM) pour la production de deux études de cas
Nowadays, Additive Manufacturing (AM) evolves the manufacturing world by its capabilities for production of the complex shapes layer by layer. Design For Manufacturing (DFM) approach helps to overcome the AM constraints and mastering product features in product lifecycle. Several studies are devoted to integrated design approach for AM, but there is no approach that considers all product life cycle steps in optimization level for product and manufacturing process. So, this thesis provides a DFM approach for AM to investigate simultaneously different attributes, constraints, and criteria of design and manufacturing in product definition. Skin-Skeleton approach models the first definition of product and AM. It contains functional analysis, usage model, and manufacturing model. In this work, a novel interface processing engine as an interface between product and manufacturing model is developed through analysis of AM technologies and their parameters and criteria. This engine relies on a bi-objective optimization problem to minimize production time and material mass under limitation of mechanical properties and roughness of the product to obtain the optimal manufacturing parameters. This methodology permits to define the product model. The approach is implemented into Fused Deposition Modeling to verify the methodology through two case studies
48

Martof, Ashley Nicole. "Analysis of Business Models for the Use of Additive Manufacturing for Maintenance and Sustainment." Youngstown State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1494940467559894.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Agharkar, Amal. "Model Validation and Comparative Performance Evaluation of MOVES/CALINE4 and Generalized Additive Models for Near-Road Black Carbon Prediction." University of Cincinnati / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1490350586489513.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Campher, Susanna Elisabeth Sophia. "Comparing generalised additive neural networks with decision trees and alternating conditional expectations / Susanna E. S. Campher." Thesis, North-West University, 2008. http://hdl.handle.net/10394/2025.

Full text
APA, Harvard, Vancouver, ISO, and other styles

To the bibliography