Academic literature on the topic 'Regression analysis model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Regression analysis model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Regression analysis model"

1

Murugan, N. Senthil Vel, Dr V. Vallinayagam Dr. V.Vallinayagam, and Dr K. Senthamarai Kannan. "Multiple Regression Model and Similarity Analysis – A Comparison Study." Indian Journal of Applied Research 4, no. 8 (2011): 430–32. http://dx.doi.org/10.15373/2249555x/august2014/109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zvára, Karel. "Analysis of variance as regression model with a reparametrization restriction." Applications of Mathematics 37, no. 6 (1992): 453–58. http://dx.doi.org/10.21136/am.1992.104523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ng, Meei Pyng, and Gary K. Grunwald. "Nonlinear Regression Analysis of the Joint-Regression Model." Biometrics 53, no. 4 (1997): 1366. http://dx.doi.org/10.2307/2533503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Amit, Singh1 and Khushbu babbar2. "A MUTATION TESTING ANALYSIS AND REGRESSION TESTING." International Journal on Foundations of Computer Science & Technology (IJFCST) 5, no. 3 (2023): 7. https://doi.org/10.5281/zenodo.8344960.

Full text
Abstract:
Software testing is a testing which conducted a test to provide information to client about the quality of the product under test. Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. In this paper we focused on two main software testing –mutation testing and mutation testing. Mutation testing is a procedural testing method, i.e. we use the structure of the code to guide the test program, A mutation is a little change in a program. Such changes are applied to model low level defects that obtain in the process of coding systems. Ideally mutations should model low-level defect creation. Mutation testing is a process of testing in which code is modified then mutated code is tested against test suites. The mutations used in source code are planned to include in common programming errors. A good unit test typically detects the program mutations and fails automatically. Mutation testing is used on many different platforms, including Java, C++, C# and Ruby. Regression testing is a type of software testing that seeks to uncover new software bugs, or regressions, in existing functional and non-functional areas of a system after changes such as enhancements, patches or configuration changes, have been made to them. When defects are found during testing, the defect got fixed and that part of the software started working as needed. But there may be a case that the defects that fixed have introduced or uncovered a different defect in the software. The way to detect these unexpected bugs and to fix them used regression testing. The main focus of regression testing is to verify that changes in the software or program have not made any adverse side effects and that the software still meets its need. Regression tests are done when there are any changes made on software, because of modified functions.
APA, Harvard, Vancouver, ISO, and other styles
5

Yousif, Omaima A., Adil N. Abed, and Hamid A. Awad. "Modal Split Model Using Multiple Linear Regression Analysis." Anbar Journal for Engineering Sciences 12, no. 2 (2021): 222–28. http://dx.doi.org/10.37649/aengs.2021.171190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kumar, Nand Kishor, Raj Kumar Shah, and Suresh Kumar Sahani. "Regression Analysis and Forecasting with Regression Model in Economics." Mikailalsys Journal of Advanced Engineering International 2, no. 2 (2025): 159–70. https://doi.org/10.58578/mjaei.v2i2.5401.

Full text
Abstract:
This work aims to provide a mathematical model that can be applied to prediction and defines this relationship. It helps economists understand how different factors influence economic indicators such as GDP, inflation, unemployment, and market trends. Forecasting using regression models provides valuable insights for policy-making, business strategies, and economic planning.
APA, Harvard, Vancouver, ISO, and other styles
7

Miroshnychenko, V. O. "Residual analysis in regression mixture model." Bulletin of Taras Shevchenko National University of Kyiv. Series: Physics and Mathematics, no. 3 (2019): 8–16. http://dx.doi.org/10.17721/1812-5409.2019/3.1.

Full text
Abstract:
We consider data in which each observed subject belongs to one of different subpopulations (components). The true number of component which a subject belongs to is unknown, but the researcher knows the probabilities that a subject belongs to a given component (concentration of the component in the mixture). The concentrations are different for different observations. So the distribution of the observed data is a mixture of components’ distributions with varying concentrations. A set of variables is observed for each subject. Dependence between these variables is described by a nonlinear regression model. The coefficients of this model are different for different components. An estimator is proposed for these regression coefficients estimation based on the least squares and generalized estimating equations. Consistency of this estimator is demonstrated under general assumptions. A mixture of logistic regression models with continuous response is considered as an example. It is shown that the general consistency conditions are satisfied for this model under very mild assumptions. Performance of the estimator is assessed by simulations and applied for sociological data analysis. Q-Q diagrams are built for visual comparison of residuals’ distributions.
APA, Harvard, Vancouver, ISO, and other styles
8

Katsetos, Anastasios A., and Andrew C. Brendler. "NBTI model development with regression analysis." Microelectronics Reliability 49, no. 12 (2009): 1498–502. http://dx.doi.org/10.1016/j.microrel.2009.06.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Jin, Yingying Ma, and Hansheng Wang. "Semiparametric model for covariance regression analysis." Computational Statistics & Data Analysis 142 (February 2020): 106815. http://dx.doi.org/10.1016/j.csda.2019.106815.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cai, Tianxi, and Yingye Zheng. "Model Checking for ROC Regression Analysis." Biometrics 63, no. 1 (2007): 152–63. http://dx.doi.org/10.1111/j.1541-0420.2006.00620.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Regression analysis model"

1

Li, Lingzhu. "Model checking for general parametric regression models." HKBU Institutional Repository, 2019. https://repository.hkbu.edu.hk/etd_oa/654.

Full text
Abstract:
Model checking for regressions has drawn considerable attention in the last three decades. Compared with global smoothing tests, local smoothing tests, which are more sensitive to high-frequency alternatives, can only detect local alternatives dis- tinct from the null model at a much slower rate when the dimension of predictor is high. When the number of covariates is large, nonparametric estimations used in local smoothing tests lack efficiency. Corresponding tests then have trouble in maintaining the significance level and detecting the alternatives. To tackle the issue, we propose two methods under high but fixed dimension framework. Further, we investigate a model checking test under divergent dimension, where the numbers of covariates and unknown parameters go divergent with the sample size n. The first proposed test is constructed upon a typical kernel-based local smoothing test using projection method. Employed by projection and integral, the resulted test statistic has a closed form that depends only on the residuals and distances of the sample points. A merit of the developed test is that the distance is easy to implement compared with the kernel estimation, especially when the dimension is high. Moreover, the test inherits some feature of local smoothing tests owing to its construction. Although it is eventually similar to an Integrated Conditional Moment test in spirit, it leads to a test with a weight function that helps to collect more information from the samples than Integrated Conditional Moment test. Simulations and real data analysis justify the powerfulness of the test. The second test, which is a synthesis of local and global smoothing tests, aims at solving the slow convergence rate caused by nonparametric estimation in local smoothing tests. A significant feature of this approach is that it allows nonparamet- ric estimation-based tests, under the alternatives, also share the merits of existing empirical process-based tests. The proposed hybrid test can detect local alternatives at the fastest possible rate like the empirical process-based ones, and simultane- ously, retains the sensitivity to high-frequency alternatives from the nonparametric estimation-based ones. This feature is achieved by utilizing an indicative dimension in the field of dimension reduction. As a by-product, we have a systematic study on a residual-related central subspace for model adaptation, showing when alterna- tive models can be indicated and when cannot. Numerical studies are conducted to verify its application. Since the data volume nowadays is increasing, the numbers of predictors and un- known parameters are probably divergent as sample size n goes to infinity. Model checking under divergent dimension, however, is almost uncharted in the literature. In this thesis, an adaptive-to-model test is proposed to handle the divergent dimen- sion based on the two previous introduced tests. Theoretical results tell that, to get the asymptotic normality of the parameter estimator, the number of unknown parameters should be in the order of o(n1/3). Also, as a spinoff, we demonstrate the asymptotic properties of estimations for the residual-related central subspace and central mean subspace under different hypotheses.
APA, Harvard, Vancouver, ISO, and other styles
2

Lo, Sau Yee. "Measurement error in logistic regression model /." View abstract or full-text, 2004. http://library.ust.hk/cgi/db/thesis.pl?MATH%202004%20LO.

Full text
Abstract:
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2004.<br>Includes bibliographical references (leaves 82-83). Also available in electronic version. Access restricted to campus users.
APA, Harvard, Vancouver, ISO, and other styles
3

Gandy, Axel. "Directed model checks for regression models from survival analysis." Berlin Logos-Ver, 2005. http://deposit.ddb.de/cgi-bin/dokserv?id=2766731&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gandy, Axel. "Directed model checks for regression models from survival analysis /." Berlin : Logos-Ver, 2006. http://deposit.ddb.de/cgi-bin/dokserv?id=2766731&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ranganai, Edmore. "Aspects of model development using regression quantiles and elemental regressions." Thesis, Stellenbosch : Stellenbosch University, 2007. http://hdl.handle.net/10019.1/18668.

Full text
Abstract:
Dissertation (PhD)--University of Stellenbosch, 2007.<br>ENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice.<br>AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
APA, Harvard, Vancouver, ISO, and other styles
6

Tan, Falong. "Projected adaptive-to-model tests for regression models." HKBU Institutional Repository, 2017. https://repository.hkbu.edu.hk/etd_oa/390.

Full text
Abstract:
This thesis investigates Goodness-of-Fit tests for parametric regression models. With the help of sufficient dimension reduction techniques, we develop adaptive-to-model tests using projection in both the fixed dimension settings and the diverging dimension settings. The first part of the thesis develops a globally smoothing test in the fixed dimension settings for a parametric single index model. When the dimension p of covariates is larger than 1, existing empirical process-based tests either have non-tractable limiting null distributions or are not omnibus. To attack this problem, we propose a projected adaptive-to-model approach. If the null hypothesis is a parametric single index model, our method can fully utilize the dimension reduction structure under the null as if the regressors were one-dimensional. Then a martingale transformation proposed by Stute, Thies, and Zhu (1998) leads our test to be asymptotically distribution-free. Moreover, our test can automatically adapt to the underlying alternative models such that it can be omnibus and thus detect all alternative models departing from the null at the fastest possible convergence rate in hypothesis testing. A comparative simulation is conducted to check the performance of our test. We also apply our test to a self-noise mechanisms data set for illustration. The second part of the thesis proposes a globally smoothing test for parametric single-index models in the diverging dimension settings. In high dimensional data analysis, the dimension p of covariates is often large even though it may be still small compared with the sample size n. Thus we should regard p as a diverging number as n goes to infinity. With this in mind, we develop an adaptive-to-model empirical process as the basis of our test statistic, when the dimension p of covariates diverges to infinity as the sample size n tends to infinity. We also show that the martingale transformation proposed by Stute, Thies, and Zhu (1998) still work in the diverging dimension settings. The limiting distributions of the adaptive-to-model empirical process under both the null and the alternative are discussed in this new situation. Simulation examples are conducted to show the performance of this test when p grows with the sample size n. The last Chapter of the thesis considers the same problem as in the second part. Bierens's (1982) first constructed tests based on projection pursuit techniques and obtained an integrated conditional moment (ICM) test. We notice that Bierens's (1982) test performs very badly for large p, although it may be viewed as a globally smoothing test. With the help of sufficient dimension techniques, we propose an adaptive-to-model integrated conditional moment test for regression models in the diverging dimension setting. We also give the asymptotic properties of the new tests under both the null and alternative hypotheses in this new situation. When p grows with the sample size n, simulation studies show that our new tests perform much better than Bierens's (1982) original test.
APA, Harvard, Vancouver, ISO, and other styles
7

Volinsky, Christopher T. "Bayesian model averaging for censored survival models /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/8944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Hai. "Semiparametric regression analysis of zero-inflated data." Diss., University of Iowa, 2009. https://ir.uiowa.edu/etd/308.

Full text
Abstract:
Zero-inflated data abound in ecological studies as well as in other scientific and quantitative fields. Nonparametric regression with zero-inflated response may be studied via the zero-inflated generalized additive model (ZIGAM). ZIGAM assumes that the conditional distribution of the response variable belongs to the zero-inflated 1-parameter exponential family which is a probabilistic mixture of the zero atom and the 1-parameter exponential family, where the zero atom accounts for an excess of zeroes in the data. We propose the constrained zero-inflated generalized additive model (COZIGAM) for analyzing zero-inflated data, with the further assumption that the probability of non-zero-inflation is some monotone function of the (non-zero-inflated) exponential family distribution mean. When the latter assumption obtains, the new approach provides a unified framework for modeling zero-inflated data, which is more parsimonious and efficient than the unconstrained ZIGAM. We develop an iterative algorithm for model estimation based on the penalized likelihood approach, and derive formulas for constructing confidence intervals of the maximum penalized likelihood estimator. Some asymptotic properties including the consistency of the regression function estimator and the limiting distribution of the parametric estimator are derived. We also propose a Bayesian model selection criterion for choosing between the unconstrained and the constrained ZIGAMs. We consider several useful extensions of the COZIGAM, including imposing additive-component-specific proportional and partial constraints, and incorporating threshold effects to account for regime shift phenomena. The new methods are illustrated with both simulated data and real applications. An R package COZIGAM has been developed for model fitting and model selection with zero-inflated data.
APA, Harvard, Vancouver, ISO, and other styles
9

Roualdes, Edward A. "New Results in ell_1 Penalized Regression." UKnowledge, 2015. http://uknowledge.uky.edu/statistics_etds/13.

Full text
Abstract:
Here we consider penalized regression methods, and extend on the results surrounding the l1 norm penalty. We address a more recent development that generalizes previous methods by penalizing a linear transformation of the coefficients of interest instead of penalizing just the coefficients themselves. We introduce an approximate algorithm to fit this generalization and a fully Bayesian hierarchical model that is a direct analogue of the frequentist version. A number of benefits are derived from the Bayesian persepective; most notably choice of the tuning parameter and natural means to estimate the variation of estimates – a notoriously difficult task for the frequentist formulation. We then introduce Bayesian trend filtering which exemplifies the benefits of our Bayesian version. Bayesian trend filtering is shown to be an empirically strong technique for fitting univariate, nonparametric regression. Through a simulation study, we show that Bayesian trend filtering reduces prediction error and attains more accurate coverage probabilities over the frequentist method. We then apply Bayesian trend filtering to real data sets, where our method is quite competitive against a number of other popular nonparametric methods.
APA, Harvard, Vancouver, ISO, and other styles
10

Bunea, Florentina. "A model selection approach to partially linear regression /." Thesis, Connect to this title online; UW restricted, 2000. http://hdl.handle.net/1773/8971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Regression analysis model"

1

Draper, Norman Richard. Model selection problems. University of Toronto, Dept. of Statistics, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Borowiak, Dale S. Model discrimination for nonlinear regression models. M. Dekker, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bera, Anil K. Specification test for a linear regression model with arch process. University of Illinois at Urbana-Champaign, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Walter, Krämer. The linear regression model under test. Physica-Verlag, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Özçam, Ahmet. The risk properties of a pre-test estimator for Zellner's seemingly unrelated regression model. Bureau of Economic and Business Research, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Özçam, Ahmet. The risk properties of a pre-test estimator for Zellner's seemingly unrelated regression model. Bureau of Economic and Business Research, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chang-Jin, Kim. In search of a model that an ARCH-type model may be approximating: The Markov model of heteroskedasticity. York University, Dept. of Economics, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jiang, Jiming. Robust Mixed Model Analysis. World Scientific Publishing Co Pte Ltd, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sankoh, O. A. Influential observations in the linear regression model and Trenkler's iteration estimator. Cuvillier, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Choi, ByoungSeon. ARMA model identification. Springer-Verlag, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Regression analysis model"

1

Westfall, Peter H., and Andrea L. Arias. "Estimating Regression Model Parameters." In Understanding Regression Analysis. Chapman and Hall/CRC, 2020. http://dx.doi.org/10.1201/9781003025764-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Westfall, Peter H., and Andrea L. Arias. "The Multiple Regression Model." In Understanding Regression Analysis. Chapman and Hall/CRC, 2020. http://dx.doi.org/10.1201/9781003025764-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Westfall, Peter H., and Andrea L. Arias. "The Classical Model and Its Consequences." In Understanding Regression Analysis. Chapman and Hall/CRC, 2020. http://dx.doi.org/10.1201/9781003025764-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Barrie Wetherill, G., P. Duncombe, M. Kenward, J. Köllerström, S. R. Paul, and B. J. Vowden. "Choosing a regression model." In Regression Analysis with Applications. Springer Netherlands, 1986. http://dx.doi.org/10.1007/978-94-009-4105-2_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bolin, Jocelyn H. "Model Comparisons and Hierarchical Regression." In Regression Analysis in R. Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9780429295843-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Takezawa, Kunio. "Linear Mixed Model." In Learning Regression Analysis by Simulation. Springer Japan, 2013. http://dx.doi.org/10.1007/978-4-431-54321-3_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hox, Joop J., Mirjam Moerbeek, and Rens van de Schoot. "The Basic Two-Level Regression Model." In Multilevel Analysis. Routledge, 2017. http://dx.doi.org/10.4324/9781315650982-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fedorov, Valerii V., and Peter Hackl. "Some Facts From Regression Analysis." In Model-Oriented Design of Experiments. Springer New York, 1997. http://dx.doi.org/10.1007/978-1-4612-0703-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Boutalbi, Rafika, Lazhar Labiod, and Mohamed Nadif. "Latent Block Regression Model." In Studies in Classification, Data Analysis, and Knowledge Organization. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-09034-9_9.

Full text
Abstract:
AbstractWhen dealing with high dimensional sparse data, such as in recommender systems,co-clusteringturnsouttobemorebeneficialthanone-sidedclustering,even if one is interested in clustering along one dimension only. Thereby, co-clusterwise is a natural extension of clusterwise. Unfortunately, all of the existing approaches do not consider covariates on both dimensions of a data matrix. In this paper, we propose a Latent Block Regression Model (LBRM) overcoming this limit. For inference, we propose an algorithm performing simultaneously co-clustering and regression where a linear regression model characterizes each block. Placing the estimate of the model parameters under the maximum likelihood approach, we derive a Variational Expectation–Maximization (VEM) algorithm for estimating the model’s parameters. The finality of the proposed VEM-LBRM is illustrated through simulated datasets.
APA, Harvard, Vancouver, ISO, and other styles
10

Carter, Walter H., Galen L. Wampler, and Donald M. Stablein. "The Logistic Regression Model." In Regression Analysis of Survival Data in Cancer Chemotherapy. CRC Press, 2024. http://dx.doi.org/10.1201/9781003573531-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Regression analysis model"

1

Al Siddik Shammo, Basid, and Biswaranjan Acharya. "Esports Earning Analysis Using Regression Model." In 2024 International Conference on Artificial Intelligence and Quantum Computation-Based Sensor Application (ICAIQSA). IEEE, 2024. https://doi.org/10.1109/icaiqsa64000.2024.10882303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao, Yichen, Guanwen Yan, Yushi Yan, and Yuhao Xiang. "Sentiment analysis of movie reviews based on logistic regression model." In Seventh International Conference on Advanced Electronic Materials, Computers, and Software Engineering (AEMCSE 2024), edited by Lvqing Yang. SPIE, 2024. http://dx.doi.org/10.1117/12.3038109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yang, Heng, Xinyu Li, and Shixun Li. "Prediction and Analysis of Olympic Medals Based on Probit Regression and Tobit Model." In 2025 International Conference on Digital Analysis and Processing, Intelligent Computation (DAPIC). IEEE, 2025. https://doi.org/10.1109/dapic66097.2025.00158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Zhetong. "Flood Hazard Prediction Model Based on K-means Clustering Analysis and AdaBoost Regression Model." In 2025 International Conference on Electrical Drives, Power Electronics & Engineering (EDPEE). IEEE, 2025. https://doi.org/10.1109/edpee65754.2025.00036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yu, Zijun. "Research on Olympic Medal Distribution Prediction Based on Cluster Regression and CUSUM Analysis Model." In 2025 International Conference on Digital Analysis and Processing, Intelligent Computation (DAPIC). IEEE, 2025. https://doi.org/10.1109/dapic66097.2025.00148.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Duller, Christine. "Model selection for logistic regression models." In NUMERICAL ANALYSIS AND APPLIED MATHEMATICS ICNAAM 2012: International Conference of Numerical Analysis and Applied Mathematics. AIP, 2012. http://dx.doi.org/10.1063/1.4756152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Khan, Mohiuddeen, and Kanishk Srivastava. "Regression Model for Better Generalization and Regression Analysis." In ICMLSC 2020: The 4th International Conference on Machine Learning and Soft Computing. ACM, 2020. http://dx.doi.org/10.1145/3380688.3380691.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Araveeporn, Autcha, and Choojai Kuharatanachai. "Comparing Penalized Regression Analysis of Logistic Regression Model with Multicollinearity." In the 2019 2nd International Conference. ACM Press, 2019. http://dx.doi.org/10.1145/3343485.3343487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Yuanyuan. "Regression Analysis of Executive Shareholding and Corporate Earnings Management." In 2019 International Conference on Economic Management and Model Engineering (ICEMME). IEEE, 2019. http://dx.doi.org/10.1109/icemme49371.2019.00090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cavusoglu, Behiye, Kemal Cek, and Serife Z. Eyupoglu. "Modelling job satisfaction using a logistic regression model." In INTERNATIONAL CONFERENCE ON ANALYSIS AND APPLIED MATHEMATICS (ICAAM 2020). AIP Publishing, 2021. http://dx.doi.org/10.1063/5.0040383.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Regression analysis model"

1

Meloncelli, Daniel. Regression Analysis using SPSS. Instats Inc., 2025. https://doi.org/10.61700/9zohmz8j1gzcj1476.

Full text
Abstract:
This seminar provides a comprehensive introduction to regression analysis using SPSS, equipping researchers with the skills to apply linear and logistic regression techniques in their work. Participants will gain practical experience in model fitting, assumption checking, and interpreting results, enhancing their ability to leverage data effectively in their respective fields.
APA, Harvard, Vancouver, ISO, and other styles
2

Meloncelli, Daniel. Regression Analysis using R. Instats Inc., 2025. https://doi.org/10.61700/j3s1r521de2ee1472.

Full text
Abstract:
This seminar provides a comprehensive introduction to regression analysis using R. Participants will explore the fundamentals of correlation, simple linear regression, multiple regression, and logistic regression. The seminar emphasises practical application, guiding attendees through data preparation, model fitting, assumption checking, and interpretation of results. Hands-on sessions will enable participants to apply regression techniques to real-world datasets, enhancing their analytical skills for research purposes.
APA, Harvard, Vancouver, ISO, and other styles
3

Hutny, W. P., and J. T. Price. Analysis and regression model of blast furnace coal injection. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 1987. http://dx.doi.org/10.4095/304361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Krishnaiah, P. R., and S. Sarkar. Principal Component Analysis Under Correlated Multivariate Regression Equations Model. Defense Technical Information Center, 1985. http://dx.doi.org/10.21236/ada160266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Marchese, Malvina. Regression Analysis: Everything You Need to Know. Instats Inc., 2023. http://dx.doi.org/10.61700/758g2bp367fhc469.

Full text
Abstract:
Regression is one of the most important modelling tools used in a variety of different research fields. In many cases, the plausibility of an empirical finding depends on the robustness of the regression results. This two-day seminar offers an in-depth introduction to linear regression models for cross sectional and time series data, covering all aspects of regression modelling, from model and variables selection, to dummy variables and multicollinearity and endogeneity to prediction (in sample and out of sample). An official Instats certificate of completion is provided at the conclusion of the seminar, and European PhD students receive 2 ECTS Equivalent points.
APA, Harvard, Vancouver, ISO, and other styles
6

Marchese, Malvina. Regression Analysis: Everything You Need to Know. Instats Inc., 2023. http://dx.doi.org/10.61700/s16tb8g7d585g469.

Full text
Abstract:
Regression is one of the most important modelling tools used in a variety of different research fields. In many cases, the plausibility of an empirical finding depends on the robustness of the regression results. This two-day seminar offers an in-depth introduction to linear regression models for cross sectional and time series data, covering all aspects of regression modelling, from model and variables selection, to dummy variables and multicollinearity and endogeneity to prediction (in sample and out of sample). An official Instats certificate of completion is provided at the conclusion of the seminar, and European PhD students receive 2 ECTS Equivalent points.
APA, Harvard, Vancouver, ISO, and other styles
7

Sun, T. C. Using Regression Analysis Method to Develop a Material Outgassing Model. Office of Scientific and Technical Information (OSTI), 2019. http://dx.doi.org/10.2172/1499976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Moeyaert, Mariola. Advanced Meta-Analysis. Instats Inc., 2023. http://dx.doi.org/10.61700/ttn9i9ntp8uvj469.

Full text
Abstract:
This seminar will introduce you to advanced meta-analytic methods. Commonly encountered meta-analytic topics and issues will be covered, including meta-regression models, methods for handling multiple effect sizes per study (i.e., dependent effect sizes), missing data, publication bias, meta-analysis SEM, and single-case experiments meta-analysis. During this seminar, participants will learn how to use RStudio to model these commonly encountered complexities. Hands-on exercises will be incorporated throughout the seminar. An official Instats certificate of completion is provided at the conclusion of the seminar. For European PhD students, each seminar offers 2 ECTS Equivalent points.
APA, Harvard, Vancouver, ISO, and other styles
9

Moeyaert, Mariola. Advanced Meta-Analysis. Instats Inc., 2023. http://dx.doi.org/10.61700/k4me5g0k92l56469.

Full text
Abstract:
This seminar will introduce you to advanced meta-analytic methods. Commonly encountered meta-analytic topics and issues will be covered, including meta-regression models, methods for handling multiple effect sizes per study (i.e., dependent effect sizes), missing data, publication bias, meta-analysis SEM, and single-case experiments meta-analysis. During this seminar, participants will learn how to use RStudio to model these commonly encountered complexities. Hands-on exercises will be incorporated throughout the seminar. An official Instats certificate of completion is provided at the conclusion of the seminar. For European PhD students, each seminar offers 2 ECTS Equivalent points.
APA, Harvard, Vancouver, ISO, and other styles
10

Kott, Phillip S. The Role of Weights in Regression Modeling and Imputation. RTI Press, 2022. http://dx.doi.org/10.3768/rtipress.2022.mr.0047.2203.

Full text
Abstract:
When fitting observations from a complex survey, the standard regression model assumes that the expected value of the difference between the dependent variable and its model-based prediction is zero, regardless of the values of the explanatory variables. A rarely failing extended regression model assumes only that the model error is uncorrelated with the model’s explanatory variables. When the standard model holds, it is possible to create alternative analysis weights that retain the consistency of the model-parameter estimates while increasing their efficiency by scaling the inverse-probability weights by an appropriately chosen function of the explanatory variables. When a regression model is used to impute for missing item values in a complex survey and when item missingness is a function of the explanatory variables of the regression model and not the item value itself, near unbiasedness of an estimated item mean requires that either the standard regression model for the item in the population holds or the analysis weights incorporate a correctly specified and consistently estimated probability of item response. By estimating the parameters of the probability of item response with a calibration equation, one can sometimes account for item missingness that is (partially) a function of the item value itself.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!