Academic literature on the topic 'Classification and regression trees'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Classification and regression trees.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Classification and regression trees"

1

Krzywinski, Martin, and Naomi Altman. "Classification and regression trees." Nature Methods 14, no. 8 (2017): 757–58. http://dx.doi.org/10.1038/nmeth.4370.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Speybroeck, N. "Classification and regression trees." International Journal of Public Health 57, no. 1 (2011): 243–46. http://dx.doi.org/10.1007/s00038-011-0315-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ryzin, John Van, Leo Breiman, Jerome H. Friedman, Richard A. Olshen, and Charles J. Stone. "Classification and Regression Trees." Journal of the American Statistical Association 81, no. 393 (1986): 253. http://dx.doi.org/10.2307/2288003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Praagman, J. "Classification and regression trees." European Journal of Operational Research 19, no. 1 (1985): 144. http://dx.doi.org/10.1016/0377-2217(85)90321-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Loh, Wei‐Yin. "Classification and regression trees." WIREs Data Mining and Knowledge Discovery 1, no. 1 (2011): 14–23. http://dx.doi.org/10.1002/widm.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fearn, Tom. "Classification and Regression Trees (CART)." NIR news 17, no. 6 (2006): 13–14. http://dx.doi.org/10.1255/nirn.917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yeh, Chyon-Hwa. "Classification and regression trees (CART)." Chemometrics and Intelligent Laboratory Systems 12, no. 1 (1991): 95–96. http://dx.doi.org/10.1016/0169-7439(91)80113-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Krzywinksi, Martin, and Naomi Altman. "Erratum: Corrigendum: Classification and regression trees." Nature Methods 14, no. 9 (2017): 928. http://dx.doi.org/10.1038/nmeth0917-928a.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Akbilgic, Oguz. "Classification trees aided mixed regression model." Journal of Applied Statistics 42, no. 8 (2015): 1773–81. http://dx.doi.org/10.1080/02664763.2015.1006394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sambasivan, Rajiv, and Sourish Das. "Classification and regression using augmented trees." International Journal of Data Science and Analytics 7, no. 4 (2018): 259–76. http://dx.doi.org/10.1007/s41060-018-0146-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Classification and regression trees"

1

Nemčíková, Lucia. "Classification and Regression Trees in R." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-194149.

Full text
Abstract:
Tree-based methods are a nice add-on to traditional statistical methods when solving classification and regression problems. The aim of this master thesis is not to judge which approach is better but rather bring the overview of these methods and apply them on the real data using R. Focus is made especially on the basic methodology of tree-based models and the application in specific software in order to provide wide range of tool for reader to be able to use these methods. One part of the thesis touches the advanced tree-based methods to provide full picture of possibilities.
APA, Harvard, Vancouver, ISO, and other styles
2

au, A. Bremner@murdoch edu, and Alexandra Bremner. "Localised splitting criteria for classification and regression trees." Murdoch University, 2004. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20040606.142949.

Full text
Abstract:
This thesis presents a modification of existing entropy-based splitting criteria for classification and regression trees. Trees are typically grown using splitting criteria that choose optimal splits without taking future splits into account. This thesis examines localised splitting criteria that are based on local averaging in regression trees or local proportions in classification trees. The use of a localised criterion is motivated by the fact that future splits result in leaves that contain local observations, and hence local deviances provide a better approximation of the deviance of the fully grown tree. While most recent research has focussed on tree-averaging techniques that are aimed at taking a moderately successful splitting criterion and improving its predictive power, this thesis concentrates on improving the splitting criterion. Use of a localised splitting criterion captures local structures and enables later splits to capitalise on the placement of earlier splits when growing a tree. Using the localised splitting criterion results in much simpler trees for pure interaction data (data with no main effects) and can produce trees with fewer errors and lower residual mean deviances than those produced using a global splitting criterion when applied to real data sets with strong interaction effects. The superiority of the localised splitting criterion can persist when multiple trees are grown and averaged using simple methods. Although a single tree grown using the localised splitting criterion can outperform tree averaging using the global criterion, generally improvements in predictive performance are achieved by utilising the localised splitting criterion's property of detecting local discontinuities and averaging over sets of trees grown by placing splits where the deviance is locally minimal. Predictive performance improves further when the degree of localisation of the splitting criterion is randomly selected and weighted randomisation is used with locally minimal deviances to produce sets of trees to average over. Although state of the art methods quickly average very large numbers of trees, thus making the performance of the splitting criterion less critical, predictive performance when the localised criterion is used in bagging indicates that different splitting methods warrant investigation. The localised splitting criterion is most useful for growing one tree or a small number of trees to examine structure in the data. Structurally different trees can be obtained by simply splitting the data where the localised splitting criterion is locally optimal.
APA, Harvard, Vancouver, ISO, and other styles
3

Bremner, Alexandra. "Localised splitting criteria for classification and regression trees." Bremner, Alexandra (2004) Localised splitting criteria for classification and regression trees. PhD thesis, Murdoch University, 2004. http://researchrepository.murdoch.edu.au/440/.

Full text
Abstract:
This thesis presents a modification of existing entropy-based splitting criteria for classification and regression trees. Trees are typically grown using splitting criteria that choose optimal splits without taking future splits into account. This thesis examines localised splitting criteria that are based on local averaging in regression trees or local proportions in classification trees. The use of a localised criterion is motivated by the fact that future splits result in leaves that contain local observations, and hence local deviances provide a better approximation of the deviance of the fully grown tree. While most recent research has focussed on tree-averaging techniques that are aimed at taking a moderately successful splitting criterion and improving its predictive power, this thesis concentrates on improving the splitting criterion. Use of a localised splitting criterion captures local structures and enables later splits to capitalise on the placement of earlier splits when growing a tree. Using the localised splitting criterion results in much simpler trees for pure interaction data (data with no main effects) and can produce trees with fewer errors and lower residual mean deviances than those produced using a global splitting criterion when applied to real data sets with strong interaction effects. The superiority of the localised splitting criterion can persist when multiple trees are grown and averaged using simple methods. Although a single tree grown using the localised splitting criterion can outperform tree averaging using the global criterion, generally improvements in predictive performance are achieved by utilising the localised splitting criterion's property of detecting local discontinuities and averaging over sets of trees grown by placing splits where the deviance is locally minimal. Predictive performance improves further when the degree of localisation of the splitting criterion is randomly selected and weighted randomisation is used with locally minimal deviances to produce sets of trees to average over. Although state of the art methods quickly average very large numbers of trees, thus making the performance of the splitting criterion less critical, predictive performance when the localised criterion is used in bagging indicates that different splitting methods warrant investigation. The localised splitting criterion is most useful for growing one tree or a small number of trees to examine structure in the data. Structurally different trees can be obtained by simply splitting the data where the localised splitting criterion is locally optimal.
APA, Harvard, Vancouver, ISO, and other styles
4

Nappa, Dario. "Bayesian classification using Bayesian additive and regression trees." Ann Arbor, Mich. : ProQuest, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3336814.

Full text
Abstract:
Thesis (Ph.D. in Statistical Sciences)--S.M.U.<br>Title from PDF title page (viewed Mar. 16, 2009). Source: Dissertation Abstracts International, Volume: 69-12, Section: B, page: . Adviser: Xinlei Wang. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
5

Bremner, Alexandra P. "Localised splitting criteria for classification and regression trees." Thesis, Bremner, Alexandra P. (2004) Localised splitting criteria for classification and regression trees. PhD thesis, Murdoch University, 2004. https://researchrepository.murdoch.edu.au/id/eprint/440/.

Full text
Abstract:
This thesis presents a modification of existing entropy-based splitting criteria for classification and regression trees. Trees are typically grown using splitting criteria that choose optimal splits without taking future splits into account. This thesis examines localised splitting criteria that are based on local averaging in regression trees or local proportions in classification trees. The use of a localised criterion is motivated by the fact that future splits result in leaves that contain local observations, and hence local deviances provide a better approximation of the deviance of the fully grown tree. While most recent research has focussed on tree-averaging techniques that are aimed at taking a moderately successful splitting criterion and improving its predictive power, this thesis concentrates on improving the splitting criterion. Use of a localised splitting criterion captures local structures and enables later splits to capitalise on the placement of earlier splits when growing a tree. Using the localised splitting criterion results in much simpler trees for pure interaction data (data with no main effects) and can produce trees with fewer errors and lower residual mean deviances than those produced using a global splitting criterion when applied to real data sets with strong interaction effects. The superiority of the localised splitting criterion can persist when multiple trees are grown and averaged using simple methods. Although a single tree grown using the localised splitting criterion can outperform tree averaging using the global criterion, generally improvements in predictive performance are achieved by utilising the localised splitting criterion's property of detecting local discontinuities and averaging over sets of trees grown by placing splits where the deviance is locally minimal. Predictive performance improves further when the degree of localisation of the splitting criterion is randomly selected and weighted randomisation is used with locally minimal deviances to produce sets of trees to average over. Although state of the art methods quickly average very large numbers of trees, thus making the performance of the splitting criterion less critical, predictive performance when the localised criterion is used in bagging indicates that different splitting methods warrant investigation. The localised splitting criterion is most useful for growing one tree or a small number of trees to examine structure in the data. Structurally different trees can be obtained by simply splitting the data where the localised splitting criterion is locally optimal.
APA, Harvard, Vancouver, ISO, and other styles
6

Bremner, Alexandra P. "Localised splitting criteria for classification and regression trees /." Access via Murdoch University Digital Theses Project, 2004. http://wwwlib.murdoch.edu.au/adt/browse/view/adt-MU20040606.142949.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rusch, Thomas, and Achim Zeileis. "Discussion on Fifty Years of Classification and Regression Trees." Wiley, 2014. http://dx.doi.org/10.1111/insr.12062.

Full text
Abstract:
In this discussion paper, we argue that the literature on tree algorithms is very fragmented. We identify possible causes and discuss good and bad sides of this situation. Among the latter is the lack of free open-source implementations for many algorithms. We argue that if the community adopts a standard of creating and sharing free open-source implementations for their developed algorithms and creates easy access to these programs the bad sides of the fragmentation will be actively combated and will benefit the whole scientific community. (authors' abstract)
APA, Harvard, Vancouver, ISO, and other styles
8

Simonds, Jonah. "Roots of Conflict: Classification and Regression Trees and the Complexity of Organized Violence." Thesis, Uppsala universitet, Institutionen för freds- och konfliktforskning, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-444122.

Full text
Abstract:
Conflict researchers have validated many different theories on the causes of organized violence, but there are significant gaps in knowledge concerning how these theories interact with one another. In this thesis, I identify a body of the most prominent theories of organized violence and model them in an environment suitable for capturing these complex interactions. I formulate six causal categories to which these theories belong: Geography; Economy; Conflict History &amp; Insecurity; Liberty &amp; Inclusion; Natural Resources; and Structures of Governance. I then construct a cross-national, time-series sample of country-year observations and create a general model of organized violence using a machine learning technique called Classification and Regression Trees (CART). The results from this first model indicates a substantial negative effect owing to Peace Years, a count of the number of years since the country last experienced an internal conflict. Subsequently, I construct three more models, each investigating different subsets of country-year observations based on their Peace Years value. My models indicate that the country-years most likely to experience a high number of deaths from organized violence are those where conflict occurred in the previous year, the population size is high, and the net rate of male secondary school enrollment is low. The models also reveal several novel results under the presence of certain conditions, including: nonlinear relationships between deaths from organized violence and both oil exports and mass education; and a negative relationship between economic inequality and deaths from organized violence, wherein higher inequality results in fewer deaths. These findings highlight the importance of complexity-based modeling for both future conflict research and policymaking oriented towards violence reduction.
APA, Harvard, Vancouver, ISO, and other styles
9

Fowdar, Navindra Jay. "Inducing fuzzy decision trees to solve classification and regression problems in non-deterministic domains." Thesis, Manchester Metropolitan University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.426942.

Full text
Abstract:
Most decision tree induction methods used for extracting knowledge in classification problems are unable to deal with uncertainties embedded within the data, associated with human thinking and perception. This thesis describes the development of a novel tree induction algorithm which improves the classification accuracy of decision trees in non-deterministic domains. A novel algorithm, Fuzzy CIA, is presented which applies the principles of fuzzy theory to decision tree algorithms in order to soften the sharp decision boundaries which are inherent in these induction techniques. Fuzzy CIA extrapolates rules from a crisply induced tree, fuzzifies the decision nodes and combines membership grades using fuzzy inference. A novel approach is also proposed to manage the defuzzification of regression trees. The application of fuzzy logic to decision trees can represent classification of knowledge more naturally and in-line with human thinking and creates more robust trees when it comes to handling imprecise, missing or conflicting information. A series of experiments, using real world datasets, were performed to compare the performance of Fuzzy CIA with crisp trees. The results have shown that Fuzzy CIA can significantly improve the classification / prediction performance when compared to crisp trees. The amount of improvement is found to be dependant upon the data domain, the method in which fuzzification is applied and the fuzzy inference technique used to combine information from the tree.
APA, Harvard, Vancouver, ISO, and other styles
10

CRISANTI, MARK. "THE PREDICTIVE ACCURACY OF BOOSTED CLASSIFICATION TREES RELATIVE TO DISCRIMINANT ANALYSIS AND LOGISTIC REGRESSION." University of Cincinnati / OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1178566287.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Classification and regression trees"

1

Classification and regression trees. Chapman & Hall, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Abdolell, Mohamed. Poisson regression trees. National Library of Canada, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Abdolell, Mohamed. Poisson regression trees. Universiy of Toronto, Dept. of Statistics, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hastie, Trevor. Flexible discriminant analysis: Adaptive classification. University of Toronto, Dept. of Statistics, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tibshirani, Robert. "Coaching" variables for regression and classification. University of Toronto, Dept. of Statistics., 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

T, Denison David G., ed. Bayesian methods for nonlinear classification and regression. Wiley, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Drezet, P. Directly optimized support vector machines for classification and regression. University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Izenman, Alan Julian. Modern multivariate statistical techniques: Regression, classification, and manifold learning. Springer, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

author, Meshbesher Wendy, ed. Animal classification: Do cats have family trees? Raintree, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Espinosa, Laura Yáñez. Las principales familias de árboles en México. Universidad Autónoma Chapingo, División de Ciencias Forestales, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Classification and regression trees"

1

Klemelä, Jussi, Sigbert Klinke, and Hizir Sofyan. "Classification and Regression Trees." In XploRe® - Application Guide. Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-642-57292-0_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

West, Robert M. "Regression and Classification Trees." In Modern Methods for Epidemiology. Springer Netherlands, 2012. http://dx.doi.org/10.1007/978-94-007-3024-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Trendowicz, Adam, and Ross Jeffery. "Classification and Regression Trees." In Software Project Effort Estimation. Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-03629-8_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Torgo, Luis. "Computationally Efficient Linear Regression Trees." In Classification, Clustering, and Data Analysis. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/978-3-642-56181-8_45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Berk, Richard A. "Classification and Regression Trees (CART)." In Statistical Learning from a Regression Perspective. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-44048-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Berk, Richard A. "Classification and Regression Trees (CART)." In Statistical Learning from a Regression Perspective. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-40189-4_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

O'Leary, Rebecca A., Samantha Low Choy, Wenbiao Hu, and Kerrie L. Mengersen. "Bayesian Classification and Regression Trees." In Case Studies in Bayesian Statistical Modelling and Analysis. John Wiley & Sons, Ltd, 2012. http://dx.doi.org/10.1002/9781118394472.ch19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gey, Servane, and Elodie Nedelec. "Risk Bounds for CART Regression Trees." In Nonlinear Estimation and Classification. Springer New York, 2003. http://dx.doi.org/10.1007/978-0-387-21579-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Giordano, Gianfranco, and Massimo Aria. "Regression Trees with Moderating Effects." In Studies in Classification, Data Analysis, and Knowledge Organization. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-11363-5_45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Garson, G. David. "Classification and regression trees in R." In Data Analytics for the Social Sciences. Routledge, 2021. http://dx.doi.org/10.4324/9781003109396-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Classification and regression trees"

1

Bertsimas, Dimitris, Jack Dunn, and Aris Paschalidis. "Regression and classification using optimal decision trees." In 2017 IEEE MIT Undergraduate Research Technology Conference (URTC). IEEE, 2017. http://dx.doi.org/10.1109/urtc.2017.8284195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Engel, Joachim, Tim Erickson, and Laura Martignon. "Teaching about decision trees for classification problems." In Decision Making Based on Data. International Association for Statistical Education, 2019. http://dx.doi.org/10.52041/srap.19303.

Full text
Abstract:
In times of big data tree-based algorithms are an important method of machine learning which supports decision making, e.g., in medicine, finance, public policy and many more. Trees are a versatile method to represent decision processes that mirror human decision-making more closely than sophisticated traditional statistical methods like multivariate regression or neural networks. We introduce and illustrate the tool ARBOR, a digital learning tool which is a plug-in to the freely available data science education software CODAP. It is designed to critically appreciate and explore the steps of automatically generated decision trees.
APA, Harvard, Vancouver, ISO, and other styles
3

Moghadam, Armin, and Fatemeh Davoudi Kakhki. "Comparative Study of Decision Tree Models for Bearing Fault Detection and Classification." In Intelligent Human Systems Integration (IHSI 2022) Integrating People and Intelligent Systems. AHFE International, 2022. http://dx.doi.org/10.54941/ahfe100968.

Full text
Abstract:
Fault diagnosis of bearings is essential in reducing failures and improving functionality and reliability of rotating machines. As vibration signals are non-linear and non-stationary, extracting features for dimension reduction and efficient fault detection is challenging. This study aims at evaluating performance of decision tree-based machine learning models in detection and classification of bearing fault data. A machine learning approach combining the tree-based classifiers with derived statistical features is proposed for localized fault classification. Statistical features are extracted from normal and faulty vibration signals though time domain analysis to develop tree-based models of AdaBoost (AD), classification and regression trees (CART), LogitBoost trees (LBT), and Random Forest trees (RF). The results confirm that machine learning classifiers have satisfactory performance and strong generalization ability in fault detection, and provide practical models for classify running state of the bearing.
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Jing, Xinpu Ji, Yuhan Jia, et al. "Hard Drive Failure Prediction Using Classification and Regression Trees." In 2014 44th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN). IEEE, 2014. http://dx.doi.org/10.1109/dsn.2014.44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ramadhan, Taufik, Agung Wahana, Dian Sa'Adillah Maylawati, Nur Lukman, Ichsan Taufik, and Ichsan Budiman. "Pneumonia Prediction System Using Classification and Regression Trees Algorithm." In 2022 8th International Conference on Wireless and Telematics (ICWT). IEEE, 2022. http://dx.doi.org/10.1109/icwt55831.2022.9935412.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vempada, Ramu Reddy, and K. Sreenivasa Rao. "Modeling the intensity of syllables using classification and Regression Trees." In 2012 National Conference on Communications (NCC). IEEE, 2012. http://dx.doi.org/10.1109/ncc.2012.6176824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Xu, Yanyan, Qing-Jie Kong, and Yuncai Liu. "Short-term traffic volume prediction using classification and regression trees." In 2013 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2013. http://dx.doi.org/10.1109/ivs.2013.6629516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Amiri, Amir Mohammad, and Giuliano Armano. "Early diagnosis of heart disease using classification and regression trees." In 2013 International Joint Conference on Neural Networks (IJCNN 2013 - Dallas). IEEE, 2013. http://dx.doi.org/10.1109/ijcnn.2013.6707080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Polat, Cahit. "Accuracies of Logistic Regression, Linear Discriminant Analysis, and Classification and Regression Trees Under Controlled Conditions." In 2019 AERA Annual Meeting. AERA, 2019. http://dx.doi.org/10.3102/1445626.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Campbel, P. R. J., H. Fathulla, and F. Ahmed. "FuzzyCART: A novel Fuzzy Logic based Classification & Regression Trees Algorithm." In 2009 International Conference on Innovations in Information Technology (IIT). IEEE, 2009. http://dx.doi.org/10.1109/iit.2009.5413763.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Classification and regression trees"

1

Asher, Sam, Denis Nekipelov, Paul Novosad, and Stephen Ryan. Classification Trees for Heterogeneous Moment-Based Models. National Bureau of Economic Research, 2016. http://dx.doi.org/10.3386/w22976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Friedman, Jerome H. CLASSIFICATION AND MULTIPLE REGRESSION THROUGH PROJECTION PURSUIT. Office of Scientific and Technical Information (OSTI), 1985. http://dx.doi.org/10.2172/1447844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Friedman, Jerome H. Classification and Multiple Regression through Projection Pursuit. Defense Technical Information Center, 1985. http://dx.doi.org/10.21236/ada150797.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hauzenberger, Niko, Florian Huber, Gary Koop, and James Mitchell. Bayesian modeling of time-varying parameters using regression trees. Federal Reserve Bank of Cleveland, 2023. http://dx.doi.org/10.26509/frbc-wp-202305.

Full text
Abstract:
In light of widespread evidence of parameter instability in macroeconomic models, many time-varying parameter (TVP) models have been proposed. This paper proposes a nonparametric TVP-VAR model using Bayesian additive regression trees (BART). The novelty of this model stems from the fact that the law of motion driving the parameters is treated nonparametrically. This leads to great flexibility in the nature and extent of parameter change, both in the conditional mean and in the conditional variance. In contrast to other nonparametric and machine learning methods that are black box, inference using our model is straightforward because, in treating the parameters rather than the variables nonparametrically, the model remains conditionally linear in the mean. Parsimony is achieved through adopting nonparametric factor structures and use of shrinkage priors. In an application to US macroeconomic data, we illustrate the use of our model in tracking both the evolving nature of the Phillips curve and how the effects of business cycle shocks on inflationary measures vary nonlinearly with movements in uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
5

Abell, Caitlyn E., Anna K. Johnson, Locke A. Karriker, Suzanne T. Millman, and Kenneth J. Stalder. Using Classification Trees to Detect Lameness in Sows. Iowa State University, 2013. http://dx.doi.org/10.31274/ans_air-180814-930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vogt, Michael, and Oliver Linton. Classification of nonparametric regression functions in heterogeneous panels. IFS, 2015. http://dx.doi.org/10.1920/wp.cem.2015.0615.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aba, David W., and Leonard A. Breslow. Comparing Simplification Procedures for Decision Trees on an Economics Classification. Defense Technical Information Center, 1998. http://dx.doi.org/10.21236/ada343512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Steed, Chad A., J. Edward SwanII, Patrick J. Fitzpatrick, and T. J. Jankun-Kelly. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis. Office of Scientific and Technical Information (OSTI), 2012. http://dx.doi.org/10.2172/1035521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sharma, Tarkeshwari, Adrian Silvescu, and Vasant Honavar. Learning Classification Trees from Distributed Horizontally and Vertically Fragmented Data Sets. Defense Technical Information Center, 2000. http://dx.doi.org/10.21236/ada638206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Carin, Lawrence. FLIR Image Classification Using Hidden Markov Trees, with Comparisons to More-Traditional Algorithms. Defense Technical Information Center, 2002. http://dx.doi.org/10.21236/ada407122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography