Academic literature on the topic 'Variable selection bias'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Variable selection bias.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Variable selection bias"
Canan, Chelsea, Catherine Lesko, and Bryan Lau. "Instrumental Variable Analyses and Selection Bias." Epidemiology 28, no. 3 (May 2017): 396–98. http://dx.doi.org/10.1097/ede.0000000000000639.
Full textShin, Sung-Chul, Yeon-Joo Jeong, and Moon Sup Song. "Bias Reduction in Split Variable Selection in C4.5." Communications for Statistical Applications and Methods 10, no. 3 (December 1, 2003): 627–35. http://dx.doi.org/10.5351/ckss.2003.10.3.627.
Full textYeob Choi, Byeong, Jason P. Fine, and M. Alan Brookhart. "Bias testing, bias correction, and confounder selection using an instrumental variable model." Statistics in Medicine 39, no. 29 (August 27, 2020): 4386–404. http://dx.doi.org/10.1002/sim.8730.
Full textShih, Yu-Shan, and Hsin-Wen Tsai. "Variable selection bias in regression trees with constant fits." Computational Statistics & Data Analysis 45, no. 3 (April 2004): 595–607. http://dx.doi.org/10.1016/s0167-9473(03)00036-7.
Full textGarcía, O. "Estimating top height with variable plot sizes." Canadian Journal of Forest Research 28, no. 10 (October 1, 1998): 1509–17. http://dx.doi.org/10.1139/x98-128.
Full textZhao, Pei Xin. "Penalized Estimation Based Variable Selection for Semiparametric Regression Models with Endogenous Covariates." Advanced Materials Research 1079-1080 (December 2014): 843–46. http://dx.doi.org/10.4028/www.scientific.net/amr.1079-1080.843.
Full textSwanson, Sonja A. "A Practical Guide to Selection Bias in Instrumental Variable Analyses." Epidemiology 30, no. 3 (May 2019): 345–49. http://dx.doi.org/10.1097/ede.0000000000000973.
Full textMarshall, Andrew, Leilei Tang, and Alistair Milne. "Variable reduction, sample selection bias and bank retail credit scoring." Journal of Empirical Finance 17, no. 3 (June 2010): 501–12. http://dx.doi.org/10.1016/j.jempfin.2009.12.003.
Full textQin, Xiao, and Junhee Han. "Variable Selection Issues in Tree-Based Regression Models." Transportation Research Record: Journal of the Transportation Research Board 2061, no. 1 (January 2008): 30–38. http://dx.doi.org/10.3141/2061-04.
Full textNishi, Hayato, Yasushi Asami, and Chihiro Shimizu. "Housing features and rent: estimating the microstructures of rental housing." International Journal of Housing Markets and Analysis 12, no. 2 (April 1, 2019): 210–25. http://dx.doi.org/10.1108/ijhma-09-2018-0067.
Full textDissertations / Theses on the topic "Variable selection bias"
Strobl, Carolin, Anne-Laure Boulesteix, Achim Zeileis, and Torsten Hothorn. "Bias in Random Forest Variable Importance Measures: Illustrations, Sources and a Solution." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 2006. http://epub.wu.ac.at/1274/1/document.pdf.
Full textSeries: Research Report Series / Department of Statistics and Mathematics
Tseng, Shih-Hsien. "Bayesian and Semi-Bayesian regression applied to manufacturing wooden products." The Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=osu1199240473.
Full textDitrich, Josef. "Možnosti redukce výběrového zkreslení v ratingových modelech." Doctoral thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-201116.
Full textCai, Mingxuan. "BIVAS: a scalable Bayesian method for bi-level variable selection." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/482.
Full textXie, Diqiong. "Bias and variance of treatment effect estimators using propensity-score matching." Diss., University of Iowa, 2011. https://ir.uiowa.edu/etd/4980.
Full textRomero, Merino Enrique. "Learning with Feed-forward Neural Networks: Three Schemes to Deal with the Bias/Variance Trade-off." Doctoral thesis, Universitat Politècnica de Catalunya, 2004. http://hdl.handle.net/10803/6644.
Full textIn this work we present three schemes related to the control of the Bias/Variance decomposition for Feed-forward Neural Networks (FNNs) with the (sometimes modified) quadratic loss function:
1. An algorithm for sequential approximation with FNNs, named Sequential Approximation with Optimal Coefficients and Interacting Frequencies (SAOCIF). Most of the sequential approximations proposed in the literature select the new frequencies (the non-linear weights) guided by the approximation of the residue of the partial approximation. We propose a sequential algorithm where the new frequency is selected taking into account its interactions with the previously selected ones. The interactions are discovered by means of their optimal coefficients (the linear weights). A number of heuristics can be used to select the new frequencies. The aim is that the same level of approximation may be achieved with less hidden units than if we only try to match the residue as best as possible. In terms of the Bias/Variance decomposition, it will be possible to obtain simpler models with the same bias. The idea behind SAOCIF can be extended to approximation in Hilbert spaces, maintaining orthogonal-like properties. In this case, the importance of the interacting frequencies lies in the expectation of increasing the rate of approximation. Experimental results show that the idea of interacting frequencies allows to construct better approximations than matching the residue.
2. A study and comparison of different criteria to perform Feature Selection (FS) with Multi-Layer Perceptrons (MLPs) and the Sequential Backward Selection (SBS) procedure within the wrapper approach. FS procedures control the Bias/Variance decomposition by means of the input dimension, establishing a clear connection with the curse of dimensionality. Several critical decision points are studied and compared. First, the stopping criterion. Second, the data set where the value of the loss function is measured. Finally, we also compare two ways of computing the saliency (i.e., the relative importance) of a feature: either first train a network and then remove temporarily every feature or train a different network with every feature temporarily removed. The experiments are performed for linear and non-linear models. Experimental results suggest that the increase in the computational cost associated with retraining a different network with every feature temporarily removed previous to computing the saliency can be rewarded with a significant performance improvement, specially if non-linear models are used. Although this idea could be thought as very intuitive, it has been hardly used in practice. Regarding the data set where the value of the loss function is measured, it seems clear that the SBS procedure for MLPs takes profit from measuring the loss function in a validation set. A somewhat non-intuitive conclusion is drawn looking at the stopping criterion, where it can be seen that forcing overtraining may be as useful as early stopping.
3. A modification of the quadratic loss function for classification problems, inspired in Support Vector Machines (SVMs) and the AdaBoost algorithm, named Weighted Quadratic Loss (WQL) function. The modification consists in weighting the contribution of every example to the total error. In the linearly separable case, the solution of the hard margin SVM also minimizes the proposed loss function. The hardness of the resulting solution can be controlled, as in SVMs, so that this scheme may also be used for the non-linearly separable case. The error weighting proposed in WQL forces the training procedure to pay more attention to the points with a smaller margin. Therefore, variance tries to be controlled by not attempting to overfit the points that are already well classified. The model shares several properties with the SVMs framework, with some additional advantages. On the one hand, the final solution is neither restricted to have an architecture with so many hidden units as points (or support vectors) in the data set nor to use kernel functions. The frequencies are not restricted to be a subset of the data set. On the other hand, it allows to deal with multiclass and multilabel problems in a natural way. Experimental results are shown confirming these claims.
A wide experimental work has been done with the proposed schemes, including artificial data sets, well-known benchmark data sets and two real-world problems from the Natural Language Processing domain. In addition to widely used activation functions, such as the hyperbolic tangent or the Gaussian function, other activation functions have been tested. In particular, sinusoidal MLPs showed a very good behavior. The experimental results can be considered as very satisfactory. The schemes presented in this work have been found to be very competitive when compared to other existing schemes described in the literature. In addition, they can be combined among them, since they deal with complementary aspects of the whole learning process.
Bhatti, Sajjad Haider. "Estimation of the mincerian wage model addressing its specification and different econometric issues." Phd thesis, Université de Bourgogne, 2012. http://tel.archives-ouvertes.fr/tel-00780563.
Full textShandilya, Sharad. "ASSESSMENT AND PREDICTION OF CARDIOVASCULAR STATUS DURING CARDIAC ARREST THROUGH MACHINE LEARNING AND DYNAMICAL TIME-SERIES ANALYSIS." VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/3198.
Full text"Addressing the Variable Selection Bias and Local Optimum Limitations of Longitudinal Recursive Partitioning with Time-Efficient Approximations." Doctoral diss., 2019. http://hdl.handle.net/2286/R.I.54792.
Full textDissertation/Thesis
Doctoral Dissertation Psychology 2019
Lutu, P. E. N. (Patricia Elizabeth Nalwoga). "Dataset selection for aggregate model implementation in predictive data mining." Thesis, 2010. http://hdl.handle.net/2263/29486.
Full textThesis (PhD)--University of Pretoria, 2010.
Computer Science
unrestricted
Books on the topic "Variable selection bias"
Boudreau, Joseph F., and Eric S. Swanson. Monte Carlo methods. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198708636.003.0007.
Full textBook chapters on the topic "Variable selection bias"
Baskin, Igor I., Gilles Marcou, Dragos Horvath, and Alexandre Varnek. "Cross-Validation and the Variable Selection Bias." In Tutorials in Chemoinformatics, 163–73. Chichester, UK: John Wiley & Sons, Ltd, 2017. http://dx.doi.org/10.1002/9781119161110.ch10.
Full textRichards, Joseph W. "Overcoming Sample Selection Bias in Variable Star Classification." In Astrostatistics and Data Mining, 213–21. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-3323-1_22.
Full textMarquis, Bastien, and Maarten Jansen. "Correction for Optimisation Bias in Structured Sparse High-Dimensional Variable Selection." In Springer Proceedings in Mathematics & Statistics, 357–65. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-57306-5_32.
Full textMunson, M. Arthur, and Rich Caruana. "On Feature Selection, Bias-Variance, and Bagging." In Machine Learning and Knowledge Discovery in Databases, 144–59. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04174-7_10.
Full textRosales-Pérez, Alejandro, Hugo Jair Escalante, Jesus A. Gonzalez, Carlos A. Reyes-Garcia, and Carlos A. Coello Coello. "Bias and Variance Multi-objective Optimization for Support Vector Machines Model Selection." In Pattern Recognition and Image Analysis, 108–16. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38628-2_12.
Full textArreola, Julio, Damián Gibaja, J. Agustín Franco, and Marcelo Sánchez-Oro. "Comparison of the Bias and Weighting of Variables in Neural Networks (ANN) for the Selection of the Type of Housing in Spain and Mexico." In Studies in Computational Intelligence, 19–34. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72065-0_2.
Full textHankin, David G., Michael S. Mohr, and Ken B. Newman. "Ratio and regression estimation." In Sampling Theory, 104–39. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198815792.003.0007.
Full textPoast, Paul. "Analyzing Alliance Treaty Negotiation Outcomes." In Arguing about Alliances, 64–106. Cornell University Press, 2019. http://dx.doi.org/10.7591/cornell/9781501740244.003.0004.
Full text"An Algorithm for Causal Inference in the Presence of Latent Variables and Selection Bias." In Computation, Causation, and Discovery. The MIT Press, 1999. http://dx.doi.org/10.7551/mitpress/2006.003.0009.
Full textHankin, David G., Michael S. Mohr, and Ken B. Newman. "Basic concepts." In Sampling Theory, 11–22. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198815792.003.0002.
Full textConference papers on the topic "Variable selection bias"
"Variable Liquidity and Selection Bias in Transaction Indices of institutional Commercial Property." In 9th European Real Estate Society Conference: ERES Conference 2002. ERES, 2002. http://dx.doi.org/10.15396/eres2002_167.
Full textSimon, Donald L., and Sanjay Garg. "Optimal Tuner Selection for Kalman Filter-Based Aircraft Engine Performance Estimation." In ASME Turbo Expo 2009: Power for Land, Sea, and Air. ASMEDC, 2009. http://dx.doi.org/10.1115/gt2009-59684.
Full textSmart, Lucinda, Yanping Li, J. Bruce Nestleroth, and Suzanne Ward. "Interaction Rule Guidance for Corrosion Features Reported by ILI." In 2018 12th International Pipeline Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/ipc2018-78284.
Full textLall, Amrita, Hamid Khakpour Nejadkhaki, and John Hall. "An Integrative Framework for Design and Control Optimization of a Variable-Ratio Gearbox in a Wind Turbine With Active Blades." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-60244.
Full textXia, Rui, Zhenchun Pan, and Feng Xu. "Instance Weighting with Applications to Cross-domain Text Classification via Trading off Sample Selection Bias and Variance." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/624.
Full textMcShane, Michael J., Gerard L. Cote, and Clifford H. Spiegelman. "Variable selection for quantitative determination of glucose concentration with near-infrared spectroscopy." In BiOS '97, Part of Photonics West, edited by Alexander V. Priezzhev, Toshimitsu Asakura, and Robert C. Leif. SPIE, 1997. http://dx.doi.org/10.1117/12.273614.
Full textBronson, Robert J., Hans R. Depold, Ravi Rajamani, Somnath Deb, William H. Morrison, and Krishna R. Pattipati. "Data Normalization for Engine Health Monitoring." In ASME Turbo Expo 2005: Power for Land, Sea, and Air. ASMEDC, 2005. http://dx.doi.org/10.1115/gt2005-68039.
Full textHe, Shuangchi, and Jitendra Tugnait. "On Bias-Variance Trade-Off in Superimposed Training-Based Doubly Selective Channel Estimation." In 2006 40th Annual Conference on Information Sciences and Systems. IEEE, 2006. http://dx.doi.org/10.1109/ciss.2006.286666.
Full textHenriksson, M., S. Borguet, O. Le´onard, and T. Gro¨nstedt. "On Inverse Problems in Turbine Engine Parameter Estimation." In ASME Turbo Expo 2007: Power for Land, Sea, and Air. ASMEDC, 2007. http://dx.doi.org/10.1115/gt2007-27756.
Full textGormez, Z., O. Kursun, A. Sertbas, N. Aydin, and H. Seker. "Statistical bias and variance of gene selection and cross validation methods: A case study on hypertension prediction." In 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI). IEEE, 2012. http://dx.doi.org/10.1109/bhi.2012.6211658.
Full textReports on the topic "Variable selection bias"
Brown, H. C., K. Ganesan, and R. K. Dhar. Enolboration 3. An Examination of the Effect of Variable Steric Requirements of R on the Stereoselective Enolboration of Ketones with R2BCl/Et3N. Bis(Bicyclo(2.2.2)Octyl)Chloroborane/Triethylamine - A New Reagent Which Achieves the Selective Generation of E Enolborinates from Representative Ketones. Fort Belvoir, VA: Defense Technical Information Center, April 1992. http://dx.doi.org/10.21236/ada250066.
Full text