To see the other types of publications on this topic, follow the link: Selection Criterion.

Journal articles on the topic 'Selection Criterion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Selection Criterion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Rahman, M. S., and Maxwell L. King. "Improved model selection criterion." Communications in Algebra 28, no. 1 (1999): 51–71. http://dx.doi.org/10.1080/00927879908826827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rahman, M. S., and Maxwell L. King. "Improved model selection criterion." Communications in Statistics - Simulation and Computation 28, no. 1 (January 1999): 51–71. http://dx.doi.org/10.1080/03610919908813535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kritikaa, Sr. "Ayurvedic Medical Destination - Selection Criterion Preference of Coimbatore City." International Journal of Trend in Scientific Research and Development Volume-3, Issue-3 (April 30, 2019): 1120–23. http://dx.doi.org/10.31142/ijtsrd23176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Das Gupta, Ujjwal. "Parameter Selection for EM Clustering Using Information Criterion and PDDP." International Journal of Engineering and Technology 2, no. 4 (2010): 340–44. http://dx.doi.org/10.7763/ijet.2010.v2.144.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

McQuarrie, Allan, Robert Shumway, and Chih-Ling Tsai. "The model selection criterion AICu." Statistics & Probability Letters 34, no. 3 (June 1997): 285–92. http://dx.doi.org/10.1016/s0167-7152(96)00192-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tseng, Chih-Yuan. "Entropic criterion for model selection." Physica A: Statistical Mechanics and its Applications 370, no. 2 (October 2006): 530–38. http://dx.doi.org/10.1016/j.physa.2006.03.024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Spinks, Nelda, and Barron Wells. "Readability: A Textbook Selection Criterion." Journal of Education for Business 69, no. 2 (December 1993): 83–87. http://dx.doi.org/10.1080/08832323.1993.10117662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Ural, Hasan, and Bo Yang. "A structural test selection criterion." Information Processing Letters 28, no. 3 (July 1988): 157–63. http://dx.doi.org/10.1016/0020-0190(88)90162-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Maity, Arnab Kumar, Sanjib Basu, and Santu Ghosh. "Bayesian criterion‐based variable selection." Journal of the Royal Statistical Society: Series C (Applied Statistics) 70, no. 4 (April 27, 2021): 835–57. http://dx.doi.org/10.1111/rssc.12488.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Steuer, Ralph E., Maximilian Wimmer, and Markus Hirschberger. "Overviewing the transition of Markowitz bi-criterion portfolio selection to tri-criterion portfolio selection." Journal of Business Economics 83, no. 1 (January 19, 2013): 61–85. http://dx.doi.org/10.1007/s11573-012-0642-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Liu, Ming. "Construction of Supplier Selection Criterion System of Green Supply Chain Based on Information Gain Analysis." Advanced Materials Research 933 (May 2014): 980–83. http://dx.doi.org/10.4028/www.scientific.net/amr.933.980.

Full text
Abstract:
The excellent suppliers are the guarantee for smooth operation of green supply chain. While the scientific green supplier evaluation system is the foundation of supplier selection. The article analyzes the characteristics of supplier selection criterion system in green supply chain, and lists the common evaluation criterion. To construct a more reasonable evaluation system, the method of information gain analysis is introduced into the paper, which can handpick the criterions reasonably.
APA, Harvard, Vancouver, ISO, and other styles
12

Yoshida, Makoto, Satoshi Ikeda, Takao Hinoi, Masanori Yoshimitu, Yuji Takakura, Daisuke Sumitani, Haruka Takeda, et al. "A Useful Simple Selection Criterion for HNPCC Patient Selection." Nippon Daicho Komonbyo Gakkai Zasshi 62, no. 5 (2009): 323–27. http://dx.doi.org/10.3862/jcoloproctology.62.323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kuiper, R. M., H. Hoijtink, and M. J. Silvapulle. "An Akaike-type information criterion for model selection under inequality constraints." Biometrika 98, no. 2 (April 22, 2011): 495–501. http://dx.doi.org/10.1093/biomet/asr002.

Full text
Abstract:
Abstract The Akaike information criterion for model selection presupposes that the parameter space is not subject to order restrictions or inequality constraints. Anraku (1999) proposed a modified version of this criterion, called the order-restricted information criterion, for model selection in the one-way analysis of variance model when the population means are monotonic. We propose a generalization of this to the case when the population means may be restricted by a mixture of linear equality and inequality constraints. If the model has no inequality constraints, then the generalized order-restricted information criterion coincides with the Akaike information criterion. Thus, the former extends the applicability of the latter to model selection in multi-way analysis of variance models when some models may have inequality constraints while others may not. Simulation shows that the information criterion proposed in this paper performs well in selecting the correct model.
APA, Harvard, Vancouver, ISO, and other styles
14

HanChen Huang. "Weight Analysis of Criterion and Sub-Criterion for Supplier Selection." Journal of Next Generation Information Technology 4, no. 5 (July 31, 2013): 55–62. http://dx.doi.org/10.4156/jnit.vol4.issue5.7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Li, Wentian, and Dale R. Nyholt. "Marker Selection by Akaike Information Criterion and Bayesian Information Criterion." Genetic Epidemiology 21, S1 (2001): S272—S277. http://dx.doi.org/10.1002/gepi.2001.21.s1.s272.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kaper, Bertrand P. "Surgical Skills as Resident-Selection Criterion." Journal of Bone and Joint Surgery-American Volume 85, no. 7 (July 2003): 1400. http://dx.doi.org/10.2106/00004623-200307000-00051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Bernstein, Adam D., and Joseph D. Zuckerman. "Surgical Skills as Resident-Selection Criterion." Journal of Bone and Joint Surgery-American Volume 85, no. 7 (July 2003): 1400. http://dx.doi.org/10.2106/00004623-200307000-00052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Schelokov, Ya M., V. G. Lisienko, Yu N. Chesnokov, and A. V. Lapteva. "A comprehensive criterion for BAT selection." Ferrous Metallurgy. Bulletin of Scientific , Technical and Economic Information 75, no. 12 (December 27, 2019): 1385–91. http://dx.doi.org/10.32339/0135-5910-2019-12-1385-1391.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Pham, Hoang. "A New Criterion for Model Selection." Mathematics 7, no. 12 (December 10, 2019): 1215. http://dx.doi.org/10.3390/math7121215.

Full text
Abstract:
Selecting the best model from a set of candidates for a given set of data is obviously not an easy task. In this paper, we propose a new criterion that takes into account a larger penalty when adding too many coefficients (or estimated parameters) in the model from too small a sample in the presence of too much noise, in addition to minimizing the sum of squares error. We discuss several real applications that illustrate the proposed criterion and compare its results to some existing criteria based on a simulated data set and some real datasets including advertising budget data, newly collected heart blood pressure health data sets and software failure data.
APA, Harvard, Vancouver, ISO, and other styles
20

Philips, R., and I. Guttman. "A new criterion for variable selection." Statistics & Probability Letters 38, no. 1 (May 1998): 11–19. http://dx.doi.org/10.1016/s0167-7152(97)00148-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Sugiyama, Masashi, and Hidemitsu Ogawa. "Subspace Information Criterion for Model Selection." Neural Computation 13, no. 8 (August 1, 2001): 1863–89. http://dx.doi.org/10.1162/08997660152469387.

Full text
Abstract:
The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this article, we propose a new criterion for model selection, the subspace information criterion (SIC), which is a generalization of Mallows's CL. It is assumed that the learning target function belongs to a specified functional Hilbert space and the generalization error is defined as the Hilbert space squared norm of the difference between the learning result function and target function. SIC gives an unbiased estimate of the generalization error so defined. SIC assumes the availability of an unbiased estimate of the target function and the noise covariance matrix, which are generally unknown. A practical calculation method of SIC for least-mean-squares learning is provided under the assumption that the dimension of the Hilbert space is less than the number of training examples. Finally, computer simulations in two examples show that SIC works well even when the number of training examples is small.
APA, Harvard, Vancouver, ISO, and other styles
22

VanderWeele, Tyler J., and Ilya Shpitser. "A New Criterion for Confounder Selection." Biometrics 67, no. 4 (May 31, 2011): 1406–13. http://dx.doi.org/10.1111/j.1541-0420.2011.01619.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Yuan, A., and B. Clarke. "An information criterion for likelihood selection." IEEE Transactions on Information Theory 45, no. 2 (March 1999): 562–71. http://dx.doi.org/10.1109/18.749003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Avlogiaris, G., A. C. Micheas, and K. Zografos. "A Criterion for Local Model Selection." Sankhya A 81, no. 2 (March 29, 2018): 406–44. http://dx.doi.org/10.1007/s13171-018-0126-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ding, Yuanyao. "Portfolio Selection under Maximum Minimum Criterion." Quality & Quantity 40, no. 3 (June 2006): 457–68. http://dx.doi.org/10.1007/s11135-005-1054-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Gehrlein, William V. "The Condorcet criterion and committee selection." Mathematical Social Sciences 10, no. 3 (December 1985): 199–209. http://dx.doi.org/10.1016/0165-4896(85)90043-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Nguefack-Tsague, Georges, and Ingo Bulla. "A Focused Bayesian Information Criterion." Advances in Statistics 2014 (October 14, 2014): 1–8. http://dx.doi.org/10.1155/2014/504325.

Full text
Abstract:
Myriads of model selection criteria (Bayesian and frequentist) have been proposed in the literature aiming at selecting a single model regardless of its intended use. An honorable exception in the frequentist perspective is the “focused information criterion” (FIC) aiming at selecting a model based on the parameter of interest (focus). This paper takes the same view in the Bayesian context; that is, a model may be good for one estimand but bad for another. The proposed method exploits the Bayesian model averaging (BMA) machinery to obtain a new criterion, the focused Bayesian model averaging (FoBMA), for which the best model is the one whose estimate is closest to the BMA estimate. In particular, for two models, this criterion reduces to the classical Bayesian model selection scheme of choosing the model with the highest posterior probability. The new method is applied in linear regression, logistic regression, and survival analysis. This criterion is specially important in epidemiological studies in which the objective is often to determine a risk factor (focus) for a disease, adjusting for potential confounding factors.
APA, Harvard, Vancouver, ISO, and other styles
28

Liu, Shi Hua, Xian Gang Liu, and Zhi Jian Sun. "A Skywave OTHR Adaptive Frequency Selection Method Based on Preliminary Criterion and Weighted Criterion." Advanced Materials Research 1061-1062 (December 2014): 974–77. http://dx.doi.org/10.4028/www.scientific.net/amr.1061-1062.974.

Full text
Abstract:
A skywave radar adaptive frequency selection method based on the preliminary criterion and the weighted criterion is presented. In this method, according to the various operational tasks, the frequency selection criterion is divided into the preliminary criterion and the weighted criterion based on the characteristic of the targets. The adaptive frequency selection of the skywave radar is achieved by the weighted computed of the frequency selection criterion. The feasibility and availability is demonstrated by an example.
APA, Harvard, Vancouver, ISO, and other styles
29

Zhang, Wenqiang, Xiaorun Li, and Liaoying Zhao. "Discovering the Representative Subset with Low Redundancy for Hyperspectral Feature Selection." Remote Sensing 11, no. 11 (June 4, 2019): 1341. http://dx.doi.org/10.3390/rs11111341.

Full text
Abstract:
In this paper, a novel unsupervised band selection (BS) criterion based on maximizing representativeness and minimizing redundancy (MRMR) is proposed for selecting a set of informative bands to represent the whole hyperspectral image cube. The new selection criterion is denoted as the MRMR selection criterion and the associated BS method is denoted as the MRMR method. The MRMR selection criterion can evaluate the band subset’s representativeness and redundancy simultaneously. For one band subset, its representativeness is estimated by using orthogonal projection (OP) and its redundancy is measured by the average of the Pearson correlation coefficients among the bands in this subset. To find the satisfactory subset, an effective evolutionary algorithm, i.e., the immune clone selection (ICS) algorithm, is applied as the subset searching strategy. Moreover, we further introduce two effective tricks to simplify the computation of the representativeness metric, thus the computational complexity of the proposed method is reduced significantly. Experimental results on different real-world datasets demonstrate that the proposed method is very effective and its selected bands can obtain good classification performances in practice.
APA, Harvard, Vancouver, ISO, and other styles
30

Veerkamp, Wim J. J., and Martijn P. F. Berger. "Some New Item Selection Criteria for Adaptive Testing." Journal of Educational and Behavioral Statistics 22, no. 2 (June 1997): 203–26. http://dx.doi.org/10.3102/10769986022002203.

Full text
Abstract:
In this study some alternative item selection criteria for adaptive testing are proposed. These criteria take into account the uncertainty of the ability estimates. A general weighted information criterion of which the usual maximum information criterion and the proposed alternative criteria are special cases is suggested. A small simulation study was conducted to compare the different criteria. The results showed that the likelihood weighted information criterion is a good alternative to the maximum information criterion. Another good alternative is a maximum information criterion with the maximum likelihood estimator of ability replaced by the Bayesian expected a posteriori estimator.
APA, Harvard, Vancouver, ISO, and other styles
31

Takezawa, Kunio. "Flexible Model Selection Criterion for Multiple Regression." Open Journal of Statistics 02, no. 04 (2012): 401–7. http://dx.doi.org/10.4236/ojs.2012.24048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Trottini, Mario, and Fulvio Spezzaferri. "A generalized predictive criterion for model selection." Canadian Journal of Statistics 30, no. 1 (March 2002): 79–96. http://dx.doi.org/10.2307/3315866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zubareva, V. D., A. S. Sarkisov, V. Yu Khatkov, A. A. Skubriy, and A. Kh Ozdoeva. "Criterion for selection of oilfield development alternatives." Neftyanoe khozyaystvo - Oil Industry, no. 8 (2019): 76–79. http://dx.doi.org/10.24887/0028-2448-2019-8-76-79.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ghosh, Malay, Victor Mergel, and Ruitao Liu. "A general divergence criterion for prior selection." Annals of the Institute of Statistical Mathematics 63, no. 1 (March 10, 2009): 43–58. http://dx.doi.org/10.1007/s10463-009-0226-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Rouzi, Abdulrahim A., and Peter F. McComb. "A new selection criterion for fimbriectomy reversal." Fertility and Sterility 64, no. 1 (July 1995): 185–86. http://dx.doi.org/10.1016/s0015-0282(16)57677-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Fang, Ning, and Ming-Chieh Cheng. "On Threshold Selection Using Fuzzy Risk Criterion." Japanese Journal of Applied Physics 31, Part 1, No. 5A (May 15, 1992): 1382–88. http://dx.doi.org/10.1143/jjap.31.1382.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

van de Straete, H. J., P. Degezelle, J. De Schutter, and R. J. M. Belmans. "Servo motor selection criterion for mechatronic applications." IEEE/ASME Transactions on Mechatronics 3, no. 1 (March 1998): 43–50. http://dx.doi.org/10.1109/3516.662867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Bab-Hadiashar, A., and N. Gheissari. "Range image segmentation using surface selection criterion." IEEE Transactions on Image Processing 15, no. 7 (July 2006): 2006–18. http://dx.doi.org/10.1109/tip.2006.877064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Tran, Minh-Ngoc. "A Criterion for Optimal Predictive Model Selection." Communications in Statistics - Theory and Methods 40, no. 5 (February 8, 2011): 893–906. http://dx.doi.org/10.1080/03610920903486798.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Chung, Han-Yeong, Kee-Won Lee, and Ja-Yong Koo. "A note on bootstrap model selection criterion." Statistics & Probability Letters 26, no. 1 (January 1996): 35–41. http://dx.doi.org/10.1016/0167-7152(94)00249-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Takano, Yuichi, and Ryuhei Miyashiro. "Best subset selection via cross-validation criterion." TOP 28, no. 2 (February 14, 2020): 475–88. http://dx.doi.org/10.1007/s11750-020-00538-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Lan, Wei, Hansheng Wang, and Chih-Ling Tsai. "A Bayesian information criterion for portfolio selection." Computational Statistics & Data Analysis 56, no. 1 (January 2012): 88–99. http://dx.doi.org/10.1016/j.csda.2011.06.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Agus Ristono, Tri Wahyuningsih, and Eko Junianto. "Proposed Method for Supplier Selection." Technium Social Sciences Journal 13 (November 3, 2020): 376–94. http://dx.doi.org/10.47577/tssj.v13i1.1955.

Full text
Abstract:
The use of the Analytical Hierarchy Process (AHP) is frequent in supplier selection. First, AHP is a pairwise comparison between criteria. If the pairwise comparisons are inconsistent, the result is invalid. Thus, the process of comparing criteria must be repeated continuously until valid results are obtained. This process takes time and costs so it is considered inefficient. This research proposes the application of the Hamilton chain process into the pairwise comparison matrix. One criterion is symbolized as a knot, while the arc is symbolized as the pairwise comparison value between the two nodes or the connected criterion. In the network model of the AHP method, each node is connected to all other nodes without exception. Whereas in the proposed method, each criterion or node is compared only once. That said, avoiding inconsistencies can be made. The consistency ratio result of the proposed method is found to be consistent
APA, Harvard, Vancouver, ISO, and other styles
44

Veres, S. M., and J. P. Norton. "Structure selection for bounded-parameter models: consistency conditions and selection criterion." IEEE Transactions on Automatic Control 36, no. 4 (April 1991): 474–81. http://dx.doi.org/10.1109/9.75105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Slavyanov, Krasimir. "AN ALGORITHM OF FUZZY INFERENCE SYSTEM FOR HUMAN RESOURCES SELECTION TOOLS." SOCIETY. INTEGRATION. EDUCATION. Proceedings of the International Scientific Conference 5 (May 25, 2018): 445–54. http://dx.doi.org/10.17770/sie2018vol1.3311.

Full text
Abstract:
This article offers an original Human Resources selection procedure based on Mamdani fuzzy inference system (FIS) dedicated to compute multiple results each from different type of analyzing criterions. The modeling and information analysis of the FIS are developed to draw a general conclusion from several results each produced by Human Resources selection basic criterion. Simulation experiments are carried out in MATLAB environment.
APA, Harvard, Vancouver, ISO, and other styles
46

Chowdhury, Mohammad Ziaul Islam, and Tanvir C. Turin. "Variable selection strategies and its importance in clinical prediction modelling." Family Medicine and Community Health 8, no. 1 (February 2020): e000262. http://dx.doi.org/10.1136/fmch-2019-000262.

Full text
Abstract:
Clinical prediction models are used frequently in clinical practice to identify patients who are at risk of developing an adverse outcome so that preventive measures can be initiated. A prediction model can be developed in a number of ways; however, an appropriate variable selection strategy needs to be followed in all cases. Our purpose is to introduce readers to the concept of variable selection in prediction modelling, including the importance of variable selection and variable reduction strategies. We will discuss the various variable selection techniques that can be applied during prediction model building (backward elimination, forward selection, stepwise selection and all possible subset selection), and the stopping rule/selection criteria in variable selection (p values, Akaike information criterion, Bayesian information criterion and Mallows’ Cp statistic). This paper focuses on the importance of including appropriate variables, following the proper steps, and adopting the proper methods when selecting variables for prediction models.
APA, Harvard, Vancouver, ISO, and other styles
47

Ostrovskaya, N. V., and D. E. Bondarev. "Criteria for damping parameters optimization in seismic isolated structures." Вестник гражданских инженеров 17, no. 5 (2020): 94–100. http://dx.doi.org/10.23968/1999-5571-2020-17-5-94-100.

Full text
Abstract:
All kinds of seismic insulation systems are widely used to protect various structures and buildings, including unique ones, against earthquakes. The efficiency of such systems depends significantly on the competent selection of damping parameters. The article considers a general approach to selecting optimal damping parameters, both by the criterion of absolute accelerations and by the kinematic criterion. The optimization criterion for unique structures is also proposed.
APA, Harvard, Vancouver, ISO, and other styles
48

Sen, Sedat, Allan S. Cohen, and Seock-Ho Kim. "Model Selection for Multilevel Mixture Rasch Models." Applied Psychological Measurement 43, no. 4 (June 7, 2018): 272–89. http://dx.doi.org/10.1177/0146621618779990.

Full text
Abstract:
Mixture item response theory (MixIRT) models can sometimes be used to model the heterogeneity among the individuals from different subpopulations, but these models do not account for the multilevel structure that is common in educational and psychological data. Multilevel extensions of the MixIRT models have been proposed to address this shortcoming. Successful applications of multilevel MixIRT models depend in part on detection of the best fitting model. In this study, performance of information indices, Akaike information criterion (AIC), Bayesian information criterion (BIC), consistent Akaike information criterion (CAIC), and sample-size adjusted Bayesian information criterion (SABIC), were compared for use in model selection with a two-level mixture Rasch model in the context of a real data example and a simulation study. Level 1 consisted of students and Level 2 consisted of schools. The performances of the model selection criteria under different sample sizes were investigated in a simulation study. Total sample size (number of students) and Level 2 sample size (number of schools) were studied for calculation of information criterion indices to examine the performance of these fit indices. Simulation study results indicated that CAIC and BIC performed better than the other indices at detection of the true (i.e., generating) model. Furthermore, information indices based on total sample size yielded more accurate detections than indices at Level 2.
APA, Harvard, Vancouver, ISO, and other styles
49

ABELLÁN, JOAQUÍN, and ANDRÉS R. MASEGOSA. "A FILTER-WRAPPER METHOD TO SELECT VARIABLES FOR THE NAIVE BAYES CLASSIFIER BASED ON CREDAL DECISION TREES." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 17, no. 06 (December 2009): 833–54. http://dx.doi.org/10.1142/s0218488509006297.

Full text
Abstract:
Variable selection methods play an important role in the field of attribute mining. In the last few years, several feature selection methods have appeared showing that the use of a set of decision trees learnt from a database can be a useful tool for selecting relevant and informative variables regarding a main class variable. With the Naive Bayes classifier as reference, in this article, our aims are twofold: (1) to study what split criterion has better performance when a complete decision tree is used to select variables; and (2) to present a filter-wrapper selection method using decision trees built with the best possible split criterion obtained in (1).
APA, Harvard, Vancouver, ISO, and other styles
50

Helu, S. Langitoto, David B. Sampson, and Yanshui Yin. "Application of statistical model selection criteria to the Stock Synthesis assessment program." Canadian Journal of Fisheries and Aquatic Sciences 57, no. 9 (September 1, 2000): 1784–93. http://dx.doi.org/10.1139/f00-137.

Full text
Abstract:
Statistical modeling involves building sufficiently complex models to represent the system being investigated. Overly complex models lead to imprecise parameter estimates, increase the subjective role of the modeler, and can distort the perceived characteristics of the system under investigation. One approach for controlling the tendency to increased complexity and subjectivity is to use model selection criteria that account for these factors. The effectiveness of two selection criteria was tested in an application with the stock assessment program known as Stock Synthesis. This program, which is often used on the U.S. west coast to assess the status of exploited marine fish stocks, can handle multiple data sets and mimic highly complex population dynamics. The Akaike information criterion and Schwarz's Bayesian information criterion are criteria that satisfy the fundamental principles of model selection: goodness-of-fit, parsimony, and objectivity. Their ability to select the correct model form and produce accurate estimates was evaluated in Monte Carlo experiments with the Stock Synthesis program. In general, the Akaike information criterion and the Bayesian information criterion had similar performance in selecting the correct model, and they produced comparable levels of accuracy in their estimates of ending stock biomass.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography