To see the other types of publications on this topic, follow the link: Variable sample size methods.

Journal articles on the topic 'Variable sample size methods'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Variable sample size methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Krejić, Nataša, and Nataša Krklec Jerinkić. "Nonmonotone line search methods with variable sample size." Numerical Algorithms 68, no. 4 (2014): 711–39. http://dx.doi.org/10.1007/s11075-014-9869-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Krejić, Nataša, and Nataša Krklec. "Line search methods with variable sample size for unconstrained optimization." Journal of Computational and Applied Mathematics 245 (June 2013): 213–31. http://dx.doi.org/10.1016/j.cam.2012.12.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kitikidou, K., and G. Chatzilazarou. "Estimating the sample size for fitting taper equations." Journal of Forest Science 54, No. 4 (2008): 176–82. http://dx.doi.org/10.17221/789-jfs.

Full text
Abstract:
Much work has been done fitting taper equations to describe tree bole shapes, but few researchers have investigated how large the sample size should be. In this paper, a method that requires two variables that are linearly correlated was applied to determine the sample size for fitting taper equations. Two cases of sample size estimation were tested, based on the method mentioned above. In the first case, the sample size required is referred to the total number of diameters estimated in the sampled trees. In the second case, the sample size required is referred to the number of sampled trees. The analysis showed that both methods are efficient from a validity standpoint but the first method has the advantage of decreased cost, since it costs much more to incrementally sample another tree than it does to make another diameter measurement on an already sampled tree.
APA, Harvard, Vancouver, ISO, and other styles
4

Ali, Sabz, Amjad Ali, Sajjad Ahmad Khan, and Sundas Hussain. "Sufficient Sample Size and Power in Multilevel Ordinal Logistic Regression Models." Computational and Mathematical Methods in Medicine 2016 (2016): 1–8. http://dx.doi.org/10.1155/2016/7329158.

Full text
Abstract:
For most of the time, biomedical researchers have been dealing with ordinal outcome variable in multilevel models where patients are nested in doctors. We can justifiably apply multilevel cumulative logit model, where the outcome variable represents the mild, severe, and extremely severe intensity of diseases like malaria and typhoid in the form of ordered categories. Based on our simulation conditions, Maximum Likelihood (ML) method is better than Penalized Quasilikelihood (PQL) method in three-category ordinal outcome variable. PQL method, however, performs equally well as ML method where five-category ordinal outcome variable is used. Further, to achieve power more than 0.80, at least 50 groups are required for both ML and PQL methods of estimation. It may be pointed out that, for five-category ordinal response variable model, the power of PQL method is slightly higher than the power of ML method.
APA, Harvard, Vancouver, ISO, and other styles
5

Vinogradov, A. G. "USING R FOR PSYCHOLOGICAL RESEARCH: A TUTORIAL OF BASIC METHODS." Ukrainian Psychological Journal, no. 2 (14) (2020): 28–63. http://dx.doi.org/10.17721/upj.2020.2(14).2.

Full text
Abstract:
The article belongs to a special modern genre of scholar publications, so-called tutorials – articles devoted to the application of the latest methods of design, modeling or analysis in an accessible format in order to disseminate best practices. The article acquaints Ukrainian psychologists with the basics of using the R programming language to the analysis of empirical research data. The article discusses the current state of world psychology in connection with the Crisis of Confidence, which arose due to the low reproducibility of empirical research. This problem is caused by poor quality of psychological measurement tools, insufficient attention to adequate sample planning, typical statistical hypothesis testing practices, and so-called “questionable research practices.” The tutorial demonstrates methods for determining the sample size depending on the expected magnitude of the effect size and desired statistical power, performing basic variable transformations and statistical analysis of psychological research data using language and environment R. The tutorial presents minimal system of R functions required to carry out: modern analysis of reliability of measurement scales, sample size calculation, point and interval estimation of effect size for four the most widespread in psychology designs for the analysis of two variables’ interdependence. These typical problems include finding the differences between the means and variances in two or more samples, correlations between continuous and categorical variables. Practical information on data preparation, import, basic transformations, and application of basic statistical methods in the cloud version of RStudio is provided.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhao, Naifei, Qingsong Xu, Man-lai Tang, and Hong Wang. "Variable Screening for Near Infrared (NIR) Spectroscopy Data Based on Ridge Partial Least Squares Regression." Combinatorial Chemistry & High Throughput Screening 23, no. 8 (2020): 740–56. http://dx.doi.org/10.2174/1386207323666200428114823.

Full text
Abstract:
Aim and Objective: Near Infrared (NIR) spectroscopy data are featured by few dozen to many thousands of samples and highly correlated variables. Quantitative analysis of such data usually requires a combination of analytical methods with variable selection or screening methods. Commonly-used variable screening methods fail to recover the true model when (i) some of the variables are highly correlated, and (ii) the sample size is less than the number of relevant variables. In these cases, Partial Least Squares (PLS) regression based approaches can be useful alternatives. Materials and Methods : In this research, a fast variable screening strategy, namely the preconditioned screening for ridge partial least squares regression (PSRPLS), is proposed for modelling NIR spectroscopy data with high-dimensional and highly correlated covariates. Under rather mild assumptions, we prove that using Puffer transformation, the proposed approach successfully transforms the problem of variable screening with highly correlated predictor variables to that of weakly correlated covariates with less extra computational effort. Results: We show that our proposed method leads to theoretically consistent model selection results. Four simulation studies and two real examples are then analyzed to illustrate the effectiveness of the proposed approach. Conclusion: By introducing Puffer transformation, high correlation problem can be mitigated using the PSRPLS procedure we construct. By employing RPLS regression to our approach, it can be made more simple and computational efficient to cope with the situation where model size is larger than the sample size while maintaining a high precision prediction.
APA, Harvard, Vancouver, ISO, and other styles
7

Endrenyi, Laszlo, and Laszlo Tothfalusi. "Sample Sizes for Designing Bioequivalence Studies for Highly Variable Drugs." Journal of Pharmacy & Pharmaceutical Sciences 15, no. 1 (2011): 73. http://dx.doi.org/10.18433/j3z88f.

Full text
Abstract:
Purpose. To provide tables of sample sizes which are required, by the European Medicines Agency (EMA) and the U.S. Food and Drug Administration (FDA), for the design of bioequivalence (BE) studies involving highly variable drugs. To elucidate the complicated features of the relationship between sample size and within-subject variation. Methods. 3- and 4-period studies were simulated with various sample sizes. They were evaluated, at various variations and various true ratios of the two geometric means (GMR), by the approaches of scaled average BE and by average BE with expanding limits. The sample sizes required for yielding 80% and 90% statistical powers were determined. Results. Because of the complicated regulatory expectations, the features of the required sample sizes are also complicated. When the true GMR = 1.0 then, without additional constraints, the sample size is independent of the intrasubject variation. When the true GMR is increased or decreased from 1.0 then the required sample sizes rise at above but close to 30% variation. An additional regulatory constraint on the point estimate of GMR and a cap on the use of expanding limits further increase the required sample size at high variations. Fewer subjects are required by the FDA than by the EMA procedures. Conclusions. The methods proposed by EMA and FDA lower the required sample sizes in comparison with unscaled average BE. However, each additional regulatory requirement (applying the mixed procedure, imposing a constraint on the point estimate of GMR, and using a cap on the application of expanding limits) raises the required number of subjects. 
 
 This article is open to POST-PUBLICATION REVIEW. Registered readers (see “For Readers”) may comment by clicking on ABSTRACT on the issue’s contents page.
APA, Harvard, Vancouver, ISO, and other styles
8

Van Delden, Arnout, Bart J. Du Chatinier, and Sander Scholtus. "Accuracy in the Application of Statistical Matching Methods for Continuous Variables Using Auxiliary Data." Journal of Survey Statistics and Methodology 8, no. 5 (2019): 990–1017. http://dx.doi.org/10.1093/jssam/smz032.

Full text
Abstract:
Abstract Statistical matching is a technique to combine variables in two or more nonoverlapping samples that are drawn from the same population. In the current study, the unobserved joint distribution between two target variables in nonoverlapping samples is estimated using a parametric model. A classical assumption to estimate this joint distribution is that the target variables are independent given the background variables observed in both samples. A problem with the use of this conditional independence assumption is that the estimated joint distribution may be severely biased when the assumption does not hold, which in general will be unacceptable for official statistics. Here, we explored to what extent the accuracy can be improved by the use of two types of auxiliary information: the use of a common administrative variable and the use of a small additional sample from a similar population. This additional sample is included by using the partial correlation of the target variables given the background variables or by using an EM algorithm. In total, four different approaches were compared to estimate the joint distribution of the target variables. Starting with empirical data, we show how the accuracy of the joint distribution is affected by the use of administrative data and by the size of the additional sample included via a partial correlation and through an EM algorithm. The study further shows how this accuracy depends on the strength of the relations among the target and auxiliary variables. We found that including a common administrative variable does not always improve the accuracy of the results. We further found that the EM algorithm nearly always yielded the most accurate results; this effect is largest when the explained variance of the separate target variables by the common background variables is not large.
APA, Harvard, Vancouver, ISO, and other styles
9

Hussain, Sarfraz, Abdul Quddus, Pham Phat Tien, Muhammad Rafiq, and Drahomíra Pavelková. "The moderating role of firm size and interest rate in capital structure of the firms: selected sample from sugar sector of Pakistan." Investment Management and Financial Innovations 17, no. 4 (2020): 341–55. http://dx.doi.org/10.21511/imfi.17(4).2020.29.

Full text
Abstract:
The selection of financing is a top priority for businesses, particularly in short- and long-term investment decisions. Mixing debt and equity leads to decisions on the financial structure for businesses. This research analyzes the moderate position of company size and the interest rate in the capital structure over six years (2013–2018) for 29 listed Pakistani enterprises operating in the sugar market. This research employed static panel analysis and dynamic panel analysis on linear and nonlinear regression methods. The capital structure included debt to capital ratio, non-current liabilities, plus current liabilities to capital as a dependent variable. Independent variables were profitability, firm size, tangibility, Non-Debt Tax Shield, liquidity, and macroeconomic variables were exchange rates and interest rates. The investigation reported that profitability, firm size, and Non-Debt Tax Shield were significant and negative, while tangibility and interest rates significantly and positively affected debt to capital ratio. This means the sugar sector has greater financial leverage to manage the funding obligations for the better performance of firms. Therefore, the outcomes revealed that the moderators have an important influence on capital structure.
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Zhou, Xiaojing Chen, Liuwei Meng, Mingen Yu, Limin Li, and Wen Shi. "Sample Consensus Model and Unsupervised Variable Consensus Model for Improving the Accuracy of a Calibration Model." Applied Spectroscopy 73, no. 7 (2019): 747–58. http://dx.doi.org/10.1177/0003702819852174.

Full text
Abstract:
In the quantitative analysis of spectral data, small sample size and high dimensionality of spectral variables often lead to poor accuracy of a calibration model. We proposed two methods, namely sample consensus and unsupervised variable consensus models, in order to solve the problem of poor accuracy. Three public near-infrared (NIR) or infrared (IR) spectroscopy data from corn, wine, and soil were used to build the partial least squares regression (PLSR) model. Then, Monte Carlo sampling and unsupervised variable clustering methods of a self-organizing map were coupled with the consensus modeling strategy to establish the multiple sub-models. Finally, sample consensus and unsupervised variable consensus models were obtained by assigning the weights to each PLSR sub-model. The calculated results show that both sample consensus and unsupervised variable consensus models can significantly improve the accuracy of the calibration model compared to the single PLSR model. The effectiveness of these two methods points out a new approach to achieve a further accurate result, which can take full advantage of the sample information and valid variable information.
APA, Harvard, Vancouver, ISO, and other styles
11

Kim, Seongho, Elisabeth Heath, and Lance Heilbrun. "Sample size determination for logistic regression on a logit-normal distribution." Statistical Methods in Medical Research 26, no. 3 (2015): 1237–47. http://dx.doi.org/10.1177/0962280215572407.

Full text
Abstract:
Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.
APA, Harvard, Vancouver, ISO, and other styles
12

Boef, Anna G. C., Olaf M. Dekkers, Jan P. Vandenbroucke, and Saskia le Cessie. "Sample size importantly limits the usefulness of instrumental variable methods, depending on instrument strength and level of confounding." Journal of Clinical Epidemiology 67, no. 11 (2014): 1258–64. http://dx.doi.org/10.1016/j.jclinepi.2014.05.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Inayati, Titik. "STRATEGI MANAJEMEN SDM, ORIENTASI PASAR, DAN KINERJA UKM." Jurnal Organisasi dan Manajemen 14, no. 2 (2018): 120–31. http://dx.doi.org/10.33830/jom.v14i2.157.2018.

Full text
Abstract:

 
 This study aims to analyze Strategic Human Resource Management (SHRM) and market orientation towards business performance with innovation as an intervening variable for SMEs. This study uses primary and secondary data with descriptive quantitative methods. Analysis using the Anova test. The study population was all leather shoe SMEs in Mojokerto with a sample size of 34 SME shoe products in Mojokerto, East Java Province. Techniques for selecting samples with certain criteria. The results showed that SHRM and market orientation had a significant positive effect on business performance with innovation as an intervening variable. Intervening innovation variables are not the only mediating SHRM relationships and market orientation on business performance, there are other mediating variables.
 Penelitian ini bertujuan untuk menganalisis Strategic Human Resource Managemen (SHRM) dan orientasi pasar terhadap kinerja usaha dengan inovasi sebagai variabel intervening pada UKM. Penelitian menggunakan data primer dan sekunder dengan metode kuantitatif deskriptif. Analisis menggunakan uji Anova. Populasi penelitian seluruh UKM sepatu kulit di Mojokerto dengan ukuran sampel 34 UKM produk sepatu di Mojokerto, Provinsi Jawa Timur. Teknik memilih sampel dengan kriteria tertentu. Hasil penelitian menunjukkan bahwa SHRM dan orientasi pasar memiliki pengaruh positif signifikan terhadap kinerja usaha dengan inovasi sebagai variabel intervening. Variabel intervening inovasi bukan satu-satunya pemediasi hubungan SHRM dan orientasi pasar terhadap kinerja usaha, terdapat faktor variabel pemediasi lainnya.
 
APA, Harvard, Vancouver, ISO, and other styles
14

Ximénez, Carmen. "Effect of Variable and Subject Sampling on Recovery of Weak Factors In CFA." Methodology 3, no. 2 (2007): 67–80. http://dx.doi.org/10.1027/1614-2241.3.2.67.

Full text
Abstract:
Abstract. Two general issues central to the design of a study are subject sampling and variable sampling. Previous research has examined their effects on factor pattern recovery in the context of exploratory factor analysis. The present paper focuses on recovery of weak factors and reports two simulation studies in the context of confirmatory factor analysis. Conditions investigated include the estimation method (ML vs. ULS), sample size (100, 300, and 500), number of variables per factor (3, 4, or 5), loading size in the weak factor (.25 or .35), and factor correlation (null vs. moderate). Results show that both subject and variable sample size affect the recovery of weak factors, particularly if factors are not correlated. A small but consistent pattern of differences between methods occurs, which favors the use of ULS. Additionally, the frequency of nonconvergent and improper solutions is also affected by the same variables.
APA, Harvard, Vancouver, ISO, and other styles
15

Lambert, Tim. "Brief Observations on Research Methods in Clinical Trials." Australasian Psychiatry 5, no. 2 (1997): 66–69. http://dx.doi.org/10.3109/10398569709082094.

Full text
Abstract:
This paper will address some of the current issues concerning the conduct of phase III clinical trials. The issues to be discussed fall into two areas – trial design and mensuration of target variable change. The former includes the determination of sample size; selection bias in sample populations due to particular inclusion and exclusion criteria; ensuring trials are randomised, double blind controlled studies; and designing the trial so that the two arms are balanced in every way possible, with especial reference to matching treatment variables. The mensuration of target variable change includes the selection of appropriate rating instruments which are (i) sensitive to target symptoms, (ii) have been adequately profiled psychometrically in a similar population, and (iii) are able to detect change within the time frame of the study. Instruments should be cross-culturally robust if used in multi-country studies and should have validated local versions where possible. Adequate training in the use of the rating instruments should be clearly reported and a clear description of inter-rater reliability assessment and maintenance should be provided. Brief mention will be made of the Cochrane collaboration and the role of evidence-based medicine in clinical trials.
APA, Harvard, Vancouver, ISO, and other styles
16

Guo, Beibei, and Ying Yuan. "A comparative review of methods for comparing means using partially paired data." Statistical Methods in Medical Research 26, no. 3 (2015): 1323–40. http://dx.doi.org/10.1177/0962280215577111.

Full text
Abstract:
In medical experiments with the objective of testing the equality of two means, data are often partially paired by design or because of missing data. The partially paired data represent a combination of paired and unpaired observations. In this article, we review and compare nine methods for analyzing partially paired data, including the two-sample t-test, paired t-test, corrected z-test, weighted t-test, pooled t-test, optimal pooled t-test, multiple imputation method, mixed model approach, and the test based on a modified maximum likelihood estimate. We compare the performance of these methods through extensive simulation studies that cover a wide range of scenarios with different effect sizes, sample sizes, and correlations between the paired variables, as well as true underlying distributions. The simulation results suggest that when the sample size is moderate, the test based on the modified maximum likelihood estimator is generally superior to the other approaches when the data is normally distributed and the optimal pooled t-test performs the best when the data is not normally distributed, with well-controlled type I error rates and high statistical power; when the sample size is small, the optimal pooled t-test is to be recommended when both variables have missing data and the paired t-test is to be recommended when only one variable has missing data.
APA, Harvard, Vancouver, ISO, and other styles
17

Sulistyawati, Ardiani Ika, Iin Indri Lestari, and Aprih Santoso. "FAKTOR-FAKTOR YANG MEMPENGARUHI PEMILIHAN METODE PERSEDIAAN (Studi Empiris Perusahaan Manufaktur di Bursa Efek Indonesia)." Adbis: Jurnal Administrasi dan Bisnis 14, no. 1 (2020): 53. http://dx.doi.org/10.33795/j-adbis.v14i1.88.

Full text
Abstract:
The reseach purpose is to analyze the selection of inventory accounting methods and the factors that influence decision making accounting methods to be used. 
 The research examines four independent variables, namely the variability of inventory, companies size, current ratio and inventory turnover ratio. While the dependent variable in this study is the FIFO and average methods. The population of the research is 144 the manufacturing company listed on the Indonesia Stock Exchange in 2014-2017. The sample selection uses purposive sampling method, which is the determination of samples based on predetermined criteria. Then obtained a total sample of 48 companies during the period 2014-2017. This study uses a quantitative approach and the analysis technique uses is logistic regression, with SPSS version 22 analysis tools.
 The results of this research indicate that the variability of inventory and companies size does not significantly influence the selection of inventory acconting methods. While the current ratio and inventory turnover ratio significantly influence the selection of inventory acconting methods.
APA, Harvard, Vancouver, ISO, and other styles
18

Yang, Ziheng. "Statistical Properties of a DNA Sample Under the Finite-Sites Model." Genetics 144, no. 4 (1996): 1941–50. http://dx.doi.org/10.1093/genetics/144.4.1941.

Full text
Abstract:
Statistical properties of a DNA sample from a random-mating population of constant size are studied under the finite-sites model. It is assumed that there is no migration and no recombination occurs within the locus. A Markov process model is used for nucleotide substitution, allowing for multiple substitutions at a single site. The evolutionary rates among sites are treated as either constant or variable. The general likelihood calculation using numerical integration involves intensive computation and is feasible for three or four sequences only; it may be used for validating approximate algorithms. Methods are developed to approximate the probability distribution of the number of segregating sites in a random sample of n sequences, with either constant or variable substitution rates across sites. Calculations using parameter estimates obtained for human D-loop mitochondrial DNAs show that among-site rate variation has a major effect on the distribution of the number of segregating sites; the distribution under the finite-sites model with variable rates among sites is quite different from that under the infinite-sites model.
APA, Harvard, Vancouver, ISO, and other styles
19

Çankaya, Soner, and Samet Hasan Abacı. "A Comparative Study of Some Estimation Methods in Simple Linear Regression Model for Different Sample Sizes in Presence of Outliers." Turkish Journal of Agriculture - Food Science and Technology 3, no. 6 (2015): 380. http://dx.doi.org/10.24925/turjaf.v3i6.380-386.304.

Full text
Abstract:
The aim of this study was to compare some estimation methods (LS, M, S, LTS and MM) for estimating the parameters of simple linear regression model in the presence of outlier and different sample size (10, 20, 30, 50 and 100). To compare methods, the effect of chest girth on body weights of Karayaka lambs at weaning period was examined. Chest girth of lambs was used as independent variable and body weight at weaning period was used as dependent variable in the study. Also, it was taken consideration that there were 10-20% outliers of data set for different sample sizes. Mean square error (MSE) and coefficient of determination (R2) values were used as criteria to evaluate the estimator performance. Research findings showed that LTS estimator is the best models with minimum MSE and maximum R2 values for different size of sample in the presence of outliers. Thereby, LTS method can be proposed, to predict best-fitted model for relationship between chest girth and body weights of Karayaka lambs at weaning period, to the researches who are studying on small ruminants as an alternative way to estimate the regression parameters in the presence of outliers for different sample size.
APA, Harvard, Vancouver, ISO, and other styles
20

Iles, Kim, and David Hugh Harrison Carter. "“Distance-variable” estimators for sampling and change measurement." Canadian Journal of Forest Research 37, no. 9 (2007): 1669–74. http://dx.doi.org/10.1139/x07-029.

Full text
Abstract:
The estimation procedure described is a simple technique that is applicable to virtually any plot-based sampling method and virtually any measured variable. It can be retrofitted to any existing fixed or variable plot over time by simply knowing the distance from the sampled object to the sample point. These estimators are illustrated for sampling over time as the plot size changes. An example is variable-plot sampling in forestry. Traditional estimates from sample plots can be geometrically viewed as a series of “disc shapes” where the same estimate is used for an object no matter how near the sample point is to that selected object. “Distance-variable” (DV) or “shaped” estimators have the same average value over the plot area, with some very important advantages. We believe that the DV estimate will be shown to reduce the variance of growth measurement compared with simple difference estimators. Traditional “disc” estimators are a special case of the more general DV estimators. There are no difficulties with the use of current edge-effect correction techniques, and the calculation of statistics is virtually identical to traditional methods.
APA, Harvard, Vancouver, ISO, and other styles
21

Dalposso, Gustavo H., Miguel A. Uribe-Opazo, and Jerry A. Johann. "Soybean yield modeling using bootstrap methods for small samples." Spanish Journal of Agricultural Research 14, no. 3 (2016): e0207. http://dx.doi.org/10.5424/sjar/2016143-8635.

Full text
Abstract:
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters.
APA, Harvard, Vancouver, ISO, and other styles
22

Dhumawati, Anak Agung Ayu Made, A. A. A. Erna Trisna Dewi, and Ida Bagus Made Putra Manuaba. "Pengaruh Profitabilitas, Firm Size, Likuiditas dan Leverage Terhadap Kebijakan Deviden." Jurnal Riset Akuntansi Warmadewa 2, no. 2 (2021): 98–103. http://dx.doi.org/10.22225/jraw.2.2.3363.98-103.

Full text
Abstract:
The purpose of this study is to analyze the influence of profitability, firm size, liquidity and leverage on dividend policy. The research sample uses purposive sampling method, which uses as many as 8 companies with a four-year observation period from 2016-2019. The type of data used is quantitative data. Based on the data sources used in this study everything is secondary data. Data collection using documentation methods. The data analysis technique used is multiple linear regression models. The results showed variable profitability and firm size listed on the Indonesia Stock Exchange (IDX) for the period 2016-2019 had a positive and significant effect on dividend policy. Meanwhile, variable liquidity and leverage negatively and significantly affect the dividend policy in LQ45 companies listed on the Indonesia Stock Exchange (IDX) in 2016-2019. Independent variables that can explain the influence of dividend policy as much as 63.20% while the remaining 36.80% is influenced by other factors not included in this study.
APA, Harvard, Vancouver, ISO, and other styles
23

Grjibovski, A. M., M. A. Gorbatova, A. N. Narkevich, and K. A. Vinogradov. "REQUIRED SAMPLE SIZE FOR CORRELATION ANALYSIS." Marine Medicine 6, no. 1 (2020): 101–6. http://dx.doi.org/10.22328/2413-5747-2020-6-1-101-106.

Full text
Abstract:
Sample size calculation prior to data collection is still relatively rare in Russian research practice. This situation threatens validity of the conclusion of many projects due to insufficient statistical power to estimate the parameters of interest with desired precision or to detect the differences of interest. Moreover, in a substantial proportion of cases where sample size calculations are performed simplified formulas with assumption of a normal distribution of the studied variables are used in spite of the fact that this assumption does not hold for many research questions in biomedical research. Correlation analysis is still one of the most commonly used methods of statistical analysis used in Russia. Pearson’s correlation coefficient despite its well-known limitations appears in a greater proportion of publications that non-parametric coefficients. We calculated minimal sample sizes for the parametric Pearson’s coefficient as well its non-parametric alternatives — Spearman’s rho and Kendall’s tau-b correlation coefficients to assist junior researchers with the tool to be able to plan data collection and analysis for several types of data, various expected strengths of associations and research questions. The results are presented in ready-for-use tables with required sample size for the three abovementioned coefficients within the range from 0,10 through 0,90 by 0,05 for statistical power 0,8 and 0,9 and alpha-error or 5% as well as for estimation of the same correlation coefficients with the 95% confidence intervals width equal to 0,1 and 0,2.
APA, Harvard, Vancouver, ISO, and other styles
24

Arisanti, Ike, Isti Fadah, and Novi Puspitasari. "PREDIKSI PERINGKAT OBLIGASI SYARIAH DI INDONESIA." JURNAL ILMU MANAJEMEN 11, no. 3 (2014): 1–15. http://dx.doi.org/10.21831/jim.v11i3.11780.

Full text
Abstract:
This study purposes to analyze the influence of financial and non financial factors to prediction of the rating islamic bond in indonesia. The study used independent variable,that is financial factor (growth, size, profit sharing/fee, liquidity) and non financial factor ( secure and maturity) and dependent variable that is the rating of islamic bond. This study applied logistic regresion analysis with sample collection methods using purposive sampling. After selecting fixed criterias, there were 25 islamic bonds chosen with the numbers of 75 investigation from periods of 2010-2012. The result of this study showed that significantly effect the variable growth (X1) , size(X2), profit sharing/ fee (X3), liquidity (X4), secure (X5), maturity (X6) simultaneously to the rating prediction of islamic bond in indonesia. Partially, variable variables of growth (X1) , size (X2), profit sharing/ fee (X3) which referred not significant affecting to the rating prediction of islamic bond in indonesia. Meanwhile, variables of liquidity (X4), secure (X5), maturity ( X6) referred significant affecting to the rating prediction of islamic bond in indonesia.
APA, Harvard, Vancouver, ISO, and other styles
25

Huang, Lingkang, Hao Helen Zhang, Zhao-Bang Zeng, and Pierre R. Bushel. "Improved Sparse Multi-Class SVM and Its Application for Gene Selection in Cancer Classification." Cancer Informatics 12 (January 2013): CIN.S10212. http://dx.doi.org/10.4137/cin.s10212.

Full text
Abstract:
Background Microarray techniques provide promising tools for cancer diagnosis using gene expression profiles. However, molecular diagnosis based on high-throughput platforms presents great challenges due to the overwhelming number of variables versus the small sample size and the complex nature of multi-type tumors. Support vector machines (SVMs) have shown superior performance in cancer classification due to their ability to handle high dimensional low sample size data. The multi-class SVM algorithm of Crammer and Singer provides a natural framework for multi-class learning. Despite its effective performance, the procedure utilizes all variables without selection. In this paper, we propose to improve the procedure by imposing shrinkage penalties in learning to enforce solution sparsity. Results The original multi-class SVM of Crammer and Singer is effective for multi-class classification but does not conduct variable selection. We improved the method by introducing soft-thresholding type penalties to incorporate variable selection into multi-class classification for high dimensional data. The new methods were applied to simulated data and two cancer gene expression data sets. The results demonstrate that the new methods can select a small number of genes for building accurate multi-class classification rules. Furthermore, the important genes selected by the methods overlap significantly, suggesting general agreement among different variable selection schemes. Conclusions High accuracy and sparsity make the new methods attractive for cancer diagnostics with gene expression data and defining targets of therapeutic intervention. Availability The source MATLAB code are available from http://math.arizona.edu/∼hzhang/software.html.
APA, Harvard, Vancouver, ISO, and other styles
26

Sayekti, Yosefa. "PENGUJIAN ATAS DEBT/EQUITY HYPOTHESIS DAN SIZE HYPOTHESIS TERHADAP PEMILIHAN METODE PENYUSUTAN ASSET TETAP." JURNAL AKUNTANSI UNIVERSITAS JEMBER 11, no. 1 (2015): 1. http://dx.doi.org/10.19184/jauj.v11i1.1257.

Full text
Abstract:
This study aims to examine the debt / equity hypothesis and hypothesis size (political cost hypothesis) with a focus on fixed asset depreciation accounting method selected companies. Debt / equity hypothesis states that if the debt / equity ratio of a company is getting higher, then chances are the company to choose accounting methods that increase profitability also increased (Watts and Zimmerman, 1986). This study uses debt to equity ratio and interest coverage ratio as proxy variables to test the debt / equity hypothesis. While the size hypothesis states that the larger the company, the managers the possibility to choose accounting methods that reduce profits is also higher (Watts and Zimmerman, 1986). This study uses total assets and net income as proxy variables for company size. This study uses a logit regression to test the hypothesis. The total sample of the study was 108 company's financial statements for the year ended December 31, 2004. The results show that the debt / equity hypothesis (the interest coverage ratio as a proxy variable), and the size hypothesis (with total assets as a proxy variable) proved. Overall, the test results are consistent with previous studies.
 
 Keywords: debt/equity hypothesis, size hypothesis, logit model
APA, Harvard, Vancouver, ISO, and other styles
27

Espinasse, Benjamin, Manali Joglekar, Giancarlo Valiente, and Gowthami M. Arepally. "Novel Techniques for Measurement of Variable Sized PF4/H Complexes." Blood 120, no. 21 (2012): 2204. http://dx.doi.org/10.1182/blood.v120.21.2204.2204.

Full text
Abstract:
Abstract Abstract 2204 Electrostatic interactions between Platelet factor 4 (PF4), a cationic protein, and heparin, an anionic carbohydrate result in the formation of ultra-large complexes (ULCs) that are immunogenic in mice (Suvarna, Blood 2007) and contribute to the immune pathogenesis of Heparin-induced thrombocytopenia (HIT). Previous studies (Rauova, Blood 2005; Greinacher, Arterioscler Thromb Vasc Biol, 2006) have shown that the size of ULCs is determined by the concentration and the molar ratios of PF4:H (PHRs) of each compound. Size determination of PF4/H complexes has been problematic due to technical limitations of two commonly employed methods for sizing complexes, photon correlation spectroscopy (PCS) and size exclusion chromatography (SEC). PCS is a technique for measuring particles in solution using laser illumination is based on principles of Brownian motion. PCS performs optimally with monodisperse populations and is biased by the presence of large aggregates. SEC, a liquid chromatography method, is technically cumbersome, requires sample labeling and not feasible for measuring large numbers of samples. To address these limitations, we examined two novel approaches for measuring a broad range of PF4/H complex size (100–3000 nm) in vitro: Nanosight and flow cytometry (FC). Nanosight (Nanosight Ltd, Wiltshire, United Kingdom),was employed for measuring small-sized complexes using physiologic concentrations of hPF4 (10 ug/mL). Nanosight uses proprietary software to track nanoparticles (range 10–1000nm) in solution by laser illumination with real-time tracking of the motion of individual particles by a camera. Analysis parameters provided by the software include: 1) Particle size distributions displayed as histograms 2) direct visualization of particles 3) particle counting and sizing and 4) particle scatter intensity vs. count and size. For measuring intermediate to large sized particles, formed at high hPF4 concentrations (95 ug/mL), we used flow cytometry calibrated with sizing beads on side scatter channel (SSC). FC was performed using a BD LSRII cell analyzer (Becton Dickinson, Franklin Lakes, NJ), a high throughput flow analyzer with the threshold channel for SSC set to 200 and a flow rate of 1 ul per second. The instrument was calibrated using sizing beads ranging from 0.3–6 μm in size (Figure A). For both techniques, PF4/H ULCs were formed by adding hPF4 (10 or 95 ug/mL)and various UFH concentrations in HBSS to yield the indicated PHRs. Complexes were incubated for 60 minutes and measured by NanoSight or FC. Results of experiments using Nanosight are shown in Table 1 with results showing size and particle counts for each PHR. Results of FC are shown in Figure B and Table 2 (median, 5% and 95% size in nm). Both studies showed reproducibility for measurements for a given concentration and showed changes in complex size as a function of PHR (Figure B). Both methodologies are technically simple and provide complementary approaches to PCS for PF4/H complex size determination. Disclosures: No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
28

Febriansyah, Erwin, Ade Tiara Yulinda, and Lina Rosalinda. "PENGARUH VARIABILITAS PERSEDIAAN, UKURAN PERUSAHAAN DAN INTENSITAS PERSEDIAAN TERHADAP PEMILIHAN METODE PENILAIAN PERSEDIAAN (Studi Empiris Perusahaan Manufaktur Di Bursa Efek Indonesia Tahun 2014-2017)." EKOMBIS REVIEW: Jurnal Ilmiah Ekonomi dan Bisnis 8, no. 1 (2020): 38–46. http://dx.doi.org/10.37676/ekombis.v8i1.930.

Full text
Abstract:
Erwin Febriansyah, Ade Tiara Yulinda, Lina Rosalinda; This study purposes to determine the effect of inventory variability and intensity, company size, on the selection of inventory assessment method. This research examined four independent variables namely inventory variability and intensity, companies size, its dependent variable was the FIFO method and the average method. Type of this research was quantitative research by using secondary data. Research population was all manufacturing companies listed on the Indonesia Stock Exchange of the 2014-2017. The sampling technique used purposive sample, with criteria and selected 41 unit analysis companies to be samples and multiplied by the number observation period are obtained 164 samples (41 companies x 4year). The method for data analysis used logistic regression by using SPSS 18.0. There were four research hypotheses; to examine the effect of inventory variability and intensity, companies size on the selection of inventory assessment methods. The result of study showed that inventory variability, companies size, do not affect on the selection of inventory valuation method. Meanwhile, intensity inventory affects the selection of inventory valuation methods and inventory variability and intensity, companies size together influence the selection of inventory valuation methods.
 Keywords: Inventory Variability, Company Size, Inventory Intensity, FIFO, Average
APA, Harvard, Vancouver, ISO, and other styles
29

Fitri, Lailatul, and Vera Noviana Erlita. "Keberagaman Dewan Direksi terhadap Struktur Modal." Jurnal Studi Manajemen dan Bisnis 5, no. 1 (2020): 31–37. http://dx.doi.org/10.21107/jsmb.v5i1.6505.

Full text
Abstract:
The characteristics of the board in a company affect the company's capital structure because of the relationship between the monitoring mechanism which is a function of the board of directors on the company's capital structure. The purpose of this study was to determine the effect of the board of directors' diversity which is proxied on the variables of female presence, foreign citizenship, board size, sales growth, profitability and firm size on capital structure. This study uses quantitative methods using multiple linear regression methods. The sample of this study is a manufacturing company listing on the Indonesia Stock Exchange for the period of 2016 - 2017. The method used to determine the sample is purposive sampling. This research data can be obtained from secondary data, namely the company's annual report. The results of hypothesis testing using multiple linear regression showed that the presence of women and foreign citizenship had an effect while the variable board size, sales growth, profitability and firm size did not affect the capital structure.
APA, Harvard, Vancouver, ISO, and other styles
30

Mingxiao, Ding, Jiao Renjie, Liang Fengxia, and Zhai Zhonghe. "The variations of virus shape and size in different preparations." Proceedings, annual meeting, Electron Microscopy Society of America 48, no. 3 (1990): 580–81. http://dx.doi.org/10.1017/s0424820100160455.

Full text
Abstract:
Envelope is a very important structure for viral attachment and entry into the host cell, but it is also a morphologically variable portion of enveloped viruses. Studying the fine structure of enveloped viruses, we noticed that different sample preparations of viruses resulted in the change of viral size and shape to some extent, which we believe was caused by the variation of the viral envelope. Four typical enveloped viruses: IBRV (Infectious Bovine Rhinotracheitis Virus), GPV (Goat Pox Virus), SbV (Sindbis Virus) and VSV (Vesicular Stomatitis Virus) were investigated in our experiments.Host cells infected with IBRV, GPV, SbV and VSV respectively were fixed with 1-5% glutaraldehyde in Hank's buffer when the cytopathic effects appeared in 50-70% of the cells, then the specimens were treated respectively with different conventional methods of EM sample preparation: 1) ultrathin sectioning, 2) negative staining,3) freeze etching, 4) surface replica, 5) whole mount or SEM observations. All the samples were examined under JEM-200CX TEM or JSM-35CF SEM.
APA, Harvard, Vancouver, ISO, and other styles
31

Apriyani, Hani Werdi. "PENGARUH CORPORATE GOVERNANCE DAN KARAKTERISTIK PERUSAHAAN TERHADAP LUAS PENGUNGKAPAN TRANSAKSI PIHAK BERELASI DI INDONESIA." Jurnal Akuntansi Indonesia 4, no. 1 (2016): 36. http://dx.doi.org/10.30659/jai.4.1.36-50.

Full text
Abstract:
The purpose of this study is to examine the influence of corporate governance and firms characteristics to the extent of disclosure of related party transactions. The independent variable in this study is the level of ownership concentration, independent commissioners, the level of corporate diversification, and profitability. The dependent variable is the related party transactions disclosure. In analyzing the effect of independent variables on the dependent variable, included two control variables in our model, the type of industry and company size. The sample used in this study is annual report of non-financial companies listed in the Indonesia Stock Exchange 2008 -2011. Purposive sampling use to determine sample. There are 25 companies that meet the criteria as the sample in this study, and finally 90 annual report used in this analysis. Statistical methods using multiple linear regression analysis. The analysis showed that profitability have significant effect in Related Party Transaction disclosure. Level of ownership concentration, the level of corporate diversification , and independent commissioners did’t significantly influance the related party transaction disclosure.
APA, Harvard, Vancouver, ISO, and other styles
32

Scagel, Rob, Y. A. El-Kassaby, and J. Emanuel. "Assessing sample size and variable number in multivariate data, with specific reference to cone morphology variation in a population of Picea sitchensis." Canadian Journal of Botany 63, no. 2 (1985): 232–41. http://dx.doi.org/10.1139/b85-027.

Full text
Abstract:
A multivariate extension of univariate sample size estimation is outlined that enables one to determine sample size for a multivariate study. The procedure is presented and illustrated by application to intraindividual and interindividual variation of cone morphology in a population of Picea sitchensis (Bong.) Carr. The method involves the stabilization of a scalar estimate of the structure of the correlation matrix (the determinant) among variables for a given sample size. The sample-specific dependency of previously described methods is avoided by random selection of several replicates in nonstructured and structured (nested) models. The procedure is best applied in pilot studies where it can aid in the characterization of multivariate data prior to analysis. Additionally, repeatability estimates for cone scale morphology are presented.
APA, Harvard, Vancouver, ISO, and other styles
33

Van Calster, Ben, Maarten van Smeden, Bavo De Cock, and Ewout W. Steyerberg. "Regression shrinkage methods for clinical prediction models do not guarantee improved performance: Simulation study." Statistical Methods in Medical Research 29, no. 11 (2020): 3166–78. http://dx.doi.org/10.1177/0962280220921415.

Full text
Abstract:
When developing risk prediction models on datasets with limited sample size, shrinkage methods are recommended. Earlier studies showed that shrinkage results in better predictive performance on average. This simulation study aimed to investigate the variability of regression shrinkage on predictive performance for a binary outcome. We compared standard maximum likelihood with the following shrinkage methods: uniform shrinkage (likelihood-based and bootstrap-based), penalized maximum likelihood (ridge) methods, LASSO logistic regression, adaptive LASSO, and Firth’s correction. In the simulation study, we varied the number of predictors and their strength, the correlation between predictors, the event rate of the outcome, and the events per variable. In terms of results, we focused on the calibration slope. The slope indicates whether risk predictions are too extreme (slope < 1) or not extreme enough (slope > 1). The results can be summarized into three main findings. First, shrinkage improved calibration slopes on average. Second, the between-sample variability of calibration slopes was often increased relative to maximum likelihood. In contrast to other shrinkage approaches, Firth’s correction had a small shrinkage effect but showed low variability. Third, the correlation between the estimated shrinkage and the optimal shrinkage to remove overfitting was typically negative, with Firth’s correction as the exception. We conclude that, despite improved performance on average, shrinkage often worked poorly in individual datasets, in particular when it was most needed. The results imply that shrinkage methods do not solve problems associated with small sample size or low number of events per variable.
APA, Harvard, Vancouver, ISO, and other styles
34

Prastiwi, Septi Kurnia, Rabia Rabia, and Renanda Bagus. "PERAN PRODUCT QUALITY, INFORMATION QUALITY DENGAN MEDIASI TRUST TERHADAP REPURCHASE INTENTION PADA MITRA UMKM GO-FOOD DI SURAKARTA." Jurnal Manajemen Dayasaing 21, no. 1 (2019): 44–54. http://dx.doi.org/10.23917/dayasaing.v21i1.6009.

Full text
Abstract:
This study aims to examine the factors that may affect repurchase intention for Go food product. Two variables suspected to affect the repurchase intention are product quality and information quality with trust as mediation. The design of this study is survery method with a population of go jek user in Solo, with sample size of 100 respondents, the sampling methods purposive sampling a questionnaire with 12 indicators questions. Test results validity, reliability and classic assumption test support for continuing research. Path analysis get the result that variable product quality not significantly affect trust and repurchase intention, but variable information quality can affect positively and significantly related to trust and the dependent variable repurchase intention. To examine variable trust can be mediator variable product quality to repurchase intention, evidence by sobel test and get the result that variable trust significantly as mediator variable.
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Fan, Sach Mukherjee, Sylvia Richardson, and Steven M. Hill. "High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking." Statistics and Computing 30, no. 3 (2019): 697–719. http://dx.doi.org/10.1007/s11222-019-09914-9.

Full text
Abstract:
AbstractPenalized likelihood approaches are widely used for high-dimensional regression. Although many methods have been proposed and the associated theory is now well developed, the relative efficacy of different approaches in finite-sample settings, as encountered in practice, remains incompletely understood. There is therefore a need for empirical investigations in this area that can offer practical insight and guidance to users. In this paper, we present a large-scale comparison of penalized regression methods. We distinguish between three related goals: prediction, variable selection and variable ranking. Our results span more than 2300 data-generating scenarios, including both synthetic and semisynthetic data (real covariates and simulated responses), allowing us to systematically consider the influence of various factors (sample size, dimensionality, sparsity, signal strength and multicollinearity). We consider several widely used approaches (Lasso, Adaptive Lasso, Elastic Net, Ridge Regression, SCAD, the Dantzig Selector and Stability Selection). We find considerable variation in performance between methods. Our results support a “no panacea” view, with no unambiguous winner across all scenarios or goals, even in this restricted setting where all data align well with the assumptions underlying the methods. The study allows us to make some recommendations as to which approaches may be most (or least) suitable given the goal and some data characteristics. Our empirical results complement existing theory and provide a resource to compare methods across a range of scenarios and metrics.
APA, Harvard, Vancouver, ISO, and other styles
36

Hantono, Hantono. "The Effect of Liquidity and Profitability on the Capital Structure as the Moderator Variable of Subsector Retail Companies Listed on Indonesian Stock Exchange." Budapest International Research and Critics Institute (BIRCI-Journal): Humanities and Social Sciences 4, no. 2 (2021): 1747–57. http://dx.doi.org/10.33258/birci.v4i2.1848.

Full text
Abstract:
The purpose of this research is to prove and to analyze the effect of liquidity and profitability on the capital structure as the moderator variable of the subsector retail companies listed on Indonesian Stock Exchange in the period of 2011-2015. The population in this research are the listed 16 companies. Using purposive sampling, 10 out of 16 companies are selected as the sample of this research. The data used in this research is secondary data by gathering the necessary information from idx such as the financial report in 2011-2015. The methods that are used in this research to analyze the correlation between the independent variable and dependent variable are multiple regression and assumption testing. In conclusion, it is shown that simultaneously, using f test, independent variables; Cash Turnover and Company Size effect the Return on Assets accordingly with the Debt to Equity Ration as the moderator variable. Partial research result using t test shows that Cash Turnover and Company size effect the Return on Assets partially with Debt of Equity as the moderator variable.
APA, Harvard, Vancouver, ISO, and other styles
37

Zhanfei, Zhu, Han Xinwen, Li Wensheng, Yang Shutao, and Wang Bingchuan. "Credibility Evaluation of Operational Test Simulation Data under Small Sample Circumstance." MATEC Web of Conferences 173 (2018): 03013. http://dx.doi.org/10.1051/matecconf/201817303013.

Full text
Abstract:
It is highly necessary to study how to analyze the reliability of simulation data under small sample circumstance when the number of times live operational test is strictly limited. Based on the analysis of existing test ideas and methods, combined with the characteristics of sequence statistics of uniform distribution, a new method of consistency verification is proposed by constructing the variable-scale differential quotient sequence statistic. The research shows that this method is not limited by the sample size, and the credibility of the simulation data can be quickly judged by MATLAB programming.
APA, Harvard, Vancouver, ISO, and other styles
38

Grilli, Leonardo, and Carla Rampichini. "The Role of Sample Cluster Means in Multilevel Models." Methodology 7, no. 4 (2011): 121–33. http://dx.doi.org/10.1027/1614-2241/a000030.

Full text
Abstract:
The paper explores some issues related to endogeneity in multilevel models, focusing on the case where the random effects are correlated with a level 1 covariate in a linear random intercept model. We consider two basic specifications, without and with the sample cluster mean. It is generally acknowledged that the omission of the cluster mean may cause omitted-variable bias. However, it is often neglected that the inclusion of the sample cluster mean in place of the population cluster mean entails a measurement error that yields biased estimators for both the slopes and the variance components. In particular, the contextual effect is attenuated, while the level 2 variance is inflated. We derive explicit formulae for measurement error biases that allow us to implement simple post-estimation corrections based on the reliability of the covariate. In the first part of the paper, the issue is tackled in a standard framework where the population cluster mean is treated as a latent variable. Later we consider a different framework arising when sampling from clusters of finite size, where the latent variable methods may have a poor performance, and we show how to effectively modify the measurement error correction. The theoretical analysis is supplemented with a simulation study and a discussion of the implications for effectiveness evaluation.
APA, Harvard, Vancouver, ISO, and other styles
39

Wu, John, Urban Hägg, and A. Bakr M. Rabie. "Chinese Norms of McNamara's Cephalometric Analysis." Angle Orthodontist 77, no. 1 (2007): 12–20. http://dx.doi.org/10.2319/021606-62r.1.

Full text
Abstract:
Abstract Objective: To establish cephalometric norms of McNamara's analysis in young Chinese and compare them to those of a matched young Caucasian sample. Materials and Methods: The material comprised lateral cephalometric radiographs of a random sample of 200 male and 205 female 12-year-old southern Chinese children, and an additional sample of 43 male and 43 female 12-year-old British Caucasian children in Hong Kong. The radiographs were digitized twice with the CASSOS program. Results: The results showed that there were statistically significant gender differences for six out of the 11 cephalometric variables in the Chinese, but for only one variable in the Caucasians. The size of the statistically significant gender differences varied from −0.3 to 0.4 on SD scores. There were statistically significant ethnic differences for eight variables in males and seven variables in females. The size of the observed statistically significant ethnic differences varied from −1.8 to 1.6 on SD scores. Conclusion: The use of specific standards for Chinese, separate for gender, for McNamara's cephalometric analysis seems to be justified.
APA, Harvard, Vancouver, ISO, and other styles
40

Ajana, Soufiane, Niyazi Acar, Lionel Bretillon, et al. "Benefits of dimension reduction in penalized regression methods for high-dimensional grouped data: a case study in low sample size." Bioinformatics 35, no. 19 (2019): 3628–34. http://dx.doi.org/10.1093/bioinformatics/btz135.

Full text
Abstract:
Abstract Motivation In some prediction analyses, predictors have a natural grouping structure and selecting predictors accounting for this additional information could be more effective for predicting the outcome accurately. Moreover, in a high dimension low sample size framework, obtaining a good predictive model becomes very challenging. The objective of this work was to investigate the benefits of dimension reduction in penalized regression methods, in terms of prediction performance and variable selection consistency, in high dimension low sample size data. Using two real datasets, we compared the performances of lasso, elastic net, group lasso, sparse group lasso, sparse partial least squares (PLS), group PLS and sparse group PLS. Results Considering dimension reduction in penalized regression methods improved the prediction accuracy. The sparse group PLS reached the lowest prediction error while consistently selecting a few predictors from a single group. Availability and implementation R codes for the prediction methods are freely available at https://github.com/SoufianeAjana/Blisar. Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
41

Gillespie, Andrew J. R., and Tiberius Cunia. "Linear regression models for biomass table construction, using cluster samples." Canadian Journal of Forest Research 19, no. 5 (1989): 664–73. http://dx.doi.org/10.1139/x89-103.

Full text
Abstract:
Biomass tables are often constructed from cluster samples by means of ordinary least squares regression estimation procedures. These procedures assume that sample observations are uncorrelated, which ignores the intracluster correlation of cluster samples and results in underestimates of the model error. We tested alternative estimation procedures by simulation under a variety of cluster sampling methods, to determine combinations of sampling and estimation procedures that yield accurate parameter estimates and reliable estimates of error. Modified, generalized, and jack-knife least squares procedures gave accurate parameter and error estimates when sample trees were selected with equal probability. Regression models that did not include height as a predictor variable yielded biased parameter estimates when sample trees were selected with probability proportional to tree size. Models that included height did not yield biased estimates. There was no discernible gain in precision associated with sampling with probability proportional to size. Random coefficient regressions generally gave biased point estimates with poor precision, regardless of sampling method.
APA, Harvard, Vancouver, ISO, and other styles
42

Jennings, B. R. "Size and Thickness Measurement of Polydisperse Clay Samples." Clay Minerals 28, no. 4 (1993): 485–94. http://dx.doi.org/10.1180/claymin.1993.028.4.01.

Full text
Abstract:
AbstractMeasurement of clay particle size invariably presents data in the form of equivalent spherical diameters. For asymmetric particles the equivalent spherical diameter varies with the method of measurement. Based upon an understanding of the theoretical concepts involved, a method has been proposed whereby comparison of data on a given sample from two different techniques can reveal information about the minor dimension of the particle. Theoretical expressions are given for the equivalent spherical diameter of cylindrically symmetric rods and discs from which it is shown that some of the existing measurement methods are more dependent upon size than the degree of non-sphericity whilst for others the reverse is true. It is shown how for rods and discs one can obtain information on both an average axial ratio and the distribution of this parameter for heterogeneous samples. Illustrated data are given for three kaolin samples. Far from showing inconsistency between the variable spherical diameters yielded by different instruments, the data produce compatible size and thickness parameters which match those observed in supplementary, unreported electron microscope experiments. A method of measuring particle major and minor parameter distributions is indicated.
APA, Harvard, Vancouver, ISO, and other styles
43

Morgan, Katy E., Sarah Cook, David A. Leon, and Chris Frost. "Reflection on modern methods: calculating a sample size for a repeatability sub-study to correct for measurement error in a single continuous exposure." International Journal of Epidemiology 48, no. 5 (2019): 1721–26. http://dx.doi.org/10.1093/ije/dyz055.

Full text
Abstract:
Abstract Using a continuous exposure variable that is measured with random error in a univariable linear regression model leads to regression dilution bias: the observed association between the exposure and outcome is smaller than it would be if the true value of the exposure could be used. A repeatability sub-study, where a sample of study participants have their data measured again, can be used to correct for this bias. It is important to perform a sample size calculation for such a sub-study, to ensure that correction factors can be estimated with sufficient precision. We describe how a previously published method can be used to calculate the sample size from the anticipated size of the correction factor and its desired precision, and demonstrate this approach using the example of the cross-sectional studies conducted as part of the International Project on Cardiovascular Disease in Russia study. We also provide correction factors calculated from repeat data from the UK Biobank study, which can be used to help plan future repeatability studies.
APA, Harvard, Vancouver, ISO, and other styles
44

Andawiyah, Ayu, Ahmad Subeki, and Arista Hakiki. "PENGARUH THIN CAPITALIZATION TERHADAP PENGHINDARAN PAJAK PERUSAHAAN INDEX SAHAM SYARIAH INDONESIA." AKUNTABILITAS 13, no. 1 (2019): 49–68. http://dx.doi.org/10.29259/ja.v13i1.9342.

Full text
Abstract:
The purpose of this study was to examine the effect of thin capitalization with to cash effective tax rates as the proxy for tax avoidance. Control Variable used in this research is key management compensation, firms size, and profitability.The method used in this research is an analytical descriptive. The sample used in this research consist of 20 companies from manufacturing sector listed in Indonesian Stock Shariah Index (ISSI) in 2011– 2016. Sample selection methods used in this research is purposive sampling. The analysis methods used in this research is multiple regression analysis.The results show that the thin capitalization has a significant influence on cash effective tax rates as the proxy for tax avoidance. Key management compensation as control variable has not been able to prove the effect on tax avoidance and firms size profitability have a significant effect on cash effective tax rates as the proxy for tax avoidance.
APA, Harvard, Vancouver, ISO, and other styles
45

Lenters, Virissa, Roel Vermeulen, and Lützen Portengen. "Performance of variable selection methods for assessing the health effects of correlated exposures in case–control studies." Occupational and Environmental Medicine 75, no. 7 (2017): 522–29. http://dx.doi.org/10.1136/oemed-2016-104231.

Full text
Abstract:
ObjectivesThere is growing recognition that simultaneously assessing multiple exposures may reduce false positive discoveries and improve epidemiological effect estimates. We evaluated the performance of statistical methods for identifying exposure–outcome associations across various data structures typical of environmental and occupational epidemiology analyses.MethodsWe simulated a case–control study, generating 100 data sets for each of 270 different simulation scenarios; varying the number of exposure variables, the correlation between exposures, sample size, the number of effective exposures and the magnitude of effect estimates. We compared conventional analytical approaches, that is, univariable (with and without multiplicity adjustment), multivariable and stepwise logistic regression, with variable selection methods: sparse partial least squares discriminant analysis, boosting, and frequentist and Bayesian penalised regression approaches.ResultsThe variable selection methods consistently yielded more precise effect estimates and generally improved selection accuracy compared with conventional logistic regression methods, especially for scenarios with higher correlation levels. Penalised lasso and elastic net regression both seemed to perform particularly well, specifically when statistical inference based on a balanced weighting of high sensitivity and a low proportion of false discoveries is sought.ConclusionsIn this extensive simulation study with multicollinear data, we found that most variable selection methods consistently outperformed conventional approaches, and demonstrated how performance is influenced by the structure of the data and underlying model.
APA, Harvard, Vancouver, ISO, and other styles
46

Putri, Arie Pratania, Ricky Utomo, Yosevin Yovenia, and Ayu Cindi Novika. "Ukuran Perusahaan, Komite Audit, Opini Audit, Ukuran KAP dan Audit Delay di Perusahaan Transportasi." E-Jurnal Akuntansi 31, no. 6 (2021): 1401. http://dx.doi.org/10.24843/eja.2021.v31.i06.p04.

Full text
Abstract:
The purpose of the study is to analyze the effect of the Size of Company, Audit Committee, Audit Opinion and the Size of Public Accountant Firm on Audit Delay in Transportation Companies in Indonesia. Quantitative methods are used in the analysis of this study.The use of purposive sampling technique leaves 28 companies from 46 populations companies, so that the total of samples in this analysis are 84 data. As the dependent variable (Audit Delay) is dummy variable, logistic regression is used in the research process. The result of this study is that partially the independent variable that has an effect is the Audit Opinion while the Size of Company, the audit committee, and the Size of Public Accountant Firm have no effect. Simultaneously, all independent variables have an overall effect on the dependent variable with a significance level of 0.000.
 Keywords: Company Size; Audit Committee; Audit Opinion; KAP size; Delay Audits.
APA, Harvard, Vancouver, ISO, and other styles
47

Boddhireddy, P., M. J. Kelly, S. Northcutt, K. C. Prayaga, J. Rumph, and S. DeNise. "Genomic predictions in Angus cattle: Comparisons of sample size, response variables, and clustering methods for cross-validation1." Journal of Animal Science 92, no. 2 (2014): 485–97. http://dx.doi.org/10.2527/jas.2013-6757.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shin, Seon-Hwa, Seongyong Kim, and Hyuncheol Kang. "A Comparison and Case Study on Sample Size Determination Methods in Discriminant Analysis with Binary Response Variables." Korean Data Analysis Society 20, no. 5 (2018): 2285–96. http://dx.doi.org/10.37727/jkdas.2018.20.5.2285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Bonett, Douglas G., and Robert M. Price. "Inferential Methods for the Tetrachoric Correlation Coefficient." Journal of Educational and Behavioral Statistics 30, no. 2 (2005): 213–25. http://dx.doi.org/10.3102/10769986030002213.

Full text
Abstract:
The tetrachoric correlation describes the linear relation between two continuous variables that have each been measured on a dichotomous scale. The treatment of the point estimate, standard error, interval estimate, and sample size requirement for the tetrachoric correlation is cursory and incomplete in modern psychometric and behavioral statistics texts. A new and simple method of accurately approximating the tetrachoric correlation is introduced. The tetrachoric approximation is then used to derive a simple standard error, confidence interval, and sample size planning formula. The new confidence interval is shown to perform far better than the confidence interval computed by SAS. A method to improve the SAS confidence interval is proposed. All of the new results are computationally simple and are ideally suited for textbook and classroom presentations.
APA, Harvard, Vancouver, ISO, and other styles
50

Altelbany, Shady. "Evaluation of Ridge, Elastic Net and Lasso Regression Methods in Precedence of Multicollinearity Problem: A Simulation Study." Journal of Applied Economics and Business Studies 5, no. 1 (2021): 131–42. http://dx.doi.org/10.34260/jaebs.517.

Full text
Abstract:
This study aims at performance evaluation of Ridge, Elastic Net and Lasso Regression Methods in handling different degrees of multicollinearity in a multiple regression analysis of independent variables using simulation data. The researcher simulated a collection of data with sample size n=200, 1000, 10000, 50000 and 100000, independent variables p=10. The researcher compared the performances of the three methods using Mean Square Errors (MSE). The study found that Elastic Net method outperforms Ridge and Lasso methods to estimate the regression coefficients when a degree of multicollinearity is low, moderate and high for any sample size. While, Lasso method is the most accurate regression coefficients estimator when data containing severe multicollinearity at sample size less than 10000 observations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography