Academic literature on the topic 'Sample-size Computation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sample-size Computation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Sample-size Computation"

1

Jung, Sin-ho, Sun J. Kang, Linda M. McCall, and Brent Blumenstein. "Sample Size Computation for Two-Sample Noninferiority Log-Rank Test." Journal of Biopharmaceutical Statistics 15, no. 6 (2005): 969–79. http://dx.doi.org/10.1080/10543400500265736.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hsu, Jason C. "Sample size computation for designing multiple comparison experiments." Computational Statistics & Data Analysis 7, no. 1 (1988): 79–91. http://dx.doi.org/10.1016/0167-9473(88)90017-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lachenbruch, Peter A. "A note on sample size computation for testing interactions." Statistics in Medicine 7, no. 4 (1988): 467–69. http://dx.doi.org/10.1002/sim.4780070403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cunningham, Tina D., and Robert E. Johnson. "Design effects for sample size computation in three-level designs." Statistical Methods in Medical Research 25, no. 2 (2012): 505–19. http://dx.doi.org/10.1177/0962280212460443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kharrat, Najla, Imen Ayadi, and Ahmed Rebaï. "Sample size computation for association studies using case—parents design." Journal of Genetics 85, no. 3 (2006): 187–91. http://dx.doi.org/10.1007/bf02935329.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Jie, Jianfeng Luo, Kenneth Liu, and Devan V. Mehrotra. "On power and sample size computation for multiple testing procedures." Computational Statistics & Data Analysis 55, no. 1 (2011): 110–22. http://dx.doi.org/10.1016/j.csda.2010.05.024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Xinjia. "Exact computation of minimum sample size for estimation of binomial parameters." Journal of Statistical Planning and Inference 141, no. 8 (2011): 2622–32. http://dx.doi.org/10.1016/j.jspi.2011.02.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gudicha, Dereje W., Fetene B. Tekle, and Jeroen K. Vermunt. "Power and Sample Size Computation for Wald Tests in Latent Class Models." Journal of Classification 33, no. 1 (2016): 30–51. http://dx.doi.org/10.1007/s00357-016-9199-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dharan, Bala G. "A priori sample size evaluation and information matrix computation for time series models." Journal of Statistical Computation and Simulation 21, no. 2 (1985): 171–77. http://dx.doi.org/10.1080/00949658508810811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Huifen, and Tsu-Kuang Yang. "Computation of the sample size and coverage for guaranteed-coverage nonnormal tolerance intervals." Journal of Statistical Computation and Simulation 63, no. 4 (1999): 299–320. http://dx.doi.org/10.1080/00949659908811959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Sample-size Computation"

1

Riou, Jérémie. "Multiplicité des tests, et calculs de taille d'échantillon en recherche clinique." Thesis, Bordeaux 2, 2013. http://www.theses.fr/2013BOR22066/document.

Full text
Abstract:
Ce travail a eu pour objectif de répondre aux problématiques inhérentes aux tests multiples dans le contexte des essais cliniques. A l’heure actuelle un nombre croissant d’essais cliniques ont pour objectif d’observer l’effet multifactoriel d’un produit, et nécessite donc l’utilisation de co-critères de jugement principaux. La significativité de l’étude est alors conclue si et seulement si nous observons le rejet d’au moins r hypothèses nulles parmi les m hypothèses nulles testées. Dans ce contexte, les statisticiens doivent prendre en compte la multiplicité induite par cette pratique. Nous nous sommes consacrés dans un premier temps à la recherche d’une correction exacte pour l’analyse des données et le calcul de taille d’échantillon pour r = 1. Puis nous avons travaillé sur le calcul de taille d’´echantillon pour toutes valeurs de r, quand les procédures en une étape, ou les procédures séquentielles sont utilisées. Finalement nous nous sommes intéressés à la correction du degré de signification engendré par la recherche d’un codage optimal d’une variable explicative continue dans un modèle linéaire généralisé<br>This work aimed to meet multiple testing problems in clinical trials context. Nowadays, in clinical research it is increasingly common to define multiple co-primary endpoints in order to capture a multi-factorial effect of the product. The significance of the study is concluded if and only if at least r null hypotheses are rejected among the m null hypotheses. In this context, statisticians need to take into account multiplicity problems. We initially devoted our work on exact correction of the multiple testing for data analysis and sample size computation, when r = 1. Then we worked on sample size computation for any values of r, when stepwise and single step procedures are used. Finally we are interested in the correction of significance level generated by the search for an optimal coding of a continuous explanatory variable in generalized linear model
APA, Harvard, Vancouver, ISO, and other styles
2

Schintler, Laurie A., and Manfred M. Fischer. "The Analysis of Big Data on Cites and Regions - Some Computational and Statistical Challenges." WU Vienna University of Economics and Business, 2018. http://epub.wu.ac.at/6637/1/2018%2D10%2D28_Big_Data_on_cities_and_regions_untrack_changes.pdf.

Full text
Abstract:
Big Data on cities and regions bring new opportunities and challenges to data analysts and city planners. On the one side, they hold great promise to combine increasingly detailed data for each citizen with critical infrastructures to plan, govern and manage cities and regions, improve their sustainability, optimize processes and maximize the provision of public and private services. On the other side, the massive sample size and high-dimensionality of Big Data and their geo-temporal character introduce unique computational and statistical challenges. This chapter provides overviews on the salient characteristics of Big Data and how these features impact on paradigm change of data management and analysis, and also on the computing environment.<br>Series: Working Papers in Regional Science
APA, Harvard, Vancouver, ISO, and other styles
3

Maremba, Thanyani Alpheus. "Computation of estimates in a complex survey sample design." Thesis, 2019. http://hdl.handle.net/10386/2920.

Full text
Abstract:
Thesis (M.Sc. (Statistics)) -- University of Limpopo, 2019<br>This research study has demonstrated the complexity involved in complex survey sample design (CSSD). Furthermore the study has proposed methods to account for each step taken in sampling and at the estimation stage using the theory of survey sampling, CSSD-based case studies and practical implementation based on census attributes. CSSD methods are designed to improve statistical efficiency, reduce costs and improve precision for sub-group analyses relative to simple random sample(SRS).They are commonly used by statistical agencies as well as development and aid organisations. CSSDs provide one of the most challenging fields for applying a statistical methodology. Researchers encounter a vast diversity of unique practical problems in the course of studying populations. These include, interalia: non-sampling errors,specific population structures,contaminated distributions of study variables,non-satisfactory sample sizes, incorporation of the auxiliary information available on many levels, simultaneous estimation of characteristics in various sub-populations, integration of data from many waves or phases of the survey and incompletely specified sampling procedures accompanying published data. While the study has not exhausted all the available real-life scenarios, it has outlined potential problems illustrated using examples and suggested appropriate approaches at each stage. Dealing with the attributes of CSSDs mentioned above brings about the need for formulating sophisticated statistical procedures dedicated to specific conditions of a sample survey. CSSD methodologies give birth to a wide variety of approaches, methodologies and procedures of borrowing the strength from virtually all branches of statistics. The application of various statistical methods from sample design to weighting and estimation ensures that the optimal estimates of a population and various domains are obtained from the sample data.CSSDs are probability sampling methodologies from which inferences are drawn about the population. The methods used in the process of producing estimates include adjustment for unequal probability of selection (resulting from stratification, clustering and probability proportional to size (PPS), non-response adjustments and benchmarking to auxiliary totals. When estimates of survey totals, means and proportions are computed using various methods, results do not differ. The latter applies when estimates are calculated for planned domains that are taken into account in sample design and benchmarking. In contrast, when the measures of precision such as standard errors and coefficient of variation are produced, they yield different results depending on the extent to which the design information is incorporated during estimation. The literature has revealed that most statistical computer packages assume SRS design in estimating variances. The replication method was used to calculate measures of precision which take into account all the sampling parameters and weighting adjustments computed in the CSSD process. The creation of replicate weights and estimation of variances were done using WesVar, astatistical computer package capable of producing statistical inference from data collected through CSSD methods. Keywords: Complex sampling, Survey design, Probability sampling, Probability proportional to size, Stratification, Area sampling, Cluster sampling.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Sample-size Computation"

1

Wilson, Kenneth R. A computer program for sample size computations for banding studies. U.S. Dept. of the Interior, Fish and Wildlife Service, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Sample-size Computation"

1

Dalgaard, Peter. "Power and the computation of sample size." In Statistics and Computing. Springer New York, 2008. http://dx.doi.org/10.1007/978-0-387-79054-1_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hothorn, L. "Sample Size Estimation for Several Trend Tests in the k-Sample Problem." In Computational Statistics. Physica-Verlag HD, 1992. http://dx.doi.org/10.1007/978-3-642-48678-4_50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Hui-Qiong, and Liu-Cang Wu. "Sample Size Determination via Non-unity Relative Risk for Stratified Matched-Pair Studies." In Computational Risk Management. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18387-4_54.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Vaclavik, Marek, Zuzana Sikorova, and Iva Cervenkova. "Analysis of Independences of Normality on Sample Size with Regards to Reliability." In Computational Statistics and Mathematical Modeling Methods in Intelligent Systems. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31362-3_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rajeshwari, I., and K. Shyamala. "Study on Performance of Classification Algorithms Based on the Sample Size for Crop Prediction." In Computational Vision and Bio-Inspired Computing. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-37218-7_110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"Sample size computations." In Multiple Comparisons. Chapman and Hall/CRC, 1996. http://dx.doi.org/10.1201/b15074-15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Datta, D. "Mathematics of Probabilistic Uncertainty Modeling." In Advances in Computational Intelligence and Robotics. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4991-0.ch009.

Full text
Abstract:
This chapter presents the uncertainty modeling using probabilistic methods. Probabilistic method of uncertainty analysis is due to randomness of the parameters of a model. Randomness of parameters is characterized by specified probability distribution such as normal, log normal, exponential etc., and the corresponding samples are generated by various methods. Monte Carlo simulation is applied to explore the probabilistic uncertainty modeling. Monte Carlo simulation being a statistical process is based on the random number generation from the specified distribution of the uncertain random parameters. Sample size is generally very large in Monte Carlo simulation which is required to have small errors in the computation. Latin hypercube sampling and importance sampling are explored in brief. This chapter also presents Polynomial Chaos theory based probabilistic uncertainty modeling. Polynomial Chaos theory is an efficient Monte Carlo simulation in the sense that sample size here is very small and dictated by the number of the uncertain parameters and by choice of the order of the polynomial selected to represent the uncertain parameter.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, David, Xiao-Yuan Jing, and Jian Yang. "Solutions of LDA for Small Sample Size Problems." In Computational Intelligence and its Applications. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-830-7.ch006.

Full text
Abstract:
This chapter shows the solutions of LDA for small sample-size (SSS) problems. We first give an overview on the existing LDA regularization techniques. Then, a unified framework for LDA and a combined LDA algorithm for SSS problem are described. Finally, we provide the experimental results and some conclusions.
APA, Harvard, Vancouver, ISO, and other styles
9

Katsis, Athanassios, and Hector E. Nistazakis. "Sample Size Criteria for Estimating the Prevalence of a Disease." In International Conference of Computational Methods in Sciences and Engineering 2004 (ICCMSE 2004). CRC Press, 2019. http://dx.doi.org/10.1201/9780429081385-63.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Haussler, David. "Generalizing the PAC Model: Sample Size Bounds From Metric Dimension-based Uniform Convergence Results." In Proceedings of the Second Annual Workshop on Computational Learning Theory. Elsevier, 1989. http://dx.doi.org/10.1016/b978-0-08-094829-4.50032-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Sample-size Computation"

1

Zhu, Shuguang, Fangzhou Zhu, Weibing Fan, et al. "Discussion on the Relation Between SVM Training Sample Size and Correct Forecast Ratio for Simulation Experiment Results." In 2010 International Conference on Intelligent Computation Technology and Automation (ICICTA). IEEE, 2010. http://dx.doi.org/10.1109/icicta.2010.301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pagalthivarthi, Krishnan V., John M. Furlan, and Robert J. Visintainer. "Effective Particle Size Representation for Erosion Wear in Centrifugal Pump Casings." In ASME 2017 Fluids Engineering Division Summer Meeting. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/fedsm2017-69240.

Full text
Abstract:
For the purpose of Computational Fluid Dynamic (CFD) simulations, the broad particle size distribution (PSD) encountered in industrial slurries is classified into a discrete number of size classes. Since mono-size simulations consume much less computational time, especially in 3D simulations, it would be advantageous to determine an equivalent single particle size representation which yields the same wear distribution predictions as the multi-size simulations. This work extends the previous two-dimensional study [1], which was for a specific PSD slurry flow through three selected pumps, to determine an effective equivalent mono-size representation. The current study covers two-dimensional simulations over a wide range of pumps of varying sizes (40 pumps), 2 inlet concentrations and 4 different particle size distributions. Comparison is made between the multi-size wear prediction and different possible representative mono-size particle wear predictions. In addition, a comparison of multi-size and different mono-size results using three dimensional simulations is also shown for a typical slurry pump as a sample case to highlight that the conclusions drawn for two dimensional simulation could hold good for three dimensional simulations as well. It is observed that by using a mono-size equivalent, the computation time is 20–25% of the computation time for multi-size (6-particle) simulation.
APA, Harvard, Vancouver, ISO, and other styles
3

Gercekovich, D. A., E. Yu Gorbachevskaya, and I. S. Shilnikova. "Identification of basic criteria of portfolio analysis based on the rolling verification principle." In 1st International Workshop on Advanced Information and Computation Technologies and Systems 2020. Crossref, 2021. http://dx.doi.org/10.47350/aicts.2020.06.

Full text
Abstract:
The problem of synthesizing the optimal sizes of training samples specific to each of the considered financial instruments is considered and tested in the article, using real examples. The sample size is selected according to the quality criterion which is based on the accuracy of the generated forecasts. The stated algorithm, which serves as the basis for the synthesis of widely diversified portfolios can significantly increase the efficiency of investment decisions. It is facilitated by, taking into account the characteristics of the markets under study.
APA, Harvard, Vancouver, ISO, and other styles
4

Phan, John H., Richard A. Moffitt, Andrea B. Barrett, and May D. Wang. "Improving Microarray Sample Size Using Bootstrap Data Combination." In 2008 International Multi-symposiums on Computer and Computational Sciences (IMSCCS). IEEE, 2008. http://dx.doi.org/10.1109/imsccs.2008.36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shen, Zebang, Hui Qian, Tongzhou Mu, and Chao Zhang. "Accelerated Doubly Stochastic Gradient Algorithm for Large-scale Empirical Risk Minimization." In Twenty-Sixth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/378.

Full text
Abstract:
Nowadays, algorithms with fast convergence, small memory footprints, and low per-iteration complexity are particularly favorable for artificial intelligence applications. In this paper, we propose a doubly stochastic algorithm with a novel accelerating multi-momentum technique to solve large scale empirical risk minimization problem for learning tasks. While enjoying a provably superior convergence rate, in each iteration, such algorithm only accesses a mini batch of samples and meanwhile updates a small block of variable coordinates, which substantially reduces the amount of memory reference when both the massive sample size and ultra-high dimensionality are involved. Specifically, to obtain an ε-accurate solution, our algorithm requires only O(log(1/ε)/sqrt(ε)) overall computation for the general convex case and O((n+sqrt{nκ})log(1/ε)) for the strongly convex case. Empirical studies on huge scale datasets are conducted to illustrate the efficiency of our method in practice.
APA, Harvard, Vancouver, ISO, and other styles
6

Dong, Guangling, Chi He, and Zhengguo Dai. "Optimal sample size allocation for integrated test scheme." In 2015 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA). IEEE, 2015. http://dx.doi.org/10.1109/civemsa.2015.7158624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lai, Kevin, Wei Xu, and Xin Sun. "An Inverse Algorithm for Resonance Inspection." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-85485.

Full text
Abstract:
Compared to other contemporarily used non-destructive evaluation (NDE) techniques, resonance inspection (RI), which employs the natural vibrational frequency spectra shift induced by the damage to detect defects, is advantageous in many aspects such as low cost, high testing speed, and broad applicability to complex structures. However, the inability to provide damage details, i.e. location, dimension, or types, of the flaws severely hinders its wide spread applications and further development despite its early success in the automobile industry for quality inspections of safety critical parts. In this study, an inverse RI algorithm using a maximum correlation function as the filtering function is proposed to quantify the location and size of flaws for a discrepant part. The algorithm and the numerical schemes are validated using a dog-bone shaped stainless steel sample, while the spectrum data for the original part and flawed parts were generated by a commercial FEM package. The results show that multiple flaws can be accurately identified using the proposed RI inversion method. The study further showed that the reliability of the inversion method is sensitive to the spectrum range included in the correlation function computation. It is demonstrated that the frequency range required to provide accurate predictions is inversely correlated to the defect size. Large defects can be detected using lower frequency spectrum data only, while smaller defects require a higher frequency range.
APA, Harvard, Vancouver, ISO, and other styles
8

Qu, Xueyong, and Raphael T. Haftka. "Design Under Uncertainty Using Monte Carlo Simulation and Probabilistic Sufficiency Factor." In ASME 2003 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/detc2003/dac-48704.

Full text
Abstract:
Monte Carlo simulation is commonly employed to evaluate system probability of failure for problems with multiple failure modes in design under uncertainty. The probability calculated from Monte Carlo simulation has random errors due to limited sample size, which create numerical noise in the dependence of the probability on design variables. This in turn may lead the design to spurious optimum. A probabilistic sufficiency factor (PSF) approach is proposed that combines safety factor and probability of failure. The PSF represents a factor of safety relative to a target probability of failure, and it can be calculated from the results of Monte Carlo simulation (MCS) with little extra computation. The paper presents the use of PSF with a design response surface (DRS), which fits it as function of design variables, filtering out the noise in the results of MCS. It is shown that the DRS for the PSF is more accurate than DRS for probability of failure or for safety index. The PSF also provides more information than probability of failure or safety index for the optimization procedure in regions of low probability of failure. Therefore, the convergence of reliability-based optimization is accelerated. The PSF gives a measure of safety that can be used more readily than probability of failure or safety index by designers to estimate the required weight increase to reach a target safety level. To reduce the computational cost of reliability-based design optimization, a variable-fidelity technique and deterministic optimization were combined with probabilistic sufficiency factor approach. Example problems were studied here to demonstrate the methodology.
APA, Harvard, Vancouver, ISO, and other styles
9

Bi, Chengpeng, Mara Becker, and Steve Leeder. "Derivation of minimum best sample size from microarray data sets: A Monte Carlo approach." In 2011 IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology - Part of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cibcb.2011.5948461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wu, Fang-Xiang, W. J. Zhang, and Anthony J. Kusalik. "On Determination of Minimum Sample Size for Discovery of Temporal Gene Expression Patterns." In 2006 International Multi-Symposiums on Computer and Computational Sciences (IMSCCS). IEEE, 2006. http://dx.doi.org/10.1109/imsccs.2006.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography