Academic literature on the topic 'Large sample size'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Large sample size.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Large sample size"

1

Lantz, Björn. "The large sample size fallacy." Scandinavian Journal of Caring Sciences 27, no. 2 (2012): 487–92. http://dx.doi.org/10.1111/j.1471-6712.2012.01052.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Choi, Jai Won, and Balgobin Nandram. "Large Sample Problems." International Journal of Statistics and Probability 10, no. 2 (2021): 81. http://dx.doi.org/10.5539/ijsp.v10n2p81.

Full text
Abstract:
Variance is very important in test statistics as it measures the degree of reliability of estimates. It depends not only on the sample size but also on other factors such as population size, type of data and its distribution, and method of sampling or experiments. But here, we assume that these other fasctors are fixed, and that the test statistic depends only on the sample size.
 
 When the sample size is larger, the variance will be smaller. Smaller variance makes test statistics larger or gives more significant results in testing a hypothesis. Whatever the hypothesis is, it does not matter. Thus, the test result is often misleading because much of it reflects the sample size. Therefore, we discuss the large sample problem in performing traditional tests and show how to fix this problem.
APA, Harvard, Vancouver, ISO, and other styles
3

Byrne, Enda M., Anjali K. Henders, Ian B. Hickie, Christel M. Middeldorp, and Naomi R. Wray. "Nick Martin and the Genetics of Depression: Sample Size, Sample Size, Sample Size." Twin Research and Human Genetics 23, no. 2 (2020): 109–11. http://dx.doi.org/10.1017/thg.2020.13.

Full text
Abstract:
AbstractNick Martin is a pioneer in recognizing the need for large sample size to study the complex, heterogeneous and polygenic disorders of common mental disorders. In the predigital era, questionnaires were mailed to thousands of twin pairs around Australia. Always quick to adopt new technology, Nick’s studies progressed to phone interviews and then online. Moreover, Nick was early to recognize the value of collecting DNA samples. As genotyping technologies improved over the years, these twin and family cohorts were used for linkage, candidate gene and genome-wide association studies. These cohorts have underpinned many analyses to disentangle the complex web of genetic and lifestyle factors associated with mental health. With characteristic foresight, Nick is chief investigator of our Australian Genetics of Depression Study, which has recruited 16,000 people with self-reported depression (plus DNA samples) over a time frame of a few months — analyses are currently ongoing. The mantra of sample size, sample size, sample size has guided Nick’s research over the last 30 years and continues to do so.
APA, Harvard, Vancouver, ISO, and other styles
4

Armstrong, Richard A. "Is there a large sample size problem?" Ophthalmic and Physiological Optics 39, no. 3 (2019): 129–30. http://dx.doi.org/10.1111/opo.12618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kumar, A. "The Sample Size." Journal of Universal College of Medical Sciences 2, no. 1 (2014): 45–47. http://dx.doi.org/10.3126/jucms.v2i1.10493.

Full text
Abstract:
Finding an "appropriate sample size" has been the most basic and foremost problem; a research worker is always faced with, in all sampling based analytical researches. This is so, since a very large sized sample results to unnecessary wastage of resources, while a very small sized sample may affect adversely the accuracy of sample estimates and thus in turn losing the very efficacy of selected sampling plan. The present paper attempts to highlight the main determinant factors and the analytical approach towards estimation ofrequired sample size, along with a few illustrations. DOI: http://dx.doi.org/10.3126/jucms.v2i1.10493 Journal of Universal College of Medical Sciences (2014) Vol.2(1): 45-47
APA, Harvard, Vancouver, ISO, and other styles
6

Kitikidou, K., and G. Chatzilazarou. "Estimating the sample size for fitting taper equations." Journal of Forest Science 54, No. 4 (2008): 176–82. http://dx.doi.org/10.17221/789-jfs.

Full text
Abstract:
Much work has been done fitting taper equations to describe tree bole shapes, but few researchers have investigated how large the sample size should be. In this paper, a method that requires two variables that are linearly correlated was applied to determine the sample size for fitting taper equations. Two cases of sample size estimation were tested, based on the method mentioned above. In the first case, the sample size required is referred to the total number of diameters estimated in the sampled trees. In the second case, the sample size required is referred to the number of sampled trees. The analysis showed that both methods are efficient from a validity standpoint but the first method has the advantage of decreased cost, since it costs much more to incrementally sample another tree than it does to make another diameter measurement on an already sampled tree.
APA, Harvard, Vancouver, ISO, and other styles
7

Raudys, Šarūnas. "Trainable fusion rules. I. Large sample size case." Neural Networks 19, no. 10 (2006): 1506–16. http://dx.doi.org/10.1016/j.neunet.2006.01.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chittenden, Mark E. "Given a significance test, How large a sample size is large enough?" Fisheries 27, no. 8 (2002): 25–29. http://dx.doi.org/10.1577/1548-8446(2002)027<0025:gasthl>2.0.co;2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Barreiro-Ures, Daniel, Ricardo Cao, and Mario Francisco-Fernández. "Bandwidth Selection in Nonparametric Regression with Large Sample Size." Proceedings 2, no. 18 (2018): 1166. http://dx.doi.org/10.3390/proceedings2181166.

Full text
Abstract:
In the context of nonparametric regression estimation, the behaviour of kernel methods such as the Nadaraya-Watson or local linear estimators is heavily influenced by the value of the bandwidth parameter, which determines the trade-off between bias and variance. This clearly implies that the selection of an optimal bandwidth, in the sense of minimizing some risk function (MSE, MISE, etc.), is a crucial issue. However, the task of estimating an optimal bandwidth using the whole sample can be very expensive in terms of computing time in the context of Big Data, due to the computational complexity of some of the most used algorithms for bandwidth selection (leave-one-out cross validation, for example, has O ( n 2 ) complexity). To overcome this problem, we propose two methods that estimate the optimal bandwidth for several subsamples of our large dataset and then extrapolate the result to the original sample size making use of the asymptotic expression of the MISE bandwidth. Preliminary simulation studies show that the proposed methods lead to a drastic reduction in computing time, while the statistical precision is only slightly decreased.
APA, Harvard, Vancouver, ISO, and other styles
10

Lara, L., W. D. Cotton, L. Feretti, et al. "A new sample of large angular size radio galaxies." Astronomy & Astrophysics 370, no. 2 (2001): 409–25. http://dx.doi.org/10.1051/0004-6361:20010254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Large sample size"

1

Tʻang, Min. "Extention of evaluating the operating characteristics for dependent mixed variables-attributes sampling plans to large first sample size /." Online version of thesis, 1991. http://hdl.handle.net/1850/11208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Mei-Kuang. "Who Are the Cigarette Smokers in Arizona." Thesis, The University of Arizona, 2007. http://hdl.handle.net/10150/193268.

Full text
Abstract:
The purpose of this study was to investigate the relationship between cigarette smoking and socio-demographic variables based on the empirical literature and the primitive theories in the field. Two regression approaches, logistic regression and linear multiple regression, were conducted on the two most recent Arizona Adult Tobacco Surveys to test the hypothesized models. The results showed that cigarette smokers in Arizona are mainly residents who have not completed a four-year college degree, who are unemployed, White, non-Hispanic, or young to middle-aged adults. Among the socio-demographic predictors of interest, education is the most important variable in identifying cigarette smokers, even though the predictive power of these socio-demographic variables is small. Practical and methodological implications of these findings are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Hawkins, Patrick Lawrence. "Variation in the Modified First Metatarsal of a Large Sample of Tapirus polkensis and the Functional Implications for Ceratomorphs." Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1241.

Full text
Abstract:
The Mio-Pliocene age Gray Fossil Site of northeastern Tennessee has the largest collection of tapir postcranial skeletons in the world. Though representing a single species, a few localized structures show high variability. This paper deals with variation of the first metatarsal, which in tapirs was reduced as an early adaptation for running and then retrofitted to serve as a special origin for flexors and adductors of the proximal phalanges. The first metatarsal connects the medial ankle with a posterior process of the third metatarsal in tapiroids. In Tapirus indicus, T. webbi, and 6 out of 31 T. polkensis feet at Gray, it extends more laterally to articulate with the fourth metatarsal. This condition is too variable for species distinction but is correlated with a decrease in the metatarsophalangeal joint facet, suggesting a mobility reduction likely related to the increased range and feeding strategy seen in extant T. indicus.
APA, Harvard, Vancouver, ISO, and other styles
4

Shamaskin, Andrea. "Age differences in long-term adjustment and psychosocial outcomes in a large multi-site sample 5-10 years after heart transplant." VCU Scholars Compass, 2011. http://scholarscompass.vcu.edu/etd/2349.

Full text
Abstract:
Research on age differences in heart transplant patients has focused primarily on medical outcomes, with mixed findings regarding mortality and morbidity rates and limited research regarding age differences in psychosocial and quality of life outcomes. To gain a more complete understanding of psychosocial adjustment after heart transplant, this study examined age differences in: satisfaction with quality of life, satisfaction with social support, depressive symptoms, negative affect, symptom distress, stress related to heart transplant, overall health functioning, coping strategies, and aspects of adherence. Results indicate that older patients, compared to younger patients, report better adjustment and quality of life across numerous outcomes 5-10 years after heart transplant. These findings are consistent with previous literature examining age differences in developmental changes with emotion regulation and coping. This study hopes to contribute to the discussion of age and heart transplant, highlighting the importance of considering quality of life in addition to medical outcomes.
APA, Harvard, Vancouver, ISO, and other styles
5

Tunková, Martina. "Městské lázně." Master's thesis, Vysoké učení technické v Brně. Fakulta architektury, 2010. http://www.nusl.cz/ntk/nusl-215713.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

謝孟芳. "Sample size determination when exploring large itemset in data mining." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/80616395923148082315.

Full text
Abstract:
碩士<br>國立臺灣科技大學<br>資訊管理系<br>90<br>Data mining, which discovers useful knowledge from large collection of data,is a fast developing field of computer science and technology.Current technology makes it fairly easy to collect a large amount of data,but conducting analysis on these fast growing data becomes a formidable task.Complete census tends to be slow and expensive.A natural and simple alternative way is to mine on a sample instead of the whole database,which mitigates computational effort at the cost of sampling error.Discovery of associtaion rules is a typical problem in data mining.Finding large itemsets is essential to forming the association rules.When using random sampling to find the large itemsets,we study the relationship between the computation effort (sample size) and the sampling performance indicated by the probability of effectively finding the correct large itemsets.
APA, Harvard, Vancouver, ISO, and other styles
7

Chiang, Chieh, and 姜杰. "A Study on Sample Size Determination for Evaluating Within-Device Precision, Heritability, and Bioequivalence Based on Modified Large-Sample Method." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/69389568825164889986.

Full text
Abstract:
博士<br>國立臺灣大學<br>農藝學研究所<br>103<br>Statistical criteria for evaluation of precision or variation often involve functions of the second moments of the normal distribution. Under the two-stage nested random-effects model, heritability is defined as the ratio of genetic variance to the total variance. Under replicated crossover designs, the criteria for individual bioequivalence (IBE) proposed by the guidance of the US Food and Drug Administration (FDA) contain the squared mean difference, variance of treatment-by-subject interaction, and the difference in within-subject variances between the generic and innovative products. On the other hand, the criterion for evaluation of the within precision for in-vitro diagnostic devices (IVD) is the sum of the variance components due to day, run, and replicates. The criterion for the in-vitro population bioequivalence (PBE) proposed by the draft guidance of the US FDA consists of the squared mean difference, the sum of the differences in variance components due to batch, sample, and life-stage. These criteria can be reformulated as linear combinations of variance components under the logarithmic transformation. The one-sided confidence limits for the linearized criteria derived by the modified large sample (MLS) method have been proposed as the test statistics for the inference in different applications. However, due to complexity of the power function, the literature for the sample size determination for the inference based on the second moments is scarce. We proved that the distribution of the one-sided confidence bound of the linearized criterion is asymptotically normal. Hence the asymptotic power can be derived for sample size determination with different applications to within-device precision, heritability, IBE and in-vitro PBE. Simulation studies were conducted to investigate the impact of magnitudes of means differences and variance components on sample sizes. In addition, empirical powers obtained from simulation studies are compared with the asymptotic powers to examine whether the sample sizes determined by our proposed methods can provide sufficient power. The proposed methods are illustrated with real data for practical applications. Discussion, final remarks and future research are also presented.
APA, Harvard, Vancouver, ISO, and other styles
8

"Continuous Monitoring As A Solution To The Large Sample Size Problem In Occupational Exposure Assessment." Tulane University, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Large sample size"

1

Hughes, William O. Statistical analysis of a large sample size pyroshock test data set. National Aeronautics and Space Administration, Lewis Research Center, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hughes, William O. Statistical analysis of a large sample size pyroshock test data set. National Aeronautics and Space Administration, Lewis Research Center, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hughes, William O. Statistical analysis of a large sample size pyroshock test data set. National Aeronautics and Space Administration, Lewis Research Center, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rowland, Joshua L. LARE review: Sample exam : site design. Professional Publications, Inc., 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fujikoshi, Yasunori. Asymptotic approximations for EPMC's of the linear and the quadratic discriminant functions when the sample sizes and the dimension are large. University of Toronto, Dept. of Statistics, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

STATISTICAL ANALYSIS OF A LARGE SAMPLE SIZE PRYOSHOCK TEST DATA SET... NASA/TM-1998-206621... JUN. 1, 1998. s.n., 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rowland, Joshua L. Lare Review, Section C Sample Exam: Site Design. Professional Publications (CA), 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hutchinson, G. O. Rhythmic Prose in Imperial Greek Literature. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198821717.003.0001.

Full text
Abstract:
The chapter looks at the division between poetry and prose in ancient and other literatures, and shows the importance of rhythmic patterning in ancient prose. The development of rhythmic prose in Greek and Latin is sketched, the system explained and illustrated (from Latin). It is firmly established, for the first time, which of the main Greek non-Christian authors 31 BC–AD 300 write rhythmically. The method takes a substantial sample of random sentence-endings (usually 400) from each of a large number of Imperial authors; it compares that sample with one sample of the same size (400) drawn randomly from a range of authors earlier than the invention of this rhythmic system. A particular sort of X2-test is applied. Many Imperial authors, it emerges, write rhythmically; many do not. The genres most likely to offer rhythmic writing are, unexpectedly, narrative: historiography and the novel.
APA, Harvard, Vancouver, ISO, and other styles
9

Powell, Roger A., Stephen Ellwood, Roland Kays, and Tiit Maran. Stink or swim: techniques to meet the challenges for the study and conservation of small critters that hide, swim, or climb, and may otherwise make themselves unpleasant. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198759805.003.0008.

Full text
Abstract:
The study of musteloids requires different perspectives and techniques than those needed for most mammals. Musteloids are generally small yet travel long distances and many live or forage underground or under water, limiting the use of telemetry and direct observation. Some are arboreal and nocturnal, facilitating telemetry but limiting observation, trapping, and many non-invasive techniques. Large sexual size dimorphism arguably doubles sample sizes for many research questions. Many musteloids defend themselves by expelling noxious chemicals. This obscure group does not attract funding, even when endangered, further reducing rate of knowledge gain. Nonetheless, passive and active radio frequency identification tags, magnetic-inductance tracking, accelerometers, mini-biologgers and some GPS tags are tiny enough for use with small musteloids. Environmental DNA can document presence of animals rarely seen. These technologies, coupled with creative research design that is well-grounded on the scientific method, form a multi-dimensional approach for advancing our understanding of these charismatic minifauna.
APA, Harvard, Vancouver, ISO, and other styles
10

Garberoglio, Carrie Lou. Secondary Analyses With Large-Scale Data in Deaf Education Research. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190455651.003.0006.

Full text
Abstract:
This chapter discusses how secondary analyses conducted with large-scale federal data sets offer a way of capturing national samples of the diverse population of deaf students, as well as important features that need to be considered when generalizing findings to practice. The author’s work with federal large-scale data sets has largely focused on an exploration of individual and systemic factors that influence postsecondary outcomes for deaf individuals. Large-scale data sets offer unique opportunities to efficiently test hypotheses and empirically address long-standing assumptions in the field through the use of large sample sizes that bring researchers closer to true representations of the heterogeneity in the Deaf community. Specific examples are shared that highlight some successes and challenges in this approach, and how researchers can best utilize large-scale data sets to conduct secondary analyses in their own work with deaf populations.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Large sample size"

1

Chow, Shein-Chung, Jun Shao, Hansheng Wang, and Yuliya Lokhnygina. "Large Sample Tests for Proportions." In Sample Size Calculations in Clinical Research: Third Edition. Chapman and Hall/CRC, 2017. http://dx.doi.org/10.1201/9781315183084-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lara, L., W. D. Cotton, L. Feretti, et al. "A Sample of Large Angular Size Radio Galaxies." In Highlights of Spanish Astrophysics II. Springer Netherlands, 2001. http://dx.doi.org/10.1007/978-94-017-1776-2_108.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Stram, Daniel O. "Design of Large-Scale Genetic Association Studies, Sample Size, and Power." In Design, Analysis, and Interpretation of Genome-Wide Association Scans. Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-9443-0_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ruel, Erin. "How Do We Know if the Sample Size Is Large Enough?" In 100 Questions (and Answers) About Survey Research. SAGE Publications, Inc., 2019. http://dx.doi.org/10.4135/9781506348803.n26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Qianqiu, and Bill Pikounis. "Shiny Tools for Sample Size Calculation in Process Performance Qualification of Large Molecules." In Springer Proceedings in Mathematics & Statistics. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-67386-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mischko, Jens, Stefan Einbock, and Rainer Wagener. "How to Predict the Product Reliability Confidently and Fast with a Minimum Number of Samples in the Wöhler Test." In Lecture Notes in Mechanical Engineering. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77256-7_11.

Full text
Abstract:
AbstractTo accurately estimate and predict the (product) lifetime, a large sample size is mandatory, especially for new and unknown materials. The realization of such a sample size is rarely feasible for reasons of cost and capacity. The prior knowledge must be systematically and consistently used to be able to predict the lifetime accurately. By using the example of Wöhler test, it will be shown that the lifetime prediction with a minimum number of specimen and test time can be successful, when taking the prior knowledge into account.
APA, Harvard, Vancouver, ISO, and other styles
7

Chadwick, D., and A. Marson. "Epilepsy: Basic Designs, Sample Sizes and Experience with Large Multicentre Trials." In Clinical Trials in Neurology. Springer London, 2001. http://dx.doi.org/10.1007/978-1-4471-3787-0_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shen, Meiyu, Yi Tsong, and Richard Lostritto. "Counting Test and Parametric Two One-Sided Tolerance Interval Test for Content Uniformity Using Large Sample Sizes." In Springer Proceedings in Mathematics & Statistics. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-67386-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Klawonn, Frank, Junxi Wang, Ina Koch, Jörg Eberhard, and Mohamed Omar. "HAUCA Curves for the Evaluation of Biomarker Pilot Studies with Small Sample Sizes and Large Numbers of Features." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46349-0_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chernoff, Herman. "Conservative bounds on extreme P-values for testing the equality of two probabilities based on very large sample sizes." In Institute of Mathematical Statistics Lecture Notes - Monograph Series. Institute of Mathematical Statistics, 2004. http://dx.doi.org/10.1214/lnms/1196285395.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Large sample size"

1

Sallaberry, Cédric, and Robert Kurth. "Optimization of Deterministic Submodel to Reduce Large Sample Size Runs." In Proceedings of the 29th European Safety and Reliability Conference (ESREL). Research Publishing Services, 2020. http://dx.doi.org/10.3850/978-981-14-8593-0_4007-cd.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Razza, Muhammad Imran, Muhammad Khurram Khan, and Khaled Alghathbar. "Bio-inspired Hybrid Face Recognition System for Small Sample Size and Large Dataset." In 2010 Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP). IEEE, 2010. http://dx.doi.org/10.1109/iihmsp.2010.99.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Bo, Ying Wei, Yu Zhang, and Qiang Yang. "Deep Neural Networks for High Dimension, Low Sample Size Data." In Twenty-Sixth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/318.

Full text
Abstract:
Deep neural networks (DNN) have achieved breakthroughs in applications with large sample size. However, when facing high dimension, low sample size (HDLSS) data, such as the phenotype prediction problem using genetic data in bioinformatics, DNN suffers from overfitting and high-variance gradients. In this paper, we propose a DNN model tailored for the HDLSS data, named Deep Neural Pursuit (DNP). DNP selects a subset of high dimensional features for the alleviation of overfitting and takes the average over multiple dropouts to calculate gradients with low variance. As the first DNN method applied on the HDLSS data, DNP enjoys the advantages of the high nonlinearity, the robustness to high dimensionality, the capability of learning from a small number of samples, the stability in feature selection, and the end-to-end training. We demonstrate these advantages of DNP via empirical results on both synthetic and real-world biological datasets.
APA, Harvard, Vancouver, ISO, and other styles
4

Filippi, R. G., C. Christiansen, B. Li, et al. "Electromigration Results with Large Sample Size for Dual Damascene Structures in a Copper/CVD Low-k Dielectric Technology." In Proceedings of the IEEE 2006 International Interconnect Technology Conference. IEEE, 2006. http://dx.doi.org/10.1109/iitc.2006.1648657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Abdurahman, Abdukayum, and Zhong Ma. "The Impact of Auditor Size and Abnormal Audit Fee upon Audit Quality: Large Sample Empirical Research Based on Regression Model." In 2020 2nd International Conference on Economic Management and Model Engineering (ICEMME). IEEE, 2020. http://dx.doi.org/10.1109/icemme51517.2020.00132.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gao, Yuan. "Comparing the Permeability Calculation Between Different System Size of the Computational Gas Diffusion Layer Sample in PEMFC." In ASME 2014 12th International Conference on Fuel Cell Science, Engineering and Technology collocated with the ASME 2014 8th International Conference on Energy Sustainability. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/fuelcell2014-6323.

Full text
Abstract:
The gas diffusion layers (GDLs) are key components in proton exchange membrane fuel cells and understanding fluid flow through them plays a significant role in improving fuel cell performance. We used a combination of multiple-relaxation time (MRT) lattice Boltzmann method (LBM) and X-ray micro tomography imaging technology to compare results on dependence of the permeability calculation on the different system size of the computational gas diffusion layer sample. The micro-structures of the carbon paper (HP_1.76) and carbon cloth (HP_1.733) GDL were all digitizing 3D images acquired by X-ray computed micro-tomography at a resolution of 1.76 and 1.733 microns meter respectively, and the fluid flow was simulated by applying pressure gradient in both the through-plane and in-plane direction respectively. The lattice Boltzmann method for permeability calculation has already been tested in our previous work. In this work, we will focus on the permeability calculation of the realistic gas diffusion layer samples depend on the different size samples. The results show the permeability increases with fluctuations as the porosity rises. All the permeability and porosity converge to the value of large size sample that can be regarding a representative volume element. As the porosity and permeability of these Porous samples differs significantly for each other, the anisotropic permeability is nearly same for each one. We can choose part of the sample to calculate the characters if the sample is too big to calculate. We systematically study the effect of system size and periodic boundary condition and validate Darcy’s law from the linear dependence of the flux on the body force exerted.
APA, Harvard, Vancouver, ISO, and other styles
7

Hoard, John, Tejas Chafekar, Mehdi Abarham, Riley Schwader, Steven Upplegger, and Dan Styles. "Large Particles in Modern Diesel Engine Exhaust." In ASME 2012 Internal Combustion Engine Division Spring Technical Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/ices2012-81232.

Full text
Abstract:
During research on diesel engine EGR cooler fouling a test stand giving visual access to the building deposit layer has been developed. Initial experiments reveal the presence of large particles in the exhaust. While conventional wisdom is that diesel particulates typically have log-normal size distributions ranging approximately 10–200 nm, the tests reported here observe small numbers of particles with sizes on the order of tens of μm. Such particles are not generally reported in the literature because exhaust particle sizing instruments typically have inertial separators to remove particles larger than ∼1 μm in order to avoid fouling of the nanoparticle measurement system. The test stand provides exhaust or heated air flow over a cooled surface with Reynolds number, pressure, and surface temperature typical of an EGR cooler. A window allows observation using a digital microscope camera. Starting from a clean surface, a rapid build of a deposit layer is observed. A few large particles are observed. These may land on the surface and remain for long times, although occasionally a particle blows away. In order to study these particles further, an exhaust sample was passed over a fiberglass filter, and the resulting filtered particles were analyzed. Samples were taken at the engine EGR passage, and also in the test stand tubing just before the visualization fixture. The resulting images indicate that the particles are not artifacts of the test system, but rather are present in engine exhaust. MATLAB routines were developed to analyze the filter images taken on the microscope camera. Particles were identified, counted, and sized by the software. It is not possible to take isokinetic samples and give quantitative measurement of the number and size distribution of the particles because the particles are large enough that inertial and gravitational effects will cause them to at least partially settle out of the flows. Nonetheless, the presence of particles tens of μm is documented. Such particles are probably the result of in-cylinder and exhaust pipe deposits flaking. While these larger particles would be captured by the diesel particulate filter (DPF), they can affect intake and exhaust valve seating, EGR cooler fouling, EGR valve sealing, and other factors.
APA, Harvard, Vancouver, ISO, and other styles
8

Yan, Yan, Qingze Zou, and Chanmin Su. "An Integrated Approach to Piezo Actuators Positioning in High-Speed AFM Imaging." In ASME 2008 Dynamic Systems and Control Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/dscc2008-2288.

Full text
Abstract:
In this paper, an integrated approach to achieve atomic force microscope (AFM) imaging of large-size samples at high-speed is proposed, which combines the enhanced inversion-based iterative control (EIIC) technique to drive the piezotube actuator control for lateral x–y axes positioning with the use of a newly-developed dual-stage piezo actuator for vertical z-axis positioning. High-speed, large-size AFM imaging is challenging because large positioning error of the AFM probe relative to the sample can be generated due to the adverse effects—the nonlinear hysteresis and the linear vibration dynamics of the piezotube actuator. In addition, vertical precision positioning of the AFM probe is even more challenging because the desired trajectory is unknown in general, and the probe positioning is also affected by and sensitive to the probe-sample interaction. The main contribution of this article is the development of an integrated approach that combines advanced control algorithm with an advanced hardware platform. The proposed approach is demonstrated in experiments by imaging a large-size (50 μm) calibration sample at high-speed (50 Hz scan rate).
APA, Harvard, Vancouver, ISO, and other styles
9

Nnanna, A. G. Agwu, Chenguang Sheng, Kimberly Conrad, and Greg Crowley. "Performance Assessment of Pre-Filtration Strainer of an Ultrafiltration Membrane System by Particle Size Analysis." In ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-53447.

Full text
Abstract:
One of the industrial applications of ultrafiltration membrane system is water purification and wastewater treatment. Membranes act as physical barriers by eliminating particles such as pollen, yeast, bacteria, colloids, viruses, and macromolecules from feed water. The effectiveness of the membrane to separate particles is determined by its molecular weight cut-off and feed water characteristics. Typically, pre-filtration strainers are installed upstream of an ultrafiltration membrane system to separate large particles from the flow stream. The criteria for selection of the strainer pore size is unclear and is often determined by the feed water average particle size distribution. This paper is motivated by the hydraulic loading failure of a 125 μm strainer by average feed water particle size of 1.6 μm when the volumetric flow is at or greater than 40% of the rated design flow capacity. The objective of this paper are to: a) determine if the feed particle size distribution is a sufficient parameter for selection of pre-filtration strainer, b) evaluate the effect of feed flow velocity on strainer performance, and c) enhance strainer performance using vortex generator. In this experimental study, a Single Particle Optical Sensing, Accusizer, was used to analyze particle size distribution of five water samples collected at strainer feed, strainer filtrate, and strainer backwash. All samples were analyzed using a lower detection limit of 0.5 μm. In order to capture more counts of the larger particles present in the sample, a second analysis was done for each sample at a higher detection limit, 5.09 μm for feed sample, and 2.15 μm for the rest of the samples. Particle size data based on individual detection limits were statistically combined to generate comprehensive blended results of total number and total volume. The volume was determined based on assumption that each particle is spherically shaped. The Particle Size Distribution Measurement Accuracy is ±0.035 μm. Results showed that the feed particle size diameter and volume was insufficient to determine strainer size. Particle size distribution is needed at the feed, filtrate, and backwash to evaluate the strainer particle separation efficiency. It was observed that the total particle count in the filtrate (4.4 × 106) was an order of magnitude higher than the feed (3.2 × 105). Specifically, the total count for particles with diameter less than 7.22 μm were higher in the filtrate while larger particle size ≥ 7.22 μm were more in the feed stream. It appears that the large particles in the feed breaks down into smaller particles at the strainer interface and the small particles (≤ 7.22μm) passed through the pore into the filtrate. The particle breakdown, detachment of particles in the strainer pore into the filtrate, and particle to particle interactions are enhanced by increase in flow velocity hence increasing the hydrodynamic shear that acts on attached particles. A vortex generator inserted in to the strainer reduced pore clogging and pressure drop.
APA, Harvard, Vancouver, ISO, and other styles
10

Nebozhyn, Michael, Razvan Cristescu, Yaping Liu, et al. "Abstract 4931: Biomarker discovery in a large panel of cell lines shows different sample size requirement for prediction of response across a set of compounds." In Proceedings: AACR 103rd Annual Meeting 2012‐‐ Mar 31‐Apr 4, 2012; Chicago, IL. American Association for Cancer Research, 2012. http://dx.doi.org/10.1158/1538-7445.am2012-4931.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Large sample size"

1

Malej, Matt, and Fengyan Shi. Suppressing the pressure-source instability in modeling deep-draft vessels with low under-keel clearance in FUNWAVE-TVD. Engineer Research and Development Center (U.S.), 2021. http://dx.doi.org/10.21079/11681/40639.

Full text
Abstract:
This Coastal and Hydraulics Engineering Technical Note (CHETN) documents the development through verification and validation of three instability-suppressing mechanisms in FUNWAVE-TVD, a Boussinesq-type numerical wave model, when modeling deep-draft vessels with a low under-keel clearance (UKC). Many large commercial ports and channels (e.g., Houston Ship Channel, Galveston, US Army Corps of Engineers [USACE]) are traveled and affected by tens of thousands of commercial vessel passages per year. In a series of recent projects undertaken for the Galveston District (USACE), it was discovered that when deep-draft vessels are modeled using pressure-source mechanisms, they can suffer from model instabilities when low UKC is employed (e.g., vessel draft of 12 m¹ in a channel of 15 m or less of depth), rendering a simulation unstable and obsolete. As an increasingly large number of deep-draft vessels are put into service, this problem is becoming more severe. This presents an operational challenge when modeling large container-type vessels in busy shipping channels, as these often will come as close as 1 m to the bottom of the channel, or even touch the bottom. This behavior would subsequently exhibit a numerical discontinuity in a given model and could severely limit the sample size of modeled vessels. This CHETN outlines a robust approach to suppressing such instability without compromising the integrity of the far-field vessel wave/wake solution. The three methods developed in this study aim to suppress high-frequency spikes generated nearfield of a vessel. They are a shock-capturing method, a friction method, and a viscosity method, respectively. The tests show that the combined shock-capturing and friction method is the most effective method to suppress the local high-frequency noises, while not affecting the far-field solution. A strong test, in which the target draft is larger than the channel depth, shows that there are no high-frequency noises generated in the case of ship squat as long as the shock-capturing method is used.
APA, Harvard, Vancouver, ISO, and other styles
2

Ahumada, Hildegart, Eduardo A. Cavallo, Santos Espina-Mairal, and Fernando Navajas. Sectoral Productivity Growth, COVID-19 Shocks, and Infrastructure. Inter-American Development Bank, 2021. http://dx.doi.org/10.18235/0003411.

Full text
Abstract:
This paper examines sectoral productivity shocks of the COVID-19 pandemic, their aggregate impact, and the possible compensatory effects of improving productivity in infrastructure-related sectors. We employ the KLEMS annual dataset for a group of OECD and Latin America and the Caribbean countries, complemented with high-frequency data for 2020. First, we estimate a panel vector autoregression of growth rates in sector level labor productivity to specify the nature and size of sectoral shocks using the historical data. We then run impulse-response simulations of one standard deviation shocks in the sectors that were most affected by COVID 19. We estimate that the pandemic cut economy-wide labor productivity by 4.9 percent in Latin America, and by 3.5 percent for the entire sample. Finally, by modeling the long-run relationship between productivity shocks in the sectors most affected by COVID 19, we find that large productivity improvements in infrastructure--equivalent to at least three times the historical rates of productivity gains--may be needed to fully compensate for the negative productivity losses traceable to COVID 19.
APA, Harvard, Vancouver, ISO, and other styles
3

Bhattarai, Rabin, Yufan Zhang, and Jacob Wood. Evaluation of Various Perimeter Barrier Products. Illinois Center for Transportation, 2021. http://dx.doi.org/10.36501/0197-9191/21-009.

Full text
Abstract:
Construction activities entail substantial disturbance of topsoil and vegetative cover. As a result, stormwater runoff and erosion rates are increased significantly. If the soil erosion and subsequently generated sediment are not contained within the site, they would have a negative off-site impact as well as a detrimental influence on the receiving water body. In this study, replicable large-scale tests were used to analyze the ability of products to prevent sediment from exiting the perimeter of a site via sheet flow. The goal of these tests was to compare products to examine how well they retain sediment and how much ponding occurs upstream, as well as other criteria of interest to the Illinois Department of Transportation. The products analyzed were silt fence, woven monofilament geotextile, Filtrexx Siltsoxx, ERTEC ProWattle, triangular silt dike, sediment log, coconut coir log, Siltworm, GeoRidge, straw wattles, and Terra-Tube. Joint tests and vegetated buffer strip tests were also conducted. The duration of each test was 30 minutes, and 116 pounds of clay-loam soil were mixed with water in a 300 gallon tank. The solution was continuously mixed throughout the test. The sediment-water slurry was uniformly discharged over an 8 ft by 20 ft impervious 3:1 slope. The bottom of the slope had a permeable zone (8 ft by 8 ft) constructed from the same soil used in the mixing. The product was installed near the center of this zone. Water samples were collected at 5 minute intervals upstream and downstream of the product. These samples were analyzed for total sediment concentration to determine the effectiveness of each product. The performance of each product was evaluated in terms of sediment removal, ponding, ease of installation, and sustainability.
APA, Harvard, Vancouver, ISO, and other styles
4

Jorgensen, Frieda, Andre Charlett, Craig Swift, Anais Painset, and Nicolae Corcionivoschi. A survey of the levels of Campylobacter spp. contamination and prevalence of selected antimicrobial resistance determinants in fresh whole UK-produced chilled chickens at retail sale (non-major retailers). Food Standards Agency, 2021. http://dx.doi.org/10.46756/sci.fsa.xls618.

Full text
Abstract:
Campylobacter spp. are the most common bacterial cause of foodborne illness in the UK, with chicken considered to be the most important vehicle for this organism. The UK Food Standards Agency (FSA) agreed with industry to reduce Campylobacter spp. contamination in raw chicken and issued a target to reduce the prevalence of the most contaminated chickens (those with more than 1000 cfu per g chicken neck skin) to below 10 % at the end of the slaughter process, initially by 2016. To help monitor progress, a series of UK-wide surveys were undertaken to determine the levels of Campylobacter spp. on whole UK-produced, fresh chicken at retail sale in the UK. The data obtained for the first four years was reported in FSA projects FS241044 (2014/15) and FS102121 (2015 to 2018). The FSA has indicated that the retail proxy target for the percentage of highly contaminated raw whole retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target. This report presents results from testing chickens from non-major retailer stores (only) in a fifth survey year from 2018 to 2019. In line with previous practise, samples were collected from stores distributed throughout the UK (in proportion to the population size of each country). Testing was performed by two laboratories - a Public Health England (PHE) laboratory or the Agri-Food &amp; Biosciences Institute (AFBI), Belfast. Enumeration of Campylobacter spp. was performed using the ISO 10272-2 standard enumeration method applied with a detection limit of 10 colony forming units (cfu) per gram (g) of neck skin. Antimicrobial resistance (AMR) to selected antimicrobials in accordance with those advised in the EU harmonised monitoring protocol was predicted from genome sequence data in Campylobacter jejuni and Campylobacter coli isolates The percentage (10.8%) of fresh, whole chicken at retail sale in stores of smaller chains (for example, Iceland, McColl’s, Budgens, Nisa, Costcutter, One Stop), independents and butchers (collectively referred to as non-major retailer stores in this report) in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. has decreased since the previous survey year but is still higher than that found in samples from major retailers. 8 whole fresh raw chickens from non-major retailer stores were collected from August 2018 to July 2019 (n = 1009). Campylobacter spp. were detected in 55.8% of the chicken skin samples obtained from non-major retailer shops, and 10.8% of the samples had counts above 1000 cfu per g chicken skin. Comparison among production plant approval codes showed significant differences of the percentages of chicken samples with more than 1000 cfu per g, ranging from 0% to 28.1%. The percentage of samples with more than 1000 cfu of Campylobacter spp. per g was significantly higher in the period May, June and July than in the period November to April. The percentage of highly contaminated samples was significantly higher for samples taken from larger compared to smaller chickens. There was no statistical difference in the percentage of highly contaminated samples between those obtained from chicken reared with access to range (for example, free-range and organic birds) and those reared under standard regime (for example, no access to range) but the small sample size for organic and to a lesser extent free-range chickens, may have limited the ability to detect important differences should they exist. Campylobacter species was determined for isolates from 93.4% of the positive samples. C. jejuni was isolated from the majority (72.6%) of samples while C. coli was identified in 22.1% of samples. A combination of both species was found in 5.3% of samples. C. coli was more frequently isolated from samples obtained from chicken reared with access to range in comparison to those reared as standard birds. C. jejuni was less prevalent during the summer months of June, July and August compared to the remaining months of the year. Resistance to ciprofloxacin (fluoroquinolone), erythromycin (macrolide), tetracycline, (tetracyclines), gentamicin and streptomycin (aminoglycosides) was predicted from WGS data by the detection of known antimicrobial resistance determinants. Resistance to ciprofloxacin was detected in 185 (51.7%) isolates of C. jejuni and 49 (42.1%) isolates of C. coli; while 220 (61.1%) isolates of C. jejuni and 73 (62.9%) isolates of C. coli isolates were resistant to tetracycline. Three C. coli (2.6%) but none of the C. jejuni isolates harboured 23S mutations predicting reduced susceptibility to erythromycin. Multidrug resistance (MDR), defined as harbouring genetic determinants for resistance to at least three unrelated antimicrobial classes, was found in 10 (8.6%) C. coli isolates but not in any C. jejuni isolates. Co-resistance to ciprofloxacin and erythromycin was predicted in 1.7% of C. coli isolates. 9 Overall, the percentages of isolates with genetic AMR determinants found in this study were similar to those reported in the previous survey year (August 2016 to July 2017) where testing was based on phenotypic break-point testing. Multi-drug resistance was similar to that found in the previous survey years. It is recommended that trends in AMR in Campylobacter spp. isolates from retail chickens continue to be monitored to realise any increasing resistance of concern, particulary to erythromycin (macrolide). Considering that the percentage of fresh, whole chicken from non-major retailer stores in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. continues to be above that in samples from major retailers more action including consideration of interventions such as improved biosecurity and slaughterhouse measures is needed to achieve better control of Campylobacter spp. for this section of the industry. The FSA has indicated that the retail proxy target for the percentage of highly contaminated retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target.
APA, Harvard, Vancouver, ISO, and other styles
5

Dempsey, Terri L. Handling the Qualitative Side of Mixed Methods Research: A Multisite, Team-Based High School Education Evaluation Study. RTI Press, 2018. http://dx.doi.org/10.3768/rtipress.2018.mr.0039.1809.

Full text
Abstract:
Attention to mixed methods studies research has increased in recent years, particularly among funding agencies that increasingly require a mixed methods approach for program evaluation. At the same time, researchers operating within large-scale, rapid-turnaround research projects are faced with the reality that collection and analysis of large amounts of qualitative data typically require an intense amount of project resources and time. However, practical examples of efficiently collecting and handling high-quality qualitative data within these studies are limited. More examples are also needed of procedures for integrating the qualitative and quantitative strands of a study from design to interpretation in ways that can facilitate efficiencies. This paper provides a detailed description of the strategies used to collect and analyze qualitative data in what the research team believed to be an efficient, high-quality way within a team-based mixed methods evaluation study of science, technology, engineering, and math (STEM) high-school education. The research team employed an iterative approach to qualitative data analysis that combined matrix analyses with Microsoft Excel and the qualitative data analysis software program ATLAS.ti. This approach yielded a number of practical benefits. Selected preliminary results illustrate how this approach can simplify analysis and facilitate data integration.
APA, Harvard, Vancouver, ISO, and other styles
6

Fehey, Kristina, and Dustin Perkins. Invasive exotic plant monitoring in Capitol Reef National Park: 2019 field season, Scenic Drive and Cathedral Valley Road. Edited by Alice Wondrak Biel. National Park Service, 2021. http://dx.doi.org/10.36967/nrr-2286627.

Full text
Abstract:
Invasive exotic plant (IEP) species are a significant threat to natural ecosystem integrity and biodiversity, and controlling them is a high priority for the National Park Service. The North-ern Colorado Plateau Network (NCPN) selected the early detection of IEPs as one of 11 monitoring protocols to be implemented as part of its long-term monitoring program. From May 30 to June 1, 2019, network staff conducted surveys for priority IEP species along the Scenic Drive and Cathedral Valley Road monitoring routes at Capitol Reef National Park. We detected 119 patches of six priority IEP species along 34 kilometers of the two monitor-ing routes. There were more patches of IEPs, and a higher percentage of large patches, than in previous years. This indicates that previously identified infestations have expanded and grown. The most common (47.1%) patch size among priority species was 1,000–2,000 m2 (0.25–0.5 acre). The vast majority (93.2%) of priority patches ranked either low (58.8%) or very low (34.4%) on the patch management index scale. Tamarisk (Tamarix sp., 72 patches) was the most prevalent priority IEP species. African mustard (Malcolmia africana, 32 patch-es), field bindweed (Convolvulus arvensis, 9 patches), and Russian olive (Elaeagnus angusti-folia, 3 patches) occurred less commonly. Together, these four species represented 97.5% of all patches recorded in 2019. Four IEP species were found on the monitored routes for the first time: Russian olive (Elaeagnus angustifolia), quackgrass (Elymus repens), Siberian elm (Ulmus pumila), and African mustard (Malcolmia africana, not on the priority species list before 2019). Cathedral Valley Road had higher IEP priority patches per kilometer (5.68) than the Scenic Drive (2.05). IEP species were found on 37.9% (25 of 66) of monitored transects. Almost all these detections were Russian thistle (Salsola sp.). Russian thistle was widespread, present in 33.3% of transects, with an estimated cover of 0.2% across all transects sampled. Across routes monitored in all three rotations (2012, 2015, and 2019), Russian thistle has increased in frequency. However, its frequency remained about the same from 2015 to 2019, and percent cover remains low. Tamarisk and field bindweed have both increased in preva-lence since monitoring began, with tamarisk showing a dramatic increase in the number and size of patches. Immediate control of tamarisk and these other species is recommended to reduce their numbers on these routes. The NCPN plans to Capitol Reef in 2020 to monitor Oak and Pleasant creeks, completing the third rotation of invasive plant monitoring.
APA, Harvard, Vancouver, ISO, and other styles
7

Perkins, Dustin. Invasive exotic plant monitoring at Colorado National Monument: 2019 field season. Edited by Alice Wondrak Biel. National Park Service, 2021. http://dx.doi.org/10.36967/nrr-2286650.

Full text
Abstract:
Invasive exotic plant (IEP) species are a significant threat to natural ecosystem integrity and biodiversity, and controlling them is a high priority for the National Park Service. The North-ern Colorado Plateau Network (NCPN) selected the early detection of IEPs as one of 11 monitoring protocols to be implemented as part of its long-term monitoring program. This report represents work completed at Colorado National Monument during 2019. During monitoring conducted June 12–19, a total of 20 IEP species were detected on monitoring routes and transects. Of these, 12 were priority species that accounted for 791 separate IEP patches. IEPs were most prevalent along riparian areas. Yellow sweetclover (Melilotis officinale) and yellow salsify (Tragopogon dubius) were the most commonly detected priority IEPs along monitoring routes, representing 73% of all priority patches. Patches of less than 40 m2 were typical of nearly all priority IEP species except yellow sweetclover. A patch management index (PMI) was created by combining patch size class and percent cover for each patch. In 2019, a large majority of priority IEP patches were assigned a PMI score of low (46%) or very low (50%), indicating small and/or sparse patches where control is generally still feasible. This is similar to the numbers for 2017, when 99% of patches scored low or very low in PMI. Seventy-eight percent of tree patches were classified as seedlings or saplings, which require less effort to control than mature trees. Cheatgrass (Anisantha tectorum) was the most common IEP recorded in transects, found in 30–77% of transects across the different routes. It was the only species found in transects on all monitoring routes. When treated and untreated extra areas near the West Entrance were compared, the treated area had comparable or higher lev-els of IEPs than the untreated area. When segments of monitoring routes conducted between 2003 and 2019 were compared, results were mixed, due to the different species monitored in different time periods. But in general, the number of IEPs per 100 meters is increasing or remaining constant over time. There were notable increases in IEP patches per 100 meters on several routes in 2019: field bindweed (Convolvulus arvensis) along East Glade Park Road; Siberian elm (Ulmus pumila) in Red Canyon; yellow salsify along East Glade Park Road, No Thoroughfare Canyon, No Thoroughfare Trail, and Red Canyon; and yellow sweetclover in No Thoroughfare Canyon and Red Canyon. Network staff will return to re-sample monitoring routes in 2021.
APA, Harvard, Vancouver, ISO, and other styles
8

Tulloch, Olivia, Tamara Roldan de Jong, and Kevin Bardosh. Data Synthesis: COVID-19 Vaccine Perceptions in Sub-Saharan Africa: Social and Behavioural Science Data, March 2020-April 2021. Institute of Development Studies (IDS), 2021. http://dx.doi.org/10.19088/sshap.2028.

Full text
Abstract:
Safe and effective vaccines against COVID-19 are seen as a critical path to ending the pandemic. This synthesis brings together data related to public perceptions about COVID-19 vaccines collected between March 2020 and March 2021 in 22 countries in Africa. It provides an overview of the data (primarily from cross-sectional perception surveys), identifies knowledge and research gaps and presents some limitations of translating the available evidence to inform local operational decisions. The synthesis is intended for those designing and delivering vaccination programmes and COVID-19 risk communication and community engagement (RCCE). 5 large-scale surveys are included with over 12 million respondents in 22 central, eastern, western and southern African countries (note: one major study accounts for more than 10 million participants); data from 14 peer-reviewed questionnaire surveys in 8 countries with n=9,600 participants and 15 social media monitoring, qualitative and community feedback studies. Sample sizes are provided in the first reference for each study and in Table 13 at the end of this document. The data largely predates vaccination campaigns that generally started in the first quarter of 2021. Perceptions will change and further syntheses, that represent the whole continent including North Africa, are planned. This review is part of the Social Science in Humanitarian Action Platform (SSHAP) series on COVID-19 vaccines. It was developed for SSHAP by Anthrologica. It was written by Kevin Bardosh (University of Washington), Tamara Roldan de Jong and Olivia Tulloch (Anthrologica), it was reviewed by colleagues from PERC, LSHTM, IRD, and UNICEF (see acknowledgments) and received coordination support from the RCCE Collective Service. It is the responsibility of SSHAP.
APA, Harvard, Vancouver, ISO, and other styles
9

Tulloch, Olivia, Tamara Roldan de Jong, and Kevin Bardosh. Data Synthesis: COVID-19 Vaccine Perceptions in Africa: Social and Behavioural Science Data, March 2020-March 2021. Institute of Development Studies (IDS), 2021. http://dx.doi.org/10.19088/sshap.2021.030.

Full text
Abstract:
Safe and effective vaccines against COVID-19 are seen as a critical path to ending the pandemic. This synthesis brings together data related to public perceptions about COVID-19 vaccines collected between March 2020 and March 2021 in 22 countries in Africa. It provides an overview of the data (primarily from cross-sectional perception surveys), identifies knowledge and research gaps and presents some limitations of translating the available evidence to inform local operational decisions. The synthesis is intended for those designing and delivering vaccination programmes and COVID-19 risk communication and community engagement (RCCE). 5 large-scale surveys are included with over 12 million respondents in 22 central, eastern, western and southern African countries (note: one major study accounts for more than 10 million participants); data from 14 peer-reviewed questionnaire surveys in 8 countries with n=9,600 participants and 15 social media monitoring, qualitative and community feedback studies. Sample sizes are provided in the first reference for each study and in Table 13 at the end of this document. The data largely predates vaccination campaigns that generally started in the first quarter of 2021. Perceptions will change and further syntheses, that represent the whole continent including North Africa, are planned. This review is part of the Social Science in Humanitarian Action Platform (SSHAP) series on COVID-19 vaccines. It was developed for SSHAP by Anthrologica. It was written by Kevin Bardosh (University of Washington), Tamara Roldan de Jong and Olivia Tulloch (Anthrologica), it was reviewed by colleagues from PERC, LSHTM, IRD, and UNICEF (see acknowledgments) and received coordination support from the RCCE Collective Service. It is the responsibility of SSHAP.
APA, Harvard, Vancouver, ISO, and other styles
10

Vargas-Herrera, Hernando, Juan Jose Ospina-Tejeiro, Carlos Alfonso Huertas-Campos, et al. Monetary Policy Report - April de 2021. Banco de la República de Colombia, 2021. http://dx.doi.org/10.32468/inf-pol-mont-eng.tr2-2021.

Full text
Abstract:
1.1 Macroeconomic summary Economic recovery has consistently outperformed the technical staff’s expectations following a steep decline in activity in the second quarter of 2020. At the same time, total and core inflation rates have fallen and remain at low levels, suggesting that a significant element of the reactivation of Colombia’s economy has been related to recovery in potential GDP. This would support the technical staff’s diagnosis of weak aggregate demand and ample excess capacity. The most recently available data on 2020 growth suggests a contraction in economic activity of 6.8%, lower than estimates from January’s Monetary Policy Report (-7.2%). High-frequency indicators suggest that economic performance was significantly more dynamic than expected in January, despite mobility restrictions and quarantine measures. This has also come amid declines in total and core inflation, the latter of which was below January projections if controlling for certain relative price changes. This suggests that the unexpected strength of recent growth contains elements of demand, and that excess capacity, while significant, could be lower than previously estimated. Nevertheless, uncertainty over the measurement of excess capacity continues to be unusually high and marked both by variations in the way different economic sectors and spending components have been affected by the pandemic, and by uneven price behavior. The size of excess capacity, and in particular the evolution of the pandemic in forthcoming quarters, constitute substantial risks to the macroeconomic forecast presented in this report. Despite the unexpected strength of the recovery, the technical staff continues to project ample excess capacity that is expected to remain on the forecast horizon, alongside core inflation that will likely remain below the target. Domestic demand remains below 2019 levels amid unusually significant uncertainty over the size of excess capacity in the economy. High national unemployment (14.6% for February 2021) reflects a loose labor market, while observed total and core inflation continue to be below 2%. Inflationary pressures from the exchange rate are expected to continue to be low, with relatively little pass-through on inflation. This would be compatible with a negative output gap. Excess productive capacity and the expectation of core inflation below the 3% target on the forecast horizon provide a basis for an expansive monetary policy posture. The technical staff’s assessment of certain shocks and their expected effects on the economy, as well as the presence of several sources of uncertainty and related assumptions about their potential macroeconomic impacts, remain a feature of this report. The coronavirus pandemic, in particular, continues to affect the public health environment, and the reopening of Colombia’s economy remains incomplete. The technical staff’s assessment is that the COVID-19 shock has affected both aggregate demand and supply, but that the impact on demand has been deeper and more persistent. Given this persistence, the central forecast accounts for a gradual tightening of the output gap in the absence of new waves of contagion, and as vaccination campaigns progress. The central forecast continues to include an expected increase of total and core inflation rates in the second quarter of 2021, alongside the lapse of the temporary price relief measures put in place in 2020. Additional COVID-19 outbreaks (of uncertain duration and intensity) represent a significant risk factor that could affect these projections. Additionally, the forecast continues to include an upward trend in sovereign risk premiums, reflected by higher levels of public debt that in the wake of the pandemic are likely to persist on the forecast horizon, even in the context of a fiscal adjustment. At the same time, the projection accounts for the shortterm effects on private domestic demand from a fiscal adjustment along the lines of the one currently being proposed by the national government. This would be compatible with a gradual recovery of private domestic demand in 2022. The size and characteristics of the fiscal adjustment that is ultimately implemented, as well as the corresponding market response, represent another source of forecast uncertainty. Newly available information offers evidence of the potential for significant changes to the macroeconomic scenario, though without altering the general diagnosis described above. The most recent data on inflation, growth, fiscal policy, and international financial conditions suggests a more dynamic economy than previously expected. However, a third wave of the pandemic has delayed the re-opening of Colombia’s economy and brought with it a deceleration in economic activity. Detailed descriptions of these considerations and subsequent changes to the macroeconomic forecast are presented below. The expected annual decline in GDP (-0.3%) in the first quarter of 2021 appears to have been less pronounced than projected in January (-4.8%). Partial closures in January to address a second wave of COVID-19 appear to have had a less significant negative impact on the economy than previously estimated. This is reflected in figures related to mobility, energy demand, industry and retail sales, foreign trade, commercial transactions from selected banks, and the national statistics agency’s (DANE) economic tracking indicator (ISE). Output is now expected to have declined annually in the first quarter by 0.3%. Private consumption likely continued to recover, registering levels somewhat above those from the previous year, while public consumption likely increased significantly. While a recovery in investment in both housing and in other buildings and structures is expected, overall investment levels in this case likely continued to be low, and gross fixed capital formation is expected to continue to show significant annual declines. Imports likely recovered to again outpace exports, though both are expected to register significant annual declines. Economic activity that outpaced projections, an increase in oil prices and other export products, and an expected increase in public spending this year account for the upward revision to the 2021 growth forecast (from 4.6% with a range between 2% and 6% in January, to 6.0% with a range between 3% and 7% in April). As a result, the output gap is expected to be smaller and to tighten more rapidly than projected in the previous report, though it is still expected to remain in negative territory on the forecast horizon. Wide forecast intervals reflect the fact that the future evolution of the COVID-19 pandemic remains a significant source of uncertainty on these projections. The delay in the recovery of economic activity as a result of the resurgence of COVID-19 in the first quarter appears to have been less significant than projected in the January report. The central forecast scenario expects this improved performance to continue in 2021 alongside increased consumer and business confidence. Low real interest rates and an active credit supply would also support this dynamic, and the overall conditions would be expected to spur a recovery in consumption and investment. Increased growth in public spending and public works based on the national government’s spending plan (Plan Financiero del Gobierno) are other factors to consider. Additionally, an expected recovery in global demand and higher projected prices for oil and coffee would further contribute to improved external revenues and would favor investment, in particular in the oil sector. Given the above, the technical staff’s 2021 growth forecast has been revised upward from 4.6% in January (range from 2% to 6%) to 6.0% in April (range from 3% to 7%). These projections account for the potential for the third wave of COVID-19 to have a larger and more persistent effect on the economy than the previous wave, while also supposing that there will not be any additional significant waves of the pandemic and that mobility restrictions will be relaxed as a result. Economic growth in 2022 is expected to be 3%, with a range between 1% and 5%. This figure would be lower than projected in the January report (3.6% with a range between 2% and 6%), due to a higher base of comparison given the upward revision to expected GDP in 2021. This forecast also takes into account the likely effects on private demand of a fiscal adjustment of the size currently being proposed by the national government, and which would come into effect in 2022. Excess in productive capacity is now expected to be lower than estimated in January but continues to be significant and affected by high levels of uncertainty, as reflected in the wide forecast intervals. The possibility of new waves of the virus (of uncertain intensity and duration) represents a significant downward risk to projected GDP growth, and is signaled by the lower limits of the ranges provided in this report. Inflation (1.51%) and inflation excluding food and regulated items (0.94%) declined in March compared to December, continuing below the 3% target. The decline in inflation in this period was below projections, explained in large part by unanticipated increases in the costs of certain foods (3.92%) and regulated items (1.52%). An increase in international food and shipping prices, increased foreign demand for beef, and specific upward pressures on perishable food supplies appear to explain a lower-than-expected deceleration in the consumer price index (CPI) for foods. An unexpected increase in regulated items prices came amid unanticipated increases in international fuel prices, on some utilities rates, and for regulated education prices. The decline in annual inflation excluding food and regulated items between December and March was in line with projections from January, though this included downward pressure from a significant reduction in telecommunications rates due to the imminent entry of a new operator. When controlling for the effects of this relative price change, inflation excluding food and regulated items exceeds levels forecast in the previous report. Within this indicator of core inflation, the CPI for goods (1.05%) accelerated due to a reversion of the effects of the VAT-free day in November, which was largely accounted for in February, and possibly by the transmission of a recent depreciation of the peso on domestic prices for certain items (electric and household appliances). For their part, services prices decelerated and showed the lowest rate of annual growth (0.89%) among the large consumer baskets in the CPI. Within the services basket, the annual change in rental prices continued to decline, while those services that continue to experience the most significant restrictions on returning to normal operations (tourism, cinemas, nightlife, etc.) continued to register significant price declines. As previously mentioned, telephone rates also fell significantly due to increased competition in the market. Total inflation is expected to continue to be affected by ample excesses in productive capacity for the remainder of 2021 and 2022, though less so than projected in January. As a result, convergence to the inflation target is now expected to be somewhat faster than estimated in the previous report, assuming the absence of significant additional outbreaks of COVID-19. The technical staff’s year-end inflation projections for 2021 and 2022 have increased, suggesting figures around 3% due largely to variation in food and regulated items prices. The projection for inflation excluding food and regulated items also increased, but remains below 3%. Price relief measures on indirect taxes implemented in 2020 are expected to lapse in the second quarter of 2021, generating a one-off effect on prices and temporarily affecting inflation excluding food and regulated items. However, indexation to low levels of past inflation, weak demand, and ample excess productive capacity are expected to keep core inflation below the target, near 2.3% at the end of 2021 (previously 2.1%). The reversion in 2021 of the effects of some price relief measures on utility rates from 2020 should lead to an increase in the CPI for regulated items in the second half of this year. Annual price changes are now expected to be higher than estimated in the January report due to an increased expected path for fuel prices and unanticipated increases in regulated education prices. The projection for the CPI for foods has increased compared to the previous report, taking into account certain factors that were not anticipated in January (a less favorable agricultural cycle, increased pressure from international prices, and transport costs). Given the above, year-end annual inflation for 2021 and 2022 is now expected to be 3% and 2.8%, respectively, which would be above projections from January (2.3% and 2,7%). For its part, expected inflation based on analyst surveys suggests year-end inflation in 2021 and 2022 of 2.8% and 3.1%, respectively. There remains significant uncertainty surrounding the inflation forecasts included in this report due to several factors: 1) the evolution of the pandemic; 2) the difficulty in evaluating the size and persistence of excess productive capacity; 3) the timing and manner in which price relief measures will lapse; and 4) the future behavior of food prices. Projected 2021 growth in foreign demand (4.4% to 5.2%) and the supposed average oil price (USD 53 to USD 61 per Brent benchmark barrel) were both revised upward. An increase in long-term international interest rates has been reflected in a depreciation of the peso and could result in relatively tighter external financial conditions for emerging market economies, including Colombia. Average growth among Colombia’s trade partners was greater than expected in the fourth quarter of 2020. This, together with a sizable fiscal stimulus approved in the United States and the onset of a massive global vaccination campaign, largely explains the projected increase in foreign demand growth in 2021. The resilience of the goods market in the face of global crisis and an expected normalization in international trade are additional factors. These considerations and the expected continuation of a gradual reduction of mobility restrictions abroad suggest that Colombia’s trade partners could grow on average by 5.2% in 2021 and around 3.4% in 2022. The improved prospects for global economic growth have led to an increase in current and expected oil prices. Production interruptions due to a heavy winter, reduced inventories, and increased supply restrictions instituted by producing countries have also contributed to the increase. Meanwhile, market forecasts and recent Federal Reserve pronouncements suggest that the benchmark interest rate in the U.S. will remain stable for the next two years. Nevertheless, a significant increase in public spending in the country has fostered expectations for greater growth and inflation, as well as increased uncertainty over the moment in which a normalization of monetary policy might begin. This has been reflected in an increase in long-term interest rates. In this context, emerging market economies in the region, including Colombia, have registered increases in sovereign risk premiums and long-term domestic interest rates, and a depreciation of local currencies against the dollar. Recent outbreaks of COVID-19 in several of these economies; limits on vaccine supply and the slow pace of immunization campaigns in some countries; a significant increase in public debt; and tensions between the United States and China, among other factors, all add to a high level of uncertainty surrounding interest rate spreads, external financing conditions, and the future performance of risk premiums. The impact that this environment could have on the exchange rate and on domestic financing conditions represent risks to the macroeconomic and monetary policy forecasts. Domestic financial conditions continue to favor recovery in economic activity. The transmission of reductions to the policy interest rate on credit rates has been significant. The banking portfolio continues to recover amid circumstances that have affected both the supply and demand for loans, and in which some credit risks have materialized. Preferential and ordinary commercial interest rates have fallen to a similar degree as the benchmark interest rate. As is generally the case, this transmission has come at a slower pace for consumer credit rates, and has been further delayed in the case of mortgage rates. Commercial credit levels stabilized above pre-pandemic levels in March, following an increase resulting from significant liquidity requirements for businesses in the second quarter of 2020. The consumer credit portfolio continued to recover and has now surpassed February 2020 levels, though overall growth in the portfolio remains low. At the same time, portfolio projections and default indicators have increased, and credit establishment earnings have come down. Despite this, credit disbursements continue to recover and solvency indicators remain well above regulatory minimums. 1.2 Monetary policy decision In its meetings in March and April the BDBR left the benchmark interest rate unchanged at 1.75%.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography