Academic literature on the topic 'Sampling efforts, sample size'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sampling efforts, sample size.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Sampling efforts, sample size"

1

Stoeckel, James, Brian Helms, Mathew Catalano, Jonathan M. Miller, Kesley Gibson, and Paul M. Stewart. "Field and model-based evaluation of a low-cost sampling protocol for a coordinated, crayfish life-history sampling effort." Freshwater Crayfish 21, no. 1 (2015): 131–41. http://dx.doi.org/10.5869/fc.2015.v21-1.131.

Full text
Abstract:
Abstract Life-history studies have been published for only a small proportion of crayfish species native to the southeastern United States. The Southeastern Crayfish Biologist Working Group was formed to help meet this deficit by coordinating life-history research efforts. We used a combination of field sampling and computer modeling to evaluate a basic core sampling protocol and assess sample size issues. We focused on a locally abundant species, Procambarus versutus, and followed the protocol for 5 months at two sites in Alabama. Results showed a monthly sample size of 10 juvenile males plus 10 adult males could be met consistently and provided a = 85% probability of detecting seasonal shifts in proportion of Form I males under a range of scenarios. Weaknesses of the effort included inconsistent determination of sex in juveniles and form of adult males among sampling crews. More importantly, female life-history information was rarely obtainable due to an inability to collect sufficient numbers of glaired or berried adults. We suggest several priorities to address before implementation of coordinated, large-scale sampling efforts. These include development of sampling techniques to increase captures of glaired and berried females, and assessment of non-lethal gamete extraction techniques to track reproductive state of males and females.
APA, Harvard, Vancouver, ISO, and other styles
2

Gumpili, Sai Prashanti, and Anthony Vipin Das. "Sample size and its evolution in research." IHOPE Journal of Ophthalmology 1 (January 7, 2022): 9–13. http://dx.doi.org/10.25259/ihopejo_3_2021.

Full text
Abstract:
Objective: Sample size is one of the crucial and basic steps involved in planning any study. This article aims to study the evolution of sample size across the years from hundreds to thousands to millions and to a trillion in the near future (H-K-M-B-T). It also aims to understand the importance of sampling in the era of big data. Study Design - Primary Outcome measure, Methods, Results, and Interpretation: A sample size which is too small will not be a true representation of the population whereas a large sample size will involve putting more individuals at risk. An optimum sample size needs to be employed to identify statistically significant differences if they exist and obtain scientifically valid results. The design of the study, the primary outcome, sampling method used, dropout rate, effect size, power, level of significance, and standard deviation are some of the multiple factors which affect the sample size. All these factors need to be taken into account while calculating the sample size. Many sources are available for calculating sample size. Discretion needs to be used while choosing the right source. The large volumes of data and the corresponding number of data points being analyzed is redefining many industries including healthcare. The larger the sample size, the more insightful information, identification of rare side effects, lesser margin of error, higher confidence level, and models with more accuracy. Advances in the digital era have ensured that we do not face most of the obstacles faced traditionally with regards to statistical sampling, yet it has its own set of challenges. Hence, considerable efforts and time should be invested in selecting sampling techniques which are appropriate and reducing sampling bias and errors. This will ensure the reliability and reproducibility in the results obtained. Along with a large sample size, the focus should be on getting to know the data better, the sample frame and the context in which it was collected. We need to focus on creation of good quality data and structured systems to capture the sample. Good data quality management makes sure that the data are structured appropriately.
APA, Harvard, Vancouver, ISO, and other styles
3

Davis, J. P., M. D. Jackson, J. M. Leek, and M. Samadpour. "Sample Preparation and Analytical Considerations for the US Aflatoxin Sampling Program for Shelled Peanuts." Peanut Science 45, no. 1 (2018): 19–31. http://dx.doi.org/10.3146/ps17-12.1.

Full text
Abstract:
ABSTRACT The USDA aflatoxin sampling program for shelled peanuts is an important component of broader industry efforts to minimize aflatoxin occurrence in the edible market. In this program, official samples are milled with either a traditional hammer/automatic sub-sampling mill, commonly called the Dickens Mill (DM) or with a vertical cutter mill (VCM). Particle size reduction and sample homogenization are the primary objectives of sample preparation (milling) to generate subsamples which best represent the parent sample composition for downstream analysis. DM particle size reduction is limited by the 3.2 mm round hole screens internal to the mill which prevent pasting of the sample. VCM grinding converts the sample to a paste while simultaneously homogenizing the sample. Experiments demonstrate that when testing aflatoxin contaminated peanuts for equivalent sized subsamples prepared from the two mill types, made into water slurries per USDA specifications and subsequently extracted and tested for total aflatoxin per USDA specifications, VCM subsamples are more normally distributed around the sample aflatoxin mean, whereas DM subsamples are more positively skewed (median lower than mean) around the sample aflatoxin mean. Accordingly, milling official samples with a DM compared to VCM promotes more lot misclassifications. It is also demonstrated that for a given subsample after extraction and immunoaffinity column (IAC) purification, the total aflatoxin measured by either high performance liquid chromatography (HPLC) or fluorometry (both USDA approved) are practically equivalent from an accuracy perspective. There are costs (time and resources) associated with decreasing natural variation due to sampling, sample preparation and analytical testing in an aflatoxin sampling/testing program. Sample preparation is a greater source of variation compared to that of the analytical testing. Resources would be better spent replacing DM with VCM mills than converting the final analytical step from IAC-fluorometry to IAC-HPLC in an effort to best classify peanut lots for the edible market.
APA, Harvard, Vancouver, ISO, and other styles
4

Chakraborty, Shayok, and Ankita Singh. "Active Sampling for Text Classification with Subinstance Level Queries." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 6 (2022): 6150–58. http://dx.doi.org/10.1609/aaai.v36i6.20563.

Full text
Abstract:
Active learning algorithms are effective in identifying the salient and exemplar samples from large amounts of unlabeled data. This tremendously reduces the human annotation effort in inducing a machine learning model as only a few samples, which are identified by the algorithm, need to be labeled manually. In problem domains like text mining and video classification, human oracles peruse the data instances incrementally to derive an opinion about their class labels (such as reading a movie review progressively to assess its sentiment). In such applications, it is not necessary for the human oracles to review an unlabeled sample end-to-end in order to provide a label; it may be more efficient to identify an optimal subinstance size (percentage of the sample from the start) for each unlabeled sample, and request the human annotator to label the sample by analyzing only the subinstance, instead of the whole data sample. In this paper, we propose a novel framework to address this challenging problem, in an effort to further reduce the labeling burden on the human oracles and utilize the available labeling budget more efficiently. We pose the sample and subinstance size selection as a constrained optimization problem and derive a linear programming relaxation to select a batch of exemplar samples, together with the optimal subinstance size of each, which can potentially augment maximal information to the underlying classification model. Our extensive empirical studies on six challenging datasets from the text mining domain corroborate the practical usefulness of our framework over competing baselines.
APA, Harvard, Vancouver, ISO, and other styles
5

Watson, David M. "Sampling effort determination in bird surveys: do current norms meet best-practice recommendations?" Wildlife Research 44, no. 3 (2017): 183. http://dx.doi.org/10.1071/wr16226.

Full text
Abstract:
A critical design component of studies measuring diversity is sampling effort. Allocation of sampling effort dictates how many sites can be sampled within a particular time-frame or budget, as well as sample duration, frequency and intensity, thereby determining the resolution and reliability of emergent inferences. Conventional survey techniques use fixed-effort methods that assume invariant detectabilities among sites and species. Several approaches have been developed in the past decade that account for variable detectability by using alternative sampling methods or by adjusting standard counts before analysis, but it is unclear how widely adopted these techniques have been or how current bird surveying norms compare with best-practice recommendations. I conducted a systematic search of the primary literature to ascertain how sampling effort is determined, how much effort is devoted to sampling each site and how variation in detectability is dealt with. Of 225 empirical studies of bird diversity published between 2004 and 2016, five used results-based stopping rules (each derived independently), 54 used proportional sampling, and 159 (71%) used implicit effort-based stopping rules (fixed effort). Effort varied widely, but 61% of studies used samples of 10min or less and 62% of studies expended total effort per datum of 2h or less, with 78% providing no justification for sampling efforts used and just 15% explicitly accounting for estimated detectability. Given known variation in detectability, relying on short-duration fixed-effort approaches without validation or post hoc correction means that most bird diversity studies necessarily under-sample some sites and/or species. Having identified current bird surveying norms and highlighted their shortcomings, I provide five practical solutions to improve sampling effort determination, urging contributors and consumers of empirical ecological literature to consider survey data in terms of sample completeness.
APA, Harvard, Vancouver, ISO, and other styles
6

Yun, Meiping, and Wenwen Qin. "Minimum Sampling Size of Floating Cars for Urban Link Travel Time Distribution Estimation." Transportation Research Record: Journal of the Transportation Research Board 2673, no. 3 (2019): 24–43. http://dx.doi.org/10.1177/0361198119834297.

Full text
Abstract:
Despite the wide application of floating car data (FCD) in urban link travel time estimation, limited efforts have been made to determine the minimum sample size of floating cars appropriate to the requirements for travel time distribution (TTD) estimation. This study develops a framework for seeking the required minimum number of travel time observations generated from FCD for urban link TTD estimation. The basic idea is to test how, with a decreasing the number of observations, the similarities between the distribution of estimated travel time from observations and those from the ground-truth vary. These are measured by employing the Hellinger Distance (HD) and Kolmogorov-Smirnov (KS) tests. Finally, the minimum sample size is determined by the HD value, ensuring that corresponding distribution passes the KS test. The proposed method is validated with the sources of FCD and Radio Frequency Identification Data (RFID) collected from an urban arterial in Nanjing, China. The results indicate that: (1) the average travel times derived from FCD give good estimation accuracy for real-time application; (2) the minimum required sample size range changes with the extent of time-varying fluctuations in traffic flows; (3) the minimum sample size determination is sensitive to whether observations are aggregated near each peak in the multistate distribution; (4) sparse and incomplete observations from FCD in most time periods cannot be used to achieve the minimum sample size. Moreover, this would produce a significant deviation from the ground-truth distributions. Finally, FCD is strongly recommended for better TTD estimation incorporating both historical trends and real-time observations.
APA, Harvard, Vancouver, ISO, and other styles
7

Hopkins, Tapani, Heikki Roininen, Simon van Noort, Gavin R. Broad, Kari Kaunisto, and Ilari E. Sääksjärvi. "Extensive sampling and thorough taxonomic assessment of Afrotropical Rhyssinae (Hymenoptera, Ichneumonidae) reveals two new species and demonstrates the limitations of previous sampling efforts." ZooKeys 878 (October 7, 2019): 33–71. http://dx.doi.org/10.3897/zookeys.878.37845.

Full text
Abstract:
Tropical forest invertebrates, such as the parasitoid wasp family Ichneumonidae, are poorly known. This work reports some of the first results of an extensive survey implemented in Kibale National Park, Uganda. A total of 456 individuals was caught of the subfamily Rhyssinae Morley, 1913, which in the Afrotropical region was previously known from only 30 specimens. Here, the six species found at the site are described and the Afrotropical Rhyssinae are reviewed. Two new species, Epirhyssa johanna Hopkins, sp. nov. and E. quaggasp. nov., are described and a key, diagnostic characters, and descriptions for all 13 known Afrotropical species are provided, including the first description of the male of Epirhyssa overlaeti Seyrig, 1937. Epirhyssa gavinbroadi Rousse & van Noort, 2014, syn. nov. is proposed to be a synonym of E. uelensis Benoit, 1951. Extensive sampling with Malaise traps gave an unprecedented sample size, and the method is recommended for other poorly known tropical areas.
APA, Harvard, Vancouver, ISO, and other styles
8

Stefani, Lorenzo De, Erisa Terolli, and Eli Upfal. "Tiered Sampling." ACM Transactions on Knowledge Discovery from Data 15, no. 5 (2021): 1–52. http://dx.doi.org/10.1145/3441299.

Full text
Abstract:
We introduce Tiered Sampling , a novel technique for estimating the count of sparse motifs in massive graphs whose edges are observed in a stream. Our technique requires only a single pass on the data and uses a memory of fixed size M , which can be magnitudes smaller than the number of edges. Our methods address the challenging task of counting sparse motifs—sub-graph patterns—that have a low probability of appearing in a sample of M edges in the graph, which is the maximum amount of data available to the algorithms in each step. To obtain an unbiased and low variance estimate of the count, we partition the available memory into tiers (layers) of reservoir samples. While the base layer is a standard reservoir sample of edges, other layers are reservoir samples of sub-structures of the desired motif. By storing more frequent sub-structures of the motif, we increase the probability of detecting an occurrence of the sparse motif we are counting, thus decreasing the variance and error of the estimate. While we focus on the designing and analysis of algorithms for counting 4-cliques, we present a method which allows generalizing Tiered Sampling to obtain high-quality estimates for the number of occurrence of any sub-graph of interest, while reducing the analysis effort due to specific properties of the pattern of interest. We present a complete analytical analysis and extensive experimental evaluation of our proposed method using both synthetic and real-world data. Our results demonstrate the advantage of our method in obtaining high-quality approximations for the number of 4 and 5-cliques for large graphs using a very limited amount of memory, significantly outperforming the single edge sample approach for counting sparse motifs in large scale graphs.
APA, Harvard, Vancouver, ISO, and other styles
9

Thorson, James T., Meaghan D. Bryan, Peter-John F. Hulson, Haikun Xu, and André E. Punt. "Simulation testing a new multi-stage process to measure the effect of increased sampling effort on effective sample size for age and length data." ICES Journal of Marine Science 77, no. 5 (2020): 1728–37. http://dx.doi.org/10.1093/icesjms/fsaa036.

Full text
Abstract:
Abstract Ocean management involves monitoring data that are used in biological models, where estimates inform policy choices. However, few science organizations publish results from a recurring, quantitative process to optimize effort spent measuring fish age. We propose that science organizations could predict the likely consequences of changing age-reading effort using four independent and species-specific analyses. Specifically we predict the impact of changing age collections on the variance of expanded age-composition data (“input sample size”, Analysis 1), likely changes in the variance of residuals relative to stock-assessment age-composition estimates (“effective sample size”, Analysis 2), subsequent changes in the variance of stock status estimates (Analysis 3), and likely impacts on management performance (Analysis 4). We propose a bootstrap estimator to conduct Analysis 1 and derive a novel analytic estimator for Analysis 2 when age-composition data are weighted using a Dirichlet-multinomial likelihood. We then provide two simulation studies to evaluate these proposed estimators and show that the bootstrap estimator for Analysis 1 underestimates the likely benefit of increased age reads while the analytic estimator for Analysis 2 is unbiased given a plausible mechanism for model misspecification. We conclude by proposing a formal process to evaluate changes in survey efforts for stock assessment.
APA, Harvard, Vancouver, ISO, and other styles
10

Tamanna, Rownak Jahan, M. Iftakhar Alam, Ahmed Hossain, and Md Hasinur Rahaman Khan. "On sample size calculation in testing treatment efficacy in clinical trials." Biometrical Letters 58, no. 2 (2021): 133–47. http://dx.doi.org/10.2478/bile-2021-0010.

Full text
Abstract:
Summary Sample size calculation is an integral part of any clinical trial design, and determining the optimal sample size for a study ensures adequate power to detect statistical significance. It is a critical step in designing a planned research protocol, since using too many participants in a study is expensive, exposing more subjects to the procedure. If a study is underpowered, it will be statistically inconclusive and may cause the whole protocol to fail. Amidst the attempt to maximize power and the underlying effort to minimize the budget, the optimization of both has become a significant issue in the determination of sample size for clinical trials in recent decades. Although it is hard to generalize a single method for sample size calculation, this study is an attempt to offer something that might be a basis for finding a permanent answer to the contradictions of sample size determination, by the use of simulation studies under simple random and cluster sampling schemes, with different sizes of power and type I error. The effective sample size is much higher when the design effect of the sampling method is smaller, particularly less than 1. Sample size increases for cluster sampling when the number of clusters increases.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Sampling efforts, sample size"

1

Dorji, Kinzang. "Utility of an existing biotic score method in assessing the stream health in Bhutan." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/97993/1/Kinzang_Dorji_Thesis.pdf.

Full text
Abstract:
In many countries water quality is assessed by using indices derived from the presence of macroinvertebrate species. This study aimed to improve the application of one such index (Hindu-Kush Himalaya based index: HKHbios) to rivers and streams in Bhutan. Sampling in a number of different streams showed that there was a strong influence of the monsoon on stream macroinvertebrates, however the month to month and site to site HKHbios scores showed no consistent patterns. Dry season sampling and increased ecological information on a number of macroinvertebrate taxa were identified as areas where water quality assessment in Bhutanese streams could be improved.
APA, Harvard, Vancouver, ISO, and other styles
2

Tse, Kwok Ho. "Sample size calculation : influence of confounding and interaction effects /." View abstract or full-text, 2006. http://library.ust.hk/cgi/db/thesis.pl?MATH%202006%20TSE.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Suen, Wai-sing Alan. "Sample size planning for clinical trials with repeated measurements." Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B31972172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Suen, Wai-sing Alan, and 孫偉盛. "Sample size planning for clinical trials with repeated measurements." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B31972172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

McGrath, Neill. "Effective sample size in order statistics of correlated data." [Boise, Idaho] : Boise State University, 2009. http://scholarworks.boisestate.edu/td/32/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cheng, Dunlei Stamey James D. "Topics in Bayesian sample size determination and Bayesian model selection." Waco, Tex. : Baylor University, 2007. http://hdl.handle.net/2104/5039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

McIntosh, Matthew J. "Sample size when the alternative is ordered and other multivariate results /." free to MU campus, to others for purchase, 1998. http://wwwlib.umi.com/cr/mo/fullcit?p9924907.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hathaway, John Ellis. "Determining the Optimum Number of Increments in Composite Sampling." BYU ScholarsArchive, 2005. https://scholarsarchive.byu.edu/etd/425.

Full text
Abstract:
Composite sampling can be more cost effective than simple random sampling. This paper considers how to determine the optimum number of increments to use in composite sampling. Composite sampling terminology and theory are outlined and a model is developed which accounts for different sources of variation in compositing and data analysis. This model is used to define and understand the process of determining the optimum number of increments that should be used in forming a composite. The blending variance is shown to have a smaller range of possible values than previously reported when estimating the number of increments in a composite sample. Accounting for differing levels of the blending variance significantly affects the estimated number of increments.
APA, Harvard, Vancouver, ISO, and other styles
9

Tʻang, Min. "Extention of evaluating the operating characteristics for dependent mixed variables-attributes sampling plans to large first sample size /." Online version of thesis, 1991. http://hdl.handle.net/1850/11208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Jie Stamey James D. "Sample size determination for Emax model, equivalence / non-inferiority test and drug combination in fixed dose trials." Waco, Tex. : Baylor University, 2008. http://hdl.handle.net/2104/5182.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Sampling efforts, sample size"

1

Desu, M. M. Sample size methodology. Academic Press, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Brush, Gary G. How to choose the proper sample size. American Society for Quality Control, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Reiser, B. Sample size choice for strength stress models. University of Toronto, Dept. of Statistics, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stanley, Lemeshow, and World Health Organization, eds. Adequacy of sample size in health studies. Published on behalf of the World Health Organization by Wiley, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

1955-, Chow Shein-Chung, Shao Jun, and Wang Hansheng 1977-, eds. Sample size calculations in clinical research. Marcel Dekker, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

1938-, Herrendörfer Günter, ed. Experimental design: Sample size determination and block designs. D. Reidel Pub. Co., 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zarnoch, Stanley J. Determining sample size for tree utilization surveys. U.S. Dept. of Agriculture, Forest Service, Southern Research Station, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Handbook of sample size guidelines for clinical trials. CRC Press, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lwanga, S. Kaggwa. Sample size determination in health studies: A practical manual. World Health Organization, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

1955-, Chow Shein-Chung, Shao Jun, and Wang Hansheng 1977-, eds. Sample size calculations in clinical research. 2nd ed. Taylor & Francis, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Sampling efforts, sample size"

1

Hale, Robert C., Meredith E. Seeley, Ashley E. King, and Lehuan H. Yu. "Analytical Chemistry of Plastic Debris: Sampling, Methods, and Instrumentation." In Microplastic in the Environment: Pattern and Process. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78627-4_2.

Full text
Abstract:
AbstractApproaches for the collection and analysis of plastic debris in environmental matrices are rapidly evolving. Such plastics span a continuum of sizes, encompassing large (macro-), medium (micro-, typically defined as particles between 1 μm and 5 mm), and smaller (nano-) plastics. All are of environmental relevance. Particle sizes are dynamic. Large plastics may fragment over time, while smaller particles may agglomerate in the field. The diverse morphologies (fragment, fiber, sphere) and chemical compositions of microplastics further complicate their characterization. Fibers are of growing interest and present particular analytical challenges due to their narrow profiles. Compositional classes of emerging concern include tire wear, paint chips, semisynthetics (e.g., rayon), and bioplastics. Plastics commonly contain chemical additives and fillers, which may alter their toxicological potency, behavior (e.g., buoyancy), or detector response (e.g., yield fluorescence) during analysis. Field sampling methods often focus on >20 μm and even >300 μm sized particles and will thus not capture smaller microplastics (which may be most abundant and bioavailable). Analysis of a limited subgroup (selected polymer types, particle sizes, or shapes) of microplastics, while often operationally necessary, can result in an underestimation of actual sample content. These shortcomings complicate calls for toxicological studies of microplastics to be based on “environmentally relevant concentrations.” Sample matrices of interest include water (including wastewater, ice, snow), sediment (soil, dust, wastewater sludge), air, and biota. Properties of the environment, and of the particles themselves, may concentrate plastic debris in select zones (e.g., gyres, shorelines, polar ice, wastewater sludge). Sampling designs should consider such patchy distributions. Episodic releases due to weather and anthropogenic discharges should also be considered. While water grab samples and sieving are commonplace, novel techniques for microplastic isolation, such as continuous flow centrifugation, show promise. The abundance of nonplastic particulates (e.g., clay, detritus, biological material) in samples interferes with microplastic detection and characterization. Their removal is typically accomplished using a combination of gravity separation and oxidative digestion (including strong bases, peroxide, enzymes); unfortunately, aggressive treatments may damage more labile plastics. Microscope-based infrared or Raman detection is often applied to provide polymer chemistry and morphological data for individual microplastic particles. However, the sheer number of particles in many samples presents logistical hurdles. In response, instruments have been developed that employ detector arrays and rapid scanning lasers. The addition of dyes to stain particulates may facilitate spectroscopic detection of some polymer types. Most researchers provide microplastic data in the form of the abundances of polymer types within particle size, polymer, and morphology classes. Polymer mass data in samples remain rare but are essential to elucidating fate. Rather than characterizing individual particles in samples, solvent extraction (following initial sample prep, such as sediment size class sorting), combined with techniques such as thermoanalysis (e.g., pyrolysis), has been used to generate microplastic mass data. However, this may obviate the acquisition of individual particle morphology and compositional information. Alternatively, some techniques (e.g., electron and atomic force microscopy and matrix-assisted laser desorption mass spectrometry) are adept at providing highly detailed data on the size, morphology, composition, and surface chemistry of select particles. Ultimately, the analyst must select the approach best suited for their study goals. Robust quality control elements are also critical to evaluate the accuracy and precision of the sampling and analysis techniques. Further, improved efforts are required to assess and control possible sample contamination due to the ubiquitous distribution of microplastics, especially in indoor environments where samples are processed.
APA, Harvard, Vancouver, ISO, and other styles
2

Indrayan, Abhaya. "Sampling and Sample Size." In Research Methods for Medical Graduates. CRC Press, 2019. http://dx.doi.org/10.1201/9780429435034-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brus, Dick J. "Computing the required sample size." In Spatial Sampling with R. Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003258940-12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Krafsur, E. S., R. D. Moon, R. Albajes, et al. "Fixed-Sample Size Sampling Plan." In Encyclopedia of Entomology. Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-6359-6_3817.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gu, Baohua, Bing Liu, Feifang Hu, and Huan Liu. "Efficiently Determining the Starting Sample Size for Progressive Sampling." In Machine Learning: ECML 2001. Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44795-4_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hubbard, Rebecca A., Carolyn Lou, and Blanca E. Himes. "The Effective Sample Size of EHR-Derived Cohorts Under Biased Sampling." In Emerging Topics in Statistics and Biostatistics. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72437-5_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Castillo-Santiago, Miguel Ángel, Edith Mondragón-Vázquez, and Roberto Domínguez-Vera. "Sample Data for Thematic Accuracy Assessment in QGIS." In Land Use Cover Datasets and Validation Tools. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-90998-7_6.

Full text
Abstract:
AbstractWe present an approach that is widely used in the field of remote sensing for the validation of single LUC maps. Unlike other chapters in this book, where maps are validated by comparison with other maps with better resolution and/or quality, this approach requires a ground sample dataset, i.e. a set of sites where LUC can be observed in the field or interpreted from high-resolution imagery. Map error is assessed using techniques based on statistical sampling. In general terms, in this approach, the accuracy of single LUC maps is assessed by comparing the thematic map against the reference data and measuring the agreement between the two. When assessing thematic accuracy, three stages can be identified: the design of the sample, the design of the response, and the estimation and analysis protocols. Sample design refers to the protocols used to define the characteristics of the sampling sites, including sample size and distribution, which can be random or systematic. Response design involves establishing the characteristics of the reference data, such as the size of the spatial assessment units, the sources from which the reference data will be obtained, and the criteria for assigning labels to spatial units. Finally, the estimation and analysis protocols include the procedures applied to the reference data to calculate accuracy indices, such as user’s and producer’s accuracy, the estimated areas covered by each category and their respective confidence intervals. This chapter has two sections in which we present a couple of exercises relating to sampling and response design; the sample size will be calculated, the distribution of sampling sites will be obtained using a stratified random scheme, and finally, a set of reference data will be obtained by photointerpretation at the sampling sites (spatial units). The accuracy statistics will be calculated later in Sect. 5 in chapter “Metrics Based on a Cross-Tabulation Matrix to Validate Land Use Cover Maps” as part of the cross-tabulation exercises. The exercises in this chapter use fine-scale LUC maps obtained for the municipality of Marqués de Comillas in Chiapas, Mexico.
APA, Harvard, Vancouver, ISO, and other styles
8

Gil, María Angeles, and Covadonga Caso. "The choice of sample size in estimating entropy according to a stratified sampling." In Uncertainty and Intelligent Systems. Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/3-540-19402-9_62.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schmidt, Robin, Matthias Voigt, and Ronald Mailach. "Latin Hypercube Sampling-Based Monte Carlo Simulation: Extension of the Sample Size and Correlation Control." In Uncertainty Management for Robust Industrial Design in Aeronautics. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77767-2_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kesavan, Ram, and Oswald A. J. Mascarenhas. "Simultaneous Optimization of Questionnaire Length and Sample Size in Marketing Research: A Matrix Sampling Approach." In Proceedings of the 1986 Academy of Marketing Science (AMS) Annual Conference. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11101-8_83.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Sampling efforts, sample size"

1

Jiang, Chunheng, Jianxi Gao, and Malik Magdon-Ismail. "Inferring Degrees from Incomplete Networks and Nonlinear Dynamics." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/457.

Full text
Abstract:
Inferring topological characteristics of complex networks from observed data is critical to understand the dynamical behavior of networked systems, ranging from the Internet and the World Wide Web to biological networks and social networks. Prior studies usually focus on the structure-based estimation to infer network sizes, degree distributions, average degrees, and more. Little effort attempted to estimate the specific degree of each vertex from a sampled induced graph, which prevents us from measuring the lethality of nodes in protein networks and influencers in social networks. The current approaches dramatically fail for a tiny sampled induced graph and require a specific sampling method and a large sample size. These approaches neglect information of the vertex state, representing the dynamical behavior of the networked system, such as the biomass of species or expression of a gene, which is useful for degree estimation. We fill this gap by developing a framework to infer individual vertex degrees using both information of the sampled topology and vertex state. We combine the mean-field theory with combinatorial optimization to learn vertex degrees. Experimental results on real networks with a variety of dynamics demonstrate that our framework can produce reliable degree estimates and dramatically improve existing link prediction methods by replacing the sampled degrees with our estimated degrees.
APA, Harvard, Vancouver, ISO, and other styles
2

Andrade, Gleyberson, Elder Cirilo, Vinicius Durelli, Bruno Cafeo, and Eiji Adachi. "Data-Flow Analysis Heuristic for Vulnerability Detection on Configurable Systems." In VIII Workshop on Software Visualization. Sociedade Brasileira de Computação - SBC, 2020. http://dx.doi.org/10.5753/vem.2020.14525.

Full text
Abstract:
Configurable software systems offer a variety of benefits such as supporting easy configuration of custom behaviours for distinctive needs. However, it is known that the presence of configuration options in source code complicates maintenance tasks and requires additional effort from developers when adding or editing code statements. They need to consider multiple configurations when executing tests or performing static analysis to detect vulnerabilities. Therefore, vulnerabilities have been widely reported in configurable software systems. Unfortunately, the effectiveness of vulnerability detection depends on how the multiple configurations (i.e., samples sets) are selected. In this paper, we tackle the challenge of generating more adequate system configuration samples by taking into account the intrinsic characteristics of security vulnerabilities. We propose a new sampling heuristic based on data-flow analysis for recommending the subset of configurations that should be analyzed individually. Our results show that we can achieve high vulnerability-detection effectiveness with a small sample size.
APA, Harvard, Vancouver, ISO, and other styles
3

Hawco, Jessica, Elliott Burden, Edison Sripal, and Lesley James. "Evaluating the Prospect of Oil Production in Tight Winterhouse Formation Rocks in Western Newfoundland." In SPE Canadian Energy Technology Conference. SPE, 2022. http://dx.doi.org/10.2118/208908-ms.

Full text
Abstract:
Abstract The Winterhouse Formation (Port au Port Peninsula, western Newfoundland, Canada) is a lateral equivalent to the Utica and Macasty formations farther west. With hydrocarbon stains and odours as a guide towards a common and regional upper Ordovician hydrocarbon system, Winterhouse rocks may yet contain their own suite of source reservoir and seal strata, with coarser, sandier beds perhaps playing host to other varieties of conventional and unconventional hydrocarbon traps. Hence, addressing basic properties of fluid transmission is an important and unknown variable that needs to be addressed for this formation. In this pilot study, Mercury Intrusion Porosimetry (MIP) is applied to measure the petrophysical properties of a single tight (low porosity, low permeability) quartz-carbonate sandstone sample from a Winterhouse outcrop. As a tool, Mercury Intrusion Porosimetry is strongly dependent on conformity of sample size and shape as a determinant of pore accessibility. Hence two sample types (i) plugs and (ii) cuttings (both real and artificial) are analyzed to explore aspects of core and cuttings preparation and data reduction work flow measurements of storage and transport properties. For artificial "cuttings" a horizontal 2.5 cm core plug and rock fragments are crushed and sieved to replicate fine and coarse fractions. For porosimetry, a Micromeretrics AutoPore IV porosimeter with a maximum pressure of 33,000 psi is used to determine the porosity, pore size distribution, surface area, and bulk density of all samples. Additionally, the FEI Quanta 650 Field Emission Gun (FEG) SEM is used to take images of the pore structure. Mineralogy is determined from the GXMAP measurement mode within FEI Mineral Liberation AnalyzerTM software. A comprehensive analysis corroborating results from MIP and SEM indicates that for these tight rocks, and namely, outcrop plugs, artificial cuttings, and real drill cuttings from a nearby well, all show a similar spectrum of results, but smaller coarse fragments are recommended for reliability. In terms of the Winterhouse strata, it is clear that some of this rock is very tight and highly cemented, but that it also possesses fractures and high permeability values which may make it a good unconventional reservoir. These porosity-permeability results are simply a beginning in a search to understand the petrophysical properties of the strata on the western coast of Newfoundland. The western part of Newfoundland has seen extensive oil exploration efforts in the last few decades, these efforts have resulted in little success. A large degree of this is due to the complex geological history and overall lack of knowledge concerning the structure and diagenesis of these rocks (Cooper et al, 2001). This study will support the new sampling programs in the hope of gaining new insights into potential oil exploitation.
APA, Harvard, Vancouver, ISO, and other styles
4

Araújo, Abner M. C., and Manuel M. Oliveira Neto. "Towards Reverse Engineering of Industrial Site Plants." In Concurso de Teses e Dissertações da SBC. Sociedade Brasileira de Computação - SBC, 2020. http://dx.doi.org/10.5753/ctd.2020.11366.

Full text
Abstract:
CAD models of industrial sites are extremely important, as they provide documentation and simplify inspection, planning, modification, as well as a variety of physical and logistics simulations of the corresponding installations. Despite these clear advantages, many industrial sites do not have CAD models, or have trouble keeping them up-to-date. This is often due to the amount of effort required to create and maintain CAD models updated. Hopefully, the recent popularization of 3D scanning devices is promoting the development of reverse engineering, allowing the creation of 3D representations of real environments from point clouds. Nevertheless, point clouds extracted from industrial sites are extremely complex due to occlusions, noise, non-uniform sampling, size of the dataset, lack of sample organization, among other factors. Thus, a successful reverse engineering solution should have several desirable properties, including speed, robustness to noise, accuracy, and be able to handle point clouds in general without requiring one to fine tune their parameters to each dataset in order to work well on it. This thesis presents some initial efforts towards obtaining a robust framework for reverse engineering of industrial sites. It introduces two fast and robust algorithms for detecting, respectively, planes and cylinders in noisy unorganized point clouds. Planes and cylinders are typically the most common and largest structures found in those environments, representing walls, floors, ceilings, pipes, and ducts. We demonstrate the effectiveness of the proposed approaches by comparing their performances against the state-of-the-art solutions for plane and cylinder detection in unorganized point clouds. In these experiments, our solutions achieved the best overall accuracy using the same set of (default) parameter values for all evaluated datasets. This is in contrast to the competing techniques, for which their parameter values were individually adjusted for each combination of technique and dataset to achieve their best results in each case, demonstrating the robustness of our algorithms, which do not require fine-tuning to perform well on arbitrary point clouds. Moreover, our technique also displayed competitive speed to other state-of-art techniques, being suitable for handling large-scale point clouds. The thesis also presents a graphical user interface which allows further refinement of the detected structures, providing the user the ability to remove, merge, and semi-automatically detect planes and cylinders in point clouds.
APA, Harvard, Vancouver, ISO, and other styles
5

Martino, L., V. Elvira, and F. Louzada. "Alternative effective sample size measures for importance sampling." In 2016 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2016. http://dx.doi.org/10.1109/ssp.2016.7551765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Furghieri Bylaardt Caldas, Priscila, Jonathan Snatic, and Kurt Kronenberger. "Quality Isotope Analysis at the Wellsite: Two Case Studies that Validate GC-C-IRMS Mud Gas Isotope Logging for Deepwater Exploration and Development." In SPE Annual Technical Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/210384-ms.

Full text
Abstract:
Abstract Geochemical analysis of gases produced during the drilling process is a common study on oil and gas exploration and development wells. This process typically includes the use of gas sample containers or other vessels that allow for single point samples to be collected for shipment to an offsite laboratory. Laboratories use high precision devices to obtain valuable information for reservoir characterization including stable carbon isotope ratios. In recent years there have been efforts to provide similar analyses during the drilling process, using ruggedized equipment suitable for wellsite deployment. This paper demonstrates that a Gas Chromatograph-Combustion-Isotope Ratio Mass Spectrometer (GC-C-IRMS) analyzer, using similar technology to what is most widespread in offsite laboratories (Dashti et al, 2018), can be successfully deployed to the rig site. This type of advanced gas analysis, commonly known as Mud Gas Isotope Logging (MGIL), provides continuous sampling of stable carbon isotopes of methane (δ13C1), ethane (δ13C2), and propane (δ13C3). The service, performed with a GC-C-IRMS analyzer, was proven and validated for an operator through two case studies. 98 The first case compares real time data with discrete gas sample tubes analyzed in an offsite laboratory. It shows how accurate results are possible, even with the presence of artificial gases generated by drill bit metamorphism (DBM) (Wenger et al, 2009). This example also demonstrates how the service enabled immediate analysis for operational decisions by indicating the presence of biodegraded thermogenic fluid. The second case study demonstrates how this wellsite service could corroborate the geological prognosis in a complex field influenced by salt tectonics. In this basin an upthrown reservoir changed the typical behavior observed in conventional wells of increased oil maturity with depth. Stable carbon isotope readings obtained in real time, integrated with cuttings analysis, indicated the presence of out of section lithology. This information allowed for estimating the thermogenic fluid maturity of reservoirs and diagnosis of geological formations that were out of sequence in terms of age (uplifted).
APA, Harvard, Vancouver, ISO, and other styles
7

Panda, K. B. "On Systematic Sampling Strategies for a Varying Sample Size." In International Conference on Recent Advances in Mathematics, Statistics and Computer Science 2015. WORLD SCIENTIFIC, 2016. http://dx.doi.org/10.1142/9789814704830_0024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Su-Fen, and Chih-Ching Yang. "Optimal variable sample size and sampling interval MSE chart." In 2011 8th International Conference on Service Systems and Service Management (ICSSSM 2011). IEEE, 2011. http://dx.doi.org/10.1109/icsssm.2011.5959315.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ahmed, Ahmed Awad E., and Issa Traore. "Dynamic sample size detection in continuous authentication using sequential sampling." In the 27th Annual Computer Security Applications Conference. ACM Press, 2011. http://dx.doi.org/10.1145/2076732.2076756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

M., Vorechovský. "Extension of Sample Size in Latin Hypercube Sampling with Correlated Variables." In 4th International Workshop on Reliable Engineering Computing (REC 2010). Research Publishing Services, 2010. http://dx.doi.org/10.3850/978-981-08-5118-7_024.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Sampling efforts, sample size"

1

Sprague, Joshua, David Kushner, James Grunden, Jamie McClain, Benjamin Grime, and Cullen Molitor. Channel Islands National Park Kelp Forest Monitoring Program: Annual report 2014. National Park Service, 2022. http://dx.doi.org/10.36967/2293855.

Full text
Abstract:
Channel Islands National Park (CHIS) has conducted long-term ecological monitoring of the kelp forests around San Miguel, Santa Rosa, Santa Cruz, Anacapa and Santa Barbara Islands since 1982. The original permanent transects were established at 16 sites between 1981 and 1986 with the first sampling beginning in 1982, this being the 33rd year of monitoring. An additional site, Miracle Mile, was established at San Miguel Island in 2001 by a commercial fisherman with assistance from the park. Miracle Mile was partially monitored from 2002 to 2004, and then fully monitored (using all KFM protocols) since 2005. In 2005, 16 additional permanent sites were established to collect baseline data from inside and adjacent to four marine reserves that were established in 2003. Sampling results from all 33 sites mentioned above are included in this report. Funding for the Kelp Forest Monitoring Program (KFM) in 2014 was provided by the National Park Service (NPS). The 2014 monitoring efforts utilized 49 days of vessel time to conduct 1,040 dives for a total of 1,059 hours of bottom time. Population dynamics of a select list of 71 “indicator species” (consisting of taxa or categories of algae, fish, and invertebrates) were measured at the 33 permanent sites. In addition, population dynamics were measured for all additional species of fish observed at the sites during the roving diver fish count. Survey techniques follow the CHIS Kelp Forest Monitoring Protocol Handbook (Davis et al. 1997) and an update to the sampling protocol handbook currently being developed (Kushner and Sprague, in progress). The techniques utilize SCUBA and surface-supplied-air to conduct the following monitoring protocols: 1 m2 quadrats, 5 m2 quadrats, band transects, random point contacts, fish transects, roving diver fish counts, video transects, size frequency measurements, and artificial recruitment modules. Hourly temperature data were collected using remote temperature loggers at 32 sites, the exception being Miracle Mile where there is no temperature logger installed. This annual report contains a brief description of each site including any notable observations or anomalies, a summary of methods used, and monitoring results for 2014. All the data collected during 2014 can be found in the appendices and in an Excel workbook on the NPS Integrated Resource Management Applications (IRMA) portal. In the 2013 annual report (Sprague et al. 2020) several changes were made to the appendices. Previously, annual report density and percent cover data tables only included the current year’s data. Now, density and percent cover data are presented in graphical format and include all years of available monitoring data. Roving diver fish count (RDFC), fish size frequency, natural habitat size frequency, and Artificial Recruitment Module (ARM) size frequency data are now stored on IRMA at https://irma.nps.gov/DataStore/Reference/Profile/2259651. The temperature data graphs in Appendix L include the same graphs that were used in past reports, but include additional violin plot sections that compare monthly means from the current year to past years. In addition to the changes listed above, the layout of the discussion section was reordered by species instead of by site. The status of kelp forests differed among the five park islands. This is a result of a combination of factors including but not limited to, oceanography, biogeography and associated differences in species abundance and composition, as well as sport and commercial fishing pressure. All 33 permanent sites were established in areas that had or were historically known to have had kelp forests in the past. In 2014, 15 of the 33 sites monitored were characterized as developing kelp forest, kelp forest or mature kelp forest. In addition, three sites were in a state of transition. Two sites were part kelp forest and part dominated by Strongylocentrotus purpuratus...
APA, Harvard, Vancouver, ISO, and other styles
2

Axelrod, M. Using Ancillary Information to Reduce Sample Size in Discovery Sampling and the Effects of Measurement Error. Office of Scientific and Technical Information (OSTI), 2005. http://dx.doi.org/10.2172/877925.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Irudayaraj, Joseph, Ze'ev Schmilovitch, Amos Mizrach, Giora Kritzman, and Chitrita DebRoy. Rapid detection of food borne pathogens and non-pathogens in fresh produce using FT-IRS and raman spectroscopy. United States Department of Agriculture, 2004. http://dx.doi.org/10.32747/2004.7587221.bard.

Full text
Abstract:
Rapid detection of pathogens and hazardous elements in fresh fruits and vegetables after harvest requires the use of advanced sensor technology at each step in the farm-to-consumer or farm-to-processing sequence. Fourier-transform infrared (FTIR) spectroscopy and the complementary Raman spectroscopy, an advanced optical technique based on light scattering will be investigated for rapid and on-site assessment of produce safety. Paving the way toward the development of this innovative methodology, specific original objectives were to (1) identify and distinguish different serotypes of Escherichia coli, Listeria monocytogenes, Salmonella typhimurium, and Bacillus cereus by FTIR and Raman spectroscopy, (2) develop spectroscopic fingerprint patterns and detection methodology for fungi such as Aspergillus, Rhizopus, Fusarium, and Penicillium (3) to validate a universal spectroscopic procedure to detect foodborne pathogens and non-pathogens in food systems. The original objectives proposed were very ambitious hence modifications were necessary to fit with the funding. Elaborate experiments were conducted for sensitivity, additionally, testing a wide range of pathogens (more than selected list proposed) was also necessary to demonstrate the robustness of the instruments, most crucially, algorithms for differentiating a specific organism of interest in mixed cultures was conceptualized and validated, and finally neural network and chemometric models were tested on a variety of applications. Food systems tested were apple juice and buffer systems. Pathogens tested include Enterococcus faecium, Salmonella enteritidis, Salmonella typhimurium, Bacillus cereus, Yersinia enterocolitis, Shigella boydii, Staphylococus aureus, Serratiamarcescens, Pseudomonas vulgaris, Vibrio cholerae, Hafniaalvei, Enterobacter cloacae, Enterobacter aerogenes, E. coli (O103, O55, O121, O30 and O26), Aspergillus niger (NRRL 326) and Fusarium verticilliodes (NRRL 13586), Saccharomyces cerevisiae (ATCC 24859), Lactobacillus casei (ATCC 11443), Erwinia carotovora pv. carotovora and Clavibacter michiganense. Sensitivity of the FTIR detection was 103CFU/ml and a clear differentiation was obtained between the different organisms both at the species as well as at the strain level for the tested pathogens. A very crucial step in the direction of analyzing mixed cultures was taken. The vector based algorithm was able to identify a target pathogen of interest in a mixture of up to three organisms. Efforts will be made to extend this to 10-12 key pathogens. The experience gained was very helpful in laying the foundations for extracting the true fingerprint of a specific pathogen irrespective of the background substrate. This is very crucial especially when experimenting with solid samples as well as complex food matrices. Spectroscopic techniques, especially FTIR and Raman methods are being pursued by agencies such as DARPA and Department of Defense to combat homeland security. Through the BARD US-3296-02 feasibility grant, the foundations for detection, sample handling, and the needed algorithms and models were developed. Successive efforts will be made in transferring the methodology to fruit surfaces and to other complex food matrices which can be accomplished with creative sampling methods and experimentation. Even a marginal success in this direction will result in a very significant breakthrough because FTIR and Raman methods, in spite of their limitations are still one of most rapid and nondestructive methods available. Continued interest and efforts in improving the components as well as the refinement of the procedures is bound to result in a significant breakthrough in sensor technology for food safety and biosecurity.
APA, Harvard, Vancouver, ISO, and other styles
4

Wadman, Heidi, and Jesse McNinch. Spatial distribution and thickness of fine-grained sediment along the United States portion of the upper Niagara River, New York. Engineer Research and Development Center (U.S.), 2021. http://dx.doi.org/10.21079/11681/41666.

Full text
Abstract:
Over 220 linear miles of geophysical data, including sidescan sonar and chirp sub-bottom profiles, were collected in 2016 and 2017 by the US Army Corps of Engineers and the US Fish and Wildlife Service in the upper Niagara River. In addition, 36 sediment grab samples were collected to groundtruth the geophysical data. These data were used to map the spatial distribution of fine-grained sediment, including volume data in certain locations, along the shallow shorelines of the upper Niagara River. Overall, the most extensive deposits were spatially associated with either small tributaries or with man-made structures that modified the natural flow of the system. Extensive beds of submerged aquatic vegetation (SAV) were also mapped. Although always associated with a fine-grained matrix, the SAV beds were patchy in distribution, which might reflect subtle differences in the grain size of the sediment matrix or could simply be a function of variations in species or growth. The maps generated from this effort can be used to guide sampling plans for future studies of contamination in fine-grained sediment regions.
APA, Harvard, Vancouver, ISO, and other styles
5

Bridges, Todd, Sandra Newell, Alan Kennedy, et al. Long-term stability and efficacy of historic activated carbon (AC) deployments at diverse freshwater and marine remediation sites. Engineer Research and Development Center (U.S.), 2020. http://dx.doi.org/10.21079/11681/38781.

Full text
Abstract:
A number of sites around the United States have used activated carbon (AC) amendments to remedy contaminated sediments. Variation in site-specific characteristics likely influences the long-term fate and efficacy of AC treatment. The long-term effectiveness of an AC amendment to sediment is largely unknown, as the field performance has not been monitored for more than three years. As a consequence, the focus of this research effort was to evaluate AC’s long-term (6–10 yr) performance. These assessments were performed at two pilot-scale demonstration sites, Grasse River, Massena, New York and Canal Creek, Aberdeen Proving Ground (APG), Aberdeen, Maryland, representing two distinct physical environments. Sediment core samples were collected after 6 and 10 years of remedy implementation at APG and Grasse River, respectively. Core samples were collected and sectioned to determine the current vertical distribution and persistence of AC in the field. The concentration profile of polychlorinated biphenyls (PCBs) in sediment pore water with depth was measured using passive sampling. Sediment samples from the untreated and AC-treated zones were also assessed for bioaccumulation in benthic organisms. The data collected enabled comparison of AC distribution, PCB concentrations, and bioaccumulation measured over the short- and long-term (months to years).
APA, Harvard, Vancouver, ISO, and other styles
6

Bobashev, Georgiy, R. Joey Morris, Elizabeth Costenbader, and Kyle Vincent. Assessing network structure with practical sampling methods. RTI Press, 2018. http://dx.doi.org/10.3768/rtipress.2018.op.0049.1805.

Full text
Abstract:
Using data from an enumerated network of worldwide flight connections between airports, we examine how sampling designs and sample size influence network metrics. Specifically, we apply three types of sampling designs: simple random sampling, nonrandom strategic sampling (i.e., selection of the largest airports), and a variation of snowball sampling. For the latter sampling method, we design what we refer to as a controlled snowball sampling design, which selects nodes in a manner analogous to a respondent-driven sampling design. For each design, we evaluate five commonly used measures of network structure and examine the percentage of total air traffic accounted for by each design. The empirical application shows that (1) the random and controlled snowball sampling designs give rise to more efficient estimates of the true underlying structure, and (2) the strategic sampling method can account for a greater proportion of the total number of passenger movements occurring in the network.
APA, Harvard, Vancouver, ISO, and other styles
7

Tennant, David. Business Surveys on the Impact of COVID-19 on Jamaican Firms. Inter-American Development Bank, 2021. http://dx.doi.org/10.18235/0003251.

Full text
Abstract:
The datasets come from two surveys of Jamaican businesses conducted between May and June 2020. Two sets of self-administered surveys were conducted using Survey Monkey. A very small sample of financial institutions was surveyed to gain perspective on the challenges facing financiers as a result of the pandemic, and their efforts to respond to such challenges. Nine financial institutions completed this survey, and the results were used to complement the information derived from the second and major survey. The second survey targeted non-financial businesses operating in Jamaica. The sample of firms was selected from a list of all registered Jamaican firms, obtained from the Companies Office of Jamaica. A stratified random sample was used based on firm type, region, and sector. Some firms may have also participated in the study through contact made by their respective affiliations, which were approached to endorse the study and encourage their members to engage. A total of 390 firms completed the second survey. A significant degree of representation was achieved across size, type and age of business, sector and location of operation. Good gender representation was also achieved.
APA, Harvard, Vancouver, ISO, and other styles
8

Lafrancois, Toben, Mark Hove, and Jay Glase. Zebra mussel (Dreissena polymorpha) distribution in Apostle Islands National Lakeshore: SCUBA-based search and removal efforts: 2019–2020. National Park Service, 2022. http://dx.doi.org/10.36967/nrr-2293376.

Full text
Abstract:
Invasive zebra mussels (Dreissena polymorpha) were first observed in situ at Apostle Islands National Lakeshore (APIS) in 2015. This report builds on 2018 SCUBA surveys and Environmental Protection Agency (EPA) veliger sampling to: 1) determine whether shoals on APIS borders act as sentinel sites to corroborate veliger drift hypotheses about invasion pathways, 2) evaluate ongoing hand-removal of zebra mussels from easily identified structures, and 3) continue efforts to assess native unionid mussel populations, particularly where zebra mussels are also present. Standard catch per unit effort survey methods by SCUBA teams were used to determine the distribution and relative abundance of zebra or quagga mussels (dreissenids) and native mussels (unionids). Zebra mussels were present at densities between 3 and 42 n/diver/hr (number of mussels per diver per hour), while native unionids were present at densities between 5 and 72 n/diver/hr. Shoal surveys (Eagle Island shoal, Sand Island shoal, York Island shoal, Bear Island shoal, Oak Island shoal, and Gull Island shoal) showed zebra mussels were more abundant on the west side of APIS and absent on the easternmost shoal (Gull Island), corroborating veliger work by the EPA that suggested drift from the Twin Ports of Duluth, Minnesota, and Superior, Wisconsin, is one pathway of invasion. Our results support the use of shallow shoals along the periphery of the park as sentinel sites gauging zebra mussel immigration and population dynamics. Zebra mussel densities in the central islands showed no obvious spatial pattern, and this survey cannot determine whether currents or human transport (or both) are invasion vectors. Given the mussels’ continued presence at heavily used mooring areas and docks where there are no zebra mussels on nearby natural features (e.g., Rocky Island dock, Stockton Island mooring areas), our findings are consistent with multiple invasion pathways (drift from the Twin Ports and anthropogenic sources at mooring areas). SCUBA search and removal of zebra mussels from docks was confirmed to be an effective method for significantly lowering the risk of zebra mussels reproducing and dispersing from these locations. We caution that this work is being done on what look like initial invasions at low densities. Repeated removal of zebra mussels by divers reduced numbers to zero at some sites after one year (South Twin docks, Stockton Island NPS docks, and the Ottawa wreck) or decreased numbers by an order of magnitude (Rocky Island docks). Dreissenid densities were more persistent on the Sevona wreck and longer-term work is required to evaluate removal versus recruitment (local and/or veliger drift). Given the size of the wreck, we have tracked detailed survey maps to guide future efforts. Zebra mussels were again observed attached to native mussels near Stockton Island and South Twin Island. Their continued presence on sensitive native species is of concern. Native unionid mussels were more widely distributed in the park than previously known, with new beds found near Oak and Basswood Islands. The work reported here will form the basis for continued efforts to determine the optimal frequency of zebra mussel removal for effective control, as well as evaluate impacts on native species.
APA, Harvard, Vancouver, ISO, and other styles
9

Delwiche, Michael, Boaz Zion, Robert BonDurant, Judith Rishpon, Ephraim Maltz, and Miriam Rosenberg. Biosensors for On-Line Measurement of Reproductive Hormones and Milk Proteins to Improve Dairy Herd Management. United States Department of Agriculture, 2001. http://dx.doi.org/10.32747/2001.7573998.bard.

Full text
Abstract:
The original objectives of this research project were to: (1) develop immunoassays, photometric sensors, and electrochemical sensors for real-time measurement of progesterone and estradiol in milk, (2) develop biosensors for measurement of caseins in milk, and (3) integrate and adapt these sensor technologies to create an automated electronic sensing system for operation in dairy parlors during milking. The overall direction of research was not changed, although the work was expanded to include other milk components such as urea and lactose. A second generation biosensor for on-line measurement of bovine progesterone was designed and tested. Anti-progesterone antibody was coated on small disks of nitrocellulose membrane, which were inserted in the reaction chamber prior to testing, and a real-time assay was developed. The biosensor was designed using micropumps and valves under computer control, and assayed fluid volumes on the order of 1 ml. An automated sampler was designed to draw a test volume of milk from the long milk tube using a 4-way pinch valve. The system could execute a measurement cycle in about 10 min. Progesterone could be measured at concentrations low enough to distinguish luteal-phase from follicular-phase cows. The potential of the sensor to detect actual ovulatory events was compared with standard methods of estrus detection, including human observation and an activity monitor. The biosensor correctly identified all ovulatory events during its testperiod, but the variability at low progesterone concentrations triggered some false positives. Direct on-line measurement and intelligent interpretation of reproductive hormone profiles offers the potential for substantial improvement in reproductive management. A simple potentiometric method for measurement of milk protein was developed and tested. The method was based on the fact that proteins bind iodine. When proteins are added to a solution of the redox couple iodine/iodide (I-I2), the concentration of free iodine is changed and, as a consequence, the potential between two electrodes immersed in the solution is changed. The method worked well with analytical casein solutions and accurately measured concentrations of analytical caseins added to fresh milk. When tested with actual milk samples, the correlation between the sensor readings and the reference lab results (of both total proteins and casein content) was inferior to that of analytical casein. A number of different technologies were explored for the analysis of milk urea, and a manometric technique was selected for the final design. In the new sensor, urea in the sample was hydrolyzed to ammonium and carbonate by the enzyme urease, and subsequent shaking of the sample with citric acid in a sealed cell allowed urea to be estimated as a change in partial pressure of carbon dioxide. The pressure change in the cell was measured with a miniature piezoresistive pressure sensor, and effects of background dissolved gases and vapor pressures were corrected for by repeating the measurement of pressure developed in the sample without the addition of urease. Results were accurate in the physiological range of milk, the assay was faster than the typical milking period, and no toxic reagents were required. A sampling device was designed and built to passively draw milk from the long milk tube in the parlor. An electrochemical sensor for lactose was developed starting with a three-cascaded-enzyme sensor, evolving into two enzymes and CO2[Fe (CN)6] as a mediator, and then into a microflow injection system using poly-osmium modified screen-printed electrodes. The sensor was designed to serve multiple milking positions, using a manifold valve, a sampling valve, and two pumps. Disposable screen-printed electrodes with enzymatic membranes were used. The sensor was optimized for electrode coating components, flow rate, pH, and sample size, and the results correlated well (r2= 0.967) with known lactose concentrations.
APA, Harvard, Vancouver, ISO, and other styles
10

Leis, Sherry. Vegetation community monitoring at Lincoln Boyhood National Memorial: 2011–2019. National Park Service, 2021. http://dx.doi.org/10.36967/nrr-2284711.

Full text
Abstract:
Lincoln Boyhood National Memorial celebrates the lives of the Lincoln family including the final resting place of Abraham’s mother, Nancy Hanks Lincoln. Lincoln’s childhood in Indiana was a formative time in the life our 16th president. When the Lincoln family arrived in Indiana, the property was covered in the oak-hickory forest type. They cleared land to create their homestead and farm. Later, designers of the memorial felt that it was important to restore woodlands to the site. The woodlands would help visitors visualize the challenges the Lincoln family faced in establishing and maintaining their homestead. Some stands of woodland may have remained, but significant restoration efforts included extensive tree planting. The Heartland Inventory and Monitoring Network began monitoring the woodland in 2011 with repeat visits every four years. These monitoring efforts provide a window into the composition and structure of the wood-lands. We measure both overstory trees and the ground flora within four permanently located plots. At these permanent plots, we record each species, foliar cover estimates of ground flora, diameter at breast height of midstory and overstory trees, and tree regeneration frequency (tree seedlings and saplings). The forest species composition was relatively consistent over the three monitoring events. Climatic conditions measured by the Palmer Drought Severity Index indicated mild to wet conditions over the monitoring record. Canopy closure continued to indicate a forest structure with a closed canopy. Large trees (>45 cm DBH) comprised the greatest amount of tree basal area. Sugar maple was observed to have the greatest basal area and density of the 23 tree species observed. The oaks characteristic of the early woodlands were present, but less dominant. Although one hickory species was present, it was in very low abundance. Of the 17 tree species recorded in the regeneration layer, three species were most abundant through time: sugar maple (Acer saccharum), red bud (Cercis canadensis), and ash (Fraxinus sp.). Ash recruitment seemed to increase over prior years and maple saplings transitioned to larger size classes. Ground flora diversity was similar through time, but alpha and gamma diversity were slightly greater in 2019. Percent cover by plant guild varied through time with native woody plants and forbs having the greatest abundance. Nonnative plants were also an important part of the ground flora composition. Common periwinkle (Vinca minor) and Japanese honeysuckle (Lonicera japonica) continued to be the most abundant nonnative species, but these two species were less abundant in 2019 than 2011. Unvegetated ground cover was high (mean = 95%) and increased by 17% since 2011. Bare ground increased from less than 1% in 2011 to 9% in 2019, but other ground cover elements were similar to prior years. In 2019, we quantified observer error by double sampling two plots within three of the monitoring sites. We found total pseudoturnover to be about 29% (i.e., 29% of the species records differed between observers due to observer error). This 29% pseudoturnover rate was almost 50% greater than our goal of 20% pseudoturnover. The majority of the error was attributed to observers overlooking species. Plot frame relocation error likely contributed as well but we were unable to separate it from overlooking error with our design.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography