To see the other types of publications on this topic, follow the link: Distribution of the number of rejections.

Journal articles on the topic 'Distribution of the number of rejections'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Distribution of the number of rejections.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Heidenreich, S., C. Dercken, C. August, H. G. Koch, and U. Nowak-Göttl. "High rate of acute rejections in renal allograft recipients with thrombophilic risk factors." Journal of the American Society of Nephrology 9, no. 7 (1998): 1309–13. http://dx.doi.org/10.1681/asn.v971309.

Full text
Abstract:
Inherited and acquired thrombophilic disorders predispose patients for thromboembolic and probably other occlusive vascular events that occur when additional risk factors play in concert. Because acute rejections in renal transplant recipients may reflect vascular events, and an impairment of the fibrinolytic system in immunosuppressed patients has been previously described, the implications of genetic or acquired risk factors of thrombophilia for the occurrence of early acute rejections after kidney transplantation were evaluated. The following risk factors of thrombophilia were determined in 97 patients after cadaveric kidney transplantation: factor V Leiden mutation, protein S, protein C, and antithrombin deficiency. In a retrospective analysis, the prevalence of acute rejections, the histologic classification when rejection episodes had been confirmed by biopsy, and other vascular complications were evaluated. In 21 of the 97 patients, an inherited or acquired risk factor of thrombophilia was detected. Prevalence of acute rejections was 71% in the first 6 mo after transplantation in patients with a thrombophilic disorder and significantly higher compared with patients without thrombophilia (41%; P = 0.017). The distribution of classic risk factors associated with acute rejections, such as number of human leukocyte antigen mismatches or percentage of panel-reactive antibodies, was similar in patients with and without thrombophilia. In the eight patients with thrombophilia and histologically proven acute rejection, four patients had an acute vascular rejection, and in two patients a vascular involvement was suspected. Furthermore, prevalence of cerebral or coronary vascular disease, or venous thromboembolic complications, was significantly higher in patients with a thrombophilic clotting defect (67%) compared with patients with normal hemostasis parameters (28%; P < 0.002). It is concluded that renal allograft recipients with thrombophilia are at risk of developing an acute rejection or other vascular event. Although the determination of thrombotic risk factors was performed at least 3 mo after an acute rejection episode, it can be presumed that acute rejection episodes are associated with subsequent coagulatory abnormalities with further consequences for transplant survival. Thus, pretransplant evaluation of genetic and acquired risk factors of thrombophilia is recommended.
APA, Harvard, Vancouver, ISO, and other styles
2

Yasir, A. Ahmed, Khaled Al-Zahrani Areej, Abdallah Al-Qhamdi Reem, et al. "Reduce Rejected Laboratory Samples and Enhance Specimen Acceptability." International Journal of Life Sciences Research 10, no. 3 (2022): 14–18. https://doi.org/10.5281/zenodo.6948538.

Full text
Abstract:
<strong>Abstract:</strong> Background: Clinical chemistry specimen rejections cause a delay in the availability of findings, which might affect patient care. The study&#39;s objective is to assess sample rejection. Methods: The study measured specimen rejection rates and the contributions of different rejection reasons. The study undertook an intervention to reduce specimen rejection during 2019 intervention period. It compared rejections rates, number of months with rejection rates 1.2%, and distribution of rejection reasons between the two year-long intervals. The study also determined the origin for specimens rejected for the most common rejection reason during one month in the second period. Results: The most common reasons for rejection in hematology and biochemistry areas were clotted blood specimen, improperly labeled specimen containers and hemolyzed blood samples. Conclusions: Using Qualitative Methodology helped to formulate efficient plans to target this issue. reduce the rate of rejected samples. Moreover, the model shed the light on how crucial the pre-analytical phase for laboratory quality improvement process, its effect on cost reduction, and the importance of staff competency and utilization. <strong>Keywords:</strong> Rejection, Blood samples,&nbsp;pre-analytic error; quality indicators; specimen insufficient; specimen rejections. <strong>Title:</strong> Reduce Rejected Laboratory Samples and Enhance Specimen Acceptability <strong>Author:</strong> Yasir A. Ahmed, Areej Khaled Al-Zahrani, Reem Abdallah Al-Qhamdi, Ziyad Fahad Al-Otaibi, Monerah Mater Wallbi, Ahlam Abdulrahman Al-Harbi, Eman Ahmed Sunyur, Areej Nasser AlRami, Ahmed Hattan Hattan, Abdulrhman Mohammed Twhari, Alanood Khalid Alsaffi, Buthaynah Abdalrahman&nbsp; Alodhayb, Dalal Abdullah Alshaikhi, Abdeah Ebrahim Beshi <strong>International Journal of Life Sciences Research</strong> <strong>ISSN 2348-313X (Print), ISSN 2348-3148 (online)</strong> <strong>Vol. 10, Issue 3, July 2022 - September 2022</strong> <strong>Page No: 14-18</strong> <strong>Research Publish Journals</strong> <strong>Website: www.researchpublish.com</strong> <strong>Published Date: 03-August-2022</strong> <strong>DOI: https://doi.org/10.5281/zenodo.6948538</strong> <strong>Paper Download Link (Source)</strong> <strong>https://www.researchpublish.com/papers/reduce-rejected-laboratory-samples-and-enhance-specimen-acceptability</strong>
APA, Harvard, Vancouver, ISO, and other styles
3

Hunt, Daniel L., Cheng Cheng, and Stanley Pounds. "The beta-binomial distribution for estimating the number of false rejections in microarray gene expression studies." Computational Statistics & Data Analysis 53, no. 5 (2009): 1688–700. http://dx.doi.org/10.1016/j.csda.2008.01.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lev, Raskin, Sira Oksana, Palant Oleksii, and Vodovozov Yevgeniy. "DEVELOPMENT OF A MODEL OF THE SERVICE SYSTEM OF BATCH ARRIVALS IN THE PASSENGERS FLOW OF PUBLIC TRANSPORT." Eastern-European Journal of Enterprise Technologies 5, no. 3 (101) (2019): 51–56. https://doi.org/10.15587/1729-4061.2019.180562.

Full text
Abstract:
A mathematical model of the queuing system for the passenger flow of urban public transport is proposed. The resulting model differs from canonical models of queuing theory by taking into account the fundamental features of real systems. Firstly, the service process is divided into different successive service sessions. Secondly, arrival and departures are batch. Thirdly, the arrival rates vary in different service sessions. Fourthly, the laws of distribution of the number of jobs in batch arrivals for different sessions are different. Fifth, the laws of distribution of the number of batch arrivals and departures are also different. A criterion of efficiency of the service system is developed. The criterion is based on the calculation of the probability distribution of the service system states at the input and similar distribution at the output. These distributions are determined independently for each service session, into which the entire service cycle is divided. The numerical value of the criterion is set by the ratio of the average number of service rejections to the average number of jobs in the batch arrival for the entire service cycle. It can be used to assess the efficiency of the service system at any selected time interval during the day, because the value of the proposed criterion depends on the length of the interval between sessions, determined by the number of vehicles on the route. The resulting models adequately reflect the functioning of the system, which makes it possible to predict many different situations and evaluate the consequences of proposed solutions. Thus, it becomes possible to predict the provision of the population with public transport and determine quantitative values of efficiency of the urban public transport system
APA, Harvard, Vancouver, ISO, and other styles
5

Dahani, Said, Nourredine Bouchriti, and Oleya Elhariri. "Analysis of Notifications of Rapid Alert System concerning Parasites in Fishery Products." World's Veterinary Journal 11, no. 2 (2021): 215–27. http://dx.doi.org/10.54203/scil.2021.wvj28.

Full text
Abstract:
Fish and fishery products are one of Morocco’s most important export products. Fish parasitism is a natural worldwide phenomenon. Fish parasites have a very wide distribution and are found in both the northern and the southern hemispheres of the globe. The present study aimed to assess parasitic infestation in fishery products by analyzing notifications available in the European rapid alert system for food and feed. The analysis involved 663 notifications registered from 2001 to 2019 on the grounds of parasitic infestation. For Morocco, 651 notifications concerning the different exported food products were analyzed. Among the 663 notifications for the presence of parasites, 161 (24.3%) were border rejections. A total number of 20 countries have been detected with the presence of parasites in their exported fish and fish products. The main fish species concerned with this hazard were Hake (26%), Silver Scabbardfish (10.5%), and Angler (9.3%). In Morocco, among the 651 notifications, 373 concerned with seafood (57.2%). The number of border rejections of fishery products was 220 that is 33.8% of overall notifications. Fish and fish products category are the most concerned with 170 rejections (26.1%), with 64 notifications due to the presence of parasites (37.6%). The Silver Scabbardfish was the species most affected by parasite infestations (23.5%), followed by European Anchovy (12.5%) and Swordfish (10.9%). In conclusion, the nematode Anisakis is the most common parasite in fish infestation while the plerocercoïd larvae of the Cestoda Gymnorhynchus gigas seems to have a predilection to infest the Atlantic Pomfret (Brama brama).
APA, Harvard, Vancouver, ISO, and other styles
6

Zibari, G. B., K. N. Boykin, J. P. Thomas, et al. "OKT3 induction therapy: influence of duration on rejections and infections." Clinical Transplantation 10, no. 6pt2 (1996): 614–16. http://dx.doi.org/10.1111/j.1399-0012.1996.tb00756.x.

Full text
Abstract:
AbstractAnti‐lymphocyte induction therapy in renal transplants remains controversial relative to efficacy and cost benefit. It has been suggested that shortening the duration of induction therapy from 14 to 7 d would provide adequate efficacy at less cost. Our objective was to compare the efficacy and complications of short (7 d or less, group A) versus standard (14 d or more, group B) duration of OKT3 induction therapy in renal allograft recipients. We performed a retrospective review of all renal allografts performed between July 1989 and September 1994. Two groups were identified based on the duration of OKT3 induction therapy. There were no significant differences between group A or B in the distribution of age, sex, race, degree of HLA matching, and etiology of renal failure. Patients in group B experienced fewer rejections at 3 and 12 months (p=0.0236 and p=0.0065, respectively) as well as fewer viral infections during the first year of follow‐up (p=0.0435). No difference on the mean number of bacterial or fungal infections existed between the two groups. There were no statistically significant differences in patient or graft survival, although patients in group B had a tendency towards increased 1‐yr graft survival.
APA, Harvard, Vancouver, ISO, and other styles
7

Aprilia, Nita, Yusman Syaukat, and Faroby Falatehan. "Analisis Dampak Kebijakan Non-Tarif Measures Terhadap Kinerja Ekspor Udang Beku Indonesia di Pasar Tujuan Utama." Jurnal Agribisnis Indonesia 11, no. 2 (2023): 311–25. http://dx.doi.org/10.29244/jai.2023.11.2.311-325.

Full text
Abstract:
The shrimp commodity significantly contributes to the total export value of the fisheries sub-sector in Indonesia. In 2021, Indonesia will become the fourth largest exporter of frozen shrimp on the world market. However, the value of Indonesia's frozen shrimp exports continues to fluctuate every year. Indonesia's frozen shrimp exports also face various challenges from non-tariff measures (NTM) policies, incredibly sanitary and phytosanitary (SPS) and technical barriers to trade (TBT) imposed by importing countries. This study aims to analyze the performance of Indonesia's frozen shrimp trade in destination markets and the impact of NTM policies and other factors on Indonesian shrimp exports in destination markets. This research uses panel data model regression. The results showed there was still rejection of fishery products including frozen shrimp from Indonesia in importing countries due to excess chemical content. But the number of rejections is less than one percent compared to the total received. This is in line with the panel data regression results for the SPS variable which is not significant and the TBT variable which has a significant and positive effect. The non-significant results indicate that exporters in Indonesia can adjust the policies imposed by the importing country relating to product certification criteria, sampling procedures, packaging requirements, distribution requirements, and labeling requirements. For this reason, the Indonesian government needs to improve regarding rejection is the need for accurate and thorough sample testing of Indonesian frozen shrimp before export, as well as increasing assistance regarding threshold limits for chemical use for shrimp cultivators at the upstream level.
APA, Harvard, Vancouver, ISO, and other styles
8

Hurley, Catherine B., and Hosam M. Mahmoud. "Analysis of an Algorithm for Random Sampling." Probability in the Engineering and Informational Sciences 8, no. 2 (1994): 153–68. http://dx.doi.org/10.1017/s0269964800003302.

Full text
Abstract:
We analyze a standard algorithm for sampling m items without replacement from a computer file of n records. The algorithm repeatedly selects a record at random from the file, rejecting records that have previously been selected, until m records are obtained. The running time of the algorithm has two components: a rejection component and a search component. We show that the probability distribution of the rejection component undergoes an infinite series of phase transitions, depending on the order of magnitude of m relative to n. We identify an infinite number of ranges of m, each with a different behavior. The rejection component is distributed as a linear combination of Poisson-like random variables. The search component is customarily done using a hash table with separate chaining. The analysis of the hashing scheme in this problem differs from previous hashing analyses, as the number of lookups in the hash table for each insertion has a geometric distribution. We show that the average overall cost of searching is asymptotically linear with only two phase transitions in the coefficient of linearity.
APA, Harvard, Vancouver, ISO, and other styles
9

Haridoss, V., and V. Sasikala. "Constructing Optimal Quick Switching System with Hurdle Poisson Distribution." Indian Journal Of Science And Technology 17, no. 22 (2024): 2296–304. http://dx.doi.org/10.17485/ijst/v17i22.581.

Full text
Abstract:
Objectives: Optimizing the sum of risks involved in the selection of acceptance sampling plans playing a vital role. This paper uses the Hurdle Poisson distribution to design an optimal quick switching system attribute plan for a given acceptable quality level (AQL) and limiting quality level (LQL) involving a minimum sum of risks. Methods: The sum of producer's and consumer's risks has been met for the specified AQL and LQL. The sum of these risks, as well as the acceptance and rejection numbers have been calculated using the Hurdle Poisson distribution. The operating characteristic function for the quick switching system attribute plan has also been derived using the Hurdle Poisson distribution. Findings: The producer and the consumer both represent the same party in the final inspection. As a result, the sum of these two risks should be minimized. In this paper, the sum of risks for various operating ratios are tabulated using the Hurdle Poisson distribution. These tabulated values are less than the sum of risks calculated using the Weighted Poisson distribution. Novelty: Reducing the sum of risks is the ultimate aim of the work. In this proposed paper, to attain the minimum sum of risks, the authors make an attempt to approach the Quick Switching System Sampling Plan, when the number of defectives in the submitted lots are very less. In other words, the probability of getting defective is very less. This indicates the quality of the lot selected for the inspection to ensure the protection for the consumer. And the plan is also designed in the way that the producer is also not get affected by rejecting a good lot by the consumer. This is the requirement of minimizing the risks. Keywords: Acceptable Quality Level (AQL), Limiting Quality Level (LQL), Minimum Risk Plan, Quick Switching System Sampling Attribute Plan, Operating Characteristic (OC) Function, Hurdle Poisson Distribution
APA, Harvard, Vancouver, ISO, and other styles
10

Pigłowski, Marcin. "Pathogenic and Non-Pathogenic Microorganisms in the Rapid Alert System for Food and Feed." International Journal of Environmental Research and Public Health 16, no. 3 (2019): 477. http://dx.doi.org/10.3390/ijerph16030477.

Full text
Abstract:
The most frequently notified pathogenic microorganisms in the RASFF in 1980–2017 were Salmonella sp., Listeria, Escherichia and Vibrio, whereas, among the notified non-pathogenic microorganisms were unspecified microorganisms, Enterobacteriaceae, Salmonella sp. and Coliforms. Microorganisms were reported mainly in poultry meat, meat, fish, molluscs, crustaceans, fruits, vegetables, herbs, spices, nuts, milk, cereals (in food) and in feed materials and pet food (in feed). The number of notifications decreased at the turn of 2005 and 2006, but has steadily increased since then. The notification basis were official controls, border controls and company’s checks. Products were notified mainly by Italy, France, United Kingdom, Germany and Netherlands. The reported products originated from Brazil, European Union countries and India, Thailand and Vietnam. The notification types were alerts, information and border rejections. The distribution status was often not specified or distribution on the market was possible. The risk decision was usually not made. Products were re-dispatched, import was not authorised or products were withdrawn from the market, destroyed and recalled from the market. Proper cooperation within the framework of the RASFF can contribute to shaping public health law and reducing outbreaks associated with microorganisms.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhou, Carl. "Expressing the randomity of events – An analysis of random number generation with given distributions." University of Ottawa Science Undergraduate Research Journal 1 (August 23, 2018): 47–54. http://dx.doi.org/10.18192/osurj.v1i1.3702.

Full text
Abstract:
In cases where it is necessary to generate random numbers that obey specific distributions, some of those distributions can be expressed as mathematical functions while others cannot. This is especially the case for epidemiological, medical, and pharmaceutical investigations, where more accurate methods, utilising actual distribution (from survey and experimental data) to generate random numbers may be required. In this study, three methods are analyzed to demonstrate simple computation examples. These methods include: inverse transform,acceptance-rejection, and Monte-Carlo simulations. Their applications are explored from a data analysis point of view. Additionally, this article discusses a flexible and practical approach of statistical measures optimization, which approximates the solution by fitting the statistical measures.
APA, Harvard, Vancouver, ISO, and other styles
12

Bergstrand, Marie, Sara-Sofia Asp, and Göran Lindström. "Nationwide hydrological statistics for Sweden with high resolution using the hydrological model S-HYPE." Hydrology Research 45, no. 3 (2013): 349–56. http://dx.doi.org/10.2166/nh.2013.010.

Full text
Abstract:
A first version of nationally covering hydrological statistics for Sweden based on the S-HYPE hydrological model for the period 1961–2010 is described. A key feature of the proposed method is that observed data are used as input wherever such data are available, and the model is used for interpolation in between stations. Short observation records are automatically extended by the use of the model. High flow statistics typically differed by about ±10% from observations. The corresponding number for low flow was about ±30%. High flow peaks were usually simulated slightly too low whereas low flows were too high. In a relative sense low flows were more uncertain than high flows. The mean flow was relatively certain. The annual maximum values were fitted to a Gumbel distribution, by the method of moments, for each subbasin. Flood statistics were then calculated up to a return period of 50 years. According to a Kolmogorov–Smirnov test, less than 1% of the fitted distributions were rejected. Most rejections occurred in regulated systems, due to difficulties in simulating regulation strategies, but also due to uncertainties in the precipitation input in the mountainous region. Results at small scale are very uncertain. The proposed method is a cost-effective way of calculating hydrological statistics with high spatial resolution.
APA, Harvard, Vancouver, ISO, and other styles
13

Karpov, D. A., and V. I. Struchenkov. "Dynamic programming in applied tasks which are allowing to reduce the options selection." Russian Technological Journal 8, no. 4 (2020): 96–111. http://dx.doi.org/10.32362/2500-316x-2020-8-4-96-111.

Full text
Abstract:
The article discusses the dynamic programming algorithm developed by R. Bellman, based on the search for the optimal trajectory connecting the nodes of a predefined regular grid of states. Possibilities are analyzed for a sharp increase in the effectiveness of using dynamic programming in solving applied problems with specific features, which allows us to refuse to split a regular grid of states and implement an algorithm for finding the optimal trajectory when rejecting not only unpromising options for paths leading to each of the states, and all of them continuations, as in R. Bellmanʼs algorithm, but also actually hopeless states and all variants of paths emanating from them. The conditions are formulated and justified under which the rejection of hopeless states is possible. It has been established that many applied problems satisfy these conditions. To solve such problems, a new dynamic programming algorithm described in the article is proposed and implemented. Concrete examples of such applied problems are given: the optimal distribution of a homogeneous resource between several consumers, the optimal loading of vehicles, the optimal distribution of finances when choosing investment projects. To solve these problems, dynamic programming algorithms with rejecting unpromising paths, but without rejecting states, were previously proposed. The number of hopeless states that appear at various stages of dynamic programming and, accordingly, the effectiveness of the new algorithm depends on the specific numerical values of the source data. For the two-parameter problem of optimal loading of vehicles with weight and volume constraints, the results of comparative calculations by the R. Bellman algorithm and the new dynamic programming algorithm are presented. As a source of data for a series of calculations, pseudorandom numbers were used. As a result of the analysis, it was shown that the comparative efficiency of the algorithm with rejection of states increases with increasing dimension of the problem. So, in the problem of the optimal choice of items for loading a vehicle of a given carrying capacity with a number of items of 150, the number of memorized states and the counting time are reduced by 50 and 57 times, respectively, when using the new algorithm compared to the classical algorithm of R. Bellman. And for 15 items, the corresponding numbers are 13 and 4.
APA, Harvard, Vancouver, ISO, and other styles
14

Gadde, Srinivasa Rao, Arnold K. Fulment, and Josephat K. Peter. "Design of Multiple Dependent State Sampling Plan Application for COVID-19 Data Using Exponentiated Weibull Distribution." Complexity 2021 (October 30, 2021): 1–10. http://dx.doi.org/10.1155/2021/2795078.

Full text
Abstract:
The proposed sampling plan in this article is referred to as multiple dependent state (MDS) sampling plans, for rejecting a lot based on properties of the current and preceding lot sampled. The median life of the product for the proposed sampling plan is assured based on a time-truncated life test, when a lifetime of the product follows exponentiated Weibull distribution (EWD). For the proposed plan, optimal parameters such as the number of preceding lots required for deciding whether to accept or reject the current lot, sample size, and rejection and acceptance numbers are obtained by the approach of two points on the operating characteristic curve (OC curve). Tables are constructed for various combinations of consumer and producer’s risks for various shape parameters. The proposed MDS sampling plan for EWD is demonstrated using the coronavirus (COVID-19) outbreak in China. The performance of the proposed sampling plan is compared with the existing single-sampling plan (SSP) when the quality of the product follows EWD.
APA, Harvard, Vancouver, ISO, and other styles
15

de Figueiredo, Leandro Passos, Dario Grana, Mauro Roisenberg, and Bruno B. Rodrigues. "Gaussian mixture Markov chain Monte Carlo method for linear seismic inversion." GEOPHYSICS 84, no. 3 (2019): R463—R476. http://dx.doi.org/10.1190/geo2018-0529.1.

Full text
Abstract:
We have developed a Markov chain Monte Carlo (MCMC) method for joint inversion of seismic data for the prediction of facies and elastic properties. The solution of the inverse problem is defined by the Bayesian posterior distribution of the properties of interest. The prior distribution is a Gaussian mixture model, and each component is associated to a potential configuration of the facies sequence along the seismic trace. The low frequency is incorporated by using facies-dependent depositional trend models for the prior means of the elastic properties in each facies. The posterior distribution is also a Gaussian mixture, for which the Gaussian component can be analytically computed. However, due to the high number of components of the mixture, i.e., the large number of facies configurations, the computation of the full posterior distribution is impractical. Our Gaussian mixture MCMC method allows for the calculation of the full posterior distribution by sampling the facies configurations according to the acceptance/rejection probability. The novelty of the method is the use of an MCMC framework with multimodal distributions for the description of the model properties and the facies along the entire seismic trace. Our method is tested on synthetic seismic data, applied to real seismic data, and validated using a well test.
APA, Harvard, Vancouver, ISO, and other styles
16

Balan, Ramkumar, and Sajana Kunjunni. "Truncated Life Test Plans for Economic Reliability Based on Four-Parametric Burr Distribution." Journal of Industrial Mathematics 2013 (November 28, 2013): 1–6. http://dx.doi.org/10.1155/2013/489285.

Full text
Abstract:
Burr distribution is considered as a probability model for the lifetime of products. Reliability test plans are those sampling plans in which items from a lot are put to test to make conclusions on the estimate of life, and hence acceptance or rejection of the submitted lot is done. A test plan designs the termination time of the experiment and the termination number for a given sample size and producer’s risk. Tables and graphs were provided for certain specific values of designs, and it is useful to verify the optimum reliability test plan realized by Burr distributions.
APA, Harvard, Vancouver, ISO, and other styles
17

V, Haridoss, and Sasikala V. "Constructing Optimal Quick Switching System with Hurdle Poisson Distribution." Indian Journal of Science and Technology 17, no. 22 (2024): 2296–304. https://doi.org/10.17485/IJST/v17i22.581.

Full text
Abstract:
Abstract <strong>Objectives:</strong>&nbsp;Optimizing the sum of risks involved in the selection of acceptance sampling plans playing a vital role. This paper uses the Hurdle Poisson distribution to design an optimal quick switching system attribute plan for a given acceptable quality level (AQL) and limiting quality level (LQL) involving a minimum sum of risks.&nbsp;<strong>Methods:</strong>&nbsp;The sum of producer's and consumer's risks has been met for the specified AQL and LQL. The sum of these risks, as well as the acceptance and rejection numbers have been calculated using the Hurdle Poisson distribution. The operating characteristic function for the quick switching system attribute plan has also been derived using the Hurdle Poisson distribution.&nbsp;<strong>Findings:</strong>&nbsp;The producer and the consumer both represent the same party in the final inspection. As a result, the sum of these two risks should be minimized. In this paper, the sum of risks for various operating ratios are tabulated using the Hurdle Poisson distribution. These tabulated values are less than the sum of risks calculated using the Weighted Poisson distribution.&nbsp;<strong>Novelty:</strong>&nbsp;Reducing the sum of risks is the ultimate aim of the work. In this proposed paper, to attain the minimum sum of risks, the authors make an attempt to approach the Quick Switching System Sampling Plan, when the number of defectives in the submitted lots are very less. In other words, the probability of getting defective is very less. This indicates the quality of the lot selected for the inspection to ensure the protection for the consumer. And the plan is also designed in the way that the producer is also not get affected by rejecting a good lot by the consumer. This is the requirement of minimizing the risks. <strong>Keywords:</strong> Acceptable Quality Level (AQL), Limiting Quality Level (LQL), Minimum Risk Plan, Quick Switching System Sampling Attribute Plan, Operating Characteristic (OC) Function, Hurdle Poisson Distribution
APA, Harvard, Vancouver, ISO, and other styles
18

Zakharov, I. M., V. A. Smirnov, D. V. Sushnikov, et al. "Modification of tundish design to improve quality of slabs, casted at CCM No. 4 of EVRAZ NTMK." Ferrous Metallurgy. Bulletin of Scientific , Technical and Economic Information 76, no. 6 (2020): 559–63. http://dx.doi.org/10.32339/0135-5910-2020-6-559-563.

Full text
Abstract:
A technology of continuous casting of steel has a large effect on its contamination. In particular, proper organization of metal flows in tundish and mold is very important. After completion a series of casting through a tundish and drop of metal level in it, it possible, that the slag flows from the surface layers of the tundish to capture the metal. An analysis of results of ultrasonic control of finished strip showed, that the basic number of the revealed defects was obtained during strip rolling out of the last slabs of the last heat in a series for a tundish. Metallographic studies determined, that the defects were located in the slab axis zone and filled by macro-inclusions of complex composition. To determine the actual distribution of metal flows, a water simulation of them was accomplished for the existing design of the EVRAZ NTMK tundish. It was determined, that at the drop of metal level in a tundish, a capture of slag from the metal reservoir and its transfer into the main bath of the tundish takes place due to whirligig flows. Following the weight metal consumption, the zone of slag inclusions distribution enlarges. Besides, denudation of the metal mirror in the metal reservoir takes place, resulted in metal secondary oxidation. Based on the results of the simulation, it was proposed to modify the design of the “turbostop”, which is installed in the metal reservoir of the tundish. Besides, it was proposed to remove the dividers installation in the tundish. It was noted, that the proposed steps enable to ensure a minimal level of rejections.
APA, Harvard, Vancouver, ISO, and other styles
19

Deng, Jiang Ming, Te Fang Chen, and Shu Cheng. "MVB-Based Dynamic Supervision of Docks in Traffic Memory." Advanced Materials Research 403-408 (November 2011): 2728–31. http://dx.doi.org/10.4028/www.scientific.net/amr.403-408.2728.

Full text
Abstract:
A high reliability of Traffic Memory(TM) working condition is of vital importance to information transfer in MVB. In order to avoid bus traffic overflows, the minimum possible time intervals or loading ratios of TM for a given number of ports were calculated. The longer supervision period it had, the more number of docks could be supervised. The number of docks supervised, during supervision intervals, was submitted to Normal distribution ( Ν(μ,σ2)). From the possibility distribution function it could σfind maximum possibility of the number of docks in working. From the disturbance rejection test of the fixed factor σ , a reasonable setting of sink-time supervision interval could be made to guarantee a high reliability for TM working condition.
APA, Harvard, Vancouver, ISO, and other styles
20

Aston, John A. D., and Donald E. K. Martin. "Waiting time distributions of competing patterns in higher-order Markovian sequences." Journal of Applied Probability 42, no. 4 (2005): 977–88. http://dx.doi.org/10.1239/jap/1134587810.

Full text
Abstract:
Competing patterns are compound patterns that compete to be the first to occur pattern-specific numbers of times. They represent a generalisation of the sooner waiting time problem and of start-up demonstration tests with both acceptance and rejection criteria. Through the use of finite Markov chain imbedding, the waiting time distribution of competing patterns in multistate trials that are Markovian of a general order is derived. Also obtained are probabilities that each particular competing pattern will be the first to occur its respective prescribed number of times, both in finite time and in the limit.
APA, Harvard, Vancouver, ISO, and other styles
21

Aston, John A. D., and Donald E. K. Martin. "Waiting time distributions of competing patterns in higher-order Markovian sequences." Journal of Applied Probability 42, no. 04 (2005): 977–88. http://dx.doi.org/10.1017/s0021900200001042.

Full text
Abstract:
Competing patterns are compound patterns that compete to be the first to occur pattern-specific numbers of times. They represent a generalisation of the sooner waiting time problem and of start-up demonstration tests with both acceptance and rejection criteria. Through the use of finite Markov chain imbedding, the waiting time distribution of competing patterns in multistate trials that are Markovian of a general order is derived. Also obtained are probabilities that each particular competing pattern will be the first to occur its respective prescribed number of times, both in finite time and in the limit.
APA, Harvard, Vancouver, ISO, and other styles
22

Yahlinskyi, V. P., S. S. Hutyria, and V. V. Vovk. "MODEL FOR CALCULATING THE RELIABILITY OF SAMPLES OF KINEMATIK PAIRS OF MACHINE-ROBOTS." Key title: Zbìrnik naukovih pracʹ Odesʹkoï deržavnoï akademìï tehnìčnogo regulûvannâ ta âkostì, no. 1(20) (2022): 37–43. http://dx.doi.org/10.32684/2412-5288-2022-1-20-37-43.

Full text
Abstract:
In the article, the reliability issues of machine tool mechanisms are still more at the stages of pre-project studies and experimental tests. In connection with the introduction of highly effective technological processes, systems of automated design and production, complex systems that provide a high level of labor productivity, the requirements for reliability, durability and workability are constantly growing. Failures of mechanisms in the composition of robotic machines, in accordance with their physical nature, can be associated with the destruction of nodes and details of mechanisms and their drives, jamming of individual elements and other reasons that lead to the fact that the technological equipment becomes unable to perform its functions. Rice's formula was used to determine the average parameter of the flow of rejections. To estimate the average parameter of the flow of failures, it is necessary to know the density of the joint distribution of the state parameter and the rate of its change over time. In many cases, such processes are described by a stationary random function with a normal law of distribution of both the state parameter and the rate of its change. Parametric failures precede functional failures, and can also cause them. Sudden failures are considered in a quasi-static formulation as the process of a random sequence of parameters exceeding the permissible limits in space. It is proposed to carry out the reliability study in the following three main stages: construction on the basis of theoretical or experimental studies of the dependence of the initial parameter on load indicators; thorough statistical analysis of operating conditions, probabilistic description of load indicators; construction of the distribution of the initial parameter. The algorithm for calculating system reliability indicators is presented, the initial parameter of which is presented in the form of a random sequence, imitating the work of a certain number of samples of the machine-work kinematic pair in real operational conditions up to their complete wear. The initial basic size and criterion parameter for each sample are determined by the Monte Carlo method.
APA, Harvard, Vancouver, ISO, and other styles
23

Amalia, Suci, H. Djumardin, and Aris Munandar. "Juridical Analysis of Rejection of Cancellation of the Deed of Sharing Collective Rights Based on Ruling Number:803K/AG/2017." RESEARCH REVIEW International Journal of Multidisciplinary 9, no. 1 (2024): 88–96. http://dx.doi.org/10.31305/rrijm.2024.v09.n01.012.

Full text
Abstract:
The aim of this research is to analyze concept and regulation of sharing of joint rights in positive law and to analyze the judge's considerations in refusing to cancel the deed of sharing of joint rights based on Decision Number:803k/Aug/2017. This type of research is normative research. The problem is how the concept and regulation of the distribution of joint rights in positive law and how the judge considers the rejection of the cancellation of the deed of distribution of joint rights based on Decision Number:803k/Aug/2017. the results of this research are PemSharing of joint rights is a legal action carried out by holders of joint rights to land so that it becomes the right of each of the joint rights based on the Deed of Sharing of Joint Rights. Arrangements for the distribution of joint rights are contained in various regulations, namely Porder Government Regulation Number 24 of 1997 concerning Land Registration Article 31 paragraphs 4 and 5, and Article 51 paragraph 1, Regulation of the Minister of State for Agrarian Affairs/Head of the National Land Agency Number 3 of 1997 concerning Provisions for Implementing Government Regulation Number 24 of 1997 concerning Land Registration. Article 94 paragraph 2, article 95, article 96, article 105 article 111 paragraph 4, article 136 paragraphs 1 and 2, and Explanation PRegulation of the Government of the Republic of Indonesia Number 18 of 2021 concerning Management Rights, Land Rights, Flats and Land Registration in Article 67 paragraph 1. and dIt can be seen from the judge's various considerations that the reason for filing this cassation is actually the cancellation of the deed of sharing of joint rights made by the defendants legally in the PPAT and in accordance with existing provisions. The other reasons in this cassation cannot be justified because they are based on reality and no errors were found so they cannot be considered in this cassation decision in accordance with the provisions in Law Number 14 of 1985 concerning the Supreme Court Article 30.
APA, Harvard, Vancouver, ISO, and other styles
24

Dudoit, Sandrine, Mark J. van der Laan, and Katherine S. Pollard. "Multiple Testing. Part I. Single-Step Procedures for Control of General Type I Error Rates." Statistical Applications in Genetics and Molecular Biology 3, no. 1 (2004): 1–69. http://dx.doi.org/10.2202/1544-6115.1040.

Full text
Abstract:
The present article proposes general single-step multiple testing procedures for controlling Type I error rates defined as arbitrary parameters of the distribution of the number of Type I errors, such as the generalized family-wise error rate. A key feature of our approach is the test statistics null distribution (rather than data generating null distribution) used to derive cut-offs (i.e., rejection regions) for these test statistics and the resulting adjusted p-values. For general null hypotheses, corresponding to submodels for the data generating distribution, we identify an asymptotic domination condition for a null distribution under which single-step common-quantile and common-cut-off procedures asymptotically control the Type I error rate, for arbitrary data generating distributions, without the need for conditions such as subset pivotality. Inspired by this general characterization of a null distribution, we then propose as an explicit null distribution the asymptotic distribution of the vector of null value shifted and scaled test statistics. In the special case of family-wise error rate (FWER) control, our method yields the single-step minP and maxT procedures, based on minima of unadjusted p-values and maxima of test statistics, respectively, with the important distinction in the choice of null distribution. Single-step procedures based on consistent estimators of the null distribution are shown to also provide asymptotic control of the Type I error rate. A general bootstrap algorithm is supplied to conveniently obtain consistent estimators of the null distribution. The special cases of t- and F-statistics are discussed in detail. The companion articles focus on step-down multiple testing procedures for control of the FWER (van der Laan et al., 2004b) and on augmentations of FWER-controlling methods to control error rates such as tail probabilities for the number of false positives and for the proportion of false positives among the rejected hypotheses (van der Laan et al., 2004a). The proposed bootstrap multiple testing procedures are evaluated by a simulation study and applied to genomic data in the fourth article of the series (Pollard et al., 2004).
APA, Harvard, Vancouver, ISO, and other styles
25

Ravikumar, M. S., A. Naga Durgamamba, and R. R. L. Kantam. "Economic Reliability Test Plan for Burr Type X Distribution." International Journal of Advanced Engineering Research and Applications 5, no. 03 (2019): 56–63. http://dx.doi.org/10.46593/ijaera.2019.v05i03.001.

Full text
Abstract:
The Burr Type X distribution is considered as a probability model for the lifetime of the product. Sampling plans in which items that are put to test, to collect the life of the items in order to decide upon accepting or rejecting a submitted lot, are called reliability test plans. A test plan to determine the termination time of the experiment for a given sample size, producer’s risk and termination number is constructed. The preferability of the present test plan over similar plans exists in the literature is established with respect to time of the experiment. Results are illustrated by an example.
APA, Harvard, Vancouver, ISO, and other styles
26

Zoramawa, A.B, and A. S. Charanchi. "A Study on Sequential Probability Sampling for Monitoring a Resubmitted Lots under Burr-Type XII Distribution." Continental J. Applied Sciences 16, no. 2 (2021): 16–26. https://doi.org/10.5281/zenodo.5540382.

Full text
Abstract:
<em>In this research, Sequential probability sampling analysis was used to treat the sample size obtained from either single or double sampling plans. Precisely the research considers and compared the minimum sample obtained from Bur Type XII distribution. Estimations of minimum sample, acceptance and rejection numbers obtained were analyzed and presented to explain the usefulness of sequential plans in relation to single and double sampling plan. Average Sample Number (ASN) obtained indicated the hypothesis at various risks&rsquo; levels was accepted indicating there is a time limit to terminate the sampling. Sequential probability sampling (SPS) plays a vital role at any sampling plan obtained using Bur Type XII distribution and saves inspection time</em><em> which was among the major concern of both producers and consumers in the manufacturing industries.</em>
APA, Harvard, Vancouver, ISO, and other styles
27

Charilogis, Vasileios, Ioannis G. Tsoulos, and V. N. Stavrou. "An Intelligent Technique for Initial Distribution of Genetic Algorithms." Axioms 12, no. 10 (2023): 980. http://dx.doi.org/10.3390/axioms12100980.

Full text
Abstract:
The need to find the global minimum in multivariable functions is a critical problem in many fields of science and technology. Effectively solving this problem requires the creation of initial solution estimates, which are subsequently used by the optimization algorithm to search for the best solution in the solution space. In the context of this article, a novel approach to generating the initial solution distribution is presented, which is applied to a genetic optimization algorithm. Using the k-means clustering algorithm, a distribution based on data similarity is created. This helps in generating initial estimates that may be more tailored to the problem. Additionally, the proposed method employs a rejection sampling algorithm to discard samples that do not yield better solution estimates in the optimization process. This allows the algorithm to focus on potentially optimal solutions, thus improving its performance. Finally, the article presents experimental results from the application of this approach to various optimization problems, providing the scientific community with a new method for addressing this significant problem.
APA, Harvard, Vancouver, ISO, and other styles
28

Okamoto, A., S. Higuchi, K. Sato, et al. "Laser Thomson scattering system for anisotropic electron temperature measurement in NUMBER." Journal of Instrumentation 18, no. 10 (2023): C10013. http://dx.doi.org/10.1088/1748-0221/18/10/c10013.

Full text
Abstract:
Abstract A laser Thomson scattering system is developed to obtain anisotropic electron temperature in a pulse operated electron cyclotron resonance plasma device. Injecting a laser with an oblique angle to the external magnetic field and detecting scattering photon from a line of sight along another oblique angle enable us to obtain parallel and perpendicular components of electron temperature. Backward (165°) scattering spectrum corresponds to quasi perpendicular velocity distribution, while that for forward (15°) scattering; quasi parallel. Collecting lenses are set in vacuum to maximize solid angle in a limited space of port, where the laser path and collecting optics share an ICF152 flange. Rayleigh scattering intensity is measured as a function of argon gas pressure. An initial result on residual stray light intensity is equivalent to the argon Rayleigh scattering intensity under ≤ 1 kPa of filled pressure. In order to obtain Thomson scattering spectra, a notch filter type stray light rejection optics, which utilize a reflective type volume holographic grating, is evaluated. Notch width (0.2 nm) and flat transmittance (≥ 90%) outside the notch measured for a prototype show that it is an effective optics.
APA, Harvard, Vancouver, ISO, and other styles
29

Zheng, Zhongxiang, Anyu Wang, and Lingyue Qin. "Rejection Sampling Revisit: How to Choose Parameters in Lattice-Based Signature." Mathematical Problems in Engineering 2021 (June 7, 2021): 1–12. http://dx.doi.org/10.1155/2021/9948618.

Full text
Abstract:
Rejection sampling technology is a core tool in the design of lattice-based signatures with ‘Fiat–Shamir with Aborts’ structure, and it is related to signing efficiency and signature, size as well as security. In the rejection sampling theorem proposed by Lyubashevsky, the masking vector of rejection sampling is chosen from discrete Gaussian distribution. However, in practical designs, the masking vector is more likely to be chosen from bounded uniform distribution due to better efficiency and simpler implementation. Besides, as one of the third-round candidate signatures in the NIST postquantum cryptography standardization process, the 3rd round version of CRYSTALS-Dilithium has proposed a new method to decrease the rejection probability in order to achieve better efficiency and smaller signature size by decreasing the number of nonzero coefficients of the challenge polynomial according to the security levels. However, it is seen that small entropies in this new method may lead to higher risk of forgery attack compared with former schemes proposed in its 2nd version. Thus, in this paper, we first analyze the complexity of forgery attack for small entropies and then introduce a new method to decrease the rejection probability without loss of security including the security against forgery attack. This method is achieved by introducing a new rejection sampling theorem with tighter bound by utilizing Rényi divergence where masking vector follows uniform distribution. By observing large gaps between the security claim and actual security bound in CRYSTALS-Dilithium, we propose two series of adapted parameters for CRYSTALS-Dilithium. The first set can improve the efficiency of the signing process in CRYSTALS-Dilithium by factors of 61.7 % and 41.7 % , according to the security levels, and ensure the security against known attacks, including forgery attack. And, the second set can reduce the signature size by a factor of 14.09 % with small improvements in efficiency at the same security level.
APA, Harvard, Vancouver, ISO, and other styles
30

Connolly, Patrick, and Ben Dutton. "Optimized rejection sampling for estimating facies probabilities from seismic data." Leading Edge 43, no. 6 (2024): 368–81. http://dx.doi.org/10.1190/tle43060368.1.

Full text
Abstract:
Seismic inversion for facies has nonunique solutions. There are invariably many vertical facies arrays that are consistent with both a data trace and the prior information. Stochastic sampling algorithms set within a Bayesian framework can provide an estimate of the posterior probability distribution of facies arrays by finding the arrays with relatively high posterior probabilities for each data trace. Sample-by-sample facies probabilities can be estimated by measuring the proportions of each facies type at each sample location from the set of posterior facies arrays. To enable the estimation of probabilities of facies mixtures and to obtain high-quality images of facies probability curves, facies must be modeled at high resolution. The facies arrays, or vectors, on which the sampling algorithm operates, must also be long enough to allow for vertical coupling caused by the wavelet. This results in very large sample spaces. The posterior probability distribution is highly nonconvex, which, combined with the large sample space, severely challenges conventional stochastic sampling methods in obtaining convergence of the estimated posterior distribution. The posterior sets of vectors from conventional methods tend to be either correlated or have low predictabilities, resulting in biased or noisy facies probability estimates, respectively. However, accurate estimates of facies probabilities can be obtained from a relatively small number of posterior facies vectors (about 100), provided that they are uncorrelated and have high predictabilities. Full convergence of the posterior distribution is not required. A hybrid algorithm optimized rejection sampling can be designed specifically for the seismic facies probability inversion problem by combining independent sampling of the prior, which ensures posterior vectors are uncorrelated, with an optimization step to obtain high predictabilities. Tests on both real and synthetic data demonstrate better results than conventional rejection sampling and Markov chain Monte Carlo methods.
APA, Harvard, Vancouver, ISO, and other styles
31

Wilks, D. S. "On “Field Significance” and the False Discovery Rate." Journal of Applied Meteorology and Climatology 45, no. 9 (2006): 1181–89. http://dx.doi.org/10.1175/jam2404.1.

Full text
Abstract:
Abstract The conventional approach to evaluating the joint statistical significance of multiple hypothesis tests (i.e., “field,” or “global,” significance) in meteorology and climatology is to count the number of individual (or “local”) tests yielding nominally significant results and then to judge the unusualness of this integer value in the context of the distribution of such counts that would occur if all local null hypotheses were true. The sensitivity (i.e., statistical power) of this approach is potentially compromised both by the discrete nature of the test statistic and by the fact that the approach ignores the confidence with which locally significant tests reject their null hypotheses. An alternative global test statistic that has neither of these problems is the minimum p value among all of the local tests. Evaluation of field significance using the minimum local p value as the global test statistic, which is also known as the Walker test, has strong connections to the joint evaluation of multiple tests in a way that controls the “false discovery rate” (FDR, or the expected fraction of local null hypothesis rejections that are incorrect). In particular, using the minimum local p value to evaluate field significance at a level αglobal is nearly equivalent to the slightly more powerful global test based on the FDR criterion. An additional advantage shared by Walker’s test and the FDR approach is that both are robust to spatial dependence within the field of tests. The FDR method not only provides a more broadly applicable and generally more powerful field significance test than the conventional counting procedure but also allows better identification of locations with significant differences, because fewer than αglobal × 100% (on average) of apparently significant local tests will have resulted from local null hypotheses that are true.
APA, Harvard, Vancouver, ISO, and other styles
32

Ozer, H., H. I. Oktay Basegmez, T. B. Whitaker, A. B. Slate, and F. G. Giesbrecht. "Sampling dried figs for aflatoxin - Part II: effect of sampling plan design on reducing the risk of misclassifying lots." World Mycotoxin Journal 10, no. 2 (2017): 99–109. http://dx.doi.org/10.3920/wmj2016.2127.

Full text
Abstract:
Because aflatoxin limits vary widely among regulating countries, the Codex Committee on Contaminants in Foods (CCCF) began work in 2006 to harmonise maximum levels (MLs) and sampling plans for aflatoxin in dried figs. Studies were developed to measure the variability and distribution among replicated sample aflatoxin test results taken from the same aflatoxin contaminated lot of dried figs so that a model could be developed to evaluate the risk of misclassifying lots of dried figs by aflatoxin sampling plan designs. The model was then be used by the CCCF electronic working group (eWG) to recommend MLs and aflatoxin sampling plan designs to the full CCCF membership for lots traded in the export market. Sixteen 10 kg samples were taken from each of 20 dried fig lots with varying levels of contamination. The observed aflatoxin distribution among the 16-aflatoxin sample test results was compared to the normal, lognormal, compound gamma, and negative binomial distributions. The negative binomial distribution was selected to model aflatoxin distribution among sample test results because it gave acceptable fits to observed aflatoxin distributions among sample test results taken from the same contaminated lot. Using the negative binomial distribution, a computer model was developed to show the effect of the number and size of samples and the accept/reject limits on the chances of rejecting good lots (seller's risk) and accepting bad lots (buyer's risk). The information was shared with the CCCF eWG and in March 2012, the 6th session of CCCF adopted at step 5/8 an aflatoxin sampling plan where three 10 kg samples must all test less than an ML of 10 µg/kg total aflatoxins to accept a dried fig lot. The 35th Session of the Codex Alimentarius Commission met in July 2012 and adopted the CCCF recommendations for the ML and the sampling plan as an official Codex standard.
APA, Harvard, Vancouver, ISO, and other styles
33

Dolgov, Sergey, Karim Anaya-Izquierdo, Colin Fox, and Robert Scheichl. "Approximation and sampling of multivariate probability distributions in the tensor train decomposition." Statistics and Computing 30, no. 3 (2019): 603–25. http://dx.doi.org/10.1007/s11222-019-09910-z.

Full text
Abstract:
Abstract General multivariate distributions are notoriously expensive to sample from, particularly the high-dimensional posterior distributions in PDE-constrained inverse problems. This paper develops a sampler for arbitrary continuous multivariate distributions that is based on low-rank surrogates in the tensor train format, a methodology that has been exploited for many years for scalable, high-dimensional density function approximation in quantum physics and chemistry. We build upon recent developments of the cross approximation algorithms in linear algebra to construct a tensor train approximation to the target probability density function using a small number of function evaluations. For sufficiently smooth distributions, the storage required for accurate tensor train approximations is moderate, scaling linearly with dimension. In turn, the structure of the tensor train surrogate allows sampling by an efficient conditional distribution method since marginal distributions are computable with linear complexity in dimension. Expected values of non-smooth quantities of interest, with respect to the surrogate distribution, can be estimated using transformed independent uniformly-random seeds that provide Monte Carlo quadrature or transformed points from a quasi-Monte Carlo lattice to give more efficient quasi-Monte Carlo quadrature. Unbiased estimates may be calculated by correcting the transformed random seeds using a Metropolis–Hastings accept/reject step, while the quasi-Monte Carlo quadrature may be corrected either by a control-variate strategy or by importance weighting. We show that the error in the tensor train approximation propagates linearly into the Metropolis–Hastings rejection rate and the integrated autocorrelation time of the resulting Markov chain; thus, the integrated autocorrelation time may be made arbitrarily close to 1, implying that, asymptotic in sample size, the cost per effectively independent sample is one target density evaluation plus the cheap tensor train surrogate proposal that has linear cost with dimension. These methods are demonstrated in three computed examples: fitting failure time of shock absorbers; a PDE-constrained inverse diffusion problem; and sampling from the Rosenbrock distribution. The delayed rejection adaptive Metropolis (DRAM) algorithm is used as a benchmark. In all computed examples, the importance weight-corrected quasi-Monte Carlo quadrature performs best and is more efficient than DRAM by orders of magnitude across a wide range of approximation accuracies and sample sizes. Indeed, all the methods developed here significantly outperform DRAM in all computed examples.
APA, Harvard, Vancouver, ISO, and other styles
34

Chepelianskii, Alexei D., Satya N. Majumdar, Hendrik Schawe, and Emmanuel Trizac. "Metropolis Monte Carlo sampling: convergence, localization transition and optimality." Journal of Statistical Mechanics: Theory and Experiment 2023, no. 12 (2023): 123205. http://dx.doi.org/10.1088/1742-5468/ad002d.

Full text
Abstract:
Abstract Among random sampling methods, Markov chain Monte Carlo (MC) algorithms are foremost. Using a combination of analytical and numerical approaches, we study their convergence properties toward the steady state, within a random walk Metropolis scheme. Analyzing the relaxation properties of some model algorithms sufficiently simple to enable analytic progress, we show that the deviations from the target steady-state distribution can feature a localization transition as a function of the characteristic length of the attempted jumps defining the random walk. While the iteration of the MC algorithm converges to equilibrium for all choices of jump parameters, the localization transition changes drastically the asymptotic shape of the difference between the probability distribution reached after a finite number of steps of the algorithm and the target equilibrium distribution. We argue that the relaxation before and after the localization transition is respectively limited by diffusion and rejection rates.
APA, Harvard, Vancouver, ISO, and other styles
35

Zhang, Zhenwei, Yutong Luo, Guisheng Yang, Shaozheng Zhang, and Zhengwei Wang. "Numerical Investigation of Symmetrical and Asymmetrical Characteristics of a Preloading Spiral Case and Concrete during Load Rejection." Symmetry 16, no. 10 (2024): 1277. http://dx.doi.org/10.3390/sym16101277.

Full text
Abstract:
During the transient process of load rejection, the hydraulic pressure applied to the pump-turbine and plant concrete changes dramatically and induces high dynamic stress on the spiral case. The preloading spiral case has been widely used in large-scale pumped-storage power stations due to its excellent load-bearing capacity. However, studies on the impact of preloading pressure on the structural response during load rejection are still few in number. In this paper, 3D flow domain and structural models of a prototype pump-turbine are designed to analyze the hydraulic characteristics and flow-induced dynamic behavior of the preloading steel spiral case under different preloading pressures during load rejection. The results show that the asymmetric design of the logarithmic spiral lines ensures an axially symmetric potential flow within the spiral case domain with uniform pressure distribution. Higher preloading pressure provides larger preloading clearance, leading to greater flow-induced deformation and stress, with their maximum values located at the mandoor and the inner edge, respectively. The combined effect of the asymmetrical shape, internal hydraulic pressure and unbalanced hydraulic force leads to an asymmetrical preloading clearance distribution, resulting in an asymmetrical distribution along the axial direction but a symmetrical characteristic near the waistline of the structural response. Stress variations at sections and between sections share similar characteristics during load rejection. It follows the same trend as the hydraulic pressure under lower preloading pressures, while there is a delayed peak of stress due to the delayed contact phenomenon when the preloading pressure reaches the maximum static head. The conclusions provide scientific guidance for optimizing the preloading pressure selection and structural design for the stable operation of units.
APA, Harvard, Vancouver, ISO, and other styles
36

Hudson, Richard R. "THE SAMPLING DISTRIBUTION OF LINKAGE DISEQUILIBRIUM UNDER AN INFINITE ALLELE MODEL WITHOUT SELECTION." Genetics 109, no. 3 (1985): 611–31. http://dx.doi.org/10.1093/genetics/109.3.611.

Full text
Abstract:
ABSTRACT The sampling distributions of several statistics that measure the association of alleles on gametes (linkage disequilibrium) are estimated under a two-locus neutral infinite allele model using an efficient Monte Carlo method. An often used approximation for the mean squared linkage disequilibrium is shown to be inaccurate unless the proper statistical conditioning is used. The joint distribution of linkage disequilibrium and the allele frequencies in the sample is studied. This estimated joint distribution is sufficient for obtaining an approximate maximum likelihood estimate of C = 4Nc, where N is the population size and c is the recombination rate. It has been suggested that observations of high linkage disequilibrium might be a good basis for rejecting a neutral model in favor of a model in which natural selection maintains genetic variation. It is found that a single sample of chromosomes, examined at two loci cannot provide sufficient information for such a test if C &amp;lt; 10, because with C this small, very high levels of linkage disequilibrium are not unexpected under the neutral model. In samples of size 50, it is found that, even when C is as large as 50, the distribution of linkage disequilibrium conditional on the allele frequencies is substantially different from the distribution when there is no linkage between the loci. When conditioned on the number of alleles at each locus in the sample, all of the sample statistics examined are nearly independent of λ = 4Nμ, where μ is the neutral mutation rate.
APA, Harvard, Vancouver, ISO, and other styles
37

Lilhore, Umesh Kumar, Sarita Simaiya, Kalpna Guleria, and Devendra Prasad. "An Efficient Load Balancing Method by Using Machine Learning-Based VM Distribution and Dynamic Resource Mapping." Journal of Computational and Theoretical Nanoscience 17, no. 6 (2020): 2545–51. http://dx.doi.org/10.1166/jctn.2020.8928.

Full text
Abstract:
In cloud computing, balancing the load among VMs and resources is a major research area, which still needs attention. The primary aim of this research is to expand an effective cloud load balancing approach, enhance the reaction time, lessen the ready time, premiere utilization of sources as well lessen the activity rejection time. The proposed MLBL method is primarily based on the SVM and Ksuggest clustering method. In the proposed MLBL technique SVM classification method used to create activity businesses based totally at the size. Later Kmethod clustering technique is used to create the institution of Virtual machines based totally on their usage of CPU and number one memory (RAM). We are providing an MLBL method based totally on system learning-based VM distribution and dynamic aid mapping. In cloud computing, VMs are scheduled to hosts as in keeping with their utilization (a host which has better availability of memory) without thinking about common utilizations. The proposed technique divides the assets into various organizations as well as VMs and then follow dynamic aid mapping. The proposed MLBL technique creates VMs clusters to execute comparable task agencies that enhance the QoS and ideal utilization of sources in addition to lessen the process rejection time.
APA, Harvard, Vancouver, ISO, and other styles
38

Moat, Stuart J., James R. Bonham, Christine Cavanagh, et al. "Consistency in the Assessment of Dried Blood Spot Specimen Size and Quality in U.K. Newborn Screening Laboratories." International Journal of Neonatal Screening 10, no. 3 (2024): 60. http://dx.doi.org/10.3390/ijns10030060.

Full text
Abstract:
In 2015, U.K. newborn screening (NBS) laboratory guidelines were introduced to standardize dried blood spot (DBS) specimen quality acceptance and specify a minimum acceptable DBS diameter of ≥7 mm. The UK ‘acceptable’ avoidable repeat rate (AVRR) is ≤2%. To assess inter-laboratory variability in specimen acceptance/rejection, two sets of colored scanned images (n = 40/set) of both good and poor-quality DBS specimens were distributed to all 16 U.K. NBS laboratories for evaluation as part of an external quality assurance (EQA) assessment. The mean (range) number of specimens rejected in the first EQA distribution was 7 (1–16) and in the second EQA distribution was 7 (0–16), demonstrating that adherence to the 2015 guidelines was highly variable. A new minimum standard for DBS size of ≥8 mm (to enable a minimum of six sub-punches from two DBS) was discussed. NBS laboratories undertook a prospective audit and demonstrated that using ≥8 mm as the minimum acceptable DBS diameter would increase the AVRR from 2.1% (range 0.55% to 5.5%) to 7.8% (range 0.55% to 22.7%). A significant inverse association between the number of specimens rejected in the DBS EQA distributions and the predicted AVVR (using ≥8 mm minimum standard) was observed (r = −0.734, p = 0.003). Before implementing more stringent standards, the impact of a standard operating procedure (SOP) designed to enable a standardized approach of visual assessment and using the existing ≥7 mm diameter (to enable a minimum of four sub-punches from two DBS) as the minimum standard was assessed in a retrospective audit. Implementation of the SOP and using the ≥7 mm DBS diameter would increase the AVRR from 2.3% (range 0.63% to 5.3%) to 6.5% (range 4.3% to 20.9%). The results demonstrate that there is inconsistency in applying the acceptance/rejection criteria, and that a low AVVR is not an indication of good-quality specimens being received into laboratories. Further work is underway to introduce and maintain standards without increasing the AVRR to unacceptable levels.
APA, Harvard, Vancouver, ISO, and other styles
39

Wibowo, Gatot Murti, Dwi Rochmayanti, and M. Irwan Katili. "Analysis of Digital Image Rejection (RFA) inDiagnosticRadiology Services after the Application of Computed Radiography (CR) to the Hospitals in Area Semarang City." Jurnal Riset Kesehatan 2, no. 1 (2015): 219–30. https://doi.org/10.31983/jrk.v2i1.151.

Full text
Abstract:
The purpose of this study was to determine the rejection of digital image analysis procedure CR softcopy hereinafter referred to as RFA, and describe the profile and characteristics of the proportion image rejection (rejection rates) based on the operational conditions of CR.Quantitative research was conducted with the observational approach. Data were collected by random sampling from 1181 digital image system CR-1137 type 1 at A hospital and the total number of CR type 2 digital image at B hospital on the status of post-image processing. Profiles and characteristics of the CR-system rejection rate was 9.15% while the type 1 for type 2 CR-system is 4.57%. Largest percentage of CR image rejection on both failitas dominated by chest radiographic anatomy of the organ which is 69.44% (A) and 48.08% (B). The findings reject rates of 15.87% due to poor performance of x-ray equipment and distribution teridenfitikasinya reject rates based on radiographer in A hospital deserves special attention. Though the radiographer and the employment rate figures reject individually in “ A” hospital not correlated (0.67 more than p-value).
APA, Harvard, Vancouver, ISO, and other styles
40

Amirunnas, Amirunnas, Nikitha Chairunnisa, Firdus Firdus, Alia Rizki, and Muhammad Nasir. "Efek Polutan Logam Berat Pada Sungai di Indonesia Terhadap Biota Aquatik." Jurnal Akuakultur Sungai dan Danau 10, no. 1 (2025): 124. https://doi.org/10.33087/akuakultur.v10i1.255.

Full text
Abstract:
Heavy metals are pollutants that cause detrimental poisoning in various aquatic biota. This article compiles the results of various studies into a review of their sources, distribution, rejection and impact on aquatic biota. Over time, the compilation of organic and inorganic arsenic, Pb, Cd, and Hg detected significantly increased in shellfish in most of the rivers included in this article. From the results, the affected areas and their river biota, a small number of eye health surveys conducted, including possible cures, are presented by discussing the utility of remediation and preventative measures through regulatory changes.
APA, Harvard, Vancouver, ISO, and other styles
41

Aryani, Diah Chandra, Yusra Egayanti, and Apriyanto Dwi Nugroho. "Strategy to minimize the risk of rejection due to mycotoxin contamination: Case for Indonesian nutmeg." BIO Web of Conferences 169 (2025): 02004. https://doi.org/10.1051/bioconf/202516902004.

Full text
Abstract:
Nutmeg seed is one of the Indonesian export products to the European Union (EU). However, due to series of non-compliance since 2016, in 2022 the EU imposed a 30% regulatory check for nutmeg consignment originating from Indonesia. This stricter control resulted in higher number of non-compliant batch/lot in the period of 2022-2023, and therefore, it is of importance to formulate a strategy to decrease the non-compliance. Desk study and interviews were conducted to obtain information on border rejection, health certificate application, and the underlying factor causing border rejection. The analysis on performance of sampling plan and the probability distribution were used to recommend the suitable strategy fit for the purpose. To decrease the number of non-compliance, and thus border rejection, the competent authority needs to reduce the prevalence of high contamination level by implementing sufficient measures. Of two possible measures, controlling raw material is able to reduce non-compliance by 80%. To control the raw material, the exporter is required to source a good quality raw material and implement Hazard Analysis Critical Control Points (HACCP). These selected measures shall be made mandatory and should be implemented gradually within 5 years, considering the readiness of the exporter and the whole nutmeg production support system.
APA, Harvard, Vancouver, ISO, and other styles
42

Kirillov, Alexey K. "The working style of the business tax distribution office and the tax inspector's personality in Tomsk Governorate in the early 20th century: The factor of an official in the epoch of income taxes formation." Vestnik Tomskogo gosudarstvennogo universiteta, no. 482 (2023): 125–34. http://dx.doi.org/10.17223/15617793/482/13.

Full text
Abstract:
Business tax distribution offices in Russia in the early 20th century were a part of the system of relations between the state and the taxpayer, relevant to the era of income taxation formation in the Western world. Working in the interests of filling up the state budget, the offices consisted mainly of representatives of merchants paying the tax. One of the unexplored issues in the work of the Russian tax offices is the dependence of their practices on the personality of the tax inspector chairing the meetings. Rare material for the study of this issue is provided by the archival fund of the 2nd Tomsk Business Tax Distribution Office. The author's sources include complete annual collections of minutes of the office meetings held under two different tax inspectors. The progress and results of considering the merchants' complaints as presented by the minutes reveal significant differences in the working style of the two inspectors. Under inspector D. Fedorchenko, the office would delve into the state of affairs of each petitioner as deeply as possible and explain reasons for its decision in details. The number of rejections exceeded the number of satisfied applications, the inspector guarded vigilantly the interests of the treasury. Still, decisions in favor of the payers were made not only on the basis of legally indisputable documents, but also on the basis of taxpayers' “unfounded” (according to the instructions) explanations. The office sought to determine as accurately as possible the taxable base for each payer, so as to avoid understating and overstating the amount. Under inspector K. Gruzinov, in most cases, refusal decisions were made on formal grounds, without an attempt to consider the case in substance. And even if merchants presented legally weighty arguments (bookkeeping records), they were in most cases rejected on formal grounds. However, from time to time, K. Gruzinov made concessions to taxpayers that he could well avoid without breaking the law. The facts presented in the article allow a conclusion that the style of work of the two inspectors differed significantly concerning their readiness to accept other people's arguments and the transparency of their decisions. Fedorchenko's methods favored the interests of his duties, meanwhile Gruzinov's style better ensured the personal interest of the inspector, since it did not require selfless overtime work. The system of the business tax was tolerant to the inspetor's taking the way of least resistance, which favored the predominance of Gruzinov's methods throughout the country. Under such conditions, not relying on the conscientiousness of officials, but limiting the rights of tax offices in favor of taxpayers' rights could be the only protection against arbitrariness. Limitation of this kind came true when the 1916 income tax law was being created.
APA, Harvard, Vancouver, ISO, and other styles
43

MALHAM, SIMON J. A., and ANKE WIESE. "CHI-SQUARE SIMULATION OF THE CIR PROCESS AND THE HESTON MODEL." International Journal of Theoretical and Applied Finance 16, no. 03 (2013): 1350014. http://dx.doi.org/10.1142/s0219024913500143.

Full text
Abstract:
The transition probability of a Cox–Ingersoll–Ross process can be represented by a non-central chi-square density. First, we establish a new representation for the central chi-square density based on sums of powers of generalized Gaussian random variables. Second, we show that Marsaglia's polar method extends to this distribution, providing a simple, exact, robust and efficient acceptance–rejection method for generalized Gaussian sampling and thus central chi-square sampling. Third, we derive a simple, high-accuracy, robust and efficient direct inversion method for generalized Gaussian sampling based on the Beasley–Springer–Moro method. Indeed the accuracy of the approximation to the inverse cumulative distribution function is to the tenth decimal place. We then apply our methods to non-central chi-square variance sampling in the Heston model. We focus on the case when the number of degrees of freedom is small and the zero boundary is attracting and attainable, typical in foreign exchange markets. Using the additivity property of the chi-square distribution, our methods apply in all parameter regimes.
APA, Harvard, Vancouver, ISO, and other styles
44

Toirov, Otabek, and Nodirjon Tursunov. "Efficiency of using heat-insulating mixtures to reduce defects of critical parts." E3S Web of Conferences 401 (2023): 05018. http://dx.doi.org/10.1051/e3sconf/202340105018.

Full text
Abstract:
A study was conducted to reduce defects in especially critical cast large-sized parts of freight car bogies. The side frame was used as the part under study. The main factors influencing the formation of hot cracks in large steel castings are presented. To avoid the rejection of steel castings, the use of heat-insulating mixtures to reduce hot cracks has been investigated. The temperature distribution of the melt along the height of the ladle is modeled. An analysis of the influence of the order of pouring molds and pouring temperature on the number of hot cracks in the side frames is presented.
APA, Harvard, Vancouver, ISO, and other styles
45

Lemeshko, Boris, and Stanislav Lemeshko. "Problems of nonparametric goodness-of-fit test application in tasks of measurement results processing." Analysis and data processing systems, no. 2 (June 18, 2021): 47–66. http://dx.doi.org/10.17212/2782-2001-2021-2-47-66.

Full text
Abstract:
It is argued that in most cases two reasons underlie the incorrect application of nonparametric goodness-of-fit tests in various applications. The first reason is that when testing composite hypotheses and evaluating the parameters of the law for the analyzed sample, classical results associated with testing simple hypotheses are used. When testing composite hypotheses, the distributions of goodness-of-fit statistics are influenced by the form of the observed law F(x, q) corresponding to the hypothesis being tested, by the type and number of estimated parameters, by the estimation method, and in some cases by the value of the shape parameter. The paper shows the influence of all mentiomed factors on the distribution of test statistics. It is emphasized that, when testing composite hypotheses, the neglect, of the fact that the test has lost the property of “freedom from distribution” leads to an increase in the probability of the 2nd kind errors. It is shown that the distribution of the statistics of the test necessary for the formation of a conclusion about the results of testing a composite hypothesis can be found using simulation in an interactive mode directly in the process of testing. The second reason is associated with the presence of round-off errors which can significantly change the distributions of test statistics. The paper shows that asymptotic results when testing simple and composite hypotheses can be used with round -off errors D much less than the standard deviation s of the distribution law of measurement errors and sample sizes n not exceeding some maximum values. For sample sizes larger than these maximum values, the real distributions of the test statistics deviate from asymptotic ones towards larger statistics values. In such situations, the use of asymptotic distributions to arrive at a conclusion about the test results leads to an increase in the probabilities of errors of the 1st kind (to the rejection of a valid hypothesis being tested). It is shown that when the round-off errors and s are commensurable, the distributions of the test statistics deviate from the asymptotic distributions for small n. And as n grows, the situation only gets worse. In the paper, changes in the distributions of statistics under the influence of rounding are demonstrated both when testing both simple and composite hypotheses. It is shown that the only way out that ensures the correctness of conclusions according to the applied tests in such non-standard conditions is the use of real distributions of statistics. This task can be solved interactively (in the process of verification) and rely on computer research technologies and the apparatus of mathematical statistics.
APA, Harvard, Vancouver, ISO, and other styles
46

Krivolapov, S. Ya. "REGRESSION MODEL FOR FITTING THE DISTRIBUTION LAW LOGARITHMIC STOCK RETURNS." SOFT MEASUREMENTS AND COMPUTING 4, no. 53 (2022): 16–26. http://dx.doi.org/10.36871/2618-9976.2022.04.002.

Full text
Abstract:
Data on daily stock quotations of 216 Russian companies for the period from 2004 to 2021 are considered. The subject of the study is the law of distribution of the logarithmic profitability of shares. Out of the total number of companies, a part of the companies is left for the subsequent testing of the model. For each of the remaining companies, using the methods of the Python language, a law is selected (out of 40 available candidates), in the "best" way (in the sense of the KulbackLeibler distance), approximating the law of sample distribution. One of the most frequently appearing as the "best" is the generalized normal law (gennorm). The law has heavier tails compared to the normal one, and is given by three parameters (shape, position and scale). For 178 samples (remaining after removal of outliers) a regression model is constructed for the dependences of the parameters of the generalized normal distribution law on the first four initial moments estimated from the sample. Graphical tools have shown a good approximation of the empirical and hypothetical laws of profitability distribution. The goodness of fit rejected the hypothesis of the consent of laws at the 5% level for two companies out of nine. For seven companies, the goodness of fit showed that there were no grounds for rejecting the hypothesis of consent at the 20% level.
APA, Harvard, Vancouver, ISO, and other styles
47

Buvry, Annick, Monique Garbarg, Violetta Dimitriadou, et al. "Phenotypic and Quantitative Changes in Mast Cells after Syngeneic Unilateral Lung Transplantation in the Rat." Clinical Science 91, no. 3 (1996): 319–27. http://dx.doi.org/10.1042/cs0910319.

Full text
Abstract:
1. Lung transplantation causes a total interruption of the innervation and vascularization within the transplanted organ, followed by repair processes. This is frequently associated with bronchial hyper-responsiveness. A common feature of tissue repair is an increase in the number of mast cells. Three phenotypically distinct mast cell subsets, with respect to their protease content, have been identified in rat lung, and it is probable that mast cells of differing protease phenotype fulfil different functions. 2. We have compared the number, protease phenotype and distribution of mast cells in left lung from transplanted and control Lewis rats 1 month after syngeneic unilateral left lung transplantation, without interference of inflammation, graft rejection or of any treatment. Connective and mucosal-type mast cell phenotypes were characterized using antibodies directed against their specific rat mast cell proteases, RMCPI and RMCPII, respectively. 3. After transplantation, RMCPI and RMCPII tissue concentrations increased by 172% and 239%, respectively, compared with controls (13.1 ± 1.2 and 5.6±1.0 μg/g). 4. Localization of mast cell phenotypes was studied by immunohistochemistry after double immunostaining. The number of mast cells increased after transplantation: the increase in the number of RMCPI-immunoreactive mast cells (RMCPI+) was significant around bronchioles and arterioles, around large vessels and in the pleura. The number of RMCPII+ mast cells also significantly increased around bronchioles and arterioles, as well as in the smooth muscle layer of large airways. Some mast cells stained for the presence of both RMCPI and RMCPII, supporting the existence of co-expressing phenotype in rat lung. The number of mast cells of the RMCPI+ /H+ phenotype significantly increased around bronchioles and arterioles and in the pleura. Moreover, the distribution of the mast cell phenotypes was modified in the different areas after transplantation. 5. This indicates a local differentiation/maturation of mast cells after transplantation.
APA, Harvard, Vancouver, ISO, and other styles
48

Iguchi, Ryo, Vitaliy I. Vasyuchka, Burkard Hillebrands, and Ken-ichi Uchida. "Lock-in thermographic study of spin-wave propagation in magnonic crystals." Journal of Applied Physics 132, no. 23 (2022): 233901. http://dx.doi.org/10.1063/5.0128870.

Full text
Abstract:
We have investigated the spin-wave dynamics in a one-dimensional magnonic crystal (MC) with respect to the heat radiation due to the damping of the spin waves. The spin waves were excited by applying microwaves via a wire antenna. The heat induced by the excitation and propagation of spin waves was measured using the lock-in thermography (LIT) technique by periodically modulating the microwave power. The LIT measurements resolved the heat source distributions inside the MC, which is made of an yttrium iron garnet film and periodic grooves with a sub-mm interval, in the backward volume wave geometry. The temperature distribution induced by the spin-wave excitation notably depends on the frequency or wave number of the spin waves, as a result of the formation of rejection bands in the MC. The observed temperature modulation profiles are complicated, but their behavior is consistent with a calculation based on the microwave transmission line approximation of the MCs, demonstrating the applicability of the LIT measurements to the investigation of the spin-wave dynamics in sub-mm scale MCs.
APA, Harvard, Vancouver, ISO, and other styles
49

Lundholt, Betina Kerstin, Kurt M. Scudder, and Len Pagliaro. "A Simple Technique for Reducing Edge Effect in Cell-Based Assays." Journal of Biomolecular Screening 8, no. 5 (2003): 566–70. http://dx.doi.org/10.1177/1087057103256465.

Full text
Abstract:
Several factors are known to increase the noise and variability of cell-based assays used for high-throughput screening. In particular, edge effects can result in an unacceptably high plate rejection rate in screening runs. In an effort to minimize these variations, the authors analyzed a number of factors that could contribute to edge effects in cell-based assays. They found that pre-incubation of newly seeded plates in ambient conditions (air at room temperature) resulted in even distribution of the cells in each well. In contrast, when newly seeded plates were placed directly in the CO2 incubator, an uneven distribution of cells occurred in wells around the plate periphery, resulting in increased edge effect. Here, the authors show that the simple, inexpensive approach of incubating newly seeded plates at room temperature before placing them in a 37° C CO2 incubator yields a significant reduction in edge effect. ( Journal of Biomolecular Screening 2003:566-570)
APA, Harvard, Vancouver, ISO, and other styles
50

Chiang, Jyun-You, Y. L. Lio, and Tzong-Ru Tsai. "Empirical Bayesian Strategy for Sampling Plans with Warranty Under Truncated Censoring." International Journal of Reliability, Quality and Safety Engineering 23, no. 05 (2016): 1650021. http://dx.doi.org/10.1142/s0218539316500212.

Full text
Abstract:
To reach an optimal acceptance sampling decision for products, whose lifetimes are Burr type XII distribution, sampling plans are developed with a rebate warranty policy based on truncated censored data. The smallest sample size and acceptance number are determined to minimize the expected total cost, which consists of the test cost, experimental time cost, the cost of lot acceptance or rejection, and the warranty cost. A new method, which combines a simple empirical Bayesian method and the genetic algorithm (GA) method, named the EB-GA method, is proposed to estimate the unknown distribution parameter and hyper-parameters. The parameters of the GA are determined through using an optimal Taguchi design procedure to reduce the subjectivity of parameter determination. An algorithm is presented to implement the EB-GA method. The application of the proposed method is illustrated by an example. Monte Carlo simulation results show that the EB-GA method works well for parameter estimation in terms of small bias and mean square error.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography