To see the other types of publications on this topic, follow the link: SIDS; binomial distribution.

Journal articles on the topic 'SIDS; binomial distribution'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 48 journal articles for your research on the topic 'SIDS; binomial distribution.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Ming, and Jin Ye. "Design and Implementation of Demand Side Response Based on Binomial Distribution." Energies 15, no. 22 (2022): 8431. http://dx.doi.org/10.3390/en15228431.

Full text
Abstract:
The application of microgrids (MG) is more and more extensive, therefore it is important to improve the system management method of microgrids. The intended costs can be further minimized when the energy management system is unified with demand side response (DSR) strategies. In this work, we propose a generic method of modeling the equipment in a microgrid including multiple stochastic loads. The microgrid model can be generated on a computer by converting the energy circuit diagram into a signal flow diagram. Then, a demand side response method based on binomial distribution is introduced, and loads are set to different probabilities according to importance. By applying the probability of loads and changing the return coefficient of loads, the problem of individual differences in demand side responses is solved, so as to improve consumer satisfaction. The proposed model is constructed as a mixed-integer linear program (MILP). Cases studies demonstrate feasibility of the proposed modeling method. The demand side response achieves the expected goal. The system management method reduces the operation cost of the energy system of microgrids.
APA, Harvard, Vancouver, ISO, and other styles
2

Le, Hoa Thanh, Uyen Hoang Pham, An Thị Đỗ Nguyễn, and Bao The Pham. "The similarity about the probability distributions of variables in the Bayesian regression model and application." Science & Technology Development Journal - Economics - Law and Management 5, no. 1 (2021): 1325–39. http://dx.doi.org/10.32508/stdjelm.v5i1.701.

Full text
Abstract:
The linear regression model as well as the time series model is applied in many fields, in which the mean of the dependent variable is one function of the mean of the independent variables. However, to consider the regression model following in the Classical Statistics (the Frequent Statistics), it means that the parameters are the constants, in many situations, the regression model does not describe the fluctuation of both the dependent variable and the independent variables. Therefore, we need to modify the parameters following the random variable form, not the constant form, like as the regression in Bayesian Statistics. The other side, when the parameters considered as the random variables, computations in the regression model becomes very complex, because we need to compute the product of the probability distributions. So, we must evaluate about to vary of the variables' probability distributions not only the normal distribution, the Student distribution t, the Poisson distribution, the binomial distribution… In this paper, we estimated the dependent variable's probability distribution form through the simple Bayesian regression model in cases having many the probability distribution forms of the independent variable. In addition, we apply the results to real stock price data, proving that the most appropriate probability distribution with the data is a mixture of probability distributions, not a single normal distribution.
APA, Harvard, Vancouver, ISO, and other styles
3

Caicedo R., Luis Sigifredo, Edgar Herney Varón D., and Helena Luisa Brochero. "Binomial sampling of Paraleyrodes Quaintance pos. bondari (Hemiptera: Aleyrodidae) in Persea americana Mill." Agronomía Colombiana 34, no. 2 (2016): 209–16. http://dx.doi.org/10.15446/agron.colomb.v34n2.54084.

Full text
Abstract:
Fresno (Tolima), in Colombia, is a notable avocado producer, with 36% of the national production. In this paper, two sampling methods are presented to assess natural populations of Paraleyrodes Quaintance pos. bondari attacking avocado trees of Hass and Lorena cultivars under field conditions. The presence/absence of whitefly nymph colonies on 30 leaves located at the high, medium and low strata per host plant from both cultivars was evaluated. Visual estimations were performed to count the number ofwhitefly nymphs on 1.25 cm2 of five leaves/ bud in low and medium strata per tree to evaluate the spatial distribution of whitefly population in accordance to Poisson distribution, Negative Binomial distribution and b parameter of Law of Taylor. Significant differences in percentages of infestation (P≤0.03) from leaves that belonged to the low avocado tree strata were found between the Lorena (31.88±1.2%) and Hass (15.64±1.8%) cultivars. Natural populations of P. pos. bondari were located on the abaxial leaf side, showing an aggregate distribution in avocado tree from orchards located at different altitudes. Our findings recommend entomological surveillance for Paraleyrodes sp. pos. bondari in Fresno (Tolima), sampling four branches from the medium and low avocado tree strata through inspection of five buds/branches/tree throughout each branch with the presence/absence method to count whitefly nymph colonies on the abaxial side of pre-basal leaves. In total, the sampling involved five leaves/branch (20 leaves/strata or 40 leaves/tree) on 13 avocado trees per hectare.
APA, Harvard, Vancouver, ISO, and other styles
4

Alyami, Salem A., Mohammed Elgarhy, Ibrahim Elbatal, Ehab M. Almetwally, Naif Alotaibi, and Ahmed R. El-Saeed. "Fréchet Binomial Distribution: Statistical Properties, Acceptance Sampling Plan, Statistical Inference and Applications to Lifetime Data." Axioms 11, no. 8 (2022): 389. http://dx.doi.org/10.3390/axioms11080389.

Full text
Abstract:
A new class of distribution called the Fréchet binomial (FB) distribution is proposed. The new suggested model is very flexible because its probability density function can be unimodal, decreasing and skewed to the right. Furthermore, the hazard rate function can be increasing, decreasing, up-side-down and reversed-J form. Important mixture representations of the probability density function (pdf) and cumulative distribution function (cdf) are computed. Numerous sub-models of the FB distribution are explored. Numerous statistical and mathematical features of the FB distribution such as the quantile function (QUNF); moments (MO); incomplete MO (IMO); conditional MO (CMO); MO generating function (MOGF); probability weighted MO (PWMO); order statistics; and entropy are computed. When the life test is shortened at a certain time, acceptance sampling (ACS) plans for the new proposed distribution, FB distribution, are produced. The truncation time is supposed to be the median lifetime of the FB distribution multiplied by a set of parameters. The smallest sample size required ensures that the specified life test is obtained at a particular consumer’s risk. The numerical results for a particular consumer’s risk, FB distribution parameters and truncation time are generated. We discuss the method of maximum likelihood to estimate the model parameters. A simulation study was performed to assess the behavior of the estimates. Three real datasets are used to illustrate the importance and flexibility of the proposed model.
APA, Harvard, Vancouver, ISO, and other styles
5

K. C. N., Njoku,, and Okoli, O. C. "On the Total Score for a Negative-Binomially Roll of a Truncated Turn-Up Side of (v-u+1) - Sided Die." African Journal of Mathematics and Statistics Studies 7, no. 3 (2024): 233–48. http://dx.doi.org/10.52589/ajmss-uhowajok.

Full text
Abstract:
Let u,v∈N(u<v) and {T(x,y)=p^x q^y:x,y=u,u+1,u+2,u+3,…,v} be a string of sequence of success-failure events constituting the Bernoulli trials, with success p and failure q. Several probability distributions have derived their roots from the sequences of this form. However, it is our purpose to introduce new probability distribution functions that unify some of the existing ones generated by sets of this form mentioned in the literature and then give some of the statistics associated with it.
APA, Harvard, Vancouver, ISO, and other styles
6

Fellman, Johan, and Aldur W. Eriksson. "Sex Ratio in Sibships With Twins." Twin Research and Human Genetics 11, no. 2 (2008): 204–14. http://dx.doi.org/10.1375/twin.11.2.204.

Full text
Abstract:
AbstractIn national birth registers of Caucasians, the secondary sex ratio, that is, the number of boys per 100 girls at birth, is almost constant at 106. Variations other than random variation have been noted, and attention is being paid to identifying presumptive influential factors. Studies of the influence of different factors have, however, yielded meagre results. An effective means of identifying discrepancies is to investigate birth data compiled into sibships of different sizes. Assuming no inter- or intra-maternal variations, the distributions of the sex composition are binomial. Varying parental tendencies for a specific sex result in discrepancies from the binomial distribution. Over a century ago, the German scientists Geissler and Lommatzsch analyzed the vital statistics of Saxony, including twin maternities, for the last quarter of the 19th century. They considered sibships ending with twin sets. Their hypothesis was that in sibships ending with male–male twin pairs, the sex ratio among previous births is higher than normal, while in sibships with female–female twin pairs, the sex ratio is lower than normal. If the sibship ended with a male-female pair, then the sex ratio is almost normal. Consequently, a same-sex twin set indicated, in general, deviations in the sex ratio among the sibs within the sibship. Our analyzes of their data yielded statistically significant results that support their statements.
APA, Harvard, Vancouver, ISO, and other styles
7

Peng, Eric Chun-Yu, and Markus G. Kuhn. "Adaptive Template Attacks on the Kyber Binomial Sampler." IACR Transactions on Cryptographic Hardware and Embedded Systems 2025, no. 3 (2025): 470–92. https://doi.org/10.46586/tches.v2025.i3.470-492.

Full text
Abstract:
Template attacks build a Gaussian multivariate model of the side-channel leakage signal generated by each value of a targeted intermediate variable. Combined with additional steps, such as dimensionality reduction, such models can help to infer a value with nearly 100% accuracy from just a single attack trace. We demonstrate this here by reconstructing the output of the binomial sampler of a Cortex-M4 imple- mentation of the Kyber768 post-quantum key-encapsulation mechanism. However, this performance is usually significantly diminished if the device, or even just the ad- dress space, used for profiling differs from the attacked one. Here we introduce a new technique for adapting templates generated from profiling devices in order to attack another device where we are also able to record many traces, but without knowledge of the random value held by the targeted variable. We interpret the model from the profiling devices as a Gaussian mixture and use the Expectation–Maximization (EM) algorithm to adapt its means and covariances to better match the unlabelled leakage distribution observed from the attacked setting. The Kyber binomial sampler turned out to be a particularly suitable target, for two reasons. Firstly, it generates a long sequence of values drawn from a small set, limiting the number of Gaussian components that need to be adjusted. Secondly, the length of this sequence requires particularly well-adapted templates to achieve a high key-recovery success rate from a single trace. We also introduce an extended point-of-interest selection method to improve linear discriminant analysis (LDA).
APA, Harvard, Vancouver, ISO, and other styles
8

Orsi, Carlo. "New Developments on the Non-Central Chi-Squared and Beta Distributions." Austrian Journal of Statistics 51, no. 1 (2022): 35–51. http://dx.doi.org/10.17713/ajs.v51i1.1106.

Full text
Abstract:
New formulas for the moments about zero of the Non-central Chi-Squared and the Non-central Beta distributions are achieved by means of novel approaches. The mixture representation of the former model and a new expansion of the ascending factorial of a binomial are the main ingredients of the first approach, whereas the second one hinges on an interesting relationship of conditional independence and a simple conditional density of the latter model. Then, a simulation study is carried out in order to pursue a twofold purpose: providing numerical validations of the derived moment formulas on one side and discussing the advantages of the new formulas over the existing ones on the other.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Yuxuan, Yi Huang, and Feng Chen. "Optimisation of Spare Parts Quality Inspection Cost Based on Simulated Annealing and Genetic Algorithm." Frontiers in Computing and Intelligent Systems 10, no. 1 (2024): 48–53. http://dx.doi.org/10.54097/abjt5f75.

Full text
Abstract:
This paper provides an in-depth study on the impact of spare parts quality inspection on production costs in the electronics industry. By establishing a model based on binomial distribution, central limit theorem and right-hand side hypothesis testing, the costs, profits and losses of enterprises under different testing strategies are analysed. Firstly, for the inspection cost problem of accessory products, the minimum number of inspections is determined by assuming the number of sample defective products using binomial distribution and approximating the probability by De Moivre-Laplace theorem. Second, the simulated annealing algorithm is combined to derive 16 decision alternatives based on the analysis of control variables, and the best decision with the lowest total cost is finally determined. Finally, for the case of increasing number of spare parts and the appearance of semi-finished products, a mathematical model with minimising cost as the objective function is developed and solved using a genetic algorithm to evaluate the total cost under different inspection strategies. The study shows that the model proposed in this paper can reasonably solve the cost optimisation problem in electronic product quality inspection with high efficiency and practicality.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhao, Dexian, Zhenkai Sun, Cheng Wang, et al. "Using Count Data Models to Predict Epiphytic Bryophyte Recruitment in Schima superba Gardn. et Champ. Plantations in Urban Forests." Forests 11, no. 2 (2020): 174. http://dx.doi.org/10.3390/f11020174.

Full text
Abstract:
Epiphytic bryophytes are known to perform essential ecosystem functions, but their sensitivity to environmental quality and change makes their survival and development vulnerable to global changes, especially habitat loss in urban environments. Fortunately, extensive urban tree planting programs worldwide have had a positive effect on the colonization and development of epiphytic bryophytes. However, how epiphytic bryophytes occur and grow on planted trees remain poorly known, especially in urban environments. In the present study, we surveyed the distribution of epiphytic bryophytes on tree trunks in a Schima superba Gardn. et Champ. urban plantation and then developed count data models, including tree characteristics, stand characteristics, human disturbance, terrain factors, and microclimate to predict the drivers on epiphytic bryophyte recruitment. Different counting models (Poisson, Negative binomial, Zero-inflated Poisson, Zero-inflated negative binomial, Hurdle-Poisson, Hurdle-negative binomial) were compared for a data analysis to account for the zero-inflated data structure. Our results show that (i) the shaded side and base of tree trunks were the preferred locations for bryophytes to colonize in urban plantations, (ii) both hurdle models performed well in modeling epiphytic bryophyte recruitment, and (iii) both hurdle models showed that the tree height, diameter at breast height (DBH), leaf area index (LAI), and altitude (ALT) promoted the occurrence of epiphytic bryophytes, but the height under branch and interference intensity of human activities opposed the occurrence of epiphytic bryophytes. Specifically, DBH and LAI had positive effects on the species richness recruitment count; similarly, DBH and ALT had positive effects on the abundance recruitment count, but slope had a negative effect. To promote the occurrence and growth of epiphytic bryophytes in urban tree planting programs, we suggest that managers regulate suitable habitats by cultivating and protecting large trees, promoting canopy closure, and controlling human disturbance.
APA, Harvard, Vancouver, ISO, and other styles
11

Mcharo*, Mwamburi, Don Labonte, Chris Clark, and Mary Hoy. "Molecular Marker Variability for Southern Root-Knot Nematode Resistance in Sweetpotato." HortScience 39, no. 4 (2004): 869A—869. http://dx.doi.org/10.21273/hortsci.39.4.869a.

Full text
Abstract:
Using two sweetpotato (Ipomoea batatas (L.) Lam) F1 populations from diverse environments we investigated the AFLP marker profiles of the genotypes for association studies between the molecular markers and southern root-knot nematode (Meloidogyne incognita) resistance expression. Population one consisted of 51 half-sib genotypes developed at the Louisiana State Univ. AgCenter. The second population consisted of 51 full-sibs developed by the East African and International Potato Center sweetpotato breeding programs. Results for nematode resistance expression indicate a binomial distribution among the genotypes. Using analysis of molecular variance, logistic regression and discriminant analysis, AFLP markers that are most influential with respect to the phenotypic trait expression were selected for both populations. A comparative analysis of the power of models from the two statistical models for southern root-knot nematode resistance class prediction was also done. The diversity and possible universal similarity of influential markers between the two populations and the expected impact in sweetpotato breeding programs will be discussed.
APA, Harvard, Vancouver, ISO, and other styles
12

Norun, Abdul Malek, Omran Khalifa Othman, Zainal Abidin Zuhairiah, Yasmin Mohamad Sarah, and Aqilah Abdul Rahman Nur. "Beam Steering Using the Active Element Pattern of Antenna Array." TELKOMNIKA Telecommunication, Computing, Electronics and Control 16, no. 4 (2018): 1542–50. https://doi.org/10.12928/TELKOMNIKA.v16i4.9040.

Full text
Abstract:
An antenna array is a set of a combination of two or more antennas in order to achieve improved performance over a single antenna. This paper investigates the beam steering technique using the active element pattern of dipole antenna array. The radiation pattern of the array can be obtain by using the active element pattern method multiplies with the array factor. The active element pattern is crucial as the mutual coupling effect is considered, and it will lead to an accurate radiation pattern, especially in determining direction of arrival (DoA) of a signal. A conventional method such as the pattern multiplication method ignores the coupling effect which is essential especially for closely spaced antenna arrays. The comparison between both techniques has been performed for better performance. It is observed that the active element pattern influenced the radiation pattern of antenna arrays, especially at the side lobe level. Then, the beam of the 3x3 dipole antenna array has been steered to an angle of 60° using three techniques; Uniform, Chebyshev and Binomial distribution. All of these are accomplished using CST and Matlab software.
APA, Harvard, Vancouver, ISO, and other styles
13

Yoo, Wonchul, and Tae-wan Kim. "Statistical trajectory-distance metric for nautical route clustering analysis using cross-track distance." Journal of Computational Design and Engineering 9, no. 2 (2022): 731–54. http://dx.doi.org/10.1093/jcde/qwac024.

Full text
Abstract:
ABSTRACT This study presents a novel statistical trajectory-distance metric specialized for nautical route clustering analysis. Based on the dynamic time warping (DTW) metric, one of the most used metrics for trajectory-distance, the statistical trajectory-distance metric was defined by replacing the distance term in DTW with a linear combination of the Jensen–Shannon divergence and Wasserstein distance. Each waypoint from a nautical route was modelled as a discrete and asymmetric binomial normal distribution defined by the cross-track distance (XTD) of the waypoint. The model was then used to compute the statistical distance between waypoints. Nautical route clustering was performed using density-based spatial clustering of applications with noise and the statistical trajectory-distance metric. The nautical route for the clustering analysis, including the XTD information, was extracted from automatic identification system data from the southern sea of the Korean Peninsula. The clustering results were evaluated by comparing them with the results of other popular trajectory-distance metrics. The proposed method was more effective compared to other trajectory-distance when the trajectories pass on both sides of a small island, which is frequent case in coastal route clustering.
APA, Harvard, Vancouver, ISO, and other styles
14

Humphrey, M. Njuki, M. Kiano Elvis, and C. Rono Lucy. "Internalization of External Cost in the Thermal Power Generation on Social Welfare Maximization." Journal of Economics, Finance And Management Studies 07, no. 04 (2024): 2150–60. https://doi.org/10.5281/zenodo.11065848.

Full text
Abstract:
For decades, Kenya has incorporated thermal power technology into its grid to generate electrical energy using fossil fuels such as petroleum, natural gas, and coal. The burning of fossil fuels has become a major source of air pollutants and is associated with several undesirable side effects on the environment and human health. However, the impact of pollutants on environmental sustainability and public welfare has yet to be evaluated. Therefore, the purpose of this study is to evaluate the external cost of electricity generated by thermal power plants in Kenya. Both survey data and secondary data were used. The analysis was conducted using externality valuation and welfare maximization approaches, and the research hypotheses were tested using a negative binomial regression model. The annual external cost ($/2022) was determined to be $ 1,333,904,970.76, with the following distribution: environmental at $ 993,488,336.26, Public health at $ 86,760,038.01, and socio-economic at $ 253,656,596.49. Equally, the thermal power generation marginal social cost ($/2022) was determined to be 1.22 $cents/kWh with the following distribution: Marginal Private Cost (MPC) at 0.01 $cents/kWh and Marginal External Cost (MEC) at 1.21 $cents/kWh. The established marginal social cost (MSC) (i.e. Σ MPC+MEC) was 1.22 ($cents/kWh). Thus, MSC is significantly greater than the established social marginal benefit (SMB) of 0.089 ($ cents/kWh); hence, we conclude that the burden of social welfare loss is highly significant, making thermal power a non-sustainable and economic energy source.
APA, Harvard, Vancouver, ISO, and other styles
15

Hutsalenko, Olga O., Ivan P. Katerenchuk, Tetyana I. Yarmola, et al. "THE ASSESSMENT OF ACUPRESSURE EFFECTIVENESS AND SAFETY IN THE COMPREHENSIVE TREATMENT AND REHABILITATION OF PATIENTS WITH PEPTIC ULCERS." Acta Balneologica 65, no. 4 (2023): 227–32. http://dx.doi.org/10.36740/abal202304105.

Full text
Abstract:
Aim: The paper evaluates effectiveness and safety of acupressure (AP) in the comprehensive treatment and rehabilitation of patients with uncomplicated peptic ulcer disease (PUD). Materials and Methods: The study retrospectively assessed the effectiveness of treating 24 PUD patients, who received AP session following the author’s protocol based on Houston F.M. recommendations. The statistical analysis of the results employed the algorithm for qualitative data analysis applying the MedCalc 2023 software package. We analyzed the frequency of clinical syndrome manifestations before and after treatment, calculating the interval estimate of this measure (95% confidence interval (CI)). The study formulated null and alternative statistical hypotheses applying the McNemar test to check the null hypothesis for paired samples. When evaluating the risk of AP side effects, we determined 95% CI for proportion, considering binomial distribution of the feature (“presence-absence of complications”). Results: After completing the treatment course and observing the absence of endoscopic signs of peptic gastropathy, we detected statistically significant changes in the frequency of all clinical syndrome manifestations (p-value < 0.0001, based on the McNemar test). AP rapidly and effectively alleviated the main clinical manifestations in patients with PUD. The tolerability of acupressure was good, without side effects. The study determined with 95% probability, that the risk of adverse effects did not exceed 15%. Conclusions: AP is easy to use, non-invasive adjunctive therapy and alternative medical practice during the rehabilitation stage for PUD patients. It proves to be effective, safe, and inexpensive non-pharmacological method of treatment and rehabilitation, aligning with the alternative statistical hypothesis.
APA, Harvard, Vancouver, ISO, and other styles
16

Hutsalenko, Olga O., Ivan P. Katerenchuk, Ulia А. Kostrikova, et al. "Acupressure as a Method of Rehabilitation and Treatment of Patients with Gastroduodenal Pathology." Acta Balneologica 64, no. 4 (2022): 342–47. http://dx.doi.org/10.36740/abal202204112.

Full text
Abstract:
Aim: To assess the effectiveness and safety of acupressure in the complex rehabilitation and treatment of patients with gastroduodenal pathology. Materials and Methods: A retrospective analysis of the results of examination and treatment of 40 patients with gastroduodenal pathology was carried out. For treatment, the author’s protocol of acupressure (AP) was used based on the recommendations of F.M. Houston. Statistical processing of the results was carried out according to the qualitative data analysis algorithm using the MedCalc 2022 software package. The frequency of occurrence of qualitative binary variables was analyzed with the calculation of 95% CI. To determine the effect of AP on the change in the frequency of clinical syndromes, the McNemar test for related groups was used. The assessment of the risk of side effects of AP was carried out on the basis of determining the significance level of 95% CI for the share, taking into account the binomial distribution of the characteristic. Formulated null and alternative statistical hypotheses. Results: After two weeks of treatment with the using of AP against the background of the disappearance of endoscopic signs of gastroduodenal pathology, statistically significant changes in the frequency of manifestations of all clinical syndromes were detected. AP not only relieves pain, but also shortens the duration of the illness, eliminates functional disorders of the motility of the upper parts of the digestive tract, allows to achieve clinical and endoscopic recovery and provides stable and long-term remission. Tolerability of AP was good. No side effects were registered. It has been proven with a probability of 95% that the risk of a side effect does not exceed 9%. Conclusions: Using of AP is effective, safe, which corresponds to the alternative statistical hypothesis.
APA, Harvard, Vancouver, ISO, and other styles
17

Madden, L. V., and M. A. Boudreau. "Effect of Strawberry Density on the Spread of Anthracnose Caused by Colletotrichum acutatum." Phytopathology® 87, no. 8 (1997): 828–38. http://dx.doi.org/10.1094/phyto.1997.87.8.828.

Full text
Abstract:
Spread of strawberry anthracnose, resulting from the rain splash dispersal of Colletotrichum acutatum conidia, was determined in field plots by assessing fruit disease incidence at a range of distances from an introduced point source of infected fruit with sporulating lesions. Four within-row plant densities were established in replicated plots in each of 2 years. A generalized linear model with a logit link function and binomial distribution for incidence was used to quantify the effects of distance and side of the row relative to the inoculum source, plant density treatment, and their interactions on disease incidence. At all assessment times, there was a significant (P ≤ 0.05) decline in incidence with increasing distance from the spore source. Moreover, row side had a significant effect, with the near side having higher incidence than the far side. Plant density treatment had a significant, but nonlinear, effect on incidence, with incidence generally declining with increasing density. Side of the row relative to the inoculum source and density treatment could affect the steepness of the disease gradient (slope) as well as the overall level of disease incidence, depending on the assessment time and year. The combined effects of plant density and row side on the height and steepness of the disease gradients could be measured using the predicted distance in which incidence equals 10% (d10). Estimated d10 generally increased in a nonlinear manner with decreasing plant density. Also, plant density had a significant negative effect on the proportion of incident rain that penetrated the canopy. In a separate study, plant density did not consistently affect infection of fruit that had been placed within the canopy immediately after being inoculated in the laboratory with a controlled inoculum density, indicating that conditions favoring infection were similar for the four densities. Thus, differences in mean disease incidence and disease gradients among the treatments were mostly due to differences in dispersal and not to other components of the disease cycle. As previously reported for controlled studies using a rain simulator, however, the effects of plant density on dispersal were complex, and increasing density did not universally lead to decreasing disease incidence.
APA, Harvard, Vancouver, ISO, and other styles
18

Escobar-Garcia, Hector Alonso, and Francisco Ferragut. "Damage and spatiotemporal dynamics of the Ngaio flat mite, Brevipalpus ferraguti (Trombidiformes: Tenuipalpidae), with observations on the development of the female insemination system." Experimental and Applied Acarology 86, no. 1 (2021): 73–90. http://dx.doi.org/10.1007/s10493-021-00670-y.

Full text
Abstract:
AbstractWe studied the Ngaio flat mite, Brevipalpus ferraguti Ochoa & Beard, on Myoporum laetum (Scrophulariaceae), a common introduced plant used as hedgerows in gardens and green areas of the Mediterranean, where the mite causes considerable damage. We first describe the damage, and then the patterns of mite seasonal abundance and spatial distribution. Finally, we address the development of the female insemination system at the population level. Damage occurs on both sides of the leaves, starting with a uniform stippling and bronzing and ending in the leaves drying out and extensive defoliation that coincides with summer. Mite population peaked between June and August, maintained moderate levels in autumn and winter and reached its lowest density in early spring. Active motile immatures and eggs were present throughout the year. Females and motile immature forms were more abundant on the abaxial (lower) leaf surface, but eggs were deposited on both surfaces indistinctly, suggesting that females actively move to the adaxial (upper) surface in summer to oviposit. All the developmental stages were aggregated on the leaves throughout the year regardless of their population density. Our study suggests that a binomial or presence-absence sampling, examining only the number of females on the abaxial surface, can accurately estimate the total mite density levels. Only 23.5% of females possessed a fully developed spermatheca, whereas in 76.5% of the cases the seminal receptacle was not present or not developed. Females with a complete spermatheca were less abundant in summer. Average temperatures and host plant species affected the occurrence of this reproductive structure.
APA, Harvard, Vancouver, ISO, and other styles
19

Ojha, Rajdeep, Abhinav Singh, Jacob George, and Bobeena Rachel Chandy. "Neuromodulation of spinal reflex pathway for the treatment of detrusor overactivity by medial plantar nerve stimulation at surface of sole of foot in patients with spinal cord injury." Journal of Neurosciences in Rural Practice 14 (August 16, 2023): 495–500. http://dx.doi.org/10.25259/jnrp_27_2023.

Full text
Abstract:
Objectives: Suprasacral spinal cord lesions are prone to have neurogenic detrusor overactivity leading to urinary incontinence. Current medical management has known side-effects and often surgical managements are irreversible. Electrical stimulation to modulate spinal reflex pathway having same nerve root as urinary bladder is reported in the literature. This study aimed to reduce detrusor overactivity in patients with spinal cord injury (SCI) using surface electrical stimulation of medial plantar nerve at the sole of foot. Materials and Methods: Twenty adults with SCI having episode of at least 1 leak/day due to detrusor overactivity as diagnosed by cystometrogram (CMG), were on clean intermittent catheterization and ankle jerk was present consented for the study. Participants were asked to maintain bladder diary a week before and during 2 weeks of treatment. CMG was done on day-0 and day-14. cmcUroModul@tor®, an inhouse developed electrical stimulator was used for ½ h daily for period of 2 weeks. Patient satisfaction feedback questionnaire was taken on completion of treatment. CMG data were analyzed using Wilcoxon signed-ranked test while bladder diary was analyzed using binomial distribution. P < 0.05 was considered as statistically significant. Institutional Review Board (IRB) and ethics committee of Christian Medical College, Vellore, approved the study (CMC/IRB/11061). Results: Statistical significant improvement in maximum detrusor pressure (P = 0.03) and cystometric capacity (P = 0.04) was observed. Of 20 subjects, 18 showed improvement in bladder diary. Conclusion: Neuromodulation of medial plantar nerve at sole of foot by surface electrical stimulation is non-invasive, cost-effective, and alternative simple treatment modality for urinary incontinence due to detrusor overactivity.
APA, Harvard, Vancouver, ISO, and other styles
20

Felbar, Daniel, Vilijam Zdravkovic, and Bernhard Jost. "Demographic changes and surgery caseloads for hip arthroplasty over the last 50 years: a retrospective study." Swiss Medical Weekly 153, no. 3 (2023): 40047. http://dx.doi.org/10.57187/smw.2023.40047.

Full text
Abstract:
AIMS OF THE STUDY: Data on the demographic changes over time for hip arthroplasty are rare in Switzerland. The aim of the study was to evaluate the influence of these changes on the distribution of age, gender, operated hip side, place of residence and caseload per surgeon over the last 50 years of hip arthroplasty at the Kantonsspital St. Gallen.
 METHODS: For this retrospective explorative study, data were collected from the operating theatre journals of hip replacements performed at Kantonsspital St. Gallen from 1969 to 2019. Every fifth year was included, which resulted in 5-year sampling rate over the observation period. The journals were handwritten until 1999 and digital from 2004 to 2019. The following data were obtained: age, sex, type of hip arthroplasty, side of operated hip, place of residence and name of main surgeon. Apart from overall descriptive statistics, we applied the Mann-Whitney U-test to test for differences in age and the binomial test for categorical variables. A linear regression model was applied to investigate the relationship between patients’ ages and historical data of life expectancy of the Swiss population.
 RESULTS: We included 2,963 patients, of whom 1,318 were men (median age = 67 yr., p25 = 59 yr., p75 = 74 yr.) and 1,608 women (median age = 72 yr., p25 = 63 yr., p75 = 79 yr.). Overall, women were significantly older than men, irrespective of whether they received primary total hip arthroplasty (median age = 70 yr. vs 66 yr., p25 = 61 yr. vs 58 yr., p75 = 77 yr. vs 73 yr., p <0.001), hemiarthroplasty (median age = 84 yr. vs 78.5 yr., p25 = 78 yr. vs 71 yr., p75 = 89 yr. vs 85 yr., p <0.001) or total revision arthroplasty (median age = 71 yr. vs 70 yr., p25 = 64.25 yr. vs 63 yr., p75 = 78 yr. vs 75 yr., p = 0.036). A trend toward rising median age is evident looking at the whole period observed from 1969 to 2019. Except for women in the total revision arthroplasty group (r = 0.226), a high correlation between increasing median age of patients undergoing hip replacement and life expectancy was found (r ≥0.663). Significantly more primary total hip prostheses (p = 0.003) and hip hemiprostheses (p <0.001) were implanted in women than in men between 1969 and 2019. Overall, no significant difference in side distribution was seen in the primary total hip arthroplasty (p = 0.061), total revision arthroplasty (p = 1.000) and hemiarthroplasty (p = 0.365) group. In contrast to earlier years, most patients in recent years are operated on by high-volume surgeons (>50 operations per surgeon per year).
 CONCLUSIONS: Demographic changes of patients undergoing total hip replacement reflect the overall demographic changes in the Swiss population. Over the last 50 years the indication for prosthetic hip replacements has not been extended to younger ages. The caseload in hip arthroplasty has changed over the last 50 years towards high-volume surgeons.
APA, Harvard, Vancouver, ISO, and other styles
21

Price, O. F., and R. A. Bradstock. "The spatial domain of wildfire risk and response in the Wildland Urban Interface in Sydney, Australia." Natural Hazards and Earth System Sciences Discussions 1, no. 5 (2013): 4539–64. http://dx.doi.org/10.5194/nhessd-1-4539-2013.

Full text
Abstract:
Abstract. In order to quantify the risks from fire at the Wildland Urban Interface (WUI), it is important to understand where fires occur and their likelihood of spreading to the WUI. For each of 999 fires in the Sydney region we calculated the distance between the ignition and the WUI, the fire weather and wind direction and whether it spread to the WUI. The likelihood of burning the WUI was analysed using binomial regression. Weather and distance interacted such that under mild weather conditions, the model predicted only a 5% chance that a fire starting more than 2.5 km from the interface would reach it, whereas when the conditions are extreme the predicted chance remained above 30% even at distances further than 10 km. Fires were more likely to spread to the WUI if the wind was from the west and in the western side of the region. We examined whether the management responses to wildfires are commensurate with risk by comparing the distribution of distance to the WUI of wildfires with roads and prescribed fires. Prescribed fires and roads were concentrated nearer to the WUI than wildfires as a whole, but further away than wildfires that burnt the WUI under extreme weather conditions (high risk fires). 79% of these high risk fires started within 2 km of the WUI, so there is some argument for concentrating more management effort near the WUI. By substituting climate change scenario weather into the statistical model, we predicted a small increase in the risk of fires spreading to the WUI, but the increase will be greater under extreme weather. This approach has a variety of uses, including mapping fire risk and improving the ability to match fire management responses to the threat from each fire. They also provide a baseline from which a cost-benefit analysis of complementary fire management strategies can be conducted.
APA, Harvard, Vancouver, ISO, and other styles
22

Price, O. F., and R. A. Bradstock. "The spatial domain of wildfire risk and response in the wildland urban interface in Sydney, Australia." Natural Hazards and Earth System Sciences 13, no. 12 (2013): 3385–93. http://dx.doi.org/10.5194/nhess-13-3385-2013.

Full text
Abstract:
Abstract. In order to quantify the risks from fire at the wildland urban interface (WUI), it is important to understand where fires occur and their likelihood of spreading to the WUI. For each of the 999 fires in the Sydney region we calculated the distance between the ignition and the WUI, the fire's weather and wind direction and whether it spread to the WUI. The likelihood of burning the WUI was analysed using binomial regression. Weather and distance interacted such that under mild weather conditions, the model predicted only a 5% chance that a fire starting >2.5 km from the interface would reach it, whereas when the conditions are extreme the predicted chance remained above 30% even at distances >10 km. Fires were more likely to spread to the WUI if the wind was from the west and in the western side of the region. We examined whether the management responses to wildfires are commensurate with risk by comparing the distribution of distance to the WUI of wildfires with roads and prescribed fires. Prescribed fires and roads were concentrated nearer to the WUI than wildfires as a whole, but further away than wildfires that burnt the WUI under extreme weather conditions (high risk fires). Overall, 79% of these high risk fires started within 2 km of the WUI, so there is some argument for concentrating more management effort near the WUI. By substituting climate change scenario weather into the statistical model, we predicted a small increase in the risk of fires spreading to the WUI, but the increase will be greater under extreme weather. This approach has a variety of uses, including mapping fire risk and improving the ability to match fire management responses to the threat from each fire. They also provide a baseline from which a cost-benefit analysis of complementary fire management strategies can be conducted.
APA, Harvard, Vancouver, ISO, and other styles
23

Regmi, Sanjaya, Elizabeth Sowell, Chenoa D. Allen, Benjamin E. Jones, Nan M. Gaylord, and Victoria Niederhauser. "Parental Barriers and Sociodemographic Disparities in Childhood Vaccination Post-COVID-19 in Tennessee." Vaccines 13, no. 5 (2025): 452. https://doi.org/10.3390/vaccines13050452.

Full text
Abstract:
Introduction: The COVID-19 pandemic disrupted routine childhood vaccinations schedules, posing significant challenges among underserved communities. Understanding how different sociodemographic groups in Tennessee perceive and navigate childhood vaccination barriers is critical for developing strategies to improve vaccination rates and reduce vulnerability to vaccine-preventable diseases. Methods: A cross-sectional survey was conducted to explore barriers to vaccination across diverse sociodemographic groups in Tennessee. Data were collected from caregivers/parents of children aged 18 years and younger across all 95 counties in Tennessee at community events and through partnerships with schools and other local organizations. Parental responses were analyzed to identify barriers in access, concern, and importance domains. The distribution of barriers across different sociodemographic groups such as race, income, education level, and insurance status was identified. Descriptive statistics, non-parametric tests, and log-binomial regressions were used to address the research objectives. Results: This study found that the most prominent barriers to childhood vaccination were concerns regarding vaccine safety and side effects. Significant differences in vaccine barriers were observed across racial and ethnic groups for access barriers (p < 0.001), concern barriers (p = 0.006), and importance barriers (p < 0.001). Parents with lower education levels, children without health insurance, and lower-income families faced disproportionate challenges across two of the three barrier domains studied (access and perceived importance of vaccines). Additionally, concern barriers (aPR = 0.998, p < 0.001) and importance barriers (aPR = 0.997, p < 0.001) were strongly associated with the parent-reported prevalence of up-to-date vaccination status. Conclusions: Addressing parental vaccination barriers related to concern, access, and perceived importance is crucial, particularly for underserved populations including low-income families, uninsured parents, racial/ethnic minorities, and those with limited education. A sustained, equity-focused approach integrating scientific communication, community engagement, and policy interventions is essential for increasing vaccine uptake and ensuring equitable vaccination access.
APA, Harvard, Vancouver, ISO, and other styles
24

Divecha, Hiren M., Terence W. O'Neill, Mark Lunt, and Tim N. Board. "The effect of cemented acetabular component geometry on the risk of revision for instability or loosening." Bone & Joint Journal 103-B, no. 11 (2021): 1669–77. http://dx.doi.org/10.1302/0301-620x.103b11.bjj-2021-0061.r1.

Full text
Abstract:
Aims To determine if primary cemented acetabular component geometry (long posterior wall (LPW), hooded, or offset reorientating) influences the risk of revision total hip arthroplasty (THA) for instability or loosening. Methods The National Joint Registry (NJR) dataset was analyzed for primary THAs performed between 2003 and 2017. A cohort of 224,874 cemented acetabular components were included. The effect of acetabular component geometry on the risk of revision for instability or for loosening was investigated using log-binomial regression adjusting for age, sex, American Society of Anesthesiologists grade, indication, side, institution type, operating surgeon grade, surgical approach, polyethylene crosslinking, and prosthetic head size. A competing risk survival analysis was performed with the competing risks being revision for other indications or death. Results The distribution of acetabular component geometries was: LPW 81.2%; hooded 18.7%; and offset reorientating 0.1%. There were 3,313 (1.5%) revision THAs performed, of which 815 (0.4%) were for instability and 838 (0.4%) were for loosening. Compared to the LPW group, the adjusted subhazard ratio of revision for instability in the hooded group was 2.31 (p < 0.001) and 4.12 (p = 0.047) in the offset reorientating group. Likewise, the subhazard ratio of revision for loosening was 2.65 (p < 0.001) in the hooded group and 13.61 (p < 0.001) in the offset reorientating group. A time-varying subhazard ratio of revision for instability (hooded vs LPW) was found, being greatest within the first three months. Conclusion This registry-based study confirms a significantly higher risk of revision after cemented THA for instability and for loosening when a hooded or offset reorientating acetabular component is used, compared to a LPW component. Further research is required to clarify if certain patients benefit from the use of hooded or offset reorientating components, but we recommend caution when using such components in routine clinical practice. Cite this article: Bone Joint J 2021;103-B(11):1669–1677.
APA, Harvard, Vancouver, ISO, and other styles
25

Guimarães, Patrik, Ana Letycia Basso Garcia, Carla Costa Garcia, et al. "Genetic parameters for early growth and disease resistance in a cloned F2 hybrid progeny of Eucalyptus urophylla × grandis." Agrociencia Uruguay 27, NE2 (2023): e1255. http://dx.doi.org/10.31285/agro.27.1255.

Full text
Abstract:
The VERACEL breeding program includes several advanced generation urograndis hybrids (F2) from crosses of selected F1. To understand the performance and genetics of these F2, we established F2-cloned progeny trials with 1,350 clones from 35 families, each family with 8 to 35 cloned progenies. These families stem from crosses between known F1 urograndis female parents and pollen mixtures of other selected parents. The experimental design comprised 15 trials with 95 clones each, arranged in randomized blocks and linked by 7 common commercial hybrid clones. These experiments were performed in two sites contrasting for physiological disorder (PD) incidence. Due to incomplete pedigree of the families, the model fitted assumed the additive genetic relationship between sibs as half-sibs. The non-additive genetic component was estimated from clone effects within half-sib families. A multisite individual tree genetic model was fitted for diameter, height, PD, Calonectria, and rust incidence for trees up to 1 year old: yijklmn= µ+gi+gij+sm+cn+tbkl+sgim+sgijm+scmn+eijklmn; where random effects are gij (additive genetic effects), gi (all non-additive genetic effects of clones), and eijklmn (residuals); and fixed effects are sm, which is either site with higher or lower productivity and PD incidence, cn is F1 or F2, and tbkl is trial-block interaction. The same model was used for diseases and PD, but assumed a binomial distribution with a logit link function. PD was analysed only for the higher PD incidence site, thus excluding interactions with the other site. Results indicate that at the high PD incidence site, height and diameter growth was greater (~40%), but it also had higher mortality (40% vs. 22%) and higher PD incidence (51% vs. 23%). Compared with F1 controls, Calonectria incidence was higher in F2 (~10%) but similar at both sites, as was rust, although with lower incidence (~5%). Multisite analysis revealed low additive genetic variance and moderate total genetic variance (A+NA) for growth, diseases, and PD. For growth, narrow- and broad-sense heritability were h2=0.11 and H2=0.40. The additive genetic correlation between the two sites was close to 1 (rA~0.9), whereas the non-additive genetic correlation was lower (rNA~0.5). The narrow- and broad-sense heritability for diseases and PD was low (between 0.10 and 0.15). The low additive genetic variance for growth, diseases, and PD constrains substantial gains from parental selection within F1. However, clonal selection would still be effective due to considerable non-additive effects. Further studies using a large set of various populations are needed to validate these findings.
APA, Harvard, Vancouver, ISO, and other styles
26

Aokage, Keiju, Yoshihisa Shimada, Kiyotaka Yoh, et al. "Pembrolizumab and ramucirumab neoadjuvant therapy for PD-L1-positive stage IB-IIIA lung cancer (EAST ENERGY)." Journal of Clinical Oncology 41, no. 16_suppl (2023): 8509. http://dx.doi.org/10.1200/jco.2023.41.16_suppl.8509.

Full text
Abstract:
8509 Background: Neoadjuvant treatments for resectable non-small cell lung cancer (NSCLC) using novel combination therapies are being developed. Angiogenesis inhibitors have been reported to modify tumor immunity, and the efficacy and safety of treatments added to immune checkpoint inhibitors (ICIs) have been investigated in advanced NSCLC. In this multi-institutional phase II study, we evaluated the efficacy and feasibility of neoadjuvant therapy with pembrolizumab and ramucirumab, a direct vascular endothelial growth factor (VEGF) receptor-2 antagonist, followed by surgery, in patients with PD-L1 positive, clinical stage IB-IIIA NSCLC. Methods: Patients (aged ≥20) with pathologically proven NSCLC harboring PD-L1 expression ≥1% (22C3), resectable clinical stage IB-IIIA NSCLC, and performance status of 0 to 1 were eligible. Patients received two cycles of pembrolizumab (200 mg/body) and ramucirumab (10 mg/kg) every three weeks. Surgery was scheduled 4-8 weeks after the last dose. The primary endpoint was to determine the major pathologic response (MPR) rate. The sample size was calculated based on the exact binomial distribution, considering a threshold MPR rate of 20%, an expected MPR rate of 45%, a one-side alpha of 5%, and a desired power of 80%. Results: A total of 24 eligible patients, with a median age of 75 years (range 50-78), were enrolled between July 2019 and April 2022; 18 patients were male. The histological subtype was adenocarcinoma in 12 patients, and the clinical stage was IB, IIA, IIB, and IIIA in 1, 4, 9, and 10 patients, respectively. PD-L1 was ≥50% in nine patients (37.5%). The MPR rate by the blinded independent central review of three pathologists was 50.0% (90% confidence interval, 31.9-68.1%); therefore, the primary endpoint was met. Six of the 12 patients who achieved MPR showed pathological complete response. One patient developed pneumonia before neoadjuvant treatment and one patient showed progressed disease after neoadjuvant treatment. Grade 3 adverse events (AEs) occurred in nine of 24 patients (37.5%) during the protocol treatment. Postoperative complications of grade 3 AEs, including postoperative hematoma, pulmonary fistula, and intraoperative arterial injury, were observed in three patients. Immune-related AEs related to protocol treatment were thyroid dysfunction, acute tubulointerstitial nephritis, and hepatic dysfunction in three, two, and two patients, respectively; however, no grade 3 or high AEs were observed. Twenty-one patients achieved R0 resection and one patient underwent R1 resection. There were no wound-healing adverse events of concern due to the anti-VEGF action of ramucirumab. Conclusions: The results of this study demonstrated that this new neoadjuvant combination of ICI and anti-VEGF agent (pembrolizumab and ramucirumab) is feasible and showed encouraging results. Clinical trial information: NCT04040361 .
APA, Harvard, Vancouver, ISO, and other styles
27

Lienkov, Serhii, Volodymyr Dzhuliy, Oleksandr Yavorskyi, and Kostyantyn Zatsepin. "Information security model of functioning software." Smart technologies: Industrial and Civil Engineering 2, no. 15 (2024): 31–45. https://doi.org/10.32347/st.2024.2.1202.

Full text
Abstract:
The paper systematizes the models of reliable and safe functioning of the software. As a result of the research, three types of models were identified: analytical; statistical; empirical. A number of the most frequently used models are considered, and their disadvantages and advantages are highlighted from the point of view of solving the problem of describing the safe functioning of a software product and recognizing malicious software. According to the results of the research, the considered models have advantages in terms of the simplicity of their practical implementation, but at the same time, the following disadvantages are highlighted: some of the considered models require a large amount of computing resources when implemented - for security analysis and accumulation of archival data; the use of statistical and probabilistic models of assumptions that the intensity of attacks/failures or the number of errors in software have a pre-known distribution (binomial, standard or Poisson), which is not always true for real processes and systems; there is no division into software failures and failures due to cyber attacks, zero-day vulnerabilities are also not taken into account; memory accesses of the investigated software are not analyzed, which could provide important information about its legitimacy or the presence of malicious functions; none of the considered models provides a comprehensive representation of the process of software functioning, including, there is no analysis from the information security side. The task of recognizing malicious software is becoming more and more relevant and difficult every year in connection with the digitalization of human activities and the use of software for the execution of business logic and technical processes in complex systems. As a result, the larger the volume of software in the system, the more errors there are potentially, and due to the connection of modern systems to the Internet, the software is often distributed over the network, which allows attackers to create new vectors of cyber attacks on systems. The proposed model of safe functioning of the software product should eliminate the shortcomings inherent in the considered models. The proposed model eliminates the mentioned shortcomings due to the fact that it takes into account the characteristic features of the manifestation of malicious software on devices, namely the impact of malicious software on the computing resources of the system and working with RAM. This allows the developed model to take into account both the reliability of software operation and security. In terms of the model, the criteria for the safe functioning of the software are formulated, it is concluded that for the most effective implementation of such a model in practice, a hypervisor should be used.
APA, Harvard, Vancouver, ISO, and other styles
28

Schwartzberg, Lee S., Eric Roeland, Pablo C. Okhuysen, et al. "Characterizing unplanned resource utilization associated with cancer-related diarrhea." Journal of Clinical Oncology 39, no. 15_suppl (2021): e18625-e18625. http://dx.doi.org/10.1200/jco.2021.39.15_suppl.e18625.

Full text
Abstract:
e18625 Background: In clinical oncology practice, diarrhea is a very common and severe side effect of cancer treatments including from radiotherapy, chemotherapy, and targeted therapies. Cancer-related diarrhea (CRD) leads to increased healthcare resource consumption due to unscheduled outpatient visits, and , increased hospital stays requiring intensive supportive care measures. We evaluated CRD patients receiving chemotherapy, targeted therapy, or both, requiring emergency department (ED), physician office visits, hospitalizations, and length of stay (LOS) compared to a matched cohort of non-CRD patients. Methods: We performed a longitudinal study among adult patients ( > 18 yrs) with CRD identified by diagnosis codes or pharmacy claims compared to matched non-CRD patients using claims data derived from the IQVIA PharMetrics Plus database. Index date was the first cancer claim date and patients were re-indexed based on their CRD claim. Each patient had a 6-month pre-index period, a minimum 3-month post-index period and had ≥12 months of continuous enrollment following the CRD index date. To adjust for selection bias and baseline differences, we matched CRD patients to non-CRD patients (1:1) by age, gender, geography and payer type. Patients were stratified by cancer therapy type (chemotherapy, targeted therapy or both treatments). We reported proportion of patients with hospitalizations, average length of stay (LOS), and ED visits. A generalized estimating equation model with log link and binomial distribution adjusted for type of cancer, therapy, and Charlson Comorbidity Index (CCI) was built to estimate the difference in occurrence of hospitalization between CRD and non-CRD cohorts. Results: We evaluated a total of 104,135 matched pairs of CRD and non-CRD adult patients with solid or hematologic cancer with 12-month continuous enrollment. The proportion of patients with ED visits (36.2% vs 18.9%, p < 0.0001) and hospitalizations (29.6% vs 12.8%, p < 0.0001) were significantly higher among CRD versus non-CRD cohort. When compared to non-CRD patients, CRD patients were more likely to be hospitalized (adjusted OR 2.28. 95% CI of 2.23-2.33). Mean CRD-specific office/hospital visits were significantly higher in the CRD cohort compared to the non-CRD cohort over the 12-month post-index period and patients had more CRD-specific visits to ED (7.5% vs 1.8%); physician’s offices (14.7% vs 3.8%); laboratory testing (11.6% vs 3.2%) and outpatient ancillary services (10.9% vs 2.6%) (all p < 0.0001). Mean hospital LOS among patients with CRD was higher than non-CRD patients (6.6±8.9 vs 5.8±10.5 days, p < 0.0001). Conclusions: Patients with CRD used significantly more resources, including outpatient services, ED visits, and hospitalizations. Effective prevention of CRD remains an unmet strategy to reduce the overall cost of cancer care.
APA, Harvard, Vancouver, ISO, and other styles
29

Ferland, Yaïves, and Mir Abolfazl Mostafavi. "From Spatial to Platial Information Systems: For a Better Representation of the Sense of Place." Abstracts of the ICA 1 (July 15, 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-76-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> In a traditional Geographical Information System (GIS), one usually characterizes spatial entities by their individual placename (or toponym) and a set of defined attributes. These entities and their attributes are structured in table frames with functional relations, plus coordinates of their geometric primitives for location in a quantified space. In such a case, one considers the geographical names of these entities as unique identifiers under the standardized ‘generic-specific’ binomial, in one-to-one relationships with features of interest on the ground. This technical perspective convenes quite well to administrators of land-based public mandates needing univocal references to particular places for planning as well as daily applications (e.g., municipal, postal, emergency). But, from a linguistic point of view, the exact location and dimensions of places are among other attributes of the toponym itself, since the study of placenames looks mainly to its meaning, etymology, and evolution in particular language(s) through the epochs of naming practices, forming kind of toponymic “strata”. So, that is investigatory, documented, critical, often anecdotal, sometimes even searching for identity, but devotes few considerations to the geographical place per se, its landscape nor its limits.</p><p> On the other hand, the geographical perspective to placenames comes more concerned by a real or identifiable place within its vicinity and the dynamics of situations occurring in that location at multiple scales in the same period, whatever the names covering part or all the area. Thus, looking for a place based information system to structure perceived or unofficial places of specified interest require the design and development of tools and functionalities able to support analysis of such information for particular purposes (personal, commercial, military, transit, indigenous, participatory, tourism, etc.). In such a context, definition of their substantial components requires other technical means than their descriptive attributes and geometric primitives. Over the rational spatial databases as a base map, the user of a platial information system would design place-entities on an autonomous fashion, more subjective and colloquial, while referring to landmarks for identity or distinction with respect to different people, being inhabitant, local worker or just visitor. The place-entity comes to the mind like a shapeless mass with a core and some peripheries, where it shows an oriented front or façade, or a force line as a trend, and more fuzzy and moveable limits on the other sides. For instance, downtown and the central business district may not correspond to each other as platial synonyms.</p><p> For this purpose, the necessary data structures remain to be developed. To do so, Voronoi tessellation, with its flexible spatial proximity definition and its topological (instead of geometrical) properties, represents an interesting alternative model for further researches. For short, a Voronoi diagram partitions the space into regions such that any location is associated with its nearest Voronoi generators (centroids representing places). This allows an adaptive discretization of the space, and provides a simple and intuitive basis for the definition of adjacency relations between the generation points. Depending on the distribution of generating points, Voronoi cells can approximate place extensions close to human perception of those places with flexibility and still keep the fundamental qualitative relations between places, thanks to topological properties of this model. Irregularity of such Voronoi model also has a significant advantage that allow to better approximate places and their variable distribution in the geographical space.</p><p> Here in this paper, we also consider scale as an important factor for the representation and analysis of the places approximated with Voronoi cells. Indeed, in higher level, a place may be constituted either by the aggregation of several places in fine scales or by their parts. Hierarchical Voronoi diagram can also be considered to model such relations between places and their vertical relations, for instance where a same particular placename identifies different features or entities that overlap, evolve, or have various limits or meanings. In such a model, moving between different scales or data “levels” is done without being worried about exact aggregation of lower geometries in a high-level place representation. In order to structure Voronoi hierarchies for a set of points representing centroids of places, one must start by constructing Voronoi cells for the finest level and then creating pointers that relate higher level places to the lower ones.</p><p> Based on its unique properties, Voronoi tessellation, among strong solutions for platial information systems, can provide a firm base for representing the complexity of places, either as entities (or features) and toponyms that identify or refer to them. Paralleling relational GIS data frames, it would allow to adapt to unusual or fuzzy places, to represent their geographical evolution, to preserve their name and spatial extension, and to take good notes of local or ancient variants and even of exonyms applied to such places from abroad. Thus, it presents an opportunity to map the sense, if not the spirit, of place.</p>
APA, Harvard, Vancouver, ISO, and other styles
30

Pillai, K. Sadasivan. "Classifying chemicals into toxicity categories based on LC50 values- Pros and cons." Journal of Environmental Biology 44, no. 5 (2023): i—iii. http://dx.doi.org/10.22438/jeb/44/5/editorial.

Full text
Abstract:
It is a usual practice to determine LC50 value in acute toxicity studies conducted in aquatic organisms as an initial step to assess the toxicity of chemicals. In regulatory toxicity studies, normally conducted in GLP (Good Laboratory Practice) certified facilities, acute toxicity of chemicals is evaluated in fish, crustacea, and or alga following the methods given in OECD Guidelines. The chemicals are classified into different toxicity categories based on the LC50/EC50 determined from the acute toxicity studies. For calculating LC50 in acute toxicity tests, the methods given in the OECD (2019) Guidelines are Probit or Logit Analysis (Litchfield & Wilcoxon method and Probit Analysis), Spearman-Karber method, the binomial method, the moving average method, and the graphical method. LC50 is the concentration of a substance that causes 50 % mortality in a batch of test organisms (eg. fish). In acute toxicity studies with laboratory animals like rats, mice, rabbits, etc, instead of LC50, the terminology LD50 is used. The procedure for the calculation of both LC50 and LD50 is same. In this article, LC50 and LD50 are written interchangeably. It means if 100 fish are exposed to LC50, theoretically 50 fish would die. In fact, the inventor of LC50 (Trevan, 1927) defined LC50 as the median lethal concentration. Like any other median value, the LC50 is not affected by extreme values of either side. Unfortunately, Trevan was ruthlessly misquoted by the animal ethicists, as they believed that he was responsible for killing millions of animals for determining the median lethal concentration. According to Rowan (1983), the median lethal concentration in animals varies considerably among the species and is affected by environmental factors. Trevan proposed median lethal concentration (LD50) in frogs and rodents for biological standardization of digitalis extract, insulin, and diphtheria toxin when he was working at Wellcome Research Labs, Beckenham (Pillai et al., 2021a). Trevan never promoted sacrificing more animals to determine median lethal concentration. He was aware of the fact that the determination of median lethal concentration is affected by several factors. The 'characteristic' of a dose-response curve proposed by Trevan is species and test substance-specific. However, after Trevan, LD50s were determined in acute toxicity studies to evaluate the effect of a substance, not for the biological standardization of drugs. His intention was to establish a numerical quality control standard to assess batch-to-batch variation, if any, of the therapeutic products of the Wellcome Research Labs. Based on the LC50/EC50 values determined in aquatic toxicity studies, the chemicals are classified into a hazard category. For example, according to United Nations Global Harmonized System (GHS), if the 96 LC50 of a chemical to fish is ≤ 1 mg l-1 , this chemical is classified into hazard category I (GHS, 2019).Though several methods are prescribed in OECD (2019) Guidelines, if the mortality data are adequate, Probit Analysis of Finney (1978) and Litchfield and Wilcoxon (1949) method may be preferred to determine LC50 as these methods provide additional valuable information on the concentration-mortality relationship. If the lowest mortality obtained is close to 16% and the highest mortality is close to 84%, most of the above-mentioned methods would result in a more or less similar LC50value (Pillai et al., 2021a).Calculation of LC50 manually by the Litchfield and Wilcoxon method is somewhat easier, but Probit Analysis is a bit cumbersome. Commercial statistical software is available for the calculation of LC50 by both the above methods. But, using the software without understanding the underlying concepts of the statistical methods has certain disadvantages. Researchers also present the toxicity of a substance in terms of LC10, LC90, etc. Since the variability of these estimates is large, their biological relevance is limited. Concentration-mortality curve in the 16-84% mortality range is linear, hence the LC50 determined from this concentration-mortality curve is reliable. The method of Litchfield and Wilcoxon (1949), uses the 16-84% mortality range for calculating LC50. This method does not consider mortality below 16 and above 84% for the LC50 calculation. But Probit Analysis by Finney (1978) considers all mortality values (excluding 0 and 100 % mortality) for the calculation of LC50. Researchers in academic institutions use LC50 values to compare the toxicity of the test substances - the lower the LC50, the substance is more toxic, and vice-versa (Islam et al., 2021).Toxicity grading of substances solely based on LC50 is inappropriate. Recently, the appropriate use of LC50 values for the GHS classification of chemicals has been questioned (Pillai et al., 2021a). LC50s vary in a wide range from one species to the other (Geyer et al., 1993) and many times are irreproducible within the same species (Peres and Pihan, 1991), as the physico-chemical parameters of dilution water play a crucial role in LC50 experiments. Hrovat et al. (2009) reported significant variability of fish LC50 test results for 44 compounds. A consistent LC50 could not be obtained in more than 750 tests conducted on fathead minnows with 644 chemicals (Mc Carty, 2012). It is a statutory requirement for the United Nations GHS that the environmental hazards should be mentioned on the labels of chemicals for distribution. The European Chemicals Agency (ECHA, 2017) uses fish LC50 for the environmental classification of a chemical according to the GHS of Classification, Labelling and Packaging of Chemicals (Paparella et al., 2021). The major disadvantage of such labelling is that the LC50 value alone does not provide information on the toxicity profile of chemicals. Showing a similar LC50 does not mean that the toxicity profile of the chemicals is same. It is important to consider the slopes of the concentration-mortality curve when comparing the LC50s of the chemicals. The slope which reflects the concentration-mortality relationship provides a better understanding of the causality between a toxicant and response (Tsatsakis et al., 2018). In Probit Analysis, parallel regression lines of mortality probits on log concentrations indicate that the mode of action of chemicals on test organisms is similar (Finney, 1978). If the regression lines are not parallel, it is a clear indication that the chemicals possess different modes of action on that particular organism. Also, it is important to present LC50 with 95% confidence limits. If the 95% confidence limits of LC50s of the chemicals are distinctly separate, LC50s can be considered different from each other. The LC50s cannot be considered different from each other if the 95% confidence limits of the LC50s overlap. Chemicals with similar LC50 values may manifest toxicity differently. Similarly, chemicals with different LC50 values may manifest similar toxicity effects; hence, the classification of chemicals into various groups based on LC50 values may not have much relevance (Pillai et al., 2021b). Ethical conduct of fish toxicity studies and euthanizing of exposed fish are emphasized in the OECD (2019) and CCSEA (Committee for Control and Supervision of Experiments on Animals) Guidelines (CPCSEA, 2021). Earlier the fish toxicity studies were conducted with 10 fish exposed to each test concentration, but the revised OECD (2019) Guideline recommends a minimum number of 7 fish for each test concentration. The probable mortality data that can be obtained in an acute test where 7 numbers of fish are exposed to each test concentration are (number of fish died/total number of fish exposed) 0/7, 1/7, 2/7, 3/7, 4/7, 5/7, 6/7, or 7/7. For calculating LC50 values by the methods of Litchfield and Wilcoxon (1949) and Finney (1978), 0 and 100% mortality are not used, since no probit values can be assigned for 0 and 100% mortality. The remaining 6 numbers of mortality data are adequate for calculating a reliable LC50 value, if the mortality data spreads over all phases of the concentration-mortality curve, particularly covering 16-84% mortality region. If the mortality data does not spread over all the phases of the concentration-mortality curve in a concentration-dependent manner, the confidence limits of LC50 could be exploded (Pillai et al., 2021b). Estimation of LD50 in rodents by the methods of Litchfield and Wilcoxon (1949) and Finney (1978) is discouraged by US Consumer Product Safety Commission, US Environmental Protection Agency, US Food and Drug Administration, National Toxicology Program, and OECD, due to ethical reasons and poor reproducibility of LD50 values. But, classical methods are used to determine LC50 values in environmental toxicity studies, especially with aquatic organisms. It is more biologically relevant to interpret LC50 in terms of the slope of concentration-mortality curve and confidence interval of LC50. My association with Dr. R.C. Dalela and Journal of Environmental Biology began in the early 1980s when he was working at D.A.V College Muzaffarnagar. His research work and enthusiasm for bringing up the Journal of Environmental Biology to an international standard fascinated me. I realized from his research work that he was a committed environmentalist. I had an opportunity to majorly organize two national conferences of the Academy of Environmental Biology. He always occupied the front row in the conferences listening to all scientific presentations keenly. He had taken a lot of hardships to bring the journal to this sustainable level with a WOS Impact Factor of 0.70. I remember as it had happened yesterday, my meeting with him at D.A.V. College, Muzaffarnagar, at JRF, Vapi, Marathwada Ambedkar University, Aurangabad, and in Chennai. He was an excellent teacher, a great scientist, a mentor to several researchers, and self-disciplined.
APA, Harvard, Vancouver, ISO, and other styles
31

Mage, David T., and E. Maria Donner. "The Sudden Infant Death Syndrome is a Probability Process." December 8, 2012. https://doi.org/10.5281/zenodo.7886.

Full text
Abstract:
The Sudden Infant Death Syndrome (SIDS) is sudden-unexpected death of an infant that remains unexplained after thorough forensic autopsy, death scene investigation and review of the infant's medical history. As the results of a few spins of a roulette wheel cannot establish whether the wheel is honest (uniform value distribution), medical investigations of a few SIDS cases have not been able to uncover the mechanistic cause of death. We propose that this is because statistical analyses of large numbers of independent observations may be required to unmask the apparent probability processes that govern these quite different phenomena. The SIDS male fraction ~0.60 appears as a binomial probability sample characteristic of a condition caused by an Xlinked gene. The unique SIDS age distribution (minimum at birth, mode ~63 days, median ~94 days, falling exponentially to zero at ~41.2 months) appears as a probability sample from an underlying Johnson SB (4-parameter lognormal) distribution of ages. The presence of this lognormal distribution is prima facie evidence that a probability process is involved. Matching binomial and SB distribution equations to these physiological phenomena, we propose: The SIDS binomial gender distributions arises from an Xlinked recessive allele (q ≈ 2/3) non-protective against acute anoxic encephalopathy; and SIDS Johnson SB age distributions arise from such genetically susceptible infants having three independent risk factors: neurological prematurity (m + 0.31)-1 decreasing with age in months m; risk of respiratory infection increasing with age (41.2 - m)-1; and risk of physiological anemia rising and falling with age (2ᴨσ2)-1/2[exp(-0.5[(y-μ)/σ]2)] , where y = Log[(m + 0.31)/(41.2 - m)] = μ + σ z, μ is median of y, σ is standard deviation of y, with z a standard normal deviate. We show infant Respiratory Distress Syndrome and Suffocation by Inhalation of Food or Foreign Object have approximately the same male fractions as SIDS, supporting the hypothesis that the same allele of an X-linked gene is responsible for death in all these cases.
APA, Harvard, Vancouver, ISO, and other styles
32

"Binomial Parameter Estimation: Asymptotic analysis of Normality and Confidence Regions Using Moments Estimators." AlQalam Journal of Medical and Applied Sciences, February 15, 2025, 315–18. https://doi.org/10.54361/ajmas.258145.

Full text
Abstract:
Confidence regions for binomial distributions are widely used in practical research. Extensive application has interested many statisticians in improving new methods for constructing these regions in asymptotic and exact ways. This research primarily aims to construct confidence ellipses for parameters p and m of the binomial distribution Bin(m,p) using a sample of fixed size n in the method of moments estimators. Although the conventional method of moments estimators for p and m are well-established and straightforward to derive, their denominators can reach zero with positive probabilities. This study introduces modified estimators to effectively address this issue. By using the asymptotic normality of the joint distribution of both the conventional and modified estimators, we examine the quadratic forms of these estimators that follow a chi-square distribution, enabling the construction of simultaneous confidence regions (ellipses) for the parameters p and m. The method of moments estimators for binomial parameters p and m are found to be approximately normal, which allows for confidence regions to be constructed based on the chi-square distribution. Presenting modified estimators to handle zero-denominator issues ensures robust and reliable inferences. The theoretical side of the confidence region results show the effectiveness of binomial parameters p and m using modified estimators, solving the limitations of conventional approaches. This method provides a powerful framework for statistical inference, relying on asymptotic normality and a chi-square distribution. This method provides a powerful framework for statistical inference because it depends on asymptotic normality and the chi-square distribution.
APA, Harvard, Vancouver, ISO, and other styles
33

Ashit, B. Chakraborty, and Khurshid Anwer. "ON POWER OF THE CONTROL CHART FOR ZERO TRUNCATED BINOMIAL DISTRIBUTION UNDER MISCLASSIFICATION ERROR." Investigación Operacional 40, no. 3 (2019). https://doi.org/10.5281/zenodo.3721687.

Full text
Abstract:
En este paper una investigación matemática se desarrolló sobre el efecto de la mala clasificación debido al error de medición en la carta de control de la distribución Binomial truncada (ZTBD). Fórmulas analíticas son  obtenidas para calcular las probabilidades de error de la clasificación debida al error de medición. La  conexión entre la fracción aparente de defectuosos ) (AFD y la verdadera ) (TFD han sido usadas para estudiar la potencia de la carta de control.. Expresiones del average del largo de la corrida (ARL) y de la curva  OC curva también son obtenidas
APA, Harvard, Vancouver, ISO, and other styles
34

-, Vallamkonda Sai Kiran. "Performance Analysis of Slotted Waveguide Antenna for High Power Microwave Applications." International Journal For Multidisciplinary Research 5, no. 4 (2023). http://dx.doi.org/10.36948/ijfmr.2023.v05i04.4536.

Full text
Abstract:
Slotted-waveguide antenna arrays find wide applications in communication and radar systems that require narrow beam or shaped-beam radiation patterns, especially when high power, light weight, and limited scanned areas are priorities. Placement of slots in waveguide plays a crucial role for shaping side lobes with desired values. An iterative procedure for calculating the placements of slot positions of slotted-waveguide array, which consists of an arbitrary number of slots, is presented in this paper. The offsets of the slots’ positions with respect to the waveguide centerline, which determine the side lobe level, were then obtained from well-known mathematical distributions like Chebyshev, Taylor, Binomial are presented. And the results are simulated in HFSS software.
APA, Harvard, Vancouver, ISO, and other styles
35

Li, Xiaohong, Dongfeng Wu, Nigel G. F. Cooper, and Shesh N. Rai. "Sample size calculations for the differential expression analysis of RNA-seq data using a negative binomial regression model." Statistical Applications in Genetics and Molecular Biology 18, no. 1 (2019). http://dx.doi.org/10.1515/sagmb-2018-0021.

Full text
Abstract:
Abstract High throughput RNA sequencing (RNA-seq) technology is increasingly used in disease-related biomarker studies. A negative binomial distribution has become the popular choice for modeling read counts of genes in RNA-seq data due to over-dispersed read counts. In this study, we propose two explicit sample size calculation methods for RNA-seq data using a negative binomial regression model. To derive these new sample size formulas, the common dispersion parameter and the size factor as an offset via a natural logarithm link function are incorporated. A two-sided Wald test statistic derived from the coefficient parameter is used for testing a single gene at a nominal significance level 0.05 and multiple genes at a false discovery rate 0.05. The variance for the Wald test is computed from the variance-covariance matrix with the parameters estimated from the maximum likelihood estimates under the unrestricted and constrained scenarios. The performance and a side-by-side comparison of our new formulas with three existing methods with a Wald test, a likelihood ratio test or an exact test are evaluated via simulation studies. Since other methods are much computationally extensive, we recommend our M1 method for quick and direct estimation of sample sizes in an experimental design. Finally, we illustrate sample sizes estimation using an existing breast cancer RNA-seq data.
APA, Harvard, Vancouver, ISO, and other styles
36

Drăgoi, Vlad-Florin, Brice Colombier, Pierre-Louis Cayrel, and Vincent Grosso. "Integer syndrome decoding in the presence of noise." Cryptography and Communications, May 10, 2024. http://dx.doi.org/10.1007/s12095-024-00712-3.

Full text
Abstract:
AbstractCode-based cryptography received attention after the NIST started the post-quantum cryptography standardization process in 2016. A central NP-hard problem is the binary syndrome decoding problem, on which the security of many code-based cryptosystems lies. The best known methods to solve this problem all stem from the information-set decoding strategy, first introduced by Prange in 1962. A recent line of work considers augmented versions of this strategy, with hints typically provided by side-channel information. In this work, we consider the integer syndrome decoding problem, where the integer syndrome is available but might be noisy. We study how the performance of the decoder is affected by the noise. First we identify the noise model as being close to a centered in zero binomial distribution. Second we model the probability of success of the ISD-score decoder in presence of a binomial noise. Third, we demonstrate that with high probability our algorithm finds the solution as long as the noise parameter d is linear in t (the Hamming weight of the solution) and t is sub-linear in the code-length. We provide experimental results on cryptographic parameters for the BIKE and Classic McEliece cryptosystems, which are both candidates for the fourth round of the NIST standardization process.
APA, Harvard, Vancouver, ISO, and other styles
37

Shirozhan, Masoumeh, Naushad A. Mamode Khan, and Célestin C. Kokonendji. "The balanced discrete triplet Lindley model and its INAR(1) extension: properties and COVID-19 applications." International Journal of Biostatistics, November 24, 2022. http://dx.doi.org/10.1515/ijb-2022-0001.

Full text
Abstract:
Abstract This paper proposes a new flexible discrete triplet Lindley model that is constructed from the balanced discretization principle of the extended Lindley distribution. This model has several appealing statistical properties in terms of providing exact and closed form moment expressions and handling all forms of dispersion. Due to these, this paper explores further the usage of the discrete triplet Lindley as an innovation distribution in the simple integer-valued autoregressive process (INAR(1)). This subsequently allows for the modeling of count time series observations. In this context, a novel INAR(1) process is developed under mixed Binomial and the Pegram thinning operators. The model parameters of the INAR(1) process are estimated using the conditional maximum likelihood and Yule-Walker approaches. Some Monte Carlo simulation experiments are executed to assess the consistency of the estimators under the two estimation approaches. Interestingly, the proposed INAR(1) process is applied to analyze the COVID-19 cases and death series of different countries where it yields reliable parameter estimates and suitable forecasts via the modified Sieve bootstrap technique. On the other side, the new INAR(1) with discrete triplet Lindley innovations competes comfortably with other established INAR(1)s in the literature.
APA, Harvard, Vancouver, ISO, and other styles
38

"Biological studies in the vicinity of a shallow-sea tidal mixing front VI. A general statistical study." Philosophical Transactions of the Royal Society of London. B, Biological Sciences 310, no. 1146 (1985): 521–54. http://dx.doi.org/10.1098/rstb.1985.0130.

Full text
Abstract:
Studies of the distributional properties, the interrelationships and comparisons in time and space of 21 biological, chemical and physical variables that characterize the activities in a shallow sea tidal mixing front in the Western Irish sea are presented. They represent an attempt at describing and interpreting biologically this complex ecosystem as a whole and particularly to assess and compare the intensity of biological and biochemical activities and differences in distribution of organisms between the upper and lower stratified and mixed water columns on the two sides of the front at different times of the year. The analyses used were mainly parametric methods, but non-parametric analyses were found to be appropriate in a few cases. The log-normal distribution tended to fit better than the normal for most variables for each water mass within each cruise. Also different discrete distributions were fitted by the method of maximum likelihood to the bacterioplankton and zooplankton data and the best fits in both cases and for each cruise, where adequately large data was available, turned out to be the negative binomial distribution. Some of the associations between the variables for separate water masses in each cruise, described by the non-parametric Spearman rank correlations, had meaningful biological interpretations while others did not. Also structural simplification through reducing the dimensionality (15 variables) of the system produced, by using principal component analysis on logarithmically transformed data, a few components that persisted throughout the cruises in the upper stratified water and could be interpreted in ecological terms; notably components showing the effect of physical stratification on biological activity, the depletion of nitrogenous compounds by plankton and the possible effect of protozooplankton grazing. Comparisons of the levels of biological and biochemical activities determined by parametric, with data logarithmically transformed, and non-parametric one-way analyses of variance (ANOVA) showed significant differences between these levels particularly glucose and urea uptake rates, consistently in the three water masses and in all cruises. The dominant feature was that the upper stratified water was different from the other two water bodies and that the two methods of analysis produced similar results. The relative importance of the biological variables to differentiate the water masses was assessed by using stepwise discriminant analysis, on data transformed logarithmically, and this confirmed, to a large extent, the results obtained from the analysis of variance comparisons. Differences between the three lifestages of zooplankton numbers were ascertained by using a randomized block ANOVA on logarithmically transformed data which indicated that these differences were significant in all but one of the eight cruises where data was available. Significantly greater abundance of zooplankton haul numbers were found at the front and the stratified side compared with the mixed side. The diurnal variation of zooplankton numbers modelled by multiple regression analysis with data again transformed logarithmically showed that the numbers depended on depth in the stratified water column but on time in the mixed water column. The analyses overall showed that the upper stratified water is an area of intense biological activity especially in the vicinity of the front and on the whole has many different characteristics from the rest of the water body and that most variables are closely linked, particularly during stable stratification in the summer.
APA, Harvard, Vancouver, ISO, and other styles
39

Njuki, Humphrey M., Elvis M. Kiano, and Lucy C. Rono. "Internalization of External Cost in the Thermal Power Generation on Social Welfare Maximization." Journal of Economics, Finance And Management Studies 07, no. 04 (2024). http://dx.doi.org/10.47191/jefms/v7-i4-38.

Full text
Abstract:
For decades, Kenya has incorporated thermal power technology into its grid to generate electrical energy using fossil fuels such as petroleum, natural gas, and coal. The burning of fossil fuels has become a major source of air pollutants and is associated with several undesirable side effects on the environment and human health. However, the impact of pollutants on environmental sustainability and public welfare has yet to be evaluated. Therefore, the purpose of this study is to evaluate the external cost of electricity generated by thermal power plants in Kenya. Both survey data and secondary data were used. The analysis was conducted using externality valuation and welfare maximization approaches, and the research hypotheses were tested using a negative binomial regression model. The annual external cost ($/2022) was determined to be $ 1,333,904,970.76, with the following distribution: environmental at $ 993,488,336.26, Public health at $ 86,760,038.01, and socio-economic at $ 253,656,596.49. Equally, the thermal power generation marginal social cost ($/2022) was determined to be 1.22 $cents/kWh with the following distribution: Marginal Private Cost (MPC) at 0.01 $cents/kWh and Marginal External Cost (MEC) at 1.21 $cents/kWh. The established marginal social cost (MSC) (i.e. Σ MPC+MEC) was 1.22 ($cents/kWh). Thus, MSC is significantly greater than the established social marginal benefit (SMB) of 0.089 ($ cents/kWh); hence, we conclude that the burden of social welfare loss is highly significant, making thermal power a non-sustainable and economic energy source.
APA, Harvard, Vancouver, ISO, and other styles
40

Aarø, Leif Edvard, Lamprini Veneti, Øystein Vedaa, Otto R. F. Smith, Birgitte Freiesleben De Blasio, and Bjarne Robberstad. "Visiting crowded places during the COVID-19 pandemic. A panel study among adult Norwegians." Frontiers in Public Health 10 (December 15, 2022). http://dx.doi.org/10.3389/fpubh.2022.1076090.

Full text
Abstract:
Non-pharmaceutical interventions, including promotion of social distancing, have been applied extensively in managing the COVID-19 pandemic. Understanding cognitive and psychological factors regulating precautionary behavior is important for future management. The present study examines the importance of selected factors as predictors of having visited or intended to visit crowded places. Six online questionnaire-based waves of data collection were conducted in April–October 2020 in a Norwegian panel (≥18 years). Sample size at Wave 1 was 1,400. In the present study, “Visited or intended to visit crowded places” for different types of locations were the dependent variables. Predictors included the following categories of items: Perceived response effectiveness, Self-efficacy, Vulnerability, Facilitating factors and Barriers. Data were analyzed with frequency and percentage distributions, descriptives, correlations, principal components analysis, negative binomial-, binary logistic-, and multiple linear regression, and cross-lagged panel models. Analyses of dimensionality revealed that a distinction had to be made between Grocery stores, a location visited by most, and locations visited by few (e.g., “Pub,” “Restaurants,” “Sports event”). We merged the latter set of variables into a countscore denoted as “Crowded places.” On the predictor side, 25 items were reduced to eight meanscores. Analyses of data from Wave 1 revealed a rather strong prediction of “Crowded places” and weaker associations with “Supermarket or other store for food.” Across waves, in multiple negative binomial regression models, three meanscore predictors turned out to be consistently associated with “Crowded places.” These include “Response effectiveness of individual action,” “Self-efficacy with regard to avoiding people,” and “Barriers.” In a prospective cross-lagged model, a combined Response effectiveness and Self-efficacy score (Cognition) predicted behavior (“Visited or intended to visit crowded places”) prospectively and vice versa. The results of this study suggest some potential to reduce people's visits to crowded locations during the pandemic through health education and behavior change approaches that focus on strengthening individuals' perceived response effectiveness and self-efficacy.
APA, Harvard, Vancouver, ISO, and other styles
41

Wetsch, Wolfgang A., Hannes M. Ecker, Alexander Scheu, Rebecca Roth, Bernd W. Böttiger, and Christopher Plata. "Video-assisted cardiopulmonary resuscitation: Does the camera perspective matter? A randomized, controlled simulation trial." Journal of Telemedicine and Telecare, June 25, 2021, 1357633X2110284. http://dx.doi.org/10.1177/1357633x211028490.

Full text
Abstract:
Background Dispatcher assistance can help to save lives during layperson cardiopulmonary resuscitation during cardiac arrest. The aim of this study was to investigate the influence of different camera positions on the evaluation of cardiopulmonary resuscitation performance during video-assisted cardiopulmonary resuscitation. Methods For this randomized, controlled simulation trial, seven video sequences of cardiopulmonary resuscitation performance were recorded from three different camera positions: side, foot and head position. Video sequences showed either correct cardiopulmonary resuscitation performance or one of the six typical errors: low and high compression rate, superficial and increased compression depth, wrong hand position or incomplete release. Video sequences with different cardiopulmonary resuscitation performances and camera positions were randomly combined such that each evaluator was presented seven individual combinations of cardiopulmonary resuscitation and camera position and evaluated each cardiopulmonary resuscitation performance once. A total of 46 paramedics and 47 emergency physicians evaluated seven video sequences of cardiopulmonary resuscitation performance from different camera positions. The primary hypothesis was that there are differences in accuracy of correct assessment/error recognition depending on camera perspective. Generalized linear multi-level analyses assuming a binomial distribution and a logit link were employed to account for the dependency between each evaluator's seven ratings. Results Of 651 video sequences, cardiopulmonary resuscitation performance was evaluable in 96.8% and correctly evaluated in 74.5% over all camera positions. Cardiopulmonary resuscitation performance was classified correctly from a side perspective in 81.3%, from a foot perspective in 68.8% and from a head perspective in 73.6%, revealing a significant difference in error recognition depending on the camera perspective ( p = .01). Correct cardiopulmonary resuscitation was mistakenly evaluated to be false in 46.2% over all perspectives. Conclusions Participants were able to recognize significantly more mistakes when the camera was located on the opposite side of the cardiopulmonary resuscitation provider. Foot position should be avoided in order to enable the dispatcher the best possible view to evaluating cardiopulmonary resuscitation quality.
APA, Harvard, Vancouver, ISO, and other styles
42

Hegde, Sharika, Hoseb Abkarian, and Hani S. Mahmassani. "Characterizing Ride-Hailing Driver Attrition and Supply in the City of Chicago Through the COVID-19 Pandemic." Transportation Research Record: Journal of the Transportation Research Board, August 24, 2022, 036119812211175. http://dx.doi.org/10.1177/03611981221117542.

Full text
Abstract:
The flexible nature of on-demand ride services provided by transportation network companies (TNC) has resulted in unique supply-side challenges as the industry deals with the COVID-19 pandemic. Early during the pandemic, there was a 70% decrease in the number of drivers accepting trips on TNC platforms, as individual drivers chose to reduce their risk of viral infection and abide by social distancing recommendations. Given the two-sided market nature of TNCs, the decrease was also the effect of reduced rider demand creating a less desirable driver experience. This paper characterizes and quantifies this change in supply as it relates to driver residency, tenure, attrition, and the number of trips provided. The distribution of drivers accepting trips shifted slightly toward the lower income and higher minority areas of Chicago. Using survival analysis methods, we find that retention among drivers who started in the early months of the pandemic was significantly lower than in reference years, after six months of driving. The results of the negative binomial regression show that drivers on a single TNC platform provided 20% less trips than drivers on multiple platforms. This difference increases to 30% during the pandemic. Additionally, new drivers joined multiple apps during COVID-19, likely to serve more trips and secure higher income. The results of this paper can be used to understand and target driver retention to accelerate the recovery of the TNC industry.
APA, Harvard, Vancouver, ISO, and other styles
43

Smyth, Bobby P., Aoife Davey, and Eamon Keenan. "Deterrence effect of penalties upon adolescent cannabis use." Irish Journal of Psychological Medicine, February 16, 2023, 1–6. http://dx.doi.org/10.1017/ipm.2023.5.

Full text
Abstract:
Abstract Objective: Penalties are used in an effort to curtail drug use by citizens in most societies. There are growing calls for a reduction or elimination of such penalties. Deterrence theory suggests that use should increase if penalties reduce and vice versa. We sought to examine the relationship between changes to penalties for drug possession and adolescent cannabis use. Method: Ten instances of penalty change occurred in Europe between 2000 and 2014, seven of which involved penalty reduction and three involved penalty increase. We conducted a secondary analysis of a series of cross-sectional surveys of 15–16-year-old school children, the ESPAD surveys, which are conducted every four years. We focused on past month cannabis use. We anticipated that an eight-year time span before and after each penalty change would yield two data points either side of the change. A simple trend line was fitted to the data points for each country. Results: In eight cases, the trend slope in past month cannabis use was in the direction predicted by deterrence theory, the two exceptions being the UK policy changes. Using the principals of binomial distributions, the likelihood of this happening by chance is 56/1024 = 0.05. The median change in the baseline prevalence rate was by 21%. Conclusions: The science seems far from settled on this issue. There remains a distinct possibility that reducing penalties could contribute to small increases in adolescent cannabis use and consequently increase cannabis-related harms. This possibility should be considered in any political decision-making influencing drug policy changes.
APA, Harvard, Vancouver, ISO, and other styles
44

Schweizer, Marin, A. M. Racila, Anitha Vijayan, et al. "P-315. Safety and Effectiveness of Nasal Povidone-Iodine Decolonization Among Patients on Hemodialysis: A Multicenter Stepped-Wedge Cluster Randomized Trial." Open Forum Infectious Diseases 12, Supplement_1 (2025). https://doi.org/10.1093/ofid/ofae631.518.

Full text
Abstract:
Abstract Background The SHEA/IDSA/APIC Strategies to Prevent MRSA Transmission and Infection Practice Recommendations advised that facilities consider decolonizing patients on hemodialysis. We implemented a nasal decolonization intervention in which patients self-administered povidone-iodine (PVI) at each dialysis session. We aimed to assess intervention safety and effectiveness. Methods We performed a stepped wedge cluster randomized trial at 16 outpatient hemodialysis units affiliated with 5 academic medical centers between 2020-2023. Adverse events were self-reported at 1 and 6 months. While the analysis was at the hemodialysis unit level, patients were required to give verbal informed consent for PVI use. Outcomes included National Healthcare Safety Network reportable dialysis events aggregated at hemodialysis unit level, including bloodstream infections (BSI), access-related BSI, and central venous catheter (CVC) BSI for all pathogens and for Staphylococcus aureus (SA). The primary outcome was SA BSI. A generalized linear mixed model with a negative binomial distribution, log link function, and an offset for person-months with a random intercept for each hemodialysis unit was performed. Results Overall, 1,351 patients received hemodialysis at these centers each month. Of those, 362 patients verbally consented to use PVI. Among these, 3.9% reported side effects: nasal drip, congestion or burning/stinging, unpleasant smell, headache, or minor nose bleed. A reported side effect ‘yellow tears’ was assessed via chart review and resolved by discontinuing PVI. There were no statistically significant associations between unit level randomization to the PVI intervention and infections. However, there was a non-statistically significant trend toward a protective association between unit-level randomization to PVI and SA infections, particularly SA CVC BSI.(Table) Conclusion Long-term nasal decolonization with PVI was safe with few adverse events. Unit level randomization to the PVI intervention did not significantly decrease unit-level infections. Given low patient enrollment and added infection prevention interventions due to COVID-19, the study could not determine if PVI decolonization could decrease BSI rates in the hemodialysis setting. Disclosures Marin Schweizer, PhD, 3M: Grant/Research Support Anitha Vijayan, MD, Baxter: Honoraria|NxStage: Advisor/Consultant|Qanta: Honoraria David A. Pegues, MD, DaVita/Total Renal Care: Advisor/Consultant Daniel Diekema, MD, Affinity Biosensors: Grant/Research Support|bioMerieux, Inc: Grant/Research Support Loreen Herwaldt, MD, 3M: Grant/Research Support|PDI: Grant/Research Support
APA, Harvard, Vancouver, ISO, and other styles
45

Sabei, Leandro, Thiago Bernardino, Marisol Parada Sarmiento, et al. "Life experiences of boars can shape the survival, aggression, and nociception responses of their offspring." Frontiers in Animal Science 4 (April 11, 2023). http://dx.doi.org/10.3389/fanim.2023.1142628.

Full text
Abstract:
IntroductionBoars are often housed in stressful environments on commercial farms, experiencing poor welfare. These conditions may cause epigenetic changes in the boars' gametes, which could potentially be transmitted to their offspring. We aimed to investigate the effect of three different boars housing environments on the survival, aggression, and nociceptive responses of their offspring.MethodsFor four weeks, 18 boars were housed in three different systems: crates (C;n=6), pens (P;n=6), and enriched pens (E;n=6). The environmental enrichment was provided twice daily (brushing, shower, and hay). Thirteen gilts were housed in outdoor paddocks and inseminated with pooled semen from the boars kept in the three treatments. We evaluated the number of live-born, stillborn, and weaned piglets, sex, and mortality rate. Weaning was performed at 29 days of age. For each piglet, six body photographs were taken for five days postweaning to measure skin lesions (n=138). On Day 34, the nociceptive pressure threshold was assessed using an analgesimeter (n=138). DNA paternity tests were carried out at the end of the study (n=181). A generalized linear model with a negative binomial distribution was used to compare the number of live-born/weaned piglets and skin lesions among the treatment groups. We used a Kruskal‒Wallis test to analyze nociceptive data.ResultsMore live-born and weaned piglets were fathered from boars kept in the E group than the P group (p=0.002;p=0.001, respectively). A trend was observed in the number of skin lesions on the left side of piglets (P<C;p=0.053). For nociceptive assessments, offspring from P boars showed less right leg withdrawal than piglets from E and C boars (p=0.008); the P group had a higher average nociceptive value than the C group (p=0.002). All treatments differed in the region adjacent to the tail for nociceptive pressure threshold (P>E>C;p<0.001).Discussion and conclusionOur results suggest that providing an enriched environment for boars can increase the number of live-born and weaned piglets. Moreover, the boars housing conditions can influence nociceptive threshold in their offspring. Further research must be performed to understand the underlying mechanism associated with these changes using epigenetics protocols and measuring physiological indicators and other molecular markers in semen and/or sperm cell samples.
APA, Harvard, Vancouver, ISO, and other styles
46

Sahin, Ibrahim Halil, Ronan Wenhan Hsieh, Vikram Gorantla, et al. "Combining low-dose regorafenib with pembrolizumab for patients with MSI-H colorectal cancer: REGPEM-CRC-01." Journal of Clinical Oncology 43, no. 4_suppl (2025). https://doi.org/10.1200/jco.2025.43.4_suppl.tps313.

Full text
Abstract:
TPS313 Background: Currently, pembrolizumab is one of the front-line therapies for patients with MSI-H CRC. However, approximately 40% of patients who received pembrolizumab experienced disease progression early in the course of disease (KEYNOTE 177). Therefore, there is still an unmet need to enhance the efficacy of checkpoint inhibitors in MSI-H CRC. MSI-H CRC has a higher level of expression of VEGF in blood compared to patients compared to its MSS counterpart (Hansen et al. Colorectal Dis. 2011). Consistently, exploratory analysis of CALBG-80405 and PARADIGM trial showed that patients with MSI-H CRC were more likely to benefit from anti-VEGF therapy than anti-EGFR therapy regardless of the side of the tumor. NSABP C-08 also suggested that anti-VEGF therapy may have biological activity even in as adjuvant therapy for patients with MSI-H colon cancer. Regorafenib is a potent VEGF inhibitor, with preclinical evidence showing its immune modulatory effect in the tumor microenvironment. In this trial, we hypothesize that adding low-dose regorafenib to pembrolizumab may induce synergistic activity beyond their independent clinical efficacy and create deep and durable responses for patients with MSI-H CRC. Methods: In the lead arm of this prospective randomized study, 22 patients will be enrolled through Hoosier Cancer Research Network (HCRN-GI23-643). Patients with treatment naïve MSI-H CRC will be enrolled in this front-line trial. One cycle of pembrolizumab and up to 3 cycles of chemotherapy prior to determination of MMR-D/MSI-H is allowed. Patients will receive regorafenib 60 mg daily in combination with pembrolizumab 200mg IV in cycle 1, followed by regorafenib 90mg in subsequent cycles to improve treatment tolerance. The primary outcome that will be measured is ORR, defined as the percentage of partial or complete response to the treatment within 12 months. ORR will be measured using RECIST 1.1. criteria. A formal one-sided hypothesis test will be conducted for futility, assuming that we will reject the null hypothesis of a target ORR only if we have strong evidence. In this study, we assume a null hypothesis that ORR is 0.60, which would reflect significant clinical improvement over the current standard of ORR = 0.43 from KEYNOTE 177. The alternative hypothesis is that ORR is less than 0.60. For the lead-in phase of the study, the emphasis is on controlling Type I error to be small, approximately 0.05. The test statistic will be the number of ORRs in the 22 patients, which we assume to follow a binomial distribution. Clinical trial information: NCT06006923 .
APA, Harvard, Vancouver, ISO, and other styles
47

"Desarrollo sostenible: Desde la mirada de preservación del medio ambiente colombiano/ Sustainable development: From the perspective of preservation of the Colombian environment." Revista de Ciencias Sociales, 2020, 293–307. http://dx.doi.org/10.31876/rcs.v26i4.34664.

Full text
Abstract:
Resumen La presente disquisición está orientada bajo un enfoque positivista, con una perspectiva metodológica documental y descriptiva. La investigación tiene como objetivo realizar unas reflexiones teóricas relacionadas con el medio ambiente, el crecimiento económico en concordancia con el desarrollo sostenible a nivel mundial y específicamente en el Estado colombiano. La sostenibilidad mantiene la postura ante la sociedad de salvaguardar una mejor calidad de vida de los individuos, que coadyuve al crecimiento económico, político, social, real distribución de las riquezas, a la expansión de la industrialización, en países desarrollados como en vía de desarrollo. Para Colombia, en conclusión la problemática ambiental ha sido de gran interés, la misma presenta cierto nivel de inconsistencia con el desarrollo económico, ha generado falta de sensibilidad por las empresas en mantener conciliación con el desarrollo sostenible, desde décadas varias países del mundo tratan de ajustarse a este emprendimiento positivo con la naturaleza. Desde el 2015 con la Agenda 2030 siendo su único norte inclinado en los 17 Objetivos del Desarrollo Sostenible, incentivan a los Estados miembros a brindarle más atención a la situación del medio ambiente como binomio impostergable al crecimiento económico. El lema es proteger el medio ambiente, para el beneficio de los seres vivos. Abstract The present disquisition is oriented under a positivist approach, with a documentary and descriptive methodological perspective. The objective of the research is to carry out theoretical reflections related to the environment, economic growth in accordance with sustainable development worldwide and specifically in the Colombian State. Sustainability maintains the position before society of safeguarding a better quality of life for individuals, which contributes to economic, political, social growth, real distribution of wealth, to the expansion of industrialization, in developed and developing countries. For Colombia, in conclusion, the environmental problem has been of great interest, it presents a certain level of inconsistency with economic development, it has generated a lack of sensitivity by companies in maintaining conciliation with sustainable development, for decades several countries in the world have tried to adjust to this positive undertaking with nature. Since 2015, with the 2030 Agenda being its only inclined north in the 17 Sustainable Development Goals, they encourage member states to pay more attention to the situation of the environment as an urgent pairing of economic growth. The motto is to protect the environment, for the benefit of living beings.
APA, Harvard, Vancouver, ISO, and other styles
48

Francis, Kate L., Christopher J. D. McKinlay, C. Omar F. Kamlin, et al. "Intratracheal budesonide mixed with surfactant to increase survival free of bronchopulmonary dysplasia in extremely preterm infants: statistical analysis plan for the international, multicenter, randomized PLUSS trial." Trials 24, no. 1 (2023). http://dx.doi.org/10.1186/s13063-023-07650-0.

Full text
Abstract:
Abstract Background Bronchopulmonary dysplasia (BPD), an inflammatory-mediated chronic lung disease, is common in extremely preterm infants born before 28 weeks’ gestation and is associated with an increased risk of adverse neurodevelopmental and respiratory outcomes in childhood. Effective and safe prophylactic therapies for BPD are urgently required. Systemic corticosteroids reduce rates of BPD in the short term but are associated with poorer neurodevelopmental outcomes if given to ventilated infants in the first week after birth. Intratracheal administration of corticosteroid admixed with exogenous surfactant could overcome these concerns by minimizing systemic sequelae. Several small, randomized trials have found intratracheal budesonide in a surfactant vehicle to be a promising therapy to increase survival free of BPD. The primary objective of the PLUSS trial is to determine whether intratracheal budesonide mixed with surfactant increases survival free of bronchopulmonary dysplasia (BPD) at 36 weeks’ postmenstrual age (PMA) in extremely preterm infants born before 28 weeks’ gestation. Methods An international, multicenter, double-blinded, randomized trial of intratracheal budesonide (a corticosteroid) mixed with surfactant for extremely preterm infants to increase survival free of BPD at 36 weeks’ postmenstrual age (PMA; primary outcome). Extremely preterm infants aged < 48 h after birth are eligible if (1) they are mechanically ventilated, or (2) they are receiving non-invasive respiratory support and there is a clinical decision to treat with surfactant. The intervention is budesonide (0.25 mg/kg) mixed with poractant alfa (200 mg/kg first intervention, 100 mg/kg if second intervention), administered intratracheally via an endotracheal tube or thin catheter. The comparator is poractant alfa alone (at the same doses). Secondary outcomes include the components of the primary outcome (death, BPD prior to or at 36 weeks’ PMA), and potential systemic side effects of corticosteroids. Longer-term outcomes will be published separately, and include cost-effectiveness, early childhood health until 2 years of age, and neurodevelopmental outcomes at 2 years of age (corrected for prematurity). Statistical analysis plan A sample size of 1038 infants (519 in each group) is required to provide 90% power to detect a relative increase in survival free of BPD of 20% (an absolute increase of 10%), from the anticipated event rate of 50% in the control arm to 60% in the intervention (budesonide) arm, alpha error 0.05. To allow for up to 2% of study withdrawals or losses to follow-up, PLUSS aimed to enroll a total of 1060 infants (530 in each arm). The binary primary outcome will be reported as the number and percentage of infants who were alive without BPD at 36 weeks’ PMA for each randomization group. To estimate the difference in risk (with 95% CI), between the treatment and control arms, binary regression (a generalized linear multivariable model with an identity link function and binomial distribution) will be used. Along with the primary outcome, the individual components of the primary outcome (death, and physiological BPD at 36 weeks’ PMA), will be reported by randomization group and, again, binary regression will be used to estimate the risk difference between the two treatment groups for survival and physiological BPD at 36 weeks’ PMA.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!