Academic literature on the topic 'Percent less-than-threshold'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Percent less-than-threshold.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Percent less-than-threshold"

1

Galster, George C. "Nonlinear and Threshold Effects Related to Neighborhood: Implications for Planning and Policy." Journal of Planning Literature 33, no. 4 (August 14, 2018): 492–508. http://dx.doi.org/10.1177/0885412218793693.

Full text
Abstract:
Nonlinear and threshold relationships are commonly manifested in neighborhoods, both relating to effects of neighborhoods on residents and causes of neighborhood changes arising from individual mobility and housing investment decisions. These relationships are generated by amalgam of often reinforcing processes related to socialization, gaming, tolerance, contagion, and tolerance. The existence of nonlinear and threshold effects holds powerful implications for planners. Scarce public investment resources must be spatially concentrated, so that they exceed property owners’ reinvestment thresholds. Poverty deconcentration strategies must seek to replace neighborhoods exceeding 40 percent poverty rates with those that have low (less than 15 percent) poverty rates.
APA, Harvard, Vancouver, ISO, and other styles
2

Weltman, A., J. Y. Weltman, R. Schurrer, W. S. Evans, J. D. Veldhuis, and A. D. Rogol. "Endurance training amplifies the pulsatile release of growth hormone: effects of training intensity." Journal of Applied Physiology 72, no. 6 (June 1, 1992): 2188–96. http://dx.doi.org/10.1152/jappl.1992.72.6.2188.

Full text
Abstract:
The effects of intensity of run training on the pulsatile release of growth hormone (GH) were investigated in 21 eumenorrheic untrained women. The O2 consumption (VO2) at the lactate threshold (LT); fixed blood lactate concentrations (FBLC) of 2.0, 2.5, and 4.0 mM; peak VO2; maximal VO2; body composition; and pulsatile release of GH were measured. Subjects in both the at-lactate threshold (/LT, n = 9) and above-lactate threshold (greater than LT, n = 7) training groups increased VO2 at LT and FBLC of 2.0, 2.5, and 4.0 mM and VO2max after 1 yr of run training. However, the increase observed in the greater than LT group was greater than that in the /LT group (P less than 0.05). No change was observed for the control group (n = 5). No among- or within-group differences were observed for body weight, although trends for reductions in percent body fat (P less than 0.06) and fat weight (P less than 0.15) were observed in the greater than LT group, and both training groups significantly increased fat-free weight (P less than 0.05).(ABSTRACT TRUNCATED AT 250 WORDS)
APA, Harvard, Vancouver, ISO, and other styles
3

Ali, M. Shaukat. "Poverty Assessment: Pakistan's Case." Pakistan Development Review 34, no. 1 (March 1, 1995): 43–54. http://dx.doi.org/10.30541/v34i1pp.43-54.

Full text
Abstract:
This study is an attempt to determine the poverty-line and the incidence of poverty in Pakistan by using data of the latest Household Integrated Economic Survey: 1990-91. The study uses a different approach. and methodology in respect of earlier studies of the subject. The approach is that of the "Basic Needs", which defines the poverty-line in terms of minimum expenditure on all needs, food as well as non-food. The methodology used in estimating the minimum expenditure on various needs is based on the "Extended Linear Expenditure System (ELES)". For the year under review, the total poverty-line was estimated at Rs 374 per capita per month, with the food poverty-line at Rs 191. A comparison with the income levels reported in the Survey revealed that roughly 47 percent population had an income less than this threshold level expenditure on all needs, the shortfall or gap being almost 25 percent. The proportion of population with an income less than the threshold expenditure on food alone was found to be 10 percent. In certain respects, the results were quite different, quantitatively as well as qualitatively, from those of the earlier studies.
APA, Harvard, Vancouver, ISO, and other styles
4

Howard, Kirsten, Sarah J. Lord, Anthony Speer, Robert N. Gibson, Robert Padbury, and Brendon Kearney. "Value of magnetic resonance cholangiopancreatography in the diagnosis of biliary abnormalities in postcholecystectomy patients: A probabilistic cost-effectiveness analysis of diagnostic strategies." International Journal of Technology Assessment in Health Care 22, no. 1 (January 2006): 109–18. http://dx.doi.org/10.1017/s0266462306050902.

Full text
Abstract:
Background:Endoscopic retrograde cholangiopancreatography (ERCP) is considered the gold standard for imaging of the biliary tract but is associated with complications. Less invasive imaging techniques, such as magnetic resonance cholangiopancreatography (MRCP), have a much lower complication rate. The accuracy of MRCP is comparable to that of ERCP, and MRCP may be more effective and cost-effective, particularly in cases for which the suspected prevalence of disease is low and further intervention can be avoided. A model was constructed to compare the effectiveness and cost-effectiveness of MRCP and ERCP in patients with a previous history of cholecystectomy, presenting with abdominal pain and/or abnormal liver function tests.Methods:Diagnostic accuracy estimates came from a systematic review of MRCP. A decision analytic model was constructed to represent the diagnostic and treatment pathway of this patient group. The model compared the following two diagnostic strategies: (i) MRCP followed with ERCP if positive, and then management based on ERCP; and (ii) ERCP only. Deterministic and probabilistic analyses were used to assess the likelihood of MRCP being cost-effective. Sensitivity analyses examined the impact of prior probabilities of common bile duct stones (CBDS) and test performance characteristics. The outcomes considered were costs, quality-adjusted life years (QALYs), and cost per additional QALY.Results:The deterministic analysis indicated that MRCP was dominant over ERCP. At prior probabilities of CBDS, less than 60 percent MRCP was the less costly initial diagnostic test; above this threshold, ERCP was less costly. Similarly, at probabilities of CBDS less than 68 percent, MRCP was also the more effective strategy (generated more QALYs). Above this threshold, ERCP became the more effective strategy. Probabilistic sensitivity analyses indicated that, in this patient group for which there is a low to moderate probability of CBDS, there was a 59 percent likelihood that MRCP was cost-saving, an 83 percent chance that MRCP was more effective with a higher quality adjusted survival, and an 83 percent chance that MRCP had a cost-effectiveness ratio more favorable than $50,000 per QALY gained.Conclusions:Costs and cost-effectiveness are dependent upon the prior probability of CBDS. However, probabilistic analysis indicated that, with a high degree of certainty, MRCP was the more effective and cost-effective initial test in postcholecystectomy patients with a low to moderate probability of CBDS.
APA, Harvard, Vancouver, ISO, and other styles
5

Xie, Xuanqian, Lindsey Falk, James M. Brophy, Hong Anh Tu, Jennifer Guo, Olga Gajic-Veljanoski, Nancy Sikich, Irfan A. Dhalla, and Vivian Ng. "A Non-inferiority Framework for Cost-Effectiveness Analysis." International Journal of Technology Assessment in Health Care 35, no. 4 (2019): 291–97. http://dx.doi.org/10.1017/s0266462319000576.

Full text
Abstract:
AbstractBackgroundTraditional decision rules have limitations when a new technology is less effective and less costly than a comparator. We propose a new probabilistic decision framework to examine non-inferiority in effectiveness and net monetary benefit (NMB) simultaneously. We illustrate this framework using the example of repetitive transcranial magnetic stimulation (rTMS) and electroconvulsive therapy (ECT) for treatment-resistant depression.MethodsWe modeled the quality-adjusted life-years (QALYs) associated with the new intervention (rTMS), an active control (ECT), and a placebo control, and we estimated the fraction of effectiveness preserved by the new intervention through probabilistic sensitivity analysis (PSA). We then assessed the probability of cost-effectiveness using a traditional cost-effectiveness acceptability curve (CEAC) and our new decision-making framework. In our new framework, we considered the new intervention cost-effective in each simulation of the PSA if it preserved at least 75 percent of the effectiveness of the active control (thus demonstrating non-inferiority) and had a positive NMB at a given willingness-to-pay threshold (WTP).ResultsrTMS was less effective (i.e., associated with fewer QALYs) and less costly than ECT. The traditional CEAC approach showed that the probabilities of rTMS being cost-effective were 100 percent, 39 percent, and 14 percent at WTPs of $0, $50,000, and $100,000 per QALY gained, respectively. In the new decision framework, the probabilities of rTMS being cost-effective were reduced to 23 percent, 21 percent, and 13 percent at WTPs of $0, $50,000, and $100,000 per QALY, respectively.ConclusionsThis new framework provides a different perspective for decision making with considerations of both non-inferiority and WTP thresholds.
APA, Harvard, Vancouver, ISO, and other styles
6

Niedermayer, Oskar. "Die brandenburgische Landtagswahl vom 1. September 2019: Die SPD schlägt die AfD auf den letzten Metern." Zeitschrift für Parlamentsfragen 51, no. 2 (2020): 285–303. http://dx.doi.org/10.5771/0340-1758-2020-2-285.

Full text
Abstract:
After a slow beginning, Brandenburg’s election campaign became dominated by the SPD’s and AfD’s struggle to come in first . This contributed considerably to an increase in turnout to 61 .3 percent . Although the SPD won the election with 26 .2 percent, it sustained substantial losses because its front runner Dietmar Woidke was less popular than in 2014, and the voters attributed less competences to the party in all relevant policy areas . The CDU could not benefit from this weakness, lost considerably and dropped back to the third place with 15 .6 percent . The AfD, which attracted ideologically convicted voters as well as economically, culturally, or socio-politically deprived protest-voters, moved forward to the second place with 23 .5 percent . The Greens won 10 .8 percent, the Left Party 10 .7 percent . The BVB/Freie Wähler remained at exactly 5 .0 percent, whereas the FDP failed to overcome the five percent threshold . The exploratory talks to form a new three-party coalition at first were overshadowed by an internal rebellion in the CDU but ended with a coalition of SPD, CDU, and the Greens .
APA, Harvard, Vancouver, ISO, and other styles
7

Petrou, Stavros, Angela Boland, Kamran Khan, Colin Powell, Ruwanthi Kolamunnage-Dona, John Lowe, Iolo Doull, Kerry Hood, and Paula Williamson. "ECONOMIC EVALUATION OF NEBULIZED MAGNESIUM SULPHATE IN ACUTE SEVERE ASTHMA IN CHILDREN." International Journal of Technology Assessment in Health Care 30, no. 4 (October 2014): 354–60. http://dx.doi.org/10.1017/s0266462314000440.

Full text
Abstract:
Objectives: The aim of this study was to estimate the cost-effectiveness of nebulized magnesium sulphate (MgSO4) in acute asthma in children from the perspective of the UK National Health Service and personal social services.Methods: An economic evaluation was conducted based on evidence from a randomized placebo controlled multi-center trial of nebulized MgSO4 in severe acute asthma in children. Participants comprised 508 children aged 2–16 years presenting to an emergency department or a children's assessment unit with severe acute asthma across thirty hospitals in the United Kingdom. Children were randomly allocated to receive nebulized salbutamol and ipratropium bromide mixed with either 2.5 ml of isotonic MgSO4 or 2.5 ml of isotonic saline on three occasions at 20-min intervals. Cost-effectiveness outcomes were constructed around the Yung Asthma Severity Score (ASS) after 60 min of treatment; whilst cost-utility outcomes were constructed around the quality-adjusted life-year (QALY) metric. The nonparametric bootstrap method was used to present cost-effectiveness acceptability curves at alternative cost-effectiveness thresholds for either: (i) a unit reduction in ASS; or (ii) an additional QALY.Results: MgSO4 had a 75.1 percent probability of being cost-effective at a GBP 1,000 (EUR 1,148) per unit decrement in ASS threshold, an 88.0 percent probability of being more effective (in terms of reducing the ASS) and a 36.6 percent probability of being less costly. MgSO4 also had a 67.6 percent probability of being cost-effective at a GBP 20,000 (EUR 22,957) per QALY gained threshold, an 8.5 percent probability of being more effective (in terms of generating increased QALYs) and a 69.1 percent probability of being less costly. Sensitivity analyses showed that the results of the economic evaluation were particularly sensitive to the methods used for QALY estimation.Conclusions: The probability of cost-effectiveness of nebulized isotonic MgSO4, given as an adjuvant to standard treatment of severe acute asthma in children, is less than 70 percent across accepted cost-effectiveness thresholds for an additional QALY.
APA, Harvard, Vancouver, ISO, and other styles
8

Deale, O. C., R. C. Wesley, D. Morgan, and B. B. Lerman. "Nature of defibrillation: determinism versus probabilism." American Journal of Physiology-Heart and Circulatory Physiology 259, no. 5 (November 1, 1990): H1544—H1550. http://dx.doi.org/10.1152/ajpheart.1990.259.5.h1544.

Full text
Abstract:
The gradual transitions that are found between unsuccessful and successful shock strengths in percent success or dose-response curves suggest that defibrillation is a probabilistic phenomenon. This concept appears to be reinforced by the fact that a frequency distribution is observed in defibrillation threshold data and that a dose-response relationship is also obtained by integration of the frequency distribution. The purpose of this study was to investigate whether a deterministic threshold model (based on experimental results) could produce 1) gradual transitions in dose-response curves, and 2) a threshold frequency distribution for individual subjects. In the experimental phase of the study, a linear deterministic relationship was found between transthoracic threshold current and defibrillation episode number (other variables held constant) in pentobarbital-anesthetized dogs. The correlation coefficient for each dog was between 0.77 and 0.98 (P less than 0.01), and both positive and negative slopes were found. Based on these results, threshold current was modeled for computer simulation as a linear function of episode number. The model was thus purely deterministic with no random variability. For each simulated experiment, several parameters were varied: order of shocks (increment, decrement, random order), slope of threshold function, and percent error of the initial threshold. Several hundred computer simulations were performed to determine the effect of varying these parameters. In all cases, threshold-frequency distributions and sigmoidal dose-response curves with gradual transitions were produced. The results of this investigation demonstrate that the apparent probabilistic behavior of defibrillation can be produced by a deterministic relationship.
APA, Harvard, Vancouver, ISO, and other styles
9

Niebur, G. L., J. C. Yuen, A. C. Hsia, and T. M. Keaveny. "Convergence Behavior of High-Resolution Finite Element Models of Trabecular Bone." Journal of Biomechanical Engineering 121, no. 6 (December 1, 1999): 629–35. http://dx.doi.org/10.1115/1.2800865.

Full text
Abstract:
The convergence behavior of finite element models depends on the size of elements used, the element polynomial order, and on the complexity of the applied loads. For high-resolution models of trabecular bone, changes in architecture and density may also be important. The goal of this study was to investigate the influence of these factors on the convergence behavior of high-resolution models of trabecular bone. Two human vertebral and two bovine tibial trabecular bone specimens were modeled at four resolutions ranging from 20 to 80 μm and subjected to both compressive and shear loading. Results indicated that convergence behavior depended on both loading mode (axial versus shear) and volume fraction of the specimen. Compared to the 20 μm resolution, the differences in apparent Young’s modulus at 40 μm resolution were less than 5 percent for all specimens, and for apparent shear modulus were less than 7 percent. By contrast, differences at 80 μm resolution in apparent modulus were up to 41 percent, depending on the specimen tested and loading mode. Overall, differences in apparent properties were always less than 10 percent when the ratio of mean trabecular thickness to element size was greater than four. Use of higher order elements did not improve the results. Tissue level parameters such as maximum principal strain did not converge. Tissue level strains converged when considered relative to a threshold value, but only if the strains were evaluated at Gauss points rather than element centroids. These findings indicate that good convergence can be obtained with this modeling technique, although element size should be chosen based on factors such as loading mode, mean trabecular thickness, and the particular output parameter of interest.
APA, Harvard, Vancouver, ISO, and other styles
10

Palmer, Katharina, Harold L. Drake, and Marcus A. Horn. "Genome-Derived Criteria for Assigning Environmental narG and nosZ Sequences to Operational Taxonomic Units of Nitrate Reducers." Applied and Environmental Microbiology 75, no. 15 (June 5, 2009): 5170–74. http://dx.doi.org/10.1128/aem.00254-09.

Full text
Abstract:
ABSTRACT Ninety percent of cultured bacterial nitrate reducers with a 16S rRNA gene similarity of ≥97% had a narG or nosZ similarity of ≥67% or ≥80%, respectively, suggesting that 67% and 80% could be used as standardized, conservative threshold similarity values for narG and nosZ, respectively (i.e., any two sequences that are less similar than the threshold similarity value have a very high probability of belonging to different species), for estimating species-level operational taxonomic units. Genus-level tree topologies of narG and nosZ were generally similar to those of the corresponding 16S rRNA genes. Although some genomes contained multiple copies of narG, recent horizontal gene transfer of narG was not apparent.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Percent less-than-threshold"

1

Hokanson, William H. "Identifying Complex Fluvial Sandstone Reservoirs Using Core, Well Log, and 3D Seismic Data: Cretaceous Cedar Mountain and Dakota Formations, Southern Uinta Basin, Utah." BYU ScholarsArchive, 2011. https://scholarsarchive.byu.edu/etd/2597.

Full text
Abstract:
The Cedar Mountain and Dakota Formations are significant gas producers in the southern Uinta Basin of Utah. To date, however, predicting the stratigraphic distribution and lateral extent of potential gas-bearing channel sandstone reservoirs in these fluvial units has proven difficult due to their complex architecture, and the limited spacing of wells in the region. A new strategy to correlate the Cedar Mountain and Dakota Formations has been developed using core, well-log, and 3D seismic data. The detailed stratigraphy and sedimentology of the interval were interpreted using descriptions of a near continuous core of the Dakota Formation from the study area. The gamma-ray and density-porosity log signatures of interpreted mud-dominated overbank, coal-bearing overbank, and channel sandstone intervals from the cored well were used to identify the same lithologies in nearby wells and correlate similar stratal packages across the study area. Data from three 3D seismic surveys covering approximately 140 mi2 (225 km2) of the study area were utilized to generate spectral decomposition, waveform classification, and percent less-than-threshold attributes of the Dakota-Cedar Mountain interval. These individual attributes were combined to create a composite attribute that was merged with interpreted lithological data from the well-log correlations. The overall process resulted in a high-resolution correlation of the Dakota-Cedar Mountain interval that permitted the identification and mapping of fluvial-channel reservoir fairways and channel belts throughout the study area. In the future, the strategy employed in this study may result in improved well-success rates in the southern Uinta Basin and assist in more detailed reconstructions of the Cedar Mountain and Dakota Formation depositional systems.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Percent less-than-threshold"

1

"elimination of cultural difference. McNamara assured an Australian journalist: “The show is Australian through and through” (quoted by Gill 1993: 2). At the time of writing, neither American nor Australian responses are known. However, the summer release in the US – like that of Neighbours – is significant. This is the holiday season, the season when stations introduce material in which they place less market faith. Neighbours’s failure in the American market begs questions about the differential circulation there of Australian televisual and filmic texts. Jon Stratton and Ien Ang have argued the centrality of television to the modern nation-state’s basic reliance on . . . the nuclear family as the basis for social order, as the site of morality and for the organization of desire . . . . Through (modern) television, the nation could be forged into an encompassing imagined community in a way which was both more extensive and intimate than the newspaper – Benedict Anderson’s exemplary medium endowed with this role – was able to achieve. (Stratton and Ang 1994) Television’s homogenizing rhetorical space appears to be particularly resistant in the American case to incursions from outside its boundaries. Film differs somewhat. While both film and television production in the US are safely dominant in their local market, film eludes the familiar and familial domestic space of television. Crocodile Dundee succeeded strikingly in lowering the threshold of recognition of Australian media product in America. Yet, despite the film’s massive success in Australian terms (US$174 million US gross box-office, far above Crocodile Dundee 2, second at US $109 million, and Mad Max Beyond Thunderdome, third at US$36 million), it has made less great waves in US market terms. Among Variety’s “All-Time Champs of the 1980s,” it ranked only twenty-third, sandwiched between Honey, I Shrunk the Kids and Fatal Attraction, and earned only 31 percent of the takings of the top film, E.T. (Variety 1993: 10). Neighbours’s failure in the US television market should be measured not only in terms of the fact that US television is more strenuously resistant to foreign imports than is US film distribution–exhibition, but also in terms of the relative lack of success, by American standards, of even Australia’s greatest film export success. France: “Viewers have been bluffed by vandals” Neighbours play a particular role in Australia. In that country of infinite spaces, the sparse population must practise solidarity and good neighbourliness to survive. In an urban environment, however [sic], caring quickly descends to malevolent snooping. Faced with this soap, it is difficult to observe the evangelical precept of loving one’s neighbours as one loves oneself. (A.W. 1989: 7)." In To Be Continued..., 124. Routledge, 2002. http://dx.doi.org/10.4324/9780203131855-26.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Percent less-than-threshold"

1

Mihell, James, J. P. Lemieux, and Samah Hasan. "Probability-Based Sentencing Criteria for Volumetric In-Line Inspection Data." In 2016 11th International Pipeline Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/ipc2016-64448.

Full text
Abstract:
ASME B31.8S, Figure 7.2.1-1 (referred to as Figure 4 in earlier editions of the Standard) is used by many operators of natural gas transmission pipelines to schedule the remediation of corrosion features found via in-line inspection (ILI). The underlying philosophy of this approach is that wall loss features should be repaired before the calculated failure pressure falls below 110% of the maximum allowable operating pressure (MAOP). ASME B31.8S Figure 7.2.1-1 provides a basis for establishing maximum response times as a function of pipeline operating stress level, based in part on assumed corrosion growth rates. The corrosion rates assumed in the derivation of ASME B31.8S Figure 7.2.1-1 depend on the wall thickness of the pipe and the operating stress level as a percent of SMYS. As documented in PHMSA’s March 17, 2016 Notice of Proposed Rulemaking, the 1.1xMAOP repair criterion that forms the basis of Figure 7.2.2-1 has a demonstrated successful history of use in response management for wall loss ILI data. Despite this successful record, some potential exists for the underlying corrosion growth rate assumptions that are incorporated within that criterion to be non-conservative. Under some circumstances, the underlying corrosion growth rate assumption that is incorporated in Figure 7.2.1-1 can be significantly less than that which is provided in the guidance provided in NACE SP0502 (referenced in Appendix B of ASME B31.8S). Therefore, operators should ideally take measures to verify that the growth rate assumptions incorporated within Figure 7.2.1-1 are appropriate for their circumstances before adopting the scheduled response criteria from that Figure. On the other hand, for the majority of circumstances, it could be demonstrated that the Figure 7.2.1-1 criteria may represent overly-conservative response times, particularly where feature-specific information related to corrosion rates are available, and/or can be inferred from ILI data. A desirable solution would be to employ a response time threshold that utilizes the 1.1xMAOP repair criterion that has been demonstrated to be successful through industry’s widespread adoption of the Figure 7.2.2-1 criteria, along with some basis for incorporating feature-specific corrosion growth rates (from ILI data), and additionally, some basis for accounting for tool measurement error. Techniques for estimating the relative probability of failure (Pf) exist that employ ILI data and account for tool measurement error, model error, and tolerances in pipe dimensions and material properties. The problem to date is that probability targets have not been available for use in conjunction with a Pf analysis. Building on previous work done by Kiefner and Kolovich, this paper derives an approach for expressing Pf targets in terms of the 1.1xMAOP repair criterion adopted by ASME B31.8S, Figure 7.2.1-1. The Pf targets are derived using stochastic modeling, and incorporate probability density functions on tool error for feature depth and length, wall thickness, yield strength, and model error. Using a wide range of pipeline material and design parameters, a relationship for establishing lower-bound Pf targets is developed for broad application.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography