Academic literature on the topic 'Optimal scoring'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Optimal scoring.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Optimal scoring"

1

Ramsay, James, Marie Wiberg, and Juan Li. "Full Information Optimal Scoring." Journal of Educational and Behavioral Statistics 45, no. 3 (2019): 297–315. http://dx.doi.org/10.3102/1076998619885636.

Full text
Abstract:
Ramsay and Wiberg used a new version of item response theory that represents test performance over nonnegative closed intervals such as [0, 100] or [0, n] and demonstrated that optimal scoring of binary test data yielded substantial improvements in point-wise root-mean-squared error and bias over number right or sum scoring. We extend these results by showing that optimal scoring of the full information in option choices produces about as much further improvement in these measures of score performance as was achieved by going from sum scoring to optimal binary scoring.
APA, Harvard, Vancouver, ISO, and other styles
2

Konigsberg, Lyle W., Susan R. Frankenberg, and Helen M. Liversidge. "Optimal trait scoring for age estimation." American Journal of Physical Anthropology 159, no. 4 (2015): 557–76. http://dx.doi.org/10.1002/ajpa.22914.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hastie, Trevor, Robert Tibshirani, and Andreas Buja. "Flexible Discriminant Analysis by Optimal Scoring." Journal of the American Statistical Association 89, no. 428 (1994): 1255–70. http://dx.doi.org/10.1080/01621459.1994.10476866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Beling, P., Z. Covaliu, and R. M. Oliver. "Optimal scoring cutoff policies and efficient frontiers." Journal of the Operational Research Society 56, no. 9 (2005): 1016–29. http://dx.doi.org/10.1057/palgrave.jors.2602021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Tao, and Lixing Zhu. "Sparse sufficient dimension reduction using optimal scoring." Computational Statistics & Data Analysis 57, no. 1 (2013): 223–32. http://dx.doi.org/10.1016/j.csda.2012.06.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Clémençon, Stéphan, and Nicolas Vayatis. "Overlaying Classifiers: A Practical Approach to Optimal Scoring." Constructive Approximation 32, no. 3 (2010): 619–48. http://dx.doi.org/10.1007/s00365-010-9084-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Feldbacher-Escamilla, Christian J., and Gerhard Schurz. "Optimal probability aggregation based on generalized brier scoring." Annals of Mathematics and Artificial Intelligence 88, no. 7 (2019): 717–34. http://dx.doi.org/10.1007/s10472-019-09648-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nishimura, Takeshi. "Optimal design of scoring auctions with multidimensional quality." Review of Economic Design 19, no. 2 (2015): 117–43. http://dx.doi.org/10.1007/s10058-015-0169-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Camilli, Gregory. "IRT Scoring and Test Blueprint Fidelity." Applied Psychological Measurement 42, no. 5 (2018): 393–400. http://dx.doi.org/10.1177/0146621618754897.

Full text
Abstract:
This article focuses on the topic of how item response theory (IRT) scoring models reflect the intended content allocation in a set of test specifications or test blueprint. Although either an adaptive or linear assessment can be built to reflect a set of design specifications, the method of scoring is also a critical step. Standard IRT models employ a set of optimal scoring weights, and these weights depend on item parameters in the two-parameter logistic (2PL) and three-parameter logistic (3PL) models. The current article is an investigation of whether the scoring models reflect an intended set of weights defined as the proportion of item falling into each cell of the test blueprint. The 3PL model is of special interest because the optimal scoring weights depend on ability. Thus, the concern arises that for examinees of low ability, the intended weights are implicitly altered.
APA, Harvard, Vancouver, ISO, and other styles
10

Tiemensma, Jitske, Sarah Depaoli, and John M. Felt. "Using subscales when scoring the Cushing's quality of life questionnaire." European Journal of Endocrinology 174, no. 1 (2016): 33–40. http://dx.doi.org/10.1530/eje-15-0640.

Full text
Abstract:
ContextPatients in long-term remission of Cushing's syndrome (CS) commonly report impaired quality of life (QoL). The CushingQoL questionnaire is a disease-specific QoL questionnaire for patients diagnosed with CS. The developers of the CushingQoL recommend using a global (total) score to assess QoL. However, the global score does not capture all aspects of QoL as outlined by the World Health Organization (WHO).ObjectiveThe aim of the study was to compare the performance of different scoring options to determine the optimal method for the CushingQoL.Design and patientsPatients in remission from CS (n=341) were recruited from the Cushing's Syndrome Research Foundation's email listserv and Facebook page, and asked to complete the CushingQoL and a short demographics survey.ResultsUsing an exploratory analysis, adequate model fit was obtained for the global score, as well as a 2-subscale (psychosocial issues and physical problems) scoring solution. Confirmatory methods were performed to identify the optimal scoring solution. Both the global score and the 2-subscale scoring solution showed adequate model fit. However, a χ2 difference test indicated that the 2-subscale scoring solution was a significantly better fit than the global score (P<0.05).ConclusionIf doctors or researchers would like to tease apart physical and psychosocial issues, the 2-subscale scoring solution would be recommended, since this solution showed to be optimal in scoring the CushingQoL. Regardless of the scoring solution used, the CushingQoL has proven to be a valuable resource for assessing health-related QoL in patients with CS.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Optimal scoring"

1

Li, Yibei. "Dynamic Optimization for Agent-Based Systems and Inverse Optimal Control." Licentiate thesis, KTH, Optimeringslära och systemteori, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252438.

Full text
Abstract:
This dissertation is concerned with three problems within the field of optimization for agent--based systems. Firstly, the inverse optimal control problem is investigated for the single-agent system. Given a dynamic process, the goal is to recover the quadratic cost function from the observation of optimal control sequences. Such estimation could then help us develop a better understanding of the physical system and reproduce a similar optimal controller in other applications. Next, problems of optimization over networked systems are considered. A novel differential game approach is proposed for the optimal intrinsic formation control of multi-agent systems. As for the credit scoring problem, an optimal filtering framework is utilized to recursively improve the scoring accuracy based on dynamic network information. In paper A, the problem of finite horizon inverse optimal control problem is investigated, where the linear quadratic (LQ) cost function is required to be estimated from the optimal feedback controller. Although the infinite-horizon inverse LQ problem is well-studied with numerous results, the finite-horizon case is still an open problem. To the best of our knowledge, we propose the first complete result of the necessary and sufficient condition for the existence of corresponding LQ cost functions. Under feasible cases, the analytic expression of the whole solution space is derived and the equivalence of weighting matrices is discussed. For infeasible problems, an infinite dimensional convex problem is formulated to obtain a best-fit approximate solution with minimal control residual, where the optimality condition is solved under a static quadratic programming framework to facilitate the computation. In paper B, the optimal formation control problem of a multi-agent system is studied. The foraging behavior of N agents is modeled as a finite-horizon non-cooperative differential game under local information, and its Nash equilibrium is studied. The collaborative swarming behaviour derived from non-cooperative individual actions also sheds new light on understanding such phenomenon in the nature. The proposed framework has a tutorial meaning since a systematic approach for formation control is proposed, where the desired formation can be obtained by only intrinsically adjusting individual costs and network topology. In contrast to most of the existing methodologies based on regulating formation errors to the pre-defined pattern, the proposed method does not need to involve any information of the desired pattern beforehand. We refer to this type of formation control as intrinsic formation control. Patterns of regular polygons, antipodal formations and Platonic solids can be achieved as Nash equilibria of the game while inter-agent collisions are naturally avoided. Paper C considers the credit scoring problem by incorporating dynamic network information, where the advantages of such incorporation are investigated in two scenarios. Firstly, when the scoring publishment is merely individual--dependent, an optimal Bayesian filter is designed for risk prediction, where network observations are utilized to provide a reference for the bank on future financial decisions. Furthermore, a recursive Bayes estimator is proposed to improve the accuracy of score publishment by incorporating the dynamic network topology as well. It is shown that under the proposed evolution framework, the designed estimator has a higher precision than all the efficient estimators, and the mean square errors are strictly smaller than the Cramér-Rao lower bound for clients within a certain range of scores.<br>I denna avhandling behandlas tre problem inom optimering för agentbaserade system. Inledningsvis undersöks problemet rörande invers optimal styrning för ett system med en agent. Målet är att, givet en dynamisk process, återskapa den kvadratiska kostnadsfunktionen från observationer av sekvenser av optimal styrning. En sådan uppskattning kan ge ökad förståelse av det underliggande fysikaliska systemet, samt vara behjälplig vid konstruktion av en liknande optimal regulator för andra tillämpningar. Vidare betraktas problem rörande optimering över nätverkssystem. Ett nytt angreppssätt, baserat på differentialspel, föreslås för optimal intrinsisk formationsstyrning av system med fler agenter. För kreditutvärderingsproblemet utnyttjas ett filtreringsramverk för att rekursivt förbättra kreditvärderingens noggrannhet baserat på dynamisk nätverksinformation. I artikel A undersöks problemet med invers optimal styrning med ändlig tidshorisont, där den linjärkvadratiska (LQ) kostnadsfunktionen måste uppskattas från den optimala återkopplingsregulatorn. Trots att det inversa LQ-problemet med oändlig tidshorisont är välstuderat och med flertalet resultat, är fallet med ändlig tidshorisont fortfarande ett öppet problem. Så vitt vi vet presenterar vi det första kompletta resultatet med både tillräckliga och nödvändiga villkor för existens av en motsvarande LQ-kostnadsfunktion. I fallet med lösbara problem härleds ett analytiskt uttryck för hela lösningsrummet och frågan om ekvivalens med viktmatriser behandlas. För de olösbara problemen formuleras ett oändligtdimensionellt konvext optimeringsproblem för att hitta den bästa approximativa lösningen med den minsta styrresidualen. För att underlätta beräkningarna löses optimalitetsvillkoren i ett ramverk för statisk kvadratisk programmering. I artikel B studeras problemet rörande optimal formationsstyrning av ett multiagentsystem. Agenternas svärmbeteende modelleras som ett icke-kooperativt differentialspel med ändlig tidshorisont och enbart lokal information. Vi studerar detta spels Nashjämvikt. Att, ur icke-kooperativa individuella handlingar, härleda ett kollaborativt svärmbeteende kastar nytt ljus på vår förståelse av sådana, i naturen förekommande, fenomen. Det föreslagna ramverket är vägledande i den meningen att det är ett systematiskt tillvägagångssätt för formationsstyrning, där den önskade formeringen kan erhållas genom att endast inbördes justera individuella kostnader samt nätverkstopologin. I motstat till de flesta befintliga metoder, vilka baseras på att reglera felet i formeringen relativt det fördefinierade mönstret, så behöver den föreslagna metoden inte på förhand ta hänsyn till det önskade mönstret. Vi kallar denna typ av formationsstyrning för intrinsisk formationsstyrning. Mönster så som regelbundna polygoner, antipodala formeringar och Platonska kroppar kan uppnås som Nashjämvikter i spelet, samtidigt som kollisioner mellan agenter undviks på ett naturligt sätt. Artikel C behandlar kreditutvärderingsproblemet genom att lägga till dynamisk nätverksinformation. Fördelarna med en sådan integrering undersöks i två scenarier. Då kreditvärdigheten enbart är individberoende utformas ett optimalt Bayesiskt filter för riskvärdering, där observationer från nätverket används för att tillhandahålla en referens för banken på framtida finansiella beslut. Vidare föreslås en rekursiv Bayesisk estimator (stickprovsvariabel) för att förbättra noggrannheten på den skattade kreditvärdigheten genom att integrera även den dynamiska nätverkstopologin. Inom den föreslagna ramverket för tidsutveckling kan vi visa att, för kunder inom ett visst intervall av värderingar, har den utformade estimatorn högre precision än alla effektiva estimatorer och medelkvadrafelet är strikt mindre än den nedre gränsen från Cramér-Raos olikhet.<br><p>QC 20190603</p>
APA, Harvard, Vancouver, ISO, and other styles
2

Ustun, Berk (Tevfik Berk). "Simple linear classifiers via discrete optimization : learning certifiably optimal scoring systems for decision-making and risk assessment." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113987.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 203-221).<br>Scoring systems are linear classification models that let users make quick predictions by adding, subtracting, and multiplying a few small numbers. These models are widely used in applications where humans have traditionally made decisions because they are easy to understand and validate. In spite of extensive deployment, many scoring systems are still built using ad hoc approaches that combine statistical techniques, heuristics, and expert judgement. Such approaches impose steep trade-offs with performance, making it difficult for practitioners to build scoring systems that will be used and accepted. In this dissertation, we present two new machine learning methods to learn scoring systems from data: Supersparse Linear Integer Models (SLIM) for decision-making applications; and Risk-calibrated Supersparse Linear Integer Models (RiskSLIM) for risk assessment applications. Both SLIM and RiskSLIM solve discrete optimization problems to learn scoring systems that are fully optimized for feature selection, small integer coefficients, and operational constraints. We formulate these problems as integer programming problems and develop specialized algorithms to recover certifiably optimal solutions with an integer programming solver. We illustrate the benefits of this approach by building scoring systems for realworld problems such as recidivism prediction, sleep apnea screening, ICU seizure prediction, and adult ADHD diagnosis. Our results show that a discrete optimization approach can learn simple models that perform well in comparison to the state-of-the-art, but that are far easier to customize, understand, and validate.<br>by Berk Ustun.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
3

Sanders, Teresa H. "Multimodal assessment of Parkinson's disease using electrophysiology and automated motor scoring." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/51970.

Full text
Abstract:
A suite of signal processing algorithms designed for extracting information from brain electrophysiology and movement signals, along with new insights gained by applying these tools to understanding parkinsonism, were presented in this dissertation. The approach taken does not assume any particular stimulus, underlying activity, or synchronizing event, nor does it assume any particular encoding scheme. Instead, novel signal processing applications of complex continuous wavelet transforms, cross-frequency-coupling, feature selection, and canonical correlation were developed to discover the most significant electrophysiologic changes in the basal ganglia and cortex of parkinsonian rhesus monkeys and how these changes are related to the motor signs of parkinsonism. The resulting algorithms effectively characterize the severity of parkinsonism and, when combined with motor signal decoding algorithms, allow technology-assisted multi-modal grading of the primary pathological signs. Based on these results, parallel data collection algorithms were implemented in real-time embedded software and off-the-shelf hardware to develop a new system to facilitate monitoring of the severity of Parkinson's disease signs and symptoms in human patients. Off -line analysis of data collected with the system was subsequently shown to allow discrimination between normal and simulated parkinsonian conditions. The main contributions of the work were in three areas: 1) Evidence of the importance of optimally selecting multiple, non-redundant features for understanding neural information, 2) Discovery of signi ficant correlations between certain pathological motor signs and brain electrophysiology in different brain regions, and 3) Implementation and human subject testing of multi-modal monitoring technology.
APA, Harvard, Vancouver, ISO, and other styles
4

Farooqi, Owais Ehtisham. "An Assessment and Modeling of Copper Plumbing pipe Failures due to Pinhole Leaks." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/33918.

Full text
Abstract:
Pinhole leaks in copper plumbing pipes are a big concern for the homeowners. The problem is spread across the nation and remains a threat to plumbing systems of all ages. Due to the absence of a single acceptable mechanistic theory no preventive measure is available to date. Most of the present mechanistic theories are based on analysis of failed pipe samples however an objective comparison with other pipes that did not fail is seldom made. The variability in hydraulic and water quality parameters has made the problem complex and unquantifiable in terms of plumbing susceptibility to pinhole leaks. The present work determines the spatial and temporal spread of pinhole leaks across United States. The hotspot communities are identified based on repair histories and surveys. An assessment of variability in water quality is presented based on nationwide water quality data. A synthesis of causal factors is presented and a scoring system for copper pitting is developed using goal programming. A probabilistic model is presented to evaluate optimal replacement time for plumbing systems. Methodologies for mechanistic modeling based on corrosion thermodynamics and kinetics are presented.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
5

Sanchez, Merchante Luis Francisco. "Learning algorithms for sparse classification." Phd thesis, Université de Technologie de Compiègne, 2013. http://tel.archives-ouvertes.fr/tel-00868847.

Full text
Abstract:
This thesis deals with the development of estimation algorithms with embedded feature selection the context of high dimensional data, in the supervised and unsupervised frameworks. The contributions of this work are materialized by two algorithms, GLOSS for the supervised domain and Mix-GLOSS for unsupervised counterpart. Both algorithms are based on the resolution of optimal scoring regression regularized with a quadratic formulation of the group-Lasso penalty which encourages the removal of uninformative features. The theoretical foundations that prove that a group-Lasso penalized optimal scoring regression can be used to solve a linear discriminant analysis bave been firstly developed in this work. The theory that adapts this technique to the unsupervised domain by means of the EM algorithm is not new, but it has never been clearly exposed for a sparsity-inducing penalty. This thesis solidly demonstrates that the utilization of group-Lasso penalized optimal scoring regression inside an EM algorithm is possible. Our algorithms have been tested with real and artificial high dimensional databases with impressive resuits from the point of view of the parsimony without compromising prediction performances.
APA, Harvard, Vancouver, ISO, and other styles
6

Vogelgesang, Ulrike. "Essays on access to financial institutions, inequality, and redistribution." [S.l. : s.n.], 2002. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB10605024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Oldham, Kevin M. "Table tennis event detection and classification." Thesis, Loughborough University, 2015. https://dspace.lboro.ac.uk/2134/19626.

Full text
Abstract:
It is well understood that multiple video cameras and computer vision (CV) technology can be used in sport for match officiating, statistics and player performance analysis. A review of the literature reveals a number of existing solutions, both commercial and theoretical, within this domain. However, these solutions are expensive and often complex in their installation. The hypothesis for this research states that by considering only changes in ball motion, automatic event classification is achievable with low-cost monocular video recording devices, without the need for 3-dimensional (3D) positional ball data and representation. The focus of this research is a rigorous empirical study of low cost single consumer-grade video camera solutions applied to table tennis, confirming that monocular CV based detected ball location data contains sufficient information to enable key match-play events to be recognised and measured. In total a library of 276 event-based video sequences, using a range of recording hardware, were produced for this research. The research has four key considerations: i) an investigation into an effective recording environment with minimum configuration and calibration, ii) the selection and optimisation of a CV algorithm to detect the ball from the resulting single source video data, iii) validation of the accuracy of the 2-dimensional (2D) CV data for motion change detection, and iv) the data requirements and processing techniques necessary to automatically detect changes in ball motion and match those to match-play events. Throughout the thesis, table tennis has been chosen as the example sport for observational and experimental analysis since it offers a number of specific CV challenges due to the relatively high ball speed (in excess of 100kph) and small ball size (40mm in diameter). Furthermore, the inherent rules of table tennis show potential for a monocular based event classification vision system. As the initial stage, a proposed optimum location and configuration of the single camera is defined. Next, the selection of a CV algorithm is critical in obtaining usable ball motion data. It is shown in this research that segmentation processes vary in their ball detection capabilities and location out-puts, which ultimately affects the ability of automated event detection and decision making solutions. Therefore, a comparison of CV algorithms is necessary to establish confidence in the accuracy of the derived location of the ball. As part of the research, a CV software environment has been developed to allow robust, repeatable and direct comparisons between different CV algorithms. An event based method of evaluating the success of a CV algorithm is proposed. Comparison of CV algorithms is made against the novel Efficacy Metric Set (EMS), producing a measurable Relative Efficacy Index (REI). Within the context of this low cost, single camera ball trajectory and event investigation, experimental results provided show that the Horn-Schunck Optical Flow algorithm, with a REI of 163.5 is the most successful method when compared to a discrete selection of CV detection and extraction techniques gathered from the literature review. Furthermore, evidence based data from the REI also suggests switching to the Canny edge detector (a REI of 186.4) for segmentation of the ball when in close proximity to the net. In addition to and in support of the data generated from the CV software environment, a novel method is presented for producing simultaneous data from 3D marker based recordings, reduced to 2D and compared directly to the CV output to establish comparative time-resolved data for the ball location. It is proposed here that a continuous scale factor, based on the known dimensions of the ball, is incorporated at every frame. Using this method, comparison results show a mean accuracy of 3.01mm when applied to a selection of nineteen video sequences and events. This tolerance is within 10% of the diameter of the ball and accountable by the limits of image resolution. Further experimental results demonstrate the ability to identify a number of match-play events from a monocular image sequence using a combination of the suggested optimum algorithm and ball motion analysis methods. The results show a promising application of 2D based CV processing to match-play event classification with an overall success rate of 95.9%. The majority of failures occur when the ball, during returns and services, is partially occluded by either the player or racket, due to the inherent problem of using a monocular recording device. Finally, the thesis proposes further research and extensions for developing and implementing monocular based CV processing of motion based event analysis and classification in a wider range of applications.
APA, Harvard, Vancouver, ISO, and other styles
8

Su, Min-I., and 蘇珉一. "Integer linear programming for Optimal Clinical Classification Schemes: New Scoring system for Predicting Target Lesion Revascularization after Paclitaxel-Coated Balloon." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/yk789s.

Full text
Abstract:
碩士<br>國立臺東大學<br>資訊管理學系碩士班<br>106<br>Background: Medical scoring systems are linear classification models widely using for clinical decision-making. Linear programming is a method to achieve the best outcome in a linearly mathematical model. We demonstrate a new model using integer linear programming (ILP) to create optimal data-driven scoring systems. The clinical efficacy of paclitaxel-coated balloon (PCB) have been well proven in the treatment of instent restenosis (ISR), but the failure prediction models of PCB are not developed. The aim of this study was to use ILP to create a new target lesion revascularization (TLR) prediction models of PCB. Methods: We used ILP in medical scoring systems and the AUC of ROC curve was utilized for evaluating optimal solution. Results: The variables such as DES-ISR and statin using are superior to other variables and D2-S score was formed by assigning 2 points for the presence of DES-ISR and by assigning -1 points for statin using had the optimal predicting performance. The area under the receiver operating characteristic curve of new model (D2-S score) is 0.75. Conclusion: We developed new scoring systems for predicting TLR of PCB which we refer to as D2-S Score, and on top of that, ILP is a new method for creating optimal data-driven medical scoring systems and it can be utilized with data that is routinely available in electronic medical records.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Optimal scoring"

1

Skipworth, James R. A., and Stephen P. Pereira. Pathophysiology, diagnosis, and assessment of acute pancreatitis. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780199600830.003.0190.

Full text
Abstract:
The incidence of acute pancreatitis continues to increase, but the attendant mortality has not decreased for &gt;30 years. The pathogenesis remains poorly understood, but the initial mechanism appears to be intracellular activation of pancreatic enzymes, with micro- and macrovascular dysfunction, in conjunction with a systemic inflammatory response acting as a key propagating factor and determinant of severity. A multitude of causes or initiators exist, but there is a common pathophysiological pathway. The use of conventional scoring systems, combined with repeated clinical and laboratory assessment, remain the optimal method of predicting early severity and organ dysfunction. Death occurs in a biphasic pattern with early mortality (&lt;2 weeks) secondary to SIRS and MODS; and late deaths (&gt;2 weeks) due to superinfection of pancreatic necrosis. Assessment of severity should reflect this, with early severity being diagnosed in the presence of organ failure for &gt;48 hours, and late severity defined by the presence of pancreatic and peri-pancreatic complications on CT or other appropriate imaging modalities.
APA, Harvard, Vancouver, ISO, and other styles
2

Kinsella, Sinead, and John Holian. The effect of chronic renal failure on critical illness. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780199600830.003.0218.

Full text
Abstract:
The incidence of chronic kidney disease (CKD) and end-stage kidney disease (ESKD) is increasing, reflecting an increase in the incidence and prevalence of hypertension and type 2 diabetes. Patients with CKD and ESKD frequently experience episodes of critical illness and require treatment in an intensive care unit (ICU)setting. Management requires specific consideration of their renal disease status together with their acute illness. Mortality in critically-ill patients with ESKD is frequently related to their co-morbid conditions, rather than their ESKD status. Illness severity scoring systems allocate high points for renal variables and tend to overestimate actual mortality. Patients with ESKD and CKD requiring ICU admission have better ICU and in-hospital survival than patients with denovo acute kidney injury requiring renal replacement therapy. Appropriately selected patients benefit from ICU admission and full consideration for ICU care should be given to these patients if required, despite their renal disease status. Cardiovascular disease and sepsis account for the majority of ICU admissions in this population and the aetiology of these conditions differs from that in patients without kidney disease. Optimal critical care management of patients with ESKD and CKD requires that these differences are recognized.
APA, Harvard, Vancouver, ISO, and other styles
3

Nixdorff, Uwe, Stephan Achenbach, Frank Bengel, et al. Imaging in cardiovascular prevention. Oxford University Press, 2015. http://dx.doi.org/10.1093/med/9780199656653.003.0006.

Full text
Abstract:
Imaging tools in preventive cardiology can be divided into imaging modalities to assess pre-clinical and clinical atherosclerosis and functional assessments of vascular function or vascular inflammation. To calculate the likelihood of pre-clinical atherosclerosis intima-media thickness as well as coronary calcium scoring are most frequently used. However, beyond these two there are other parameters derived by ultrasound and multi-detector computed tomography as well as magnetic resonance imaging and nuclear/molecular imaging which are discussed in the chapter. Functional tests include flow-mediated dilatation, pulse wave analysis, and the ankle-brachial index. In clinical research other invasive measurements such as intravascular ultrasound/virtual histology/elastography, optical coherence tomography as well as thermography are being used. However, their value in clinical prevention still needs to be established.
APA, Harvard, Vancouver, ISO, and other styles
4

Mushambi, Mary C., and Rajesh Pandey. Management of the difficult airway. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198713333.003.0026.

Full text
Abstract:
Failed or difficult intubation is still a major cause of maternal morbidity and mortality. The management of the airway in the pregnant patient requires careful consideration of anatomical and physiological changes, training issues, and situational factors. Despite significant improvements in monitoring and airway equipment, and a reduction in anaesthetic-related maternal mortality, the incidence of failed intubation in the pregnant woman in many units has remained between 1/250 and 1/300. This may result from many factors such as the reduction of the number of caesarean deliveries performed under general anaesthesia which has resulted in limited opportunities to teach airway skills in obstetrics, the increased incidence of obesity, and the rise in maternal age and associated co-morbidities. Improved training and careful planning and performance of a general anaesthetic (i.e. reducing the risk of aspiration; optimum pre-oxygenation, patient positioning, and application of cricoid pressure; and availability of appropriate airway equipment) have the potential to reduce airway-related morbidity and mortality in the pregnant woman. Simple bedside tests such as Mallampati scoring, thyromental distance, neck movement, and ability to protrude the mandible may help to predict a potential difficult airway, particularly when used in combination. Management of a predicted difficult airway requires early referral to the anaesthetists, formulation of an airway management strategy, and involvement of the multidisciplinary team in decision-making. Fibreoptic equipment and skills should be readily available when required. Management of the unpredicted difficult airway should make maintenance of maternal and fetal oxygenation the primary goal. Decision-making during a failed intubation on whether to proceed or wake the patient should involve the obstetrician and ideally be planned in advance. The periods during extubation and recovery are high risk and require preparation and planning in advance.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Optimal scoring"

1

Thi, Hoai An Le, and Duy Nhat Phan. "A DC Programming Approach for Sparse Optimal Scoring." In Advances in Knowledge Discovery and Data Mining. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-18032-8_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Clarke, Stephen R. "Dynamic Programming in One-Day Cricket—Optimal Scoring Rates." In Operational Research Applied to Sports. Palgrave Macmillan UK, 2015. http://dx.doi.org/10.1057/9781137534675_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Juan, James O. Ramsay, and Marie Wiberg. "TestGardener: A Program for Optimal Scoring and Graphical Analysis." In Springer Proceedings in Mathematics & Statistics. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-01310-3_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kok, Jelle, Remco de Boer, Nikos Vlassis, and Frans C. A. Groen. "Towards an Optimal Scoring Policy for Simulated Soccer Agents." In RoboCup 2002: Robot Soccer World Cup VI. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45135-8_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Al Sarkhi, Awaad K., and John R. Talburt. "Model for Estimating the Optimal Parameter Values of the Scoring Matrix in the Entity Resolution of Unstandardized References." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-73103-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

von Skerst, Bernhard. "Scoring-Modelle: Intelligentes Informationsmanagement für Kosten-Nutzen-optimale Kundenbindung." In Führungskräfte-Handbuch. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/978-3-642-56401-7_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lappas, Pantelis Z., and Athanasios N. Yannacopoulos. "Credit Scoring." In Advances in Marketing, Customer Relationship Management, and E-Services. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-5077-9.ch028.

Full text
Abstract:
The main objective of this chapter is to propose a hybrid evolutionary feature selection approach for solving credit scoring problems subject to constraints. A hybrid scheme combining filter and wrapper-based approaches is proposed to develop an accurate credit scoring model with a high predictive performance. Initially, the minimum redundancy maximum relevance algorithm is applied to find an optimal set of features that is mutually and maximally dissimilar and can represent the response variable effectively, allowing for an ordering of features by their importance. Subsequently, an iterative procedure, where supervised machine learning algorithms such as the logistic regression and the linear-discriminant analysis are combined with an evolutionary optimization algorithm like the genetic algorithm, is applied to choose the feature subset that maximizes an appropriate classification measure according to the predefined features and subject to the predefined constraints. The performance of the proposed method is illustrated using standard credit scoring datasets.
APA, Harvard, Vancouver, ISO, and other styles
8

"Information Functions and Optimal Scoring Weights." In Applications of Item Response Theory To Practical Testing Problems. Routledge, 2012. http://dx.doi.org/10.4324/9780203056615-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Keramati, Abbas, Niloofar Yousefi, and Amin Omidvar. "Default Probability Prediction of Credit Applicants Using a New Fuzzy KNN Method With Optimal Weights." In Intelligent Systems. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5643-5.ch082.

Full text
Abstract:
Credit scoring has become a very important issue due to the recent growth of the credit industry. As the first objective, this chapter provides an academic database of literature between and proposes a classification scheme to classify the articles. The second objective of this chapter is to suggest the employing of the Optimally Weighted Fuzzy K-Nearest Neighbor (OWFKNN) algorithm for credit scoring. To show the performance of this method, two real world datasets from UCI database are used. In classification task, the empirical results demonstrate that the OWFKNN outperforms the conventional KNN and fuzzy KNN methods and also other methods. In the predictive accuracy of probability of default, the OWFKNN also show the best performance among the other methods. The results in this chapter suggest that the OWFKNN approach is mostly effective in estimating default probabilities and is a promising method to the fields of classification.
APA, Harvard, Vancouver, ISO, and other styles
10

Keramati, Abbas, Niloofar Yousefi, and Amin Omidvar. "Default Probability Prediction of Credit Applicants Using a New Fuzzy KNN Method with Optimal Weights." In Advances in Business Information Systems and Analytics. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-7272-7.ch024.

Full text
Abstract:
Credit scoring has become a very important issue due to the recent growth of the credit industry. As the first objective, this chapter provides an academic database of literature between and proposes a classification scheme to classify the articles. The second objective of this chapter is to suggest the employing of the Optimally Weighted Fuzzy K-Nearest Neighbor (OWFKNN) algorithm for credit scoring. To show the performance of this method, two real world datasets from UCI database are used. In classification task, the empirical results demonstrate that the OWFKNN outperforms the conventional KNN and fuzzy KNN methods and also other methods. In the predictive accuracy of probability of default, the OWFKNN also show the best performance among the other methods. The results in this chapter suggest that the OWFKNN approach is mostly effective in estimating default probabilities and is a promising method to the fields of classification.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Optimal scoring"

1

Zengrong, Zhao. "How to Determine the Optimal Scoring Policy." In 2011 International Conference on Intelligent Computation Technology and Automation (ICICTA). IEEE, 2011. http://dx.doi.org/10.1109/icicta.2011.269.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Shuo, Qiang Wu, and Ju Liu. "Tensor optimal scoring for alzheimer's disease detection." In 2017 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD). IEEE, 2017. http://dx.doi.org/10.1109/fskd.2017.8393006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bieńkowska, Jadwiga R., Robert G. Rogers, and Temple F. Smith. "A method for optimal design of a threading scoring function." In the third annual international conference. ACM Press, 1999. http://dx.doi.org/10.1145/299432.299446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Haghpanahi, Masoumeh, Reza Sameni, and David A. Borkholder. "Scoring consensus of multiple ECG annotators by optimal sequence alignment." In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2014. http://dx.doi.org/10.1109/embc.2014.6943971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sefair, Jorge A., Daniel Castro-Lacouture, and Andres L. Medaglia. "Material Selection in Building Construction Using Optimal Scoring Method (OSM)." In Construction Research Congress 2009. American Society of Civil Engineers, 2009. http://dx.doi.org/10.1061/41020(339)109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Neyman, Eric, and Tim Roughgarden. "From Proper Scoring Rules to Max-Min Optimal Forecast Aggregation." In EC '21: The 22nd ACM Conference on Economics and Computation. ACM, 2021. http://dx.doi.org/10.1145/3465456.3467599.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lenka, Sudhansu R., Sukant Kishoro Bisoy, Rojalina Priyadarshini, Jhalak Hota, and Rabindra K. Barik. "An Effective Credit Scoring Model Implementation by Optimal Feature Selection Scheme." In 2021 International Conference on Emerging Smart Computing and Informatics (ESCI). IEEE, 2021. http://dx.doi.org/10.1109/esci50559.2021.9396911.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Dey, Palash, Neeldhara Misra, Swaprava Nath, and Garima Shakya. "A Parameterized Perspective on Protecting Elections." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/34.

Full text
Abstract:
We study the parameterized complexity of the optimal defense and optimal attack problems in voting. In both the problems, the input is a set of voter groups (every voter group is a set of votes) and two integers k_a and k_d corresponding to respectively the number of voter groups the attacker can attack and the number of voter groups the defender can defend. A voter group gets removed from the election if it is attacked but not defended. In the optimal defense problem, we want to know if it is possible for the defender to commit to a strategy of defending at most k_d voter groups such that, no matter which k_a voter groups the attacker attacks, the out-come of the election does not change. In the optimal attack problem, we want to know if it is possible for the attacker to commit to a strategy of attacking k_a voter groups such that, no matter which k_d voter groups the defender defends, the outcome of the election is always different from the original (without any attack) one. We show that both the optimal defense problem and the optimal attack problem are computationally intractable for every scoring rule and the Condorcet voting rule even when we have only3candidates. We also show that the optimal defense problem for every scoring rule and the Condorcet voting rule is W[2]-hard for both the parameters k_a and k_d, while it admits a fixed parameter tractable algorithm parameterized by the combined parameter (ka, kd). The optimal attack problem for every scoring rule and the Condorcet voting rule turns out to be much harder – it is W[1]-hard even for the combined parameter (ka, kd). We propose two greedy algorithms for the OPTIMAL DEFENSE problem and empirically show that they perform effectively on reasonable voting profiles.
APA, Harvard, Vancouver, ISO, and other styles
9

Pitman, Mark W., and Anthony D. Lucey. "Optimal Swimming Modes of a Homo-Sapien Performing Butterfly-Stroke Kick." In ASME 2006 Pressure Vessels and Piping/ICPVT-11 Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/pvp2006-icpvt-11-93938.

Full text
Abstract:
This paper outlines the development and application of a computational method that finds the most efficient two-dimensional swimming mode of a human performing fully submerged butterfly-stroke kick at high Reynolds number. The optimal solution of this non-linear problem is found using a Genetic Algorithm (GA) search method where possible solutions compete in a ‘survival of the fittest’ scheme to ‘breed’ the optimal solution. The swimming is modelled using Discrete Vortex Method (DVM) and Boundary Element Method (BEM) computational techniques. The BEM solves for the inviscid flow field around the two-dimensional body while the shedding of vorticies from joints where the curvature is high (ie. knee, waist and ankle joints) generate the vortex structures necessary for propulsion. The motion of the limbs is characterised by a displacement function which includes the possibility for simple harmonic or non-harmonic motion with a ‘rest’ period in the kick. The finite number of joints means that a finite length parameter set can be developed which characterises the motion of the swimming body. This parameter set is fed into the GA to perform the optimisation based on a scoring function. In this case, the scoring function is simply the distance that the body swims in a set amount of time. The objective of the GA is to maximise this score for a set kicking frequency. This method opens a wider possibility for optimisation of a variety of systems that involve fluid-structure interactions, particulary the possibility of optimisation in the non-linear regime of prescribed motion coupled with compliant surfaces (such as rubbery flippers) that could further increase efficiency.
APA, Harvard, Vancouver, ISO, and other styles
10

Hulse, Daniel, Christopher Hoyle, Kai Goebel, and Irem Y. Tumer. "Optimizing Function-Based Fault Propagation Model Resilience Using Expected Cost Scoring." In ASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/detc2018-85318.

Full text
Abstract:
Complex engineered systems are often associated with risk due to high failure consequences, high complexity, and large investments. As a result, it is desirable for complex engineered systems to be resilient such that they can avoid or quickly recover from faults. Ideally, this should be done at the early design stage where designers are most able to explore a large space of concepts. Previous work has shown that functional models can be used to predict fault propagation behavior and motivate design work. However, little has been done to formally optimize a design based on these predictions, partially because the effects of these models have not been quantified into an objective function to optimize. This work introduces a scoring function which integrates with a fault scenario-based simulation to enable the risk-neutral optimization of functional model resilience. This scoring function accomplishes this by resolving the tradeoffs between the design costs, operating costs, and modeled fault response of a given design in a way that may be parameterized in terms of designer-specified resilient features. This scoring function is adapted and applied to the optimization of controlling functions which recover flows in a monopropellant orbiter. In this case study, an evolutionary algorithm is found to find the optimal logic for these functions, showing an improvement over a typical a-priori guess by exploring a large range of solutions, demonstrating the value of the approach.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography