Academic literature on the topic 'Weighted Entropy'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Weighted Entropy.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Weighted Entropy"

1

Stosic, Darko, Dusan Stosic, Tatijana Stosic, and Borko Stosic. "Generalized weighted permutation entropy." Chaos: An Interdisciplinary Journal of Nonlinear Science 32, no. 10 (2022): 103105. http://dx.doi.org/10.1063/5.0107427.

Full text
Abstract:
A novel heuristic approach is proposed here for time series data analysis, dubbed Generalized weighted permutation entropy, which amalgamates and generalizes beyond their original scope two well established data analysis methods: Permutation entropy and Weighted permutation entropy. The method introduces a scaling parameter to discern the disorder and complexity of ordinal patterns with small and large fluctuations. Using this scaling parameter, the complexity-entropy causality plane is generalized to the complexity-entropy-scale causality box. Simulations conducted on synthetic series generated by stochastic, chaotic, and random processes, as well as real world data, are shown to produce unique signatures in this three dimensional representation.
APA, Harvard, Vancouver, ISO, and other styles
2

Guiaşu, Radu Cornel, and Silviu Guiaşu. "Conditional and Weighted Measures of Ecological Diversity." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 11, no. 03 (2003): 283–300. http://dx.doi.org/10.1142/s0218488503002089.

Full text
Abstract:
Shannon's entropy and Simpson's index are the most used measures of species diversity. As the Simpson index proves to be just an approximation of the Shannon entropy, conditional Simpson indices of diversity and a global measure of interdependence among species are introduced, similar to those used in the corresponding entropic formalism from information theory. Also, since both the Shannon entropy and the Simpson index depend only on the number and relative abundance of the respective species in a given ecosystem, the paper generalizes these indices of diversity to the case when a numerical weight is attached to each species. Such a weight could reflect supplementary information about the absolute abundance, the economic significance, or the conservation value of the species.
APA, Harvard, Vancouver, ISO, and other styles
3

Kelbert, Mark, Izabella Stuhl, and Yuri Suhov. "Weighted entropy: basic inequalities." Modern Stochastics: Theory and Applications 4, no. 3 (2017): 233–52. http://dx.doi.org/10.15559/17-vmsta85.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Das, Suchismita. "On weighted generalized entropy." Communications in Statistics - Theory and Methods 46, no. 12 (2016): 5707–27. http://dx.doi.org/10.1080/03610926.2014.960583.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Misagh, F., and G. H. Yari. "On weighted interval entropy." Statistics & Probability Letters 81, no. 2 (2011): 188–94. http://dx.doi.org/10.1016/j.spl.2010.11.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Yuan, and Aijing Lin. "Weighted link entropy and multiscale weighted link entropy for complex time series." Nonlinear Dynamics 105, no. 1 (2021): 541–54. http://dx.doi.org/10.1007/s11071-021-06599-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kvålseth, Tarald O. "On Weighted Exponential Entropies." Perceptual and Motor Skills 92, no. 1 (2001): 3–7. http://dx.doi.org/10.2466/pms.2001.92.1.3.

Full text
Abstract:
As an alternative to Shannon's classical entropy measure of information, an exponential entropy function was proposed by Pal and Pal in 1989 and 1991. To generalize Pal's entropy further, this author introduced two different families of exponential entropies that are one-parameter generalizations of Pal's entropy. The purpose of the present paper is to define weighted entropies corresponding to those one-parameter generalizations. Some properties and examples of such weighted exponential entropies are discussed.
APA, Harvard, Vancouver, ISO, and other styles
8

Puneet, Kumar. "Weighted Average Charge for Hetergenous Questionnaires and Weighted Netropy." Business Research 1, no. 1 (2003): 131–42. https://doi.org/10.5281/zenodo.3888051.

Full text
Abstract:
In the present paper we introduce the idea of Weighted Average Charge for Heterogeneous Questionnaires extending the Average Charge studied by Dancan G.T.(1974). Some Coding theorems and comparison results have been presented through weighted entropy. It is also shown that the weighted Average charge for Heterogeneous Questionnaires is bounded below by weighted entropy.
APA, Harvard, Vancouver, ISO, and other styles
9

Yu, Shiwei, and Ting-Zhu Huang. "Exponential weighted entropy and exponential weighted mutual information." Neurocomputing 249 (August 2017): 86–94. http://dx.doi.org/10.1016/j.neucom.2017.03.075.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Verma, Rohit Kumar. "On Generating Measures of Fuzzy Cross-Entropy by Employing Measures of Fuzzy Weight Entropy." Asian Journal of Probability and Statistics 22, no. 3 (2023): 37–44. http://dx.doi.org/10.9734/ajpas/2023/v22i3486.

Full text
Abstract:
By utilizing the ideas of fuzzy-weighted entropy and fuzzy-weighted directed divergence and appropriately selecting the weights, several new measures of fuzzy cross-entropy have been generated. Concurrently, some new measures of fuzzy weighted entropy have been suggested based on the new measures of fuzzy cross-entropy generated.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Weighted Entropy"

1

Mieth, Therese [Verfasser], Dorothee [Gutachter] Haroske, Hans [Gutachter] Triebel, and David E. [Gutachter] Edmunds. "Entropy and approximation numbers of weighted Sobolev embeddings : a bracking technique / Therese Mieth ; Gutachter: Dorothee Haroske, Hans Triebel, David E. Edmunds." Jena : Friedrich-Schiller-Universität Jena, 2016. http://d-nb.info/117761152X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Юхименко, Микола Петрович, Николай Петрович Юхименко та Mykola Petrovych Yukhymenko. "Ентропійні методи опису технологічних процесів в апаратах завислого шару". Thesis, Вид-во СумДУ, 2010. http://essuir.sumdu.edu.ua/handle/123456789/5796.

Full text
Abstract:
Одним зі шляхів підвищення інтенсивності технологічних процесів є здійснення їх в активному аеродинамічному режимі – завислому шарі. Розробку і впровадження апаратів із завислим шаром на багатьох виробництвах стримує відсутність надійних розрахункових залежностей. Одержання останніх можливе при використанні ентропійних методів до опису складних процесів, що протікають при зважуванні і переносі турбулентним потоком газу частинок зернистого матеріалу. При цитуванні документа, використовуйте посилання http://essuir.sumdu.edu.ua/handle/123456789/5796
APA, Harvard, Vancouver, ISO, and other styles
3

Fernández-Valdés, Villa Bruno. "La variabilidad de movimiento en el entrenamiento de fuerza en los deportes de equipo / Movement variability in resistance training in team sports." Doctoral thesis, Universitat de Barcelona, 2020. http://hdl.handle.net/10803/670682.

Full text
Abstract:
El entrenamiento de la fuerza en los deportes de equipo ha ido evolucionando en las últimas décadas hacia un enfoque más integrador, adaptándose a sus características propias y distanciándose de los entrenamientos más tradicionales derivados de los deportes individuales. Julio Tous (Seirul·lo, 2017, Capítulo Tous Fajardo, Julio) propone un cambio de paradigma basando el entrenamiento de fuerza en el movimiento humano, así serán los movimientos los que actuarán como eje vertebrador de las propuestas de ejercicios de fuerza y no los grupos musculares, que serán los meros ejecutores. Una de las metodologías que nos permiten ajustarnos a este cambio de paradigma es el entrenamiento estructurado a través de sus diferentes niveles de aproximación deportiva. Seirul·lo (Seirul-lo, 1993b) estableció cuatro niveles de especificidad que iban de general a competitivo; posteriormente, Moras (1994) estableció seis niveles de aproximación deportiva que iban del 0 al 5 y sus diferentes agrupaciones. Uno de los grandes retos pendientes de abordar y aportar evidencia científica era mediante qué sensores y a través de qué tipo de análisis se debían monitorizar estos niveles de aproximación deportiva, especialmente en ese cambio de ejercicios con una predominancia condicional a otros con predominancia coordinativa. Desde una perspectiva de evaluación, las oscilaciones del movimiento humano se pueden evaluar como cualquier otra serie temporal (Couceiro et al., 2014). Sin embargo, las series temporales fruto de señales fisiológicas presentan fluctuaciones no lineales y, como consecuencia, realizar un análisis estadístico mediante técnicas lineales nos dará como resultado una información parcial del movimiento humano (Orellana & Torres, 2010). Por lo tanto, el uso de herramientas no lineales como la entropía puede ser una buena alternativa para explorar la naturaleza del movimiento humano y su relación con el desarrollo coordinativo (Preatoni et al., 2013), reflejando cambios en la variabilidad de movimiento durante la ejecución de las tareas y a lo largo del tiempo. Así, el objetivo de esta tesis fue analizar el rol de la variabilidad de movimiento medida a través de la entropía en tareas de fuerza con diferentes niveles de aproximación deportiva. Primero se hizo una prueba piloto con el objetivo de comprobar si los acelerómetros de dos sensores con frecuencias de muestreo de 100 y 1000Hz eran válidos para evaluar tareas de fuerza de corta duración seleccionados en los estudios de esta tesis. Se observaron que había diferencias tanto en los valores de aceleración media como de entropía en base a la frecuencia de muestreo, con valores de entropía significativamente superiores al registrar a 100 Hz, donde claramente se observaba que había una pérdida de puntos temporales los cuales hacían que la señal fuera menos previsible (Figura 11). En conclusión, dadas las características temporales de las tareas de fuerza en los deportes de equipo, 1000 Hz es una frecuencia adecuada para registrar los ejercicios cortos y explosivos, y por lo tanto fue la frecuencia empleada durante todos los estudios de la tesis. En el primer estudio se analizó la variabilidad en la aceleración durante una tarea de fuerza realizada con un cono de inercia sin (NOBALL) y con (BALL) el condicionante de coger y pasar un balón de rugby. Cambios en las medias (%; ± 90% CL) de 4.64; ± 3.1 g en la aceleración media y 39.48; ± 36.63 a.u. en la entropía indican un aumento probable y muy probable cuando se introduce el condicionante del balón. La entropía multiescala también mostró una mayor imprevisibilidad de la aceleración bajo la condición del balón, especialmente en escalas de tiempo más altas. Por lo tanto, la aplicación de condicionantes coordinativos en el entrenamiento de fuerza con jugadores de rugby produce una cantidad diferente de variabilidad de movimiento a través de múltiples escalas de tiempo fisiológico. El segundo estudio tuvo como objetivo identificar las diferencias entre posiciones (forwards vs. backs) en la variabilidad del movimiento en el entrenamiento de placajes acumulados durante los roles de ataque y defensa. Los participantes realizaron cuatro bloques de seis placajes de defensa (es decir, placar a un oponente) y seis de ataque (es decir, ser tacleado por un oponente mientras llevaba una pelota), es decir, 48 placajes totales. Se utilizó la entropía muestral (SampEn) para analizar la variabilidad del movimiento. Se observaron diferencias significativas entre bloques en los backs (bloque 1 frente a 3 y bloque 1 frente a 4 pero no en forwards. La variabilidad del movimiento mostró una reducción progresiva con los placajes acumulados, especialmente en los backs y en el rol defensivo. Los delanteros presentaron valores de variabilidad de movimiento más bajos en todos los bloques, particularmente en el primero, tanto en el rol de ataque como en el defensa. Por último, el tercer estudio el objetivo fue identificar los cambios en la variabilidad de movimiento y la velocidad de ejecución durante un período de entrenamiento de seis semanas utilizando una tarea de fuerza realizada con un cono inercial sin (NOBALL) y con (BALL) el condicionante de coger y pasar una pelota de rugby, durante un período de seis semanas. SampEn no mostró una disminución significativa para NOBALL (ES -0.64 ± 1.02) y una disminución significativa para BALL (ES -1.71 ± 1.16; p <0.007). Además, la velocidad de ejecución mostró un aumento significativo para NOBALL (ES 1.02 ± 1.05; p <0.047), y un aumento significativo para BALL (ES 1.25 ± 1.08; p <0.025) entre las semanas 1 y 6. El índice de complejidad mostró mayores niveles en la condición de BALL, especialmente en las primeras tres semanas. La velocidad de ejecución y la variabilidad de movimiento se adaptaron a las limitaciones de la tarea después de un período de entrenamiento de cuatro semanas. Las medidas de entropía parecen una técnica de señal de procesamiento de datos prometedora para identificar cuándo se deben cambiar estas tareas de ejercicio en el entrenamiento de fuerza en los deportes de equipo. En conclusión, analizar la variabilidad de movimiento a través de la entropía tendrá un rol fundamental en el control y programación de tareas de fuerza en los deportes de equipo, especialmente cuando estas tengan una predominancia coordinativa basada en el movimiento deportivo.<br>Over the last decades, resistance training in team sports has evolved towards a more integrative approach, adapting to its own characteristics and distancing itself from the more traditional training derived from individual sports. Based on human movement, Julio Tous (Seirul·lo Vargas, 2017, Chapter Tous Fajardo, Julio) proposed a paradigm shift in which movements act as the backbone of exercise selection instead of the muscle groups, which in turn become mere executors. Structured training is a resistance training methodology that allows us to adjust to this paradigm shift through different levels of sports approach. Seirul·lo (1993) established four levels of specificity ranging from general to competitive. Subsequently, Moras (2000) developed six levels of sports approach, ranging from zero to five with different groupings. Analysis of human movement has evolved to allow the assessment of the variability of a measure by targeting the detection of changes in fluctuations and spatiotemporal characteristics of the outcomes. Within the past 20 years, entropy analysis has become relatively popular as a measure of system complexity. Thus, the aim of this thesis was to analyse the role of movement variability measured through entropy in resistance training in team sports with different levels of sports approach. First, a pilot test was carried out to compare the values of mean acceleration and entropy values in short actions (i.e. collisions) when registered with two devices with different accelerometer sampling frequencies (1000 Hz versus 100 Hz). Differences were observed for mean acceleration and entropy when measured with different sampling frequencies. Therefore, 1000 Hz was selected as the sampling frequency for the rest of the experiments on this thesis. The first study described the variability in acceleration during a resistance training task, performed in horizontal inertial flywheels without (NOBALL) or with the constraint of catching and throwing a rugby ball (BALL). Mean changes (%; ±90%CL) of 4.64; ±3.1 g for mean acceleration and 39.48; ±36.63 a.u. for sample entropy indicated likely and very likely increase when in BALL condition. Multiscale entropy also showed higher unpredictability of acceleration under the BALL condition, especially at higher time scales. Thus, the application of match specific constraints in resistance training for rugby players elicit different amount of variability of body acceleration across multiple physiological time scales. The second study aims to identify between-position (forwards vs. backs) differences in movement variability in cumulative tackle events training during both attacking and defensive roles. Participants performed four blocks of six tackling (i.e. tackling an opponent) and six tackled (i.e. being tackled by an opponent while carrying a ball) events (i.e. 48 total tackles) while wearing a micro-technological inertial measurement unit. Sample entropy (SampEn) were used to analyse the movement variability. Significant between-block differences were observed in backs (block 1 vs 3 and block 1 vs 4) but not in forwards. Movement variability shows a progressive reduction with cumulative tackle events, especially in backs and the defensive role. Forwards present lower movement variability values in all blocks, particularly in the first block, both in the attacking and defensive role. Last, in the third study aims was to identify the changes in movement variability and movement velocity during six weeks training period using a resistance horizontal forward-backward task without (NOBALL) or with (BALL) the constraint of catching and throwing a rugby ball in the forward phase, during a six-week period. SampEn showed no significant decrease for NOBALL (ES -0.64 ± 1.02) and significant decrease for BALL (ES -1.71 ± 1.16; p<0.007) conditions. Additionally, movement velocity showed significant increase for NOBALL (ES 1.02 ± 1.05; p<0.047), and significant increase for BALL (ES 1.25 ± 1.08; p<0.025) between weeks 1 and 6. The complexity index showed higher levels of complexity in BALL condition, specifically in the first three weeks. Movement velocity and complex dynamics were adapted to the constraints of the task after a four-week training period. Entropy measures seem a promising processing signal technique to identify when these exercise tasks should be changed. In conclusion, analysing the movement variability through entropy will have a fundamental role in the control and programming of resistance training in team sports, especially when they have a coordinative predominance based on sports movement.
APA, Harvard, Vancouver, ISO, and other styles
4

Gondhalekar, Akash Avinash. "Design and Development of Light Weight High Entropy Alloys." Thesis, Tekniska Högskolan, Högskolan i Jönköping, JTH, Material och tillverkning, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-45551.

Full text
Abstract:
The main aim of this thesis was to design and develop new Aluminium based compositionally complex alloys (CCAs) using the high entropy alloy (HEA) concept, and to understand their evolution of microstructures during casting and also after the secondary process which is heat-treatment, and finally to evaluate their subsequent mechanical properties. Prior to the development of alloys, a computational technique ThermoCalc was used which helped in understanding the phase formation in various results. Use of thermodynamic physical parameters for predicting the stability of single-phase fields was done to assess their validity in predicting the compositional regions of the alloys developed. The first alloy developed is Al73.6Mg18Ni1.5Ti1.9Zr1Zn4 in at% (NiTiZrZn) CCA. The microstructure consists of the FCC as a primary phase with ~49% volume fraction along with β-AlMg and intermetallic (IM) phases including Al3Ni, Al3Ti, and Al3Zr. After casting, the microstructure showed some presence of eutectic structures. The Al3Ti, and Al3Zr IM phases seemed to precipitate early which led to less homogenization of Ti and Zr, causing deviation in the amount of these elements in the matrix. Further, the CCA was heat-treated at 375 oC for 24hrs and 48hrs and the evolution of microstructure along with its hardness and phase transformation characterisation was investigated. The second developed alloy was quaternary Al65.65Mg21.39Ag10.02Ni2.94 in at% (AgNi) CCA. In the as-cast state, the main phase (matrix) was FCC with ~64 % volume fraction along with BCC, β-AlMg and Al3Ni IM phases. There was a good level homogenization of all elements in the alloy. They were further heat- treated at 400 oC for 24 hrs and 48 hrs and were studied for any change in microstructure along with its hardness and thermal stability. This CCA had the highest hardness value from all developed CCAs. Lastly, in order to check how Ni affects the microstructure and properties of (AgNi) CCA, a ternary Al67.2Mg22.09Ag10.7 in at% (Ag) CCA was developed. The composition was kept such that it is exactly 97% by excluding the Ni. During the development of this alloy, the cast was cooled in two ways first being the normal cooled just like other CCAs and second being a fast cooling method. Both of these alloys consists of the FCC phase as a primary phase with 72% volume fraction along with BCC and β-AlMg. Both of them were also heat treated at 400 oC for 24 hrs and 48 hrs to evaluate any changes in microstructure and also to assess its hardness and thermal stability.
APA, Harvard, Vancouver, ISO, and other styles
5

Khabou, Mohamed Ali. "Improving shared weight neural networks generalization using regularization theory and entropy maximization /." free to MU campus, to others for purchase, 1999. http://wwwlib.umi.com/cr/mo/fullcit?p9953870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stamatopoulou, Anastasia. "AGGREGATION IN MULTIAGENT AND MULTICRITERIA DECISION MODELS: INTERACTION, DYNAMICS, AND MAXIMUM ENTROPY WEIGHTS IN THE FRAMEWORK OF CHOQUET INTEGRATION." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/311987.

Full text
Abstract:
In the context of MCDM, the Choquet integral constitutes an interesting aggregation model which generalizes both the classical and the ordered weighted means. In the Choquet integration framework, an additive capacity generates a classical weighted mean, whereas a symmetric (non-additive) capacity generates an ordered weighted mean. Moreover, a general (non-additive) Choquet capacity induces a natural weighted mean, the Shapley mean, whose weights correspond to the so-called Shapley power indices. In the first part of the thesis, we examine a negotiation model which combines the Choquet integration framework with the classical Morris H. DeGroot 1974 model of consensus linear dynamics, in interactive multicriteria and multiagents networks. We consider a set $N={1,ldots,n }$ of interacting criteria (or agents) whose single evaluations (or individual opinions) are expressed in some domain $mathbb{D}subseteq mathbb{R}$. The interaction among the criteria (or agents) is expressed by a symmetric interaction matrix with null diagonal and off-diagonal coefficients in the open unit interval. The interaction network structure is thus that of a complete graph with edge values in $(0,1)$. In the Choquet integration framework, the interacting network structure is the basis for the construction of a capacity $mu$, whose Shapley indices are proportional to the average degree of interaction between criterion (or agent) $i$ and the remaining criteria (or agents). In relation with this interactive multicriteria (or multiagent) network model, we discuss three types of linear consensus dynamics, each of which represents a progressive aggregation process towards a consensual evaluation (or opinion) of the single criteria (or agents), corresponding to some form of mean of the original evaluations (or opinions). In the first type, the progressive aggregation converges simply to the plain mean of the original evaluations (or opinions) of the single criteria (or agents), while the second type converges to the Shapley mean of the original evaluations (or opinions). The third type, instead, converges to an emphasized form of Shapley mean, which we call superShapley mean. The interesting relation between Shapley and superShapley aggregation is investigated. In the second part of the thesis, we focus on entropy constrained optimization in the context of ordered weighted means, both in the classical Shannon entropy case, and in the more general Tsallis entropy case. The maximum entropy method is based on the solution of a nonlinear constrained optimization problem in which the OWA weights are obtained by maximizing the entropy, given a specified degree of orness. In the Shannon entropy case, we begin by reviewing the analytic solution of the maximum entropy method proposed by Filev and Yager in 1995, and later by Fuller and Majlender in 2001, and we consider the maximum entropy method in the binomial decomposition framework. Then, we present the optimization of the parametric Tsallis entropy function associated with Ordered Weighted Averaging. We examine the meaning of the entropic parameter $gamma$ in the context of OWA functions and how it affects the behavior of the associated entropy function. We introduce the nonlinear constrained optimization problem of Tsallis entropy for parameter values $gamma in (0,1)$ and we obtain the solution for the optimal weights in terms of the two Lagrange multipliers. Both in Shannon and Tsallis entropy cases for parameter $gamma in (0,1)$, the optimal weights for orness values in the open unit interval are positive (except for the extreme orness values $0,1$) and monotonic (increasing or decreasing) over the whole orness range $Omega in[0,1]$.
APA, Harvard, Vancouver, ISO, and other styles
7

Stamatopoulou, Anastasia. "AGGREGATION IN MULTIAGENT AND MULTICRITERIA DECISION MODELS: INTERACTION, DYNAMICS, AND MAXIMUM ENTROPY WEIGHTS IN THE FRAMEWORK OF CHOQUET INTEGRATION." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/311987.

Full text
Abstract:
In the context of MCDM, the Choquet integral constitutes an interesting aggregation model which generalizes both the classical and the ordered weighted means. In the Choquet integration framework, an additive capacity generates a classical weighted mean, whereas a symmetric (non-additive) capacity generates an ordered weighted mean. Moreover, a general (non-additive) Choquet capacity induces a natural weighted mean, the Shapley mean, whose weights correspond to the so-called Shapley power indices. In the first part of the thesis, we examine a negotiation model which combines the Choquet integration framework with the classical Morris H. DeGroot 1974 model of consensus linear dynamics, in interactive multicriteria and multiagents networks. We consider a set $N={1,ldots,n }$ of interacting criteria (or agents) whose single evaluations (or individual opinions) are expressed in some domain $mathbb{D}subseteq mathbb{R}$. The interaction among the criteria (or agents) is expressed by a symmetric interaction matrix with null diagonal and off-diagonal coefficients in the open unit interval. The interaction network structure is thus that of a complete graph with edge values in $(0,1)$. In the Choquet integration framework, the interacting network structure is the basis for the construction of a capacity $mu$, whose Shapley indices are proportional to the average degree of interaction between criterion (or agent) $i$ and the remaining criteria (or agents). In relation with this interactive multicriteria (or multiagent) network model, we discuss three types of linear consensus dynamics, each of which represents a progressive aggregation process towards a consensual evaluation (or opinion) of the single criteria (or agents), corresponding to some form of mean of the original evaluations (or opinions). In the first type, the progressive aggregation converges simply to the plain mean of the original evaluations (or opinions) of the single criteria (or agents), while the second type converges to the Shapley mean of the original evaluations (or opinions). The third type, instead, converges to an emphasized form of Shapley mean, which we call superShapley mean. The interesting relation between Shapley and superShapley aggregation is investigated. In the second part of the thesis, we focus on entropy constrained optimization in the context of ordered weighted means, both in the classical Shannon entropy case, and in the more general Tsallis entropy case. The maximum entropy method is based on the solution of a nonlinear constrained optimization problem in which the OWA weights are obtained by maximizing the entropy, given a specified degree of orness. In the Shannon entropy case, we begin by reviewing the analytic solution of the maximum entropy method proposed by Filev and Yager in 1995, and later by Fuller and Majlender in 2001, and we consider the maximum entropy method in the binomial decomposition framework. Then, we present the optimization of the parametric Tsallis entropy function associated with Ordered Weighted Averaging. We examine the meaning of the entropic parameter $gamma$ in the context of OWA functions and how it affects the behavior of the associated entropy function. We introduce the nonlinear constrained optimization problem of Tsallis entropy for parameter values $gamma in (0,1)$ and we obtain the solution for the optimal weights in terms of the two Lagrange multipliers. Both in Shannon and Tsallis entropy cases for parameter $gamma in (0,1)$, the optimal weights for orness values in the open unit interval are positive (except for the extreme orness values $0,1$) and monotonic (increasing or decreasing) over the whole orness range $Omega in[0,1]$.
APA, Harvard, Vancouver, ISO, and other styles
8

Edmonds, Christopher, and Jacqueline Brody. "Effect of a Network Wide Computer Entry System and Weight Based Dosing on Heparin Alert Rates." The University of Arizona, 2015. http://hdl.handle.net/10150/614107.

Full text
Abstract:
Class of 2015 Abstract<br>Objectives: The purpose of the study was to analyze the effect of a network wide computer system and the implementation of weight based dosing on heparin alert rates. Our central hypotheses was that the implementation of a network wide computer system will decrease alert rates for heparin infusions on smart pump infusion systems. Our rationale for this study was to evaluate methods to improve patient safety for high alert medications such as heparin. Methods: This was a before-after study design evaluating the effect of the intervention using data obtained by a smart pump infusion system. Heparin infusions at the university campus were analyzed for the effect of a network wide computer system, administered in the adult ICU or Med/Surg unit between July 2013-September 2013 and from January 2014-March 2014. Pump data from before the implementation of the network wide computer system was compared to the pump data obtained after the network wide computer system. Results: After the implementation of a hospital wide computerized physician order entry system, there was a statistically significant increase in heparin alert rates from 15.7 alerts per 100 infusions of heparin to an alert rate of 20.2 alerts per 100 infusions of heparin (P=0.001). Conclusions: The implementation of a network wide computerized physician order entry was associated with an increase in alert generation rate on smart pump infusion systems. Further studies are needed to elucidate this unexpected increase in alerts.
APA, Harvard, Vancouver, ISO, and other styles
9

Avapak, Sukunta. "Failure mode analysis on concentrated solar power (CSP) plants : a case study on solar tower power plant." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/102375/1/Sukunta_Avapak_Thesis.pdf.

Full text
Abstract:
This thesis is an investigation of critical failure modes of solar tower power system in concentrated solar power (CSP) technology. The thesis evaluated the causes and impacts of failure on the major components and apply the failure Mode and Effect Analysis (FMEA) to CSP solar tower system. This research proposed an alternative method to overcome the limitations of Risk Priority Number (RPN) from traditional FMEA. A case study applies the proposed approach to CSP solar tower system for a better prioritization of failure mode in order to reduce the risk of failures.
APA, Harvard, Vancouver, ISO, and other styles
10

Delgado, Villanueva Kiko Alexi. "Methodological proposal for social impact assessment and environmental conflict analysis." Doctoral thesis, Universitat Politècnica de València, 2016. http://hdl.handle.net/10251/64063.

Full text
Abstract:
[EN] Social impact assessment (SIA) is a part of environmental impact assessment (EIA), which is characterized by a high level of uncertainty and the subjective aspects that are presents in the methods used during its conduction. In addition, environmental conflict analysis (ECA) has become a key factor for the viability of projects and welfare of affected populations. In this thesis, an integrated method for SIA and ECA is proposed, by the combination of the grey clustering method and the entropy-weight method. SIA was performed using the grey clustering method, which enables qualitative information coming from a stakeholder group to be quantified. In turn, ECA was performed using the entropy-weight method, which identifies the criteria in which there is greater divergence between stakeholder groups, thus enabling to establish measures to prevent potential environmental conflicts. Then, in order to apply and test the proposed integrated method, two case studies were conducted. The first case study was a mining project in northern Peru. In this study, three stakeholder groups and seven criteria were identified. The results revealed that for the urban population group and the rural population group, the project would have a positive and negative social impact, respectively. For the group of specialists the project would have a normal social impact. It was also noted that the criteria most likely to generate environmental conflicts in order of importance were: access to drinking water, poverty, GDP per capita, and employment. The second case study considered was a hydrocarbon exploration project located in the Gulf of Valencia, Spain. In this study, four stakeholder groups and four criteria were identified. The results revealed that for the group of specialists the project would have a negative social impact, and contrary perceptions were shown between the group of those directly affected by the project and the group of citizens in favour. It was also noted that the criteria most likely to generate environmental conflict were the percentage of unemployment and GDP per capita. The proposed integrated method in this thesis showed great potential on the studied cases, and could be applied to other contexts and other projects, such as water resources management, industrial projects, construction projects, and to measure social impact and prevent conflicts during the implementation of government policies and programs.<br>[ES] La evaluación del impacto social (SIA) forma parte de la evaluación de impacto ambiental (EIA), y está caracterizada por su alto nivel de incertidumbre, y por los aspectos subjetivos presentes en los métodos usados para su realización. Por otro lado, el análisis del conflicto ambiental (ECA) se ha convertido en un factor clave para la viabilidad de los proyectos y el bienestar de la población afectada. En esta tesis, se propone un método integrado para la SIA y el ECA, mediante la combinación de los métodos grey clustering y entropy-weight. La SIA fue desarrollada usando el método grey clustering, el cual permite cuantificar la información cualitativa recogida de los grupos de interés o stakeholders. Sucesivamente, el ECA fue realizado usando el método entropy-weight, el cual identifica los criterios en los cuales existe gran divergencia entre los grupos de interés, permitiendo así establecer medidas para prevenir potenciales conflictos ambientales. Luego, con el fin de aplicar y testear el método integrado propuesto fueron realizados dos casos de estudio. El primer caso de estudio fue un proyecto minero ubicado en el norte de Perú. En este estudio se identificaron tres grupos de interés y siete criterios. Los resultados revelaron que para el grupo población urbana y el grupo población rural, el proyecto tendría un impacto social positivo y negativo, respectivamente. Para el grupo de los especialistas el proyecto tendría un impacto social normal. También fue notado que los criterios más probables de generar conflicto ambiental en orden de importancia fueron: acceso al agua potable, pobreza, PIB per cápita, y empleo. El segundo caso de estudio considerado fue un proyecto de exploración de hidrocarburos ubicado en el Golfo de Valencia, España. En este estudio se identificaron cuatro grupos de interés y cuatro criterios. Los resultados revelaron que para el grupo de los especialistas el proyecto tendría un impacto social negativo, y contrarias percepciones se encontraron entre el grupo de los directamente afectados y el grupo de los ciudadanos a favor. También fue notado que los criterios más probables de generar conflicto ambiental fueron el porcentaje de desempleo y el PIB per cápita. El método integrado propuesto en esta tesis mostró un gran potencial sobre los casos estudiados, y podría ser aplicado a otros contextos y otros tipos de proyectos, tales como gestión de recursos hídricos, proyectos industriales, proyectos de construcción de obras públicas, y para medir el impacto social y prevenir conflictos durante la aplicación de políticas y programas gubernamentales.<br>[CAT] L'avaluació de l'impacte social (SIA) és una part de l'avaluació de l'impacte ambiental (EIA), la qual està caracteritzada pel seu alt nivell d'incertitud i els aspectes subjectius presents en els mètodes amprats durant la seua conducció. A més, la anàlisis del conflicte ambiental (ECA) s'ha convertit en un factor clau per a la viabilitat dels projectes i el benestar de la població afectada. En esta tesis es proposa un mètode integrat per a l'avaluació de l'impacte social i la anàlisis del conflicte ambiental, mitjançant la combinació del mètode grey clustering i el mètode entropy-weight. L'avaluació de l'impacte social ha segut realitzada usant el mètode grey clustering, el qual permet que la informació qualitativa arreplegada dels grups d'interès siga quantificada. Successivament, la anàlisis del conflicte ambiental ha segut realitzada usant el mètode entropy-weight, el qual identifica els criteris en els quals existeix gran divergència entre els grups d'interès, la qual cosa permet establir mides per a prevenir conflictes ambientals potencials. Després, amb la finalitat d'aplicar i testejar el mètode integrat proposat han segut realitzats dos casos d'estudi. El primer d'ells ha segut un projecte miner al nord de Perú. En aquest estudi, tres grups d'interès i set criteris foren identificats. Els resultats revelaren que per al grup població-urbana i el grup població-rural, el projecte experimentaria un positiu i un negatiu impacte social respectivament. Per al grup dels especialistes el projecte tindria un impacte social normal. Per altra banda també va ser reconegut que els criteris més probables de generar conflicte ambiental en orde d'importància foren: accés a l'aigua potable, pobresa, PIB per càpita, i ofici. El segon cas d'estudi considerat va ser un projecte d'exploració d'hidrocarburs ubicat al Golf de València, Espanya. En este estudi, quatre grups d'interès i quatre criteris foren identificats. Els resultats revelaren que per al grup dels especialistes el projecte tindria un impacte social negatiu, mentre que entre el grup dels directament afectats i el grup dels ciutadans a favor es mostraren percepcions contraries. Va ser també reconegut que els criteris més probables de generar conflicte ambiental foren el percentatge de desocupació i el PIB per càpita. El mètode integrat proposat en aquesta tesis mostra un gran potencial sobre els casos estudiats, i pot ser aplicat a altres contexts i altres tipus de projectes com gestió de recursos hídrics, projectes industrials i projectes de construcció d'obres públiques. A més pot fer-se servir per mesurar l'impacte social i prevenir conflictes durant l'aplicació de polítiques i programes governamentals.<br>Delgado Villanueva, KA. (2016). Methodological proposal for social impact assessment and environmental conflict analysis [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/64063<br>TESIS
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Weighted Entropy"

1

Ji, Jiangming. Wo guo cheng shi gong gong fu wu gong zhong man yi du shang quan TOPSIS zhi shu ji ying xiang yin su yan jiu: Research on entropy weighted TOPSIS index of urban public services satisfaction and its influential factors in China. Zhongguo she hui ke xue chu ban she, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Melchert, H. Craig. Indo-Europeans. Edited by Gregory McMahon and Sharon Steadman. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780195376142.013.0031.

Full text
Abstract:
This article presents an overview of the arrival and florescence of the Indo-European languages in Anatolia, the most famous of which is Hittite. The weight of current linguistic evidence supports the traditional view that Indo-European speakers are intrusive to Asia Minor, coming from somewhere in eastern Europe. There is a growing consensus that the differentiation among the Indo-European Anatolian languages begins at least by the mid-second millennium BCE and possibly earlier. It is likely, but not strictly provable, that this differentiation correlates with the entry of the Indo-European speakers into Anatolia and their subsequent dispersal. Nothing definitive can be said about the route by which the Indo-European entry took place.
APA, Harvard, Vancouver, ISO, and other styles
3

Blood Pressure Log Book: A Blood Pressure Record Log Book That Also Includes Weight Log Entry. Independently Published, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Myers, Richard L. The 100 Most Important Chemical Compounds. Greenwood Publishing Group, Inc., 2007. http://dx.doi.org/10.5040/9798400605284.

Full text
Abstract:
What is a chemical compound? Compounds are substances that are two or more elements combined together chemically in a standard proportion by weight. Compounds are all around us - they include familiar things, such as water, and more esoteric substances, such as triuranium octaoxide, the most commonly occurring natural source for uranium. This reference guide gives us a tour of 100 of the most important, common, unusual, and intriguing compounds known to science. Each entry gives an extensive explanation of the composition, molecular formula, and chemical properties of the compound. In addition, each entry reviews the relevant chemistry, history, and uses of the compound, with discussions of the origin of the compound's name, the discovery or first synthesis of the compound, production statistics, and uses of the compound.
APA, Harvard, Vancouver, ISO, and other styles
5

Dixit, Avinash. Relation-Based Governance and Competition. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198812555.003.0015.

Full text
Abstract:
If formal institutions of contract governance are absent or ineffective, traders try to substitute relational governance based on norms and sanctions. However, these alternatives need good information and communication concerning members’ actions; that works well only in relatively small communities. If there are fixed costs, the market has too few firms for perfect competition. The optimum must be a second best, balancing the effectiveness of contract governance and dead-weight loss of monopoly. This chapter explores this idea using a spatial model with monopolistic competition. It is found that relational governance constrains the size of firms and can cause inefficiently excessive entry, beyond the excess that already occurs in a spatial model without governance problems. Effects of alternative methods of improving governance to ameliorate this inefficiency are explored.
APA, Harvard, Vancouver, ISO, and other styles
6

Yaffe, Gideon. The Age of Culpability. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198803324.001.0001.

Full text
Abstract:
Reflection on the grounds for leniency towards children who commit crimes is the entry point into the development, in this book, of a theory of the nature of criminal responsibility and desert of punishment for crime. The book argues that child criminals are owed lesser punishments than adults thanks not to their psychological, behavioral, or neural immaturity but, instead, because they are denied the vote. This conclusion is reached through the development of theories of the nature of criminal culpability, desert for wrongdoing, strength of legal reasons, and what it is to have a say over the law, theories that produce a bridge between limited participation in government and criminal culpability. The cornerstone of this discussion is the proposed theory of criminal culpability. To be criminally culpable is for one’s criminal act to manifest a failure to grant sufficient weight to the legal reasons to refrain. The stronger the legal reasons, then, the greater the criminal culpability. Those who lack a say over the law, it is argued, have weaker legal reasons to refrain from crime than those who have a say. They are therefore reduced in criminal culpability and deserve lesser punishment for their crimes. Children are owed leniency, then, because of the political meaning of age rather than because of its psychological meaning. This position has implications for criminal justice policy, with respect to, among other things, the interrogation of children suspected of crimes and the enfranchisement of adult felons.
APA, Harvard, Vancouver, ISO, and other styles
7

Gillingham, Paul. Unrevolutionary Mexico. Yale University Press, 2021. http://dx.doi.org/10.12987/yale/9780300253122.001.0001.

Full text
Abstract:
Unrevolutionary Mexico addresses how the Mexican Revolution (1910-1940) turned into a capitalist dictatorship of exceptional resilience. While soldiers seized power across the rest of Latin America, in modern Mexico the civilians of a single party moved punctiliously in and out of office for seventy-one years. The book uses the histories of the states of Guerrero and Veracruz as entry points to explore the origins and consolidation of this unique authoritarian state on both provincial and national levels. An empirically rich reconstruction of over sixty years of modernization and revolution (1880-1945) revises prevailing ideas of a pacified Mexico and establishes the 1940s as a decade of faltering governments and enduring violence. The book then assesses the pivotal changes of the mid-twentieth century, when a new generation of lawyers, bureaucrats and businessmen joined with surviving revolutionaries to form the Partido Revolucionario Institucional, which held uninterrupted power until 2000. Thematic chapters analyse elections, development, corruption and high and low culture in the period. The central role of military and private violence is explored in two further chapters that measure the weight of hidden coercion in keeping the party in power. In conclusion, the combination of provincial and national histories reveals Mexico as a place where soldiers prevented coups, a single party lost its own rigged elections, corruption fostered legitimacy, violence was concealed but decisive, and ambitious cultural control co-existed with a critical press and a disbelieving public. In conclusion, the book demonstrates how this strange dictatorship thrived not despite but because of its contradictions.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Weighted Entropy"

1

Al-Zoubaidi, Majd, and Amjad D. Al-Nasser. "Entropy-Based Weighted Exponential Regression." In Springer Proceedings in Mathematics & Statistics. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-4876-1_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Endo, Tomomi, and Mineichi Kudo. "Weighted Naïve Bayes Classifiers by Renyi Entropy." In Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41822-8_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sruthi, S. L., and Rinku Jacob. "Analyzing Electrocardiogram Signal Complexity with Weighted Entropy." In Springer Proceedings in Physics. Springer Nature Switzerland, 2024. https://doi.org/10.1007/978-3-031-69146-1_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sečkárová, Vladimíra. "Weighted Probabilistic Opinion Pooling Based on Cross-Entropy." In Neural Information Processing. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-26535-3_71.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Hui, Kaihu Hou, and Zhou Zhou. "A Weighted KNN Algorithm Based on Entropy Method." In Intelligent Computing and Internet of Things. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-2384-3_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sunil Kumar, B. S., A. S. Manjunath, and S. Christopher. "Improvisation in HEVC Performance by Weighted Entropy Encoding Technique." In Advances in Intelligent Systems and Computing. Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-3223-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Zekun, Jianyong Yu, Linlin Gu, and Xue Han. "Dynamic Information Diffusion Model Based on Weighted Information Entropy." In Computer Supported Cooperative Work and Social Computing. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-4549-6_39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zheng, Lijuan, Linhao Zhang, Meng Cui, Jianyou Chen, Shaobo Yang, and Zhaoxuan Li. "Medical Information Access Control Method Based on Weighted Information Entropy." In Cloud Computing and Security. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-00012-7_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Zundong, Zhaoran Zhang, Weixin Ma, and Huijuan Zhou. "Research on Shortest Paths-Based Entropy of Weighted Complex Networks." In Lecture Notes in Electrical Engineering. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-7986-3_79.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kelbert, Mark, Izabella Stuhl, and Yuri Suhov. "Weighted Entropy and its Use in Computer Science and Beyond." In Analytical and Computational Methods in Probability Theory. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71504-9_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Weighted Entropy"

1

Zhou, Yingying, Kun Yang, Jiahong Xu, Dejia Cai, and Xiaowei Zhou. "HIFU ablation monitoring with weighted ultrasound entropy imaging." In 2024 IEEE Ultrasonics, Ferroelectrics, and Frequency Control Joint Symposium (UFFC-JS). IEEE, 2024. https://doi.org/10.1109/uffc-js60046.2024.10793626.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Indiravathi, G., and D. Rajesh. "Music Genre Classification Using Entropy Weighted – Gated Recurrent Unit Model." In 2025 3rd International Conference on Integrated Circuits and Communication Systems (ICICACS). IEEE, 2025. https://doi.org/10.1109/icicacs65178.2025.10967708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Peng, Fengjiao, and Shuisheng Zhou. "Sample-Weighted Discrete Multiple Kernel K-Means Method with Entropy Regularization." In 2024 10th International Conference on Systems and Informatics (ICSAI). IEEE, 2024. https://doi.org/10.1109/icsai65059.2024.10893793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Misagh, F., Y. Panahi, G. H. Yari, and R. Shahi. "Weighted cumulative entropy and its estimation." In 2011 IEEE International Conference on Quality and Reliability (ICQR). IEEE, 2011. http://dx.doi.org/10.1109/icqr.2011.6031765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Perez, Claudio A., Leonardo A. Cament, and Luis E. Castillo. "Local matching Gabor entropy weighted face recognition." In Gesture Recognition (FG 2011). IEEE, 2011. http://dx.doi.org/10.1109/fg.2011.5771394.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dang, Bozhan, Jin Zhou, Yingxu Wang, et al. "Transfer Learning for Entropy-Weighted Fuzzy Clustering." In 2018 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC). IEEE, 2018. http://dx.doi.org/10.1109/spac46244.2018.8965458.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Deng, Jian, M. D. Pandey, and W. C. Xie. "Maximum entropy principle and partial probability weighted moments." In BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING: 31st International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering. AIP, 2012. http://dx.doi.org/10.1063/1.3703635.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Suyan He and Yuxi Jiang. "A weighted relative entropy method for forecasting demand." In 2011 International Conference on Electric Information and Control Engineering (ICEICE). IEEE, 2011. http://dx.doi.org/10.1109/iceice.2011.5777198.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bhuiyan, Sharif, Jesmin Khan, and Gregory Murphy. "Weighted Entropy for Data Compression in Smart Grid." In 2018 IEEE Industry Applications Society Annual Meeting (IAS2018). IEEE, 2018. http://dx.doi.org/10.1109/ias.2018.8544486.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Park, Eunhyeok, Junwhan Ahn, and Sungjoo Yoo. "Weighted-Entropy-Based Quantization for Deep Neural Networks." In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2017. http://dx.doi.org/10.1109/cvpr.2017.761.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Weighted Entropy"

1

Cordwell, William. Entropy Approximation Formula for Hamming Weights. Office of Scientific and Technical Information (OSTI), 2016. http://dx.doi.org/10.2172/1646969.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Van Duren, Jeroen K., Carl Koch, Alan Luo, Vivek Sample, and Anil Sachdev. High-Throughput Combinatorial Development of High-Entropy Alloys For Light-Weight Structural Applications. Office of Scientific and Technical Information (OSTI), 2017. http://dx.doi.org/10.2172/1413702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Данильчук, Г. Б., О. А. Засядько та В. М. Соловйов. Застосування методів теорії складних систем при оцінці економічної безпеки підприємства. Видавець Вовчок О.Ю., 2017. http://dx.doi.org/10.31812/0564/1260.

Full text
Abstract:
The paper estimated the financial stability of the enterprise «Motor Sich» network measures and using permutation entropy. The analysis and comparison of the weights with integrated measurement of financial security. The conclusions about the possibility of using methods of the theory of complex systems in assessing economic security.
APA, Harvard, Vancouver, ISO, and other styles
4

Estill, J. Electronic weight- and dimensional-data entry in a computer database. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/2765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Alwan, Iktimal, Dennis D. Spencer, and Rafeed Alkawadri. Comparison of Machine Learning Algorithms in Sensorimotor Functional Mapping. Progress in Neurobiology, 2023. http://dx.doi.org/10.60124/j.pneuro.2023.30.03.

Full text
Abstract:
Objective: To compare the performance of popular machine learning algorithms (ML) in mapping the sensorimotor cortex (SM) and identifying the anterior lip of the central sulcus (CS). Methods: We evaluated support vector machines (SVMs), random forest (RF), decision trees (DT), single layer perceptron (SLP), and multilayer perceptron (MLP) against standard logistic regression (LR) to identify the SM cortex employing validated features from six-minute of NREM sleep icEEG data and applying standard common hyperparameters and 10-fold cross-validation. Each algorithm was tested using vetted features based on the statistical significance of classical univariate analysis (p&lt;0.05) and extended () 17 features representing power/coherence of different frequency bands, entropy, and interelectrode-based distance. The analysis was performed before and after weight adjustment for imbalanced data (w). Results: 7 subjects and 376 contacts were included. Before optimization, ML algorithms performed comparably employing conventional features (median CS accuracy: 0.89, IQR [0.88-0.9]). After optimization, neural networks outperformed others in means of accuracy (MLP: 0.86), the area under the curve (AUC) (SLPw, MLPw, MLP: 0.91), recall (SLPw: 0.82, MLPw: 0.81), precision (SLPw: 0.84), and F1-scores (SLPw: 0.82). SVM achieved the best specificity performance. Extending the number of features and adjusting the weights improved recall, precision, and F1-scores by 48.27%, 27.15%, and 39.15%, respectively, with gains or no significant losses in specificity and AUC across CS and Function (correlation r=0.71 between the two clinical scenarios in all performance metrics, p&lt;0.001). Interpretation: Computational passive sensorimotor mapping is feasible and reliable. Feature extension and weight adjustments improve the performance and counterbalance the accuracy paradox. Optimized neural networks outperform other ML algorithms even in binary classification tasks. The best-performing models and the MATLAB® routine employed in signal processing are available to the public at (Link 1).
APA, Harvard, Vancouver, ISO, and other styles
6

Treadwell, Jonathan R., James T. Reston, Benjamin Rouse, Joann Fontanarosa, Neha Patel, and Nikhil K. Mull. Automated-Entry Patient-Generated Health Data for Chronic Conditions: The Evidence on Health Outcomes. Agency for Healthcare Research and Quality (AHRQ), 2021. http://dx.doi.org/10.23970/ahrqepctb38.

Full text
Abstract:
Background. Automated-entry consumer devices that collect and transmit patient-generated health data (PGHD) are being evaluated as potential tools to aid in the management of chronic diseases. The need exists to evaluate the evidence regarding consumer PGHD technologies, particularly for devices that have not gone through Food and Drug Administration evaluation. Purpose. To summarize the research related to automated-entry consumer health technologies that provide PGHD for the prevention or management of 11 chronic diseases. Methods. The project scope was determined through discussions with Key Informants. We searched MEDLINE and EMBASE (via EMBASE.com), In-Process MEDLINE and PubMed unique content (via PubMed.gov), and the Cochrane Database of Systematic Reviews for systematic reviews or controlled trials. We also searched ClinicalTrials.gov for ongoing studies. We assessed risk of bias and extracted data on health outcomes, surrogate outcomes, usability, sustainability, cost-effectiveness outcomes (quantifying the tradeoffs between health effects and cost), process outcomes, and other characteristics related to PGHD technologies. For isolated effects on health outcomes, we classified the results in one of four categories: (1) likely no effect, (2) unclear, (3) possible positive effect, or (4) likely positive effect. When we categorized the data as “unclear” based solely on health outcomes, we then examined and classified surrogate outcomes for that particular clinical condition. Findings. We identified 114 unique studies that met inclusion criteria. The largest number of studies addressed patients with hypertension (51 studies) and obesity (43 studies). Eighty-four trials used a single PGHD device, 23 used 2 PGHD devices, and the other 7 used 3 or more PGHD devices. Pedometers, blood pressure (BP) monitors, and scales were commonly used in the same studies. Overall, we found a “possible positive effect” of PGHD interventions on health outcomes for coronary artery disease, heart failure, and asthma. For obesity, we rated the health outcomes as unclear, and the surrogate outcomes (body mass index/weight) as likely no effect. For hypertension, we rated the health outcomes as unclear, and the surrogate outcomes (systolic BP/diastolic BP) as possible positive effect. For cardiac arrhythmias or conduction abnormalities we rated the health outcomes as unclear and the surrogate outcome (time to arrhythmia detection) as likely positive effect. The findings were “unclear” regarding PGHD interventions for diabetes prevention, sleep apnea, stroke, Parkinson’s disease, and chronic obstructive pulmonary disease. Most studies did not report harms related to PGHD interventions; the relatively few harms reported were minor and transient, with event rates usually comparable to harms in the control groups. Few studies reported cost-effectiveness analyses, and only for PGHD interventions for hypertension, coronary artery disease, and chronic obstructive pulmonary disease; the findings were variable across different chronic conditions and devices. Patient adherence to PGHD interventions was highly variable across studies, but patient acceptance/satisfaction and usability was generally fair to good. However, device engineers independently evaluated consumer wearable and handheld BP monitors and considered the user experience to be poor, while their assessment of smartphone-based electrocardiogram monitors found the user experience to be good. Student volunteers involved in device usability testing of the Weight Watchers Online app found it well-designed and relatively easy to use. Implications. Multiple randomized controlled trials (RCTs) have evaluated some PGHD technologies (e.g., pedometers, scales, BP monitors), particularly for obesity and hypertension, but health outcomes were generally underreported. We found evidence suggesting a possible positive effect of PGHD interventions on health outcomes for four chronic conditions. Lack of reporting of health outcomes and insufficient statistical power to assess these outcomes were the main reasons for “unclear” ratings. The majority of studies on PGHD technologies still focus on non-health-related outcomes. Future RCTs should focus on measurement of health outcomes. Furthermore, future RCTs should be designed to isolate the effect of the PGHD intervention from other components in a multicomponent intervention.
APA, Harvard, Vancouver, ISO, and other styles
7

Hertel, Thomas, David Hummels, Maros Ivanic, and Roman Keeney. How Confident Can We Be in CGE-Based Assessments of Free Trade Agreements? GTAP Working Paper, 2003. http://dx.doi.org/10.21642/gtap.wp26.

Full text
Abstract:
With the proliferation of Free Trade Agreements (FTAs) over the past decade, demand for quantitative analysis of their likely impacts has surged. The main quantitative tool for performing such analysis is Computable General Equilibrium (CGE) modeling. Yet these models have been widely criticized for performing poorly (Kehoe, 2002) and having weak econometric foundations (McKitrick, 1998; Jorgenson, 1984). FTA results have been shown to be particularly sensitive to the trade elasticities, with small trade elasticities generating large terms of trade effects and relatively modest efficiency gains, whereas large trade elasticities lead to the opposite result. Critics are understandably wary of results being determined largely by the authors’ choice of trade elasticities. Where do these trade elasticities come from? CGE modelers typically draw these elasticities from econometric work that uses time series price variation to identify an elasticity of substitution between domestic goods and composite imports (Alaouze, 1977; Alaouze, et al., 1977; Stern et al., 1976; Gallaway, McDaniel and Rivera, 2003). This approach has three problems: the use of point estimates as “truth”, the magnitude of the point estimates, and estimating the relevant elasticity. First, modelers take point estimates drawn from the econometric literature, while ignoring the precision of these estimates. As we will make clear below, the confidence one has in various CGE conclusions depends critically on the size of the confidence interval around parameter estimates. Standard “robustness checks” such as systematically raising or lowering the substitution parameters does not properly address this problem because it ignores information about which parameters we know with some precision and which we do not. A second problem with most existing studies derives from the use of import price series to identify home vs. foreign substitution, for example, tends to systematically understate the true elasticity. This is because these estimates take price variation as exogenous when estimating the import demand functions, and ignore quality variation. When quality is high, import demand and prices will be jointly high. This biases estimated elasticities toward zero. A related point is that the fixed-weight import price series used by most authors are theoretically inappropriate for estimating the elasticities of interest. CGE modelers generally examine a nested utility structure, with domestic production substitution for a CES composite import bundle. The appropriate price series is then the corresponding CES price index among foreign varieties. Constructing such an index requires knowledge of the elasticity of substitution among foreign varieties (see below). By using a fixed-weight import price series, previous estimates place too much weight on high foreign prices, and too small a weight on low foreign prices. In other words, they overstate the degree of price variation that exists, relative to a CES price index. Reconciling small trade volume movements with large import price series movements requires a small elasticity of substitution. This problem, and that of unmeasured quality variation, helps explain why typical estimated elasticities are very small. The third problem with the existing literature is that estimates taken from other researchers’ studies typically employ different levels of aggregation, and exploit different sources of price variation, from what policy modelers have in mind. Employment of elasticities in experiments ill-matched to their original estimation can be problematic. For example, estimates may be calculated at a higher or lower level of aggregation than the level of analysis than the modeler wants to examine. Estimating substitutability across sources for paddy rice gives one a quite different answer than estimates that look at agriculture as a whole. When analyzing Free Trade Agreements, the principle policy experiment is a change in relative prices among foreign suppliers caused by lowering tariffs within the FTA. Understanding the substitution this will induce across those suppliers is critical to gauging the FTA’s real effects. Using home v. foreign elasticities rather than elasticities of substitution among imports supplied from different countries may be quite misleading. Moreover, these “sourcing” elasticities are critical for constructing composite import price series to appropriate estimate home v. foreign substitutability. In summary, the history of estimating the substitution elasticities governing trade flows in CGE models has been checkered at best. Clearly there is a need for improved econometric estimation of these trade elasticities that is well-integrated into the CGE modeling framework. This paper provides such estimation and integration, and has several significant merits. First, we choose our experiment carefully. Our CGE analysis focuses on the prospective Free Trade Agreement of the Americas (FTAA) currently under negotiation. This is one of the most important FTAs currently “in play” in international negotiations. It also fits nicely with the source data used to estimate the trade elasticities, which is largely based on imports into North and South America. Our assessment is done in a perfectly competitive, comparative static setting in order to emphasize the role of the trade elasticities in determining the conventional gains/losses from such an FTA. This type of model is still widely used by government agencies for the evaluation of such agreements. Extensions to incorporate imperfect competition are straightforward, but involve the introduction of additional parameters (markups, extent of unexploited scale economies) as well as structural assumptions (entry/no-entry, nature of inter-firm rivalry) that introduce further uncertainty. Since our focus is on the effects of a PTA we estimate elasticities of substitution across multiple foreign supply sources. We do not use cross-exporter variation in prices or tariffs alone. Exporter price series exhibit a high degree of multicolinearity, and in any case, would be subject to unmeasured quality variation as described previously. Similarly, tariff variation by itself is typically unhelpful because by their very nature, Most Favored Nation (MFN) tariffs are non-discriminatory in nature, affecting all suppliers in the same way. Tariff preferences, where they exist, are often difficult to measure – sometimes being confounded by quantitative barriers, restrictive rules of origin, and other restrictions. Instead we employ a unique methodology and data set drawing on not only tariffs, but also bilateral transportation costs for goods traded internationally (Hummels, 1999). Transportation costs vary much more widely than do tariffs, allowing much more precise estimation of the trade elasticities that are central to CGE analysis of FTAs. We have highly disaggregated commodity trade flow data, and are therefore able to provide estimates that precisely match the commodity aggregation scheme employed in the subsequent CGE model. We follow the GTAP Version 5.0 aggregation scheme which includes 42 merchandise trade commodities covering food products, natural resources and manufactured goods. With the exception of two primary commodities that are not traded, we are able to estimate trade elasticities for all merchandise commodities that are significantly different form zero at the 95% confidence level. Rather than producing point estimates of the resulting welfare, export and employment effects, we report confidence intervals instead. These are based on repeated solution of the model, drawing from a distribution of trade elasticity estimates constructed based on the econometrically estimated standard errors. There is now a long history of CGE studies based on SSA: Systematic Sensitivity Analysis (Harrison and Vinod, 1992; Wigle, 1991; Pagon and Shannon, 1987) Ho
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!