Academic literature on the topic 'Minimum likelihood ratio'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Minimum likelihood ratio.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Minimum likelihood ratio"

1

Li, Chen, and Xiaohu Li. "Likelihood ratio order of sample minimum from heterogeneous Weibull random variables." Statistics & Probability Letters 97 (February 2015): 46–53. http://dx.doi.org/10.1016/j.spl.2014.10.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fu, Y. X., and W. H. Li. "Maximum likelihood estimation of population parameters." Genetics 134, no. 4 (August 1, 1993): 1261–70. http://dx.doi.org/10.1093/genetics/134.4.1261.

Full text
Abstract:
Abstract One of the most important parameters in population genetics is theta = 4Ne mu where Ne is the effective population size and mu is the rate of mutation per gene per generation. We study two related problems, using the maximum likelihood method and the theory of coalescence. One problem is the potential improvement of accuracy in estimating the parameter theta over existing methods and the other is the estimation of parameter lambda which is the ratio of two theta's. The minimum variances of estimates of the parameter theta are derived under two idealized situations. These minimum variances serve as the lower bounds of the variances of all possible estimates of theta in practice. We then show that Watterson's estimate of theta based on the number of segregating sites is asymptotically an optimal estimate of theta. However, for a finite sample of sequences, substantial improvement over Watterson's estimate is possible when theta is large. The maximum likelihood estimate of lambda = theta 1/theta 2 is obtained and the properties of the estimate are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Pusadan, Mohammad Yazdi, Joko Lianto Buliali, and Raden Venantius Hari Ginardi. "Anomaly detection on flight route using similarity and grouping approach based-on automatic dependent surveillance-broadcast." International Journal of Advances in Intelligent Informatics 5, no. 3 (November 30, 2019): 285. http://dx.doi.org/10.26555/ijain.v5i3.232.

Full text
Abstract:
Flight anomaly detection is used to determine the abnormal state data on the flight route. This study focused on two groups: general aviation habits (C1)and anomalies (C2). Groups C1 and C2 are obtained through similarity test with references. The methods used are: 1) normalizing the training data form, 2) forming the training segment 3) calculating the log-likelihood value and determining the maximum log-likelihood (C1) and minimum log-likelihood (C2) values, 4) determining the percentage of data based on criteria C1 and C2 by grouping SVM, KNN, and K-means and 5) Testing with log-likelihood ratio. The results achieved in each segment are Log-likelihood value in C1Latitude is -15.97 and C1Longitude is -16.97. On the other hand, Log-likelihood value in C2Latitude is -19.3 (maximum) and -20.3 (minimum), and log-likelihood value in C2Longitude is -21.2 (maximum) and -24.8 (minimum). The largest percentage value in C1 is 96%, while the largest in C2 is 10%. Thus, the highest potential anomaly data is 10%, and the smallest is 3%. Also, there are performance tests based on F-measure to get accuracy and precision.
APA, Harvard, Vancouver, ISO, and other styles
4

Buszkiewicz, James H., Heather D. Hill, and Jennifer J. Otten. "Association of State Minimum Wage Rates and Health in Working-Age Adults Using the National Health Interview Survey." American Journal of Epidemiology 190, no. 1 (February 10, 2020): 21–30. http://dx.doi.org/10.1093/aje/kwaa018.

Full text
Abstract:
Abstract States adopt minimum wages to improve workers’ economic circumstances and well-being. Many studies, but not all, find evidence of health benefits from higher minimum wages. This study used a rigorous “triple difference” strategy to identify the associations between state minimum wages and adult obesity, body mass index (weight (kg)/height (m)2), hypertension, diabetes, fair or poor health, and serious psychological distress. National Health Interview Survey data (United States, 2008–2015) on adults aged 25–64 years (n = 131,430) were linked to state policies to estimate the prevalence odds ratio or mean difference in these outcomes associated with a $1 increase in current and 2-year lagged minimum wage among less-educated adults overall and by sex, race/ethnicity, and age. In contrast to prior studies, there was no association between current minimum wage and health; however, 2-year lagged minimum wage was positively associated with the likelihood of obesity (prevalence odds ratio = 1.08, 95% confidence interval: 1.00, 1.16) and with elevated body mass index (mean difference = 0.27, 95% confidence interval: 0.04, 0.49). In subgroup models, current and 2-year lagged minimum wage were associated with a higher likelihood of obesity among male and non-White or Hispanic adults. The associations with hypertension also varied by sex and the timing of the exposure.
APA, Harvard, Vancouver, ISO, and other styles
5

Rosinsky, Philip J., Jeffery W. Chen, Mitchell J. Yelton, Ajay C. Lall, David R. Maldonado, Mitchell B. Meghpara, Jacob Shapira, and Benjamin G. Domb. "Does failure to meet threshold scores for mHHS and iHOT-12 correlate to secondary operations following hip arthroscopy?" Journal of Hip Preservation Surgery 7, no. 2 (April 14, 2020): 272–80. http://dx.doi.org/10.1093/jhps/hnaa015.

Full text
Abstract:
Abstract The purpose of this study was to determine (i) if failing to achieve a patient-reported outcome (PRO) threshold at 1 year was associated with secondary operations at minimum 2-year follow-up and (ii)what outcome measure and threshold has the highest association with future surgeries. Inclusion criteria for this study were cases of primary hip arthroscopy between July 2014 and April 2017. Included patients had recorded pre-operative and 1-year post-operative modified Harris Hip Score (mHHS) and 12-item international Hip Outcome Tool (iHOT-12) scores. Patients were classified based on their ability to achieve minimal clinical important difference (MCID), substantial clinical benefit (SCB) and patient acceptable symptom state (PASS) for each PRO and the status of secondary operations at minimum 2-year follow-up. The sensitivity, specificity, accuracy, positive likelihood ratio and negative likelihood ratio for these thresholds were calculated. Of 425 eligible cases, 369 (86.8%) had minimum 2-year follow-up. Of the included patients, 28 underwent secondary operations (7.59%), with 14 undergoing secondary arthroscopies (3.79%) and 14 converting to total hip arthroplasty (3.79%). For mHHS, 267 (72.4%), 173 (46.9%) and 277 (75.1%) hips met MCID, SCB and PASS, respectively. For iHOT-12, 234 (63.4%), 218 (59.1%) and 280 (75.9%) hips met the respective thresholds. The highest specificity, sensitivity and accuracy were identified as for iHOT-12 MCID (0.79), iHOT-12 PASS (0.79) and iHOT-12 MCID (0.77), respectively. Patients not attaining MCID and PASS for mHHS and iHOT-12 at 1-year post-operatively are at increased risk of secondary operation. The most accurate threshold associated with secondary operation (0.77) is not achieving iHOT-12 MCID. Level of evidence: retrospective case series: level IV.
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Kou‐Yuan, and King‐sun Fu. "Syntactic pattern recognition for the classification of Ricker wavelets." GEOPHYSICS 50, no. 10 (October 1985): 1548–55. http://dx.doi.org/10.1190/1.1441845.

Full text
Abstract:
Syntactic pattern recognition techniques are applied to the analysis of one‐dimensional seismic traces for classification of Ricker wavelets. The system for one‐dimensional seismic analysis includes a likelihood ratio test, optimal amplitude‐dependent encoding, probability of detecting the signal involved in the global and local detection, plus minimum‐distance and nearest‐neighbor classification rules. The relation between error probability and Levenshtein distance is proposed.
APA, Harvard, Vancouver, ISO, and other styles
7

Turovsky, A. L., and O. V. Drobik. "PROCEDURE FOR EVALUATION OF THE SUPPORTING FREQUENCY SIGNAL OF THE SATELLITE COMMUNICATION SYSTEM IN CONTINUOUS MODE." Radio Electronics, Computer Science, Control, no. 2 (June 26, 2021): 28–38. http://dx.doi.org/10.15588/1607-3274-2021-2-3.

Full text
Abstract:
Context. One of the features of satellite communication systems is the advantageous use in them during the reception of the signal in the continuous mode of phase modulation of signals intended for the transmission of useful information. The use of this type of modulation requires solving the problem of estimating the carrier frequency of the signal. And the estimation itself is reduced to the problem of estimating the frequency of the maximum in the spectrum of a fragment of a sinusoidal signal against the background of additive Gaussian noise. The article considers the process of estimating the carrier frequency of a signal by a satellite communication system in a continuous mode according to the rule of maximum likelihood. Objective. Development of a procedure for estimating the carrier frequency of a signal received by a satellite communication system in a continuous mode according to the maximum likelihood rule. Method. The procedure proposed in the work and the algorithm developed on its basis allows to estimate the carrier frequency according to the rule of maximum likelihood, taking into account the conditions of uncertainty of all signal parameters by the satellite communication system in continuous mode. The results. For the purpose of practical introduction of the specified algorithm in operating schemes of satellite communication, schemes of its hardware realization are offered in work. To illustrate the ratio of the limits of the minimum limiting variance of the carrier frequency estimate, the paper presents dependencies that allow comparing the minimum limiting variance defined by the lower Cramer-Rao boundary and the minimum limiting variance determined taking into account all signal parameters. Conclusions. Analysis of these dependences showed that in real conditions the minimum dispersion of the carrier frequency of the signal according to the rule of maximum likelihood received by the satellite communication system in continuous mode with uncertainty of all signal parameters may differ significantly from the minimum dispersion obtained by applying the lower Kramer-Rao boundary. Prospective research, development and creation of algorithms and techniques aimed at estimating the carrier frequency at the minimum limiting variance in the conditions of uncertainty of all parameters of the received signal should be aimed at the maximum approximation of the minimum limiting variance of the estimated carrier frequency to the lower Cramer-Rao boundary to estimate the carrier frequency under conditions of certainty of other signal parameters.
APA, Harvard, Vancouver, ISO, and other styles
8

Tam, Vincent H., Amy N. Schilling, Shadi Neshat, Keith Poole, David A. Melnick, and Elizabeth A. Coyle. "Optimization of Meropenem Minimum Concentration/MIC Ratio To Suppress In Vitro Resistance of Pseudomonas aeruginosa." Antimicrobial Agents and Chemotherapy 49, no. 12 (December 2005): 4920–27. http://dx.doi.org/10.1128/aac.49.12.4920-4927.2005.

Full text
Abstract:
ABSTRACT Suppression of resistance in a dense Pseudomonas aeruginosa population has previously been shown with optimized quinolone exposures. However, the relevance to β-lactams is unknown. We investigated the bactericidal activity of meropenem and its propensity to suppress P. aeruginosa resistance in an in vitro hollow-fiber infection model (HFIM). Two isogenic strains of P. aeruginosa (wild type and an AmpC stably derepressed mutant [MIC = 1 mg/liter]) were used. An HFIM inoculated with approximately 1 × 108 CFU/ml of bacteria was subjected to various meropenem exposures. Maintenance doses were given every 8 h to simulate the maximum concentration achieved after a 1-g dose in all regimens, but escalating unbound minimum concentrations (C mi ns) were simulated with different clearances. Serial samples were obtained over 5 days to quantify the meropenem concentrations, the total bacterial population, and subpopulations with reduced susceptibilities to meropenem (>3× the MIC). For both strains, a significant bacterial burden reduction was seen with all regimens at 24 h. Regrowth was apparent after 3 days, with the C min/MIC ratio being ≤1.7 (time above the MIC, 100%). Selective amplification of subpopulations with reduced susceptibilities to meropenem was suppressed with a C min/MIC of ≥6.2 or by adding tobramycin to meropenem (C min/MIC = 1.7). Investigations that were longer than 24 h and that used high inocula may be necessary to fully evaluate the relationship between drug exposures and the likelihood of resistance suppression. These results suggest that the C min/MIC of meropenem can be optimized to suppress the emergence of non-plasmid-mediated P. aeruginosa resistance. Our in vitro data support the use of an extended duration of meropenem infusion for the treatment of severe nosocomial infections in combination with an aminoglycoside.
APA, Harvard, Vancouver, ISO, and other styles
9

Juskowiak, Jochen, and Bernd Bertsche. "Application and Simulation Study of Stress-Dependent Weibull Lifetime Models." International Journal of Reliability, Quality and Safety Engineering 23, no. 02 (April 2016): 1650008. http://dx.doi.org/10.1142/s021853931650008x.

Full text
Abstract:
Different Weibull lifetime models are presented whose scale, shape and minimum lifetime parameters are stress-dependent. This allows describing and predicting the lifetime of products with a Weibull distribution more accurately wherever stress-dependence applies to the failure mechanism. For instance, this is the case for failures due to fatigue, on which this paper focusses. The proposed procedure encompasses a two-step maximum likelihood estimation and a Fisher matrix (FM) confidence bounds calculation, followed by a model evaluation. This model evaluation is conducted by means of a general plausibility check (PC), likelihood ratio test (LRT) and Bayesian information criterion (BIC). Their applicability to accelerated life test data is discussed and validated using test data. Finally, a simulation study confirms a wide range of applicability.
APA, Harvard, Vancouver, ISO, and other styles
10

S*, Erfiyani, and Amira Permatasari Tarigan**. "KETEPATAN PEMERIKSAAN RADIOLOGI DAN BTA APUSAN LANGSUNG DENGAN KULTUR DALAM DIAGNOSIS TUBERKULOSIS PARU DI MEDAN." Jurnal Ilmiah PANNMED (Pharmacist, Analyst, Nurse, Nutrition, Midwivery, Environment, Dentist) 9, no. 3 (January 28, 2019): 238–44. http://dx.doi.org/10.36911/pannmed.v9i3.214.

Full text
Abstract:
Lung tuberculosis is one of transmitted diseases which are caused by Mycobacterium tuberculosis throughnuclei droplet. Diagnosis for lung tuberculosis is usually detected by the examination of direct sputumremoval, thorax photo, and Culture. The type of the research was descriptive study by using diagnostic testand the samples consisted of sputum and thorax photo. The sputum was taken accidentally, in the morning,and accidentally which was examined by using Ziehl Nelsen method and culture. The objective of theresearch was to find out the effectiveness of examining direct BTA removal and radiology which werecompared with Culture. The samples were taken from Private Practice of Tuberculosis Specialists in Medanand in BP4, Medan which had fulfilled inclusive criteria, radiology examination and three times of phlegmtaking for direct BTA removal and culture were performed; after that, radiology diagnostic test for directBTA removal was compared with Culture. 60 samples on direct removal method indicated the value ofsensitivity of 59.38%, specificity of 92.86%, value of positive prediction of 90.48%, value of negativeprediction of 66.7%, ratio of positive likelihood of 8.31, and ratio of negative likelihood of 0.44. Radiologymethod indicated sensitivity of 62.63%, specificity of 82.14%, the value of positive prediction of 80.77%,value of negative prediction of 67.65%, ratio of positive likelihood of 3.67, and ratio of negative likelihood of0.42. Of the 21 samples, positive BTA which underwent minimum lung damage of 4 (6.7%), moderatedamage of 9 (15.0%), and wide damage of 5 (8.3%). The result of the research showed that radiologyexamination indicated that the value of sensitivity was higher than direct removal method, compared withculture as gold standard in BTA examination.That clinical benefit was bigger in direct removal method canbe seen from the value of positive prediction which was higher than radiology method so that it isrecommended that both examination techniques be used to diagnose lung tuberculosis
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Minimum likelihood ratio"

1

Hedell, Ronny. "Rarities of genotype profiles in a normal Swedish population." Thesis, Linköpings universitet, Matematiska institutionen, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-59708.

Full text
Abstract:
Investigation of stains from crime scenes are commonly used in the search for criminals. At The National Laboratory of Forensic Science, where these stains are examined, a number of questions of theoretical and practical interest regarding the databases of DNA profiles and the strength of DNA evidence against a suspect in a trial are not fully investigated. The first part of this thesis deals with how a sample of DNA profiles from a population is used in the process of estimating the strength of DNA evidence in a trial, taking population genetic factors into account. We then consider how to combine hypotheses regarding the relationship between a suspect and other possible donors of the stain from the crime scene by two applications of Bayes’ theorem. After that we assess the DNA profiles that minimize the strength of DNA evidence against a suspect, and investigate how the strength is affected by sampling error using the bootstrap method and a Bayesian method. In the last part of the thesis we examine discrepancies between different databases of DNA profiles by both descriptive and inferential statistics, including likelihood ratio tests and Bayes factor tests. Little evidence of major differences is found.
APA, Harvard, Vancouver, ISO, and other styles
2

Eliasson, Björn. "Voice Activity Detection and Noise Estimation for Teleconference Phones." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-108395.

Full text
Abstract:
If communicating via a teleconference phone the desired transmitted signal (speech) needs to be crystal clear so that all participants experience a good communication ability. However, there are many environmental conditions that contaminates the signal with background noise, i.e sounds not of interest for communication purposes, which impedes the ability to communicate due to interfering sounds. Noise can be removed from the signal if it is known and so this work has evaluated different ways of estimating the characteristics of the background noise. Focus was put on using speech detection to define the noise, i.e. the non-speech part of the signal, but other methods not solely reliant on speech detection but rather on characteristics of the noisy speech signal were included. The implemented techniques were compared and evaluated to the current solution utilized by the teleconference phone in two ways, firstly for their speech detection ability and secondly for their ability to correctly estimate the noise characteristics. The evaluation process was based on simulations of the methods' performance in various noise conditions, ranging from harsh to mild environments. It was shown that the proposed method showed improvement over the existing solution, as implemented in this study, in terms of speech detection ability and for the noise estimate it showed improvement in certain conditions. It was also concluded that using the proposed method would enable two sources of noise estimation compared to the current single estimation source and it was suggested to investigate how utilizing two noise estimators could affect the performance.
APA, Harvard, Vancouver, ISO, and other styles
3

Babakeshizadeh, Vahid. "Biologically-inspired Motion Control for Kinematic Redundancy Resolution and Self-sensing Exploitation for Energy Conservation in Electromagnetic Devices." Thesis, 2014. http://hdl.handle.net/10012/8123.

Full text
Abstract:
This thesis investigates particular topics in advanced motion control of two distinct mechanical systems: human-like motion control of redundant robot manipulators and advanced sensing and control for energy-efficient operation of electromagnetic devices. Control of robot manipulators for human-like motions has been one of challenging topics in robot control for over half a century. The first part of this thesis considers methods that exploits robot manipulators??? degrees of freedom for such purposes. Jacobian transpose control law is investigated as one of the well-known controllers and sufficient conditions for its universal convergence are derived by using concepts of ???stability on a manifold??? and ???transferability to a sub-manifold???. Firstly, a modification on this method is proposed to enhance the rectilinear trajectory of the robot end-effector. Secondly, an abridged Jacobian controller is proposed that exploits passive control of joints to reduce the attended degrees of freedom of the system. Finally, the application of minimally-attended controller for human-like motion is introduced. Electromagnetic (EM) access control systems are one of growing electronic systems which are used in applications where conventional mechanical locks may not guarantee the expected safety of the peripheral doors of buildings. In the second part of this thesis, an intelligent EM unit is introduced which recruits the selfsensing capability of the original EM block for detection purposes. The proposed EM device optimizes its energy consumption through a control strategy which regulates the supply to the system upon detection of any eminent disturbance. Therefore, it draws a very small current when the full power is not needed. The performance of the proposed control strategy was evaluated based on a standard safety requirement for EM locking mechanisms. For a particular EM model, the proposed method is verified to realize a 75% reduction in the power consumption.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Minimum likelihood ratio"

1

"Paddlefish Management, Propagation, and Conservation in the 21st Century." In Paddlefish Management, Propagation, and Conservation in the 21st Century, edited by JANICE A. KERNS, PHILLIP W. BETTOLI, and GEORGE D. SCHOLTEN. American Fisheries Society, 2009. http://dx.doi.org/10.47886/9781934874127.ch20.

Full text
Abstract:
<em>Abstract</em>.—We present information on delayed mortality of commercially exploited paddlefish <em>Polyodon spathula </em>released as bycatch in Kentucky Lake, Tennessee–Kentucky, an impoundment on the lower Tennessee River. Minimum size limits enacted in 2002 (864 mm eye-to-fork length [EFL]) and 2005 (914 mm EFL) sought to protect paddlefish from overfishing. In 2005, bycatch of sublegal paddlefish represented 75% of the total catch, and releasing undersized fish will not reduce fishing mortality unless those fish survive. Paddlefish caught and released by commercial fishers in 2005 and 2006 were externally tagged with radio transmitters and tracked a minimum of 2 weeks to estimate delayed mortality. Four of the 104 tagged paddlefish died following release, 94 survived, and 6 were censored because their fate could not be determined. Paddlefish that survived moved rapidly from release locations. Net movements of the 94 fish that survived averaged 12.0 km (SE = 5.3) upriver and ranged from 91.5 km downriver to 390.0 km upriver. Fish that died could not be distinguished from fish that lived on the basis of mean water temperature, fish length, net-soak time, or handling time. Given the low delayed mortality of discarded paddlefish, imposing minimum size limits is a reasonable approach to reduce fishing mortality of juveniles and reduce the likelihood of overfishing. Efforts to reduce fishing mortality should focus on avoiding fishing gear and seasons (e.g., early fall and late spring) that cause high initial bycatch mortality.
APA, Harvard, Vancouver, ISO, and other styles
2

Baten, Azizul. "Kuala Lumpur Stock Exchange Traded Bank Performance with Stochastic Frontiers." In Handbook of Research on Strategic Developments and Regulatory Practice in Global Finance, 16–33. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-7288-8.ch002.

Full text
Abstract:
Banks are designed to be efficient as they play a vital role in the economic development; otherwise, banks may create obstacles in the development process of any country. This chapter employs an appropriate stochastic frontier model to investigate the performance of banks, traded in Kuala Lumpur Stock Exchange (KLSE) market. Based on the likelihood ratio test, the Cobb-Douglas stochastic frontier model is found to be more preferable than Translog stochastic frontier model for this study. The market data are used as the input and output variables. Banks traded in KLSE exhibited a commendable overall efficiency level of 99.52% during 2005-2009 hence suggesting minimal input waste of 0.48%. Among the banks, the Rashid Hussain bank is found to be highly efficient with a score of 0.9973 and BIMB (BIMB Holdings) bank is noted to have the lowest efficiency with a score of 0.9917. The results also show that the technical efficiency effect is increased over time.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Minimum likelihood ratio"

1

Liu, Gang, Yu-Jing Cui, Hong-Gang Zhang, and Jun Guo. "Sample Selection Based on Minimum Likelihood Ratio." In 2007 International Conference on Machine Learning and Cybernetics. IEEE, 2007. http://dx.doi.org/10.1109/icmlc.2007.4370105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Yi-Chen. "Minimax and generalized likelihood ratio test for noncoherent FH/BFSK in band multitone jamming." In ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288554.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Balasubramanian, Karthik, Mark G. Turner, and Kiran Siddappaji. "Novel Curvature-Based Airfoil Parameterization for Wind Turbine Application and Optimization." In ASME Turbo Expo 2017: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/gt2017-65153.

Full text
Abstract:
The direct proportionality of streamline curvature to the pressure gradient normal to it causes the dependence of surface pressure loading on geometry curvature. This allows for the use of geometry curvature as a direct and aerodynamically meaningful interface to modify and improve performance of wind turbine sections. A novel blade parameterization technique driven by specification of meanline second derivative and a thickness distribution is presented. This technique is implemented as T-Blade3 which is an already existing in-house open-executable. The second derivative which is indicative of curvature, is used, enabling exploration of a large design space with minimal number of parameters due to the use of B-spline control points, capable of producing smooth curves with only a few points. New thickness and curvature control capabilities have been added to TBlade3 for isolated and wind turbine airfoils. The parameterization ensures curvature and slope of curvature continuity on the airfoil surface which are critical to smooth surface pressure distribution. Consequently, losses due to unintentional pressure spikes are minimized and likelihood of separation reduced. As a demonstration of the parameterization capability, Multi-Objective optimization is carried out to maximize wind turbine efficiency. This is achieved through an optimization tool-chain that minimizes a weighted sum of the drag-to-lift ratios over a range of angles of attack and sectional Reynolds numbers using a Genetic Algorithm. This allows for radial Reynolds number variation and ensures efficiency of wind turbine blade with twist incorporated. The tool-chain uses XFOIL to evaluate drag polars. This is implemented in MATLAB and Python in serial and in parallel with the US Department of Energy optimization system, DAKOTA. The Python and DAKOTA versions of the code are fully open-source. The NREL S809 horizontal axis wind turbine laminar-flow airfoil which is 21% thick has been used as a benchmark for comparison. Hence, the optimization is carried out with the same thickness-to-chord ratio. Drag coefficient improvement ranging from 17% to 55% for Cl between 0.3 and 1 was achieved.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography