Academic literature on the topic 'Log-normal distribution model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Log-normal distribution model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Log-normal distribution model"

1

Monteiro, Michael J. "Fitting molecular weight distributions using a log-normal distribution model." European Polymer Journal 65 (April 2015): 197–201. http://dx.doi.org/10.1016/j.eurpolymj.2015.01.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ozel, Gamze, Emrah Altun, Morad Alizadeh, and Mahdieh Mozafari. "The Odd Log-Logistic Log-Normal Distribution with Theory and Applications." Advances in Data Science and Adaptive Analysis 10, no. 04 (October 2018): 1850009. http://dx.doi.org/10.1142/s2424922x18500092.

Full text
Abstract:
In this paper, a new heavy-tailed distribution is used to model data with a strong right tail, as often occuring in practical situations. The proposed distribution is derived from the log-normal distribution, by using odd log-logistic distribution. Statistical properties of this distribution, including hazard function, moments, quantile function, and asymptotics, are derived. The unknown parameters are estimated by the maximum likelihood estimation procedure. For different parameter settings and sample sizes, a simulation study is performed and the performance of the new distribution is compared to beta log-normal. The new lifetime model can be very useful and its superiority is illustrated by means of two real data sets.
APA, Harvard, Vancouver, ISO, and other styles
3

Monteiro, Michael J., and Mikhail Gavrilov. "Characterization of hetero-block copolymers by the log-normal distribution model." Polymer Chemistry 7, no. 17 (2016): 2992–3002. http://dx.doi.org/10.1039/c6py00345a.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

JIMÉNEZ, J. A., V. ARUNACHALAM, and G. M. SERNA. "OPTION PRICING BASED ON A LOG–SKEW–NORMAL MIXTURE." International Journal of Theoretical and Applied Finance 18, no. 08 (December 2015): 1550051. http://dx.doi.org/10.1142/s021902491550051x.

Full text
Abstract:
This paper presents a method for approximating the underlying stock’s distribution by using a Log–Skew–Normal mixture distribution. The basic properties of a mixture of Skew–Normal distributions are reviewed in this paper. We provide a formula for the European option price by assuming that the log price follows a Skew–Normal mixture distribution. We also calculate the “Greeks”, such as delta, gamma and vega. We compare the proposed model with other existing models and consider an example of calibration to real market option data.
APA, Harvard, Vancouver, ISO, and other styles
5

Ashiq, Muhammad, John C. Doering, and Takashi Hosoda. "Bed-load transport model based on fractional size distribution." Canadian Journal of Civil Engineering 33, no. 1 (January 1, 2006): 69–80. http://dx.doi.org/10.1139/l05-086.

Full text
Abstract:
Two models based on the fractional size distribution approach, used in conjunction with the excess discharge theory, have been developed by using bed-load data collected from the Roaring River (Rocky Mountain National Park, Colorado) during the summer of 1995. The first model is based on the critical discharge value of individual fractional (IF) sizes, IF model (for log-normal and nonlog-normal size distribution modes), while the other is based on critical discharge value for total (combined) sizes, total fractional (TF) sizes model (for log-normal and nonlog-normal size distribution modes). The performance of the log-normal size distribution based models was tested with data from the Roaring River, Rich Creek, and Fourmile Creek (three Colorado streams), whereas the performance of the nonlog-normal size distribution based models was tested using Pitzbach River data. The performance of the models was also tested by comparing their results with the Inpasihardjo fractional size distribution based model. For all tests, the TF model performed better for both the log-normal and nonlog-normal grain size distributions.Key words: fractional size, critical discharge, IF model, TF model, discharge theory, Roaring River.
APA, Harvard, Vancouver, ISO, and other styles
6

Prataviera, Fábio, Gauss M. Cordeiro, Edwin M. M. Ortega, and Adriano K. Suzuki. "The Odd Log-Logistic Geometric Normal Regression Model with Applications." Advances in Data Science and Adaptive Analysis 11, no. 01n02 (April 2019): 1950003. http://dx.doi.org/10.1142/s2424922x19500037.

Full text
Abstract:
In several applications, the distribution of the data is frequently unimodal, asymmetric or bimodal. The regression models commonly used for applications to data with real support are the normal, skew normal, beta normal and gamma normal, among others. We define a new regression model based on the odd log-logistic geometric normal distribution for modeling asymmetric or bimodal data with support in [Formula: see text], which generalizes some known regression models including the widely known heteroscedastic linear regression. We adopt the maximum likelihood method for estimating the model parameters and define diagnostic measures to detect influential observations. For some parameter settings, sample sizes and different systematic structures, various simulations are performed to verify the adequacy of the estimators of the model parameters. The empirical distribution of the quantile residuals is investigated and compared with the standard normal distribution. We prove empirically the usefulness of the proposed models by means of three applications to real data.
APA, Harvard, Vancouver, ISO, and other styles
7

Gilmour, AR, and KD Atkins. "Modelling the FFDA fibre diameter histogram of fleece wool as a mixture distribution." Australian Journal of Agricultural Research 43, no. 8 (1992): 1777. http://dx.doi.org/10.1071/ar9921777.

Full text
Abstract:
The histogram of wool fibre diameters obtained by processing fleece samples through the Fibre Fineness Distribution Analyser (FFDA) machine is modelled as a mixture of two normal distributions fitted on the log scale (model iv). The paper compares this model with a single normal distribution on the natural scale (model i), a single normal distribution on the log scale (model ii) and a mixture of two normal distributions on the natural scale (model iii). When fitted to 2544 fibre diameter histograms from Merino hoggets, these models gave average lack-of-fits, distributed as �225, of 549.6, 190.1, 93.5 and 39.5 for models i to iv respectively. Model iv is proposed as the basis for describing the FFDA fibre diameter histogram in sheep breeding.
APA, Harvard, Vancouver, ISO, and other styles
8

Peyton Jones, James C., Saeed Shayestehmanesh, and Jesse Frey. "Parametric modelling of knock intensity data using a dual log-normal model." International Journal of Engine Research 21, no. 6 (September 5, 2018): 1026–36. http://dx.doi.org/10.1177/1468087418796335.

Full text
Abstract:
The Pearson test is used to confirm that knock intensity data closely approximate a cyclically independent random process which is therefore fully characterized by its probability density function or cumulative distribution function. Although these distributions are often assumed to be log-normal, other results have shown that the data do not conform to a log-normal distribution at the 5% significance level. A new dual log-normal model is therefore proposed based on the assumption that the data comprise a mixture of two distributions, one knocking and one non-knocking. Methods for estimating the parameters of this model, and for assessing the quality of fit, are presented. The results show a significantly improved model fit.
APA, Harvard, Vancouver, ISO, and other styles
9

Kim, J., and H. C. NO. "Model development for fragment-size distribution based on upper-limit log-normal distribution." Nuclear Engineering and Design 349 (August 2019): 86–91. http://dx.doi.org/10.1016/j.nucengdes.2019.04.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Brook, B. S., C. M. Murphy, D. Breen, A. W. Miles, D. G. Tilley, and A. J. Wilson. "Theoretical Models for the Quantification of Lung Injury Using Ventilation and Perfusion Distributions." Computational and Mathematical Methods in Medicine 10, no. 2 (2009): 139–54. http://dx.doi.org/10.1080/17486700802201592.

Full text
Abstract:
This paper describes two approaches to modelling lung disease: one based on a multi-compartment statistical model with a log normal distribution of ventilation perfusion ratio (V˙/Q˙) values; and the other on a bifurcating tree which emulates the anatomical structure of the lung. In the statistical model, the distribution becomes bimodal, when theV˙/Q˙values of a randomly selected number of compartments are reduced by 85% to simulate lung disease. For the bifurcating tree model a difference in flow to the left and right branches coupled with a small random variation in flow ratio between generations results in a log normal distribution of flows in the terminal branches. Restricting flow through branches within the tree to simulate lung disease transforms this log normal distribution to a bi-modal one. These results are compatible with those obtained from experiments using the multiple inert gas elimination technique, where log normal distributions ofV˙/Q˙ratio become bimodal in the presence of lung disease.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Log-normal distribution model"

1

Braga, Altemir da Silva. "Extensions of the normal distribution using the odd log-logistic family: theory and applications." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-02102017-092313/.

Full text
Abstract:
In this study we propose three new distributions and a study with longitudinal data. The first was the Odd log-logistic normal distribution: theory and applications in analysis of experiments, the second was Odd log-logistic t Student: theory and applications, the third was the Odd log-logistic skew normal: the new distribution skew-bimodal with applications in analysis of experiments and the fourth regression model with random effect of the Odd log-logistic skew normal distribution: an application in longitudinal data. Some have been demonstrated such as symmetry, quantile function, some expansions, ordinary incomplete moments, mean deviation and the moment generating function. The estimation of the model parameters were approached by the method of maximum likelihood. In applications were used regression models to data from a completely randomized design (CRD) or designs completely randomized in blocks (DBC). Thus, the models can be used in practical situations for as a completely randomized designs or completely randomized blocks designs, mainly, with evidence of asymmetry, kurtosis and bimodality.
A distribuição normal é uma das mais importantes na área de estatística. Porém, não é adequada para ajustar dados que apresentam características de assimetria ou de bimodalidade, uma vez que tal distribuição possui apenas os dois primeiros momentos, diferentes de zero, ou seja, a média e o desvio-padrão. Por isso, muitos estudos são realizados com a finalidade de criar novas famílias de distribuições que possam modelar ou a assimetria ou a curtose ou a bimodalidade dos dados. Neste sentido, é importante que estas novas distribuições tenham boas propriedades matemáticas e, também, a distribuição normal como um submodelo. Porém, ainda, são poucas as classes de distribuições que incluem a distribuição normal como um modelo encaixado. Dentre essas propostas destacam-se: a skew-normal, a beta-normal, a Kumarassuamy-normal e a gama-normal. Em 2013 foi proposta a nova família X de distribuições Odd log-logística-G com o objetivo de criar novas distribuições de probabildade. Assim, utilizando as distribuições normal e a skew-normal como função base foram propostas três novas distribuições e um quarto estudo com dados longitudinais. A primeira, foi a distribuição Odd log-logística normal: teoria e aplicações em dados de ensaios experimentais; a segunda foi a distribuição Odd log-logística t Student: teoria e aplicações; a terceira foi a distribuição Odd log-logística skew-bimodal com aplicações em dados de ensaios experimentais e o quarto estudo foi o modelo de regressão com efeito aleatório para a distribuição distribuição Odd log-logística skew-bimodal: uma aplicação em dados longitudinais. Estas distribuições apresentam boas propriedades tais como: assimetria, curtose e bimodalidade. Algumas delas foram demonstradas como: simetria, função quantílica, algumas expansões, os momentos incompletos ordinários, desvios médios e a função geradora de momentos. A flexibilidade das novas distrições foram comparada com os modelos: skew-normal, beta-normal, Kumarassuamy-normal e gama-normal. A estimativas dos parâmetros dos modelos foram obtidas pelo método da máxima verossimilhança. Nas aplicações foram utilizados modelos de regressão para dados provenientes de delineamentos inteiramente casualizados (DIC) ou delineamentos casualizados em blocos (DBC). Além disso, para os novos modelos, foram realizados estudos de simulação para verificar as propriedades assintóticas das estimativas de parâmetros. Para verificar a presença de valores extremos e a qualidade dos ajustes foram propostos os resíduos quantílicos e a análise de sensibilidade. Portanto, os novos modelos estão fundamentados em propriedades matemáticas, estudos de simulação computacional e com aplicações para dados de delineamentos experimentais. Podem ser utilizados em ensaios inteiramente casualizados ou em blocos casualizados, principalmente, com dados que apresentem evidências de assimetria, curtose e bimodalidade.
APA, Harvard, Vancouver, ISO, and other styles
2

Saaidia, Noureddine. "Sur les familles des lois de fonction de hasard unimodale : applications en fiabilité et analyse de survie." Thesis, Bordeaux 1, 2013. http://www.theses.fr/2013BOR14794/document.

Full text
Abstract:
En fiabilité et en analyse de survie, les distributions qui ont une fonction de hasard unimodale ne sont pas nombreuses, qu'on peut citer: Gaussienne inverse ,log-normale, log-logistique, de Birnbaum-Saunders, de Weibull exponentielle et de Weibullgénéralisée. Dans cette thèse, nous développons les tests modifiés du Chi-deux pour ces distributions tout en comparant la distribution Gaussienne inverse avec les autres. Ensuite nousconstruisons le modèle AFT basé sur la distribution Gaussienne inverse et les systèmes redondants basés sur les distributions de fonction de hasard unimodale
In reliability and survival analysis, distributions that have a unimodalor $\cap-$shape hazard rate function are not too many, they include: the inverse Gaussian,log-normal, log-logistic, Birnbaum-Saunders, exponential Weibull and power generalized Weibulldistributions. In this thesis, we develop the modified Chi-squared tests for these distributions,and we give a comparative study between the inverse Gaussian distribution and the otherdistributions, then we realize simulations. We also construct the AFT model based on the inverseGaussian distribution and redundant systems based on distributions having a unimodal hazard ratefunction
APA, Harvard, Vancouver, ISO, and other styles
3

Trönnberg, Filip. "Empirical evaluation of a Markovian model in a limit order market." Thesis, Uppsala universitet, Matematiska institutionen, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-176726.

Full text
Abstract:
A stochastic model for the dynamics of a limit order book is evaluated and tested on empirical data. Arrival of limit, market and cancellation orders are described in terms of a Markovian queuing system with exponentially distributed occurrences. In this model, several key quantities can be analytically calculated, such as the distribution of times between price moves, price volatility and the probability of an upward price move, all conditional on the state of the order book. We show that the exponential distribution poorly fits the occurrences of order book events and further show that little resemblance exists between the analytical formulas in this model and the empirical data. The log-normal and Weibull distribution are suggested as replacements as they appear to fit the empirical data better.
APA, Harvard, Vancouver, ISO, and other styles
4

Mvondo, Bernardin Gael. "Numerical techniques for optimal investment consumption models." University of the Western Cape, 2014. http://hdl.handle.net/11394/4352.

Full text
Abstract:
>Magister Scientiae - MSc
The problem of optimal investment has been extensively studied by numerous researchers in order to generalize the original framework. Those generalizations have been made in different directions and using different techniques. For example, Perera [Optimal consumption, investment and insurance with insurable risk for an investor in a Levy market, Insurance: Mathematics and Economics, 46 (3) (2010) 479-484] applied the martingale approach to obtain a closed form solution for the optimal investment, consumption and insurance strategies of an individual in the presence of an insurable risk when the insurable risk and risky asset returns are described by Levy processes and the utility is a constant absolute risk aversion. In another work, Sattinger [The Markov consumption problem, Journal of Mathematical Economics, 47 (4-5) (2011) 409-416] gave a model of consumption behavior under uncertainty as the solution to a continuous-time dynamic control problem in which an individual moves between employment and unemployment according to a Markov process. In this thesis, we will review the consumption models in the above framework and will simulate some of them using an infinite series expansion method − a key focus of this research. Several numerical results obtained by using MATLAB are presented with detailed explanations.
APA, Harvard, Vancouver, ISO, and other styles
5

Golder, Jacques. "Modélisation d'un phénomène pluvieux local et analyse de son transfert vers la nappe phréatique." Phd thesis, Université d'Avignon, 2013. http://tel.archives-ouvertes.fr/tel-01057725.

Full text
Abstract:
Dans le cadre des recherches de la qualité des ressources en eau, l'étude du processus de transfert de masse du sol vers la nappe phréatique constitue un élément primordial pour la compréhension de la pollution de cette dernière. En effet, les éléments polluants solubles à la surface (produits liés aux activités humaines tels engrais, pesticides...) peuvent transiter vers la nappe à travers le milieu poreux qu'est le sol. Ce scénario de transfert de pollution repose sur deux phénomènes : la pluie qui génère la masse d'eau à la surface et la dispersion de celle-ci à travers le milieu poreux. La dispersion de masse dans un milieu poreux naturel comme le sol forme un sujet de recherche vaste et difficile aussi bien au plan expérimental que théorique. Sa modélisation constitue une préoccupation du laboratoire EMMAH, en particulier dans le cadre du projet Sol Virtuel dans lequel un modèle de transfert (modèle PASTIS) a été développé. Le couplage de ce modèle de transfert avec en entrée un modèle décrivant la dynamique aléatoire de la pluie est un des objectifs de la présente thèse. Ce travail de thèse aborde cet objectif en s'appuyant d'une part sur des résultats d'observations expérimentaux et d'autre part sur de la modélisation inspirée par l'analyse des données d'observation. La première partie du travail est consacrée à l'élaboration d'un modèle stochastique de pluie. Le choix et la nature du modèle sont basés sur les caractéristiques obtenus à partir de l'analyse de données de hauteur de pluie recueillies sur 40 ans (1968-2008) sur le Centre de Recherche de l'INRA d'Avignon. Pour cela, la représentation cumulée des précipitations sera assimilée à une marche aléatoire dans laquelle les sauts et les temps d'attente entre les sauts sont respectivement les amplitudes et les durées aléatoires entre deux occurrences d'événements de pluie. Ainsi, la loi de probabilité des sauts (loi log-normale) et celle des temps d'attente entre les sauts (loi alpha-stable) sont obtenus en analysant les lois de probabilité des amplitudes et des occurrences des événements de pluie. Nous montrons alors que ce modèle de marche aléatoire tend vers un mouvement brownien géométrique subordonné en temps (quand les pas d'espace et de temps de la marche tendent simultanément vers zéro tout en gardant un rapport constant) dont la loi de densité de probabilité est régie par une équation de Fokker Planck fractionnaire (FFPE). Deux approches sont ensuite utilisées pour la mise en œuvre du modèle. La première approche est de type stochastique et repose sur le lien existant entre le processus stochastique issu de l'équation différentielle d'Itô et la FFPE. La deuxième approche utilise une résolution numérique directe par discrétisation de la FFPE. Conformément à l'objectif principal de la thèse, la seconde partie du travail est consacrée à l'analyse de la contribution de la pluie aux fluctuations de la nappe phréatique. Cette analyse est faite sur la base de deux relevés simultanées d'observations de hauteurs de pluie et de la nappe phréatique sur 14 mois (février 2005-mars 2006). Une étude statistique des liens entre les signaux de pluie et de fluctuations de la nappe est menée comme suit : Les données de variations de hauteur de nappe sont analysées et traitées pour isoler les fluctuations cohérentes avec les événements de pluie. Par ailleurs, afin de tenir compte de la dispersion de masse dans le sol, le transport de la masse d'eau pluviale dans le sol sera modélisé par un code de calcul de transfert (modèle PASTIS) auquel nous appliquons en entrée les données de hauteurs de pluie mesurées. Les résultats du modèle permettent entre autre d'estimer l'état hydrique du sol à une profondeur donnée (ici fixée à 1.6m). Une étude de la corrélation entre cet état hydrique et les fluctuations de la nappe sera ensuite effectuée en complément à celle décrite ci-dessus pour illustrer la possibilité de modéliser l'impact de la pluie sur les fluctuations de la nappe
APA, Harvard, Vancouver, ISO, and other styles
6

Cai, Changjie. "Development of a portable aerosol collector and spectrometer (PACS)." Diss., University of Iowa, 2018. https://ir.uiowa.edu/etd/6067.

Full text
Abstract:
The overall goal of this doctoral dissertation is to develop a prototype instrument, a Portable Aerosol Collector and Spectrometer (PACS), that can continuously measure aerosol size distributions by number, surface area and mass concentrations over a wide size range (from 10 nm to 10 µm) while also collecting particles with impactor and diffusion stages for post-sampling chemical analyses. To achieve the goal, in the first study, we designed, built and tested the PACS hardware. The PACS consists of a six-stage particle size selector, a valve system, a water condensation particle counter to measure number concentrations and a photometer to measure mass concentrations. The valve system diverts airflow to pass sequentially through upstream stages of the selector to the detectors. The stages of the selector include three impactor and two diffusion stages, which resolve particles by size and collect particles for chemical analysis. Particle penetration by size was measured through each stage to determine actual performance and account for particle losses. The measured d50 of each stage (aerodynamic diameter for impactor stages and geometric diameter for diffusion stages) was similar to the design. The pressure drop of each stage was sufficiently low to permit its operation with portable air pumps. In the second study, we developed a multi-modal log-normal (MMLN) fitting algorithm to leverage the multi-metric, low-resolution data from one sequence of PACS measurements to estimate aerosol size distributions of number, surface area, and mass concentration in near-real-time. The algorithm uses a grid-search process and a constrained linear least-square (CLLS) solver to find a tri-mode (ultrafine, fine, and coarse), log-normal distribution that best fits the input data. We refined the algorithm to obtain accurate and precise size distributions for four aerosols typical of diverse environments: clean background, urban and freeway, coal power plant, and marine surface. Sensitivity studies were conducted to explore the influence of unknown particle density and shape factor on algorithm output. An adaptive process that refined the ranges and step sizes of the grid-search reduced the computation time to fit a single size distribution in near-real-time. Assuming standard density spheres, the aerosol size distributions fit well with the normalized mean bias (NMB) of -4.9% to 3.5%, normalized mean error (NME) of 3.3% to 27.6%, and R2 values of 0.90 to 1.00. The fitted number and mass concentration biases were within ± 10% regardless of uncertainties in density and shape. With this algorithm, the PACS is able to estimate aerosol size distributions by number, surface area, and mass concentrations from 10 nm to 10 µm in near-real-time. In the third study, we developed a new algorithm–the mass distribution by composition and size (MDCS) algorithm–to estimate the mass size distribution of various particle compositions. Then we compared the PACS for measuring multi-mode aerosols to three reference instruments, including a scanning mobility particle sizer (SMPS), an aerodynamic particle sizer (APS) and a nano micro-orifice uniform deposit impactor (nanoMOUDI). We used inductively coupled plasma mass spectrometry to measure the mass of collected particles on PACS and nanoMOUDI stages by element. For the three-mode aerosol, the aerosol size distributions in three metrics measured with the PACS agreed well with those measured with the SMPS/APS: number concentration, bias = 9.4% and R2 = 0.96; surface area, bias = 17.8%, R2 = 0.77; mass, bias = -2.2%, R2 = 0.94. Agreement was considerably poorer for the two-mode aerosol, especially for surface area and mass concentrations. Comparing to the nanoMOUDI, for the three-mode aerosol, the PACS estimated the mass median diameters (MMDs) of the coarse mode well, but overestimated the MMDs for ultrafine and fine modes. The PACS overestimated the mass concentrations of ultrafine and fine mode, but underestimated the coarse mode. This work provides insight into a novel way to simultaneously assess airborne aerosol size, composition, and concentration by number, surface area and mass using cost-effective handheld technologies.
APA, Harvard, Vancouver, ISO, and other styles
7

Polizzi, Stefano. "Emergence of log-normal distributions in avalanche processes, validation of 1D stochastic and random network models, with an application to the characterization of cancer cells plasticity." Thesis, Bordeaux, 2020. http://www.theses.fr/2020BORD0220.

Full text
Abstract:
Plusieurs matériaux vitreux ont des comportements caractéristiques suite à fractures induites par des contraintes. Ces fractures se présentent comme des processus d'avalanche dont la statistique dans la plus part des cas suit une loi de puissance, rappel de comportements collectifs et critiques auto-organisés. Des avalanches des fractures sont observées aussi dans des systèmes vivants, qui peuvent être vus dans certains cas comme un réseau vitreux, avec une structure figée. Le cytosquelette d'actine (CSK) forme des structures organisées en microfilaments par un mécanisme dynamique d'assemblage-désassemblage de cross-linkers. Des expériences ont montré que les cellules répondent à des contraintes extérieures par cascades d’événements aléatoires de ruptures, en suggérant qu'elles se comportent comme de réseaux aléatoires quasi-rigides de filaments interconnectés. Nous analysons des données expérimentales provenant de cellules CD34+ de moelles osseuses saines et leucémiques. Étonnement, les distributions de la force, la taille et l'énergie libérée lors de ces cascades, ne suivent pas une distribution en lois de puissance typique de phénomènes critiques. En fait la distribution de la taille des avalanches s'avère être log-normale. Dans le but de donner une interprétation de ce comportement particulier nous proposons d'abord un modèle stochastique minimal (1D). Ce modèle donne une interprétation de l'énergie relarguée dans les cascades de ruptures, au regard d'une somme (étant l'énergie additive) d'un processus multiplicatif de cascade avec une relaxation temporelle. Nous identifions 2 types d’événements de ruptures: des fractures friables susceptibles de représenter des ruptures irréversibles dans un CSK rigide et très connecté, et des fractures ductiles résultant des décrochements dynamiques des cross-linkers pendant la déformation plastique sans perte d'intégrité du CSK. Notre modèle fournit une compréhension mathématique et mécanique de la statistique log-normale observée dans les deux (friable et ductile) cas. Nous montrons aussi que les fractures friables sont relativement plus importantes dans les cellules leucémiques, en témoignant leur plus grande fragilité et leur différente architecture du CSK, plus rigide et réticulée. Ce modèle minimal motive la question plus générale de quelles sont les distributions résultantes pour la somme de variables corrélées provenant d'un processus multiplicatif. En conséquence nous analysons la distribution de la somme d'un processus de branchement généralisé évoluant avec un facteur de croissance aléatoire continu. Le processus dépend de 2 paramètres: les 2 premiers moments centrés de la distribution du facteur de croissance. Nous créons un diagramme de phase en montrant 3 régions différentes: une région où la distribution finale a tous les moments finis et qui est approximativement log-normale. 2) Une région où la distribution asymptotique est une lois de puissance, avec un exposent inclus dans l'intervalle [1;3], dont la valeur est uniquement déterminée par les paramètres du modèle. 3) Enfin dans la dernière région une distribution exactement log-normale, mais non-stationnaire. Dans tous les cas, les corrélations se révèlent fondamentales. Nous proposons ensuite un modèle de réseau aléatoire Erdös-Rényi pour modéliser le CSK, en identifiant le nœuds en tant que filaments d'actine et les liens en tant que cross-linkers. Sur cette structure nous simulons la propagation d'avalanches de ruptures. Nos simulations montrent que l'on peut reproduire une statistique log-normale avec deux simples ingrédients: un réseau aléatoire sans échelle d'espace caractéristique et une règle de rupture capturant la visco-élasticité des cellules. Ce travail ouvre la voie pour des applications futures à plusieurs phénomènes dans les systèmes vivants qui contiennent de larges populations d'éléments individuels, non-linéaires (cerveau, cœur, épidémies), où des statistiques log-normales similaires ont été observées
Many glassy and amorphous materials, like martensites, show characteristic behaviours during constraintinduced fractures. These fractures are avalanche processes whose statistics is known to follow in most cases a power-law distribution, reminding of collective behaviour and self-organised criticality. Avalanches of fractures are observed as well in living systems which, if we do not consider active remodelling, can be seen as a glassy network, with a frozen structure.The actin cytoskeleton (CSK) forms microfilaments organisedinto higher-order structures by a dynamic assembly-disassembly mechanism with cross-linkers.Experiments revealed that cells respond to external constraints bya cascade of random and abrupt ruptures of their CSK, suggesting that they behaveas a quasi-rigid random network of intertwined filaments. We analyse experimental data on CD34+ cells, isolated from healthy andleukemic bone marrows, however these behaviours have been reproduced on other cells.Surprisingly, the distribution of thestrength, the size and the energy of these rupture events do not follow the power-law statistics typical of critical phenomena and of avalanche size distributions in amorphous materials. In fact, the avalanche size turns out to be log-normal, suggesting that the mechanics of living systems in catastrophic events would not fitinto self-organised critical systems (power-laws).In order to give an interpretation of this peculiar behaviour we first proposea minimal (1D) stochastic model. This model gives an interpretation of the energy released along the rupture events, in terms of the sum (being energy additive) of a multiplicative cascade process relaxing with time. We distinguish 2 types of rupture events, brittle failures likely corresponding toirreversible ruptures in a stiff and highly cross-linked CSK and ductile failures resulting from dynamiccross-linker unbindings during plastic deformation without loss of CSK integrity. Our model provides somemathematical and mechanistic understanding of the robustness of the log-normal statistics observedin both brittle and ductile situations. We also show that brittle failures are relatively more prominentin leukemic than in healthy cells, suggesting their greater fragility and their different CSK architecture, stiffer and more reticulated.This minimal model motivates the more general question of what are the resulting distributions of a sum of correlated random variables coming from a multiplicative process. Therefore, we analyse the distribution of the sum of a generalised branching process evolving with a continuous random reproduction (growth) rate. The process depends only on 2 parameters: the first 2 central moments of the reproduction rate distribution.We then create a phase diagram showing 3 different regions: 1) a region where the final distribution has all central moments finite and is approximately log-normal. 2) A region where the asymptotic distribution is a power-law, with a decay exponent belonging to the interval [1;3], whose value is uniquely determined by the model parameters. 3) Finally, we found an exact log-normal size, non-stationary, distribution region. In all cases correlations are fundamental.Increasing the level of complexity for avalanche modelling, we propose then a random Erdös-Rényi network to model a cell CSK, identifying the networknodes as the actin filaments, and its links as actin cross-linkers. On this structure wesimulate avalanches of ruptures.Our simulations show that we can reproduce the log-normal statistics with two simple ingredients: a random network without characteristic length scale, and a breaking rule capturingthe observed visco-elasticity of living cells. This work paves the way for future applications to many phenomena in living systems that include large populations of individual, non-linear, elements(brain, heart, epidemics) where similar log-normal statistics have also been observed
APA, Harvard, Vancouver, ISO, and other styles
8

Cruz, José Nilton da. "A nova família de distribuições odd log-logística: teoria e aplicações." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-03052016-183138/.

Full text
Abstract:
Neste trabalho, foi proposta uma nova família de distribuições, a qual permite modelar dados de sobrevivência quando a função de risco tem formas unimodal e U (banheira). Ainda, foram consideradas as modificações das distribuições Weibull, Fréchet, half-normal generalizada, log-logística e lognormal. Tomando dados não-censurados e censurados, considerou-se os estimadores de máxima verossimilhança para o modelo proposto, a fim de verificar a flexibilidade da nova família. Além disso, um modelo de regressão locação-escala foi utilizado para verificar a influência de covariáveis nos tempos de sobrevida. Adicionalmente, conduziu-se uma análise de resíduos baseada nos resíduos deviance modificada. Estudos de simulação, utilizando-se de diferentes atribuições dos parâmetros, porcentagens de censura e tamanhos amostrais, foram conduzidos com o objetivo de verificar a distribuição empírica dos resíduos tipo martingale e deviance modificada. Para detectar observações influentes, foram utilizadas medidas de influência local, que são medidas de diagnóstico baseadas em pequenas perturbações nos dados ou no modelo proposto. Podem ocorrer situações em que a suposição de independência entre os tempos de falha e censura não seja válida. Assim, outro objetivo desse trabalho é considerar o mecanismo de censura informativa, baseado na verossimilhança marginal, considerando a distribuição log-odd log-logística Weibull na modelagem. Por fim, as metodologias descritas são aplicadas a conjuntos de dados reais.
In this study, a new family of distributions was proposed, which allows to model survival data when the function of risk has unimodal shapes and U (bathtub). Modifications of the Weibull, Fréchet, generalized half-normal, log-logistic and lognormal distributions were considered. Taking censored and non-censored data, we consider the maximum likelihood estimators for the proposed model, in order to check the flexibility of the new family. Also, it was considered a location-scale regression model, to verify the influence of covariates on survival times. Additionally, a residual analysis was conducted based on modified deviance residuals. For different parameters fixed, percentages of censoring and sample sizes, several simulation studies were performed with the objective of verify the empirical distribution of the martingale type and modified deviance residuals. To detect influential observations, measures of local influence were used, which are diagnostic measures based on small perturbations in the data or in the proposed model. It can occur situations in which the assumption of independence between the failure and censoring times is not valid. Thus, another objective of this work is to consider the informative censoring mechanism based on the marginal likelihood, considering the log-odd log-logistic Weibull distribution in modelling. Finally, the methodologies described are applied to sets of real data.
APA, Harvard, Vancouver, ISO, and other styles
9

SILVA, Débora Karollyne Xavier. "Análise de diagnóstico para o modelo de regressão Log-Birnbaum-Saunders generalizado." Universidade Federal de Campina Grande, 2013. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1391.

Full text
Abstract:
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-08-08T21:14:19Z No. of bitstreams: 1 DÉBORA KAROLLYNE XAVIER SILVA - DISSERTAÇÃO PPGMAT 2013..pdf: 5676823 bytes, checksum: 10779ac6b54c624585a998fed783af51 (MD5)
Made available in DSpace on 2018-08-08T21:14:19Z (GMT). No. of bitstreams: 1 DÉBORA KAROLLYNE XAVIER SILVA - DISSERTAÇÃO PPGMAT 2013..pdf: 5676823 bytes, checksum: 10779ac6b54c624585a998fed783af51 (MD5) Previous issue date: 2013-12
Capes
A distribuição Birnbaum-Saunders surgiu em 1969 com aplicações fortemente ligadas à engenharia e se expandiu nas últimas décadas a diversas áreas. Na literatura, além de tomar um papel de destaque na análise de sobrevivência, podemos destacar o surgimento de várias generalizações. Neste trabalho apresentaremos uma dessas generalizações, a qual foi formulada por Mentainis em 2010. Primeiramente, faremos uma breve explanação sobre a distribuição Birnbaum-Saunders cl´assica e sobre a generaliza¸c˜ao que foi proposta por Mentainis (2010), a qual chamaremos de distribuição Birnbaum-Saunders generalizada. Em seguida, discorreremos sobre a distribuição senh-normal, a qual possui uma importante relação com a distribuição Birnbaum-Saunders. Numa outra etapa, apresentaremos alguns métodos de diagnóstico para o modelo de regressão log-Birnbaum-Saunders generalizado e investigaremos testes de homogeneidade para os correspondentes parˆametros de forma e escala. Por fim, analisamos um conjunto de dados para ilustrar a teoria desenvolvida.
The Birnbaum-Saunders distribution emerged in 1969 motivated by problems in engineering. However, its field of application has been extended beyond the original context of material fatigue and reliability analysis. In the literature, it has made an important role in survival analysis. Moreover, many generalizations of it have been considered. In this work we present one of these generalizations, which was formulated by Mentainis in 2010. First, we provide a brief explanation of the classical Birnbaum-Saunders distribution and its generalization proposed by Mentainis (2010), which we name as the generalized Birnbaum-Saunders distribution. Thereafter, we discuss the sinh-normal distribution, which has an important relationship with the Birnbaum-Saunders distribution. In a further part of this work, we present some diagnostic methods for generalized log-Birnbaum-Saunders regression models and investigate tests of homogeneity for the corresponding shape and scale parameters. Finally, an application with real data is presented.
APA, Harvard, Vancouver, ISO, and other styles
10

Zhao, Hui. "Variational Bayesian Learning and its Applications." Thesis, 2013. http://hdl.handle.net/10012/8120.

Full text
Abstract:
This dissertation is devoted to studying a fast and analytic approximation method, called the variational Bayesian (VB) method, and aims to give insight into its general applicability and usefulness, and explore its applications to various real-world problems. This work has three main foci: 1) The general applicability and properties; 2) Diagnostics for VB approximations; 3) Variational applications. Generally, the variational inference has been developed in the context of the exponential family, which is open to further development. First, it usually consider the cases in the context of the conjugate exponential family. Second, the variational inferences are developed only with respect to natural parameters, which are often not the parameters of immediate interest. Moreover, the full factorization, which assumes all terms to be independent of one another, is the most commonly used scheme in the most of the variational applications. We show that VB inferences can be extended to a more general situation. We propose a special parameterization for a parametric family, and also propose a factorization scheme with a more general dependency structure than is traditional in VB. Based on these new frameworks, we develop a variational formalism, in which VB has a fast implementation, and not be limited to the conjugate exponential setting. We also investigate its local convergence property, the effects of choosing different priors, and the effects of choosing different factorization scheme. The essence of the VB method relies on making simplifying assumptions about the posterior dependence of a problem. By definition, the general posterior dependence structure is distorted. In addition, in the various applications, we observe that the posterior variances are often underestimated. We aim to develop diagnostics test to assess VB approximations, and these methods are expected to be quick and easy to use, and to require no sophisticated tuning expertise. We propose three methods to compute the actual posterior covariance matrix by only using the knowledge obtained from VB approximations: 1) To look at the joint posterior distribution and attempt to find an optimal affine transformation that links the VB and true posteriors; 2) Based on a marginal posterior density approximation to work in specific low dimensional directions to estimate true posterior variances and correlations; 3) Based on a stepwise conditional approach, to construct and solve a set of system of equations that lead to estimates of the true posterior variances and correlations. A key computation in the above methods is to calculate a uni-variate marginal or conditional variance. We propose a novel way, called the VB Adjusted Independent Metropolis-Hastings (VBAIMH) method, to compute these quantities. It uses an independent Metropolis-Hastings (IMH) algorithm with proposal distributions configured by VB approximations. The variance of the target distribution is obtained by monitoring the acceptance rate of the generated chain. One major question associated with the VB method is how well the approximations can work. We particularly study the mean structure approximations, and show how it is possible using VB approximations to approach model selection tasks such as determining the dimensionality of a model, or variable selection. We also consider the variational application in Bayesian nonparametric modeling, especially for the Dirichlet process (DP). The posterior inference for DP has been extensively studied in the context of MCMC methods. This work presents a a full variational solution for DP with non-conjugate settings. Our solution uses a truncated stick-breaking representation. We propose an empirical method to determine the number of distinct components in a finite dimensional DP. The posterior predictive distribution for DP is often not available in a closed form. We show how to use the variational techniques to approximate this quantity. As a concrete application study, we work through the VB method on regime-switching lognormal models and present solutions to quantify both the uncertainty in the parameters and model specification. Through a series numerical comparison studies with likelihood based methods and MCMC methods on the simulated and real data sets, we show that the VB method can recover exactly the model structure, gives the reasonable point estimates, and is very computationally efficient.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Log-normal distribution model"

1

The new Weibull handbook: Reliability & statistical analysis for predicting life, safety, risk, support costs, failures, and forecasting warranty claims, substantiation and accelerated testing, using Weibull, Log normal, crow-AMSAA, probit, and Kaplan-Meier models. 5th ed. North Palm Beach, Fla: R.B. Abernethy, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cheng, Russell. The Skew Normal Distribution. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198505044.003.0012.

Full text
Abstract:
This chapter considers the univariate skew-normal distribution, a generalization of the normal that includes the normal as a special case. The most natural parametrization is non-standard. This is because the Fisher information matrix is then singular at the true parameter value when the true model is the normal special case. The log-likelihood is then particularly flat in a certain coordinate direction. Standard theory cannot then be used to calculate the asymptotic distribution of all the parameter estimates. This problem can be handled using an alternative parametrization. There is another special case: the half/folded normal distribution. This occurs in the usual parametrization when the shape parameter is infinite. This is not a problem computationally and is easily handled. There are many generalizations to skew-t distributions and to tractable multivariate forms and regression versions. A very brief review is included of these.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Log-normal distribution model"

1

Fujimoto, Shouji, Masashi Tomoyose, and Atushi Ishikawa. "A Stochastic Model for Pareto’s Law and the Log-Normal Distribution under the Detailed Balance and Extended-Gibrat’s Law." In Studies in Computational Intelligence, 605–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-00909-9_58.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sakamoto, Naoshi. "Simple Models Characterizing the Cell Dwell Time with a Log-Normal Distribution." In Studies in Computational Intelligence, 115–30. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23509-7_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

"Normal, Log-Normal Distribution, and Option Pricing Model." In Security Analysis, Portfolio Management, and Financial Derivatives, 739–76. WORLD SCIENTIFIC, 2012. http://dx.doi.org/10.1142/9789814343589_0019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Datta, D. "Mathematics of Probabilistic Uncertainty Modeling." In Advances in Computational Intelligence and Robotics, 173–204. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4991-0.ch009.

Full text
Abstract:
This chapter presents the uncertainty modeling using probabilistic methods. Probabilistic method of uncertainty analysis is due to randomness of the parameters of a model. Randomness of parameters is characterized by specified probability distribution such as normal, log normal, exponential etc., and the corresponding samples are generated by various methods. Monte Carlo simulation is applied to explore the probabilistic uncertainty modeling. Monte Carlo simulation being a statistical process is based on the random number generation from the specified distribution of the uncertain random parameters. Sample size is generally very large in Monte Carlo simulation which is required to have small errors in the computation. Latin hypercube sampling and importance sampling are explored in brief. This chapter also presents Polynomial Chaos theory based probabilistic uncertainty modeling. Polynomial Chaos theory is an efficient Monte Carlo simulation in the sense that sample size here is very small and dictated by the number of the uncertain parameters and by choice of the order of the polynomial selected to represent the uncertain parameter.
APA, Harvard, Vancouver, ISO, and other styles
5

Lee, Jinhyung. "Factors Affecting Health Information Technology Expenditure in California Hospitals." In Technology Adoption and Social Issues, 1437–49. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5201-7.ch066.

Full text
Abstract:
This paper investigates the factors affecting health information technology (IT) investment. Different from previous studies, health IT was measured as the dollar amount of hardware, software and labor related health IT. This study employed Hospital and Patient level data of the Office of Statewide Health Planning and Development (OSHPD) from 2000 to 2006. The generalized linear model (GLM) was employed with log link and normal distribution and controlled for clustering error. This study found that not-for-profit and government hospital, teaching hospitals, competition, health IT expenditure of neighborhood hospitals were positively associated with health IT expenditure. However, rural hospitals were negatively associated with health IT expenditure. Moreover, this study found a significant increase in health IT investment over seven years resulted from increased clinical IT adoption.
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Jinhyung, and Hansil Choi. "Health Information Technology Spending on the Rise." In Advances in Healthcare Information Systems and Administration, 1–14. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5460-8.ch001.

Full text
Abstract:
In this chapter, the authors track health information technology by examining the factors affecting health information technology (IT) expenditure. The authors employed hospital- and patient-level data of the Office of Statewide Health Planning and Development (OSHPD) from 2000 to 2006. The generalized linear model (GLM) was employed with log link and normal distribution and controlled for clustering error. The authors found that not-for-profit and government hospitals, teaching hospitals, competition, and health IT expenditure of neighborhood hospitals were positively associated with health IT expenditure. However, rural hospitals were negatively associated with health IT expenditure. Moreover, the authors found that mean annual health IT expenditure was approximately $7.4 million from 2000-2006. However, it jumped 204% to $15.1 million from 2008-2014.
APA, Harvard, Vancouver, ISO, and other styles
7

Pawlowsky-Glahn, Vera, and Richardo A. Olea. "Spatial covariance structure." In Geostatistical Analysis of Compositional Data. Oxford University Press, 2004. http://dx.doi.org/10.1093/oso/9780195171662.003.0009.

Full text
Abstract:
For any component in time series analysis (Natke 1983), the concept of covariance between components of a spatially distributed random vector Z(u) leads to: direct covariances, Cov[Zi(u),Zj(u)]; shifted covariances or spatial covariances, Cov [Zi(u), Zj-(u+ h)], also known as cross-covariance functions; and autocovariance functions, Cov[Zi(u),Zi(u + h)]. The direct covariances may be thought of as a special case of the cross-covariance functions (for h = 0), and the same holds for the autocovariance functions (for i = j), so there is no need for a separate discussion. To simplify the exposition, hereafter the term function is dropped, and only the terms cross-covariance and autocovariance are used. Pawlowsky (1984) stated that if the vector random function constitutes an r-composition, then the problem of spurious spatial correlations appears. This is evident from the fact that at each point of the domain W, as in the nonregionalized case, the natural sample space of an r-composition is the D-simplex. This aspect will be discussed in Section 3.1.1. Aitchison (1986) discussed the problematic nature of the covariance analysis of nonregionalized compositions. He circumvents the problem of spurious correlations by using the fact that the ratio of two arbitrary components of a basis is identical to the ratios of the corresponding components of the associated composition. To avoid working with ratios, which is always difficult, Aitchison takes logarithms of the ratios. Then dependencies among variables of a composition can be examined in real space by analyzing the covariance structure of the log-quotients. The advantages of using this approach are not only numerical or related to the facility of subsequent mathematical operations. Essentially they relate to the fact that the approach consists of a projection of the original sample space, the simplex SD, onto a new sample space, namely real space IRD-1. Thus the door is open to many available methods and models based on the multivariate normal distribution. Recall that the multivariate normal distribution requires the sample space to be precisely the multidimensional, unconstrained real space. For this kind of model, strictly speaking, this is equivalent to saying that you need unconstrained components of the random vector to be analyzed.
APA, Harvard, Vancouver, ISO, and other styles
8

Guhathakurta, Kousik, Basabi Bhattacharya, and A. Roy Chowdhury. "Comparative Analysis of Asset Pricing Models Based on Log-Normal Distribution and Tsallis Distribution using Recurrence Plot in an Emerging Market." In Research in Finance, 35–73. Emerald Group Publishing Limited, 2016. http://dx.doi.org/10.1108/s0196-382120160000032003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"model based on multivariate conditional normal dis-distribution is greater than the tail of a normal distri-tribution assumptions, finding that Copula model bution, indicating their yield series present "spikes based on extreme value theory is dominant. and fat tail". Meanwhile, JB statistics are greater than the critical value of 5.9915 at 5% significant 3 EMPIRICAL RESEARCH level, refusing the null hypothesis that the yield se-ries obey normal distribution, that is, the two index 3.1 Selection and pre-processing of sample data yield series are not normally distributed, and we cannot use traditional mean - variance model to ana-This paper selects the daily closing price of Shang-log. hai Composite Index (SH) and S&P500 as study sample. The time period is from January 4, 2000 to 3.3 Stationary test May 28, 2008, and there are a total of 2023 and 2111." In Network Security and Communication Engineering, 412. CRC Press, 2015. http://dx.doi.org/10.1201/b18660-107.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Log-normal distribution model"

1

Xu, Shuzhen, and Enrique Susemihl. "Reliability Metrics of ReduNdant Systems With Log-Normal Repair Times." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-13169.

Full text
Abstract:
The data usually available related to equipment failure and repair are the MTTF (mean time to failure) and MTTR (mean time to repair), with the underlying distributions assumed exponential. The analysis of repairable redundant systems can then proceed by means of integration of Markhov transition equations. It has been found that although the exponential distribution may represent adequately the failure probability, in some cases the log-normal distribution better represents the repair probability, but this distribution leads to non-Markhovian systems. The reliability of redundant systems is affected by the distribution used to model the repair time. This paper presents a comparison of the reliability metrics for redundant systems with exponential failure distribution and two cases of repair time distributions, i.e., exponential and log-normal. The exponential failure distribution is analyzed by integration of the Markhov transition equations, and the log-normal distribution by Monte Carlo simulation. The results obtained show that the use of exponential repair distributions when the data follows the log-normal distribution overestimates the reliability of the system. Nevertheless for values of the MTTR relatively small compared to the MTTF, 10% or lower, the results obtained by using exponential failure distributions are acceptable for engineering applications.
APA, Harvard, Vancouver, ISO, and other styles
2

Bin, Qin, Xu Yan, Lei Xiao Shuang, and Yang Hui. "A Reliability Model of Intelligent Station Secondary Device Based on Log-Normal Distribution Model and Its Discriminating Method." In 2019 IEEE 2nd International Conference on Electronics Technology (ICET). IEEE, 2019. http://dx.doi.org/10.1109/eltech.2019.8839412.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cao, Peng, Jiangping Wu, Zhiyuan Liu, Jingjing Guo, Jun Yang, and Longxing Shi. "A Statistical Current and Delay Model Based on Log-Skew-Normal Distribution for Low Voltage Region." In GLSVLSI '19: Great Lakes Symposium on VLSI 2019. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3299874.3318028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bhonsle, Suryaji R., and Paul Thompson. "A Statistical Adaptable Distribution Function Model for Low Probabilities of Failure." In ASME 1993 Design Technical Conferences. American Society of Mechanical Engineers, 1993. http://dx.doi.org/10.1115/detc1993-0063.

Full text
Abstract:
Abstract Weibull, log normal, and some other Distribution function models (D.F.M.) have a tendency to deviate from experimental results. This deviation, either exceedingly conservative or nonconservative, is amplified at low probabilities of failure. To remedy such problems a new D.F.M. is derived. It is then used to predict low probabilities of failure. The predictions are consistent with experimental data and are not too conservative or too nonconservative.
APA, Harvard, Vancouver, ISO, and other styles
5

Yao, Chen, and Yang Jun. "Generalized confidence intervals for process capability indices of log-normal distribution in the one-way random model." In 2016 Prognostics and System Health Management Conference (PHM-Chengdu). IEEE, 2016. http://dx.doi.org/10.1109/phm.2016.7819855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dabirian, Ramin, Shihao Cui, Ilias Gavrielatos, Ram Mohan, and Ovadia Shoham. "Evaluation of Models for Droplet Shear Effect of Centrifugal Pump." In ASME 2018 5th Joint US-European Fluids Engineering Division Summer Meeting. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/fedsm2018-83318.

Full text
Abstract:
During the process of petroleum production and transportation, equipment such as pumps and chokes will cause shear effects which break the dispersed droplets into smaller size. The smaller droplets will influence the separator process significantly and the droplet size distribution has become a critical criterion for separator design. In order to have a better understanding of the separation efficiency, estimation of the dispersed-phase droplet size distribution is very important. The objective of this paper is to qualitatively and quantitatively investigate the effect of shear imparted on oil-water flow by centrifugal pump. This paper presents available published models for the calculation of droplet size distribution caused by different production equipment. Also detailed experimental data for droplet size distribution downstream of a centrifugal pump are presented. Rosin-Rammler and Log-Normal Distributions utilizing dmax Pereyra (2011) model as well as dmin Kouba (2003) model are used in order to evaluate the best fit distribution function to simulate the cumulative droplet size distribution. The results confirm that applying dmax Pereyra (2011) model leads to Rosin-Rammler distribution is much closer to the experimental data for low shear conditions, while the Log-Normal distribution shows better performance for higher shear rates. Furthermore, the predictions of Modified Kouba (2003) dmin model show good results for predicting the droplet distribution in centrifugal pump, and even better predictions under various ranges of experiments are achieved with manipulating cumulative percentage at minimum droplet diameter F(Dmin).
APA, Harvard, Vancouver, ISO, and other styles
7

Zhu, Yongzhong, Jian Zhang, and Ming Hu. "Random Model of Water Hammer Pressure and Probability Analysis in Waterpower Station." In ASME/JSME 2007 5th Joint Fluids Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/fedsm2007-37426.

Full text
Abstract:
Considering such factors as the original load of turbine, closure time of wicket gate, and water level of reservoir are changeable in the practical operation of waterpower station, they should be regarded as random variables in water hammer analysis. Accordingly, random model of water hammer with random boundary conditions and random initial states is introduced in this paper. And the probability distribution law of each of random variables is concluded. Concerning a single hydraulic system, firstly, random computational formulas for increasing pressures and maximum pressures of water hammer with random variables are deduced. Then theoretical formulas for sensitivity analysis are provided and, with these formulas, sensitivity analysis for water hammer is made. Besides, probability density formulas for increasing pressures and maximum pressures of water hammer are given. And the method of distribution type fitting is introduced to determine the distribution of water hammer pressures. Thereby conclusions are drawn that the distribution of increasing pressures of water hammer in single systems of waterpower station is log-normal type and that of maximum pressures is normal type. Finally, all the methods involved are verified through computational examples.
APA, Harvard, Vancouver, ISO, and other styles
8

Mansur, Tanius Rodrigues, Joa˜o Ma´rio Andrade Pinto, Wellington Antonio Soares, Ernani Sales Palma, and Enrico A. Colosimo. "Determination of the Fatigue Limit: Comparison Between Experimental Tests and Statistical Simulations." In ASME 2002 Pressure Vessels and Piping Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/pvp2002-1210.

Full text
Abstract:
Fatigue limit’s of steel specimens were determined using experimental test’s and numerical simulations. The simulation was based on life distribution parameters taking into account a log-normal model. The obtained experimental results are quite close to those obtained by simulation.
APA, Harvard, Vancouver, ISO, and other styles
9

Jones, Simon. "Predicting Wave Propagation Through Inhomogeneous Soils Using a Finite-Element Model Incorporating Perfectly-Matched Layers." In ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-50136.

Full text
Abstract:
A 2D, plane-strain, finite element model, with perfectly-matched layer elements acting as absorbing boundaries, is used to investigate the effect of soil inhomogeneity on resultant surface vibration. The stiffness and mass matrices for the perfectly-matched layer element is derived and included for reference. Stochastic variability of the soil’s shear wave velocity is introduced using a K-L expansion; the shear wave velocity is assumed to have a log-normal distribution and a modified exponential co-variance kernel. Results suggest that local soil inhomogeneity can significantly affect surface velocity predictions; 90% confidence intervals showing 7dB averages and peak values up to 11dB are computed. This is a significant source of uncertainty and should be considered when using predictions from models assuming homogeneous soil properties.
APA, Harvard, Vancouver, ISO, and other styles
10

Moriarty, Patrick J., William E. Holley, and Sandy Butterfield. "Effect of Turbulence Variation on Extreme Loads Prediction for Wind Turbines." In ASME 2002 Wind Energy Symposium. ASMEDC, 2002. http://dx.doi.org/10.1115/wind2002-50.

Full text
Abstract:
The effect of varying turbulence levels on long-term loads extrapolation techniques was examined using a joint probability density function of both mean wind speed and turbulence level for loads calculations. The turbulence level has a dramatic effect on the statistics of moment maxima extracted from aeroelastic simulations. Maxima from simulations at lower turbulence levels are more deterministic and become dominated by the stochastic component as turbulence level increases. Short-term probability distributions were calculated using four different moment-based fitting methods. Several hundred of these distributions were used to calculate a long-term probability function. From the long-term probability, 1- and 50-year extreme loads were estimated. As an alternative, using a normal distribution of turbulence level produced a long-term load comparable to that of a log-normal distribution and may be more straightforward to implement. A parametric model of the moments was also used to estimate the extreme loads. The parametric model predicted nearly identical loads to the empirical model and required less data. An input extrapolation technique was also examined. Extrapolating the turbulence level prior to input into the aeroelastic code simplifies the loads extrapolation procedure but, in this case, produces loads lower than the empirical model and may be non-conservative in general.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography