Academic literature on the topic 'Generalized uncertainty models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Generalized uncertainty models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Generalized uncertainty models"

1

Buckles, Billy P., and Frederick E. Petry. "Uncertainty models in information and database systems." Journal of Information Science 11, no. 2 (August 1985): 77–87. http://dx.doi.org/10.1177/016555158501100204.

Full text
Abstract:
Information systems have evolved to the point where it is desirable to capture the vagueness and uncertainty of data that occurs in actuality. Approaches have been taken using various fuzzy set concepts such as degree of membership, similarity relations and possibility distributions. This leads to the concept of generalized information systems which are typically char acterized by heterogeneous data representations, weakly typed data domains and the requirement for semantic knowledge during query interpretation. A generalized information system is more likely to have a direct representation for larger classes of information at the cost of more complex data management and query processing. In general the various fuzzy database approaches that have been developed are overviewed in the paper and characterized with respect to the concept of a generalized information system.
APA, Harvard, Vancouver, ISO, and other styles
2

Koyak, Robert A., T. J. Hastie, and R. J. Tibshirani. "Generalized Additive Models." Journal of the American Statistical Association 86, no. 416 (December 1991): 1140. http://dx.doi.org/10.2307/2290538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Thompson, W. A., P. McCullagh, J. A. Nelder, and Annette J. Dobson. "Generalized Linear Models." Journal of the American Statistical Association 80, no. 392 (December 1985): 1066. http://dx.doi.org/10.2307/2288581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Moon, Graham, T. J. Hastie, and R. J. Tibshirani. "Generalized Additive Models." Applied Statistics 41, no. 1 (1992): 219. http://dx.doi.org/10.2307/2347636.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pukelsheim, F. "Generalized linear models." Metrika 33, no. 1 (December 1986): 290. http://dx.doi.org/10.1007/bf01894758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Belfaqih, I. H., H. Maulana, and A. Sulaksono. "White dwarfs and generalized uncertainty principle." International Journal of Modern Physics D 30, no. 09 (May 24, 2021): 2150064. http://dx.doi.org/10.1142/s0218271821500644.

Full text
Abstract:
This work is motivated by the sign problem in a logarithmic parameter of black hole entropy and the existing more massive white dwarfs than the Chandrasekhar mass limit. We examine the quadratic, linear, and linear–quadratic generalized uncertainty principle (GUP) models within the virtue of recent masses and radii of white dwarfs. We consider the modification generated by introducing the minimal length on the degenerate Fermi gas equation of state (EoS) and the hydrostatic equation. For the latter, we applied Verlinde’s proposal regarding entropic gravity to derive the quantum corrected Newtonian gravity, which is responsible for modifying the hydrostatic equation. Through the models’ chi-square analysis, we have found that the observation data favor the quadratic than linear GUP models without mass limit. However, for the quadratic–linear GUP model, we can obtain the positive value of the free parameter [Formula: see text] as well as we can get mass limit more massive than the Chandrasekhar limit. In the linear–quadratic GUP model, the formation of stable massive white dwarfs than the Chandrasekhar limit is possible only if both parameters are not equal.
APA, Harvard, Vancouver, ISO, and other styles
7

Hilbe, Joseph M. "Generalized Linear Models." American Statistician 48, no. 3 (August 1994): 255. http://dx.doi.org/10.2307/2684732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hu, X. Joan. "Generalized Linear Models." American Statistician 57, no. 1 (February 2003): 67–68. http://dx.doi.org/10.1198/tas.2003.s212.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Müller, Hans-Georg, and Ulrich Stadtmüller. "Generalized functional linear models." Annals of Statistics 33, no. 2 (April 2005): 774–805. http://dx.doi.org/10.1214/009053604000001156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Burridge, Jim, P. McCullagh, and J. A. Nelder. "Generalized Linear Models." Journal of the Royal Statistical Society. Series A (Statistics in Society) 154, no. 2 (1991): 361. http://dx.doi.org/10.2307/2983054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Generalized uncertainty models"

1

Sima, Adam. "Accounting for Model Uncertainty in Linear Mixed-Effects Models." VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/2950.

Full text
Abstract:
Standard statistical decision-making tools, such as inference, confidence intervals and forecasting, are contingent on the assumption that the statistical model used in the analysis is the true model. In linear mixed-effect models, ignoring model uncertainty results in an underestimation of the residual variance, contributing to hypothesis tests that demonstrate larger than nominal Type-I errors and confidence intervals with smaller than nominal coverage probabilities. A novel utilization of the generalized degrees of freedom developed by Zhang et al. (2012) is used to adjust the estimate of the residual variance for model uncertainty. Additionally, the general global linear approximation is extended to linear mixed-effect models to adjust the standard errors of the parameter estimates for model uncertainty. Both of these methods use a perturbation method for estimation, where random noise is added to the response variable and, conditional on the observed responses, the corresponding estimate is calculated. A simulation study demonstrates that when the proposed methodologies are utilized, both the variance and standard errors are inflated for model uncertainty. However, when a data-driven strategy is employed, the proposed methodologies show limited usefulness. These methods are evaluated with a trial assessing the performance of cervical traction in the treatment of cervical radiculopathy.
APA, Harvard, Vancouver, ISO, and other styles
2

Abid, Fatma. "Contribution à la robustesse et à l'optimisation fiabiliste des structures Uncertainty of shape memory alloy micro-actuator using generalized polynomial chaos methodUncertainty of shape memory alloy micro-actuator using generalized polynomial chaos method Numerical modeling of shape memory alloy problem in presence of perturbation : application to Cu-Al-Zn-Mn specimen An approach for the reliability-based design optimization of shape memory alloy structure Surrogate models for uncertainty analysis of micro-actuator." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMIR24.

Full text
Abstract:
La conception des ouvrages économiques a suscité de nombreux progrès dans les domaines de la modélisation et de l’optimisation, permettant l’analyse de structures de plus en plus complexes. Cependant, les conceptions optimisées sans considérer les incertitudes des paramètres, peuvent ne pas respecter certains critères de fiabilité. Pour assurer le bon fonctionnement de la structure, il est important de prendre en considération l’incertitude dès la phase de conception. Il existe plusieurs théories dans la littérature pour traiter les incertitudes. La théorie de la fiabilité des structures consiste à définir la probabilité de défaillance d’une structure par la probabilité que les conditions de bon fonctionnement ne soient pas respectées. On appelle cette étude l’analyse de la fiabilité. L’intégration de l’analyse de fiabilité dans les problèmes d’optimisation constitue une nouvelle discipline introduisant des critères de fiabilité dans la recherche de la configuration optimale des structures, c’est le domaine de l’optimisation fiabiliste (RBDO). Cette méthodologie de RBDO vise donc à considérer la propagation des incertitudes dans les performances mécaniques en s’appuyant sur une modélisation probabiliste des fluctuations des paramètres d’entrée. Dans ce cadre, ce travail de thèse porte sur l’analyse robuste et l’optimisation fiabiliste des problèmes mécaniques complexes. Il est important de tenir compte des paramètres incertains du système pour assurer une conception robuste. L’objectif de la méthode RBDO est de concevoir une structure afin d’établir un bon compromis entre le coût et l’assurance de fiabilité. Par conséquent, plusieurs méthodes, telles que la méthode hybride et la méthode optimum safety factor, ont été développées pour atteindre cet objectif. Pour remédier à la complexité des problèmes mécaniques complexes comportant des paramètres incertains, des méthodologies spécifiques à cette problématique, tel que les méthodes de méta-modélisation, ont été développées afin de bâtir un modèle de substitution mécanique, qui satisfait en même temps l’efficacité et la précision du modèle
The design of economic system leads to many advances in the fields of modeling and optimization, allowing the analysis of structures more and more complex. However, optimized designs can suffer from uncertain parameters that may not meet certain reliability criteria. To ensure the proper functioning of the structure, it is important to consider uncertainty study is called the reliability analysis. The integration of reliability analysis in optimization problems is a new discipline introducing reliability criteria in the search for the optimal configuration of structures, this is the domain of reliability optimization (RBDO). This RBDO methodology aims to consider the propagation of uncertainties in the mechanical performance by relying on a probabilistic modeling of input parameter fluctuations. In this context, this thesis focuses on a robust analysis and a reliability optimization of complex mechanical problems. It is important to consider the uncertain parameters of the system to ensure a robust design. The objective of the RBDO method is to design a structure in order to establish a good compromise between the cost and the reliability assurance. As a result, several methods, such as the hybrid method and the optimum safety factor method, have been developed to achieve this goal. To address the complexity of complex mechanical problems with uncertain parameters, methodologies specific to this issue, such as meta-modeling methods, have been developed to build a mechanical substitution model, which at the same time satisfies the efficiency and the precision of the model
APA, Harvard, Vancouver, ISO, and other styles
3

Blumer, Joel David. "Cross-scale model validation with aleatory and epistemic uncertainty." Thesis, Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53571.

Full text
Abstract:
Nearly every decision must be made with a degree of uncertainty regarding the outcome. Decision making based on modeling and simulation predictions needs to incorporate and aggregate uncertain evidence. To validate multiscale simulation models, it may be necessary to consider evidence collected at a length scale that is different from the one at which a model predicts. In addition, traditional methods of uncertainty analysis do not distinguish between two types of uncertainty: uncertainty due to inherently random inputs, and uncertainty due to lack of information about the inputs. This thesis examines and applies a Bayesian approach for model parameter validation that uses generalized interval probability to separate these two types of uncertainty. A generalized interval Bayes’ rule (GIBR) is used to combine the evidence and update belief in the validity of parameters. The sensitivity of completeness and soundness for interval range estimation in GIBR is investigated. Several approaches to represent complete ignorance of probabilities’ values are tested. The result from the GIBR method is verified using Monte Carlo simulations. The method is first applied to validate the parameter set for a molecular dynamics simulation of defect formation due to radiation. Evidence is supplied by the comparison with physical experiments. Because the simulation includes variables whose effects are not directly observable, an expanded form of GIBR is implemented to incorporate the uncertainty associated with measurement in belief update. In a second example, the proposed method is applied to combining the evidence from two models of crystal plasticity at different length scales.
APA, Harvard, Vancouver, ISO, and other styles
4

Sickert, Jan-Uwe, Wolfgang Graf, and Stephan Pannier. "Entwurf von Textilbetonverstärkungen – computerorientierte Methoden mit verallgemeinerten Unschärfemodellen." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1244047293129-54264.

Full text
Abstract:
Im Beitrag werden drei Methoden für den Entwurf und die Bemessung von Textilbetonverstärkungen vorgestellt. Für eine Vorbemessung wird die Variantenuntersuchung angewendet, z.B. für die Bestimmung der Anzahl an Textillagen. Für die Festlegung von Realisierungen mehrerer kontinuierlicher Entwurfsvariablen unter Berücksichtigung unterschiedlicher Entwurfsziele und Entwurfsnebenbedingungen werden die Fuzzy-Optimierung und die direkte Lösung der Entwurfsaufgabe skizziert. Mit der Fuzzy-Optimierung werden Kompromisslösungen für die multikriterielle Entwurfsaufgabe ermittelt. Die direkte Lösung basiert auf der explorativen Datenanalyse von Punktmengen, die als Ergebnis einer unscharfen Tragwerksanalyse vorliegen, und liefert Bereiche – sog. Entwurfsteilräume – als Grundlage für die Auswahl des Entwurfs.
APA, Harvard, Vancouver, ISO, and other styles
5

Steinigen, Frank, Wolfgang Graf, Andreas Hoffmann, and Michael Kaliske. "Nachträglich textilverstärkte Stahlbetontragwerke — Strukturanalyse mit unscharfen Daten." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1244047124333-78222.

Full text
Abstract:
Mit der Fuzzy-Stochastischen Finite-Elemente-Methode (FSFEM) kann die nachgewiesene stochastische und nichtstochastische Datenunschärfe des stahlbewehrten Altbetons und des Textilbeton bei der Strukturanalyse berücksichtigt werden. Die für die deterministische Analyse textilverstärkter Tragwerke auf der Basis des Multi-Referenzebenen-Modells (MRM) entwickelten finiten MRM-Elemente wurden zu FSMRM-Elementen weiterentwickelt. Das Stoffmodell des mit AR-Glas bewehrten Feinbetons wurde für textile Gelege aus Carbon erweitert. Die entwickelten Modelle und Algorithmen werden zur fuzzystochastischen Tragwerksanalyse textilverstärkter Tragwerke eingesetzt.
APA, Harvard, Vancouver, ISO, and other styles
6

Silva, Humberto Ara?jo da. "Controlador preditivo generalizado multi-modelo aplicado ao controle de press?o arterial." Universidade Federal do Rio Grande do Norte, 2010. http://repositorio.ufrn.br:8080/jspui/handle/123456789/15324.

Full text
Abstract:
Made available in DSpace on 2014-12-17T14:55:44Z (GMT). No. of bitstreams: 1 HumbertoAS_DISSERT.pdf: 1011716 bytes, checksum: 2b872372fd085e114502d8002132d6ba (MD5) Previous issue date: 2010-03-19
Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico
Postsurgical complication of hypertension may occur in cardiac patients. To decrease the chances of complication it is necessary to reduce elevated blood pressure as soon as possible. Continuous infusion of vasodilator drugs, such as sodium nitroprusside (Nipride), would quickly lower the blood pressure in most patients. However, each patient has a different sensitivity to infusion of Nipride. The parameters and the time delays of the system are initially unknown. Moreover, the parameters of the transfer function associated with a particular patient are time varying. the objective of the study is to develop a procedure for blood pressure control i the presence of uncertainty of parameters and considerable time delays. So, a methodology was developed multi-model, and for each such model a Preditive Controller can be a priori designed. An adaptive mechanism is then needed for deciding which controller should be dominant for a given plant
Em muitos procedimentos cir?rgicos existe a necessidade de realizar o controle da press?o arterial para, com isto, preservar a sa?de do paciente. Para diminuir as chances de uma complica??o, ? necess?rio reduzir a press?o arterial o mais r?pido poss?vel. A infus?o cont?nua de drogas vasodilatadoras, como o nitroprussiato de s?dio (NPS), reduz rapidamente a press?o arterial na maioria dos pacientes. Por?m, cada paciente tem uma sensibilidade diferente a infus?o do NPS, o que faz com que os par?metros e os atrasos do sistema sejam desconhecidos a priori. Al?m disso, os par?metros de uma fun??o de transfer?ncia associados ? um paciente particular s?o variantes no tempo. Desta forma, o objetivo do trabalho consiste em desenvolver uma metodologia capaz de controlar de forma autom?tica a press?o arterial na presen?a de incertezas de par?metros e de grandes atrasos. Para isso foi desenvolvida uma metodologia multi-modelo, onde para cada modelo existe um Controlador Preditivo especificamente sintonizado, e um mecanismo adaptativo decide qual controlador deve ser o dominante para uma determinada planta
APA, Harvard, Vancouver, ISO, and other styles
7

Dixon, William J., and bill dixon@dse vic gov au. "Uncertainty in Aquatic Toxicological Exposure-Effect Models: the Toxicity of 2,4-Dichlorophenoxyacetic Acid and 4-Chlorophenol to Daphnia carinata." RMIT University. Biotechnology and Environmental Biology, 2005. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070119.163720.

Full text
Abstract:
Uncertainty is pervasive in risk assessment. In ecotoxicological risk assessments, it arises from such sources as a lack of data, the simplification and abstraction of complex situations, and ambiguities in assessment endpoints (Burgman 2005; Suter 1993). When evaluating and managing risks, uncertainty needs to be explicitly considered in order to avoid erroneous decisions and to be able to make statements about the confidence that we can place in risk estimates. Although informative, previous approaches to dealing with uncertainty in ecotoxicological modelling have been found to be limited, inconsistent and often based on assumptions that may be false (Ferson & Ginzburg 1996; Suter 1998; Suter et al. 2002; van der Hoeven 2004; van Straalen 2002a; Verdonck et al. 2003a). In this thesis a Generalised Linear Modelling approach is proposed as an alternative, congruous framework for the analysis and prediction of a wide range of ecotoxicological effects. This approach was used to investigate the results of toxicity experiments on the effect of 2,4-Dichlorophenoxyacetic Acid (2,4-D) formulations and 4-Chlorophenol (4-CP, an associated breakdown product) on Daphnia carinata. Differences between frequentist Maximum Likelihood (ML) and Bayesian Markov-Chain Monte-Carlo (MCMC) approaches to statistical reasoning and model estimation were also investigated. These approaches are inferentially disparate and place different emphasis on aleatory and epistemic uncertainty (O'Hagan 2004). Bayesian MCMC and Probability Bounds Analysis methods for propagating uncertainty in risk models are also compared for the first time. For simple models, Bayesian and frequentist approaches to Generalised Linear Model (GLM) estimation were found to produce very similar results when non-informative prior distributions were used for the Bayesian models. Potency estimates and regression parameters were found to be similar for identical models, signifying that Bayesian MCMC techniques are at least a suitable and objective replacement for frequentist ML for the analysis of exposureresponse data. Applications of these techniques demonstrated that Amicide formulations of 2,4-D are more toxic to Daphnia than their unformulated, Technical Acid parent. Different results were obtained from Bayesian MCMC and ML methods when more complex models and data structures were considered. In the analysis of 4-CP toxicity, the treatment of 2 different factors as fixed or random in standard and Mixed-Effect models was found to affect variance estimates to the degree that different conclusions would be drawn from the same model, fit to the same data. Associated discrepancies in the treatment of overdispersion between ML and Bayesian MCMC analyses were also found to affect results. Bayesian MCMC techniques were found to be superior to the ML ones employed for the analysis of complex models because they enabled the correct formulation of hierarchical (nested) datastructures within a binomial logistic GLM. Application of these techniques to the analysis of results from 4-CP toxicity testing on two strains of Daphnia carinata found that between-experiment variability was greater than that within-experiments or between-strains. Perhaps surprisingly, this indicated that long-term laboratory culture had not significantly affected the sensitivity of one strain when compared to cultures of another strain that had recently been established from field populations. The results from this analysis highlighted the need for repetition of experiments, proper model formulation in complex analyses and careful consideration of the effects of pooling data on characterising variability and uncertainty. The GLM framework was used to develop three dimensional surface models of the effects of different length pulse exposures, and subsequent delayed toxicity, of 4-CP on Daphnia. These models described the relationship between exposure duration and intensity (concentration) on toxicity, and were constructed for both pulse and delayed effects. Statistical analysis of these models found that significant delayed effects occurred following the full range of pulse exposure durations, and that both exposure duration and intensity interacted significantly and concurrently with the delayed effect. These results indicated that failure to consider delayed toxicity could lead to significant underestimation of the effects of pulse exposure, and therefore increase uncertainty in risk assessments. A number of new approaches to modelling ecotoxicological risk and to propagating uncertainty were also developed and applied in this thesis. In the first of these, a method for describing and propagating uncertainty in conventional Species Sensitivity Distribution (SSD) models was described. This utilised Probability Bounds Analysis to construct a nonparametric 'probability box' on an SSD based on EC05 estimates and their confidence intervals. Predictions from this uncertain SSD and the confidence interval extrapolation methods described by Aldenberg and colleagues (2000; 2002a) were compared. It was found that the extrapolation techniques underestimated the width of uncertainty (confidence) intervals by 63% and the upper bound by 65%, when compared to the Probability Bounds (P3 Bounds) approach, which was based on actual confidence estimates derived from the original data. An alternative approach to formulating ecotoxicological risk modelling was also proposed and was based on a Binomial GLM. In this formulation, the model is first fit to the available data in order to derive mean and uncertainty estimates for the parameters. This 'uncertain' GLM model is then used to predict the risk of effect from possible or observed exposure distributions. This risk is described as a whole distribution, with a central tendency and uncertainty bounds derived from the original data and the exposure distribution (if this is also 'uncertain'). Bayesian and P-Bounds approaches to propagating uncertainty in this model were compared using an example of the risk of exposure to a hypothetical (uncertain) distribution of 4-CP for the two Daphnia strains studied. This comparison found that the Bayesian and P-Bounds approaches produced very similar mean and uncertainty estimates, with the P-bounds intervals always being wider than the Bayesian ones. This difference is due to the different methods for dealing with dependencies between model parameters by the two approaches, and is confirmation that the P-bounds approach is better suited to situations where data and knowledge are scarce. The advantages of the Bayesian risk assessment and uncertainty propagation method developed are that it allows calculation of the likelihood of any effect occurring, not just the (probability)bounds, and that the same software (WinBugs) and model construction may be used to fit regression models and predict risks simultaneously. The GLM risk modelling approaches developed here are able to explain a wide range of response shapes (including hormesis) and underlying (non-normal) distributions, and do not involve expression of the exposure-response as a probability distribution, hence solving a number of problems found with previous formulations of ecotoxicological risk. The approaches developed can also be easily extended to describe communities, include modifying factors, mixed-effects, population growth, carrying capacity and a range of other variables of interest in ecotoxicological risk assessments. While the lack of data on the toxicological effects of chemicals is the most significant source of uncertainty in ecotoxicological risk assessments today, methods such as those described here can assist by quantifying that uncertainty so that it can be communicated to stakeholders and decision makers. As new information becomes available, these techniques can be used to develop more complex models that will help to bridge the gap between the bioassay and the ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
8

Mohamed, Ibrahim Daoud Ahmed. "Automatic history matching in Bayesian framework for field-scale applications." Texas A&M University, 2004. http://hdl.handle.net/1969.1/3170.

Full text
Abstract:
Conditioning geologic models to production data and assessment of uncertainty is generally done in a Bayesian framework. The current Bayesian approach suffers from three major limitations that make it impractical for field-scale applications. These are: first, the CPU time scaling behavior of the Bayesian inverse problem using the modified Gauss-Newton algorithm with full covariance as regularization behaves quadratically with increasing model size; second, the sensitivity calculation using finite difference as the forward model depends upon the number of model parameters or the number of data points; and third, the high CPU time and memory required for covariance matrix calculation. Different attempts were used to alleviate the third limitation by using analytically-derived stencil, but these are limited to the exponential models only. We propose a fast and robust adaptation of the Bayesian formulation for inverse modeling that overcomes many of the current limitations. First, we use a commercial finite difference simulator, ECLIPSE, as a forward model, which is general and can account for complex physical behavior that dominates most field applications. Second, the production data misfit is represented by a single generalized travel time misfit per well, thus effectively reducing the number of data points into one per well and ensuring the matching of the entire production history. Third, we use both the adjoint method and streamline-based sensitivity method for sensitivity calculations. The adjoint method depends on the number of wells integrated, and generally is of an order of magnitude less than the number of data points or the model parameters. The streamline method is more efficient and faster as it requires only one simulation run per iteration regardless of the number of model parameters or the data points. Fourth, for solving the inverse problem, we utilize an iterative sparse matrix solver, LSQR, along with an approximation of the square root of the inverse of the covariance calculated using a numerically-derived stencil, which is broadly applicable to a wide class of covariance models. Our proposed approach is computationally efficient and, more importantly, the CPU time scales linearly with respect to model size. This makes automatic history matching and uncertainty assessment using a Bayesian framework more feasible for large-scale applications. We demonstrate the power and utility of our approach using synthetic cases and a field example. The field example is from Goldsmith San Andres Unit in West Texas, where we matched 20 years of production history and generated multiple realizations using the Randomized Maximum Likelihood method for uncertainty assessment. Both the adjoint method and the streamline-based sensitivity method are used to illustrate the broad applicability of our approach.
APA, Harvard, Vancouver, ISO, and other styles
9

Zelelew, Mulugeta. "Improving Runoff Estimation at Ungauged Catchments." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for vann- og miljøteknikk, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-19675.

Full text
Abstract:
Water infrastructures have been implemented to support the vital activities of human society. The infrastructure developments at the same time have interrupted the natural catchment response characteristics, challenging society to implement effective water resources planning and management strategies. The Telemark area in southern Norway has seen a large number of water infrastructure developments, particularly hydropower, over more than a century. Recent developments in decision support tools for flood control and reservoir operation has raised the need to compute inflows from local catchments, most of which are regulated or have no observed data. This has contributed for the motivation of this PhD thesis work, with an aim of improving runoff estimation at ungauged catchments, and the research results are presented in four manuscript scientific papers.  The inverse distance weighting, inverse distance squared weighting, ordinary kriging, universal kriging and kriging with external drift were applied to analyse precipitation variability and estimate daily precipitation in the study area. The geostatistical based univariate and multivariate map-correlation concepts were applied to analyse and physically understand regional hydrological response patterns. The Sobol variance based sensitivity analysis (VBSA) method was used to investigate the HBV hydrological model parameterization significances on the model response variations and evaluate the model’s reliability as a prediction tool. The HBV hydrological model space transferability into ungauged catchments was also studied.  The analyses results showed that the inverse distance weighting variants are the preferred spatial data interpolation methods in areas where relatively dense precipitation station network can be found.  In mountainous areas and in areas where the precipitation station network is relatively sparse, the kriging variants are the preferred methods. The regional hydrological response correlation analyses suggested that geographic proximity alone cannot explain the entire hydrological response correlations in the study area. Besides, when the multivariate map-correlation analysis was applied, two distinct regional hydrological response patterns - the radial and elliptical-types were identified. The presence of these hydrological response patterns influenced the location of the best-correlated reference streamgauges to the ungauged catchments. As a result, the nearest streamgauge was found the best-correlated in areas where the radial-type hydrological response pattern is the dominant. In area where the elliptical-type hydrological response pattern is the dominant, the nearest reference streamgauge was not necessarily the best-correlated. The VBSA verified that varying up to a minimum of four to six influential HBV model parameters can sufficiently simulate the catchments' responses characteristics when emphasis is given to fit the high flows. Varying up to a minimum of six influential model parameters is necessary to sufficiently simulate the catchments’ responses and maintain the model performance when emphasis is given to fit the low flows. However, varying more than nine out of the fifteen HBV model parameters will not make any significant change on the model performance.  The hydrological model space transfer study indicated that estimation of representative runoff at ungauged catchments cannot be guaranteed by transferring model parameter sets from a single donor catchment. On the other hand, applying the ensemble based model space transferring approach and utilizing model parameter sets from multiple donor catchments improved the model performance at the ungauged catchments. The result also suggested that high model performance can be achieved by integrating model parameter sets from two to six donor catchments. Objectively minimizing the HBV model parametric dimensionality and only sampling the sensitive model parameters, maintained the model performance and limited the model prediction uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
10

Öztekin, Ahme. "A generalized hybrid fuzzy-Bayesian methodology for modeling complex uncertainty." 2009. http://hdl.rutgers.edu/1782.2/rucore10001600001.ETD.000051885.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Generalized uncertainty models"

1

Routledge, Bryan R. Generalized disappointment aversion and asset prices. Cambridge, Mass: National Bureau of Economic Research, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Huyse, Luc. Random field solutions including boundary condition uncertainty for steady-stae generalized Burgers equation. Hampton, VA: Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Krause, Timothy A. Pricing of Futures Contracts. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190656010.003.0015.

Full text
Abstract:
This chapter examines the relation between futures prices relative to the spot price of the underlying asset. Basic futures pricing is characterized by the convergence of futures and spot prices during the delivery period just before contract expiration. However, “no arbitrage” arguments that dictate the fair value of futures contracts largely determine pricing relations before expiration. Although the cost of carry model in its various forms largely determines futures prices before expiration, the chapter presents alternative explanations. Related commodity futures complexes exhibit mean-reverting behavior, as seen in commodity spread markets and other interrelated commodities. Energy commodity futures prices can be somewhat accurately modeled as a generalized autoregressive conditional heteroskedastic (GARCH) process, although whether these models provide economically significant excess returns is uncertain.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Generalized uncertainty models"

1

Pencheva, Tania, Olympia Roeva, and Anthony Shannon. "Generalized Net Models of Basic Genetic Algorithm Operators." In Imprecision and Uncertainty in Information Representation and Processing, 305–25. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-26302-1_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bureva, Veselina, Stanislav Popov, Evdokia Sotirova, and Krassimir T. Atanassov. "Generalized Net of MapReduce Computational Model." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 305–15. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Leyk, Stefan, and Niklaus E. Zimmermann. "A Predictive Uncertainty Model for Field-Based Survey Maps Using Generalized Linear Models." In Geographic Information Science, 191–205. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30231-5_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Erbakanov, Lenko, Krassimir T. Atanassov, and Sotir Sotirov. "Generalized Net Model of Synchronous Binary Counter." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 325–32. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Simeonov, Stanislav, Vassia Atanassova, Evdokia Sotirova, Neli Simeonova, and Todor Kostadinov. "Generalized Net of a Centralized Embedded System." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 299–304. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ribagin, Simeon, Krassimir T. Atanassov, Olympia Roeva, and Tania Pencheva. "Generalized Net Model of Adolescent Idiopathic Scoliosis Diagnosing." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 333–48. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Petkov, Todor, Sotir Sotirov, and Stanislav Popov. "Generalized Net Model of Optimization of the Self-Organizing Map Learning Algorithm." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 316–24. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bentkowska, Urszula, and Barbara Pȩkala. "Generalized Reciprocity Property for Interval-Valued Fuzzy Setting in Some Aspect of Social Network." In Uncertainty and Imprecision in Decision Making and Decision Support: Cross-Fertilization, New Models and Applications, 286–96. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-65545-1_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Steeb, Willi-Hans. "Uncertainty Relation." In Hilbert Spaces, Wavelets, Generalised Functions and Modern Quantum Mechanics, 117–22. Dordrecht: Springer Netherlands, 1998. http://dx.doi.org/10.1007/978-94-011-5332-4_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Weinmann, Alexander. "Generalized Nyquist Stability of Perturbed Systems." In Uncertain Models and Robust Control, 407–32. Vienna: Springer Vienna, 1991. http://dx.doi.org/10.1007/978-3-7091-6711-3_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Generalized uncertainty models"

1

Graf, W., J. U. Sickert, and F. Steinigen. "Numerical simulation of structures using generalized models for data uncertainty." In CMEM 2009. Southampton, UK: WIT Press, 2009. http://dx.doi.org/10.2495/cmem090461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Yeqing, Getachew K. Befekadu, and Crystal L. Pasiliao. "Uncertainty Quantification for Laser Ablation of Aluminum." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-70625.

Full text
Abstract:
In recent years, a number of research efforts have been devoted to understand the mechanisms and develop accurate simulation models for laser ablation of solid materials. However, uncertainty quantification (UQ) for laser ablation of solid materials, when the sources of uncertainty are inherently stochastic (e.g., material and optical properties of target materials at elevated temperatures), is not sufficiently understood or addressed, despite having critical impact on guiding experimental efforts and advanced manufacturing. In this paper, we consider the problem of UQ for pulsed laser ablation of aluminum. In particular, a generalized polynomial chaos (PC) method is used to incorporate constitutive parameter uncertainties within the representation of laser heat conduction phenomena, where the parameter uncertainties are either presumed from the mathematical modeling approximation for the laser heat conduction model and/or from the laser source. Moreover, numerical simulation studies for laser ablation of aluminum, with nanosecond Nd:YAG 266nm pulsed laser, that demonstrate the proposed generalized PC predictions are also presented. Finally, a sensitivity study is used to identify those parameters that provide the most variance in the thermal and ablation response.
APA, Harvard, Vancouver, ISO, and other styles
3

Tallman, Aaron E., Joel D. Blumer, Yan Wang, and David L. McDowell. "Multiscale Model Validation Based on Generalized Interval Bayes’ Rule and its Application in Molecular Dynamics Simulation." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-35126.

Full text
Abstract:
Reliable simulation protocols supporting integrated computational materials engineering requires uncertainty to be quantified. In general, two types of uncertainties are recognized. Aleatory uncertainty is inherent randomness, whereas epistemic uncertainty is due to lack of knowledge. Aleatory and epistemic uncertainties need to be differentiated in validating multiscale models, where measurement data for unconventionally very small or large systems are scarce, or vary greatly in forms and quality (i.e. sources of epistemic uncertainty). In this paper, a recently proposed generalized hidden Markov model is used for cross-scale and cross-domain information fusion under the two types of uncertainties. The dependency relationships among the observable and hidden state variables at multiple scales and physical domains are captured using generalized interval probability. The update of imprecise credence and model validation are based on a generalized interval Bayes’ rule. Its application in molecular dynamics simulation for irradiation of Fe is demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
4

Yedavalli, R. K. "A Generalized Lyapunov Theory for Robust Root Clustering of Linear State Space Models with Real Parameter Uncertainty." In 1992 American Control Conference. IEEE, 1992. http://dx.doi.org/10.23919/acc.1992.4792028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Park, Jinkyoo, Kincho H. Law, Raunak Bhinge, Nishant Biswas, Amrita Srinivasan, David A. Dornfeld, Moneer Helu, and Sudarsan Rachuri. "A Generalized Data-Driven Energy Prediction Model With Uncertainty for a Milling Machine Tool Using Gaussian Process." In ASME 2015 International Manufacturing Science and Engineering Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/msec2015-9354.

Full text
Abstract:
Using a machine learning approach, this study investigates the effects of machining parameters on the energy consumption of a milling machine tool, which would allow selection of optimal operational strategies to machine a part with minimum energy. Data-driven prediction models, built upon a nonlinear regression approach, can be used to gain an understanding of the effects of machining parameters on energy consumption. In this study, we use the Gaussian Process to construct the energy prediction model for a computer numerical control (CNC) milling machine tool. Energy prediction models for different machining operations are constructed based on collected data. With the collected data sets, optimum input features for model selection are identified. We demonstrate how the energy prediction models can be used to compare the energy consumption for the different operations and to estimate the total energy usage for machining a generic part. We also present an uncertainty analysis to develop confidence bounds for the prediction model and to provide insight into the vast parameter space and training required to improve the accuracy of the model. Generic parts are machined to test and validate the prediction model constructed using the Gaussian Process and we consistently achieve an accuracy of over 95 % on the total predicted energy.
APA, Harvard, Vancouver, ISO, and other styles
6

Choi, Hae-Jin, and Janet K. Allen. "Empirical Models for Non-Deterministic Simulation-Based Robust Design." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-35606.

Full text
Abstract:
We propose a method for metamodeling non-deterministic computer intensive simulations for use in robust design. Generalized linear models for mean responses and heteroscadastic response variances are iteratively estimated in an integrated manner. Estimators that may be used for predicting the mean and variance models are introduced and metamodels of variance are developed. The usefulness of this metamodeling approach in efficient uncertainty analyses of non-deterministic, computationally-intensive simulation models for robust design methods is illustrated with the example of the design of a linear cellular alloy heat exchanger with randomly distributed cracks in the cell walls.
APA, Harvard, Vancouver, ISO, and other styles
7

Xia, Z., and J. Tang. "Characterization of Structural Dynamics With Uncertainty by Using Gaussian Processes." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-48804.

Full text
Abstract:
An efficient way to capture the dynamic characteristics of structural systems with uncertainties has been an important and challenging subject. While such characterization is valuable for structural response predictions, it could be impractical in many application situations where a sufficiently large sample is expensive or unavailable. In this paper, Gaussian process regression models are employed to capture structural dynamical responses, especially responses with uncertainties. When Gaussian processes are used to make predictions for responses with uncertainties, the sampling costs can be significantly reduced because only a relatively small set of data points is needed. With no loss of generality, applications of Gaussian process regression models are introduced in conjunction with Monte Carlo sampling. This approach can be easily generalized to situations where data points are obtained by other sampling techniques.
APA, Harvard, Vancouver, ISO, and other styles
8

Lloyd, George M., and K. J. Kim. "Power/Efficiency Optimization of a Sorption Cooler Under Quantified Design Uncertainty." In ASME 2007 International Mechanical Engineering Congress and Exposition. ASMEDC, 2007. http://dx.doi.org/10.1115/imece2007-43742.

Full text
Abstract:
While the design paradigm in engineering of searching for the optimum system has proven fruitful (and given a good model relatively straightforward, in principle), the desired end result of engineering development is rarely a model (even the optimum one), but a system. In this regard it has frequently been observed (generally with some disappointment) that what one can specify is not always what one gets. It is frequently the case that realized systems, no matter how carefully constructed according to specifications derived from verified and validated models, frequently depart from the designed-for behaviour, due to parametric incertitude. Given this not uncommon circumstance, a somewhat more useful question one might seek to answer during an optimization process is “what is the best system under the constraints which I can reasonably hope to build?” Design optimization under incertitude approaches based on intrusive modifications to the deterministic model, such as stochastic finite elements and chaos expansions, are tedious to apply, computationally expensive, and fraught with convergence issues. The simplest nonintrusive approach—direct Monte Carlo sampling— is far too slow to efficiently sample the joint response distribution of complex thermophysics transient models. The purpose of this paper is to address this topic by incorporating design uncertainty itself as a constraint during the optimization of a sorption cooler. In our method a Markov Chain Monte Carlo sampler is used as the means to develop a suitable ensemble from a practical set of computational results which circumscribe the power/efficiency characteristics of a cooler as a function of several dimensionless stochastic optimization parameters. The ensemble is used to estimate the covariance structure of the design uncertainty, which is then projected into the best low rank subspace where tests of hypothesis under the dominant generalized parameters can be formulated; growth in fluctuations of the generalized parameters along optimization trajectories becomes clearly evident and quantifiable. The method results in a classical power/efficiency diagram, with the addition of quantified design uncertainty. The utility of these diagrams is that they enable rapid-prototyping efforts to target the best cooler design that is most likely to function as expected.
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Wei, Ruichen Jin, and Agus Sudjianto. "Analytical Variance-Based Global Sensitivity Analysis in Simulation-Based Design Under Uncertainty." In ASME 2004 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2004. http://dx.doi.org/10.1115/detc2004-57484.

Full text
Abstract:
The importance of sensitivity analysis in engineering design cannot be over-emphasized. In design under uncertainty, sensitivity analysis is performed with respect to the probabilistic characteristics. Global sensitivity analysis (GSA), in particular, is used to study the impact of variations in input variables on the variation of a model output. One of the most challenging issues for GSA is the intensive computational demand for assessing the impact of probabilistic variations. Existing variance-based GSA methods are developed for general functional relationships but require a large number of samples. In this work, we develop an efficient and accurate approach to GSA that employs analytic formulations derived from metamodels of engineering simulation models. We examine the types of GSA needed for design under uncertainty and derive generalized analytical formulations of GSA based on a variety of metamodels commonly used in engineering applications. The benefits of our proposed techniques are demonstrated and verified through both illustrative mathematical examples and the robust design for improving vehicle handling performance.
APA, Harvard, Vancouver, ISO, and other styles
10

Shahane, Shantanu, Soham Mujumdar, Namjung Kim, Pikee Priya, Narayana Aluru, Placid Ferreira, Shiv G. Kapoor, and Surya Vanka. "Virtually-Guided Certification With Uncertainty Quantification Applied to Die Casting." In ASME 2018 Verification and Validation Symposium. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/vvs2018-9323.

Full text
Abstract:
Die casting is a type of metal casting in which liquid metal is solidified in a reusable die. In such a complex process, measuring and controlling the process parameters is difficult. Conventional deterministic simulations are insufficient to completely estimate the effect of stochastic variation in the process parameters on product quality. In this research, a framework to simulate the effect of stochastic variation together with verification, validation, and uncertainty quantification is proposed. This framework includes high-speed numerical simulations of solidification, micro-structure and mechanical properties prediction models along with experimental inputs for calibration and validation. Both experimental data and stochastic variation in process parameters with numerical modeling are employed thus enhancing the utility of traditional numerical simulations used in die casting to have a better prediction of product quality. Although the framework is being developed and applied to die casting, it can be generalized to any manufacturing process or other engineering problems as well.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Generalized uncertainty models"

1

Czado, Claudia, and Adrian E. Raftery. Choosing the Link Function and Accounting for Link Uncertainty in Generalized Linear Models using Bayes Factors. Fort Belvoir, VA: Defense Technical Information Center, October 2001. http://dx.doi.org/10.21236/ada459482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Garvey, Paul R. The Effect of Software Size Uncertainty on Effort Estimates Generalized by Alpha Times I to the Beta Power Software Resource Models. Fort Belvoir, VA: Defense Technical Information Center, November 1987. http://dx.doi.org/10.21236/ada189209.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography