Siga este enlace para ver otros tipos de publicaciones sobre el tema: Reliable quantification of uncertainty.

Artículos de revistas sobre el tema "Reliable quantification of uncertainty"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Reliable quantification of uncertainty".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Xue, Yujia, Shiyi Cheng, Yunzhe Li y Lei Tian. "Reliable deep-learning-based phase imaging with uncertainty quantification". Optica 6, n.º 5 (7 de mayo de 2019): 618. http://dx.doi.org/10.1364/optica.6.000618.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Russi, Trent, Andy Packard y Michael Frenklach. "Uncertainty quantification: Making predictions of complex reaction systems reliable". Chemical Physics Letters 499, n.º 1-3 (octubre de 2010): 1–8. http://dx.doi.org/10.1016/j.cplett.2010.09.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Xue, Yujia, Shiyi Cheng, Yunzhe Li y Lei Tian. "Reliable deep-learning-based phase imaging with uncertainty quantification: erratum". Optica 7, n.º 4 (9 de abril de 2020): 332. http://dx.doi.org/10.1364/optica.392632.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Alrashed, Mosab, Theoklis Nikolaidis, Pericles Pilidis y Soheil Jafari. "Turboelectric Uncertainty Quantification and Error Estimation in Numerical Modelling". Applied Sciences 10, n.º 5 (6 de marzo de 2020): 1805. http://dx.doi.org/10.3390/app10051805.

Texto completo
Resumen
Turboelectric systems can be considered complex systems that may comprise errors and uncertainty. Uncertainty quantification and error estimation processes can, therefore, be useful in achieving accurate system parameters. Uncertainty quantification and error estimation processes, however, entail some stages that provide results that are more positive. Since accurate approximation and power optimisation are crucial processes, it is essential to focus on higher accuracy levels. Integrating computational models with reliable algorithms into the computation processes leads to a higher accuracy level. Some of the current models, like Monte Carlo and Latin hypercube sampling, are reliable. This paper focuses on uncertainty quantification and error estimation processes in turboelectric numerical modelling. The current study integrates the current evidence with scholarly sources to ensure the incorporation of the most reliable evidence into the conclusions. It is evident that studies on the current subject began a long time ago, and there is sufficient scholarly evidence for analysis. The case study used to obtain this evidence is NASA N3-X, with three aircraft conditions: rolling to take off, cruising and taking off. The results show that the electrical elements in turboelectric systems can have decent outcomes in statistical analysis. Moreover, the risk of having overload branches is up to 2% of the total aircraft operation lifecycle, and the enhancement of the turboelectric system through electrical power optimisation management could lead to higher performance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Scheidt, C., I. Zabalza-Mezghani, M. Feraille y D. Collombier. "Toward a Reliable Quantification of Uncertainty on Production Forecasts: Adaptive Experimental Designs". Oil & Gas Science and Technology - Revue de l'IFP 62, n.º 2 (marzo de 2007): 207–24. http://dx.doi.org/10.2516/ogst:2007018.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Tran, Anh V. y Yan Wang. "Reliable Molecular Dynamics: Uncertainty quantification using interval analysis in molecular dynamics simulation". Computational Materials Science 127 (febrero de 2017): 141–60. http://dx.doi.org/10.1016/j.commatsci.2016.10.021.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Liu, Xuejun, Hailong Tang, Xin Zhang y Min Chen. "Gaussian Process Model-Based Performance Uncertainty Quantification of a Typical Turboshaft Engine". Applied Sciences 11, n.º 18 (8 de septiembre de 2021): 8333. http://dx.doi.org/10.3390/app11188333.

Texto completo
Resumen
The gas turbine engine is a widely used thermodynamic system for aircraft. The demand for quantifying the uncertainty of engine performance is increasing due to the expectation of reliable engine performance design. In this paper, a fast, accurate, and robust uncertainty quantification method is proposed to investigate the impact of component performance uncertainty on the performance of a classical turboshaft engine. The Gaussian process model is firstly utilized to accurately approximate the relationships between inputs and outputs of the engine performance simulation model. Latin hypercube sampling is subsequently employed to perform uncertainty analysis of the engine performance. The accuracy, robustness, and convergence rate of the proposed method are validated by comparing with the Monte Carlo sampling method. Two main scenarios are investigated, where uncertain parameters are considered to be mutually independent and partially correlated, respectively. Finally, the variance-based sensitivity analysis is used to determine the main contributors to the engine performance uncertainty. Both approximation and sampling errors are explained in the uncertainty quantification to give more accurate results. The final results yield new insights about the engine performance uncertainty and the important component performance parameters.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Ryu, Seongok, Yongchan Kwon y Woo Youn Kim. "A Bayesian graph convolutional network for reliable prediction of molecular properties with uncertainty quantification". Chemical Science 10, n.º 36 (2019): 8438–46. http://dx.doi.org/10.1039/c9sc01992h.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Ba, Huanhuan, Shenglian Guo, Yixuan Zhong, Shaokun He y Xushu Wu. "Quantification of the forecast uncertainty using conditional probability and updating models". Hydrology Research 50, n.º 6 (27 de septiembre de 2019): 1751–71. http://dx.doi.org/10.2166/nh.2019.094.

Texto completo
Resumen
Abstract Quantifying forecast uncertainty is of great importance for reservoir operation and flood control. However, deterministic hydrological forecasts do not consider forecast uncertainty. This study develops a conditional probability model based on copulas to quantify forecast uncertainty. Three updating models, namely auto-regressive (AR) model, AR exogenous input model, and adaptive neuro fuzzy inference system model, are applied to update raw deterministic inflow forecasts of the Three Gorges Reservoir on the Yangtze River, China with lead times of 1d, 2d, and 3d. Results show that the conditional probability model provides a reasonable and reliable forecast interval. The updating models both enhance the forecast accuracy and improve the reliability of probabilistic forecasts. The conditional probability model based on copula functions is a useful tool to describe and quantify forecast uncertainty, and using an updating model is an effective measure to improve the accuracy and reliability of probabilistic forecast.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zhou, Shuang, Jianguo Zhang, Lingfei You y Qingyuan Zhang. "Uncertainty propagation in structural reliability with implicit limit state functions under aleatory and epistemic uncertainties". Eksploatacja i Niezawodnosc - Maintenance and Reliability 23, n.º 2 (4 de febrero de 2021): 231–41. http://dx.doi.org/10.17531/ein.2021.2.3.

Texto completo
Resumen
Uncertainty propagation plays a pivotal role in structural reliability assessment. This paper introduces a novel uncertainty propagation method for structural reliability under different knowledge stages based on probability theory, uncertainty theory and chance theory. Firstly, a surrogate model combining the uniform design and least-squares method is presented to simulate the implicit limit state function with random and uncertain variables. Then, a novel quantification method based on chance theory is derived herein, to calculate the structural reliability under mixed aleatory and epistemic uncertainties. The concepts of chance reliability and chance reliability index (CRI) are defined to show the reliable degree of structure. Besides, the selection principles of uncertainty propagation types and the corresponding reliability estimation methods are given according to the different knowledge stages. The proposed methods are finally applied in a practical structural reliability problem, which illustrates the effectiveness and advantages of the techniques presented in this work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Choubert, Jean-Marc, Samuel Martin Ruel, Cécile Miege y Marina Coquery. "Rethinking micropollutant removal assessment methods for wastewater treatment plants – how to get more robust data?" Water Science and Technology 75, n.º 12 (29 de marzo de 2017): 2964–72. http://dx.doi.org/10.2166/wst.2017.181.

Texto completo
Resumen
This paper covers the pitfalls, recommendations and a new methodology for assessing micropollutant removal efficiencies in wastewater treatment plants. The proposed calculation rules take into account the limit of quantification and the analytical and sampling uncertainty of measured concentrations. We identified six cases for which a removal efficiency value is reliable and four other cases where result is highly variable (uncertain) due to very low or unquantified concentrations in effluent or when the influent–effluent concentrations differential is below the measurement uncertainty. The influence of the proposed calculation rules on removal efficiency values was scrutinized using actual results from a research project. The paper arrives at detailed recommendations for limiting the impact of other sources of uncertainty during sampling (sampling strategy, cleaning and field blank), chemical analyses (suspended solids and sludge) and data processing according to the targeted objectives.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Gandhi, Margi y Rajashree Mashru. "A Highly Specific Colorimetric Method for On-Spot Determination of Lidocaine Using Color Kit and Application of Uncertainty Principles". Journal of Drug Delivery and Therapeutics 10, n.º 2 (15 de marzo de 2020): 86–96. http://dx.doi.org/10.22270/jddt.v10i2.3970.

Texto completo
Resumen
A simple, accurate, precise, selective and detectable Colorimetric method was developed for estimation of Lidocaine in five different Pharmaceutical Formulations. The method is an azo C-coupling reaction where Lidocaine undergoes series of reaction and finally couples with resorcinol to form yellow color azo compound. The colored complex was measured at 430 nm. Beers law was obeyed in concentration range of 0.05-0.8 ug/ml. The method was validated and was found to be accurate, precise and robust with limit of detection and quantification to be 0.014 and 0.045 ug/ml. The Color kit was also developed for On-Spot detection of Lidocaine. Measurement uncertainty principles were also adopted to obtain reliable results where the process of uncertainty started from specifying measurand, then identifying uncertainty sources by cause-effect diagram, quantification of these sources of uncertainty and then finally calculating combined standard uncertainty and expanded uncertainty. In the present experiment, Concentration of sample and mass of sample was the major contributor towards uncertainty for all five Formulations. From the five formulations combined standard uncertainty of Transdermal patch was high, followed by Aerosol, Ointment, Gel and Injection. Keywords: Lidocaine, Uncertainty, Transdermal patch, Aerosol, Injection, Ointment, Gel
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Feng, Dongyu, Paola Passalacqua y Ben R. Hodges. "Innovative Approaches for Geometric Uncertainty Quantification in an Operational Oil Spill Modeling System". Journal of Marine Science and Engineering 7, n.º 8 (8 de agosto de 2019): 259. http://dx.doi.org/10.3390/jmse7080259.

Texto completo
Resumen
Reliable and rapid real-time prediction of likely oil transport paths is critical for decision-making from emergency response managers and timely clean-up after a spill. As high-resolution hydrodynamic models are slow, operational oil spill systems generally rely on relatively coarse-grid models to provide quick estimates of the near-future surface-water velocities and oil transport paths. However, the coarse grid resolution introduces model structural errors, which have been called “geometric uncertainty”. Presently, emergency response managers do not have readily-available methods for estimating how geometric uncertainty might affect predictions. This research develops new methods to quantify geometric uncertainty using fine- and coarse-grid models within a lagoonal estuary along the coast of the northern Gulf of Mexico. Using measures of geometric uncertainty, we propose and test a new data-driven uncertainty model along with a multi-model integration approach to quantify this uncertainty in an operational context. The data-driven uncertainty model is developed from a machine learning algorithm that provides a priori assessment of the prediction’s confidence degree. The multi-model integration generates ensemble predictions through comparison with limited fine-grid predictions. The two approaches provide explicit information on the expected scale of modeling errors induced by geometric uncertainty in a manner suitable for operational modeling.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Stoean, Catalin, Ruxandra Stoean, Miguel Atencia, Moloud Abdar, Luis Velázquez-Pérez, Abbas Khosravi, Saeid Nahavandi, U. Rajendra Acharya y Gonzalo Joya. "Automated Detection of Presymptomatic Conditions in Spinocerebellar Ataxia Type 2 Using Monte Carlo Dropout and Deep Neural Network Techniques with Electrooculogram Signals". Sensors 20, n.º 11 (27 de mayo de 2020): 3032. http://dx.doi.org/10.3390/s20113032.

Texto completo
Resumen
Application of deep learning (DL) to the field of healthcare is aiding clinicians to make an accurate diagnosis. DL provides reliable results for image processing and sensor interpretation problems most of the time. However, model uncertainty should also be thoroughly quantified. This paper therefore addresses the employment of Monte Carlo dropout within the DL structure to automatically discriminate presymptomatic signs of spinocerebellar ataxia type 2 in saccadic samples obtained from electrooculograms. The current work goes beyond the common incorporation of this special type of dropout into deep neural networks and uses the uncertainty derived from the validation samples to construct a decision tree at the register level of the patients. The decision tree built from the uncertainty estimates obtained a classification accuracy of 81.18% in automatically discriminating control, presymptomatic and sick classes. This paper proposes a novel method to address both uncertainty quantification and explainability to develop reliable healthcare support systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Proppe, Jonny, Tamara Husch, Gregor N. Simm y Markus Reiher. "Uncertainty quantification for quantum chemical models of complex reaction networks". Faraday Discussions 195 (2016): 497–520. http://dx.doi.org/10.1039/c6fd00144k.

Texto completo
Resumen
For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Zhu, Hong-Yu, Gang Wang, Yi Liu y Ze-Kun Zhou. "Numerical investigation of transonic buffet on supercritical airfoil considering uncertainties in wind tunnel testing". International Journal of Modern Physics B 34, n.º 14n16 (20 de abril de 2020): 2040083. http://dx.doi.org/10.1142/s0217979220400834.

Texto completo
Resumen
To improve the predictive ability of computational fluid dynamics (CFD) on the transonic buffet phenomenon, NASA SC(2)-0714 supercritical airfoil is numerically investigated by noninstructive probabilistic collocation method for uncertainty quantification. Distributions of uncertain parameters are established according to the NASA wind tunnel report. The effects of the uncertainties on lift, drag, mean pressure and root-mean square pressure are discussed. To represent the stochastic solution, the mean and standard deviation of variation of flow quantities such as lift and drag coefficients are computed. Furthermore, mean pressure distribution and root-mean square pressure distribution from the upper surface are displayed with uncertainty bounds containing 95% of all possible values. It is shown that the most sensitive part of flow to uncertain parameters is near the shock wave motion region. Comparing uncertainty bounds with experimental data, numerical results are reliable to predict the reduced frequency and mean pressure distribution. However, for root-mean square pressure distribution, numerical results are higher than the experimental data in the trailing edge region.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Esbensen, Kim H. y Costas Velis. "Transition to circular economy requires reliable statistical quantification and control of uncertainty and variability in waste". Waste Management & Research 34, n.º 12 (28 de noviembre de 2016): 1197–200. http://dx.doi.org/10.1177/0734242x16680911.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Davis, Gary A. y Christopher Cheong. "Pedestrian Injury Severity vs. Vehicle Impact Speed: Uncertainty Quantification and Calibration to Local Conditions". Transportation Research Record: Journal of the Transportation Research Board 2673, n.º 11 (16 de junio de 2019): 583–92. http://dx.doi.org/10.1177/0361198119851747.

Texto completo
Resumen
This paper describes a method for fitting predictive models that relate vehicle impact speeds to pedestrian injuries, in which results from a national sample are calibrated to reflect local injury statistics. Three methodological issues identified in the literature, outcome-based sampling, uncertainty regarding estimated impact speeds, and uncertainty quantification, are addressed by (i) implementing Bayesian inference using Markov Chain Monte Carlo sampling and (ii) applying multiple imputation to conditional maximum likelihood estimation. The methods are illustrated using crash data from the NHTSA Pedestrian Crash Data Study coupled with an exogenous sample of pedestrian crashes from Minnesota’s Twin Cities. The two approaches produced similar results and, given a reliable characterization of impact speed uncertainty, either approach can be applied in a jurisdiction having an exogenous sample of pedestrian crash severities.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Xiao, Yijun y William Yang Wang. "Quantifying Uncertainties in Natural Language Processing Tasks". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julio de 2019): 7322–29. http://dx.doi.org/10.1609/aaai.v33i01.33017322.

Texto completo
Resumen
Reliable uncertainty quantification is a first step towards building explainable, transparent, and accountable artificial intelligent systems. Recent progress in Bayesian deep learning has made such quantification realizable. In this paper, we propose novel methods to study the benefits of characterizing model and data uncertainties for natural language processing (NLP) tasks. With empirical experiments on sentiment analysis, named entity recognition, and language modeling using convolutional and recurrent neural network models, we show that explicitly modeling uncertainties is not only necessary to measure output confidence levels, but also useful at enhancing model performances in various NLP tasks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Ritter, Karen, James Keating, Terri Shires y Miriam Lev-On. "A decade of sectoral initiative to promote consistent and reliable quantification of greenhouse gas emissions". APPEA Journal 50, n.º 2 (2010): 696. http://dx.doi.org/10.1071/aj09060.

Texto completo
Resumen
With the increased focus on greenhouse gas emissions (GHG) and their role in the implementation of policy measures for their mitigation, there continues to be a need for accurate, reliable and transparent characterisation of these emissions. A myriad of mandatory reporting regulations and voluntary initiatives with diverse protocols and methodologies are emerging globally. This poses a particular challenge to multinational companies, such as in the oil and natural gas industry sector, which operate globally and in joint ventures. The American Petroleum Institute (API) and its member companies recognised these challenges over a decade ago and launched a multi-year initiative to map out and provide tools for the quantification of GHG emissions from oil and natural gas industry operations and similar industrial sources. During this time span, the industry developed several key guidance documents to promote the consistent and accurate quantification and reporting of GHG emissions. This paper will focus on two recent publications: the 2009 Edition of API’s Compendium of GHG Emissions Estimation Methodologies for the Oil and Gas Industry (3rd revision); and, a new document addressing technical considerations and statistical calculation methods for assessing the uncertainty of GHG emission estimates. The paper will discuss case studies pertinent to oil and natural gas exploration and production activities and will put these in context with emerging US mandatory GHG emissions reporting. It will also discuss the broad applicability of these estimation methods, and uncertainty considerations, to most industry sectors that rely on fossil fuels for their energy sources.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Liu, Xingpo, Chengfei Xia, Yifan Tang, Jiayang Tu y Huimin Wang. "Parameter optimization and uncertainty assessment for rainfall frequency modeling using an adaptive Metropolis–Hastings algorithm". Water Science and Technology 83, n.º 5 (27 de enero de 2021): 1085–102. http://dx.doi.org/10.2166/wst.2021.032.

Texto completo
Resumen
Abstract A new parameter optimization and uncertainty assessment procedure using the Bayesian inference with an adaptive Metropolis–Hastings (AM-H) algorithm is presented for extreme rainfall frequency modeling. An efficient Markov chain Monte Carlo sampler is adopted to explore the posterior distribution of parameters and calculate their uncertainty intervals associated with the magnitude of estimated rainfall depth quantiles. Also, the efficiency of AM-H and conventional maximum likelihood estimation (MLE) in parameter estimation and uncertainty quantification are compared. And the procedure was implemented and discussed for the case of Chaohu city, China. Results of our work reveal that: (i) the adaptive Bayesian method, especially for return level associated to large return period, shows better estimated effect when compared with MLE; it should be noted that the implementation of MLE often produces overy optimistic results in the case of Chaohu city; (ii) AM-H algorithm is more reliable than MLE in terms of uncertainty quantification, and yields relatively narrow credible intervals for the quantile estimates to be instrumental in risk assessment of urban storm drainage planning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Scotto di Perta, Ester, Nunzio Fiorentino, Marco Carozzi, Elena Cervelli y Stefania Pindozzi. "A Review of Chamber and Micrometeorological Methods to Quantify NH3 Emissions from Fertilisers Field Application". International Journal of Agronomy 2020 (1 de agosto de 2020): 1–16. http://dx.doi.org/10.1155/2020/8909784.

Texto completo
Resumen
Agriculture is mainly responsible for ammonia (NH3) volatilisation. A common effort to produce reliable quantifications, national emission inventories, and policies is needed to reduce health and environmental issues related to this emission. Sources of NH3 are locally distributed and mainly depend on farm building characteristics, management of excreta, and the field application of mineral fertilisers. To date, appropriate measurements related to the application of fertilisers to the field are still scarce in the literature. Proper quantification of NH3 must consider the nature of the fertiliser, the environmental variables that influence the dynamic of the emission, and a reliable measurement method. This paper presents the state of the art of the most commonly used direct methods to measure NH3 volatilisation following field application of fertilisers, mainly focusing on chamber method. The characteristics and the associated uncertainty of the measurement of the most widespread chamber types are discussed and compared to the micrometeorological methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Hemmings, J. C. P., P. G. Challenor y A. Yool. "Mechanistic site-based emulation of a global ocean biogeochemical model for parametric analysis and calibration". Geoscientific Model Development Discussions 7, n.º 5 (25 de septiembre de 2014): 6327–411. http://dx.doi.org/10.5194/gmdd-7-6327-2014.

Texto completo
Resumen
Abstract. Biogeochemical ocean circulation models used to investigate the role of plankton ecosystems in global change rely on adjustable parameters to compensate for missing biological complexity. In principle, optimal parameter values can be estimated by fitting models to observational data, including satellite ocean colour products such as chlorophyll that achieve good spatial and temporal coverage of the surface ocean. However, comprehensive parametric analyses require large ensemble experiments that are computationally infeasible with global 3-D simulations. Site-based simulations provide an efficient alternative but can only be used to make reliable inferences about global model performance if robust quantitative descriptions of their relationships with the corresponding 3-D simulations can be established. The feasibility of establishing such a relationship is investigated for an intermediate complexity biogeochemistry model (MEDUSA) coupled with a widely-used global ocean model (NEMO). A site-based mechanistic emulator is constructed for surface chlorophyll output from this target model as a function of model parameters. The emulator comprises an array of 1-D simulators and a statistical quantification of the uncertainty in their predictions. The unknown parameter-dependent biogeochemical environment, in terms of initial tracer concentrations and lateral flux information required by the simulators, is a significant source of uncertainty. It is approximated by a mean environment derived from a small ensemble of 3-D simulations representing variability of the target model behaviour over the parameter space of interest. The performance of two alternative uncertainty quantification schemes is examined: a direct method based on comparisons between simulator output and a sample of known target model "truths" and an indirect method that is only partially reliant on knowledge of target model output. In general, chlorophyll records at a representative array of oceanic sites are well reproduced. The use of lateral flux information reduces the 1-D simulator error considerably, consistent with a major influence of advection at some sites. Emulator robustness is assessed by comparing actual error distributions with those predicted. With the direct uncertainty quantification scheme, the emulator is reasonably robust over all sites. The indirect uncertainty quantification scheme is less reliable at some sites but scope for improving its performance is identified. The results demonstrate the strong potential of the emulation approach to improve the effectiveness of site-based methods. This represents important progress towards establishing a robust site-based capability that will allow comprehensive parametric analyses to be achieved for improving global models and quantifying uncertainty in their predictions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Hwang, Sungkun, Recep M. Gorguluarslan, Hae-Jin Choi y Seung-Kyum Choi. "Integration of Dimension Reduction and Uncertainty Quantification in Designing Stretchable Strain Gauge Sensor". Applied Sciences 10, n.º 2 (16 de enero de 2020): 643. http://dx.doi.org/10.3390/app10020643.

Texto completo
Resumen
Interests in strain gauge sensors employing stretchable patch antenna have escalated in the area of structural health monitoring, because the malleable sensor is sensitive to capturing strain variation in any shape of structure. However, owing to the narrow frequency bandwidth of the patch antenna, the operation quality of the strain sensor is not often assured under structural deformation, which creates unpredictable frequency shifts. Geometric properties of the stretchable antenna also severely regulate the performance of the sensor. Especially rugged substrate created by printing procedure and manual fabrication derives multivariate design variables. Such design variables intensify the computational burden and uncertainties that impede reliable analysis of the strain sensor. In this research, therefore, a framework is proposed not only to comprehensively capture the sensor’s geometric design variables, but also to effectively reduce the multivariate dimensions. The geometric uncertainties are characterized based on the measurements from real specimens and a Gaussian copula is used to represent them with the correlations. A dimension reduction process with a clear decision criterion by entropy-based correlation coefficient dwindles uncertainties that inhibit precise system reliability assessment. After handling the uncertainties, an artificial neural network-based surrogate model predicts the system responses, and a probabilistic neural network derives a precise estimation of the variability of complicated system behavior. To elicit better performance of the stretchable antenna-based strain sensor, a shape optimization process is then executed by developing an optimal design of the strain sensor, which can resolve the issue of the frequency shift in the narrow bandwidth. Compared with the conventional rigid antenna-based strain sensors, the proposed design brings flexible shape adjustment that enables the resonance frequency to be maintained in reliable frequency bandwidth and antenna performance to be maximized under deformation. Hence, the efficacy of the proposed design framework that employs uncertainty characterization, dimension reduction, and machine learning-based behavior prediction is epitomized by the stretchable antenna-based strain sensor.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Braun, Mathias, Olivier Piller, Jochen Deuerlein, Iraj Mortazavi y Angelo Iollo. "Uncertainty quantification of water age in water supply systems by use of spectral propagation". Journal of Hydroinformatics 22, n.º 1 (3 de junio de 2019): 111–20. http://dx.doi.org/10.2166/hydro.2019.017.

Texto completo
Resumen
Abstract Water distribution networks are critical infrastructures that should ensure the reliable supply of high quality potable water to its users. Numerical models of these networks are generally governed by many parameters for which the exact value is not known. This may be due to a lack of precise knowledge like for consumer demand or due to a lack of accessibility as for the pipe roughness. For network managers, the effect of these uncertainties on the network state is important information that supports them in the decision-making process. This effect is generally evaluated by propagating the uncertainties using the mathematical model. In the past, perturbation, fuzzy and stochastic collocation methods have been used for uncertainty propagation. However, these methods are limited either in the accuracy of the results or the computational effort of the necessary calculations. This paper uses an alternative spectral approach that uses the polynomial chaos expansion and has the potential to give results of comparable accuracy to the Monte Carlo sampling through the definition of a stochastic model. This approach is applied to the hydraulic model of two real networks in order to evaluate the influence of uncertain demands on the water age.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Hemmings, J. C. P., P. G. Challenor y A. Yool. "Mechanistic site-based emulation of a global ocean biogeochemical model (MEDUSA 1.0) for parametric analysis and calibration: an application of the Marine Model Optimization Testbed (MarMOT 1.1)". Geoscientific Model Development 8, n.º 3 (23 de marzo de 2015): 697–731. http://dx.doi.org/10.5194/gmd-8-697-2015.

Texto completo
Resumen
Abstract. Biogeochemical ocean circulation models used to investigate the role of plankton ecosystems in global change rely on adjustable parameters to capture the dominant biogeochemical dynamics of a complex biological system. In principle, optimal parameter values can be estimated by fitting models to observational data, including satellite ocean colour products such as chlorophyll that achieve good spatial and temporal coverage of the surface ocean. However, comprehensive parametric analyses require large ensemble experiments that are computationally infeasible with global 3-D simulations. Site-based simulations provide an efficient alternative but can only be used to make reliable inferences about global model performance if robust quantitative descriptions of their relationships with the corresponding 3-D simulations can be established. The feasibility of establishing such a relationship is investigated for an intermediate complexity biogeochemistry model (MEDUSA) coupled with a widely used global ocean model (NEMO). A site-based mechanistic emulator is constructed for surface chlorophyll output from this target model as a function of model parameters. The emulator comprises an array of 1-D simulators and a statistical quantification of the uncertainty in their predictions. The unknown parameter-dependent biogeochemical environment, in terms of initial tracer concentrations and lateral flux information required by the simulators, is a significant source of uncertainty. It is approximated by a mean environment derived from a small ensemble of 3-D simulations representing variability of the target model behaviour over the parameter space of interest. The performance of two alternative uncertainty quantification schemes is examined: a direct method based on comparisons between simulator output and a sample of known target model "truths" and an indirect method that is only partially reliant on knowledge of the target model output. In general, chlorophyll records at a representative array of oceanic sites are well reproduced. The use of lateral flux information reduces the 1-D simulator error considerably, consistent with a major influence of advection at some sites. Emulator robustness is assessed by comparing actual error distributions with those predicted. With the direct uncertainty quantification scheme, the emulator is reasonably robust over all sites. The indirect uncertainty quantification scheme is less reliable at some sites but scope for improving its performance is identified. The results demonstrate the strong potential of the emulation approach to improve the effectiveness of site-based methods. This represents important progress towards establishing a robust site-based capability that will allow comprehensive parametric analyses to be achieved for improving global models and quantifying uncertainty in their predictions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Al-Dahidi, Sameer, Piero Baraldi, Enrico Zio y Montelatici Lorenzo. "Bootstrapped Ensemble of Artificial Neural Networks Technique for Quantifying Uncertainty in Prediction of Wind Energy Production". Sustainability 13, n.º 11 (4 de junio de 2021): 6417. http://dx.doi.org/10.3390/su13116417.

Texto completo
Resumen
The accurate prediction of wind energy production is crucial for an affordable and reliable power supply to consumers. Prediction models are used as decision-aid tools for electric grid operators to dynamically balance the energy production provided by a pool of diverse sources in the energy mix. However, different sources of uncertainty affect the predictions, providing the decision-makers with non-accurate and possibly misleading information for grid operation. In this regard, this work aims to quantify the possible sources of uncertainty that affect the predictions of wind energy production provided by an ensemble of Artificial Neural Network (ANN) models. The proposed Bootstrap (BS) technique for uncertainty quantification relies on estimating Prediction Intervals (PIs) for a predefined confidence level. The capability of the proposed BS technique is verified, considering a 34 MW wind plant located in Italy. The obtained results show that the BS technique provides a more satisfactory quantification of the uncertainty of wind energy predictions than that of a technique adopted by the wind plant owner and the Mean-Variance Estimation (MVE) technique of literature. The PIs obtained by the BS technique are also analyzed in terms of different weather conditions experienced by the wind plant and time horizons of prediction.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Gisler, Batoul M. "Uncertainty Quantification for a Hydraulic Fracture Geometry: Application to Woodford Shale Data". Geofluids 2021 (21 de agosto de 2021): 1–14. http://dx.doi.org/10.1155/2021/2138115.

Texto completo
Resumen
Hydraulic fracturing enhances hydrocarbon production from low permeability reservoirs. Laboratory tests and direct field measurements do a decent job of predicting the response of the system but are expensive and not easily accessible, thus increasing the need for robust deterministic and numerical solutions. The reliability of these mathematical models hinges on the uncertainties in the input parameters because uncertainty propagates to the output solution resulting in incorrect interpretations. Here, I build a framework for uncertainty quantification for a 1D fracture geometry using Woodford shale data. The proposed framework uses Monte-Carlo-based statistical methods and is comprised of two parts: sensitivity analysis and the probability density functions. Results reveal the transient nature of the sensitivity analysis, showing that Young’s modulus controls the initial pore pressure, which after 1 hour depends on the hydraulic conductivity. Results also show that the leak-off is most sensitive to permeability and thermal expansion coefficient of the rock and that temperature evolution primarily depends on thermal conductivity and the overall heat capacity. Furthermore, the model shows that Young’s modulus controls the initial fracture width, which after 1 hour of injection depends on the thermal expansion coefficient. Finally, the probability density curve of the transient fracture width displays the range of possible fracture aperture and adequate proppant size. The good agreement between the statistical model and field observations shows that the probability density curve can provide a reliable insight into the optimal proppant size.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Bevia, Vicente José, Clara Burgos Simón, Juan Carlos Cortés y Rafael J. Villanueva Micó. "Uncertainty Quantification of Random Microbial Growth in a Competitive Environment via Probability Density Functions". Fractal and Fractional 5, n.º 2 (24 de marzo de 2021): 26. http://dx.doi.org/10.3390/fractalfract5020026.

Texto completo
Resumen
The Baranyi–Roberts model describes the dynamics of the volumetric densities of two interacting cell populations. We randomize this model by considering that the initial conditions are random variables whose distributions are determined by using sample data and the principle of maximum entropy. Subsequenly, we obtain the Liouville–Gibbs partial differential equation for the probability density function of the two-dimensional solution stochastic process. Because the exact solution of this equation is unaffordable, we use a finite volume scheme to numerically approximate the aforementioned probability density function. From this key information, we design an optimization procedure in order to determine the best growth rates of the Baranyi–Roberts model, so that the expectation of the numerical solution is as close as possible to the sample data. The results evidence good fitting that allows for performing reliable predictions.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Tran, Vinh Ngoc y Jongho Kim. "Toward an Efficient Uncertainty Quantification of Streamflow Predictions Using Sparse Polynomial Chaos Expansion". Water 13, n.º 2 (15 de enero de 2021): 203. http://dx.doi.org/10.3390/w13020203.

Texto completo
Resumen
Reliable hydrologic models are essential for planning, designing, and management of water resources. However, predictions by hydrological models are prone to errors due to a variety of sources of uncertainty. More accurate quantification of these uncertainties using a large number of ensembles and model runs is hampered by the high computational burden. In this study, we developed a highly efficient surrogate model constructed by sparse polynomial chaos expansion (SPCE) coupled with the least angle regression method, which enables efficient uncertainty quantifications. Polynomial chaos expansion was employed to surrogate a storage function-based hydrological model (SFM) for nine streamflow events in the Hongcheon watershed of South Korea. The efficiency of SPCE is investigated by comparing it with another surrogate model, full polynomial chaos expansion (FPCE) built by a well-known, ordinary least square regression (OLS) method. This study confirms that (1) the performance of SPCE is superior to that of FPCE because SPCE can build a more accurate surrogate model (i.e., smaller leave-one-out cross-validation error) with one-quarter the size (i.e., 500 versus 2000). (2) SPCE can sufficiently capture the uncertainty of the streamflow, which is comparable to that of SFM. (3) Sensitivity analysis attained through visual inspection and mathematical computation of the Sobol’ index has been of great success for SPCE to capture the parameter sensitivity of SFM, identifying four parameters, α, Kbas, Pbas, and Pchn, that are most sensitive to the likelihood function, Nash-Sutcliffe efficiency. (4) The computational power of SPCE is about 200 times faster than that of SFM and about four times faster than that of FPCE. The SPCE approach builds a surrogate model quickly and robustly with a more compact experimental design compared to FPCE. Ultimately, it will benefit ensemble streamflow forecasting studies, which must provide information and alerts in real time.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Steiner, A. K., D. Hunt, S. P. Ho, G. Kirchengast, A. J. Mannucci, B. Scherllin-Pirscher, H. Gleisner et al. "Quantification of structural uncertainty in climate data records from GPS radio occultation". Atmospheric Chemistry and Physics Discussions 12, n.º 10 (12 de octubre de 2012): 26963–94. http://dx.doi.org/10.5194/acpd-12-26963-2012.

Texto completo
Resumen
Abstract. Global Positioning System (GPS) radio occultation (RO) provides continuous observations of the Earth's atmosphere since 2001 with global coverage, all-weather capability, and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS). Precise time measurements enable long-term stability but careful processing is needed. Here we provide climate-oriented atmospheric scientists with multicenter-based results on the long-term stability of RO climatological fields for trend studies. We quantify the structural uncertainty of atmospheric trends estimated from the RO record, which arises from current processing schemes of six international RO processing centers, DMI Copenhagen, EUM Darmstadt, GFZ Potsdam, JPL Pasadena, UCAR Boulder, and WEGC Graz. Monthly-mean zonal-mean fields of bending angle, refractivity, dry pressure, dry geopotential height, and dry temperature from the CHAMP mission are compared for September 2001 to September 2008. We find that structural uncertainty is lowest in the tropics and mid-latitudes (50° S to 50° N) from 8 km to 25 km for all inspected RO variables. In this region, the structural uncertainty in trends over 7 yr is <0.03% f or bending angle, refractivity, and pressure, <3 m for geopotential height of pressure levels, and <0.06 K for temperature; low enough for detecting a climate change signal within about a decade. Larger structural uncertainty above about 25 km and at high latitudes is attributable to differences in the processing schemes, which undergo continuous improvements. Though current use of RO for reliable climate trend assessment is bound to 50° S to 50° N, our results show that quality, consistency, and reproducibility are favorable in the UTLS for the establishment of a climate benchmark record.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Steiner, A. K., D. Hunt, S. P. Ho, G. Kirchengast, A. J. Mannucci, B. Scherllin-Pirscher, H. Gleisner et al. "Quantification of structural uncertainty in climate data records from GPS radio occultation". Atmospheric Chemistry and Physics 13, n.º 3 (6 de febrero de 2013): 1469–84. http://dx.doi.org/10.5194/acp-13-1469-2013.

Texto completo
Resumen
Abstract. Global Positioning System (GPS) radio occultation (RO) has provided continuous observations of the Earth's atmosphere since 2001 with global coverage, all-weather capability, and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS). Precise time measurements enable long-term stability but careful processing is needed. Here we provide climate-oriented atmospheric scientists with multicenter-based results on the long-term stability of RO climatological fields for trend studies. We quantify the structural uncertainty of atmospheric trends estimated from the RO record, which arises from current processing schemes of six international RO processing centers, DMI Copenhagen, EUM Darmstadt, GFZ Potsdam, JPL Pasadena, UCAR Boulder, and WEGC Graz. Monthly-mean zonal-mean fields of bending angle, refractivity, dry pressure, dry geopotential height, and dry temperature from the CHAMP mission are compared for September 2001 to September 2008. We find that structural uncertainty is lowest in the tropics and mid-latitudes (50° S to 50° N) from 8 km to 25 km for all inspected RO variables. In this region, the structural uncertainty in trends over 7 yr is <0.03% for bending angle, refractivity, and pressure, <3 m for geopotential height of pressure levels, and <0.06 K for temperature; low enough for detecting a climate change signal within about a decade. Larger structural uncertainty above about 25 km and at high latitudes is attributable to differences in the processing schemes, which undergo continuous improvements. Though current use of RO for reliable climate trend assessment is bound to 50° S to 50° N, our results show that quality, consistency, and reproducibility are favorable in the UTLS for the establishment of a climate benchmark record.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Nikishova, A., L. Veen, P. Zun y A. G. Hoekstra. "Semi-intrusive multiscale metamodelling uncertainty quantification with application to a model of in-stent restenosis". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 377, n.º 2142 (18 de febrero de 2019): 20180154. http://dx.doi.org/10.1098/rsta.2018.0154.

Texto completo
Resumen
We explore the efficiency of a semi-intrusive uncertainty quantification (UQ) method for multiscale models as proposed by us in an earlier publication. We applied the multiscale metamodelling UQ method to a two-dimensional multiscale model for the wound healing response in a coronary artery after stenting (in-stent restenosis). The results obtained by the semi-intrusive method show a good match to those obtained by a black-box quasi-Monte Carlo method. Moreover, we significantly reduce the computational cost of the UQ. We conclude that the semi-intrusive metamodelling method is reliable and efficient, and can be applied to such complex models as the in-stent restenosis ISR2D model. This article is part of the theme issue ‘Multiscale modelling, simulation and computing: from the desktop to the exascale’.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Schulze, Moritz y René Schenkendorf. "Robust Model Selection: Flatness-Based Optimal Experimental Design for a Biocatalytic Reaction". Processes 8, n.º 2 (5 de febrero de 2020): 190. http://dx.doi.org/10.3390/pr8020190.

Texto completo
Resumen
Considering the competitive and strongly regulated pharmaceutical industry, mathematical modeling and process systems engineering might be useful tools for implementing quality by design (QbD) and quality by control (QbC) strategies for low-cost but high-quality drugs. However, a crucial task in modeling (bio)pharmaceutical manufacturing processes is the reliable identification of model candidates from a set of various model hypotheses. To identify the best experimental design suitable for a reliable model selection and system identification is challenging for nonlinear (bio)pharmaceutical process models in general. This paper is the first to exploit differential flatness for model selection problems under uncertainty, and thus translates the model selection problem to advanced concepts of systems theory and controllability aspects, respectively. Here, the optimal controls for improved model selection trajectories are expressed analytically with low computational costs. We further demonstrate the impact of parameter uncertainties on the differential flatness-based method and provide an effective robustification strategy with the point estimate method for uncertainty quantification. In a simulation study, we consider a biocatalytic reaction step simulating the carboligation of aldehydes, where we successfully derive optimal controls for improved model selection trajectories under uncertainty.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Gąsior, Robert y Mariusz P. Pietras. "Validation of a Method for Determining Cholesterol in Egg Yolks/ Walidacja metody oznaczania cholesterolu w żółtkach jaj". Annals of Animal Science 13, n.º 1 (1 de enero de 2013): 143–53. http://dx.doi.org/10.2478/v10220-012-0066-7.

Texto completo
Resumen
Abstract The aim of the study was to validate a gas chromatographic method for determining cholesterol in egg yolks according to the EN ISO/IEC 17025 standard. Of the two methods, with and without internal standard, the former was characterized by lower uncertainty, with a repeatability of 4% and within-laboratory reproducibility of 6%. The method’s uncertainty (n = 2, P≤0.05), which included sample preparation errors and chromatographic measurement errors, was 10.6%. Mean recovery was 99.9% and limit of quantification was 0.16 mg/g. The coefficient of variation for repeatability, which is calculated during routine analyses, should not exceed the 8% limit of repeatability. The method is reliable, as confirmed by the results of validation, and the procedure is relatively rapid and simple.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Zimmer, Christoph, Sequoia I. Leuba, Ted Cohen y Reza Yaesoubi. "Accurate quantification of uncertainty in epidemic parameter estimates and predictions using stochastic compartmental models". Statistical Methods in Medical Research 28, n.º 12 (14 de noviembre de 2018): 3591–608. http://dx.doi.org/10.1177/0962280218805780.

Texto completo
Resumen
Stochastic transmission dynamic models are needed to quantify the uncertainty in estimates and predictions during outbreaks of infectious diseases. We previously developed a calibration method for stochastic epidemic compartmental models, called Multiple Shooting for Stochastic Systems (MSS), and demonstrated its competitive performance against a number of existing state-of-the-art calibration methods. The existing MSS method, however, lacks a mechanism against filter degeneracy, a phenomenon that results in parameter posterior distributions that are weighted heavily around a single value. As such, when filter degeneracy occurs, the posterior distributions of parameter estimates will not yield reliable credible or prediction intervals for parameter estimates and predictions. In this work, we extend the MSS method by evaluating and incorporating two resampling techniques to detect and resolve filter degeneracy. Using simulation experiments, we demonstrate that an extended MSS method produces credible and prediction intervals with desired coverage in estimating key epidemic parameters (e.g. mean duration of infectiousness and R0) and short- and long-term predictions (e.g. one and three-week forecasts, timing and number of cases at the epidemic peak, and final epidemic size). Applying the extended MSS approach to a humidity-based stochastic compartmental influenza model, we were able to accurately predict influenza-like illness activity reported by U.S. Centers for Disease Control and Prevention from 10 regions as well as city-level influenza activity using real-time, city-specific Google search query data from 119 U.S. cities between 2003 and 2014.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Gao, Guohua, Jeroen C. Vink y Faruk O. Alpak. "Integrated Field-Scale Production and Economic Evaluation Under Subsurface Uncertainty for the Pattern-Driven Development of Unconventional Resources With Analytical Superposition". SPE Reservoir Evaluation & Engineering 19, n.º 01 (31 de diciembre de 2015): 118–29. http://dx.doi.org/10.2118/173247-pa.

Texto completo
Resumen
Summary The in-situ upgrading process (IUP) is a thermal-recovery technique that relies on a pattern-based development process, a complicated physical process that involves thermal and mass transfer in porous media, which renders full field-scale reservoir simulations impractical. Although it is feasible to quantify the impact of subsurface uncertainties on recovery for small-scale sector models with experimental design (ED), it is still a very challenging problem to quantify their impact on field-scale quantities. Straightforward upscaling to field scale does not work because such conventional superposition-based methods do not capture the effects of spatial variability in rock and fluid properties and the time delay in sequential pattern development. In this paper, we show that, under certain mild assumptions, an analytical superposition formulation can be developed that propagates the uncertainties of production forecasts and economic evaluations generated from a sector model to full field-scale quantities. One can simplify this formulation further so that the variance of a field-scale quantity is analytically expressed as the variance of the same single-pattern quantity multiplied by a (computable) scaleup factor. This makes it possible to implement a practical uncertainty quantification work flow in which single-pattern results are upscaled to accurate full field results with reliable uncertainty ranges, without the need for full field-scale simulations. We apply the proposed novel superposition and uncertainty-propagation method to a multipattern IUP development, and demonstrate that this work flow produces reliable results for field-scale production and economics as well as realistic uncertainty ranges. Moreover, these results indicate that the scaleup factor for single-pattern results can accurately capture the impact of spatial correlations of subsurface uncertainties, the size of the field-scale model, the time-delay in pattern development, and the discount rate. Uncertainty quantification of field-scale production and economics is a key factor for the successful development of unconventional resources such as extraheavy oil and oil shale with significant rewards in terms of risk management and project profitability. With minor modifications, the proposed method can also be applied to other pattern-driven processes such as the in-situ conversion process (ICP) and steam-assisted gravity drainage.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Proietto Galeano, Michele, Monica Scordino, Leonardo Sabatino, Valentina Pantò, Giovanni Morabito, Elena Chiappara, Pasqualino Traulo y Giacomo Gagliano. "UHPLC/MS-MS Analysis of Six Neonicotinoids in Honey by Modified QuEChERS: Method Development, Validation, and Uncertainty Measurement". International Journal of Food Science 2013 (2013): 1–7. http://dx.doi.org/10.1155/2013/863904.

Texto completo
Resumen
Rapid and reliable multiresidue analytical methods were developed and validated for the determination of 6 neonicotinoids pesticides (acetamiprid, clothianidin, imidacloprid, nitenpyram, thiacloprid, and thiamethoxam) in honey. A modified QuEChERS method has allowed a very rapid and efficient single-step extraction, while the detection was performed by UHPLC/MS-MS. The recovery studies were carried out by spiking the samples at two concentration levels (10 and 40 μg/kg). The methods were subjected to a thorough validation procedure. The mean recovery was in the range of 75 to 114% with repeatability below 20%. The limits of detection were below 2.5 μg/kg, while the limits of quantification did not exceed 4.0 μg/kg. The total uncertainty was evaluated taking the main independent uncertainty sources under consideration. The expanded uncertainty did not exceed 49% for the 10 μg/kg concentration level and was in the range of 16–19% for the 40 μg/kg fortification level.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Jin, Jonghoon, Yuzhang Che, Jiafeng Zheng y Feng Xiao. "Uncertainty Quantification of a Coupled Model for Wind Prediction at a Wind Farm in Japan". Energies 12, n.º 8 (21 de abril de 2019): 1505. http://dx.doi.org/10.3390/en12081505.

Texto completo
Resumen
Reliable and accurate short-term prediction of wind speed at hub height is very important to optimize the integration of wind energy into existing electrical systems. To this end, a coupled model based on the Weather Research Forecasting (WRF) model and Open Source Field Operation and Manipulation (OpenFOAM) Computational Fluid Dynamics (CFD) model is proposed to improve the forecast of the wind fields over complex terrain regions. The proposed model has been validated with the quality-controlled observations of 15 turbine sites in a target wind farm in Japan. The numerical results show that the coupled model provides more precise forecasts compared to the WRF alone forecasts, with the overall improvements of 26%, 22% and 4% in mean error (ME), root mean square error (RMSE) and correlation coefficient (CC), respectively. As the first step to explore further improvement of the coupled system, the polynomial chaos expansion (PCE) approach is adopted to quantitatively evaluate the effects of several parameters in the coupled model. The statistics from the uncertainty quantification results show that the uncertainty in the inflow boundary conditions to the CFD model affects more dominantly the hub-height wind prediction in comparison with other parameters in the turbulence model, which suggests an effective approach to parameterize and assimilate the coupling interface of the model.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Auer, Ekaterina, Julia Kersten y Andreas Rauh. "Preface". Acta Cybernetica 24, n.º 3 (16 de marzo de 2020): 265–66. http://dx.doi.org/10.14232/actacyb.24.3.2020.1.

Texto completo
Resumen
The Summer Workshop on Interval Methods (SWIM) is an annual meeting initiated in 2008 by the French MEA working group on Set Computation and Interval Techniques of the French research group on Automatic Control. A special focus of the MEA group is on promoting interval analysis techniques and applications to a broader community of researchers, facilitated by such multidisciplinary workshops. Since 2008, SWIM has become a keystone event for researchers dealing with various aspects of interval and set-based methods. In 2018, the 11th edition in this workshop series was held at the University of Rostock, Germany, with a focus on research topics in the fields of engineering, computer science, and mathematics. A total of 31 talks were given during this workshop, covering the following areas: verified solution of initial value problems for ordinary differential equations, differential-algebraic system models, and partial differential equations, scientific computing with guaranteed error bounds, design of robust and fault-tolerant control systems, modeling and quantification of errors in engineering tasks, implementation of software libraries, and usage of the aforementioned approaches for system models in control engineering, data analysis, signal and image processing. After a peer-review process, 15 high-quality articles were selected for publication in this special issue. They are roughly divided into two thematic groups: Uncertainty Modeling, Software, Verified Computing and Optimization as well as Interval Methods in Control and Robotics. The first part, Uncertainty Modeling, Software, Verified Computing and Optimization, contains methodological aspects concerning reliable modeling of dynamic systems as well as visualization and quantification of uncertainty in the fields of measurement and simulation. Moreover, existence proofs for solutions of partial differential equations and their reliable optimal control synthesis are considered. A paper making use of quantifier elimination for robust linear output feedback control by means of eigenvalue placement concludes this section. The second part of this special issue, Interval Methods in Control and Robotics, is focused on the design as well as numerical and experimental validation of robust state observation and control procedures along with reliable parameter and state estimation approaches in the fields of control for thermal systems, robotics, localization of drones and global positioning systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Salis, Christos, Nikolaos Kantartzis y Theodoros Zygiridis. "Stochastic LOD-FDTD method for two-dimensional electromagnetic uncertainty problems". COMPEL - The international journal for computation and mathematics in electrical and electronic engineering 36, n.º 5 (4 de septiembre de 2017): 1442–56. http://dx.doi.org/10.1108/compel-02-2017-0087.

Texto completo
Resumen
Purpose Random media uncertainties exhibit a significant impact on the properties of electromagnetic fields that usually deterministic models tend to neglect. As a result, these models fail to quantify the variation in the calculated electromagnetic fields, leading to inaccurate outcomes. This paper aims to introduce an unconditionally stable finite-difference time-domain (FDTD) method for assessing two-dimensional random media uncertainties in one simulation. Design/methodology/approach The proposed technique is an extension of the stochastic FDTD (S-FDTD) scheme, which approximates the variance of a given field component using the Delta method. Specifically in this paper, the Delta method is applied to the locally one-dimensional (LOD) FDTD scheme (hence named S-LOD-FDTD), to achieve unconditional stability. The validity of this algorithm is tested by solving two-dimensional random media problems and comparing the results with other methods, such as the Monte-Carlo (MC) and the S-FDTD techniques. Findings This paper provides numerical results that prove the unconditional stability of the S-LOD-FDTD technique. Also, the comparison with the MC and the S-FDTD methods shows that reliable outcomes can be extracted even with larger time steps, thus making this technique more efficient than the other two aforementioned schemes. Research limitations/implications The S-LOD-FDTD method requires the proper quantification of various correlation coefficients between the calculated fields and the electrical parameters, to achieve reliable results. This cannot be known beforehand and the only known way to calculate them is to run a fraction of MC simulations. Originality/value This paper introduces a new unconditional stable technique for measuring material uncertainties in one realization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Olivieri, Alejandro C., Nicolaas M. Faber, Joan Ferré, Ricard Boqué, John H. Kalivas y Howard Mark. "Uncertainty estimation and figures of merit for multivariate calibration (IUPAC Technical Report)". Pure and Applied Chemistry 78, n.º 3 (1 de enero de 2006): 633–61. http://dx.doi.org/10.1351/pac200678030633.

Texto completo
Resumen
This paper gives an introduction to multivariate calibration from a chemometrics perspective and reviews the various proposals to generalize the well-established univariate methodology to the multivariate domain. Univariate calibration leads to relatively simple models with a sound statistical underpinning. The associated uncertainty estimation and figures of merit are thoroughly covered in several official documents. However, univariate model predictions for unknown samples are only reliable if the signal is sufficiently selective for the analyte of interest. By contrast, multivariate calibration methods may produce valid predictions also from highly unselective data. A case in point is quantification from near-infrared (NIR) spectra. With the ever-increasing sophistication of analytical instruments inevitably comes a suite of multivariate calibration methods, each with its own underlying assumptions and statistical properties. As a result, uncertainty estimation and figures of merit for multivariate calibration methods has become a subject of active research, especially in the field of chemometrics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Portilha-Cunha, M. Francisca, Teresa I. A. Gouveia, Alicia L. Garcia-Costa, Arminda Alves y Mónica S. F. Santos. "Multi-Matrix Approach for the Analysis of Bicalutamide Residues in Oncology Centers by HPLC–FLD". Molecules 26, n.º 18 (13 de septiembre de 2021): 5561. http://dx.doi.org/10.3390/molecules26185561.

Texto completo
Resumen
Cytostatics are toxic pharmaceuticals, whose presence in surfaces puts healthcare workers at risk. These drugs might also end up in hospital effluents (HWW), potentially damaging aquatic ecosystems. Bicalutamide is a cytostatic extensively consumed worldwide, but few analytical methods exist for its quantification and most of them require advanced techniques, such as liquid chromatography mass spectrometry (LC-MS), which are very complex and expensive for large monitoring studies. Therefore, a simple but reliable multi-matrix high performance liquid chromatographic method, with fluorescence detection, was developed and validated to rapidly screen abnormal concentrations of bicalutamide in HWW and relevant contamination levels of bicalutamide in indoor surfaces (>100 pg/cm2), prior to confirmation by LC-MS. The method presents good linearity and relatively low method detection limits (HWW: 0.14 ng/mL; surfaces: 0.28 pg/cm2). Global uncertainty was below 20% for concentrations higher than 25 ng/mL (HWW) and 50 pg/cm2 (surfaces); global uncertainty was little affected by the matrix. Therefore, a multi-matrix assessment could be achieved with this method, thus contributing to a holistic quantification of bicalutamide along the cytostatic circuit. Bicalutamide was not detected in any of the grab samples from a Portuguese hospital, but an enlarged sampling is required to conclude about its occurrence and exposure risks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Grana, Dario, Leonardo Azevedo y Mingliang Liu. "A comparison of deep machine learning and Monte Carlo methods for facies classification from seismic data". GEOPHYSICS 85, n.º 4 (16 de enero de 2020): WA41—WA52. http://dx.doi.org/10.1190/geo2019-0405.1.

Texto completo
Resumen
Among the large variety of mathematical and computational methods for estimating reservoir properties such as facies and petrophysical variables from geophysical data, deep machine-learning algorithms have gained significant popularity for their ability to obtain accurate solutions for geophysical inverse problems in which the physical models are partially unknown. Solutions of classification and inversion problems are generally not unique, and uncertainty quantification studies are required to quantify the uncertainty in the model predictions and determine the precision of the results. Probabilistic methods, such as Monte Carlo approaches, provide a reliable approach for capturing the variability of the set of possible models that match the measured data. Here, we focused on the classification of facies from seismic data and benchmarked the performance of three different algorithms: recurrent neural network, Monte Carlo acceptance/rejection sampling, and Markov chain Monte Carlo. We tested and validated these approaches at the well locations by comparing classification predictions to the reference facies profile. The accuracy of the classification results is defined as the mismatch between the predictions and the log facies profile. Our study found that when the training data set of the neural network is large enough and the prior information about the transition probabilities of the facies in the Monte Carlo approach is not informative, machine-learning methods lead to more accurate solutions; however, the uncertainty of the solution might be underestimated. When some prior knowledge of the facies model is available, for example, from nearby wells, Monte Carlo methods provide solutions with similar accuracy to the neural network and allow a more robust quantification of the uncertainty, of the solution.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Pochet, Maxime, Hervé Jeanmart y Francesco Contino. "Uncertainty quantification from raw measurements to post-processed data: A general methodology and its application to a homogeneous-charge compression–ignition engine". International Journal of Engine Research 21, n.º 9 (24 de diciembre de 2019): 1709–37. http://dx.doi.org/10.1177/1468087419892697.

Texto completo
Resumen
Internal combustion engines have been improved for many decades. Yet, complex phenomena are now resorted to, for which any optimum might be unstable: noise, low-temperature heat release timing, stratification, pollutant sweet spots, and so on. In order to make reliable statements on an improvement, one must specify the uncertainty related to it. Still, uncertainty quantification is generally missing in the piston engine experimental literature. Therefore, we detailed a mathematical methodology to obtain any engine parameter uncertainty and then used it to derive the uncertainty expressions of the physical quantities of the most generic homogeneous-charge compression–ignition research engine (mass-flow-induced mixture with [Formula: see text] fuel). We then applied those expressions on an existing hydrogen homogeneous-charge compression–ignition test bench. This includes the uncertainty propagation chain from sensor specifications, user calibrations, intake control, in-cylinder processes, and post-processing techniques. Directly measured physical quantities have uncertainties of around 1%, depending on the sensor quality (e.g. pressure, volume), but indirectly measured quantities relying on modelled parameters have uncertainties higher than 5% (e.g. wall heat losses, in-cylinder temperature, gross heat release, pressure rise rate). Other findings that such an analysis can bring relate, for example, to the physical quantities driving the uncertainty and to the ones that can be neglected. In the case of the homogeneous-charge compression–ignition engine considered, the effects of blow-by, bottle purity and air moisture content were found negligible; the post-processing for effective compression ratio, effective in-cylinder temperature, and top dead centre offset were found essential; and the pressure and volume uncertainties were found to be the main drivers to a large extent. The obtained numeric values serve the general purpose of alerting the experimenter on uncertainty order of magnitudes. The developed methodology shall be used and adapted by the experimenter willing to study the uncertainty propagation in their setup or willing to assess the adequacy of a sensor performance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Chen, Xiaoyun, Yi Ji, Kai Li, Xiaofu Wang, Cheng Peng, Xiaoli Xu, Xinwu Pei, Junfeng Xu y Liang Li. "Development of a Duck Genomic Reference Material by Digital PCR Platforms for the Detection of Meat Adulteration". Foods 10, n.º 8 (15 de agosto de 2021): 1890. http://dx.doi.org/10.3390/foods10081890.

Texto completo
Resumen
Low-cost meat, such as duck, is frequently used to adulterate more expensive foods like lamb or beef in many countries. However, the lack of DNA-based reference materials has limited the quality control and detection of adulterants. Here, we report the development and validation of duck genomic DNA certified reference materials (CRMs) through the detection of the duck interleukin 2 (IL2) gene by digital PCR (dPCR) for the identification of duck meat in food products. The certified value of IL2 in CRMs was 5.78 ± 0.51 × 103 copies/μL with extended uncertainty (coverage factor k = 2) based on IL2 quantification by eight independent collaborating laboratories. Quantification of the mitochondrial gene cytb revealed a concentration of 2.0 × 106 copies/μL, as an information value. The CRMs were also used to determine the limit of detection (LOD) for six commercial testing kits, which confirmed that these kits meet or exceed their claimed sensitivity and are reliable for duck detection.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Shi, Guolin, Bing Xu, Xin Wang, Zhong Xue, Xinyuan Shi y Yanjiang Qiao. "Real-Time Release Testing of Herbal Extract Powder by Near-Infrared Spectroscopy considering the Uncertainty around Specification Limits". Journal of Spectroscopy 2019 (3 de marzo de 2019): 1–10. http://dx.doi.org/10.1155/2019/4139762.

Texto completo
Resumen
The concept of real-time release testing (RTRT) has recently been adopted by the production of pharmaceuticals in order to provide high-level guarantee of product quality. Process analytical technology (PAT) is an attractive and efficient way for realizing RTRT. In this paper, near-infrared (NIR) determination of cryptotanshinone and tanshinoneIIA content in tanshinone extract powders was taken as the research object. The aim of NIR analysis is to reliably declare the extract product as compliant with its specification limits or not. First, the NIR quantification method was developed and the parameters of the multivariate calibration model were optimized. The reliable concentration ranges covering the specification limits of two APIs were successfully verified by the accuracy profile (AP) methodology. Then, with the designed validation data from AP, the unreliability graph as the decision tool was built. Innovatively, the β-content, γ-confidence tolerance intervals (β-CTIs) around the specification limits were estimated. During routine use, the boundary of β-CTIs could help decide whether the NIR prediction results are acceptable. The proposed method quantified the analysis risk near the specification limits and confirmed that the unreliable region was useful to release the product quality in a real-time way. Such release strategy could be extended for other PAT applications to improve the reliability of results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Lutz, Julia, Lars Grinde y Anita Verpe Dyrrdal. "Estimating Rainfall Design Values for the City of Oslo, Norway—Comparison of Methods and Quantification of Uncertainty". Water 12, n.º 6 (17 de junio de 2020): 1735. http://dx.doi.org/10.3390/w12061735.

Texto completo
Resumen
Due to its location, its old sewage system, and the channelling of rivers, Oslo is highly exposed to urban flooding. Thus, it is crucial to provide relevant and reliable information on extreme precipitation in the planning and design of infrastructure. Intensity-Duration-Frequency (IDF) curves are a frequently used tool for that purpose. However, the computational method for IDF curves in Norway was established over 45 years ago, and has not been further developed since. In our study, we show that the current method of fitting a Gumbel distribution to the highest precipitation events is not able to reflect the return values for the long return periods. Instead, we introduce the fitting of a Generalised Extreme Value (GEV) distribution for annual maximum precipitation in two different ways, using (a) a modified Maximum Likelihood estimation and (b) Bayesian inference. The comparison of the two methods for 14 stations in and around Oslo reveals that the estimated median return values are very similar, but the Bayesian method provides upper credible interval boundaries that are considerably higher. Two different goodness-of-fit tests favour the Bayesian method; thus, we suggest using the Bayesian inference for estimating IDF curves for the Oslo area.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Petrescu, Ana Maria Roxana, Glen P. Peters, Greet Janssens-Maenhout, Philippe Ciais, Francesco N. Tubiello, Giacomo Grassi, Gert-Jan Nabuurs et al. "European anthropogenic AFOLU greenhouse gas emissions: a review and benchmark data". Earth System Science Data 12, n.º 2 (1 de mayo de 2020): 961–1001. http://dx.doi.org/10.5194/essd-12-961-2020.

Texto completo
Resumen
Abstract. Emission of greenhouse gases (GHGs) and removals from land, including both anthropogenic and natural fluxes, require reliable quantification, including estimates of uncertainties, to support credible mitigation action under the Paris Agreement. This study provides a state-of-the-art scientific overview of bottom-up anthropogenic emissions data from agriculture, forestry and other land use (AFOLU) in the European Union (EU281). The data integrate recent AFOLU emission inventories with ecosystem data and land carbon models and summarize GHG emissions and removals over the period 1990–2016. This compilation of bottom-up estimates of the AFOLU GHG emissions of European national greenhouse gas inventories (NGHGIs), with those of land carbon models and observation-based estimates of large-scale GHG fluxes, aims at improving the overall estimates of the GHG balance in Europe with respect to land GHG emissions and removals. Whenever available, we present uncertainties, its propagation and role in the comparison of different estimates. While NGHGI data for the EU28 provide consistent quantification of uncertainty following the established IPCC Guidelines, uncertainty in the estimates produced with other methods needs to account for both within model uncertainty and the spread from different model results. The largest inconsistencies between EU28 estimates are mainly due to different sources of data related to human activity, referred to here as activity data (AD) and methodologies (tiers) used for calculating emissions and removals from AFOLU sectors. The referenced datasets related to figures are visualized at https://doi.org/10.5281/zenodo.3662371 (Petrescu et al., 2020).
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

de Louw, P. G. B., Y. van der Velde y S. E. A. T. M. van der Zee. "Quantifying water and salt fluxes in a lowland polder catchment dominated by boil seepage: a probabilistic end-member mixing approach". Hydrology and Earth System Sciences 15, n.º 7 (7 de julio de 2011): 2101–17. http://dx.doi.org/10.5194/hess-15-2101-2011.

Texto completo
Resumen
Abstract. Upward saline groundwater seepage is leading to surface water salinization of deep lying polders in the Netherlands. Identifying measures to reduce the salt content requires a thorough understanding and quantification of the dominant sources of water and salt on a daily basis. However, as in most balance studies, there are large uncertainties in the contribution from groundwater seepage. Taking these into account, we applied a probabilistic (GLUE) end-member mixing approach to simulate two years of daily to weekly observations of discharge, salt loads and salt concentration of water pumped out of an artificially drained polder catchment area. We were then able to assess the contribution from the different sources to the water and salt balance of the polder and uncertainties in their quantification. Our modelling approach demonstrates the need to distinguish preferential from diffuse seepage. Preferential seepage via boils contributes, on average, 66 % to the total salt load and only about 15 % to the total water flux into the polder and therefore forms the main salinization pathway. With the model we were able to calculate the effect of future changes on surface water salinity and to assess the uncertainty in our predictions. Furthermore, we analyzed the parameter sensitivity and uncertainty to determine for which parameter the quality of field measurements should be improved to reduce model input and output uncertainty. High frequency measurements of polder water discharge and weighted concentration at the outlet of the catchment area appear to be essential for obtaining reliable simulations of water and salt fluxes and for allotting these to the different sources.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía