Academic literature on the topic 'Latin hypercube sampling technique'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Latin hypercube sampling technique.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Latin hypercube sampling technique"

1

Todorov, Venelin. "LATIN HYPERCUBE AND IMPORTANCE SAMPLING ALGORITHMS FOR MULTIDIMENSIONAL INTEGRALS." Journal Scientific and Applied Research 10, no. 1 (2016): 17–23. http://dx.doi.org/10.46687/jsar.v10i1.201.

Full text
Abstract:
Monte Carlo method is the only viable method for high-dimensional problems since its convergence is independent of the dimension. In this paper we implement and analyze the computational complexity of the Latin hypercube sampling algorithm. We compare the results with Importance sampling algorithm which is the most widely used variance reduction Monte Carlo method. We show that the Latin hypercube sampling has some advantageous over the importance sampling technique.
APA, Harvard, Vancouver, ISO, and other styles
2

LEI, GANG, and LINQING LIAO. "A STOCHASTIC FINITE ELEMENT ANALYSIS BASED ON UNIFORM SAMPLING METHOD." Journal of Advanced Manufacturing Systems 07, no. 01 (2008): 69–73. http://dx.doi.org/10.1142/s0219686708001097.

Full text
Abstract:
Stochastic finite element method is the main analysis method of complicated stochastic structure, and it is mainly applied to the Monte Carlo simulation method. Its sampling method is commonly called the Latin Hypercube sampling method. But for stochastic finite element analysis, and if level numbers for sampling are too many, sampling numbers will be difficult for most engineering purposes. In this paper, uniform sampling method is applied. Calculating examples with different factors and levels are compared with calculating examples in which Latin Hypercube sampling method is applied. Calculating examples show that when sampling numbers by uniform sampling technique is obviously fewer than sampling numbers by Latin Hypercube, mean values and standard deviations of displacements, stresses of node by the two sampling methods are almost completely identical. Uniform samplings for stochastic finite element analysis have good computing efficiency, and have certain applied perspectives.
APA, Harvard, Vancouver, ISO, and other styles
3

Erten, Oktay, Fábio P. L. Pereira, and Clayton V. Deutsch. "Projection Pursuit Multivariate Sampling of Parameter Uncertainty." Applied Sciences 12, no. 19 (2022): 9668. http://dx.doi.org/10.3390/app12199668.

Full text
Abstract:
The efficiency of sampling is a critical concern in Monte Carlo analysis, which is frequently used to assess the effect of the uncertainty of the input variables on the uncertainty of the model outputs. The projection pursuit multivariate transform is proposed as an easily applicable tool for improving the efficiency and quality of a sampling design in Monte Carlo analysis. The superiority of the projection pursuit multivariate transform, as a sampling technique, is demonstrated in two synthetic case studies, where the random variables are considered to be uncorrelated and correlated in low (bivariate) and high (five-variate) dimensional sampling spaces. Five sampling techniques including Monte Carlo simulation, classic Latin hypercube sampling, maximin Latin hypercube sampling, Latin hypercube sampling with multidimensional uniformity, and projection pursuit multivariate transform are employed in the simulation studies, considering cases where the sample sizes (n) are small (i.e., 10≤n≤100), medium (i.e., 100<n≤1000), and large (i.e., 1000 < n≤ 10,000). The results of the case studies show that the projection pursuit multivariate transform appears to yield the fewest sampling errors and the best sampling space coverage (or multidimensional uniformity), and that a significant amount of computer effort could be saved by using this technique.
APA, Harvard, Vancouver, ISO, and other styles
4

Phromphan, Pakin, Jirachot Suvisuthikasame, Metas Kaewmongkol, Woravech Chanpichitwanich, and Suwin Sleesongsom. "A New Latin Hypercube Sampling with Maximum Diversity Factor for Reliability-Based Design Optimization of HLM." Symmetry 16, no. 7 (2024): 901. http://dx.doi.org/10.3390/sym16070901.

Full text
Abstract:
This research paper presents a new Latin hypercube sampling method, aimed at enhancing its performance in quantifying uncertainty and reducing computation time. The new Latin hypercube sampling (LHS) method serves as a tool in reliability-based design optimization (RBDO). The quantification technique is termed LHSMDF (LHS with maximum diversity factor). The quantification techniques, such as Latin hypercube sampling (LHS), optimum Latin hypercube sampling (OLHS), and Latin hypercube sampling with maximum diversity factor (LHSMDF), are tested against mechanical components, including a circular shaft housing, a connecting rod, and a cantilever beam, to evaluate its comparative performance. Subsequently, the new method is employed as the basis of RBDO in the synthesis of a six-bar high-lift mechanism (HLM) example to enhance the reliability of the resulting mechanism compared to Monte Carlo simulation (MCS). The design problem of this mechanism is classified as a motion generation problem, incorporating angle and position of the flap as an objective function. The six-bar linkage is first adapted to be a high-lift mechanism (HLM), which is a symmetrical device of the aircraft. Furthermore, a deterministic design, without consideration of uncertainty, may lead to unacceptable performance during the manufacturing step due to link length tolerances. The techniques are combined with an efficient metaheuristic known as teaching–learning-based optimization with a diversity archive (ATLBO-DA) to identify a reliable HLM. Performance testing of the new LHSMDF reveals that it outperforms the original LHS and OLHS. The HLM problem test results demonstrate that achieving optimum HLM with high reliability necessitates precision without sacrificing accuracy in the manufacturing process. Moreover, it is suggested that the six-bar HLM could emerge as a viable option for developing a new high-lift device in aircraft mechanisms for the future.
APA, Harvard, Vancouver, ISO, and other styles
5

Karatzetzou, Anna. "Uncertainty and Latin Hypercube Sampling in Geotechnical Earthquake Engineering." Geotechnics 4, no. 4 (2024): 1007–25. http://dx.doi.org/10.3390/geotechnics4040051.

Full text
Abstract:
A soil–foundation–structure system (SFSS) often exhibits different responses compared to a fixed-base structure when subjected to earthquake ground motion. Both kinematic and inertial soil–foundation–structure interactions can significantly influence the structural performance of buildings. Numerous parameters within an SFSS affect its overall response, introducing inherent uncertainty into the solution. Performing time history analyses, even for a linear elastic coupled SFSS, requires considerable computational effort. To reduce the computational cost without compromising accuracy, the use of the Latin Hypercube Sampling (LHS) technique is proposed herein. Sampling techniques are rarely employed in soil–foundation–structure interaction analyses, yet they are highly beneficial. These methodologies allow analyses determined by sampling to be conducted using commercial codes designed for deterministic analyses, without requiring any modifications. The advantage is that the number of analyses determined by the sampling size is significantly reduced as compared to considering all combinations of input parameters. After identifying the important samples, one can evaluate the seismic demand of selected soil–foundation–bridge pier systems using finite element numerical software. This paper indicates that LHS reduces computational effort by 60%, whereas structural response components (translation, rocking) show distinct trends for different systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhao, Wei, YangYang Chen, and Jike Liu. "Reliability sensitivity analysis using axis orthogonal importance Latin hypercube sampling method." Advances in Mechanical Engineering 11, no. 3 (2019): 168781401982641. http://dx.doi.org/10.1177/1687814019826414.

Full text
Abstract:
In this article, a combined use of Latin hypercube sampling and axis orthogonal importance sampling, as an efficient and applicable tool for reliability analysis with limited number of samples, is explored for sensitivity estimation of the failure probability with respect to the distribution parameters of basic random variables, which is equivalently solved by reliability sensitivity analysis of a series of hyperplanes through each sampling point parallel to the tangent hyperplane of limit state surface around the design point. The analytical expressions of these hyperplanes are given, and the formulas for reliability sensitivity estimators and variances with the samples are derived according to the first-order reliability theory and difference method when non-normal random variables are involved and not involved, respectively. A procedure is established for the reliability sensitivity analysis with two versions: (1) axis orthogonal Latin hypercube importance sampling and (2) axis orthogonal quasi-random importance sampling with the Halton sequence. Four numerical examples are presented. The results are discussed and demonstrate that the proposed procedure is more efficient than the one based on the Latin hypercube sampling and the direct Monte Carlo technique with an acceptable accuracy in sensitivity estimation of the failure probability.
APA, Harvard, Vancouver, ISO, and other styles
7

Kaewsuwan, Kantapit, Chumpol Yuangyai, Udom Janjarassuk, and Kanokporn Rienkhemaniyom. "A comparison of latin hypercube sampling techniques for a supply chain network design problem." MATEC Web of Conferences 192 (2018): 01023. http://dx.doi.org/10.1051/matecconf/201819201023.

Full text
Abstract:
Currently, supply chain network design becomes more complex. In designing a supply chain network to withstand changing events, it is necessary to consider the uncertainties and risks that cause network disruptions from unexpected events. The current research related to the designing problem considers network disruptions using Monte Carlo Sampling (MCS) or Latin Hypercube Sampling (LHS) techniques. Both have a disadvantage that sample points or disruption locations are not scattered entirely sample space leading to high variation in objective function values. The purpose of this study is to apply a modified LHS or Improved Distributed Hypercube Sampling (IHS) techniques to reduce the variation. The results show that IHS techniques provide smaller standard deviation than that of the LHS technique. In addition, IHS can reduce not only the number of sample size but also and the computational time.
APA, Harvard, Vancouver, ISO, and other styles
8

Nikas, Ioannis A., Vasileios P. Georgopoulos, and Vasileios C. Loukopoulos. "Selective Multistart Optimization Based on Adaptive Latin Hypercube Sampling and Interval Enclosures." Mathematics 13, no. 11 (2025): 1733. https://doi.org/10.3390/math13111733.

Full text
Abstract:
Solving global optimization problems is a significant challenge, particularly in high-dimensional spaces. This paper proposes a selective multistart optimization framework that employs a modified Latin Hypercube Sampling (LHS) technique to maintain a constant search space coverage rate, alongside Interval Arithmetic (IA) to prioritize sampling points. The proposed methodology addresses key limitations of conventional multistart methods, such as the exponential decline in space coverage with increasing dimensionality. It prioritizes sampling points by leveraging the hypercubes generated through LHS and their corresponding interval enclosures, guiding the optimization process toward regions more likely to contain the global minimum. Unlike conventional multistart methods, which assume uniform sampling without quantifying spatial coverage, the proposed approach constructs interval enclosures around each sample point, enabling explicit estimation and control of the explored search space. Numerical experiments on well-known benchmark functions demonstrate improvements in space coverage efficiency and enhanced local/global minimum identification. The proposed framework offers a promising approach for large-scale optimization problems frequently encountered in machine learning, artificial intelligence, and data-intensive domains.
APA, Harvard, Vancouver, ISO, and other styles
9

Packham, N. "Combining Latin Hypercube Sampling With Other Variance Reduction Techniques." Wilmott 2015, no. 76 (2015): 60–69. http://dx.doi.org/10.1002/wilm.10410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bin, Li, Rashana Abbas, Muhammad Shahzad, and Nouman Safdar. "Probabilistic Load Flow Analysis Using Nonparametric Distribution." Sustainability 16, no. 1 (2023): 240. http://dx.doi.org/10.3390/su16010240.

Full text
Abstract:
In the pursuit of sustainable energy solutions, this research addresses the critical need for accurate probabilistic load flow (PLF) analysis in power systems. PLF analysis is an essential tool for estimating the statistical behavior of power systems under uncertainty. It plays a vital part in power system planning, operation, and dependability studies. To perform accurate PLF analysis, this article proposes a Kernel density estimation with adaptive bandwidth for probability density function (PDF) estimation of power injections from sustainable energy sources like solar and wind, reducing errors in PDF estimation. To reduce the computational burden, a Latin hypercube sampling approach was incorporated. Input random variables are modeled using kernel density estimation (KDE) in conjunction with Latin hypercube sampling (LHS) for probabilistic load flow (PLF) analysis. To test the proposed techniques, IEEE 14 and IEEE 118 bus systems are used. Two benchmark techniques, the Monte Carlo Simulation (MCS) method and Hamiltonian Monte Carlo (HMC), were set side by side for validation of results. The results illustrate that an adaptive bandwidth kernel density estimation with the Latin hypercube sampling (AKDE-LHS) method provides better performance in terms of precision and computational efficiency. The results also show that the suggested technique is more feasible in reducing errors, uncertainties, and computational time while depicting arbitrary distributions of photovoltaic and wind farms for probabilistic load flow analysis. It can be a potential solution to tackle challenges posed by sustainable energy sources in power systems.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Latin hypercube sampling technique"

1

OTTONELLO, ANDREA. "Application of Uncertainty Quantification techniques to CFD simulation of twin entry radial turbines." Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1046507.

Full text
Abstract:
L'argomento principale della tesi è l'applicazione delle tecniche di quantificazione dell'incertezza (UQ) alla simulazione numerica (CFD) di turbine radiali twin entry impiegate nella turbosovralimentazione automobilistica. Lo studio approfondito di questo tipo di turbomacchine è affrontato nel capitolo 3, finalizzato alla comprensione dei principali parametri che caratterizzano e influenzano le prestazioni fluidodinamiche delle turbine twin scroll. Il capitolo 4 tratta di una piattaforma per l'analisi UQ sviluppata internamente tramite il set di strumenti open source ‘Dakota’. La piattaforma è stata testata dapprima su un caso di interesse industriale, ovvero un ugello de Laval supersonico (capitolo 5); l'analisi ha evidenziato l'utilizzo pratico delle tecniche di quantificazione dell'incertezza nella previsione delle prestazioni di un ugello affetto da condizioni di fuori progetto con complessità fluidodinamica dovuta alla forte non linearità. L'esperienza maturata con l'approccio UQ ha agevolato l'identificazione di metodi idonei per applicare la propagazione dell’incertezza alla simulazione CFD di turbine radiali twin scroll (capitolo 6). In tal caso sono state studiate e messe in pratica diverse tecniche di quantificazione dell'incertezza al fine di acquisire un'esperienza approfondita sull’attuale stato dell'arte. Il confronto dei risultati ottenuti dai diversi approcci e la discussione dei pro e dei contro relativi a ciascuna tecnica hanno portato a conclusioni interessanti, che vengono proposte come linee guida per future applicazioni di quantificazione dell’incertezza alla simulazione CFD delle turbine radiali. L'integrazione di modelli e metodologie UQ, oggi utilizzati solo da alcuni centri di ricerca accademica, con solutori CFD commerciali consolidati ha permesso di raggiungere l'obiettivo finale della tesi di dottorato: dimostrare all'industria l'elevato potenziale delle tecniche UQ nel migliorare, attraverso distribuzioni di probabilità, la previsione delle prestazioni relative ad un componente soggetto a diverse fonti di incertezza. Lo scopo dell’attività di ricerca consiste pertanto nel fornire ai progettisti dati prestazionali associati a margini di incertezza che consentano di correlare meglio simulazione e applicazione reale. Per accordi di riservatezza, i parametri geometrici relativi alla turbina twin entry in oggetto sono forniti adimensionali, i dati sensibili sugli assi dei grafici sono stati omessi e nelle figure si è reso necessario eliminare le legende dei contours ed ogni eventuale riferimento dimensionale.<br>The main topic of the thesis is the application of uncertainty quantification (UQ) techniques to the numerical simulation (CFD) of twin entry radial turbines used in automotive turbocharging. The detailed study of this type of turbomachinery is addressed in chapter 3, aimed at understanding the main parameters which characterize and influence the fluid dynamic performance of twin scroll turbines. Chapter 4 deals with the development of an in-house platform for UQ analysis through ‘Dakota’ open source toolset. The platform was first tested on a test case of industrial interest, i.e. a supersonic de Laval nozzle (chapter 5); the analysis highlighted the practical use of uncertainty quantification techniques in predicting the performance of a nozzle affected by off-design conditions with fluid dynamic complexity due to strong non-linearity. The experience gained with the UQ approach facilitated the identification of suitable methods for applying the uncertainty propagation to the CFD simulation of twin entry radial turbines (chapter 6). In this case different uncertainty quantification techniques have been investigated and put into practice in order to acquire in-depth experience on the current state of the art. The comparison of the results coming from the different approaches and the discussion of the pros and cons related to each technique led to interesting conclusions, which are proposed as guidelines for future uncertainty quantification applications to the CFD simulation of radial turbines. The integration of UQ models and methodologies, today used only by some academic research centers, with well established commercial CFD solvers allowed to achieve the final goal of the doctoral thesis: to demonstrate to industry the high potential of UQ techniques in improving, through probability distributions, the prediction of the performance relating to a component subject to different sources of uncertainty. The purpose of the research activity is therefore to provide designers with performance data associated with margins of uncertainty that allow to better correlate simulation and real application. Due to confidentiality agreements, geometrical parameters concerning the studied twin entry radial turbine are provided dimensionless, confidential data on axes of graphs are omitted and legends of the contours as well as any dimensional reference have been shadowed.
APA, Harvard, Vancouver, ISO, and other styles
2

Green, Robert C. II. "Novel Computational Methods for the Reliability Evaluation of Composite Power Systems using Computational Intelligence and High Performance Computing Techniques." University of Toledo / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1338894641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Batiston, Evandro Lino. "Simulação Sequencial Gaussiana usando Latin Hypercube Sampling : estudo de caso minério de ferro Carajás." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2010. http://hdl.handle.net/10183/28838.

Full text
Abstract:
A utilização de modelos de incerteza geológica é fundamental para a quantificação e avaliação da flutuação dos atributos analisados pelos departamentos de planejamento da indústria mineira. O método de simulação seqüencial Gaussiana (SSG) é amplamente utilizado para a construção destes modelos. O SSG caracteriza-se por representar adequadamente o espaço de incerteza da variável aleatória (VA) Z(u), desde que o número de realizações L seja adequado para reproduzi-lo. Existem dois algoritmos implementados em SSG que efetuam a tiragem aleatória da distribuição condicional local de probabilidade (dclp) cumulativa, visando gerar as realizações que vão compor a simulação. O algoritmo clássico, baseado na tiragem simples por Monte Carlo, denomina-se Simple Random Sampling (SRS), enquanto que o método alternativo é denominado Latin Hypercube Sampling (LHS). Esta dissertação compara a eficiência destes dois algoritmos, como forma de caracterizar o espaço de incerteza de algumas funções de transferência usadas na indústria mineral. O estudo de caso envolveu a análise do número de realizações necessárias para caracterizar adequadamente a variabilidade da resposta destas funções, como mecanismo para comparação, para um banco de dados de minério de ferro da Província Mineral de Carajás. Observou-se que o método LHS ofereceu maior eficiência na caracterização do espaço de incerteza da VA Z(u), estratificando a dclp de acordo com cada realização, proporcionando menor número de realizações e melhor cobertura da dclp, na construção do modelo de incerteza. Estes benefícios facilitam a implementação da técnica de SSG nas rotinas de planejamento, de forma que os modelos de incerteza serão menores e mais fáceis de manipular.<br>Assessing geological uncertainty is of paramount importance in mining industry risk analysis. Sequential Gaussian Simulation (SGS) is widely used for building such models, especially when mapping grade uncertainty. SGS is commonly used for mapping the uncertainty space of a random variable (RV) Z(u), and the number of realizations L to adequate characterize this space is possible large. Two algorithms were herein implemented combined with SGS for random drawing from the conditional cumulative distribution function (ccdf). The classical algorithm, based on Monte Carlo simple drawing known as Simple Random Sampling (SRS), whereas the alternative method, Latin Hypercube Sampling (LHS). The present dissertation compares the efficiency of these two algorithms checking their efficiency in characterizing the uncertainty space of some transfer functions employed in the mineral industry. Through a case study it was checked the number of necessary realizations to adequately characterize the variability of these response functions, as a mechanism for comparison. The dataset comes from an iron ore mine at the Carajás Mineral Province It was observed that the LHS method is more efficient in characterizing uncertainty space of RV Z(u), by stratifying the ccdf according to each realization. Such characteristic of LHS requires fewer realizations to proper build the uncertainty model. These benefits facilitate the implementation simulations into the routines of planning, using smaller and easier to manipulate uncertainty models.
APA, Harvard, Vancouver, ISO, and other styles
4

Arnesson, Jennifer. "Numerical Modeling of Radioactive Release Implemented for Loss of Coolant Accident, Evaluated using Latin Hypercube Sampling." Thesis, KTH, Reaktorfysik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-169622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lim, Nay Kim. "Optimization of TIG weld geometry using a Kriging surrogate model and Latin Hypercube sampling for data generation." Thesis, California State University, Long Beach, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1527976.

Full text
Abstract:
<p> The purpose of this study was to use a systematic approach to model the welding parameters for tungsten inert gas (TIG) welding of two thin pieces of titanium sheet metal. The experiment was carried out using a mechanized welding equipment to consistently perform each trial. The Latin Hyper Cube sampling method for data generation was applied to four input variables and the Kriging interpolation technique was used to develop surrogate models that map the four input variables to the three output parameters of interest. A total of fifty data points were generated. To utilize the minimal amount of weld samples, Kriging models were created using five sets of data at a time and the surrogate model was tested against an actual data output. Once the models have been generated and verified, an attempt is made to optimize the output variables individually and as a multi-objective problem using genetic algorithm (GA) optimization.</p>
APA, Harvard, Vancouver, ISO, and other styles
6

Packham, Natalie [Verfasser]. "Credit dynamics in a first-passage time model with jumps and Latin hypercube sampling with dependence / Natalie Packham." Frankfurt am Main : Frankfurt School of Finance & Management gGmbH, 2008. http://d-nb.info/1011759780/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Brungard, Colby W. "Alternative Sampling and Analysis Methods for Digital Soil Mapping in Southwestern Utah." DigitalCommons@USU, 2009. http://digitalcommons.usu.edu/etd/472.

Full text
Abstract:
Digital soil mapping (DSM) relies on quantitative relationships between easily measured environmental covariates and field and laboratory data. We applied innovative sampling and inference techniques to predict the distribution of soil attributes, taxonomic classes, and dominant vegetation across a 30,000-ha complex Great Basin landscape in southwestern Utah. This arid rangeland was characterized by rugged topography, diverse vegetation, and intricate geology. Environmental covariates calculated from digital elevation models (DEM) and spectral satellite data were used to represent factors controlling soil development and distribution. We investigated optimal sample size and sampled the environmental covariates using conditioned Latin Hypercube Sampling (cLHS). We demonstrated that cLHS, a type of stratified random sampling, closely approximated the full range of variability of environmental covariates in feature and geographic space with small sample sizes. Site and soil data were collected at 300 locations identified by cLHS. Random forests was used to generate spatial predictions and associated probabilities of site and soil characteristics. Balanced random forests and balanced and weighted random forests were investigated for their use in producing an overall soil map. Overall and class errors (referred to as out-of-bag [OOB] error) were within acceptable levels. Quantitative covariate importance was useful in determining what factors were important for soil distribution. Random forest spatial predictions were evaluated based on the conceptual framework developed during field sampling.
APA, Harvard, Vancouver, ISO, and other styles
8

Huh, Jungwon, Quang Tran, Achintya Haldar, Innjoon Park, and Jin-Hee Ahn. "Seismic Vulnerability Assessment of a Shallow Two-Story Underground RC Box Structure." MDPI AG, 2017. http://hdl.handle.net/10150/625742.

Full text
Abstract:
Tunnels, culverts, and subway stations are the main parts of an integrated infrastructure system. Most of them are constructed by the cut-and-cover method at shallow depths (mainly lower than 30 m) of soil deposits, where large-scale seismic ground deformation can occur with lower stiffness and strength of the soil. Therefore, the transverse racking deformation (one of the major seismic ground deformation) due to soil shear deformations should be included in the seismic design of underground structures using cost- and time-efficient methods that can achieve robustness of design and are easily understood by engineers. This paper aims to develop a simplified but comprehensive approach relating to vulnerability assessment in the form of fragility curves on a shallow two-story reinforced concrete underground box structure constructed in a highly-weathered soil. In addition, a comparison of the results of earthquakes per peak ground acceleration (PGA) is conducted to determine the effective and appropriate number for cost- and time-benefit analysis. The ground response acceleration method for buried structures (GRAMBS) is used to analyze the behavior of the structure subjected to transverse seismic loading under quasi-static conditions. Furthermore, the damage states that indicate the exceedance level of the structural strength capacity are described by the results of nonlinear static analyses (or so-called pushover analyses). The Latin hypercube sampling technique is employed to consider the uncertainties associated with the material properties and concrete cover owing to the variation in construction conditions. Finally, a large number of artificial ground shakings satisfying the design spectrum are generated in order to develop the seismic fragility curves based on the defined damage states. It is worth noting that the number of ground motions per PGA, which is equal to or larger than 20, is a reasonable value to perform a structural analysis that produces satisfactory fragility curves.
APA, Harvard, Vancouver, ISO, and other styles
9

Taheri, Mehdi. "Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/81417.

Full text
Abstract:
The application of stochastic modeling for learning the behavior of multibody dynamics models is investigated. The stochastic modeling technique is also known as Kriging or random function approach. Post-processing data from a simulation run is used to train the stochastic model that estimates the relationship between model inputs, such as the suspension relative displacement and velocity, and the output, for example, sum of suspension forces. Computational efficiency of Multibody Dynamics (MBD) models can be improved by replacing their computationally-intensive subsystems with stochastic predictions. The stochastic modeling technique is able to learn the behavior of a physical system and integrate its behavior in MBS models, resulting in improved real-time simulations and reduced computational effort in models with repeated substructures (for example, modeling a train with a large number of rail vehicles). Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, various sampling plans are investigated, and a space-filling Latin Hypercube sampling plan based on the traveling salesman problem (TPS) is suggested for efficiently representing the entire parameter space. The simulation results confirm the expected increased modeling efficiency, although further research is needed for improving the accuracy of the predictions. The prediction accuracy is expected to improve through employing a sampling strategy that considers the discrete nature of the training data and uses infill criteria that considers the shape of the output function and detects sample spaces with high prediction errors. It is recommended that future efforts consider quantifying the computation efficiency of the proposed learning behavior by overcoming the inefficiencies associated with transferring data between multiple software packages, which proved to be a limiting factor in this study. These limitations can be overcome by using the user subroutine functionality of SIMPACK and adding the stochastic modeling technique to its force library.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
10

Grabaskas, David. "Efficient Approaches to the Treatment of Uncertainty in Satisfying Regulatory Limits." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1345464067.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Latin hypercube sampling technique"

1

Dandekar, Ramesh A., Michael Cohen, and Nancy Kirkendall. "Sensitive Micro Data Protection Using Latin Hypercube Sampling Technique." In Inference Control in Statistical Databases. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-47804-3_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stamp, Mark. "Conditioned Latin hypercube sampling." In Introduction to Machine Learning with Applications in Information Security, 2nd ed. Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003264873-19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brus, Dick J. "Conditioned Latin hypercube sampling." In Spatial Sampling with R. Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003258940-19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kyriakidis, Phaedon C. "Sequential Spatial Simulation using Latin Hypercube Sampling." In Geostatistics Banff 2004. Springer Netherlands, 2005. http://dx.doi.org/10.1007/978-1-4020-3610-1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Haddad, Rami El, Rana Fakhereddine, Christian Lécot, and Gopalakrishnan Venkiteswaran. "Extended Latin Hypercube Sampling for Integration and Simulation." In Monte Carlo and Quasi-Monte Carlo Methods 2012. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41095-6_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Zhizhao, Ming Yang, and Wei Li. "A Sequential Latin Hypercube Sampling Method for Metamodeling." In Theory, Methodology, Tools and Applications for Modeling and Simulation of Complex Systems. Springer Singapore, 2016. http://dx.doi.org/10.1007/978-981-10-2663-8_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Karanki, D. R., and S. Rahman. "Alternative Sampling Approaches for Integrated Safety Analysis: Latin Hypercube Versus Deterministic Sampling." In Risk, Reliability and Safety Engineering. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-99-8258-5_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Krumscheid, Sebastian, and Per Pettersson. "Sequential Estimation Using Hierarchically Stratified Domains with Latin Hypercube Sampling." In Springer Proceedings in Mathematics & Statistics. Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-59762-6_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Minasny, B., and A. B. McBratney. "Conditioned Latin Hypercube Sampling for Calibrating Soil Sensor Data to Soil Properties." In Proximal Soil Sensing. Springer Netherlands, 2010. http://dx.doi.org/10.1007/978-90-481-8859-8_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pilger, Gustavo G., Joao Felipe C. L. Costa, and Jair C. Koppe. "Improving the Efficiency of the Sequential Simulation Algorithm using Latin Hypercube Sampling." In Geostatistics Banff 2004. Springer Netherlands, 2005. http://dx.doi.org/10.1007/978-1-4020-3610-1_103.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Latin hypercube sampling technique"

1

Zhu, Zheng, and QiJun Zhao. "Numerical Optimization for Rotor Blade-tip Planform with Low HSI Noise Characteristics in Forward Flight." In Vertical Flight Society 71st Annual Forum & Technology Display. The Vertical Flight Society, 2015. http://dx.doi.org/10.4050/f-0071-2015-10070.

Full text
Abstract:
Based on CFD/FW-H_pds methods and hybrid optimization technique, an optimization design procedure for rotor planform with low HSI noise characteristics is established. In this solver, based on the moving-embedded grid methodology, a CFD simulation method for the aerodynamic characteristics of rotor is developed by solving the compressible Reynolds-averaged Navier-Stokes (RANS) equations. Among the optimization process, the high-qualified blade grids are generated by a high-efficient parameterized method. Additionally, the high-speed impulsive (HSI) noise characteristics generated by transonic helicopter rotor are analyzed through a robust prediction method based on FW-H_pds equations (Ffowcs Williams-Hawkings equations with penetrable data surface). Aiming at the minimization of the noise level in forward flight, optimization analyses based on the rotor blade with double-swept and tapered tip have been accomplished with the aerodynamic performance as constraints. The genetic algorithm and surrogated model based on Latin Hypercube Sampling (LHS) design and Radial Basis Function (RBF) are combined as a hybrid optimization technique. Compared with rectangular blades, it shows that the noise level of rotor with optimized blade-tip shape can be decreased obviously at the present calculating condition. For the rotor with optimized blade-tip, the HSI noise level can be reduced effectively due to its weaker transonic "delocalization" phenomenon in the region of blade-tip.
APA, Harvard, Vancouver, ISO, and other styles
2

Singh, Mahesh Pal, and Nidul Sinha. "Data Generation using Latin Hypercube Sampling Approach for Power System Dynamic Security Assessment." In 2024 IEEE International Conference on Smart Power Control and Renewable Energy (ICSPCRE). IEEE, 2024. http://dx.doi.org/10.1109/icspcre62303.2024.10675156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Kun, and Fangfang Han. "Safety Assessment Method for Wind-Solar Systems Based on Improved Latin Hypercube Sampling." In 2024 4th International Conference on Intelligent Power and Systems (ICIPS). IEEE, 2024. https://doi.org/10.1109/icips64173.2024.10900181.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Radha, P., and K. Rajagopalan. "Reliability of Stiffened Cylindrical Shells Using Random Polar Sampling Technique." In ASME 2004 23rd International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2004. http://dx.doi.org/10.1115/omae2004-51059.

Full text
Abstract:
Rational design of stiffened cylindrical shell structures calls for the calculation of reliability. Random variables occur in modelling loads and strengths. The reliability can be evaluated with ‘Monte-Carlo Simulation’ (MCS) which consists of obtaining cumulative distribution functions for each and every random variable and simulating the ultimate strength of stiffened shells for combinations of random variable values. However for MCS to be successful, the sample size should be very large. Hence methods have been proposed to reduce the sample size without however sacrificing any accuracy on reliability. ‘Point Estimation Method’ (PEM), ‘Response Surface Technique’ (RST), ‘Importance Sampling Procedure Using Design points’ (ISPUD), ‘Latin Hypercube Sampling’ (LHS) etc., are some of these methods. In this paper, a method based on ‘Random Polar Sampling Technique’ (RPST) is proposed, in which combinations of variates are obtained using a polar sampling of Latin Hypercube sampled values. A typical example has been worked out using this method.
APA, Harvard, Vancouver, ISO, and other styles
5

Manteufel, Randall D., and Jason B. Pleming. "Assessing Hypercube Sampling Techniques for Risk Assessment." In ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-81656.

Full text
Abstract:
Two variations of hypercube sampling techniques are introduced and computationally tested using benchmark problems. The methods are variations of the Latin hypercube sampling (LHS) and incremental-fractional LHS scheme. Both can be described as stratified sampling with one sample per strata. Because they ensure uniform marginals, they are observed to have computational advantages for linear problems where weighted response statistics are sought. Advantages are less pronounced for non-linear responses and sorted statistics, which is often the case for risk analysis. The complementary cumulative distribution is identified as being helpful in assessing a methods performance. Both methods are applied to an application problem having multiple responses of interest and 48 uncertain inputs. The hypercube methods are noted to produce estimates of the mean with orders-of-magnitude lower variance than that of simple random sampling.
APA, Harvard, Vancouver, ISO, and other styles
6

Johnston, Joel, and Aditi Chattopadhyay. "Stochastic Multiscale Modeling and Damage Progression for Composite Materials." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-66566.

Full text
Abstract:
Modeling and characterization of complex composite structures is challenging due to uncertainties inherent in these materials. Uncertainty is present at each length scale in composites and must be quantified in order to accurately model the mechanical response and damage progression of this material. The ability to pass information between length scales permits multiscale models to transport uncertainties from one scale to the next. Limitations in the physics and errors in numerical methods pose additional challenges for composite models. By replacing deterministic inputs with random inputs, stochastic methods can be implemented within these multiscale models making them more robust. This work focuses on understanding the sensitivity of multiscale models and damage progression variations to stochastic input parameters as well as quantifying these uncertainties within a modeling framework. A multiscale, sectional model is used due to its efficiency and capacity to incorporate stochastic methods with little difficulty. The sectional micromechanics in this model are similar to that of the Generalized Method of Cells with the difference being the discretization techniques of the unit cell and the continuity conditions. A Latin Hypercube sampling technique is used due to its reported computational savings over other methods such as a fully random Monte Carlo simulation. Specifically in the sectional model, the Latin Hypercube sampling method provides an approximate 35 % reduction in computations compared to the fully random Monte Carlo method. The Latin Hypercube sampling is a stratified technique which discretizes the distribution function and randomizes the input parameters within those discretized fields. Within this multiscale modeling framework, a progressive failure theory is implemented using these stochastic methods and a modified Hashin failure theory. With a combined stochastic method and progressive failure theory, this multiscale model is capable of modeling the uncertainty and material property variations for composite materials.
APA, Harvard, Vancouver, ISO, and other styles
7

Dong, Guang, Zheng-Dong Ma, Gregory Hulbert, et al. "Variable Screening Using Restricted Maximum Likelihood Kriging Method With Application to Gunner Joint Stiffness Variables." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47582.

Full text
Abstract:
Complex and computationally intensive modeling and simulation of real-world engineering systems can include a large number of design variables in the optimization of such systems. Consequently, it is desirable to conduct variable screening to identify significant or active variables so that a simpler, more efficient, yet accurate optimization process can be achieved. This paper proposes a variable screening method based on a Kriging model with Restricted Maximum Likelihood criterion. The Kriging metamodeling method is more reliable for highly nonlinear systems than the traditional response surface method, and the Restricted Maximum Likelihood criterion makes the variable screening process more efficient. In this work, three different sampling methods, namely, Latin Hypercube, Improved Distributed Hypercube and D-optimally selected Latin Hypercube sampling methods, are compared when used with the proposed variable screening method. The new variable screening method is evaluated using a standard benchmark nonlinear numerical example that employs 20 factors. The variable screening method then is applied to a gunner restrain system design problem. Starting with a set of eight of the most important gunner’s joints variables in a wide open property space, the Improved Hypercube space filling sampling technique is used to screen the joints variables and the variable screening method based on the Kriging method with the restricted maximum likelihood criterion is used to determine the most important joints properties for gunner’s performance.
APA, Harvard, Vancouver, ISO, and other styles
8

Frey, Daniel D., Geoff Reber, and Yiben Lin. "A Quadrature-Based Sampling Technique for Robust Design With Computer Models." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-85490.

Full text
Abstract:
Several methods have been proposed for estimating transmitted variance to enable robust parameter design using computer models. This paper presents an alternative technique based on Gaussian quadrature which requires only 2n+1 or 4n+1 samples (depending on the accuracy desired) where n is the number of randomly varying inputs. The quadrature-based technique is assessed using a hierarchical probability model. The 4n+1 quadrature-based technique can estimate transmitted standard deviation within 5% in over 95% of systems which is much better than the accuracy of Hammersley Sequence Sampling, Latin Hypercube Sampling, and the Quadrature Factorial Method under similar resource constraints. If the most accurate existing method, Hammersley Sequence Sampling, is afforded ten times the number of samples, it provides approximately the same degree of accuracy as the quadrature-based method. Two case studies on robust design confirmed the main conclusions and also suggest the quadrature-based method becomes more accurate as robustness improvements are made.
APA, Harvard, Vancouver, ISO, and other styles
9

Xu, Sheng, C. Guedes Soares, and Ângelo P. Teixeira. "Reliability Analysis of Short Term Mooring Tension of a Semi-Submersible System." In ASME 2018 37th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/omae2018-78751.

Full text
Abstract:
A detail procedure to study mooring line strength reliability is presented. A fully coupled analysis is carried out to get the mooring tensions of a deep water semi-submersible floating systems operated in 100 year wave condition in South China Sea. The ACER method is applied to predict the 3h extreme mooring tension, and the results are validated by global maximum method. The hydrodynamic sampling points are generated by Latin Hypercube Sampling technique. The 3h extreme mooring tension is calculated by the ACER method with 10 minutes fully coupled dynamic simulation for each sampling point. The Kriging meta model method is trained to predict 3h mooring extreme tension under the effects of random hydrodynamic drag coefficients. A reliability analysis is carried out by implementing Monte Carlo simulation with the random hydrodynamic drag coefficients and mooring breaking strength considered.
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, R. J., N. Wang, C. H. Tho, J. P. Bobineau, and B. P. Wang. "Metamodeling Development for Vehicle Frontal Impact Simulation." In ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dac-21012.

Full text
Abstract:
Abstract Response surface methods or metamodels are commonly used to approximate large engineering systems. This paper presents a new metric for evaluating a response surface method or a metamodeling technique. Five response surface methods are studied: Stepwise Regression, Moving Least Square, Kriging, Multiquadratic, and Adaptive and Interactive Modeling System. A real world frontal impact design problem is used as an example, which is a complex, highly nonlinear, transient, dynamic, large deformation finite element model. The optimal Latin Hypercube Sampling method is used to distribute the sampling points uniformly over the entire design space. The Root Mean Square Error is used as the error indicator to study the accuracy and convergence rate of the metamodels for this vehicle impact analysis. A hybrid approach/strategy for selecting the best metamodels of impact responses is proposed.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Latin hypercube sampling technique"

1

Gwo, J. P., L. E. Toran, M. D. Morris, and G. V. Wilson. Subsurface stormflow modeling with sensitivity analysis using a Latin-hypercube sampling technique. Office of Scientific and Technical Information (OSTI), 1994. http://dx.doi.org/10.2172/10194326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

van der Mensbrugghe, Dominique. The META 21 Integrated Assessment Model in GAMS and LHS Sampling. GTAP Working Paper, 2023. http://dx.doi.org/10.21642/gtap.wp95.

Full text
Abstract:
The META 21 Integrated Assessment Model (Dietz et al., 2021) represents a fairly comprehensive climate change simulation model incorporating a number of features: (a) a recent simple climate model; (b) down-scaling of temperature change to the country level; (c) integration of a number of bio-physical tipping points linked to rising temperatures; and (d) economic damages linked to rising sea levels and temperature using recent country-level estimates from the literature. The original implementation of META 21 was in Excel and linked to the @Risk Excel add-in for performing Monte Carlo-type analysis. This paper reflects a translation of META 21 into GAMS. One of the key purposes is to allow for ready incorporation of many of the features of META 21 into other models—notably CGE-based IAMs. To replace the features provided by the @Risk Excel add-in, this paper also introduces a software package that produces random deviates—and, similar to @Risk, uses the Latin Hypercube Sampling (LHS) approach, which is a stratified sampling technique intended to cover the entire sampling space efficiently and, in addition, orders the sample to reflect a desired correlation matrix for the sampled random variables.
APA, Harvard, Vancouver, ISO, and other styles
3

Collins, Joseph C., and III. Latin Hypercube Sampling in Sensitivity Analysis. Defense Technical Information Center, 1994. http://dx.doi.org/10.21236/ada285867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

van der Mensbrugghe, Dominique. A Summary Guide to the Latin Hypercube Sampling (LHS) Utility. GTAP Working Paper, 2023. http://dx.doi.org/10.21642/gtap.wp94.

Full text
Abstract:
Latin Hypercube Sampling (LHS) is one method of Monte Carlo-type sampling, which is useful for limiting sample size yet maximizing the range of sampling of the underlying distributions. The LHS utility, for which this document describes the usage, also allows for user-specified correlations between two or more of the sampled distributions. The LHS utility described herein is a full re-coding using C/C++ of the original LHS utility—developed at Sandia National Labs (Swiler and Wyss (2004)), written in FORTRAN and freely available. The re-coding hones close to the original FORTRAN code, but allows for significantly more flexibility. For example, dynamic memory allocation is used for all internal variables and hence there are no pre-determined dimensions. The new utility has additional features compared to the original FORTRAN code: (1) it includes 10 new statistical distributions; (2) it has four additional output formats; and (3) it has an alternative random number generator. This guide provides a summary of the full features of the LHS utility. For a complete reference, with the exception of the new features, as well as a description of the intuition behind the LHS algorithm users are referred to Swiler and Wyss (2004)
APA, Harvard, Vancouver, ISO, and other styles
5

Wyss, G. D., and K. H. Jorgensen. A user`s guide to LHS: Sandia`s Latin Hypercube Sampling Software. Office of Scientific and Technical Information (OSTI), 1998. http://dx.doi.org/10.2172/573301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

HELTON, JON CRAIG, and FREDDIE J. DAVIS. Latin Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems. Office of Scientific and Technical Information (OSTI), 2002. http://dx.doi.org/10.2172/806696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Swiler, Laura Painton, and Gregory Dane Wyss. A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version. Office of Scientific and Technical Information (OSTI), 2004. http://dx.doi.org/10.2172/919175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hart, Carl R., D. Keith Wilson, Chris L. Pettit, and Edward T. Nykaza. Machine-Learning of Long-Range Sound Propagation Through Simulated Atmospheric Turbulence. U.S. Army Engineer Research and Development Center, 2021. http://dx.doi.org/10.21079/11681/41182.

Full text
Abstract:
Conventional numerical methods can capture the inherent variability of long-range outdoor sound propagation. However, computational memory and time requirements are high. In contrast, machine-learning models provide very fast predictions. This comes by learning from experimental observations or surrogate data. Yet, it is unknown what type of surrogate data is most suitable for machine-learning. This study used a Crank-Nicholson parabolic equation (CNPE) for generating the surrogate data. The CNPE input data were sampled by the Latin hypercube technique. Two separate datasets comprised 5000 samples of model input. The first dataset consisted of transmission loss (TL) fields for single realizations of turbulence. The second dataset consisted of average TL fields for 64 realizations of turbulence. Three machine-learning algorithms were applied to each dataset, namely, ensemble decision trees, neural networks, and cluster-weighted models. Observational data come from a long-range (out to 8 km) sound propagation experiment. In comparison to the experimental observations, regression predictions have 5–7 dB in median absolute error. Surrogate data quality depends on an accurate characterization of refractive and scattering conditions. Predictions obtained through a single realization of turbulence agree better with the experimental observations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography