To see the other types of publications on this topic, follow the link: Latin hypercube sampling technique.

Journal articles on the topic 'Latin hypercube sampling technique'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Latin hypercube sampling technique.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Todorov, Venelin. "LATIN HYPERCUBE AND IMPORTANCE SAMPLING ALGORITHMS FOR MULTIDIMENSIONAL INTEGRALS." Journal Scientific and Applied Research 10, no. 1 (2016): 17–23. http://dx.doi.org/10.46687/jsar.v10i1.201.

Full text
Abstract:
Monte Carlo method is the only viable method for high-dimensional problems since its convergence is independent of the dimension. In this paper we implement and analyze the computational complexity of the Latin hypercube sampling algorithm. We compare the results with Importance sampling algorithm which is the most widely used variance reduction Monte Carlo method. We show that the Latin hypercube sampling has some advantageous over the importance sampling technique.
APA, Harvard, Vancouver, ISO, and other styles
2

LEI, GANG, and LINQING LIAO. "A STOCHASTIC FINITE ELEMENT ANALYSIS BASED ON UNIFORM SAMPLING METHOD." Journal of Advanced Manufacturing Systems 07, no. 01 (2008): 69–73. http://dx.doi.org/10.1142/s0219686708001097.

Full text
Abstract:
Stochastic finite element method is the main analysis method of complicated stochastic structure, and it is mainly applied to the Monte Carlo simulation method. Its sampling method is commonly called the Latin Hypercube sampling method. But for stochastic finite element analysis, and if level numbers for sampling are too many, sampling numbers will be difficult for most engineering purposes. In this paper, uniform sampling method is applied. Calculating examples with different factors and levels are compared with calculating examples in which Latin Hypercube sampling method is applied. Calculating examples show that when sampling numbers by uniform sampling technique is obviously fewer than sampling numbers by Latin Hypercube, mean values and standard deviations of displacements, stresses of node by the two sampling methods are almost completely identical. Uniform samplings for stochastic finite element analysis have good computing efficiency, and have certain applied perspectives.
APA, Harvard, Vancouver, ISO, and other styles
3

Erten, Oktay, Fábio P. L. Pereira, and Clayton V. Deutsch. "Projection Pursuit Multivariate Sampling of Parameter Uncertainty." Applied Sciences 12, no. 19 (2022): 9668. http://dx.doi.org/10.3390/app12199668.

Full text
Abstract:
The efficiency of sampling is a critical concern in Monte Carlo analysis, which is frequently used to assess the effect of the uncertainty of the input variables on the uncertainty of the model outputs. The projection pursuit multivariate transform is proposed as an easily applicable tool for improving the efficiency and quality of a sampling design in Monte Carlo analysis. The superiority of the projection pursuit multivariate transform, as a sampling technique, is demonstrated in two synthetic case studies, where the random variables are considered to be uncorrelated and correlated in low (bivariate) and high (five-variate) dimensional sampling spaces. Five sampling techniques including Monte Carlo simulation, classic Latin hypercube sampling, maximin Latin hypercube sampling, Latin hypercube sampling with multidimensional uniformity, and projection pursuit multivariate transform are employed in the simulation studies, considering cases where the sample sizes (n) are small (i.e., 10≤n≤100), medium (i.e., 100<n≤1000), and large (i.e., 1000 < n≤ 10,000). The results of the case studies show that the projection pursuit multivariate transform appears to yield the fewest sampling errors and the best sampling space coverage (or multidimensional uniformity), and that a significant amount of computer effort could be saved by using this technique.
APA, Harvard, Vancouver, ISO, and other styles
4

Phromphan, Pakin, Jirachot Suvisuthikasame, Metas Kaewmongkol, Woravech Chanpichitwanich, and Suwin Sleesongsom. "A New Latin Hypercube Sampling with Maximum Diversity Factor for Reliability-Based Design Optimization of HLM." Symmetry 16, no. 7 (2024): 901. http://dx.doi.org/10.3390/sym16070901.

Full text
Abstract:
This research paper presents a new Latin hypercube sampling method, aimed at enhancing its performance in quantifying uncertainty and reducing computation time. The new Latin hypercube sampling (LHS) method serves as a tool in reliability-based design optimization (RBDO). The quantification technique is termed LHSMDF (LHS with maximum diversity factor). The quantification techniques, such as Latin hypercube sampling (LHS), optimum Latin hypercube sampling (OLHS), and Latin hypercube sampling with maximum diversity factor (LHSMDF), are tested against mechanical components, including a circular shaft housing, a connecting rod, and a cantilever beam, to evaluate its comparative performance. Subsequently, the new method is employed as the basis of RBDO in the synthesis of a six-bar high-lift mechanism (HLM) example to enhance the reliability of the resulting mechanism compared to Monte Carlo simulation (MCS). The design problem of this mechanism is classified as a motion generation problem, incorporating angle and position of the flap as an objective function. The six-bar linkage is first adapted to be a high-lift mechanism (HLM), which is a symmetrical device of the aircraft. Furthermore, a deterministic design, without consideration of uncertainty, may lead to unacceptable performance during the manufacturing step due to link length tolerances. The techniques are combined with an efficient metaheuristic known as teaching–learning-based optimization with a diversity archive (ATLBO-DA) to identify a reliable HLM. Performance testing of the new LHSMDF reveals that it outperforms the original LHS and OLHS. The HLM problem test results demonstrate that achieving optimum HLM with high reliability necessitates precision without sacrificing accuracy in the manufacturing process. Moreover, it is suggested that the six-bar HLM could emerge as a viable option for developing a new high-lift device in aircraft mechanisms for the future.
APA, Harvard, Vancouver, ISO, and other styles
5

Karatzetzou, Anna. "Uncertainty and Latin Hypercube Sampling in Geotechnical Earthquake Engineering." Geotechnics 4, no. 4 (2024): 1007–25. http://dx.doi.org/10.3390/geotechnics4040051.

Full text
Abstract:
A soil–foundation–structure system (SFSS) often exhibits different responses compared to a fixed-base structure when subjected to earthquake ground motion. Both kinematic and inertial soil–foundation–structure interactions can significantly influence the structural performance of buildings. Numerous parameters within an SFSS affect its overall response, introducing inherent uncertainty into the solution. Performing time history analyses, even for a linear elastic coupled SFSS, requires considerable computational effort. To reduce the computational cost without compromising accuracy, the use of the Latin Hypercube Sampling (LHS) technique is proposed herein. Sampling techniques are rarely employed in soil–foundation–structure interaction analyses, yet they are highly beneficial. These methodologies allow analyses determined by sampling to be conducted using commercial codes designed for deterministic analyses, without requiring any modifications. The advantage is that the number of analyses determined by the sampling size is significantly reduced as compared to considering all combinations of input parameters. After identifying the important samples, one can evaluate the seismic demand of selected soil–foundation–bridge pier systems using finite element numerical software. This paper indicates that LHS reduces computational effort by 60%, whereas structural response components (translation, rocking) show distinct trends for different systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhao, Wei, YangYang Chen, and Jike Liu. "Reliability sensitivity analysis using axis orthogonal importance Latin hypercube sampling method." Advances in Mechanical Engineering 11, no. 3 (2019): 168781401982641. http://dx.doi.org/10.1177/1687814019826414.

Full text
Abstract:
In this article, a combined use of Latin hypercube sampling and axis orthogonal importance sampling, as an efficient and applicable tool for reliability analysis with limited number of samples, is explored for sensitivity estimation of the failure probability with respect to the distribution parameters of basic random variables, which is equivalently solved by reliability sensitivity analysis of a series of hyperplanes through each sampling point parallel to the tangent hyperplane of limit state surface around the design point. The analytical expressions of these hyperplanes are given, and the formulas for reliability sensitivity estimators and variances with the samples are derived according to the first-order reliability theory and difference method when non-normal random variables are involved and not involved, respectively. A procedure is established for the reliability sensitivity analysis with two versions: (1) axis orthogonal Latin hypercube importance sampling and (2) axis orthogonal quasi-random importance sampling with the Halton sequence. Four numerical examples are presented. The results are discussed and demonstrate that the proposed procedure is more efficient than the one based on the Latin hypercube sampling and the direct Monte Carlo technique with an acceptable accuracy in sensitivity estimation of the failure probability.
APA, Harvard, Vancouver, ISO, and other styles
7

Kaewsuwan, Kantapit, Chumpol Yuangyai, Udom Janjarassuk, and Kanokporn Rienkhemaniyom. "A comparison of latin hypercube sampling techniques for a supply chain network design problem." MATEC Web of Conferences 192 (2018): 01023. http://dx.doi.org/10.1051/matecconf/201819201023.

Full text
Abstract:
Currently, supply chain network design becomes more complex. In designing a supply chain network to withstand changing events, it is necessary to consider the uncertainties and risks that cause network disruptions from unexpected events. The current research related to the designing problem considers network disruptions using Monte Carlo Sampling (MCS) or Latin Hypercube Sampling (LHS) techniques. Both have a disadvantage that sample points or disruption locations are not scattered entirely sample space leading to high variation in objective function values. The purpose of this study is to apply a modified LHS or Improved Distributed Hypercube Sampling (IHS) techniques to reduce the variation. The results show that IHS techniques provide smaller standard deviation than that of the LHS technique. In addition, IHS can reduce not only the number of sample size but also and the computational time.
APA, Harvard, Vancouver, ISO, and other styles
8

Nikas, Ioannis A., Vasileios P. Georgopoulos, and Vasileios C. Loukopoulos. "Selective Multistart Optimization Based on Adaptive Latin Hypercube Sampling and Interval Enclosures." Mathematics 13, no. 11 (2025): 1733. https://doi.org/10.3390/math13111733.

Full text
Abstract:
Solving global optimization problems is a significant challenge, particularly in high-dimensional spaces. This paper proposes a selective multistart optimization framework that employs a modified Latin Hypercube Sampling (LHS) technique to maintain a constant search space coverage rate, alongside Interval Arithmetic (IA) to prioritize sampling points. The proposed methodology addresses key limitations of conventional multistart methods, such as the exponential decline in space coverage with increasing dimensionality. It prioritizes sampling points by leveraging the hypercubes generated through LHS and their corresponding interval enclosures, guiding the optimization process toward regions more likely to contain the global minimum. Unlike conventional multistart methods, which assume uniform sampling without quantifying spatial coverage, the proposed approach constructs interval enclosures around each sample point, enabling explicit estimation and control of the explored search space. Numerical experiments on well-known benchmark functions demonstrate improvements in space coverage efficiency and enhanced local/global minimum identification. The proposed framework offers a promising approach for large-scale optimization problems frequently encountered in machine learning, artificial intelligence, and data-intensive domains.
APA, Harvard, Vancouver, ISO, and other styles
9

Packham, N. "Combining Latin Hypercube Sampling With Other Variance Reduction Techniques." Wilmott 2015, no. 76 (2015): 60–69. http://dx.doi.org/10.1002/wilm.10410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bin, Li, Rashana Abbas, Muhammad Shahzad, and Nouman Safdar. "Probabilistic Load Flow Analysis Using Nonparametric Distribution." Sustainability 16, no. 1 (2023): 240. http://dx.doi.org/10.3390/su16010240.

Full text
Abstract:
In the pursuit of sustainable energy solutions, this research addresses the critical need for accurate probabilistic load flow (PLF) analysis in power systems. PLF analysis is an essential tool for estimating the statistical behavior of power systems under uncertainty. It plays a vital part in power system planning, operation, and dependability studies. To perform accurate PLF analysis, this article proposes a Kernel density estimation with adaptive bandwidth for probability density function (PDF) estimation of power injections from sustainable energy sources like solar and wind, reducing errors in PDF estimation. To reduce the computational burden, a Latin hypercube sampling approach was incorporated. Input random variables are modeled using kernel density estimation (KDE) in conjunction with Latin hypercube sampling (LHS) for probabilistic load flow (PLF) analysis. To test the proposed techniques, IEEE 14 and IEEE 118 bus systems are used. Two benchmark techniques, the Monte Carlo Simulation (MCS) method and Hamiltonian Monte Carlo (HMC), were set side by side for validation of results. The results illustrate that an adaptive bandwidth kernel density estimation with the Latin hypercube sampling (AKDE-LHS) method provides better performance in terms of precision and computational efficiency. The results also show that the suggested technique is more feasible in reducing errors, uncertainties, and computational time while depicting arbitrary distributions of photovoltaic and wind farms for probabilistic load flow analysis. It can be a potential solution to tackle challenges posed by sustainable energy sources in power systems.
APA, Harvard, Vancouver, ISO, and other styles
11

Gwo, J. P., L. E. Toran, M. D. Morris, and G. V. Wilson. "Subsurface Stormflow Modeling with Sensitivity Analysis Using a Latin-Hypercube Sampling Technique." Ground Water 34, no. 5 (1996): 811–18. http://dx.doi.org/10.1111/j.1745-6584.1996.tb02075.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Yin, Shengwen, Haogang Qin, and Qiang Gao. "An Efficient Orthogonal Polynomial Method for Auxetic Structure Analysis with Epistemic Uncertainties." Mathematical and Computational Applications 27, no. 3 (2022): 49. http://dx.doi.org/10.3390/mca27030049.

Full text
Abstract:
Traditional approaches used for analyzing the mechanical properties of auxetic structures are commonly based on deterministic techniques, where the effects of uncertainties are neglected. However, uncertainty is widely presented in auxetic structures, which may affect their mechanical properties greatly. The evidence theory has a strong ability to deal with uncertainties; thus, it is introduced for the modelling of epistemic uncertainties in auxetic structures. For the response analysis of a typical double-V negative Poisson’s ratio (NPR) structure with epistemic uncertainty, a new sequence-sampling-based arbitrary orthogonal polynomial (SS-AOP) expansion is proposed by introducing arbitrary orthogonal polynomial theory and the sequential sampling strategy. In SS-AOP, a sampling technique is developed to calculate the coefficient of AOP expansion. In particular, the candidate points for sampling are generated using the Gauss points associated with the optimal Gauss weight function for each evidence variable, and the sequential-sampling technique is introduced to select the sampling points from candidate points. By using the SS-AOP, the number of sampling points needed for establishing AOP expansion can be effectively reduced; thus, the efficiency of the AOP expansion method can be improved without sacrificing accuracy. The proposed SS-AOP is thoroughly investigated through comparison to the Gaussian quadrature-based AOP method, the Latin-hypercube-sampling-based AOP (LHS-AOP) method and the optimal Latin-hypercube-sampling-based AOP (OLHS-AOP) method.
APA, Harvard, Vancouver, ISO, and other styles
13

Todorov, Venelin. "MONTE CARLO SAMPLING TECHNIQUES FOR COMPUTATION OF MULTIDIMENSIONAL INTEGRALS RELATED TO MIGRATION." Journal Scientific and Applied Research 16, no. 1 (2019): 23–31. http://dx.doi.org/10.46687/jsar.v16i1.260.

Full text
Abstract:
We study multidimensional integrals with applications to international migration forecasting. A comprehensive experimental study based on Latin Hypercube and Importance Sampling and Fibonacci based lattice rule has been done. This is the first time such a comparison has been made. The numerical tests show that the stochastic algorithms under consideration are efficient tool for computing multidimensional integrals. It is important in order to obtain a more accurate and reliable interpretation of the results which is a foundation in international migration forecasting.
APA, Harvard, Vancouver, ISO, and other styles
14

Mohammed, Maha A., N. F. M. Noor, A. I. N. Ibrahim, and Z. Siri. "A non-conventional hybrid numerical approach with multi-dimensional random sampling for cocaine abuse in Spain." International Journal of Biomathematics 11, no. 08 (2018): 1850110. http://dx.doi.org/10.1142/s1793524518501103.

Full text
Abstract:
This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method to complete one cycle of LHS-FD simulation iteration. This process is repeated until [Formula: see text] final iterations of LHS-FD are obtained. The means of these [Formula: see text] final solutions (MLHFD solutions) are tabulated, graphed and analyzed. The numerical simulation results of MLHFD for the SEIR model are presented side-by-side with deterministic solutions obtained from the classical FD scheme and homotopy analysis method with Pade approximation (HAM-Pade). The present MLHFD results are also compared with the previous non-deterministic statistical estimations from 1995 to 2015. Good agreement between the two is perceived with small errors. MLHFD method can be used to predict future behavior, range and prediction interval for the epidemic model solutions. The expected profiles of the cocaine abuse subpopulations are projected until the year 2045. Both the statistical estimations and the deterministic results of FD and HAM-Pade are found to be within the MLHFD prediction intervals for all the years and for all the subpopulations considered.
APA, Harvard, Vancouver, ISO, and other styles
15

Ibrahim, Mohamed, Saad Al-Sobhi, Rajib Mukherjee, and Ahmed AlNouss. "Impact of Sampling Technique on the Performance of Surrogate Models Generated with Artificial Neural Network (ANN): A Case Study for a Natural Gas Stabilization Unit." Energies 12, no. 10 (2019): 1906. http://dx.doi.org/10.3390/en12101906.

Full text
Abstract:
Data-driven models are essential tools for the development of surrogate models that can be used for the design, operation, and optimization of industrial processes. One approach of developing surrogate models is through the use of input–output data obtained from a process simulator. To enhance the model robustness, proper sampling techniques are required to cover the entire domain of the process variables uniformly. In the present work, Monte Carlo with pseudo-random samples as well as Latin hypercube samples and quasi-Monte Carlo samples with Hammersley Sequence Sampling (HSS) are generated. The sampled data obtained from the process simulator are fitted to neural networks for generating a surrogate model. An illustrative case study is solved to predict the gas stabilization unit performance. From the developed surrogate models to predict process data, it can be concluded that of the different sampling methods, Latin hypercube sampling and HSS have better performance than the pseudo-random sampling method for designing the surrogate model. This argument is based on the maximum absolute value, standard deviation, and the confidence interval for the relative average error as obtained from different sampling techniques.
APA, Harvard, Vancouver, ISO, and other styles
16

Tripathi, Anshuman, Gursharanjit Kaur, Abhirup Datta, and Suman Majumdar. "Comparing sampling techniques to chart parameter space of 21 cm global signal with Artificial Neural Networks." Journal of Cosmology and Astroparticle Physics 2024, no. 10 (2024): 041. http://dx.doi.org/10.1088/1475-7516/2024/10/041.

Full text
Abstract:
Abstract Understanding the first billion years of the universe requires studying two critical epochs: the Epoch of Reionization (EoR) and Cosmic Dawn (CD). However, due to limited data, the properties of the Intergalactic Medium (IGM) during these periods remain poorly understood, leading to a vast parameter space for the global 21cm signal. Training an Artificial Neural Network (ANN) with a narrowly defined parameter space can result in biased inferences. To mitigate this, the training dataset must be uniformly drawn from the entire parameter space to cover all possible signal realizations. However, drawing all possible realizations is computationally challenging, necessitating the sampling of a representative subset of this space. This study aims to identify optimal sampling techniques for the extensive dimensionality and volume of the 21cm signal parameter space. The optimally sampled training set will be used to train the ANN to infer from the global signal experiment. We investigate three sampling techniques: random, Latin hypercube (stratified), and Hammersley sequence (quasi-Monte Carlo) sampling, and compare their outcomes. Our findings reveal that sufficient samples must be drawn for robust and accurate ANN model training, regardless of the sampling technique employed. The required sample size depends primarily on two factors: the complexity of the data and the number of free parameters. More free parameters necessitate drawing more realizations. Among the sampling techniques utilized, we find that ANN models trained with Hammersley Sequence sampling demonstrate greater robustness compared to those trained with Latin hypercube and Random sampling.
APA, Harvard, Vancouver, ISO, and other styles
17

Isaev, Alexander, Tatiana Dobroserdova, Alexander Danilov, and Sergey Simakov. "Physically Informed Deep Learning Technique for Estimating Blood Flow Parameters in Four-Vessel Junction after the Fontan Procedure." Computation 12, no. 3 (2024): 41. http://dx.doi.org/10.3390/computation12030041.

Full text
Abstract:
This study introduces an innovative approach leveraging physics-informed neural networks (PINNs) for the efficient computation of blood flows at the boundaries of a four-vessel junction formed by a Fontan procedure. The methodology incorporates a 3D mesh generation technique based on the parameterization of the junction’s geometry, coupled with an advanced physically regularized neural network architecture. Synthetic datasets are generated through stationary 3D Navier–Stokes simulations within immobile boundaries, offering a precise alternative to resource-intensive computations. A comparative analysis of standard grid sampling and Latin hypercube sampling data generation methods is conducted, resulting in datasets comprising 1.1×104 and 5×103 samples, respectively. The following two families of feed-forward neural networks (FFNNs) are then compared: the conventional “black-box” approach using mean squared error (MSE) and a physically informed FFNN employing a physically regularized loss function (PRLF), incorporating mass conservation law. The study demonstrates that combining PRLF with Latin hypercube sampling enables the rapid minimization of relative error (RE) when using a smaller dataset, achieving a relative error value of 6% on the test set. This approach offers a viable alternative to resource-intensive simulations, showcasing potential applications in patient-specific 1D network models of hemodynamics.
APA, Harvard, Vancouver, ISO, and other styles
18

Pholdee, Nantiwat, and Sujin Bureerat. "An efficient optimum Latin hypercube sampling technique based on sequencing optimisation using simulated annealing." International Journal of Systems Science 46, no. 10 (2013): 1780–89. http://dx.doi.org/10.1080/00207721.2013.835003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Nguyễn, Duy Duẩn, Xuân Thục Phan та Trọng Hà Nguyễn. "Đánh giá độ tin cậy của dầm thép trên nền đàn hồi chịu tải trọng di động sử dụng mô phỏng monte carlo và phương pháp lấy mẫu HYPERCUBE LATIN". Vietnam Institute for Building Science and Technology 2023, vi.vol3 (2023): 12–19. http://dx.doi.org/10.59382/j-ibst.2023.vi.vol3-2.

Full text
Abstract:
Steel beams resting on the elastic foundation are an applied technique to minimize the impact of thermal expansion and vibrations on civil engineering structures such as bridges and buildings, due to their flexibility and ability to withstand large loads. In practice, the performance of steel beams resting on elastic foundations is affected by random factors such as exceeding load-bearing capacity, undesirable deformations, loss of soil resilience, or environmental corrosion. This study proposes a technique to evaluate the reliability of steel beams resting on the elastic foundation subjected to moving loads using discrete random input parameters through the Latin hypercube sampling and Monte Carlo simulation method. The reliability of steel beams obtained from the proposed method is then compared to that of the traditional Monte Carlo simulation. The results reveal that the proposed technique provides a high accuracy and computational resource savings compared to the traditional Monte Carlo method. Finally, the influence of the random input parameters is also assessed based on the Sobol' sensitivity index.
APA, Harvard, Vancouver, ISO, and other styles
20

Alzubaidi, Mohammed, Kazi N. Hasan, Lasantha Meegahapola, and Mir Toufikur Rahman. "Identification of Efficient Sampling Techniques for Probabilistic Voltage Stability Analysis of Renewable-Rich Power Systems." Energies 14, no. 8 (2021): 2328. http://dx.doi.org/10.3390/en14082328.

Full text
Abstract:
This paper presents a comparative analysis of six sampling techniques to identify an efficient and accurate sampling technique to be applied to probabilistic voltage stability assessment in large-scale power systems. In this study, six different sampling techniques are investigated and compared to each other in terms of their accuracy and efficiency, including Monte Carlo (MC), three versions of Quasi-Monte Carlo (QMC), i.e., Sobol, Halton, and Latin Hypercube, Markov Chain MC (MCMC), and importance sampling (IS) technique, to evaluate their suitability for application with probabilistic voltage stability analysis in large-scale uncertain power systems. The coefficient of determination (R2) and root mean square error (RMSE) are calculated to measure the accuracy and the efficiency of the sampling techniques compared to each other. All the six sampling techniques provide more than 99% accuracy by producing a large number of wind speed random samples (8760 samples). In terms of efficiency, on the other hand, the three versions of QMC are the most efficient sampling techniques, providing more than 96% accuracy with only a small number of generated samples (150 samples) compared to other techniques.
APA, Harvard, Vancouver, ISO, and other styles
21

Sun, Gao Rong, and Fen Fen Xiong. "Efficient Sampling Approaches for Stochastic Response Surface Method." Advanced Materials Research 538-541 (June 2012): 2481–87. http://dx.doi.org/10.4028/www.scientific.net/amr.538-541.2481.

Full text
Abstract:
Stochastic response surface methods (SRSM) based on polynomial chaos expansion (PCE) has been widely used for uncertainty propagation. It is necessary to select efficient sampling technique to estimate the PCE coefficients in SRSM. In this paper, the three advanced sampling approaches, namely, Gaussian Quadrature point (GQ), Monomial Cubature Rule (MCR), and Latin Hypercube Design (LHD) are introduced and investigated, whose performances are tested through several examples. It is shown that the results of UP for the three sampling approaches show great agreements to those of Monte Carlo simulation. Specifically, GQ yields the most accurate result of UP, followed by MCR and LHD, while MCR shows the best efficiency for lower PCE order.
APA, Harvard, Vancouver, ISO, and other styles
22

Srivastava, Ankur, and Andrew J. Meade. "Use of Active Learning to Design Wind Tunnel Runs for Unsteady Cavity Pressure Measurements." International Journal of Aerospace Engineering 2014 (2014): 1–11. http://dx.doi.org/10.1155/2014/218710.

Full text
Abstract:
Wind tunnel tests to measure unsteady cavity flow pressure measurements can be expensive, lengthy, and tedious. In this work, the feasibility of an active machine learning technique to design wind tunnel runs using proxy data is tested. The proposed active learning scheme used scattered data approximation in conjunction with uncertainty sampling (US). We applied the proposed intelligent sampling strategy in characterizing cavity flow classes at subsonic and transonic speeds and demonstrated that the scheme has better classification accuracies, using fewer training points, than a passive Latin Hypercube Sampling (LHS) strategy.
APA, Harvard, Vancouver, ISO, and other styles
23

Hlobilová, Adéla, and Matěj Lepš. "Parameter Study of Asymptotic Sampling for Truss Structures Reliability." Advanced Materials Research 969 (June 2014): 288–93. http://dx.doi.org/10.4028/www.scientific.net/amr.969.288.

Full text
Abstract:
Sampling methods for predicting a reliability index such as Monte Carlo or Latin Hypercube Sampling are very time consuming thus advanced simulation techniques are frequently used. The asymptotic sampling is one of new techniques based on asymptotic results from reliability theory supported with a simple regression. Since the high sensitivity of the asymptotic sampling method to the control parameters has been reported, this contribution is focused on a study of the optimal parameter setting. The well-known truss structure is introduced and results with different parameter settings are presented.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhen Shu and P. Jirutitijaroen. "Latin Hypercube Sampling Techniques for Power Systems Reliability Analysis With Renewable Energy Sources." IEEE Transactions on Power Systems 26, no. 4 (2011): 2066–73. http://dx.doi.org/10.1109/tpwrs.2011.2113380.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Sen-chun, Miao, Shi Zhi-xiao, Wang Xiao-hui, Shi Feng-xia, and Shi Guang-tai. "Impeller meridional plane optimization of pump as turbine." Science Progress 103, no. 1 (2019): 003685041987654. http://dx.doi.org/10.1177/0036850419876542.

Full text
Abstract:
How to improve efficiency is still a very active research point for pump as turbine. This article comes up with a method for optimal design of pump as turbine impeller meridional plane. It included the parameterized control impeller meridional plane, the computational fluid dynamics technique, the optimized Latin hypercube sampling experimental design, the back propagation neural network optimized by genetic algorithm and genetic algorithm. Concretely, the impeller meridional plane was parameterized by the Pro/E software, the optimized Latin hypercube sampling was used to obtain the test sample points for back propagation neural network optimized by genetic algorithm, and the model corresponding to each sample point was calculated to obtain the performance values by the computational fluid dynamics techniques. Then, back propagation neural network learning and training are carried out by combining sample points and corresponding model performance values. Last but not least, back propagation neural network optimized by genetic algorithm and genetic algorithm were combined to deal with the optimization problem of impeller meridional plane. According to the aforementioned optimization design method, impeller meridional plane of the pump as turbine was optimized. The result manifests that the optimized pump as turbine energy-conversion efficiency was improved by 2.28% at the optimum operating condition, at the same time meet the pressure head constraint, namely the head difference between initial and optimized model is under the set numeric value. This demonstrates that the optimization method proposed in this article to optimize the impeller meridional plane is practicable.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhao, Wei, Na Zhou, and Yi Min Zhang. "Global Sensitivity of Random Parameters in Vibration Transfer Path Systems." Applied Mechanics and Materials 635-637 (September 2014): 274–80. http://dx.doi.org/10.4028/www.scientific.net/amm.635-637.274.

Full text
Abstract:
Based on the Sobol’ theory, Hamma and Saltelli indicator system, the path parameters were divided into the different subsets for the vibration transfer path systems with random paths in this paper. The object function was constructed and decomposed, thus the global sensitivity based on variances was formed. The first-order and total sensitivities of various parameter subsets were calculated using Monte-Carlo simulation method combined with Latin hypercube sampling technique. Through the analysis example, the way is feasible to analyze the interaction of paths or path parameters in the vibration transfer path systems with uncertainty.
APA, Harvard, Vancouver, ISO, and other styles
27

Chowdhary, K., M. Salloum, B. Debusschere, and V. E. Larson. "Quadrature Methods for the Calculation of Subgrid Microphysics Moments." Monthly Weather Review 143, no. 7 (2015): 2955–72. http://dx.doi.org/10.1175/mwr-d-14-00168.1.

Full text
Abstract:
Abstract Many cloud microphysical processes occur on a much smaller scale than a typical numerical grid box can resolve. In such cases, a probability density function (PDF) can act as a proxy for subgrid variability in these microphysical processes. This method is known as the assumed PDF method. By placing a density on the microphysical fields, one can use samples from this density to estimate microphysics averages. In the assumed PDF method, the calculation of such microphysical averages has primarily been done using classical Monte Carlo methods and Latin hypercube sampling. Although these techniques are fairly easy to implement and ubiquitous in the literature, they suffer from slow convergence rates as a function of the number of samples. This paper proposes using deterministic quadrature methods instead of traditional random sampling approaches to compute the microphysics statistical moments for the assumed PDF method. For smooth functions, the quadrature-based methods can achieve much greater accuracy with fewer samples by choosing tailored quadrature points and weights instead of random samples. Moreover, these techniques are fairly easy to implement and conceptually similar to Monte Carlo–type methods. As a prototypical microphysical formula, Khairoutdinov and Kogan’s autoconversion and accretion formulas are used to illustrate the benefit of using quadrature instead of Monte Carlo or Latin hypercube sampling.
APA, Harvard, Vancouver, ISO, and other styles
28

Castaneda-Lopez, Mario, Thomas Lenoir, Jean-Pierre Sanfratello, and Luc Thorel. "Data-Driven Prediction of Cement-Stabilized Soils Tensile Properties." Infrastructures 8, no. 10 (2023): 146. http://dx.doi.org/10.3390/infrastructures8100146.

Full text
Abstract:
The indirect tensile strength of two geomaterials treated with variable cement contents, degrees of compaction and water contents were tested after several curing times. A statistical review through an analysis of variance allows for identifying the significant variables and generating prediction models. The distribution of associated uncertainties was measured. Based on these probabilistic results, numerical models were constructed using Latin Hypercube Sampling as the space filling technique. Predictions from the numerical sampling were in accordance with the experimental results. The numerical results suggest that the net gain in accuracy was not affected by the soil type. In addition, it increases rapidly as a function of the sampling size. The proposed approach is broad. It can help to highlight the physical mechanisms involved in behaviors of multi-component materials.
APA, Harvard, Vancouver, ISO, and other styles
29

Liu, Bojun, Jun Xia, Libin Yang, Changyong Cui, Linwei Wang, and Tiansheng Li. "Improved dynamic simulation technique for hydrodynamics and water quality of river-connected lakes." Water Supply 20, no. 8 (2020): 3752–67. http://dx.doi.org/10.2166/ws.2020.125.

Full text
Abstract:
Abstract In this study, a two-dimensional hydrodynamic water-quality model is proposed for river-connected lakes in an effort to improve calibration accuracy and reduce computational burden. To achieve this, the sensitivity of parameters involved in the hydrodynamic model is analyzed using stepwise rank regression and Latin hypercube sampling (LHS), and the roughness coefficient, wind drag coefficient and wind resistance coefficient are identified as the most important parameters affecting the hydrodynamics of the Hongze Lake. Then, the ensemble Kalman filter (EnKF) is used to assimilate observations to the proposed hydrodynamic and water quality model. It is found that assimilation of both state variables and model parameters results in a significant improvement of the simulation of the water level, flow velocity and pollutant concentration in the Hongze Lake.
APA, Harvard, Vancouver, ISO, and other styles
30

Kanyakam, Siwadol, and Sujin Bureerat. "Optimal Design of a Pin-Fin Heat Sink Using a Surrogate-Assisted Multiobjective Evolutionary Algorithm." Advanced Materials Research 308-310 (August 2011): 1122–28. http://dx.doi.org/10.4028/www.scientific.net/amr.308-310.1122.

Full text
Abstract:
In this work, performance enhancement of a multiobjective evolutionary algorithm (MOEA) by integrating a surrogate model to the design process is presented. The MOEA used in this work is multiobjective population-based incremental learning (PBIL). The bi-objective design problem of a pin-fin heat sink (PFHS) is posed to minimize junction temperature and fan pumping power while meeting design constraints. A Kriging (KRG) model is used for improving the performance of PBIL. The training points for constructing a surrogate KRG model are sampled by means of a Latin hypercube sampling (LHS) technique. It is shown that hybridization of PBIL and KRG can enhance the search performance of PBIL.
APA, Harvard, Vancouver, ISO, and other styles
31

Park, Sang-Jun, Jin-Bin Im, Hye-Soon Yoon, and Ju-Hyung Kim. "Data Augmentation Approaches for Estimating Curtain Wall Construction Duration in High-Rise Buildings." Buildings 15, no. 4 (2025): 583. https://doi.org/10.3390/buildings15040583.

Full text
Abstract:
Reliable project management during planning stages of a building project is a meticulous process typically requiring sufficient precedencies. Typical construction duration estimation is based on previous cases of similar projects used to validate construction duration proposals from contractors, plan overall project duration, and set a standard for project success or failure. In cases of high-rise buildings exceeding 200 m, insufficient data commonly arise from the rarity of such projects, leading to a rough estimation of construction duration. Therefore, in this study, oversampling and data augmentation techniques derived from engineering principles, such as parametric optimization and data imbalance problems, are explored for curtain wall construction for high-rise buildings. The study was conducted in two phases. First, oversampling and data augmentation techniques, including Latin Hypercube, optimal Latin Hypercube, simple Monte Carlo, descriptive Monte Carlo, Sobol Monte Carlo, synthetic minority oversampling technique (SMOTE), and SMOTE–Tomek, were applied to 15 raw datasets collected from previous projects. The dataset was split into 8:2 for training and testing, where the mentioned techniques were applied to generate 500 virtual samples from the training data. Second, support vector regression was applied to forecast construction duration, where statistical performance criteria were applied for evaluation. The results showed that SMOTE and SMOTE–Tomek best represented the original dataset based on box plot analysis showcasing data distribution. Moreover, according to statistical performance criteria, it was found that the oversampling techniques improved the prediction performance, where Pearson correlation for linear, polynomial, and RBF increased by 0.611%, 4.232%, and 0.594%, respectively, for the best-performing sampling method. Finally, for the prediction models, probabilistic oversampling methods outperformed other methods according to the statistical performance criteria.
APA, Harvard, Vancouver, ISO, and other styles
32

Wei, Xin, Yi Zhong Wu, and Li Ping Chen. "Global Optimization Method Based on Incremental Radial Basis Functions." Applied Mechanics and Materials 121-126 (October 2011): 3950–54. http://dx.doi.org/10.4028/www.scientific.net/amm.121-126.3950.

Full text
Abstract:
Global optimization techniques have been used extensively due to their capability in handling complex engineering problems. Metamodel becomes effective method to enhance global optimization. In this paper, we propose a new global optimization method base on incremental metamodel. At each sampling step, we adopt inherited Latin HyperCube design to sample points step by step, and propose a new incremental metamodel to update the cofficient matrix gradually. Experiments proved that the global optimization method has highest efficiency and can be finding global minimum fastly.
APA, Harvard, Vancouver, ISO, and other styles
33

Zeng, Wei, Xian Chao Wang, and Ying Sheng Wang. "Surrogating for High Dimensional Computationally Expensive Multi-Modal Functions with Elliptical Basis Function Models." Applied Mechanics and Materials 733 (February 2015): 880–84. http://dx.doi.org/10.4028/www.scientific.net/amm.733.880.

Full text
Abstract:
In the engineering design process, approximation Technique could guarantee the fitting precision, speed up the design process and reduce design costs. To a certain extent, surrogate models could replace time-consuming and highly accurate computational fluid dynamics analysis gradually. In this paper, we take Optimal Latin Hypercube Sampling experimental design strategies to determine the sample space and error analysis test sample, adopt the principle of infilling criteria based on the maximum error to improve the accuracy of the surrogate model, test the unimodal and multimodal expensive functions of 10 dimension, 20 dimensions and 30 dimensions, study the performance and scope of EBF-NN surrogate model based on infilling criteria by comparing the RBF-NN surrogate model.
APA, Harvard, Vancouver, ISO, and other styles
34

Mišurović, Filip, and Saša Mujović. "Numerical Probabilistic Load Flow Analysis in Modern Power Systems with Intermittent Energy Sources." Energies 15, no. 6 (2022): 2038. http://dx.doi.org/10.3390/en15062038.

Full text
Abstract:
Renewable resources integration through distributed generation (DG) affects conventional consideration of power system performance and confronts deterministic load flow (DLF) analysis with serious challenges. The DLF gives a snapshot of the system state neglecting all of the uncertainties arising from intermittent DG driven by variable weather conditions or volatile consumption. Therefore, with the aim of finer tracking and presentation of system variables, a probabilistic load flow (PLF) approach should be adopted. First, this article gives a literature overview of different PLF techniques. It focuses on numerical techniques examining them for simple random and Latin Hypercube sampling, vastly applied in previous works, and proposes a method combining Monte Carlo simulations with Halton quasi-random numbers. Stochastic modelling is performed for solar and wind power output. For method comparison and confirmation of the applicability of suggested PLF method with Halton sequences, different IEEE test cases were used, all modified by attaching DGs. More profound method assessment is conducted through discussing different renewables penetration levels and processing time. The overall simulation outcomes have shown that results of Halton method are of similar precision as the generally used Latin Hypercube method and therefore indicated the relevance of the proposed method and its potential for application in contemporary system analysis.
APA, Harvard, Vancouver, ISO, and other styles
35

Hou, Zenghao, and Joyoung Lee. "Multi-Thread Optimization for the Calibration of Microscopic Traffic Simulation Model." Transportation Research Record: Journal of the Transportation Research Board 2672, no. 20 (2018): 98–109. http://dx.doi.org/10.1177/0361198118796395.

Full text
Abstract:
This paper proposes an innovative multi-thread stochastic optimization approach for the calibration of microscopic traffic simulation models. Combining Quasi-Monte Carlo (QMC) sampling and the Particle Swarm Optimization (PSO) algorithm, the proposed approach, namely the Quasi-Monte Carlo Particle Swarm (QPS) calibration method, is designed to boost the searching process without prejudice to the calibration accuracy. Given the search space constructed by the combinations of simulation parameters, the QMC sampling technique filters the searching space, followed by the multi-thread optimization through the PSO algorithm. A systematic framework for the implementation of the QPS QMC-initialized PSO method is developed and applied for a case study dealing with a large-scale simulation model covering a 6-mile stretch of Interstate Highway 66 (I-66) in Fairfax, Virginia. The case study results prove that the proposed QPS method outperforms other methods utilizing Genetic Algorithm and Latin Hypercube Sampling in achieving faster convergence to obtain an optimal calibration parameter set.
APA, Harvard, Vancouver, ISO, and other styles
36

Kumkam, Noppawit, and Suwin Sleesongsom. "Reliability-Based Topology Optimization with a Proportional Topology for Reliability." Aerospace 11, no. 6 (2024): 435. http://dx.doi.org/10.3390/aerospace11060435.

Full text
Abstract:
This research proposes an efficient technique for reliability-based topology optimization (RBTO), which deals with uncertainty and employs proportional topology optimization (PTO) to achieve the optimal reliability structure. The recent technique, called proportional topology optimization for reliability (PTOr), uses Latin hypercube sampling (LHS) for uncertainty quantification. The difficulty of the double-loop nested problem in uncertainty quantification (UQ) with LHS can be alleviated by the power of PTO, enabling RBTO to be performed easily. The rigorous advantage of PTOr is its ability to accomplish topology optimization (TO) without gradient information, making it faster than TO with evolutionary algorithms. Particularly, for reliability-based topology design, evolutionary techniques often fail to achieve satisfactory results compared to gradient-based techniques. Unlike recent PTOr advancement, which enhances the RBTO performance, this achievement was previously unattainable. Test problems, including an aircraft pylon, reveal its performances. Furthermore, the proposed efficient framework facilitates easy integration with other uncertainty quantification techniques, increasing its performance in uncertainty quantification. Lastly, this research provides computer programs for the newcomer studying cutting-edge knowledge in engineering design, including UQ, TO, and RBTO, in a simple manner.
APA, Harvard, Vancouver, ISO, and other styles
37

Balasubramanian, Aswin, Floran Martin, Md Masum Billah, Osaruyi Osemwinyen, and Anouar Belahcen. "Application of Surrogate Optimization Routine with Clustering Technique for Optimal Design of an Induction Motor." Energies 14, no. 16 (2021): 5042. http://dx.doi.org/10.3390/en14165042.

Full text
Abstract:
This paper proposes a new surrogate optimization routine for optimal design of a direct on line (DOL) squirrel cage induction motor. The geometry of the motor is optimized to maximize its electromagnetic efficiency while respecting the constraints, such as output power and power factor. The routine uses the methodologies of Latin-hypercube sampling, a clustering technique and a Box–Behnken design for improving the accuracy of the surrogate model while efficiently utilizing the computational resources. The global search-based particle swarm optimization (PSO) algorithm is used for optimizing the surrogate model and the pattern search algorithm is used for fine-tuning the surrogate optimal solution. The proposed surrogate optimization routine achieved an optimal design with an electromagnetic efficiency of 93.90%, for a 7.5 kW motor. To benchmark the performance of the surrogate optimization routine, a comparative analysis was carried out with a direct optimization routine that uses a finite element method (FEM)-based machine model as a cost function.
APA, Harvard, Vancouver, ISO, and other styles
38

Rajendra, M., and K. Shankar. "Damage Identification of Multimember Structure using Improved Neural Networks." International Journal of Manufacturing, Materials, and Mechanical Engineering 3, no. 3 (2013): 57–75. http://dx.doi.org/10.4018/ijmmme.2013070104.

Full text
Abstract:
A novel two stage Improved Radial Basis Function (IRBF) neural network for the damage identification of a multimember structure in the frequency domain is presented. The improvement of the proposed IRBF network is carried out in two stages. Conventional RBF network is used in the first stage for preliminary damage prediction and in the second stage reduced search space moving technique is used to minimize the prediction error. The network is trained with fractional frequency change ratios (FFCs) and damage signature indices (DSIs) as effective input patterns and the corresponding damage severity values as output patterns. The patterns are searched at different damage levels by Latin hypercube sampling (LHS) technique. The performance of the novel IRBF method is compared with the conventional RBF and Genetic algorithm (GA) methods and it is found to be a good multiple member damage identification strategy in terms of accuracy and precision with less computational effort.
APA, Harvard, Vancouver, ISO, and other styles
39

Hlobilová, Adéla, and Matěj Lepš. "Parameter Study on Subset Simulation for Reliability Assessment." Advanced Materials Research 1144 (March 2017): 128–35. http://dx.doi.org/10.4028/www.scientific.net/amr.1144.128.

Full text
Abstract:
Small probability of failure characterizes a good structural design. Prediction of such a structural safety is time consuming considering that sampling methods such as Monte Carlo method or Latin Hypercube sampling are used. Therefore, more specialized methods are developed. A Subset simulation is one of the new techniques based on modifying the failure event as an intersection of nested intermediate events that are easier to solve. This paper deals with a parameter study of the Subset simulation with modified Metropolis algorithm for Markov chain Monte Carlo using distinct proposal distributions. Different setting is then compared on reliability assessment benchmarks, namely on two mathematical functions with different failure probabilities and on a 23-bar planar truss bridge.
APA, Harvard, Vancouver, ISO, and other styles
40

Ao, Liang Bo, Lei Li, Yuan Sheng Li, Zhi Xun Wen, and Zhu Feng Yue. "Multi-Objective Design Optimization of Cooling Turbine Blade Based on Kriging Model." Applied Mechanics and Materials 184-185 (June 2012): 316–19. http://dx.doi.org/10.4028/www.scientific.net/amm.184-185.316.

Full text
Abstract:
The multi-objective design optimization of cooling turbine blade is studied using Kriging model. The optimization model is created, with the diameter of pin fin at the trailing edge of cooling turbine blade and the location, width, height of rib as design variable, the blade body temperature, flow resistance loss and aerodynamic efficiency as optimization object. The sample points are selected using Latin hypercube sampling technique, and the approximate model is created using Kriging method, the set of Pareto-optimal solutions of optimization objects is obtained by the multi-object optimization model using elitist non-dominated sorting genetic algorithm (NSGA-Ⅱ) based on the approximate model. The result shows that the conflict among all optimization objects is solved effectively and the feasibility of the optimization method is improved.
APA, Harvard, Vancouver, ISO, and other styles
41

Le, Thai Son, Jung Won Huh, Jin Hee Ahn, and Achintya Haldar. "Damage State Identification and Seismic Fragility Evaluation of the Underground Tunnels." Applied Mechanics and Materials 775 (July 2015): 274–78. http://dx.doi.org/10.4028/www.scientific.net/amm.775.274.

Full text
Abstract:
An efficient seismic fragility assessment method is proposed for underground tunnel structures in this paper. The ground response acceleration method for buried structure (GRAMBS), an efficient quasi-static method considering soil-structure interaction (SSI) effect, is used in the proposed approach to estimate the dynamic response behavior of the underground tunnels. In addition, the pushover analyses are conducted to identify the damage states of tunnels and Latin Hypercube sampling technique is used to consider uncertainties in the design variables. A large set of artificially generated ground motions satisfying a design spectrum for specific earthquake intensity are generated and fragility curves are developed. The seismic fragility curves are represented by two-parameter lognormal distribution function and its two parameters, namely the median and log-standard deviation, are estimated using the maximum likelihood estimates method.
APA, Harvard, Vancouver, ISO, and other styles
42

Yepes-Bellver, Lorena, Alejandro Brun-Izquierdo, Julián Alcalá, and Víctor Yepes. "Embodied Energy Optimization of Prestressed Concrete Road Flyovers by a Two-Phase Kriging Surrogate Model." Materials 16, no. 20 (2023): 6767. http://dx.doi.org/10.3390/ma16206767.

Full text
Abstract:
This study aims to establish a methodology for optimizing embodied energy while constructing lightened road flyovers. A cross-sectional analysis is conducted to determine design parameters through an exhaustive literature review. Based on this analysis, key design variables that can enhance the energy efficiency of the slab are identified. The methodology is divided into two phases: a statistical technique known as Latin Hypercube Sampling is initially employed to sample deck variables and create a response surface; subsequently, the response surface is fine-tuned through a Kriging-based optimization model. Consequently, a methodology has been developed that reduces the energy cost of constructing lightened slab bridge decks. Recommendations to improve energy efficiency include employing high slenderness ratios (approximately 1/28), minimizing concrete and active reinforcement usage, and increasing the amount of passive reinforcement.
APA, Harvard, Vancouver, ISO, and other styles
43

Štefaňák, Jan, and Lumír Miča. "DESIGN PARAMETERS ASSESSMENT FOR GROUND ANCHORS IN TERTIERY FLUVIAL CLAYS." Engineering Structures and Technologies 6, no. 3 (2014): 106–13. http://dx.doi.org/10.3846/2029882x.2014.987948.

Full text
Abstract:
The design of the bond length of an postgrouted presstressed anchor is based on presumption of bond shear stress on contact of the bond skin and surrounding soil. Statistical analysis of measured test data was performed, including Latin Hypercube Sampling technique, to specify shear stress parameter for geotechnical conditions of Neogene fluvial clays stiff to very stiff consistency. Information about anchors installed into clays was abstracted from acceptance tests records provided by TOPGEO – Czech special foundations contractor. All tested anchors were used as supports of temporary retaining structures on construction site in Olomouc, Czech Republic. Results of conducted analysis are expressed as confidence interval estimation of mean of bond shear stress parameter. This parameter is usable for future design of anchors installed into analogical geotechnical conditions. Influencing of results by technological aspects of the construction process is partially discussed too.
APA, Harvard, Vancouver, ISO, and other styles
44

Seo, Jong-Beom, Hosaeng Lee, and Sang-Jo Han. "A Design Optimization of Organic Rankine Cycle Turbine Blades with Radial Basis Neural Network." Energies 17, no. 1 (2023): 26. http://dx.doi.org/10.3390/en17010026.

Full text
Abstract:
In the present study, a 100 kW organic Rankine cycle is suggested to recover heat energy from commercial ships. A radial-type turbine is employed with R1233zd(E) and back-to-back layout. To improve the performance of an organic Rankine power system, the efficiency of the turbine is significant. With the conventional approach, the optimization of a turbine requires a considerable amount of time and involves substantial costs. By combining design of experiments, an artificial neural network, and Latin hypercube sampling, it becomes possible to reduce costs and achieve rapid optimization. A radial basis neural network with machine learning technique, known for its advantages of being fast and easily applicable, has been implemented. Using such an approach, an increase in efficiency greater than 1% was achieved with minimal design changes at the first and second turbines.
APA, Harvard, Vancouver, ISO, and other styles
45

Wang, Hongbin, Nurulafiqah Nadzirah Binti Mansor, and Hazlie Bin Mokhlis. "Novel Hybrid Optimization Technique for Solar Photovoltaic Output Prediction Using Improved Hippopotamus Algorithm." Applied Sciences 14, no. 17 (2024): 7803. http://dx.doi.org/10.3390/app14177803.

Full text
Abstract:
This paper introduces a novel hybrid optimization technique aimed at improving the prediction accuracy of solar photovoltaic (PV) outputs using an Improved Hippopotamus Optimization Algorithm (IHO). The IHO enhances the traditional Hippopotamus Optimization (HO) algorithm by addressing its limitations in search efficiency, convergence speed, and global exploration. The IHO algorithm used Latin hypercube sampling (LHS) for population initialization, significantly enhancing the diversity and global search potential of the optimization process. The integration of the Jaya algorithm further refines solution quality and accelerates convergence. Additionally, a combination of unordered dimensional sampling, random crossover, and sequential mutation is employed to enhance the optimization process. The effectiveness of the proposed IHO is demonstrated through the optimization of weights and neuron thresholds in the extreme learning machine (ELM), a model known for its rapid learning capabilities but often affected by the randomness of initial parameters. The IHO-optimized ELM (IHO-ELM) is tested against benchmark algorithms, including BP, the traditional ELM, the HO-ELM, LCN, and LSTM, showing significant improvements in prediction accuracy and stability. Moreover, the IHO-ELM model is validated in a different region to assess its generalization ability for solar PV output prediction. The results confirm that the proposed hybrid approach not only improves prediction accuracy but also demonstrates robust generalization capabilities, making it a promising tool for predictive modeling in solar energy systems.
APA, Harvard, Vancouver, ISO, and other styles
46

Xu, Zhenfa, Fanyu Kong, Kun Zhang, Han Zhu, Jiaqiong Wang, and Ning Qiu. "Optimization of the impeller for hydraulic performance improvement of a high-speed magnetic drive pump." Advances in Mechanical Engineering 14, no. 6 (2022): 168781322211045. http://dx.doi.org/10.1177/16878132221104576.

Full text
Abstract:
Magnetic drive centrifugal pumps have compact structure and lower efficiency than ordinary centrifugal pumps. The surrogate-based optimization technique was applied to improve the performance of a high-speed magnetic drive pump with the help of numerical simulations. Eight geometrical parameters of the impeller were considered as the design variable. About 290 samples of impeller were generated by optimal Latin hypercube sampling (OLHS) method, and the corresponding efficiencies of all the impeller samplings were obtained from numerical simulation. The performance test of the prototype pump was carried out, and the experimental results were in good agreement with the numerical simulation results. The hydraulic efficiency at 1.2 Qd of the magnetic drive pump was set as the optimization objective. Using response surface methodology (RSM), surrogate models were established for the objective functions based on the numerical results. The multi-island genetic algorithm (MIGA) was used to optimize the impeller. The hydraulic efficiency of the optimal impeller at rated flow rate was 72.89%, which was 6.23% higher than the prototype impeller.
APA, Harvard, Vancouver, ISO, and other styles
47

Xu, Zhenfa, Fanyu Kong, Kun Zhang, Han Zhu, Jiaqiong Wang, and Ning Qiu. "Optimization of the impeller for hydraulic performance improvement of a high-speed magnetic drive pump." Advances in Mechanical Engineering 14, no. 6 (2022): 168781322211045. http://dx.doi.org/10.1177/16878132221104576.

Full text
Abstract:
Magnetic drive centrifugal pumps have compact structure and lower efficiency than ordinary centrifugal pumps. The surrogate-based optimization technique was applied to improve the performance of a high-speed magnetic drive pump with the help of numerical simulations. Eight geometrical parameters of the impeller were considered as the design variable. About 290 samples of impeller were generated by optimal Latin hypercube sampling (OLHS) method, and the corresponding efficiencies of all the impeller samplings were obtained from numerical simulation. The performance test of the prototype pump was carried out, and the experimental results were in good agreement with the numerical simulation results. The hydraulic efficiency at 1.2 Qd of the magnetic drive pump was set as the optimization objective. Using response surface methodology (RSM), surrogate models were established for the objective functions based on the numerical results. The multi-island genetic algorithm (MIGA) was used to optimize the impeller. The hydraulic efficiency of the optimal impeller at rated flow rate was 72.89%, which was 6.23% higher than the prototype impeller.
APA, Harvard, Vancouver, ISO, and other styles
48

Li, Xiao Yan, Ran Zhang, Xin Zhao, and Hai Nian Wang. "Sensitivity Analysis of Flexible Pavement Parameters by Mechanistic-Empirical Design Guide." Applied Mechanics and Materials 590 (June 2014): 539–45. http://dx.doi.org/10.4028/www.scientific.net/amm.590.539.

Full text
Abstract:
The sensitivity analysis of performance for flexible pavement can be used to study the influence of every parameters, which can help optimization design and performance evaluation of the pavement structure. In this research, three asphalt layers and four asphalt layers were selected to study the sensitivity of traffic and material parameters based on MEPDG method. Some computer aided engineering simulation techniques, such as the Latin hypercube sampling (LHS) technique and the multiple regression analysis are used as auxiliary means. The single factorial sensitivity analysis method was used to calculate the sensitivity of traffic and material parameters and to validate the result of sensitivity analysis in multiple regression process based on MEPDG. The result shows that, about traffic parameters, Two-way AADT, Dual tire spacing and Vehicle operational speed are more significant than other parameters. The largest effects on rutting are due to Air Void--2, Air Void--1, Poisson' Ratio--1 and Poisson' Ratio--2. However, it is different on cracking, where the largest effects are due to Air Void--1, Poisson' Ratio--1, and Efficient Binder Content--1.
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Rui, Yu Guang Xie, Kai Xie, and Ya Qiao Luo. "Thermal Unit Commitment Problem with Wind Power and Energy Storage System." Applied Mechanics and Materials 347-350 (August 2013): 1455–61. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.1455.

Full text
Abstract:
This paper presents a methodology for solving unit commitment (UC) problem for thermal units integrated with wind power and generalized energy storage system (ESS).The ESS is introduced to achieve peak load shaving and reduce the operating cost. The volatility of wind power is simulated by multiple scenarios, which are generated by Latin hypercube sampling. Meanwhile, the scenario reduction technique based on probability metric is introduced to reduce the number of scenarios so that the computational burden can be alleviated. The thermal UC problem with volatile wind power and ESS is transformed to a deterministic optimization which is formulated as the mixed-integer convex program optimized by branch and bound-interior point method. During the branch and bound process, the best first search and depth first search are combined to expedite the computation. The effectiveness of the proposed algorithm is demonstrated by a ten unit UC problem.
APA, Harvard, Vancouver, ISO, and other styles
50

Pholdee, Nantiwat, and Sujin Bureerat. "Surrogate-Assisted Evolutionary Optimizers for Multiobjective Design of a Torque Arm Structure." Applied Mechanics and Materials 101-102 (September 2011): 324–28. http://dx.doi.org/10.4028/www.scientific.net/amm.101-102.324.

Full text
Abstract:
This paper presents two surrogate-assisted optimization strategies for structural constrained multiobjective optimization. The optimization strategies are based on hybridization of multiobjective population-based incremental learning (MOPBIL) and radial-basis function (RBF) interpolation. The first strategy uses MOPBIL for generating training points while the second strategy uses a Latin hypercube sampling (LHS) technique. The design case study is the shape and sizing design of a torque arm structure. A design problem is set to minimize structural mass and displacement while constraints include stresses due to three different load cases. Structural analysis is carried out by means of a finite element approach. The design problem is then tackled by the proposed surrogate-assisted design strategies. Numerical results show that the use of MOPBIL for generating training points is superior to the use of LHS based on a hypervolume indicator and root mean square error (RMSE).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!