To see the other types of publications on this topic, follow the link: Geotechnical Site Investigation.

Dissertations / Theses on the topic 'Geotechnical Site Investigation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 26 dissertations / theses for your research on the topic 'Geotechnical Site Investigation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wint, Joanne. "Geotechnical site investigation of vegetated slopes." Thesis, Nottingham Trent University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.429261.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ryan, Christopher R. "Geotechnical investigation of Montrose wetland site." Morgantown, W. Va. : [West Virginia University Libraries], 2004. https://etd.wvu.edu/etd/controller.jsp?moduleName=documentdata&jsp%5FetdId=3723.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2004.
Title from document title page. Document formatted into pages; contains xii, 191 p. : ill. (some col.), maps (some col.). Vita. Includes abstract. Includes bibliographical references (p. 117-119).
APA, Harvard, Vancouver, ISO, and other styles
3

Ibrahim, Jwan Abdul Razzak. "The application of knowledge based technology to geotechnical site investigation." Thesis, Heriot-Watt University, 1993. http://hdl.handle.net/10399/1420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Oliver, Andy. "A knowledge based system for the interpretation of site investigation information." Thesis, Durham University, 1994. http://etheses.dur.ac.uk/969/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kim, Jong Hee. "Improvement of geotechnical site investigations via statistical analyses and simulation." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/41218.

Full text
Abstract:
The purpose of this study is to improve site investigation in geotechnical engineering via the evaluation and development of statistical approaches for characterizing the spatial variability of soil properties and the development of site investigation simulation software for educational use. This study consists of four components: statistical characteristics, data measurement, simulation, and educational training. Statistical measures of spatial variability of soil properties were examined for three different geographical areas where soil formation processes differ to assess the influence on the spatial variability of soils. Statistical measures of spatial variability were also calculated for a case history where blasting was used as a method of soil improvement to evaluate the effects of man-made changes to soil structure. The concept of spatial aliasing was employed to estimate the maximum allowable sampling interval for field data as a function of the spatial correlation properties. Once a maximum statistically allowable sampling interval is determined for a specific soil property, the minimum statistically required number of soundings / borings is calculated to perform an economical site investigation at a specific site. A simple and efficient simulation technique was proposed to generate correlated, multi-dimensional simulations of soil properties. Based on limited data, the proposed simulation technique generated accurate and correlated simulations of soil properties that are consistent with the observed or proposed correlation structures of soil properties. Lastly, a geotechnical site investigation simulation program with a wide variety of in situ and laboratory tests was developed to allow students to plan and perform a comprehensive site investigation program. The simulation generates an input file based partly on the statistical characteristics of the spatial variability of soil properties analyzed in this study and partly on traditional values. Spatial variability in soil properties is modeled via correlated random fields, interpolation, and a decomposition method to yield realistic geotechnical data. Via the simulation, students are able to obtain experience and judgment in an essential component of geotechnical engineering practice. The four components of this research (statistical characteristics, data measurement, simulation, and educational training) focus on the improvement of site investigation performance in geotechnical engineering, thereby improving reliability analysis in geotechnical practice.
APA, Harvard, Vancouver, ISO, and other styles
6

Gillon, Rosemary Jayne Browning. "The role of the ROV within integrated geotechnical and hydrographic site investigation." Thesis, University of Plymouth, 2002. http://hdl.handle.net/10026.1/1116.

Full text
Abstract:
The acquisition of marine survey data is traditionally undertaken from surface vessels including boats and temporary rigs. Translation of these techniques to the nearshore zone is a complex task and requires equipment adaptation and. often the sacrifice of data coverage. The remotely operated vehicle (ROV) offers the potential for overcoming some of the standard nearshore survey Concems, providing remote intervention and data acquisition in areas of restricted access. In situ testing is the most efficient and reliable method of acquiring data with minimal sediment disturbance effects. Research has been undertaken into the viability of nearshore cone penetration testing (CPT) which has shown the T-Bar flow round penetrometer to be a possible solution. Data could be acquired in sediments with undrained shear strengths of up to 300 kPa from a bottom crawling ROV weighing 260 kgf and measuring 1 m in length by 0.6 m in width. The collection of sediment cores may be necessary in areas requiring ground truthing for geophysical or in situ investigations. A pneumatic piston corer has been designed and manufactured and is capable of collecting sediment cores up to 400 mm in length, 38 mm in diameter, in sediment with undrained shear strength of 17 kPa. To ascertain additional sediment characteristics in situ, a resistivity subbottom profiling system has also been designed and tested and allows for discrimination between sediment types ranging in size from gravel to silt. The integration of equipment and testing procedures can be fiirther developed through the use of integrated data management approaches such as geographical information systems (GIS). An offthe- shelf GIS, Arclnfo 8, was used to create a GIS containing typical nearshore data using the Dart estuary as a case study location.
APA, Harvard, Vancouver, ISO, and other styles
7

Ozyurt, Gokhan. "Cataloging And Statistical Evaluation Of Common Mistakes In Geotechnical Investigation Reports For Buildings On Shallow Foundations." Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12615084/index.pdf.

Full text
Abstract:
Information presented in site investigation reports has a strong influence in design, project costs and safety. For this reason, both the quality and the reliability of site investigation reports are important. However in our country, geotechnical engineering is relegated to second place and site investigation studies, especially parcel-basis ground investigation works
do not receive the attention they deserve. In this study, site investigation reports, that are required for the license of design projects, are examined and the missing/incorrect site investigations, laboratory tests, geotechnical evaluations and geotechnical suggestions that occur in the reports are catalogued. Also, frequency of each mistake is statistically examined
for geotechnical engineers, recommendations and solutions are presented to help them avoid frequent problems.
APA, Harvard, Vancouver, ISO, and other styles
8

Albatal, Ali Hefdhallah Ali. "Advancement of Using Portable Free Fall Penetrometers for Geotechnical Site Characterization of Energetic Sandy Nearshore Areas." Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/94608.

Full text
Abstract:
Portable Free Fall Penetrometers (PFFPs) are lightweight tools used for rapid and economic characterization of surficial subaqueous sediments. PFFPs vary in weight, shape and size with options for using add-on units. The different configurations enable deployments in various environments and water depths, including the nearshore zone where conventional methods are challenged by energetic hydrodynamics and limited navigable depth. Moreover, PFFPs offer an opportunity to reduce the high site investigation costs associated with conventional offshore geotechnical site investigation methods. These costs are often a major obstacle for small projects serving remote communities or testing novel renewable energy harvesting machines. However, PFFPs still face issues regarding data analysis and interpretation, particularly in energetic sandy nearshore areas. This includes a lack of data and accepted analysis methods for such environments. Therefore, the goal of this research was to advance data interpretation and sediments characterization methods using PFFPs with emphasis on deployments in energetic nearshore environments. PFFP tests were conducted in the nearshore areas of: Yakutat Bay, AK; Cannon Beach, AK; and the U.S. Army Corps of Engineers' Field Research Facility's beach, Duck, NC. From the measurements, the research goal was addressed by: (1) introducing a methodology to create a regional sediment classification scheme utilizing the PFFP deceleration and pore pressure measurements, sediment traces on the probe upon retrieval, and previous literature; (2) investigating the effect of wave forcing on the sediments' behavior through correlating variations in sediment strength to wave climate, sandbar migration, and depth of closure, as well as identifying areas of significant sediment mobilization processes; and (3) estimating the relative density and friction angle of sand in energetic nearshore areas from PFFP measurements. For the latter, the field data was supported by vacuum triaxial tests and PFFP deployments under controlled laboratory conditions on sand samples prepared at different relative densities. The research outcomes address gaps in knowledge with regard to the limited studies available that investigate the sand geotechnical properties in energetic nearshore areas. More specifically, the research contributes to the understanding of surficial sediment geotechnical properties in energetic nearshore areas and the enhancement of sediment characterization and interpretation methods.
PHD
APA, Harvard, Vancouver, ISO, and other styles
9

Martin, John Charles. "The development of a knowledge-based system for the preliminary investigation of contaminated land." Thesis, Durham University, 2001. http://etheses.dur.ac.uk/1234/.

Full text
Abstract:
Large areas of the UK have witnessed intense industrialisation since the industrial revolution in the latter part of the 18th Century. Increased environmental awareness and pressure to redevelop brown field sites, have resulted in the majority of civil engineering projects undertaken within the UK encountering some form of contamination. In order to collect the vast amount of information required to assess a potentially contaminated site, a multi-stage site investigation (preliminary investigation, exploratory and detailed investigation) is usually undertaken. The information collected during the investigation allows the three components of the risk assessment process to be identified. These components are the source of contamination, possible pathways for the movement of contaminants and vulnerable targets on and off site. A prototype knowledge-based system (ATTIC Assessment Tool for The Investigation of Contaminated Land) has been developed to demonstrate that knowledge-based technology can be applied to the preliminary stage of the investigation of contaminated land. ATTIC assesses information collected during the preliminary stage of an investigation (past use, geological map, hydrological maps etc.) and assists with the risk assessment process, with the prediction of potential contaminants, hazards and risk to neighbouring areas. The system has been developed, using CLIPS software. It consists of four knowledgebases (source, pathway, target and health and safety knowledge-base), containing 1600 rules. The knowledge within the knowledge-bases was obtained from two main sources. The initial and main source was the technical literature. Obtaining knowledge from technical literature involved reviewing published material, extracting relevant information and converting information into rules suitable for the knowledge-base system. The second source of knowledge was domain experts via a knowledge elicitation exercise. The exercise took the form of a questionnaire relating to the rules and parameters within the system. A Visual Basics interface was also developed in conjunction with the knowledge-based system, in order to allow data entry to the system. The interface uses a series of forms relating to different components within the risk assessment process. On completion of compiling the prototype, the system was validated against a number of case studies. The system predicted the likely contaminants with a reasonable match to those observed, even though the input data for the case studies was limited. The assessment of risks to neighbouring target areas was generally in agreement with the case study reports, matching similar risk values and directions. In addition to the development of the prototype system, a database modelled on the Association of Geotechnical Specialists electronic format for the transfer of ground investigation data was also developed to store preliminary investigation information. The data structures were implemented using Microsoft Access relational database management system software. This allowed the database to be developed within a Microsoft Windows environment.
APA, Harvard, Vancouver, ISO, and other styles
10

Hausmann, Jörg [Verfasser], and Peter [Akademischer Betreuer] Grathwohl. "Parameterisation of the near surface by combined geophysical and direct push techniques in the frame of geotechnical site investigation / Jörg Hausmann ; Betreuer: Peter Grathwohl." Tübingen : Universitätsbibliothek Tübingen, 2014. http://d-nb.info/1196801762/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Arque, Armengol Anna. "Comparison between preinvestigations and detailed geotechnical site characterization of City Link, Stockholm." Thesis, KTH, Mark- och vattenteknik (flyttat 20130630), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-171800.

Full text
Abstract:
A statistical comparison between the pre-investigations and the detailed site characterization while tunneling was performed in three areas of different rock quality in the Södermalm’s Tunnel. An overview of the site investigations performed prior to the construction works and the consequences in the tunneling method are also presented in this study. The statistical analyses in this study showed low correlation within the results obtained from the geotechnical investigations performed prior and while tunneling. The correlation diminishes as the rock mass quality decreases; however, in areas where the rock mass quality is high, the correlation is not as elevated as expected. The low association within those results may be due to diverse factors: the concentration of the pre-investigations in mostly three areas along the trace of the tunnel, and the extension of those results to the rest of the tunnel; the inappropriate utilization of the investigation techniques; and the lack of geotechnical data in the regional areas of Stockholm. The inaccurate geological characterization given by the pre-investigations leaded to great challenges in the most fractured and altered areas of the tunnel. A collapse occurred where the glaciofluvial sediments were in contact with the rock. The excavation had to be stopped and additional rock reinforcement had to be applied. Therefore, an increase of the expenses in terms of time and budget were the major consequences of the inaccurate predictions.
APA, Harvard, Vancouver, ISO, and other styles
12

Glynn, Mary Eileen 1960. "Geotechnical investigations of two potential sites for the proposed Arizona superconducting super collider." Thesis, The University of Arizona, 1987. http://hdl.handle.net/10150/276641.

Full text
Abstract:
Two sites around the Maricopa and Sierrita Mountains respectively were investigated to provide supporting data for the State of Arizona proposal to the Department of Energy to construct a Superconducting Super Collider (SSC) facility. The main feature of the facility is a 53 mile racetrack shaped tunnel. The proposed Maricopa SSC tunnel passes through three main types of rock--approximately 35 miles of indurated fanglomerates, 10 miles of granodiorites and 8 miles of volcanic and sedimentary rocks. The proposed Sierrita SSC tunnel also passes through three main rock types--approximately 19 miles of indurated fanglomerates, 18 miles of granodiorites and granites and 16 miles of volcanic and associated rocks. Data were obtained from three sources--existing data; field investigations including drill logs and geophysics and laboratory testing. Empirical design approaches were compared with rock classifications (RQD, RMR, Q) at the tunnel horizon. Results indicate mostly routine tunneling at both sites. Recommendations are made for: further logging and testing of existing core; further field mapping; additional boreholes in rock and alluvium; and in situ testing of alluvium.
APA, Harvard, Vancouver, ISO, and other styles
13

Örn, Henrik. "Accuracy and precision of bedrock sur-face prediction using geophysics and geostatistics." Thesis, KTH, Mark- och vattenteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-171859.

Full text
Abstract:
In underground construction and foundation engineering uncertainties associated with subsurface properties are inevitable to deal with. Site investigations are expensive to perform, but a limited understanding of the subsurface may result in major problems; which often lead to an unexpected increase in the overall cost of the construction project. This study aims to optimize the pre-investigation program to get as much correct information out from a limited input of resources, thus making it as cost effective as possible. To optimize site investigation using soil-rock sounding three different sampling techniques, a varying number of sample points and two different interpolation methods (Inverse distance weighting and point Kriging) were tested on four modeled reference surfaces. The accuracy of rock surface predictions was evaluated using a 3D gridding and modeling computer software (Surfer 8.02®). Samples with continuously distributed data, resembling profile lines from geophysical surveys were used to evaluate how this could improve the accuracy of the prediction compared to adding additional sampling points. The study explains the correlation between the number of sampling points and the accuracy of the prediction obtained using different interpolators. Most importantly it shows how continuous data significantly improves the accuracy of the rock surface predictions and therefore concludes that geophysical measurement should be used combined with traditional soil rock sounding to optimize the pre-investigation program.
APA, Harvard, Vancouver, ISO, and other styles
14

Vosolo, David A. "Investigation on geotechnical engineering properties of coal mine spoil subjected to slaking." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/109789/1/David_Vosolo_Thesis.pdf.

Full text
Abstract:
This research project has led to the development of a new approach to assess the saturated and unsaturated properties of soil material subjected to slaking. Constant saturation along with overburden pressure resulted in a larger vertical deformation ofInvestigation on geotechnical engineering properties of coal mine spoil subjected to slaking the slaking chambers; which is indicative of slaking, these results indicate that material slaking was occurring due to saturation and overburden pressure. This will have a substantial benefit to the mining operations with distinctive interest associated to safety of the mine spoil slopes, limiting failures to protect workers, equipment, and operational costs.
APA, Harvard, Vancouver, ISO, and other styles
15

Boller, Ronald C. "Geotechnical investigations at three sites in the South Carolina Coastal Plain that did not liquefy during the 1886 Charleston earthquake." Connect to this title online, 2008. http://etd.lib.clemson.edu/documents/1211385017/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Dong, Yuepeng. "Advanced finite element analysis of deep excavation case histories." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:fda7c27d-a132-4975-a73d-e8e009ca38bb.

Full text
Abstract:
Deep excavations have been used worldwide for underground construction, but they also alter the ground conditions and induce ground movements which might cause risks to adjacent infrastructure. Field measurements are normally carried out during excavations to ensure their safety, and also provide valuable data to calibrate the results from the numerical analysis which is an effective way to investigate the performance of deep excavations. This thesis is concerned with evaluating the capability of advanced finite element analysis in reproducing various aspects of observed deep excavation behaviour in the field through back analysis of case histories. The finite element model developed considers both geotechnical and structural aspects such as (i) detailed geometry of the excavation and retaining structures, (ii) realistic material models for the soil, structures and the soil-structure interface, and (iii) correct construction sequences. Parametric studies are conducted first based on a simplified square excavation to understand the effect of several important aspects, e.g. (i) the merit of shell or solid elements to model the retaining wall, (ii) the effect of construction joints in the retaining wall, (iii) the effect of the operational stiffness of concrete structural components due to cracks, (iv) the thermal effect of concrete beams and floor slabs during curing process and due to variation of ambient temperature, (v) the effect of soil-structure interface behaviour, and (vi) the effect of stiffness and strength properties of the soil. Two more complex case histories are then investigated through fully 3D analyses to explore the influence of various factors such as (i) neglecting the small-strain stiffness nonlinearity in the soil model, (ii) the selected K_0 value to represent the initial stress state in the ground, (iii) the appropriate anisotropic wall properties to consider the joints in the diaphragm wall, (iv) the parameters governing the settlements of adjacent buildings and buried pipelines, (v) the effectiveness of ground improvement on reducing the building settlement, (vi) the variation of construction sequences, (vii) the effectiveness of earth berms, and (viii) ignoring the openings in the floor slabs. This research has strong practical implications, but cautions should also be taken in applications, e.g. element types and parameter selection.
APA, Harvard, Vancouver, ISO, and other styles
17

Samui, Pijush. "Geotechnical Site Characterization And Liquefaction Evaluation Using Intelligent Models." Thesis, 2009. https://etd.iisc.ac.in/handle/2005/628.

Full text
Abstract:
Site characterization is an important task in Geotechnical Engineering. In situ tests based on standard penetration test (SPT), cone penetration test (CPT) and shear wave velocity survey are popular among geotechnical engineers. Site characterization using any of these properties based on finite number of in-situ test data is an imperative task in probabilistic site characterization. These methods have been used to design future soil sampling programs for the site and to specify the soil stratification. It is never possible to know the geotechnical properties at every location beneath an actual site because, in order to do so, one would need to sample and/or test the entire subsurface profile. Therefore, the main objective of site characterization models is to predict the subsurface soil properties with minimum in-situ test data. The prediction of soil property is a difficult task due to the uncertainities. Spatial variability, measurement ‘noise’, measurement and model bias, and statistical error due to limited measurements are the sources of uncertainities. Liquefaction in soil is one of the other major problems in geotechnical earthquake engineering. It is defined as the transformation of a granular material from a solid to a liquefied state as a consequence of increased pore-water pressure and reduced effective stress. The generation of excess pore pressure under undrained loading conditions is a hallmark of all liquefaction phenomena. This phenomena was brought to the attention of engineers more so after Niigata(1964) and Alaska(1964) earthquakes. Liquefaction will cause building settlement or tipping, sand boils, ground cracks, landslides, dam instability, highway embankment failures, or other hazards. Such damages are generally of great concern to public safety and are of economic significance. Site-spefific evaluation of liquefaction susceptibility of sandy and silty soils is a first step in liquefaction hazard assessment. Many methods (intelligent models and simple methods as suggested by Seed and Idriss, 1971) have been suggested to evaluate liquefaction susceptibility based on the large data from the sites where soil has been liquefied / not liquefied. The rapid advance in information processing systems in recent decades directed engineering research towards the development of intelligent models that can model natural phenomena automatically. In intelligent model, a process of training is used to build up a model of the particular system, from which it is hoped to deduce responses of the system for situations that have yet to be observed. Intelligent models learn the input output relationship from the data itself. The quantity and quality of the data govern the performance of intelligent model. The objective of this study is to develop intelligent models [geostatistic, artificial neural network(ANN) and support vector machine(SVM)] to estimate corrected standard penetration test (SPT) value, Nc, in the three dimensional (3D) subsurface of Bangalore. The database consists of 766 boreholes spread over a 220 sq km area, with several SPT N values (uncorrected blow counts) in each of them. There are total 3015 N values in the 3D subsurface of Bangalore. To get the corrected blow counts, Nc, various corrections such as for overburden stress, size of borehole, type of sampler, hammer energy and length of connecting rod have been applied on the raw N values. Using a large database of Nc values in the 3D subsurface of Bangalore, three geostatistical models (simple kriging, ordinary kriging and disjunctive kriging) have been developed. Simple and ordinary kriging produces linear estimator whereas, disjunctive kriging produces nonlinear estimator. The knowledge of the semivariogram of the Nc data is used in the kriging theory to estimate the values at points in the subsurface of Bangalore where field measurements are not available. The capability of disjunctive kriging to be a nonlinear estimator and an estimator of the conditional probability is explored. A cross validation (Q1 and Q2) analysis is also done for the developed simple, ordinary and disjunctive kriging model. The result indicates that the performance of the disjunctive kriging model is better than simple as well as ordinary kriging model. This study also describes two ANN modelling techniques applied to predict Nc data at any point in the 3D subsurface of Bangalore. The first technique uses four layered feed-forward backpropagation (BP) model to approximate the function, Nc=f(x, y, z) where x, y, z are the coordinates of the 3D subsurface of Bangalore. The second technique uses generalized regression neural network (GRNN) that is trained with suitable spread(s) to approximate the function, Nc=f(x, y, z). In this BP model, the transfer function used in first and second hidden layer is tansig and logsig respectively. The logsig transfer function is used in the output layer. The maximum epoch has been set to 30000. A Levenberg-Marquardt algorithm has been used for BP model. The performance of the models obtained using both techniques is assessed in terms of prediction accuracy. BP ANN model outperforms GRNN model and all kriging models. SVM model, which is firmly based on the theory of statistical learning theory, uses regression technique by introducing -insensitive loss function has been also adopted to predict Nc data at any point in 3D subsurface of Bangalore. The SVM implements the structural risk minimization principle (SRMP), which has been shown to be superior to the more traditional empirical risk minimization principle (ERMP) employed by many of the other modelling techniques. The present study also highlights the capability of SVM over the developed geostatistic models (simple kriging, ordinary kriging and disjunctive kriging) and ANN models. Further in this thesis, Liquefaction susceptibility is evaluated from SPT, CPT and Vs data using BP-ANN and SVM. Intelligent models (based on ANN and SVM) are developed for prediction of liquefaction susceptibility using SPT data from the 1999 Chi-Chi earthquake, Taiwan. Two models (MODEL I and MODEL II) are developed. The SPT data from the work of Hwang and Yang (2001) has been used for this purpose. In MODEL I, cyclic stress ratio (CSR) and corrected SPT values (N1)60 have been used for prediction of liquefaction susceptibility. In MODEL II, only peak ground acceleration (PGA) and (N1)60 have been used for prediction of liquefaction susceptibility. Further, the generalization capability of the MODEL II has been examined using different case histories available globally (global SPT data) from the work of Goh (1994). This study also examines the capabilities of ANN and SVM to predict the liquefaction susceptibility of soils from CPT data obtained from the 1999 Chi-Chi earthquake, Taiwan. For determination of liquefaction susceptibility, both ANN and SVM use the classification technique. The CPT data has been taken from the work of Ku et al.(2004). In MODEL I, cone tip resistance (qc) and CSR values have been used for prediction of liquefaction susceptibility (using both ANN and SVM). In MODEL II, only PGA and qc have been used for prediction of liquefaction susceptibility. Further, developed MODEL II has been also applied to different case histories available globally (global CPT data) from the work of Goh (1996). Intelligent models (ANN and SVM) have been also adopted for liquefaction susceptibility prediction based on shear wave velocity (Vs). The Vs data has been collected from the work of Andrus and Stokoe (1997). The same procedures (as in SPT and CPT) have been applied for Vs also. SVM outperforms ANN model for all three models based on SPT, CPT and Vs data. CPT method gives better result than SPT and Vs for both ANN and SVM models. For CPT and SPT, two input parameters {PGA and qc or (N1)60} are sufficient input parameters to determine the liquefaction susceptibility using SVM model. In this study, an attempt has also been made to evaluate geotechnical site characterization by carrying out in situ tests using different in situ techniques such as CPT, SPT and multi channel analysis of surface wave (MASW) techniques. For this purpose a typical site was selected wherein a man made homogeneous embankment and as well natural ground has been met. For this typical site, in situ tests (SPT, CPT and MASW) have been carried out in different ground conditions and the obtained test results are compared. Three CPT continuous test profiles, fifty-four SPT tests and nine MASW test profiles with depth have been carried out for the selected site covering both homogeneous embankment and natural ground. Relationships have been developed between Vs, (N1)60 and qc values for this specific site. From the limited test results, it was found that there is a good correlation between qc and Vs. Liquefaction susceptibility is evaluated using the in situ test data from (N1)60, qc and Vs using ANN and SVM models. It has been shown to compare well with “Idriss and Boulanger, 2004” approach based on SPT test data. SVM model has been also adopted to determine over consolidation ratio (OCR) based on piezocone data. Sensitivity analysis has been performed to investigate the relative importance of each of the input parameters. SVM model outperforms all the available methods for OCR prediction.
APA, Harvard, Vancouver, ISO, and other styles
18

Samui, Pijush. "Geotechnical Site Characterization And Liquefaction Evaluation Using Intelligent Models." Thesis, 2009. http://hdl.handle.net/2005/628.

Full text
Abstract:
Site characterization is an important task in Geotechnical Engineering. In situ tests based on standard penetration test (SPT), cone penetration test (CPT) and shear wave velocity survey are popular among geotechnical engineers. Site characterization using any of these properties based on finite number of in-situ test data is an imperative task in probabilistic site characterization. These methods have been used to design future soil sampling programs for the site and to specify the soil stratification. It is never possible to know the geotechnical properties at every location beneath an actual site because, in order to do so, one would need to sample and/or test the entire subsurface profile. Therefore, the main objective of site characterization models is to predict the subsurface soil properties with minimum in-situ test data. The prediction of soil property is a difficult task due to the uncertainities. Spatial variability, measurement ‘noise’, measurement and model bias, and statistical error due to limited measurements are the sources of uncertainities. Liquefaction in soil is one of the other major problems in geotechnical earthquake engineering. It is defined as the transformation of a granular material from a solid to a liquefied state as a consequence of increased pore-water pressure and reduced effective stress. The generation of excess pore pressure under undrained loading conditions is a hallmark of all liquefaction phenomena. This phenomena was brought to the attention of engineers more so after Niigata(1964) and Alaska(1964) earthquakes. Liquefaction will cause building settlement or tipping, sand boils, ground cracks, landslides, dam instability, highway embankment failures, or other hazards. Such damages are generally of great concern to public safety and are of economic significance. Site-spefific evaluation of liquefaction susceptibility of sandy and silty soils is a first step in liquefaction hazard assessment. Many methods (intelligent models and simple methods as suggested by Seed and Idriss, 1971) have been suggested to evaluate liquefaction susceptibility based on the large data from the sites where soil has been liquefied / not liquefied. The rapid advance in information processing systems in recent decades directed engineering research towards the development of intelligent models that can model natural phenomena automatically. In intelligent model, a process of training is used to build up a model of the particular system, from which it is hoped to deduce responses of the system for situations that have yet to be observed. Intelligent models learn the input output relationship from the data itself. The quantity and quality of the data govern the performance of intelligent model. The objective of this study is to develop intelligent models [geostatistic, artificial neural network(ANN) and support vector machine(SVM)] to estimate corrected standard penetration test (SPT) value, Nc, in the three dimensional (3D) subsurface of Bangalore. The database consists of 766 boreholes spread over a 220 sq km area, with several SPT N values (uncorrected blow counts) in each of them. There are total 3015 N values in the 3D subsurface of Bangalore. To get the corrected blow counts, Nc, various corrections such as for overburden stress, size of borehole, type of sampler, hammer energy and length of connecting rod have been applied on the raw N values. Using a large database of Nc values in the 3D subsurface of Bangalore, three geostatistical models (simple kriging, ordinary kriging and disjunctive kriging) have been developed. Simple and ordinary kriging produces linear estimator whereas, disjunctive kriging produces nonlinear estimator. The knowledge of the semivariogram of the Nc data is used in the kriging theory to estimate the values at points in the subsurface of Bangalore where field measurements are not available. The capability of disjunctive kriging to be a nonlinear estimator and an estimator of the conditional probability is explored. A cross validation (Q1 and Q2) analysis is also done for the developed simple, ordinary and disjunctive kriging model. The result indicates that the performance of the disjunctive kriging model is better than simple as well as ordinary kriging model. This study also describes two ANN modelling techniques applied to predict Nc data at any point in the 3D subsurface of Bangalore. The first technique uses four layered feed-forward backpropagation (BP) model to approximate the function, Nc=f(x, y, z) where x, y, z are the coordinates of the 3D subsurface of Bangalore. The second technique uses generalized regression neural network (GRNN) that is trained with suitable spread(s) to approximate the function, Nc=f(x, y, z). In this BP model, the transfer function used in first and second hidden layer is tansig and logsig respectively. The logsig transfer function is used in the output layer. The maximum epoch has been set to 30000. A Levenberg-Marquardt algorithm has been used for BP model. The performance of the models obtained using both techniques is assessed in terms of prediction accuracy. BP ANN model outperforms GRNN model and all kriging models. SVM model, which is firmly based on the theory of statistical learning theory, uses regression technique by introducing -insensitive loss function has been also adopted to predict Nc data at any point in 3D subsurface of Bangalore. The SVM implements the structural risk minimization principle (SRMP), which has been shown to be superior to the more traditional empirical risk minimization principle (ERMP) employed by many of the other modelling techniques. The present study also highlights the capability of SVM over the developed geostatistic models (simple kriging, ordinary kriging and disjunctive kriging) and ANN models. Further in this thesis, Liquefaction susceptibility is evaluated from SPT, CPT and Vs data using BP-ANN and SVM. Intelligent models (based on ANN and SVM) are developed for prediction of liquefaction susceptibility using SPT data from the 1999 Chi-Chi earthquake, Taiwan. Two models (MODEL I and MODEL II) are developed. The SPT data from the work of Hwang and Yang (2001) has been used for this purpose. In MODEL I, cyclic stress ratio (CSR) and corrected SPT values (N1)60 have been used for prediction of liquefaction susceptibility. In MODEL II, only peak ground acceleration (PGA) and (N1)60 have been used for prediction of liquefaction susceptibility. Further, the generalization capability of the MODEL II has been examined using different case histories available globally (global SPT data) from the work of Goh (1994). This study also examines the capabilities of ANN and SVM to predict the liquefaction susceptibility of soils from CPT data obtained from the 1999 Chi-Chi earthquake, Taiwan. For determination of liquefaction susceptibility, both ANN and SVM use the classification technique. The CPT data has been taken from the work of Ku et al.(2004). In MODEL I, cone tip resistance (qc) and CSR values have been used for prediction of liquefaction susceptibility (using both ANN and SVM). In MODEL II, only PGA and qc have been used for prediction of liquefaction susceptibility. Further, developed MODEL II has been also applied to different case histories available globally (global CPT data) from the work of Goh (1996). Intelligent models (ANN and SVM) have been also adopted for liquefaction susceptibility prediction based on shear wave velocity (Vs). The Vs data has been collected from the work of Andrus and Stokoe (1997). The same procedures (as in SPT and CPT) have been applied for Vs also. SVM outperforms ANN model for all three models based on SPT, CPT and Vs data. CPT method gives better result than SPT and Vs for both ANN and SVM models. For CPT and SPT, two input parameters {PGA and qc or (N1)60} are sufficient input parameters to determine the liquefaction susceptibility using SVM model. In this study, an attempt has also been made to evaluate geotechnical site characterization by carrying out in situ tests using different in situ techniques such as CPT, SPT and multi channel analysis of surface wave (MASW) techniques. For this purpose a typical site was selected wherein a man made homogeneous embankment and as well natural ground has been met. For this typical site, in situ tests (SPT, CPT and MASW) have been carried out in different ground conditions and the obtained test results are compared. Three CPT continuous test profiles, fifty-four SPT tests and nine MASW test profiles with depth have been carried out for the selected site covering both homogeneous embankment and natural ground. Relationships have been developed between Vs, (N1)60 and qc values for this specific site. From the limited test results, it was found that there is a good correlation between qc and Vs. Liquefaction susceptibility is evaluated using the in situ test data from (N1)60, qc and Vs using ANN and SVM models. It has been shown to compare well with “Idriss and Boulanger, 2004” approach based on SPT test data. SVM model has been also adopted to determine over consolidation ratio (OCR) based on piezocone data. Sensitivity analysis has been performed to investigate the relative importance of each of the input parameters. SVM model outperforms all the available methods for OCR prediction.
APA, Harvard, Vancouver, ISO, and other styles
19

Evangelista, Lorenza. "A Critical Review of the MASW Technique for Site Investigation in Geotechnical Enigeering." Tesi di dottorato, 2009. http://www.fedoa.unina.it/3883/1/evangelista.pdf.

Full text
Abstract:
The dispersion curves of surface waves have been successfully used for the characterization of the shallow subsurface for decades. Three steps are involved in utilizing dispersion curves of surface waves for imaging geological profiles: 1. implemented the experimental procedure, 2. create efficient and accurate algorithms organized in a basic data processing sequence designed to extract surface wave dispersion curves from accelerometric records, and 3. develop stable and efficient multimodal inversion algorithms to obtain shear wave velocity profiles. This dissertation focuses on the third step, the inversion of the dispersion curves of surface waves, with the aim of searching the best procedure to get a more accurate and reliable estimate of the geological material properties. The inversion actually is comprised of two sub-steps: 3a) estimate a model employing the theory of surface wave propagation and mathematical optimization; 3b) appraise the model for its accuracy, either deterministically or statistically. One of the major goals of this study is to find the shallow S-wave velocity structure that explains the observed dispersion curves of surface waves. This is achieved by a multimodal inversion that involves the minimization of the cost/objective function that characterizes the differences between observed and calculated dispersion data. Due to discrete nature of inversion problems, the model obtained from the inversion of the data is therefore not necessarily equal to the true model that one seeks. This implies that for realistic problems inversion really consists of model estimation followed by model appraisal. General speaking, there are three catalogs of inversion techniques based on the internal physical principle of the geotechnical problems: linear inversions, non-linear inversions, and trial and error methods. It is common to present the inverted Vs profile as a unique profile without showing a range of possible solutions or some type of error bars, such as the standard deviations of the Vs values of each layer. Additionally, the person performing the inversion usually assumes the prior information required to constrain the problem based on his or her own judgment. Implementing an inversion method that includes estimates of the standard deviations of the Vs profile and finding tools to choose the prior information objectively were the main purposes of this research. To perform SASW inversion, one global and one local search procedures were presented and employed with synthetic data: a pure Monte Carlo method. The synthetic data was produced with a forward algorithm used during inversion. This implies that all uncertainties are caused by the nature of the MASW inversion problem alone since there are no uncertainties added by experimental errors in data collection, analysis of the data to create the dispersion curve, layered model to represent a real 3-D soil stratification, or wave propagation theory. The pure Monte Carlo method was chosen to study the non-uniqueness of the problem by looking at a range of acceptable solutions (i.e., Vs profiles) obtained with as few constraints as possible. It is important to note that this method requires large amounts of time to obtain Vs profiles with low rms error. Based on the variety of shapes found for Vs profiles with satisfactory rms, the non-uniqueness of SASW inversion was evident, concluding that the dispersion curve does not constrain the solution sufficiently to determine a unique Vs profile or to resolve specific velocity contrasts between layers. A summary of the reasons for this factor : 1. Characteristics related to the experimental dispersion curve: 2. Number and distribution of data points describing the experimental dispersion curve 3. Uncertainties of the experimental dispersion data 4. Characteristics related to the initial shear wave velocity profile 5. Depths and thicknesses of the layers 6. Depth to half-space 7. Initial shear wave velocities The points that represent the dispersion characteristics of a site needs to be selected carefully to have: (i) sufficient data to include all important features of the dispersion curve, and (ii) a good balance of information content to resolve the Vs of the layers based on similar amounts of information and have a fairly weighted rms error that gives a good measure of the fit between theoretical and experimental data. Therefore, using a multimodal inversion algorithms can reduce the non-uniqueness of the SASW inversion, allowing to use all the information contained in the experimental dispersion curve. To improve the interpretation step of the MASW experimental procedures, it is focused the attention of the dispersion behaviour of Rayleigh wave in complex startigraphic condition. It is analysed the influence of subsoil structure on the experimental data and the error introduced in the inversion step, assuming a one-dimensional soil stratification. Two dimensional numerical models are developed to investigate the behaviour of Rayleigh waves in the presence of lateral anomalies: slope interface between the layers. Different geometrical configuration are analyzed to take account also the influence of the source, relating to the immersion of the layers. The results show a clear influence of the subsurface structure, that can induce to a underestimation of the correct thickness layer of the soil profile investigated. To overcome this limitation a new approach is proposed to correct the dispersion curve with adequate factor before the inversion problem.
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Shuo-Ting, and 陳碩霆. "The Study of Localized Site Investigation Guideline and Geotechnical Parameters for Offshore Wind Farm." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/5h86va.

Full text
Abstract:
碩士
國立臺灣海洋大學
河海工程學系
105
Taiwan’s offshore wind farm industry is starting to launch, but lacked marine site investigation experiences and localized guideline in Taiwan. Therefore, how to establish the framework of offshore wind farm preparatory system and compose the site investigation guidelines which are suitable for Taiwan’s maritime environment and offshore wind farms become an important issue. This study was collected and discussed the codes or guidelines of international offshore wind farm site investigation, geotechnical design and EIA standard, including ABS, BV, BSH, DNV-GL, Euro code, IEC, NORSOK, SUT, and preparatory system including Denmark, UK, USA, Germany, China, HK, etc. This study also integrated the internal legislation or law and considering the local condition to propose the localized site investigation guideline for offshore wind farm and listed recommendations for preparatory system. Based on the related international codes and guidelines, this study was provided a suitable guideline of localized site investigation for offshore wind farm. This guideline not only explores the requirements to establish offshore wind farms, but also explains the details of site investigations such as geotechnical and geophysical survey. Specifically, this guideline combines those laws and standards; and makes it effective to Taiwan’s offshore wind industry. It could be improved the capacity of Taiwan’s offshore wind power development. As the basis of the important geotechnical parameters of offshore investigation suggestion by SUT and NORSOK standard, this study suggests several of essential geotechnical parameters and data which are necessary in the design phase. Research areas are situated in these four offshore wind farm areas, as Chang-Bin Fuhai phase II, the submarine cable route, and Si-Dao area. Collecting site investigation information, such as geophysics, geotechnical and drilling data, and the correlation of soil parameters and cone-penetration test data were discussed to establish geotechnical parameters and propose the suggestions for localization parameters. By using of the empirical formulas, this study shows most of the soil layers of site area are composed with sandy soils, and also has soft soil distribution. Besides, due to some deviations between the mean drilling data, laboratory tests are necessary to get accurate soil parameters, and should be aware of the risk of soft shallow soils in the maritine engineering of offshore wind farm. Keywords: offshore wind farm, preparatory system, site investigation guideline, geotechnical parameters, cone-penetration test (CPT)
APA, Harvard, Vancouver, ISO, and other styles
21

Calheiros, António Barreto. "Condições geológicas e geotécnicas da fundação para um edifício na Av. Fernão de Magalhães, em Coimbra." Master's thesis, 2017. http://hdl.handle.net/10316/82187.

Full text
Abstract:
Dissertação de Mestrado em Geociências apresentada à Faculdade de Ciências e Tecnologia
Procura-se fazer a análise das condições geológicas e geotécnicas do local do futuro Centro de Saúde Fernão de Magalhães. Dividiu-se o trabalho em seis capítulos: o primeiro consiste numa introdução, na qual são descritos os objetivos do trabalho, bem como uma breve introdução histórica e um enquadramento geográfico e geológico; no segundo capítulo é descrito o enchimento aluvionar do Mondego tendo em conta as glaciações, a erosão e a ocupação humana da área em estudo; o terceiro capítulo é dedicado à descrição das técnicas de prospeção e a sua importância; no quarto capítulo descrevem-se os trabalhos de prospeção do Centro de Saúde Fernão de Magalhães tendo em conta as sondagens realizadas, o zonamento geotécnico atribuído e a arqueologia; no quinto capítulo apresentam-se várias condicionantes à realização do projeto, nomeadamente a presença de património arqueológico ou a falta de condições geológicas e geotécnicas para a realização do projeto; o último capítulo diz respeito às conclusões do trabalho.Reunindo as características geológicas e geotécnicas dos terrenos foi possível distinguir quatro unidades geotécnicas observadas. O zonamento geotécnico atribuído foi, unidade geotécnica UG1 – Aterro, UG2 – Aluvião, UG3 – Solo Argiloso e UG4 – Margas e Calcários. Deste zonamento geotécnico conclui-se que existem unidades geotécnicas que apresentam características não adequadas à construção de edifíciosDe acordo com as informações transmitidas pelo Arqueólogo Pedro Roquinho, se a fundação for realizada acima do nível arqueológico deverá ser necessário acompanhamento arqueológico na fase de execução das fundações. Caso se opte por fundações mais profundas deverá ser necessário realizar previamente sondagens arqueológicas. As decisões vinculativas serão da responsabilidade da Direção Regional de Cultura do Centro.
It is sought of this paper to provide an objective assessment of the geological and geotechnical conditions of the areas where will be the future Health Primary Care Center Fernão de Magalhães. The work was divided into six chapters: the first one is an introduction, in which the objectives of the work are listed, as well as a brief historical introduction and a geographic and geological framework; in the second chapter the river Mondego alluvial filler is described taking into account the glaciations, erosion and human occupation of the area under study; the third chapter is dedicated to the explanation of prospecting techniques as well as their importance; in the fourth chapter the exploration works of the Fernão de Magalhães Health Center are described, taking into account the surveys carried out, the assigned geotechnical zoning and archeology itself; in the fifth chapter presents the several constraints to the realization of the project, such as the presence of archaeological heritage or the lack of geological and geotechnical conditions for the realization of the project as it is; finally the last one it´s dedicate to the conclusions of the paper.By combining the geological and geotechnical characteristics of the lands, it was possible to distinguish four observed geotechnical units. The geotechnical zoning attributed was, geotechnical unit UG1 - Landfill, UG2 - Alluvium, UG3 – Clay Soil and UG4 - Marls and Limestone. From this geotechnical zoning it is concluded that there are geotechnical units that present characteristics that are not suitable for the construction of buildings.According to the information transmitted by the archaeologist Pedro Roquinho, if the foundation is carried out above the archaeological level, archaeological accompaniment should be necessary in the execution phase of the foundations. In case of deeper foundations, archaeological surveys must be carried out. The binding decisions shall be responsibility of the Regional Directorate of Culture of the Center.
APA, Harvard, Vancouver, ISO, and other styles
22

Crisp, Michael Perry. "The Optimization of Geotechnical Site Investigations for Pile Design in Multiple Layer Soil Profiles Using a Risk-Based Approach." Thesis, 2020. http://hdl.handle.net/2440/129182.

Full text
Abstract:
The testing of subsurface material properties, i.e. a geotechnical site investigation, is a crucial part of projects that are located on or within the ground. The process consists of testing samples at a variety of locations, in order to model the performance of an engineering system for design processes. Should these models be inaccurate or unconservative due to an improper investigation, there is considerable risk of consequences such as structural collapse, construction delays, litigation, and over-design. However, despite these risks, there are relatively few quantitative guidelines or research items on informing an explicit, optimal investigation for a given foundation and soil profile. This is detrimental, as testing scope is often minimised in an attempt to reduce expenditure, thereby increasing the aforementioned risks. This research recommends optimal site investigations for multi-storey buildings supported by pile foundations, for a variety of structural configurations and soil profiles. The recommendations include that of the optimal test type, number of tests, testing locations, and interpretation of test data. The framework consists of a risk-based approach, where an investigation is considered optimal if it results in the lowest total project cost, incorporating both the cost of testing, and that associated with any expected negative consequences. The analysis is statistical in nature, employing Monte Carlo simulation and the use of randomly generated virtual soils through random field theory, as well as finite element analysis for pile assessment. A number of innovations have been developed to assist the novel nature of the work. For example, a new method of producing randomly generated multiple-layer soils has been devised. This work is the first instance of site investigations being optimised in multiple-layer soils, which are considerably more complex than the single-layer soils examined previously. Furthermore, both the framework and the numerical tools have been themselves extensively optimised for speed. Efficiency innovations include modifying the analysis to produce re-usable pile settlement curves, as opposed to designing and assessing the piles directly. This both reduces the amount of analysis required and allows for flexible post-processing for different conditions. Other optimizations include the elimination of computationally expensive finite element analysis from within the Monte Carlo simulations, and additional minor improvements. Practicing engineers can optimise their site investigations through three outcomes of this research. Firstly, optimal site investigation scopes are known for the numerous specific cases examined throughout this document, and the resulting inferred recommendations. Secondly, a rule-of-thumb guideline has been produced, suggesting the optimal number of tests for buildings of all sizes in a single soil case of intermediate variability. Thirdly, a highly efficient and versatile software tool, SIOPS, has been produced, allowing engineers to run a simplified version of the analysis for custom soils and buildings. The tool can do almost all the analysis shown throughout the thesis, including the use of a genetic algorithm to optimise testing locations. However, it is approximately 10 million times faster than analysis using the original framework, running on a single-core computer within minutes.
Thesis (Ph.D.) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 2020
APA, Harvard, Vancouver, ISO, and other styles
23

Yang, Rui. "Optimal geotechnical site investigations for slope design." Thesis, 2020. http://hdl.handle.net/1959.13/1427611.

Full text
Abstract:
Research Doctorate - Doctor of Philosophy (PhD)
Almost all natural soil and rock deposits are highly variable in their properties. Soil properties can vary by orders of magnitude from site to site, and even within a single site. As a result, the soil profiles cannot be identified with certainty, even if an extensive subsurface exploration program is executed. In most cases, measurements are only obtained from a limited number of site investigation tests at scattered locations over a construction site constrained by operational and economic considerations. Insufficient or inappropriate site investigation tests can lead to a range of undesirable consequences. Numerous geotechnical projects have experienced cost overruns, unexpected delays, and occasional failure due to unforeseen variability in the subsurface. It is of great significance to develop a design based on an effective site investigation that performs satisfactorily while providing an appropriate level of safety and minimizing the use of financial and human resources. This research proposed a framework which can quantify the benefits of undertaking site investigation of different sampling locations, increased scope and better testing methods. To assess the effectiveness of a site investigation, a simulated slope where the properties are known exactly at every location is used to act as a benchmark. The slope design based on the complete knowledge of the soil properties reflects the true state of the slope, which has only been possible due to the use of simulated soil properties. The site investigation is then carried out numerically at discrete locations from the simulated slope. The slope stability analysis based on the obtained measurements is performed by the finite element method. Such a design would be unreliable because the decision about the stability of slope is made on the basis of a set of samples. Comparisons between the designs based on complete and limited information indicate that two types of decision errors could be made due to inaccurately or inadequately site investigations. This comparison is repeated many times within a Monte Carlo framework to incorporate the uncertainties in the slope design process. Uncertainties due to inherent soil variability, measurement errors and limit measurements have been included. The Monte Carlo simulations have also provided the means to estimate the probabilities of decision errors by counting how many times the unreliable designs have been made based on limited site investigation measurements. Furthermore, the costs associated with the site investigations and making wrong decisions are assigned. It is then possible to assess the site investigation effectiveness through quantifying the relationship between various aspects of a site investigation and the corresponding cost and risk of slope design. The proposed framework enables the direct comparison of different sampling locations, numbers of the site investigation tests, and the testing methods. This allows the identification of optimal site investigation that provides a design with the lowest risk. Results indicated that there is an optimal sampling location that gives the most information while the probability of making the wrong decisions is a minimum. It also appears an optimal site investigation scope, beyond which the cost of additional samplings does not justify the cost savings due to reduced slope failure risk. However, if the cost of slope failure is high, increasing the scope of a site investigation will lead to a lower risk that is because the expected savings in terms of risk are significant when compared to the increased investigation cost. The proposed framework and those results would assist engineers in designing a more efficient and accurate, geotechnical site investigation.
APA, Harvard, Vancouver, ISO, and other styles
24

Goldsworthy, Jason Scott. "Quantifying the risk of geotechnical site investigations." 2006. http://hdl.handle.net/2440/47462.

Full text
Abstract:
The site investigation phase plays a vital role in any foundation design where inadequate characterisation of the subsurface conditions may lead to either a significantly over designed foundation that is not cost-effective, or an under-designed foundation, which may result in foundation failure. As such, the scope of an investigation should be dependent on the conditions at the site and the importance of the structure. However, it is common for the expense dedicated to the site investigation to be a fraction of the total cost of the project, and is typically determined by budget and time constraints, and the experience and judgement of the geotechnical engineer. However, additional site investigation expenditure or sampling is expected to reduce the financial risk of the design by reducing the uncertainties in the geotechnical system and protecting against possible foundation failures. This research has quantified the relative benefits of undertaking site investigations of increased and differing scope. This has been achieved by simulating the design process to yield a foundation design based on the results of a site investigation. Such a design has been compared to an optimal design that utilises the complete knowledge of the soil, which has only been possible due to the use of simulated soils. Comparisons between these two design types indicate the performance of the site investigation to accurately or adequately characterise the site conditions. Furthermore, the design based on the results of the site investigation have been analysed using the complete knowledge of the soil. This yields a probability of failure and, therefore, has been included in a risk analysis where the costs associated with the site investigation have been measured against the financial risk of the design. As such, potential savings in financial risk for increased site investigation expenditure have been subsequently identified. A Monte Carlo analysis has been used in this research to incorporate the uncertainties in the foundation design process. Uncertainties have been included due to soil variability; sampling errors; measurement and transformation model errors; and errors related to the use of a simplified foundation response prediction method. The Monte Carlo analysis has also provided the means to obtain results in a probabilistic framework to enable reliability and risk analyses. Computer code has been specifically developed with an aim to: generate a simulated soil that conforms to the variability of soil properties; simulate a site investigation to estimate data for a foundation design; simulate the design of a foundation and conduct a reliability and risk analysis of such a design. Results indicate that there are significant benefits to be derived from increasing the scope of a site investigation in terms of the risk and reliability of the foundation design. However, it also appears that an optimal site investigation scope or expenditure exists where additional expenditure leads to a design with a higher financial risk due to the increased cost of the site investigation. The expected savings in terms of financial risk are significant when compared to the increased investigation cost. These results will assist geotechnical engineers in planning a site investigation in a more rational manner with knowledge of the associated risks.
http://proxy.library.adelaide.edu.au/login?url= http://library.adelaide.edu.au/cgi-bin/Pwebrecon.cgi?BBID=1255275
Thesis(Ph.D.) -- School of Civil and Environmental Engineering, 2006
APA, Harvard, Vancouver, ISO, and other styles
25

Goldsworthy, Jason Scott. "Quantifying the risk of geotechnical site investigations." Thesis, 2006. http://hdl.handle.net/2440/47462.

Full text
Abstract:
The site investigation phase plays a vital role in any foundation design where inadequate characterisation of the subsurface conditions may lead to either a significantly over designed foundation that is not cost-effective, or an under-designed foundation, which may result in foundation failure. As such, the scope of an investigation should be dependent on the conditions at the site and the importance of the structure. However, it is common for the expense dedicated to the site investigation to be a fraction of the total cost of the project, and is typically determined by budget and time constraints, and the experience and judgement of the geotechnical engineer. However, additional site investigation expenditure or sampling is expected to reduce the financial risk of the design by reducing the uncertainties in the geotechnical system and protecting against possible foundation failures. This research has quantified the relative benefits of undertaking site investigations of increased and differing scope. This has been achieved by simulating the design process to yield a foundation design based on the results of a site investigation. Such a design has been compared to an optimal design that utilises the complete knowledge of the soil, which has only been possible due to the use of simulated soils. Comparisons between these two design types indicate the performance of the site investigation to accurately or adequately characterise the site conditions. Furthermore, the design based on the results of the site investigation have been analysed using the complete knowledge of the soil. This yields a probability of failure and, therefore, has been included in a risk analysis where the costs associated with the site investigation have been measured against the financial risk of the design. As such, potential savings in financial risk for increased site investigation expenditure have been subsequently identified. A Monte Carlo analysis has been used in this research to incorporate the uncertainties in the foundation design process. Uncertainties have been included due to soil variability; sampling errors; measurement and transformation model errors; and errors related to the use of a simplified foundation response prediction method. The Monte Carlo analysis has also provided the means to obtain results in a probabilistic framework to enable reliability and risk analyses. Computer code has been specifically developed with an aim to: generate a simulated soil that conforms to the variability of soil properties; simulate a site investigation to estimate data for a foundation design; simulate the design of a foundation and conduct a reliability and risk analysis of such a design. Results indicate that there are significant benefits to be derived from increasing the scope of a site investigation in terms of the risk and reliability of the foundation design. However, it also appears that an optimal site investigation scope or expenditure exists where additional expenditure leads to a design with a higher financial risk due to the increased cost of the site investigation. The expected savings in terms of financial risk are significant when compared to the increased investigation cost. These results will assist geotechnical engineers in planning a site investigation in a more rational manner with knowledge of the associated risks.
Thesis(Ph.D.) -- School of Civil and Environmental Engineering, 2006
APA, Harvard, Vancouver, ISO, and other styles
26

Teixeira, Ana. "Reliability and cost models of axial pile foundations." Doctoral thesis, 2013. http://hdl.handle.net/1822/24556.

Full text
Abstract:
Tese de doutoramento em Engenharia Civil / Geotecnia
Pile foundations are often used for important structures, and thus, reliability evaluation is an important aspect of the design. Unlike the approach to reliability evaluation used in structural engineering, the traditional procedure used in geotechnical designs addresses uncertainties through high global or partial safety factors, mostly based on past experience. However, this approach to addressing uncertainties does not provide a rational basis for understanding their influence on design. For this reason, and because of regulation codes and social concerns (such as sustainability), geotechnical engineers need to improve their ability to deal with uncertainties and probabilities to help with decision-making. Reliability methods have become increasingly important as decision support tools. The main benefit of reliability analysis is that it provides quantitative information about the parameters (uncertainties) that most significantly influence the behaviour under study. This makes risk control, the determination of the potential causes of adverse effects on the structure, possible. In particular, the design of pile foundations still involves many limitations and uncertainties, particularly when there is not enough investment in soil characterisation and/or pile load tests. In addition to the uncertainties associated with soil characterisation, physical, statistical, spatial and human uncertainties exist. Hence, because it is technically and economically impossible to produce designs of pile foundations in the most unfavourable of cases, it is the engineer’s goal to minimise the risk and limit it to an acceptable level in the most economical manner possible. Towards this, reliability theory needs to be adapted to the needs and objectives of geotechnical engineering. In this subject, the primary purpose of this dissertation is to demonstrate the application of reliability methods to geotechnical design and more particularly to two distinct case studies of vertical single pile foundations under axial loading. This dissertation also presents a simple and practical approach to performing reliability-based design, obtaining valuable information from it. For that purpose, sensitivity and cost analyses were conducted to study the influence of each uncertainty type. Two well-known reliability methods, the first-order reliability method (FORM) and Monte Carlo simulations (MCS) were applied to the case studies for comparison. In addition, reliability-based safety factors were evaluated and discussed. Another purpose of this dissertation is to demonstrate the advantages of employing reliability tools in the decision-making process for pile foundation design. The decision-making related to the economic and research investments required for gathering the information necessary to characterise the uncertainties associated with important random variables, in both pile design and its reliability, is facilitated by the type of balanced analyses presented in this dissertation. It is concluded that, even though the extent to which this can be accomplished depends on the engineer’s knowledge and the project’s budget for investigation, geotechnical engineering definitely benefits from the consideration of reliability in design. It is finally intended to provide knowledge and tools for code harmonisation between structural and geotechnical designs, and also encourage the development of such in geotechnical practice, international standards and conformity in assessment systems.
As fundações por estaca são utilizadas em obras de grande importância, e por esse motivo a fiabilidade na avaliação da segurança é um ponto essencial no seu dimensionamento. Ao contrário do que acontece em engenharia estrutural, a fiabilidade geotécnica é ainda obtida através de elevados coeficientes de segurança, globais ou parciais, na sua maioria com base empírica. No entanto, esta forma de tratar as incertezas não apresenta uma base racional para compreender a sua influência no dimensionamento e no projeto. Por estas razões, por questões de preocupação sociais (como a sustentabilidade) e também para obedecer às novas regulamentações, os engenheiros geotécnicos devem melhorar a sua capacidade para tratar as incertezas e gerir probabilidades, para que com isto possam ter ajuda nas tomadas de decisão. Os métodos de fiabilidade têm ganho uma importância crescente como ferramentas de ajuda e suporte a tomadas de decisão. As principais vantagens são a quantificação da probabilidade de ocorrência do comportamento da estrutura em estudo e a obtenção de informação sobre os parâmetros (incertezas) que mais o influenciam. Isto melhora o controlo do risco e a determinação das potenciais causas de efeitos adversos sobre a estrutura. Em particular, o dimensionamento de fundações por estaca ainda tem várias limitações e diversas incertezas, especialmente quando não existe investimento suficiente na caracterização do solo e/ou na realização de ensaios de carga. A esta incerteza no solo e seu comportamento, acrescentam-se ainda as incertezas físicas, estatísticas, espaciais e erros humanos. Assim, sendo tecnicamente e economicamente impossível fazer dimensionamentos considerando os casos mais desfavoráveis, é objetivo de um engenheiro minimizar e controlar os riscos a um nível aceitável da forma mais económica possível. Para tal a teoria da fiabilidade deve ser adaptada às necessidades e objetivos da engenharia geotécnica. Neste contexto, este trabalho pretende demonstrar como realizar análises de fiabilidade, introduzindo as incertezas no dimensionamento do ponto de vista geotécnico. Dois casos de estudo de duas fundações por estaca submetidas a carga axial são apresentados, explicando metodologias simples e práticas para realizar análises de fiabilidade, das quais um engenheiro pode obter informações valiosas e importantes. Para tal, análises de sensibilidade envolvendo as técnicas de fiabilidade e custos foram realizadas a fim de investigar a influência de cada parâmetro (incerteza) considerada. Dois métodos tradicionais de fiabilidade, o método de fiabilidade de primeira ordem (FORM) e o método de simulação de Monte Carlo (MCS), foram aplicados aos casos de estudo para comparação entre si. Além disso, coeficientes de segurança baseados nas técnicas de fiabilidade foram também avaliados e discutidos. Outro objetivo deste trabalho é demonstrar as vantagens da utilização das ferramentas de fiabilidade no processo de tomada de decisão no projeto e dimensionamento de fundações por estaca. As tomadas de decisão relativas a investimentos económicos e em investigação, necessários para a recolha de informação essencial para caracterizar as incertezas mais influentes, é facilitada com o tipo de análises apresentadas neste trabalho. Conclui-se portanto que, embora este tipo de investimento depende consideravelmente do conhecimento do engenheiro responsável e do orçamento disponível para a obra em questão, o projeto e dimensionamento iriam beneficiar notavelmente com este tipo de análises baseadas nas técnicas de fiabilidade. Finalmente, este trabalho é destinado a fornecer conhecimentos e ferramentas para a harmonização entre os dimensionamentos estrutural e geotécnico, e também incentivar o desenvolvimento destas técnicas na prática de geotecnia, na normalização internacional e na conformidade dos sistemas de avaliação.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography