To see the other types of publications on this topic, follow the link: Seismic Hazard Analysis.

Dissertations / Theses on the topic 'Seismic Hazard Analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Seismic Hazard Analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Mak, Sum. "Seismic analysis of the South China Region." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B30588893.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mapuranga, Victor Philip. "Probabilistic seismic hazard analysis for Zimbabwe." Diss., University of Pretoria, 2014. http://hdl.handle.net/2263/43166.

Full text
Abstract:
In this study, the seismic hazards of Zimbabwe are presented as maps showing probabilistic peak ground acceleration (PGA). Seismic hazards maps have a 10% chance of exceeding the indicated ground acceleration over a 50 year period, and are prepared using a homogenized 101 year catalogue compiled for seismic moment magnitude . Two approaches of probabilistic seismic hazard assessment were applied. The first was the widely used "deductive" approach (Cornell, 1968) which integrates geological and geophysical information together with seismic event catalogues in the assessment of seismic hazards. Application of the procedure includes several steps. As a first step, this procedure requires the delineation of potential seismic zones, which is strongly influenced by historic patterns and based on independent geologic evidence or tectonic features such as faults (Atkinson, 2004; Kijko and Graham, 1998). The second method was the "parametric-historic" approach of Kijko and Graham (1998, 1999) which has been developed for regions with incomplete catalogues and does not require the subjective delineation of active seismic zones. It combines the best features of the deductive Cornell-McGuire procedure and the historic method of Veneziano et al. (1984). Four (4) ground motion prediction equations suitable for hard rock conditions in a specified region were applied in the assessment of seismic hazards. The highest levels of hazards in Zimbabwe are in the south-eastern border of the country with Mozambique, the Lake Kariba area and the mid-Zambezi basin in the vicinity of the Save-Limpopo mobile belt. Results show that assessment of seismic hazard using parametric-historic procedure to a large extent gives a “mirror” of the seismicity pattern whereas using the classic Cornell-McGuire procedure gives results that reflect the delineated pattern of seismic zones and the two methods are best used complementary of each other depending on available input data.
Dissertation (MSc)--University of Pretoria, 2014.
lk2014
Physics
MSc
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
3

Kocair, Celebi. "A Grid-based Seismic Hazard Analysis Application." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612540/index.pdf.

Full text
Abstract:
The results of seismic hazard analysis (SHA) play a crucial role in assessing seismic risks and mitigating seismic hazards. SHA calculations generally involve magnitude and distance distribution models, and ground motion prediction models as components. Many alternatives have been proposed for these component models. SHA calculations may be demanding in terms of processing power depending on the models and analysis parameters involved, and especially the size of the site for which the analysis is to be performed. In this thesis, we develop a grid-based SHA application which provides the necessary computational power and enables the investigation of the effects of applying different models. Our application not only includes various already implemented component models but also allows integration of newly developed ones.
APA, Harvard, Vancouver, ISO, and other styles
4

Mak, Sum, and 麥琛. "Seismic analysis of the South China Region." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B30588893.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tsang, Hing-ho. "Probabilistic seismic hazard assessment direct amplitude-based approach /." Click to view the E-thesis via HKUTO, 2006. http://sunzi.lib.hku.hk/hkuto/record/B36783456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tsang, Hing-ho, and 曾慶豪. "Probabilistic seismic hazard assessment: direct amplitude-based approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36783456.

Full text
Abstract:
The Best PhD Thesis in the Faculties of Dentistry, Engineering, Medicine and Science (University of Hong Kong), Li Ka Shing Prize, 2005-2006.
published_or_final_version
abstract
Civil Engineering
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
7

Wetie, Ngongang Ariane. "Seismic and Volcanic Hazard Analysis for Mount Cameroon Volcano." Diss., University of Pretoria, 2016. http://hdl.handle.net/2263/60871.

Full text
Abstract:
Mount Cameroon is considered the only active volcano along a 1600 km long chain of volcanic complexes called the Cameroon Volcanic Line (CVL). It has erupted seven times during the last 100 years, the most recent was in May 2000. The approximately 500,000 inhabitants that live and work around the fertile flanks are exposed to impending threats from volcanic eruptions and earthquakes. In this thesis, a hazard assessment study that involves both statistical modelling of seismic hazard parameters and the evaluation of a future volcanic risk was undertaken on Mount Cameroon. The Gutenberg-Richter magnitude-frequency relations, the annual activity rate, the maximum magnitude, the rate of volcanic eruptions and risks assessment were examined. The seismic hazard parameters were estimated using the Maximum Likelihood Method on the basis of a procedure which combines seismic data containing incomplete files of large historical events with complete files of short periods of observations. A homogenous Poisson distribution model was applied to previous recorded volcanic eruptions of Mount Cameroon to determine the frequency of eruption and assess the probability of a future eruption. Frequency-magnitude plots indicated that Gutenberg-Richter b-values are partially dependent on the maximum regional magnitude and the method used in their calculation. b-values showed temporal and spatial variation with an average value of 1.53 ± 0.02. The intrusion of a magma body generating the occurrence of relatively small earthquakes as observed in our instrumental catalogue, could be responsible for this high anomalous b-value. An epicentre map of locally recorded earthquakes revealed that the southeastern zone is the most seismically active part of the volcano. The annual mean activity rate of the seismicity strongly depends on the time span of the seismic catalogue and results showed that on average, one earthquake event occurs every 10 days. The maximum regional magnitude values which had been determined from various approaches overlap when their standard deviations are taken into account. However, the magnitude distribution model of the Mt. Cameroon earthquakes might not follow the form of the Gutenberg-Richter frequency magnitude relationship. The datations of the last eruptive events that have occurred on Mt. Cameroon volcanic complex are presented. No specific pattern was observed on the frequency of eruptions, which means that a homogenous Poisson distribution provides a suitable model to estimate the rate of occurrence of volcanic eruptions and evaluate the risk of a future eruption. Two different approaches were used to estimate the mean eruption rate (λ) and both yielded a value of 0.074. The results showed that eruptions take place on average once every 13 years and, with the last eruption occurring over 15 years ago, it is considered that there is at present a high risk of an eruption to occur.
Dissertation (MSc)--University of Pretoria, 2016.
Geology
MSc
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
8

Wilding, Andrew J. "Development of a GIS-based seismic hazard screening tool." Diss., Rolla, Mo. : Missouri University of Science and Technology, 2008. http://scholarsmine.mst.edu/thesis/pdf/Wilding_Thesis_FINAL_09007dcc804eb333.pdf.

Full text
Abstract:
Thesis (M.S.)--Missouri University of Science and Technology, 2008.
Vita. The print version of this thesis includes an accompanying CD-ROM. "Included with this Thesis is a CD-ROM, which contain the VISUAL BASIC CODE for the S4 application...The included code is divided into three files: a) VISUAL BASIC Module Code, b) VISUAL BASIC Form Code, and c) VISUAL BASIC FFT Code."--leaf 158. The entire thesis text is included in file. Title from title screen of thesis/dissertation PDF file (viewed April 25, 2008) Includes bibliographical references (p. 160-172).
APA, Harvard, Vancouver, ISO, and other styles
9

Free, Matthew William. "The attenuation of earthquake strong-motion in intraplate regions." Thesis, Imperial College London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.321810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dawood, Haitham Mohamed Mahmoud Mousad. "Partitioning Uncertainty for Non-Ergodic Probabilistic Seismic Hazard Analyses." Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/70757.

Full text
Abstract:
Properly accounting for the uncertainties in predicting ground motion parameters is critical for Probabilistic Seismic Hazard Analyses (PSHA). This is particularly important for critical facilities that are designed for long return period motions. Non-ergodic PSHA is a framework that allows for this proper accounting of uncertainties. This, in turn, allows for more informed decisions by designers, owners and regulating agencies. The ergodic assumption implies that the standard deviation applicable to a specific source-path-site combination is equal to the standard deviation estimated using a database with multiple source-path-site combinations. The removal of the ergodic assumption requires dense instrumental networks operating in seismically active zones so that a sufficient number of recordings are made. Only recently, with the advent of networks such as the Japanese KiK-net network has this become possible. This study contributes to the state of the art in earthquake engineering and engineering seismology in general and in non-ergodic seismic hazard analysis in particular. The study is divided in for parts. First, an automated protocol was developed and implemented to process a large database of strong ground motions for GMPE development. A comparison was conducted between the common records in the database processed within this study and other studies. The comparison showed the viability of using the automated algorithm to process strong ground motions. On the other hand, the automated algorithm resulted in narrower usable frequency bandwidths because of the strict criteria adopted for processing the data. Second, an approach to include path-specific attenuation rates in GMPEs was proposed. This approach was applied to a subset of the KiK-net database. The attenuation rates across regions that contains volcanoes was found to be higher than other regions which is in line with the observations of other researchers. Moreover, accounting for the path-specific attenuation rates reduced the aleatoric variability associated with predicting pseudo-spectral accelerations. Third, two GMPEs were developed for active crustal earthquakes in Japan. The two GMPEs followed the ergodic and site-specific formulations, respectively. Finally, a comprehensive residual analysis was conducted to find potential biases in the residuals and propose models to predict some components of variability as a function of some input parameters.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
11

Balal, Onur. "Probabilistic Seismic Hazard Assessment For Earthquake Induced Landslides." Master's thesis, METU, 2013. http://etd.lib.metu.edu.tr/upload/12615453/index.pdf.

Full text
Abstract:
Earthquake-induced slope instability is one of the major sources of earthquake hazards in near fault regions. Simplified tools, such as Newmark&rsquo
s Sliding Block (NSB) Analysis are widely used to represent the stability of a slope under earthquake shaking. The outcome of this analogy is the slope displacement where larger displacement values indicate higher seismic slope instability risk. Recent studies in the literature propose empirical models between the slope displacement and single or multiple ground motion intensity measures such as peak ground acceleration or Arias intensity. These correlations are based on the analysis of large datasets from global ground motion recording database (PEER NGA-W1 Database). Ground motions from earthquakes occurred in Turkey are poorly represented in NGA-W1 database since corrected and processed data from Turkey was not available until recently. The objective of this study is to evaluate the compatibility of available NSB displacement prediction models for the Probabilistic Seismic Hazard Assessment (PSHA) applications in Turkey using a comprehensive dataset of ground motions recorded during earthquakes occurred in Turkey. Then the application of selected NSB displacement prediction model in a vector-valued PSHA framework is demonstrated with the explanations of seismic source characterization, ground motion prediction models and ground motion intensity measure correlation coefficients. The results of the study is presented in terms of hazard curves and a comparison is made with a case history in Asarsuyu Region where seismically induced landslides (Bakacak Landslides) had taken place during 1999 Dü
zce Earthquake.
APA, Harvard, Vancouver, ISO, and other styles
12

李德坤 and Dekun Li. "Seismic hazard analysis for bridge design in the Hong Kong region." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31226358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ma, Xu. "Passive Seismic Tomography and Seismicity Hazard Analysis in Deep Underground Mines." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/51266.

Full text
Abstract:
Seismic tomography is a promising tool to help understand and evaluate the stability of a rock mass in mining excavations. Lab measurements give evidence that velocities of seismic wave propagations increase in high stress areas of rock samples. It is well known that closing effects of cracks under compressive pressures tend to increase the effective elastic moduli of rocks. Tomography can map stress transfer and redistribution and further forecast rock burst potential and other seismic hazards, which are influenced by mining. Recorded by seismic networks in multiple underground mines, arrival time of seismic waves and locations of seismic events are used as sources of tomographic imaging survey. An initial velocity model is established according to properties of a rock mass, then velocity structure is reconstructed by velocity inversion to reflect the anomalies of the rock mass. Mining-induced seismicity and double-difference tomographic images of rock mass in mining areas are coupled to show how stress changes with microseismic activities. Especially, comparisons between velocity structures of different periods (before and after rock burst) are performed to analyze effects of rock burst on stress distribution. Tomographic results show that high velocity anomalies form in the vicinity of rock burst before the occurrence, and velocity subsequently experiences a significant drop after the occurrence of rock burst. In addition, regression analysis of travel time and distance indicates that the average velocity of all the monitored region appears to increase before rock burst and reduce after them. A reasonable explanation is that rock bursts tend to be triggered in highly stressed rock masses. After the energy release of rock bursts, stress relief is expected to exhibit within rock mass. Average velocity significantly decreases because of lower stresses and as a result of fractures in the rock mass that are generated by shaking-induced damage from nearby rock burst zones. Mining-induced microseismic rate is positively correlated with stress level. The fact that highly concentrated seismicity is more likely to be located in margins between high-velocity and low-velocity regions manifests that high seismic rates appear to be along with high stress in rock masses. Statistical analyses were performed on the aftershock sequence in order to generate an aftershock decay model to detect potential hazards and evaluate stability of aftershock activities.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
14

Li, Dekun. "Seismic hazard analysis for bridge design in the Hong Kong region /." Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B23436049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Karadeniz, Deniz. "Pilot program to assess seismic hazards of the Granite City, Monks Mound, and Columbia Bottom quadrangles, St. Louis Metropolitan area, Missouri and Illinois." Diss., Rolla, Mo. : University of Missouri-Rolla, 2007. http://scholarsmine.mst.edu/thesis/pdf/Karadeniz_09007dcc8042c729.pdf.

Full text
Abstract:
Thesis (Ph. D.)--University of Missouri--Rolla, 2007.
Vita. The entire thesis text is included in file. Accompanying "this dissertation is a CD-ROM, which contains site amplification and seismic hazard results for each grid point (1974 points) considered in the study. The results have prepared as .txt files. The CD-ROM also contains the maps generated from these estimated results. The maps are prepared as .png files." Title from title screen of thesis/dissertation PDF file (viewed January 28, 2008) Includes bibliographical references (p. 249-269).
APA, Harvard, Vancouver, ISO, and other styles
16

Yilmaz, Ozturk Nazan. "Probabilistic Seismic Hazard Analysis: A Sensitivity Study With Respect To Different Models." Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/3/12609403/index.pdf.

Full text
Abstract:
Due to the randomness inherent in the occurrence of earthquakes with respect to time, space and magnitude as well as other various sources of uncertainties, seismic hazard assessment should be carried out in a probabilistic manner. Basic steps of probabilistic seismic hazard analysis are the delineation of seismic sources, assessment of the earthquake occurrence characteristics for each seismic source, selection of the appropriate ground motion attenuation relationship and identification of the site characteristics. Seismic sources can be modeled as area and line sources. Also, the seismic activity that can not be related with any major seismic sources can be treated as background source in which the seismicity is assumed to be uniform or spatially smoothed. Exponentially distributed magnitude and characteristic earthquake models are often used to describe the magnitude recurrence relationship. Poisson and renewal models are used to model the occurrence of earthquakes in the time domain. In this study, the sensitivity of seismic hazard results to the models associated with the different assumptions mentioned above is investigated. The effects of different sources of uncertainties involved in probabilistic seismic hazard analysis methodology to the results are investigated for a number of sites with different distances to a single fault. Two case studies are carried out to examine the influence of different assumptions on the final results based on real data as well as to illustrate the implementation of probabilistic seismic hazard analysis methodology for a large region (e.g. a country) and a smaller region (e.g. a province).
APA, Harvard, Vancouver, ISO, and other styles
17

Kurata, Masahiro. "Strategies for rapid seismic hazard mitigation in sustainable infrastructure systems." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31770.

Full text
Abstract:
Thesis (Ph.D)--Civil and Environmental Engineering, Georgia Institute of Technology, 2010.
Committee Co-Chair: DesRoches, Reginald; Committee Co-Chair: Leon, Roberto T.; Committee Member: Craig, James I.; Committee Member: Goodno, Barry; Committee Member: White, Donald W. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
18

Lopes, da Silva Valencio Arthur. "An information-theoretical approach to identify seismic precursors and earthquake-causing variables." Thesis, University of Aberdeen, 2018. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=237105.

Full text
Abstract:
Several seismic precursors and earthquake-causing variables have been proposed in the last decades based on physical considerations and case observations, however none has been confirmed on long datasets using linear analysis. This work adopts an information-theoretical approach to investigate the occurrence of causal flow between these precursors and causing variables and seismicity. It starts by introducing the key concepts in seismology and presenting the current main precursor candidates. Four variables will be considered as possible precursors or anomalies leading to earthquakes: large tidal amplitudes, temporal fluctuations in the Gutenberg-Richter's b-value, surface gravity changes, and preceding anomalous seismicity patterns. To perform the causality test between these variables and their effects, it is developed a method which allows the fast calculation of Transfer Entropy for any two time-series, detecting the direction of the flow of information between the variables of interest. The method is tested to coupled logistic maps and networks with different topologies before application to geophysical events. The analysis shows mutual information relating to coupling strength and also allows inference of the causal direction from data using the Transfer Entropy, both in bivariate systems and in networks. The method was then applied to the earthquake analysis for an interval of 4018 days on an area comprising the Japan trench. Within a conservative margin of confidence, the results could not at this point confirm any of the four precursor options considered, but future work can clarify initial suggestions regarding tidal amplitudes link to seismicity, and pre-seismic gravity changes and cumulative daily magnitude anomalies. The Matlab/Octavecodes for our method are open-source and available at https://github.com/artvalencio/causality-toolbox We hope the method is able to support the quest for other precursor candidates, and to assist other fields of knowledge.
APA, Harvard, Vancouver, ISO, and other styles
19

Vermeulen, Petrus Johannes. "Problems in parameter estimation in probabilistic seismic hazard analysis and some solutions." Thesis, University of Pretoria, 2020. http://hdl.handle.net/2263/77895.

Full text
Abstract:
Probabilistic Seismic Hazard Analysis (PSHA) is not a new study field — indeed, it dates from the late 1960s. However, the original and introductory study paid scant attention to a crucial aspect, namely the estimation of the model parameters. Consequently, over the ensuing five decades, Parameter Estimation in Probabilistic Seismic Hazard Analysis (PE-PSHA) has not gained due recognition as an independent field of study. A review of the relevant body of literature indicates that PE-PSHA is not yet regarded as an entity, a coherent body of literature, or a study field. This study aims to introduce PE-PSHA as a distinct field of study. In 1968, Cornell introduced what is known today as Probabilistic Seismic Hazard Analysis (PSHA). Although a landmark study, it is peculiar and even astonishing that Cornell (1968) simply ignored the crucial aspect of parameter estimation of models. This aspect and the implications of ignoring the importance of parameter estimation are discussed in detail in this thesis. Seismicity modelling in general and the classic Cornell–McGuire procedure are introduced, which provides the platform for the introduction of the parameters typically associated with it, usually referred to as seismicity parameters. Subsequently, each parameter is discussed in detail, clarifying the development of estimation techniques, as well as the problem areas that could be identified. In some instances, solutions are put forward, either as own research by the author or gleaned from the literature. A discussion is presented on the magnitude of completeness (𝑚𝑐) of seismic catalogues, along with a critical analysis of the estimation techniques currently employed. Concerns about some of these methods are discussed comprehensively and clarified by detailed argument. The two principal model parameters are discussed, namely the Gutenberg–Richter 𝑏-value and the rate of seismicity (RoS). A review of the estimation techniques of these parameters is presented, as well as the problems encountered. This review also serves as an overview of the historical development of the estimation of the two parameters. Various solutions have been put forward to some of the problems encountered; however, these solutions are not being employed. Subsequently, some estimators for the 𝑏-value for incomplete catalogues are compared. The maximum possible earthquake magnitude for a given area (𝑚𝑚𝑎𝑥) from the seismic catalogue data is discussed. A few procedures (or estimators) have been proposed, although only by a few researchers. The estimators are discussed in some detail and are analysed critically, among which are methods newly investigated by the author. The concept of seismic zones is discussed, as, although seismic zones are not parameters, the delineation of seismic sources is a modelling procedure that requires estimation from the catalogue data similar to estimating parameters (this can be regarded as a generalised part of parameter estimation). The practice of seismic zoning based largely on expert opinion is analysed critically, and a number of alternatives are discussed. In the conclusion to the study, the need for PE-PSHA to be regarded as an entity, or separate field of study, is highlighted. In addition, the discussed problems and solutions are reviewed, and recommendations are made. Finally, possible future research areas are pointed out.
Thesis (PhD)--University of Pretoria, 2020.
Geology
PhD
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
20

Pavlenko, Vasily. "Ground motion variability and its effect on the probabilistic seismic hazard analysis." Thesis, University of Pretoria, 2016. http://hdl.handle.net/2263/60850.

Full text
Abstract:
The majority of injuries and casualties during earthquakes occur as a result of partial or complete collapse of buildings. The assessment of possible seismic ground motions for the purposes of earthquake-resistant design can be performed by following the deterministic or probabilistic methodology. Chapter 1 presents an overview of the current practice in seismic hazard analysis with emphasis on PSHA. At present, the Cornell-McGuire method prevails in PSHA studies. Despite significant development and modifications, this method has several controversial aspects. Absence of an upper bound of the seismic hazard curve is one of the most disputable aspects of the method, as it leads to unrealistic ground motion estimates for very low probabilities of exceedance. This problem stems from using the unbounded log-normal distribution in the modelling of the ground motion variability. The main objective of the study was to investigate this variability and suggest a more realistic probability distribution which would allow accounting for the finiteness of the ground motion induced by earthquake. Chapter 2 introduces the procedure that is suitable for studying the ground motion variability. Given the data sample, this procedure allows selecting the most plausible probability distribution from a set of candidate models. Chapter 3 demonstrates the application of this procedure to PGA data recorded in Japan. This analysis demonstrated the superiority of the GEVD in the vast majority of considered examples. Estimates of the shape parameter of the GEVD were negative in every considered example, indicating the presence of a finite upper bound of PGA. Therefore, the GEVD provides a model that is more realistic for the scatter of the logarithm of PGA, and the application of this model leads to a bounded seismic hazard curve. In connection with a revival of interest in seismic intensity as an analogue for physical ground motion parameters, the problem of accounting for anisotropy in the attenuation of MMI is considered in Chapter 4. A set of four equations that could account for this anisotropy was proposed and the applicability of these equations was demonstrated by modelling the isoseismal maps of two well-recorded seismic events that have occurred in South Africa. The results demonstrated that, in general, the new equations were superior to the isotropic attenuation equation, especially as regards to the pronounced anisotropy. As several different PSHA methods exist, it is important to know how the results of application of these methods corresponded to each other. Chapter 5 presents the comparative study of three major PSHA methods, namely, the Cornell-McGuire method, the Parametric-Historic method, and the method based on Monte Carlo simulations. Two regions in Russia were selected for comparison, and the PGA estimates were compared for return periods of 475 and 2475 years. The results indicated that the choice of a particular method for conducting PSHA has relatively little effect on the hazard estimates when the same seismic source model was used in the calculations. The considered PSHA methods would provide closely related results for areas of moderate seismic activity; however, the difference among the results would apparently increase with increasing seismic activity.
Thesis (PhD)--University of Pretoria, 2016.
Physics
PhD
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
21

Pancha, Aasha. "Seismic hazards in the Basin and Range province, U.S.A." abstract and full text PDF (free order & download UNR users only), 2007. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3258844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Stergiou, Evangelos Kiremidjian Anne S. "Treatment of uncertainties in seismic-risk analysis of transportation systems /." Berkeley, Calif. : Pacific Earthquake Engineering Research Center, 2008. http://peer.berkeley.edu/publications/peer_reports.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Itamochi, Mami. "Effective planning for seismic risk case of Kobe, Japan /." Huntington, WV : [Marshall University Libraries], 2004. http://www.marshall.edu/etd/descript.asp?ref=411.

Full text
Abstract:
Thesis (M.A.)--Marshall University, 2004.
Title from document title page. Abstract included. Document formatted into pages; contains vi, 48 p. Includes abstract. Includes bibliographical references (p. 46-48).
APA, Harvard, Vancouver, ISO, and other styles
24

Elkady, Ahmed. "Variability and uncertainty in probabilistic seismic hazard analysis for the island of Montreal." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=103774.

Full text
Abstract:
The current seismic design process for structures in Montréal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montréal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "ε" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montréal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.
Le processus de conception sismiques actuelles pour les structures à Montréal est basé sur l'édition 2005 du Code National du Bâtiment du Canada (CNBC 2005), qui est basé sur un niveau de risque correspondant à une probabilité de dépassement de 2% en 50 ans. Le code est basé sur le spectre uniforme d'aléa (UHS) et a été dérivé par la Commission Géologique du Canada (CGC) avec une version modifiée du logiciel F-RISK. Cette version du code fut développée sans considérer formellement la contribution de l'incertitude épistémique sur l'aléa. Les sources de l'incertitude épistémique sont reliées à la formulation sources sismiques. Un modèle sismologique est constitué de sources sismiques (localisation, étendue spatiale, taux d'activité, distribution en magnitude et magnitude maximale) et au choix de la fonction d'atténuation (GMPE, Ground Motion Prediction Equation). En général, et en particulier pour Montréal, GMPEs sont la source principale d'incertitude épistémique par rapport à d'autres variables du modèle sismique sismologiques. L'objectif de cette thèse est d'utiliser le logiciel CRISIS pour étudier l'effet de l'incertitude épistémique sur l'analyse des risques sismiques probabiliste (PSHA) des produits comme le UHS et les valeurs désagrégation en intégrant de nouveaux GMPE différents. étudier l'effet de l'incertitude épistémique L'epsilon "ε" paramètre est également discuté de ce qui évaluer la différence entre un effet de site médian (prédit par une fonction d'atténuation) et une valeur spécifiée et est couramment utilisée pour les études de vulnérabilité. Une méthode de calcul d'Epsilon est proposée pour la région de Montréal ainsi que pour le calcul d'un Epsilon pondéré lorsque l'incertitude épistémique est considérée, Ces valeurs epsilon peut être très utile pour identifier les événements conception, la sélection d'enregistrements réels, l'évaluation des études de vulnérabilité et de la liquéfaction. Un bref aperçu de l'epsilon record qui représente la forme spectrale de l'histoire du temps des mouvements du sol est également présenté.
APA, Harvard, Vancouver, ISO, and other styles
25

Weatherill, Graeme. "A Monte Carlo approach to probabilistic seismic hazard analysis in the Aegean region." Thesis, University of East Anglia, 2009. https://ueaeprints.uea.ac.uk/10630/.

Full text
Abstract:
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. The three fundamental components of PSHA are considered: seismic source model, magnitude recurrence model and ground motion attenuation model. Initial analysis of the earthquake catalogue indicates that a doubly-truncated Gutenberg-Richter recurrence relation is an appropriate recurrence model for the Aegean. A novel seismic source model is presented, developed by interpretation of Aegean seismotectonics. The K-means cluster analysis algorithmis introduced as newand objectivemeans of partitioning seismicity and seismogenic faults to achieve of source zone delineation. Partitions of the seismicity containing 20 to 30 earthquake clusters emerge as the most appropriate for modelling seismicity in the Aegean. The 27 and 29 cluster K-means source models are integrated into the seismic hazard analysis alongside existing source models. Attenuation models are reviewed, (including European, Greek and global Next Generation Attenuation models) and their suitability for the Aegean region qualitatively and quantitatively assessed. Seismic hazard maps are produced and site-specific seismic hazard analyses undertaken for 8 selected cities across the Aegean. Epistemic uncertainty is qualitatively assessed by consideration of different source and attenuation models, before being integrated into the PSHA via the Monte Carlo technique. Further extensions to this method (fault and site characterization and aftershock simulation) are presented and their impact on the PSHA assessed. Fault and site characterization appear to have a significant impact on the outcome of the seismic hazard analysis.
APA, Harvard, Vancouver, ISO, and other styles
26

Astorga, Mejia Marlem Lucia. "Simplified Performance-Based Analysis for Seismic Slope Displacements." BYU ScholarsArchive, 2016. https://scholarsarchive.byu.edu/etd/5963.

Full text
Abstract:
Millions of lives have been lost over the years as a result of the effects of earthquakes. One of these devastating effects is slope failure, more commonly known as landslide. Over the years, seismologists and engineers have teamed up to better record data during an earthquake. As technology has advanced, the data obtained have become more refined, allowing engineers to use the data in their efforts to estimate earthquakes where they have not yet occurred. Several methods have been proposed over time to utilize the earthquake data and estimate slope displacements. A pioneer in the development of methods to estimate slope displacements, Nathan Newmark, proposed what is now called the Newmark sliding block method. This method explained in very simple ways how a mass, in this case a rigid block, would slide over an incline given that the acceleration of the block surpassed the frictional resistance created between the bottom of the block and the surface of the incline. Because many of the assumptions from this method were criticized by scientists over time, modified Newmark sliding block methods were proposed. As the original and modified Newmark sliding block methods were introduced, the need to account for the uncertainty in the way soil would behave under earthquake loading became a big challenge. Deterministic and probabilistic methods have been used to incorporate parameters that would account for some of the uncertainty in the analysis. In an attempt to use a probabilistic approach in understanding how slopes might fail, the Pacific Earthquake Engineering Research Center proposed a performance-based earthquake engineering framework that would allow decision-makers to use probabilistically generated information to make decisions based on acceptable risk. Previous researchers applied this framework to simplified Newmark sliding block models, but the approach is difficult for engineers to implement in practice because of the numerous probability calculations that are required. The work presented in this thesis provides a solution to the implementation of the performance-based approach by providing a simplified procedure for the performance-based determination of seismic slope displacements using the Rathje & Saygili (2009) and the Bray and Travasarou (2007) simplified Newmark sliding block models. This document also includes hazard parameter maps, which are an important part of the simplified procedure, for five states in the United States. A validation of the method is provided, as well as a comparison of the simplified method against other commonly used approaches such as deterministic and pseudo-probabilistic.
APA, Harvard, Vancouver, ISO, and other styles
27

Scott, James B. "Seismic noise in the shallow subsurface methods for using it in earthquake hazard assessment /." abstract and full text PDF (free order & download UNR users only), 2007. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3258847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Faraji, Mahdi. "SEISMIC PERFORMANCE AND DISASTER MANAGEMENT OF INTERDEPENDENT CRITICAL INFRASTRUCTURES." 京都大学 (Kyoto University), 2012. http://hdl.handle.net/2433/160998.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Fernandez, Leon J. Alfredo. "Numerical Simulation of Earthquake Ground Motions in the Upper Mississippi Embayment." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19825.

Full text
Abstract:
Earthquake ground motions are needed to evaluate the seismic performance of new and existing structures and facilities. In seismically active regions the strong ground motion recordings database is usually sufficiently large to physically constrain the earthquake estimation for seismic risk assessment. However, in areas of low seismicity rate, particularly in the Central and Eastern United States, the estimation of strong ground motions for a specified magnitude, distance, and site conditions represents a significant issue. The only available approach for ground motion estimation in this region is numerical simulation. In this study, earthquake ground motions have been generated for the Upper Mississippi Embayment using a numerical wave propagation formulation. The effects of epistemic and aleatory uncertainties in the earthquake source, path, and site processes, the effect of non-linear soil behavior, and the effects of the geometry of the Embayment have been incorporated. The ground motions are intended to better characterize the seismic hazard in the Upper Mississippi Embayment by representing the amplitude and variability that might be observed in real earthquakes and to provide resources to evaluate the seismic risk in the region.
APA, Harvard, Vancouver, ISO, and other styles
30

Liang, Jonathan Zhongyuan. "Seismic risk analysis of Perth metropolitan area." University of Western Australia. School of Civil and Resource Engineering, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0142.

Full text
Abstract:
[Truncated abstract] Perth is the capital city of Western Australia (WA) and the home of more than three quarters of the population in the state. It is located in the southwest WA (SWWA), a low to moderate seismic region but the seismically most active region in Australia. The 1968 ML6.9 Meckering earthquake, which was about 130 km from the Perth Metropolitan Area (PMA), caused only minor to moderate damage in PMA. With the rapid increase in population in PMA, compared to 1968, many new structures including some high-rise buildings have been constructed in PMA. Moreover, increased seismic activities and a few strong ground motions have been recorded in the SWWA. Therefore it is necessary to evaluate the seismic risk of PMA under the current conditions. This thesis presents results from a comprehensive study of seismic risk of PMA. This includes development of ground motion attenuation relations, ground motion time history simulation, site characterization and response analysis, and structural response analysis. As only a very limited number of earthquake strong ground motion records are available in SWWA, it is difficult to derive a reliable and unbiased strong ground motion attenuation model based on these data. To overcome this, in this study a combined approach is used to simulate ground motions. First, the stochastic approach is used to simulate ground motion time histories at various epicentral distances from small earthquake events. Then, the Green's function method, with the stochastically simulated time histories as input, is used to generate large event ground motion time histories. Comparing the Fourier spectra of the simulated motions with the recorded motions of a ML6.2 event in Cadoux in June 1979 and a ML5.5 event in Meckering in January 1990, provides good evidence in support of this method. This approach is then used to simulate a series of ground motion time histories from earthquakes of varying magnitudes and distances. ... The responses of three typical Perth structures, namely a masonry house, a middle-rise reinforced concrete frame structure, and a high-rise building of reinforced concrete frame with core wall on various soil sites subjected to the predicted earthquake ground motions of different return periods are calculated. Numerical results indicate that the one-storey unreinforced masonry wall (UMW) building is unlikely to be damaged when subjected to the 475-year return period earthquake ground motion. However, it will suffer slight damage during the 2475-return period earthquake ground motion at some sites. The six-storey RC frame with masonry infill wall is also safe under the 475-year return period ground motion. However, the infill masonry wall will suffer severe damage under the 2475-year return period earthquake ground motion at some sites. The 34-storey RC frame with core wall will not experience any damage to the 475-year return period ground motion. The building will, however, suffer light to moderate damage during the 2475-year return period ground motion, but it might not be life threatening.
APA, Harvard, Vancouver, ISO, and other styles
31

Martin, David N. "Evaluation and comparison of a non-seismic design and seismic design for a low rise office building." Master's thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-03172010-020113/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Cabas, Mijares Ashly Margot. "Improvements to the Assessment of Site-Specific Seismic Hazards." Diss., Virginia Tech, 2016. http://hdl.handle.net/10919/82352.

Full text
Abstract:
The understanding of the impact of site effects on ground motions is crucial for improving the assessment of seismic hazards. Site response analyses (SRA) can numerically accommodate the mechanics behind the wave propagation phenomena near the surface as well as the variability associated with the input motion and soil properties. As a result, SRA constitute a key component of the assessment of site-specific seismic hazards within the probabilistic seismic hazard analysis framework. This work focuses on limitations in SRA, namely, the definition of the elastic half-space (EHS) boundary condition, the selection of input ground motions so that they are compatible with the assumed EHS properties, and the proper consideration of near-surface attenuation effects. Input motions are commonly selected based on similarities between the shear wave velocity (Vs) at the recording station and the materials below the reference depth at the study site (among other aspects such as the intensity of the expected ground motion, distance to rupture, type of source, etc.). This traditional approach disregards the influence of the attenuation in the shallow crust and the degree to which it can alter the estimates of site response. A Vs-κ correction framework for input motions is proposed to render them compatible with the properties of the assumed EHS at the site. An ideal EHS must satisfy the conditions of linearity and homogeneity. It is usually defined at a horizon where no strong impedance contrast will be found below that depth (typically the top of bedrock). However, engineers face challenges when dealing with sites where this strong impedance contrast takes place far beyond the depth of typical Vs measurements. Case studies are presented to illustrate potential issues associated with the selection of the EHS boundary in SRA. Additionally, the relationship between damping values as considered in geotechnical laboratory-based models, and as implied by seismological attenuation parameters measured using ground motions recorded in the field is investigated to propose alternative damping models that can match more closely the attenuation of seismic waves in the field.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
33

Galybin, Konstantin A. "P-wave velocity model for the southwest of the Yilgarn Craton, Western Australia and its relation to the local geology and seismicity /." Connect to this title, 2006. http://theses.library.uwa.edu.au/adt-WU2007.0167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Unsal, Oral Sevinc. "An Integrated Seismic Hazard Framework For Liquefaction Triggering Assessment Of Earthfill Dams&#039." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/3/12610362/index.pdf.

Full text
Abstract:
Within the confines of this study, seismic soil liquefaction triggering potential of a dam foundation is assessed within an integrated probabilistic seismic hazard assessment framework. More specifically, the scheme presented hereby directly integrates effective stress-based seismic soil liquefaction triggering assessment with seismic hazard analysis framework, supported by an illustrative case. The proposed methodology successively, i) processes the discrete stages of probabilistic seismic hazard workflow upon seismic source characterization, ii) numerically develops the target elastic acceleration response spectra for typical rock sites, covering all the earthquake scenarios that are re-grouped with respect to earthquake magnitude and distance, iii) matches the strong ground motion records selected from a database with the target response spectra for every defined scenario, and iv) performs 2-D equivalent linear seismic response analyses of a 56 m high earth fill dam founded on 24 m thick alluvial deposits. Results of seismic response analyses are presented in the form of annual probability of excess pore pressure ratios and seismically-induced lateral deformations exceeding various threshold values. For the purpose of assessing the safety of the dam slopes, phi-c reduction based slope stability analyses were also performed representing post-liquefaction conditions. After having integrated this phi-c reduction analyses results into the probabilistic hazard framework, annual probabilities of factor of safety of slopes exceeding various threshold values were estimated. As the concluding remark, probability of liquefaction triggering, induced deformations and factor of safeties are presented for a service life of 100 years. It is believed that the proposed probabilistic seismic performance assessment methodology which incorporates both phi-c reduction based failure probabilities and seismic soil liquefaction-induced deformation potentials, provides dam engineers a robust methodology to rationally quantify the level of confidence with their decisions regarding if costly mitigation of dam foundation soils against seismic soil liquefaction triggering hazard and induced risks is necessary.
APA, Harvard, Vancouver, ISO, and other styles
35

Taroni, Matteo <1984&gt. "Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6422/1/Taroni_Matteo_Tesi.pdf.

Full text
Abstract:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
APA, Harvard, Vancouver, ISO, and other styles
36

Taroni, Matteo <1984&gt. "Earthquake forecasting and seismic hazard analysis: some insights on the testing phase and the modeling." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6422/.

Full text
Abstract:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
APA, Harvard, Vancouver, ISO, and other styles
37

Shafieezadeh, Abdollah. "Seismic vulnerability assessment of wharf structures." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/41203.

Full text
Abstract:
Serving as critical gateways for international trade, seaports are pivotal elements in transportation networks. Any disruption in the activities of port infrastructures may lead to significant losses from secondary economic effects, and can hamper the response and recovery efforts following a natural disaster. Particularly poignant examples which revealed the significance of port operations were the 1995 Kobe earthquake and 2010 Haiti earthquake in which liquefaction and lateral spreading of embankments imposed severe damage to both structural and non-structural components of ports. Since container wharf structures are responsible for loading and unloading of cargo, it is essential to understand the performance of these structures during earthquakes. Although previous studies have provided insight into some aspects of the seismic response of wharves, limitations in the modeling of wharf structures and the surrounding soil media have constrained the understanding of various features of the wharf response. This research provides new insights into the seismic behavior of wharves by using new and advanced structure and soil modeling procedures to carry out two and three-dimensional seismic analyses of a pile-supported marginal wharf structure in liquefiable soils. Furthermore, this research investigates the interaction between cranes and wharves and closely assesses the role of wharf-crane interaction on the response of each of these systems. For this purpose, the specific effect of wharf-crane interaction is studied by incorporating advanced models of the crane with sliding/uplift base conditions. To reduce the computational time required for three-dimensional nonlinear dynamic analysis of the wharf in order to be applicable for probabilistic seismic demand analysis, a simplified wharf model and an analysis technique are introduced and verified. In the next step probabilistic seismic demand models (PSDMs) are generated by imposing the wharf models to a suit of ground deformations of the soil embankment and pore water pressure generated for this study through free-field analysis. Convolving PSDMs and the limit states, a set of fragility curves are developed for critical wharf components whose damage induces a disruption in the normal operation of ports. The developed fragility curves provide decision makers with essential tools for maximizing investment in wharf retrofit and fill a major gap in seismic risk assessment of seaports which can be used to assess the regional impact of the damage to wharves during a natural hazard event.
APA, Harvard, Vancouver, ISO, and other styles
38

Singh, Bina Aruna. "GIS based assessment of seismic risk for the Christchurch CBD and Mount Pleasant, New Zealand." Thesis, University of Canterbury. Geography, 2006. http://hdl.handle.net/10092/1302.

Full text
Abstract:
This research employs a deterministic seismic risk assessment methodology to assess the potential damage and loss at meshblock level in the Christchurch CBD and Mount Pleasant primarily due to building damage caused by earthquake ground shaking. Expected losses in terms of dollar value and casualties are calculated for two earthquake scenarios. Findings are based on: (1) data describing the earthquake ground shaking and microzonation effects; (2) an inventory of buildings by value, floor area, replacement value, occupancy and age; (3) damage ratios defining the performance of buildings as a function of earthquake intensity; (4) daytime and night-time population distribution data and (5) casualty functions defining casualty risk as a function of building damage. A GIS serves as a platform for collecting, storing and analyzing the original and the derived data. It also allows for easy display of input and output data, providing a critical functionality for communication of outcomes. The results of this study suggest that economic losses due to building damage in the Christchurch CBD and Mount Pleasant will possibly be in the order of $5.6 and $35.3 million in a magnitude 8.0 Alpine fault earthquake and a magnitude 7.0 Ashley fault earthquake respectively. Damage to non-residential buildings constitutes the vast majority of the economic loss. Casualty numbers are expected to be between 0 and 10.
APA, Harvard, Vancouver, ISO, and other styles
39

Kiuchi, Ryota. "New Ground Motion Prediction Equations for Saudi Arabia and their Application to Probabilistic Seismic Hazard Analysis." Kyoto University, 2020. http://hdl.handle.net/2433/253095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Händel, Annabel [Verfasser], Frank [Akademischer Betreuer] Scherbaum, and Frank [Akademischer Betreuer] Krüger. "Ground-motion model selection and adjustment for seismic hazard analysis / Annabel Händel ; Frank Scherbaum, Frank Krüger." Potsdam : Universität Potsdam, 2018. http://d-nb.info/121840406X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Brannon, Brittany Ann. "Faulty Measurements and Shaky Tools: An Exploration into Hazus and the Seismic Vulnerabilities of Portland, OR." PDXScholar, 2013. https://pdxscholar.library.pdx.edu/open_access_etds/1410.

Full text
Abstract:
Events or forces of nature with catastrophic consequences, or "natural disasters," have increased in both frequency and force due to climate change and increased urbanization in climate-sensitive areas. To create capacity to face these dangers, an entity must first quantify the threat and translate scientific knowledge on nature into comprehensible estimates of cost and loss. These estimates equip those at risk with knowledge to enact policy, formulate mitigation plans, raise awareness, and promote preparedness in light of potential destruction. Hazards-United States, or Hazus, is one such tool created by the federal government to estimate loss from a variety of threats, including earthquakes, hurricanes, and floods. Private and governmental agencies use Hazus to provide information and support to enact mitigation measures, craft plans, and create insurance assessments; hence the results of Hazus can have lasting and irreversible effects once the hazard in question occurs. This thesis addresses this problem and sheds light on the obvious and deterministic failings of Hazus in the context of the probable earthquake in Portland, OR; stripping away the tool's black box and exposing the grim vulnerabilities it fails to account for. The purpose of this thesis is twofold. First, this thesis aims to examine the critical flaws within Hazus and the omitted vulnerabilities particular to the Portland region and likely relevant in other areas of study. Second and more nationally applicable, this thesis intends to examine the influence Hazus outputs can have in the framing of seismic risk by the non-expert public. Combining the problem of inadequate understanding of risk in Portland with the questionable faith in Hazus alludes to a larger, socio-technical situation in need of attention by the academic and hazard mitigation community. This thesis addresses those issues in scope and adds to the growing body of literature on defining risk, hazard mitigation, and the consequences of natural disasters to urban environments.
APA, Harvard, Vancouver, ISO, and other styles
42

Orton, Alice M. "SCIENCE AND PUBLIC POLICY OF EARTHQUAKE HAZARD MITIGATION IN THE NEW MADRID SEISMIC ZONE." UKnowledge, 2014. http://uknowledge.uky.edu/ees_etds/19.

Full text
Abstract:
In the central United States, undefined earthquake sources, long earthquake recurrence intervals and uncertain ground motion attenuation models have contributed to an overstatement of regional seismic hazard for the New Madrid Seismic Zone on the National Seismic Hazard Maps. This study examined concerns regarding scientific uncertainties, overly stringent seismic mitigation policies and depressed local economy in western Kentucky through a series of informal interviews with local businessmen, public officials, and other professionals in occupations associated with seismic mitigation. Scientific and relative economic analyses were then performed using scenario earthquake models developed with FEMA’s Hazus-MH software. Effects of the 2008 Wenchuan earthquake in central China and seismic mitigation policies in use there were considered for potential parallels and learning opportunities. Finally, suggestions for continued scientific research, additional educational opportunities for laymen and engineering professionals, and changes in the application of current earthquake science to public policy in the central United States were outlined with the goal of easing western Kentucky economic issues while maintaining acceptable public safety conditions.
APA, Harvard, Vancouver, ISO, and other styles
43

Nilsson, Emily Michelle. "Seismic risk assessment of the transportation network of Charleston, SC." Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22554.

Full text
Abstract:
Thesis (M. S.)--Civil and Environmental Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Dr. Reginald DesRoches; Committee Member: Dr. Barry Goodno; Committee Member: Dr. Laurence Jacobs; Committee Member: Dr. Mulalo Doyoyo.
APA, Harvard, Vancouver, ISO, and other styles
44

Sullins, Eric James. "Analysis of radio communication towers subjected to wind, ice and seismic loadings." Diss., Columbia, Mo. : University of Missouri-Columbia, 2006. http://hdl.handle.net/10355/4561.

Full text
Abstract:
Thesis (M.S.)--University of Missouri-Columbia, 2006.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file viewed on (February 23, 2006) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
45

Bora, Sanjay Singh [Verfasser], and Frank [Akademischer Betreuer] Scherbaum. "Regionally adaptable ground-motion Prediction Equations (GMPEs) for seismic hazard analysis / Sanjay Singh Bora ; Betreuer: Frank Scherbaum." Potsdam : Universität Potsdam, 2015. http://d-nb.info/1219513946/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

VALAGUSSA, ANDREA. "Relationships between landslides size distribution and earthquake source area in a perspective of seismic hazard zoning." Doctoral thesis, Università degli Studi di Milano-Bicocca, 2015. http://hdl.handle.net/10281/68458.

Full text
Abstract:
Gli eventi sismici sono riconosciuti come una delle maggiori cause per l’innesco di frane (Keefer, 1984). Le frane sismo-indotte sono documentate sin dal IV secolo (Seed, 1968). È stata condotta un’analisi sulla distribuzione spaziale delle frane sismo-indotte nell’area circostante la sorgente sismogenetica per meglio comprendere il loro innesco in aree sismiche e per delimitare la massima distanza alla quale un sisma con data magnitudo possa indurre frane. Tuttavia, quando si applicano tali approcci a eventi storici si pone un problema legato al sottocampionamento delle frane più piccole, che possono essere obliterate dall'erosione e dall'evoluzione del paesaggio. Per questo motivo è importante caratterizzare accuratamente la distribuzione delle frane, in termini di dimensione, in funzione della distanza dalla sorgente sismica. Sono stati analizzati sei terremoti in tutto il mondo che hanno innescato un significativo numero di frane (Finisterre 1993, Northridge 1994, Niigata 2004, Wenchuan 2008, Iwate 2008 and Tohoku 2011) per meglio comprendere le relazioni esistenti tra la distribuzione spaziale delle frane, l’accelerazione di picco al suolo (PGA), la distanza dalla sorgente, il relief e le litologie presenti nell’area. Si è osservata una forte relazione tra la PGA e la dimensione delle frane, mentre una la relazione tra la loro dimensione e la distanza dalla sorgente non è altrettanto chiara, ciò legato all’interazione tra diversi fattori quali ad esempio il relief e la litologia. Sono state realizzate e analizzate le curve magnitudo-frequenza (MFC) per differenti distanze dall’area sorgente attraverso varie metodologie: stimatore di massima verosimiglianza per distribuzioni di tipo potenza cumulate (Clauset et al, 2009), stimatore di massima verosimiglianza per distribuzioni di tipo potenza non cumulate, regressione ai minimi quadrati per funzioni di tipo potenza non cumulate in scala logaritmica e stimatore di massima verosimiglianza per la distribuzione Double Pareto. Dalle analisi si è potuto osservare un decrescere della densità spaziale delle frane con la distanza, ma un basso impatto della dimensione delle frane. Inoltre la funzione Double Pareto è stata scelta come miglior strumento per il fittaggio dei dati (Valagussa et al, 2014). Allo scopo di definire il rischio legato alle frane sismo-indotte è stata sviluppata una metodologia per la zonazione probabilistica quantitativa del rischio da frane da crollo (Valagussa et al, 2014). Il metodo è stato applicato e dimostrato nell’area del Friuli (Apli orientali) colpita da un terremoto di magnitudo 6.4 nel 1976. Quattro inventari sono stati realizzati sia tramite attività di terreno che da dati storici. La metodologia si basa sul vettore di rischio tridimensionale (RHVmod) le cui componenti includo l’energia cinetica, l’altezza di volo e la frequenza annua. I primi due valori sono calcolati per ogni cella del versante per mezzo del programma Hy-STONE. La frequenza annua è invece determinata moltiplicando la frequenza d’innesco annua per il numero di transiti simulati in ogni cella. La frequenza d’innesco annua è calcolata combinando l’area instabile, calcolata per 10 differenti scenari con differente frequenza annua di occorrenza sulla base di caratteristiche morfometriche e sismiche, e la curva magnitudo-frequenza relativa dei blocchi identificati da attività di terreno. Una serie di analisi discriminanti sono state condotte per determinare le variabili che controllano l’area in frana, sulla base degli inventari redatti e di DEMs a differenti risoluzioni (1 e 10m). L’analisi ha dimostrato il ruolo rilevante della curvatura nella definizione dell’area instabile. Per verificare la validità della mappa di PGA utilizzata nelle analisi, una nuova mappa è stata redatta sulla base delle Precarious Balanced Rocks identificate sul terreno.
Earthquakes have been recognized as a major cause of landsliding (Keefer, 1984), and landslides triggered by earthquakes have been documented since the IV century (Seed, 1968). The spatial distribution of earthquake-induced landslides around the seismogenetic source has been analysed to better understand the triggering of landslides in seismic areas and to forecast the maximum distance at which an earthquake, with a certain magnitude, can trigger landslides. However, when applying such approaches to old earthquakes one should be concerned about the undersampling of smaller landslides, which can be cancelled, by erosion and landscape evolution. For this reason, it is important to characterize carefully the size distribution of landslides as a function of distance from the earthquake source. I analysed six earthquakes in the world that triggered significant amount of landslides (Finisterre 1993, Northridge 1994, Niigata 2004, Wenchuan 2008, Iwate 2008 and Tohoku 2011) to better understand the relation between the spatial distribution of the landslides, the peak ground acceleration (PGA), the distance from the sources, the relief and the lithologies of the area. I observed a strong relationship between landslides size and PGA, while the relationship between the distance from the source and the landslide size distribution is not clear, due to the interaction of different factors such as relief and lithology. I also developed magnitude frequency curves (MFC) for different distances from the source area by using different methods, such as: the maximum likelihood estimator of cumulative power-law distribution (Clauset et al, 2009); the maximum likelihood estimator of non-cumulative power-law function; the least square regression of non-cumulative log power-law function and the maximum likelihood estimator of Double Pareto distribution. I observed a decrease of the spatial density of landslides with distance, with a small effect of the size of these landslides. I also identify the Double Pareto function as the best tool for the fitting of the data (Valagussa et al., 2014a). In order to define the hazard due to earthquake-induced landslides, I developed a methodology for quantitative probabilistic hazard zonation for rockfalls (Valagussa et al., 2014b). I applied and demonstrated the method in the area of Friuli (Eastern Italian Alps) that was affected by the 1976 Mw 6.5 earthquake. Four rockfall datasets have been prepared from both historical data and field surveys. The methodology relies on a three-dimensional hazard vector (RHVmod), whose components include the rockfall kinetic energy, the fly height, and the annual frequency. The values of the first two components are calculated for each location along the slope using the 3D rockfall runout simulator Hy-STONE. The rockfall annual frequency is assessed by multiplying the annual onset frequency by the simulated transit frequency. The annual onset frequency is calculated 2 through a procedure that combines the extent of unstable areas, calculated for 10 different seismichazard scenarios with different annual frequencies of occurrence, and the magnitude relativefrequency relationship of blocks as derived from the collected field data. For each annual frequency of occurrence, the unstable area is calculated as a function of morphometric and earthquake characteristics. A series of discriminant-analysis models, using the rockfall datasets and DEMs of different resolution (1 and 10 m), identified the controlling variables and verified the model robustness. In contrast with previously published research, I show that the slope curvature plays a relevant role in the computation of the unstable area. To ensure the validity of the peak ground acceleration used as seismic parameter in the discriminant function, I also try to define a map of PGA based on the precarious balanced rocks surveyed on the field.
APA, Harvard, Vancouver, ISO, and other styles
47

Scheingraber, Christoph [Verfasser], and Heiner [Akademischer Betreuer] Igel. "Efficient treatment and quantification of uncertainty in probabilistic seismic hazard and risk analysis / Christoph Scheingraber ; Betreuer: Heiner Igel." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/1179075986/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Joshi, Varun Anil. "Near-Fault Forward-Directivity Aspects of Strong Ground Motions in the 2010-11 Canterbury Earthquakes." Thesis, University of Canterbury. Department of Civil and Natural Resources Engineering, 2013. http://hdl.handle.net/10092/8987.

Full text
Abstract:
The purpose of this thesis is to conduct a detailed examination of the forward-directivity characteristics of near-fault ground motions produced in the 2010-11 Canterbury earthquakes, including evaluating the efficacy of several existing empirical models which form the basis of frameworks for considering directivity in seismic hazard assessment. A wavelet-based pulse classification algorithm developed by Baker (2007) is firstly used to identify and characterise ground motions which demonstrate evidence of forward-directivity effects from significant events in the Canterbury earthquake sequence. The algorithm fails to classify a large number of ground motions which clearly exhibit an early-arriving directivity pulse due to: (i) incorrect pulse extraction resulting from the presence of pulse-like features caused by other physical phenomena; and (ii) inadequacy of the pulse indicator score used to carry out binary pulse-like/non-pulse-like classification. An alternative ‘manual’ approach is proposed to ensure 'correct' pulse extraction and the classification process is also guided by examination of the horizontal velocity trajectory plots and source-to-site geometry. Based on the above analysis, 59 pulse-like ground motions are identified from the Canterbury earthquakes , which in the author's opinion, are caused by forward-directivity effects. The pulses are also characterised in terms of their period and amplitude. A revised version of the B07 algorithm developed by Shahi (2013) is also subsequently utilised but without observing any notable improvement in the pulse classification results. A series of three chapters are dedicated to assess the predictive capabilities of empirical models to predict the: (i) probability of pulse occurrence; (ii) response spectrum amplification caused by the directivity pulse; (iii) period and amplitude (peak ground velocity, PGV) of the directivity pulse using observations from four significant events in the Canterbury earthquakes. Based on the results of logistic regression analysis, it is found that the pulse probability model of Shahi (2013) provides the most improved predictions in comparison to its predecessors. Pulse probability contour maps are developed to scrutinise observations of pulses/non-pulses with predicted probabilities. A direct comparison of the observed and predicted directivity amplification of acceleration response spectra reveals the inadequacy of broadband directivity models, which form the basis of the near-fault factor in the New Zealand loadings standard, NZS1170.5:2004. In contrast, a recently developed narrowband model by Shahi & Baker (2011) provides significantly improved predictions by amplifying the response spectra within a small range of periods. The significant positive bias demonstrated by the residuals associated with all models at longer vibration periods (in the Mw7.1 Darfield and Mw6.2 Christchurch earthquakes) is likely due to the influence of basin-induced surface waves and non-linear soil response. Empirical models for the pulse period notably under-predict observations from the Darfield and Christchurch earthquakes, inferred as being a result of both the effect of nonlinear site response and influence of the Canterbury basin. In contrast, observed pulse periods from the smaller magnitude June (Mw6.0) and December (Mw5.9) 2011 earthquakes are in good agreement with predictions. Models for the pulse amplitude generally provide accurate estimates of the observations at source-to-site distances between 1 km and 10 km. At longer distances, observed PGVs are significantly under-predicted due to their slower apparent attenuation. Mixed-effects regression is employed to develop revised models for both parameters using the latest NGA-West2 pulse-like ground motion database. A pulse period relationship which accounts for the effect of faulting mechanism using rake angle as a continuous predictor variable is developed. The use of a larger database in model development, however does not result in improved predictions of pulse period for the Darfield and Christchurch earthquakes. In contrast, the revised model for PGV provides a more appropriate attenuation of the pulse amplitude with distance, and does not exhibit the bias associated with previous models. Finally, the effects of near-fault directivity are explicitly included in NZ-specific probabilistic seismic hazard analysis (PSHA) using the narrowband directivity model of Shahi & Baker (2011). Seismic hazard analyses are conducted with and without considering directivity for typical sites in Christchurch and Otira. The inadequacy of the near-fault factor in the NZS1170.5: 2004 is apparent based on a comparison with the directivity amplification obtained from PSHA.
APA, Harvard, Vancouver, ISO, and other styles
49

Burden, Lindsay Ivey. "Forecasting earthquake losses in port systems." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/43615.

Full text
Abstract:
Ports play a critical role in transportation infrastructure, but are vulnerable to seismic hazards. Downtime and reduced throughput from seismic damage in ports results in significant business interruption losses for port stakeholders. Current risk management practices only focus on the effect of seismic hazards on individual port structures. However, damage and downtime of these structures has a significant impact on the overall port system's ship handling operations and the regional, national, and even international economic impacts that result from extended earthquake-induced disruption of a major container port. Managing risks from system-wide disruptions resulting from earthquake damage has been studied as a central element of a Grand Challenge project sponsored by the National Science Foundation Network for Earthquake Engineering Simulation (NEES) program. The following thesis presents the concepts and methods developed for the seismic risk management of a port-wide system of berths. In particular the thesis discusses the framework used to calculated port losses: the use of spatially correlated ground motion intensity measures to estimate damage to pile-supported marginal wharves and container cranes of various configurations via fragility relationships developed by project team members, repair costs and downtimes subsequently determined via repair models for both types of structures, and the impact on cargo handling operations calculated via logistical models of the port system. Results are expressed in the form of loss exceedance curves than include both repair/replacement costs and business interruption losses. The thesis also discusses how the results from such an analysis might be used by port decision makers to make more informed decisions in design, retrofit, operational, and other seismic risk management options.
APA, Harvard, Vancouver, ISO, and other styles
50

Liao, Tianfei. "Post processing of cone penetration data for assessing seismic ground hazards, with application to the New Madrid seismic zone." Diss., Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-05042005-133640/.

Full text
Abstract:
Thesis (Ph. D.)--Civil and Environmental Engineering, Georgia Institute of Technology, 2006.
Mayne, Paul W., Committee Chair ; Goldsman, David, Committee Member ; Lai, James, Committee Member ; Rix, Glenn J., Committee Member ; Santamarina, J. Carlos, Committee Member.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography