Academic literature on the topic 'Historical Simulation Approach'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Historical Simulation Approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Historical Simulation Approach"

1

Zhou, Rui, Johnny Siu-Hang Li, and Jeffrey Pai. "Pricing temperature derivatives with a filtered historical simulation approach." European Journal of Finance 25, no. 15 (April 5, 2019): 1462–84. http://dx.doi.org/10.1080/1351847x.2019.1602068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmad Baharul Ulum, Zatul Karamah Binti. "Market Risk Quantifications: Historical Simulation Approach on the Malaysian Stock Exchange." International Journal of Management Excellence 2, no. 1 (November 11, 2013): 122. http://dx.doi.org/10.17722/ijme.v2i1.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

David Cabedo, J., and Ismael Moya. "Estimating oil price ‘Value at Risk’ using the historical simulation approach." Energy Economics 25, no. 3 (May 2003): 239–53. http://dx.doi.org/10.1016/s0140-9883(02)00111-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Semenov, Andrei. "Historical simulation approach to the estimation of stochastic discount factor models." Quantitative Finance 8, no. 4 (June 2008): 391–404. http://dx.doi.org/10.1080/14697680701561365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dulaney, Ronald A. "An Historical Simulation Approach to the Present Valuation of Future Earnings." Journal of Forensic Economics 1, no. 1 (August 1, 1987): 37–48. http://dx.doi.org/10.5085/0898-5510-1.1.37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ramadan, Helmi, Prana Ugiana Gio, and Elly Rosmaini. "Monte Carlo Simulation Approach to Determine the Optimal Solution of Probabilistic Supply Cost." Journal of Research in Mathematics Trends and Technology 2, no. 1 (February 24, 2020): 1–6. http://dx.doi.org/10.32734/jormtt.v2i1.3752.

Full text
Abstract:
Monte Carlo simulation is a probabilistic simulation where the solution of problem is given based on random process. The random process involves a probabilitydistribution from data variable collected based on historical data. The used model is probabilistic Economic Order Quantity Model (EOQ). This model then assumed use Monte Carlo simulation, so that obtained the total of optimal supply cost in the future. Based on data processing, the result of probabilistic EOQ is $486128,19. After simulation using Monte Carlo simulation where the demand data follows normal distribution and it is obtained the total of supply cost is $46116,05 in 23 months later. Whereas the demand data uses Weibull distribution is obtained the total of supply stock is $482301,76. So that, Monte Carlo simulation can calculate the total of optimal supply in the future based on historical demand data.
APA, Harvard, Vancouver, ISO, and other styles
7

Susanti, Dwi, Sukono Sukono, and Maria Jatu Verrany. "Value-at-Risk Estimation Method Based on Normal Distribution, Logistics Distribution and Historical Simulation." Operations Research: International Conference Series 1, no. 1 (February 5, 2020): 13–18. http://dx.doi.org/10.47194/orics.v1i1.19.

Full text
Abstract:
This paper discusses the risk analysis of single stock and portfolio returns. The stock data analyzed are BNI, BRI shares and portfolio. After obtaining a stock return, value at risk (VaR) will be estimated using the normal distribution approach, logistic distribution, and historical simulation. From the VaR results, a backtest is then conducted to test the validity of the model and the backtest results for BNI and the portfolio produce a smaller QPS on the historical simulation method compared to the normal distribution and logistics distribution approaches. This shows that BNI VaR and VaR portfolios with the historical simulation method are more consistent than other methods. While the backtest results for BRI produced the smallest QPS on the normal distribution approach compared to the logistical distribution and historical simulation approaches. This shows that the VaR BRI using the normal distribution approach is more consistent than the other methods.
APA, Harvard, Vancouver, ISO, and other styles
8

Gawin, Bartlomiej, and Bartosz Marcinkowski. "How Close to Reality is the „as-is” Business Process Simulation Model?" Organizacija 48, no. 3 (August 1, 2015): 155–75. http://dx.doi.org/10.1515/orga-2015-0013.

Full text
Abstract:
AbstractBackground and Purpose: Business process simulation (BPS) model is based on real-life data form sources like databases, observations and interviews. It acts as “as-is” business scenario can used for reengineering. The main challenge is to gather relevant data and to develop simulation model. Research aims to elaborate BPS model and to systematically assess how close to reality it is.Design/Methodology/Approach: The research has been performed in Polish telecommunications company. Authors investigate technical process of expanding cellular network. After elaborating “as-is” model, authors use ADONIS simulation tool to run a series simulations and confront simulation results with actual historical events. After this, assessment whether computer simulation model can precisely map real-life business process - and consequently act as credible basis for process improvement - is made.Results: The simulation model has been constructed with data from the WfMS database, observations, staff knowledge and their experience. Fully equipped simulation model is found to allow reconstructing the historical execution of business activity with low margin for error. Some limitation were identified and discussed.Conclusion: BPS is not a popular approach for process reengineering and improvement yet. Data collection issues for BPS that require adopting process mining techniques and additional information sources are among the reasons for that. In our study, computer simulation outputs are compatible with historical events. Hence the model reflects the business reality and can be taken as a reference model while redesigning the process.
APA, Harvard, Vancouver, ISO, and other styles
9

Lu, Baohong, Huanghe Gu, Ziyin Xie, Jiufu Liu, Lejun Ma, and Weixian Lu. "Stochastic simulation for determining the design flood of cascade reservoir systems." Hydrology Research 43, no. 1-2 (February 1, 2012): 54–63. http://dx.doi.org/10.2166/nh.2011.002.

Full text
Abstract:
Stochastic simulation is widely applied for estimating the design flood of various hydrosystems. The design flood at a reservoir site should consider the impact of upstream reservoirs, along with any development of hydropower. This paper investigates and applies a stochastic simulation approach for determining the design flood of a complex cascade of reservoirs in the Longtan watershed, southern China. The magnitude of the design flood when the impact of the upstream reservoirs is considered is less than that without considering them. In particular, the stochastic simulation model takes into account both systematic and historical flood records. As the reliability of the frequency analysis increases with more representative samples, it is desirable to incorporate historical flood records, if available, into the stochastic simulation model. This study shows that the design values from the stochastic simulation method with historical flood records are higher than those without historical flood records. The paper demonstrates the advantages of adopting a stochastic flow simulation approach to address design-flood-related issues for a complex cascade reservoir system.
APA, Harvard, Vancouver, ISO, and other styles
10

Changchien, Chang-Cheng, Chu-Hsiung Lin, and Wei-Shun Kao. "Capturing value-at-risk in futures markets: a revised filtered historical simulation approach." Journal of Risk Model Validation 6, no. 4 (December 2012): 67–93. http://dx.doi.org/10.21314/jrmv.2012.095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Historical Simulation Approach"

1

Cucurnia, Renato, and Ali Khadar. "Value at Risk : En kvantitativ studie av Historical Simulation Approach och Simple Moving Average Approach." Thesis, Umeå University, Umeå School of Business, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-34300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gunnarsson, Fredrik. "Filtered Historical SimulationValue at Risk for Options : A Dimension Reduction Approach to Model the VolatilitySurface Shifts." Thesis, Umeå universitet, Institutionen för fysik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-160344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Antonenko, Zhanna. "Rizika použití VAR modelů při řízení portfolia." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-202134.

Full text
Abstract:
The diploma thesis Risks of using VaR models for portfolio management is focused on estimation of the portfolio VaR using basic and modified methods. The goal of this thesis is to point out some weakness of the basic methods and to demonstrate the estimation of VaR using improved methods to overcome these problems. The analysis will be perform theoretically and in practice. Only market risk will be the subject of the study. Several simulation and parametric methods will be introduced.
APA, Harvard, Vancouver, ISO, and other styles
4

Hsueh, Po-Hsiu, and 薛伯修. "A Study on Estimating Value-at-Risk Model for US Dollars against NT Dollars Exchange Rate by Historical Simulation Approach." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/02118615695157490490.

Full text
Abstract:
碩士
國立高雄第一科技大學
財務管理所
90
Abstract We collected the data of US Dollars against NT Dollars interbank exchange rates of Taiwan foreign exchange market from August 25, 1992 to January 2, 2002 to analysis and compare the accuracy of estimating Value at Risk(VaR) under the models of original Historical Simulation Approach and improved Historical Simulation Approach (including Simply Weighted Moving Average Approach、Exponentially Weighted Moving Average Approach and Robust Exponentially Weighted Moving Average Approach )﹒The empirical results are as follow: 1.Improved Historical Simulation Approaches actually raise the accuracy of VaR estimating. 2.In general, Simply Weighted Moving Average Approach is better than Exponentially Weighted Moving Average Approach and Robust Exponentially Weighted Moving Average Approach for evaluating the accuracy of VaR.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Historical Simulation Approach"

1

Simulating security returns: A filtered historical simulation approach. New York City, NY: Palgrave Pivot, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Primiero, Giuseppe. On the Foundations of Computing. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198835646.001.0001.

Full text
Abstract:
This book is a technical, historical, and conceptual investigation into the three main methodological approaches to the computational sciences: mathematical, engineering, and experimental. Part I explores the background behind the formal understanding of computing, originating at the end of the nineteenth century, and it invesitagtes the formal origins and conceptual development of the notions of computation, algorithm, and program.Part II overviews the construction of physical devices to performautomated tasks and it considers associated technical and conceptual issues. It starts with the design and construction of the first generation of computingmachines, explores their evolution and progress in engineering (for both hardware and software), and investigates their theoretical and conceptual problems. Part III analyses the methods and principles of experimental sciences founded on computationalmethods. It studies the use ofmachines to performscientific tasks,with particular reference to computer models and simulations. Each part aims at defining a notion of computational validity according to the corresponding methodological approach.
APA, Harvard, Vancouver, ISO, and other styles
3

Mesinger, Fedor, Miodrag Rančić, and R. James Purser. Numerical Methods in Atmospheric Models. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.617.

Full text
Abstract:
The astonishing development of computer technology since the mid-20th century has been accompanied by a corresponding proliferation in the numerical methods that have been developed to improve the simulation of atmospheric flows. This article reviews some of the numerical developments concern the ongoing improvements of weather forecasting and climate simulation models. Early computers were single-processor machines with severely limited memory capacity and computational speed, requiring simplified representations of the atmospheric equations and low resolution. As the hardware evolved and memory and speed increased, it became feasible to accommodate more complete representations of the dynamic and physical atmospheric processes. These more faithful representations of the so-called primitive equations included dynamic modes that are not necessarily of meteorological significance, which in turn led to additional computational challenges. Understanding which problems required attention and how they should be addressed was not a straightforward and unique process, and it resulted in the variety of approaches that are summarized in this article. At about the turn of the century, the most dramatic developments in hardware were the inauguration of the era of massively parallel computers, together with the vast increase in the amount of rapidly accessible memory that the new architectures provided. These advances and opportunities have demanded a thorough reassessment of the numerical methods that are most successfully adapted to this new computational environment. This article combines a survey of the important historical landmarks together with a somewhat speculative review of methods that, at the time of writing, seem to hold out the promise of further advancing the art and science of atmospheric numerical modeling.
APA, Harvard, Vancouver, ISO, and other styles
4

Voyatzaki, Maria, ed. Architectural Materialisms. Edinburgh University Press, 2018. http://dx.doi.org/10.3366/edinburgh/9781474420570.001.0001.

Full text
Abstract:
This book gathers 14 voices from a diverse group of architects, designers, performing artists, film makers, media theorists, philosophers, mathematicians and programmers. By transversally crossing disciplinary boundaries, new and profound insights into contemporary thinking and creating architecture emerge. The book is at the forefront of the current contemplation on matter and its significance for and within architecture. The premise is that matter in posthuman times has to be rethought in the rich and multifaceted context of contemporary computational architecture, and in the systemic and ecological context of pervasive computer simulations. Combining the dynamism of materiality and the capacities of nonhuman machines towards prototyping spatiotemporal designs and constructs, leads to alternative conceptions of the human, of ethics, aesthetics and politics in this world yet-to-come. The reader, through the various approaches presented by the authors’ perspectives, will appreciate that creativity can come from allowing matter to take the lead in the feedback loop of the creative process towards a relevant outcome evaluated as such by a matter of concern actualised within the ecological milieu of design. The focus is on the authors’ speculative dimension in their multifaceted role of discussing materiality by recognising that a transdisciplinary mode is first and foremost a speculative praxis in our effort to trace materiality and its affects in creativity. The book is not interested in discussing technicalities and unidirectional approaches to materiality, and retreats from a historical linear timeline of enquiry whilst establishing a sectional mapping of materiality’s importance in the emergent future of architecture.
APA, Harvard, Vancouver, ISO, and other styles
5

Busuioc, Aristita, and Alexandru Dumitrescu. Empirical-Statistical Downscaling: Nonlinear Statistical Downscaling. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.770.

Full text
Abstract:
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Climate Science. Please check back later for the full article.The concept of statistical downscaling or empirical-statistical downscaling became a distinct and important scientific approach in climate science in recent decades, when the climate change issue and assessment of climate change impact on various social and natural systems have become international challenges. Global climate models are the best tools for estimating future climate conditions. Even if improvements can be made in state-of-the art global climate models, in terms of spatial resolution and their performance in simulation of climate characteristics, they are still skillful only in reproducing large-scale feature of climate variability, such as global mean temperature or various circulation patterns (e.g., the North Atlantic Oscillation). However, these models are not able to provide reliable information on local climate characteristics (mean temperature, total precipitation), especially on extreme weather and climate events. The main reason for this failure is the influence of local geographical features on the local climate, as well as other factors related to surrounding large-scale conditions, the influence of which cannot be correctly taken into consideration by the current dynamical global models.Impact models, such as hydrological and crop models, need high resolution information on various climate parameters on the scale of a river basin or a farm, scales that are not available from the usual global climate models. Downscaling techniques produce regional climate information on finer scale, from global climate change scenarios, based on the assumption that there is a systematic link between the large-scale and local climate. Two types of downscaling approaches are known: a) dynamical downscaling is based on regional climate models nested in a global climate model; and b) statistical downscaling is based on developing statistical relationships between large-scale atmospheric variables (predictors), available from global climate models, and observed local-scale variables of interest (predictands).Various types of empirical-statistical downscaling approaches can be placed approximately in linear and nonlinear groupings. The empirical-statistical downscaling techniques focus more on details related to the nonlinear models—their validation, strengths, and weaknesses—in comparison to linear models or the mixed models combining the linear and nonlinear approaches. Stochastic models can be applied to daily and sub-daily precipitation in Romania, with a comparison to dynamical downscaling. Conditional stochastic models are generally specific for daily or sub-daily precipitation as predictand.A complex validation of the nonlinear statistical downscaling models, selection of the large-scale predictors, model ability to reproduce historical trends, extreme events, and the uncertainty related to future downscaled changes are important issues. A better estimation of the uncertainty related to downscaled climate change projections can be achieved by using ensembles of more global climate models as drivers, including their ability to simulate the input in downscaling models. Comparison between future statistical downscaled climate signals and those derived from dynamical downscaling driven by the same global model, including a complex validation of the regional climate models, gives a measure of the reliability of downscaled regional climate changes.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Historical Simulation Approach"

1

Aloud, Monira, Edward Tsang, and Richard Olsen. "Modeling the FX Market Traders’ Behavior." In Simulation in Computational Finance and Economics, 303–38. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2011-7.ch015.

Full text
Abstract:
In this chapter, the authors use an Agent-Based Modeling (ABM) approach to model trading behavior in the Foreign Exchange (FX) market. They establish statistical properties (stylized facts) of the traders’ trading behavior in the FX market using a high-frequency dataset of anonymised OANDA individual traders’ historical transactions on an account level spanning 2.25 years. Using the identified stylized facts of real FX market traders’ behavior, the authors evaluate the collective behavior of the trading agents in resembling the collective behavior of the FX market traders. The study identifies the conditions under which the stylized facts of trading agents’ collective behaviors resemble those for the real FX market traders’ collective behavior. The authors perform an exploration of the market’s features in order to identify the conditions under which the stylized facts emerge.
APA, Harvard, Vancouver, ISO, and other styles
2

Mohammed, Chekour. "Teaching Electricity Between Pedagogy and Technology." In Personalization and Collaboration in Adaptive E-Learning, 304–14. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1492-4.ch015.

Full text
Abstract:
Moroccan high school students find major difficulties in understanding the concepts related to electricity. The problem lies not only in the knowledge itself but also in the teaching practices. Even worse, in the Moroccan educational system, the lack of experimental activities and the low rate of integration of educational technologies hinder an effective teaching of these concepts. Indeed, the lack of these experimental activities is the main cause of the introduction of the erroneous conceptions. This lack can be remedied through simulation. The simulation is the one-off solution for the phenomena invisible to the human eye. In this chapter, the authors review the literature of the main pedagogical approaches used to facilitate the acquisition of phenomena of electricity and specially the historical investigation approach based on collaborative learning. Also, they highlight the added value of the combination of the investigation and simulation of phenomena of electricity.
APA, Harvard, Vancouver, ISO, and other styles
3

Papadopoulos, Homer, and Antonis Korakis. "Predicting Medical Resources Required to be Dispatched After Earthquake and Flood, Using Historical Data and Machine Learning Techniques." In Improving the Safety and Efficiency of Emergency Services, 38–66. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2535-7.ch003.

Full text
Abstract:
This article presents a method to predict the medical resources required to be dispatched after large-scale disasters to satisfy the demand. The historical data of past incidents (earthquakes, floods) regarding the number of victims requested emergency medical services and hospitalisation, simulation tools, web services and machine learning techniques have been combined. The authors adopted a twofold approach: a) use of web services and simulation tools to predict the potential number of victims and b) use of historical data and self-trained algorithms to “learn” from these data and provide relative predictions. Comparing actual and predicted victims needed hospitalisation showed that the proposed models can predict the medical resources required to be dispatched with acceptable errors. The results are promoting the use of electronic platforms able to coordinate an emergency medical response since these platforms can collect big heterogeneous datasets necessary to optimise the performance of the suggested algorithms.
APA, Harvard, Vancouver, ISO, and other styles
4

Rossmann, Juergen, Martin Hoppen, and Arno Buecken. "GML-Based nD Data Management With a Big Geo Data Semantic World Modeling Approach." In Contemporary Strategies and Approaches in 3-D Information Modeling, 191–223. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5625-1.ch008.

Full text
Abstract:
3D simulation applications benefit from realistic and exact forest models. They range from training simulators like flight or harvester simulators to economic and ecological simulations for tree growth or succession. The nD forest simulation and information system integrates the necessary methods for data extraction, modeling, and management of highly realistic models. Using semantic world modeling, tree data can efficiently be extracted from remote sensing data – even for very large areas. Data is modeled using a GML-based modeling language and a flexible data management approach is integrated to provide caching, persistence, a central communication hub, and a versioning mechanism. Combining various simulation techniques and data versioning, the nD forest simulation and information system can provide applications with historic 3D data in multiple time dimensions (hence nD) as well as with predicted data based on simulations.
APA, Harvard, Vancouver, ISO, and other styles
5

Tossell, John A., and David J. Vaughan. "Theoretical Methods." In Theoretical Geochemistry. Oxford University Press, 1992. http://dx.doi.org/10.1093/oso/9780195044034.003.0005.

Full text
Abstract:
In this chapter, the most important quantum-mechanical methods that can be applied to geological materials are described briefly. The approach used follows that of modern quantum-chemistry textbooks rather than being a historical account of the development of quantum theory and the derivation of the Schrödinger equation from the classical wave equation. The latter approach may serve as a better introduction to the field for those readers with a more limited theoretical background and has recently been well presented in a chapter by McMillan and Hess (1988), which such readers are advised to study initially. Computational aspects of quantum chemistry are also well treated by Hinchliffe (1988). In the section that follows this introduction, the fundamentals of the quantum mechanics of molecules are presented first; that is, the “localized” side of Fig. 1.1 is examined, basing the discussion on that of Levine (1983), a standard quantum-chemistry text. Details of the calculation of molecular wave functions using the standard Hartree-Fock methods are then discussed, drawing upon Schaefer (1972), Szabo and Ostlund (1989), and Hehre et al. (1986), particularly in the discussion of the agreement between calculated versus experimental properties as a function of the size of the expansion basis set. Improvements on the Hartree-Fock wave function using configuration-interaction (CI) or many-body perturbation theory (MBPT), evaluation of properties from Hartree-Fock wave functions, and approximate Hartree-Fock methods are then discussed. The focus then shifts to the “delocalized” side of Fig. 1.1, first discussing Hartree-Fock band-structure studies, that is, calculations in which the full translational symmetry of a solid is exploited rather than the point-group symmetry of a molecule. A good general reference for such studies is Ashcroft and Mermin (1976). Density-functional theory is then discussed, based on a review by von Barth (1986), and including both the multiple-scattering self-consistent-field Xα method (MS-SCF-Xα) and more accurate basis-function-density-functional approaches. We then describe the success of these methods in calculations on molecules and molecular clusters. Advances in density-functional band theory are then considered, with a presentation based on Srivastava and Weaire (1987). A discussion of the purely theoretical modified electron-gas ionic models is followed by discussion of empirical simulation, and we conclude by mentioning a recent approach incorporating density-functional theory and molecular dynamics (Car and Parrinello, 1985).
APA, Harvard, Vancouver, ISO, and other styles
6

David, Roman, and Ian Holliday. "Transitional Justice." In Liberalism and Democracy in Myanmar, 145–78. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198809609.003.0005.

Full text
Abstract:
Myanmar’s half-century of authoritarianism from 1962 to 2011 left a bitter legacy of gross human rights abuse and other historical injustice. One issue widely held by researchers to be a contributing factor to establishing a human rights culture and promoting democratization is dealing with the past. In this context, the chapter explores the demand for transitional justice in Myanmar, drawing on interviews with former political prisoners, surveys, survey experiments, and secondary sources. It reviews factors commonly associated with demand for transitional justice, examines the historical and political determinants of transitional justice in Myanmar, probes the authors’ surveys to investigate popular demand for transitional justice, uses interviews with former political prisoners to assess victims’ needs and their conception of justice, and connects a victim-centred approach with popular demand by examining support for transitional justice in light of experimental evidence simulating real-life resolution of historical injustice.
APA, Harvard, Vancouver, ISO, and other styles
7

Albertini, Marcelo Keese, and Rodrigo Fernandes de Mello. "A Self-Organizing Neural Network to Approach Novelty Detection." In Machine Learning, 262–82. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-60960-818-7.ch211.

Full text
Abstract:
Machine learning is a field of artificial intelligence which aims at developing techniques to automatically transfer human knowledge into analytical models. Recently, those techniques have been applied to time series with unknown dynamics and fluctuations in the established behavior patterns, such as humancomputer interaction, inspection robotics and climate change. In order to detect novelties in those time series, techniques are required to learn and update knowledge structures, adapting themselves to data tendencies. The learning and updating process should integrate and accommodate novelty events into the normal behavior model, possibly incurring the revaluation of long-term memories. This sort of application has been addressed by the proposal of incremental techniques based on unsupervised neural networks and regression techniques. Such proposals have introduced two new concepts in time-series novelty detection. The first defines the temporal novelty, which indicates the occurrence of unexpected series of events. The second measures how novel a single event is, based on the historical knowledge. However, current studies do not fully consider both concepts of detecting and quantifying temporal novelties. This motivated the proposal of the self-organizing novelty detection neural network architecture (SONDE) which incrementally learns patterns in order to represent unknown dynamics and fluctuation of established behavior. The knowledge accumulated by SONDE is employed to estimate Markov chains which model causal relationships. This architecture is applied to detect and measure temporal and nontemporal novelties. The evaluation of the proposed technique is carried out through simulations and experiments, which have presented promising results.
APA, Harvard, Vancouver, ISO, and other styles
8

Vondrackova, Petra, and David Šmahel. "Internet Addiction in Context." In Advanced Methodologies and Technologies in Artificial Intelligence, Computer Simulation, and Human-Computer Interaction, 551–62. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7368-5.ch041.

Full text
Abstract:
Internet addiction can be defined as the overuse of the internet such that it leads to the impairment of an individual's psychological state (both mental and emotional), as well as their scholastic, occupational, and social interactions. In this chapter, the authors first present a short historical overview of the internet addiction phenomenon and its place in the context of mental health. They then introduce the contributions of major researchers who focused on defining its core components, designing measurement scales and diagnostic criteria. Furthermore, they focus on the main areas of research in this field: the major surveys regarding prevalence rates and the correlates of internet addiction. In the last section, they introduce basic approaches to the treatment of internet addiction.
APA, Harvard, Vancouver, ISO, and other styles
9

Zambujal-Oliveira, João, and César Serradas. "Valuation of Technology-Based Companies." In Advances in Business Information Systems and Analytics, 108–22. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4983-5.ch007.

Full text
Abstract:
The cash flows of technology-based companies show high degrees of uncertainty. As traditional valuation methods can hardly capture these characteristics, they are insufficient for valuing these kinds of companies. On the contrary, real options theory can quantify the value associated with management flexibility, growth opportunities, and synergies. This chapter assesses the corporate value of a technology-based company. By gathering information from historical cash flows and using Monte Carlo simulations, the chapter generates future returns paths and primarily uses them for valuations by discounted cash flow methods. The generated volatility is subsequently used to value the measurement carried out by real options theory. The value obtained under the real options binomial approach is about 40% higher than the one obtained by the discounted cash flow method. This difference can be attributed to the value associated with uncertainty and flexibility.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Historical Simulation Approach"

1

Elrafie, Emad, Mohammed Agil, Tariq Abbas, Boy Idroos, and François-Michel Colomar. "Innovative Simulation History Matching Approach Enabling Better Historical Performance Match and Embracing Uncertainty in Predictive Forecasting." In SPE/EAGE Reservoir Characterization & Simulation Conference. European Association of Geoscientists & Engineers, 2009. http://dx.doi.org/10.3997/2214-4609-pdb.170.spe120958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Elrafie, Emad Ahmed, Mohammed A. Agil, Raja Tariq Abbas, Boy Esvano Idroos, and Francois-Michel Colomar. "Innovated Simulation History Matching Approach Enabling Better Historical Performance Match and Embracing Uncertainty in Predictive Forecasting." In EUROPEC/EAGE Conference and Exhibition. Society of Petroleum Engineers, 2009. http://dx.doi.org/10.2118/120958-ms.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wei, Zhigang, Fulun Yang, Dmitri Konson, and Kamran Nikbin. "A Design Approach Based on Historical Test Data and Bayesian Statistics." In ASME 2013 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/pvp2013-97627.

Full text
Abstract:
Testing is still the final verification for a design even though there are substantial number of analytical and simulation methods available. Testing is seen to be also an indispensable part in the foreseeable future. Numerous test data have been generated in many testing institutions over the years and it is clear that future new tests will be conducted. Historical data with similar design and operating conditions can shed light on the current and future designs since they would share some common features when the changes are not dramatic. To effectively utilize the historical data for future design, two steps are necessary: (1) finding an approach to consistently correlate test data obtained from various conditions; (2) Use of Bayesian statistics which can provide a rational mathematical tool for extracting useful information from the historical data. In this paper, the basic Bayesian statistical procedure based on the historical data is outlined. With this information the reduction of sample size number or improving the accuracy and confidence with the same sample size are becoming possible. Examples of utilizing the historical data are also presented and the benefit of using the Bayesian statistics are highlighted.
APA, Harvard, Vancouver, ISO, and other styles
4

Tayeb, Yousef, Faisal Naseef, and Amell Ghamdi. "Automatic Simulation Data Update: A New Innovative Way to Expedite Historical Data Extension for Models." In International Petroleum Technology Conference. IPTC, 2021. http://dx.doi.org/10.2523/iptc-21187-ms.

Full text
Abstract:
Abstract Historical data in reservoir simulation models becomes outdated with time and the gap between the latest history matched model and the current situation increases. Time is essential, especially with escalated numbers of simulation models. Therefore, updating models in a frequent manner requires a new approach to maximize human resources and deliver timely answers. Engineers must also ensure that the existing quality checked input is secured and that data integrity is maintained during updates. In the conventional method of updating simulation model data, engineers go through several steps to acquire new data from corporate databases and fill the extended missing periods. This procedure involves interacting with many applications, which is very time-consuming. To expedite this process, a new methodology is applied that automates each of the single interactions together in a one-step procedure. A smart in-house tool connected to the corporate database extracts all necessary data and generates consistent inputs, including rates, pressures and well completions as data is developed. The tool then incorporates the new data into the existing simulation model's data input. The smart automation tool shows the positive impact of the one-step methodology in the updating process. As an example, the new tool was tested on a mega-field, where a vast number of wells were perfectly updated and extended to the current simulation date. The updating time was drastically reduced using the indroduced process and engineers were able to keep their existing files. In addition, the smart automation tool was able to connect to the corresponding geological model seamlessly to generate the specified perforation data after incorporating all trajectory and completion data automatically. This paper illustrates that the automation of the updating process is essential to increasing workflow efficiency. The opportunity for fast and frequent simulation model updates leads to a better understanding of reservoir behavior and improves forecasting.
APA, Harvard, Vancouver, ISO, and other styles
5

Moridis, Nefeli, John Lee, Duc Lam, Christie Schultz, and Wade Wardlow. "Hydraulic Fracture Characterization Using Rapid Simulation." In SPE Hydraulic Fracturing Technology Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/204150-ms.

Full text
Abstract:
Abstract The purpose of this paper is to present a technique to estimate hydraulic fracture (HF) length, fracture conductivity, and fracture efficiency using simple and rapid but rigorous reservoir simulation matching of historical production, and where available, pressure. The methodology is particularly appropriate for analysis of horizontal wells with multiple fractures in tight unconventional or unconventional resource plays. In our discussion, we also analyze the differences between the results from decline curve analysis (DCA) approach and the Science Based Forecasting (SBF) results that this work proposes. When we characterize fracture properties with SBF, we can do a better job of forecasting than if we randomly combine fracture properties and reservoir permeability together in a decline-curve trend. The forecasts are significantly different with SBF, therefore fracture characterization plays an important role and SBF uses this characterization to produce different (and better) forecasts.
APA, Harvard, Vancouver, ISO, and other styles
6

Caddemi, Salvatore, Ivo Caliò, Francesco Cannizzaro, Domenico D'Urso, Bartolomeo Pantò, Davide Rapicavoli, and Giuseppe Occhipinti. "3D Discrete Macro-Modelling Approach for Masonry Arch Bridges." In IABSE Symposium, Guimarães 2019: Towards a Resilient Built Environment Risk and Asset Management. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 2019. http://dx.doi.org/10.2749/guimaraes.2019.1825.

Full text
Abstract:
<p>Masonry multi-span arch bridges are historical structures still playing a key role in many transportation networks of numerous countries. Most of these bridges are several decades old and have been subjected to continuous dynamic loadings, due to the vehicular traffic, and in many cases their maintenance required structural modifications. The currently adopted health monitoring strategies are based on in situ inspections as well as structural assessments based on numerical models characterised by different levels of reliability according to the required purpose. Simplified approaches are generally adopted for fast structural evaluation, on the other hand more rigorous approaches are fundamental for a reliable structural assessment of these particular structures, often characterized by very complex geometrical layouts and structural alterations not always sufficiently documented. This paper presents an original Discrete Macro-Element Method (DMEM) that allows a reliable simulation of the linear and nonlinear response of masonry structures and masonry bridges characterised by a lower computational burden, compared to classical nonlinear FEM analyses, although maintaining a good accuracy. The method is applied to a real masonry bridges and the results are compared with those obtained from a more sophisticated three- dimensional nonlinear FEM model both in linear and nonlinear context.</p>
APA, Harvard, Vancouver, ISO, and other styles
7

Ghosh, Supriyo, Jing Yu Koh, and Patrick Jaillet. "Improving Customer Satisfaction in Bike Sharing Systems through Dynamic Repositioning." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/813.

Full text
Abstract:
In bike sharing systems (BSSs), the uncoordinated movements of customers using bikes lead to empty or congested stations, which causes a significant loss in customer demand. In order to reduce the lost demand, a wide variety of existing research has employed a fixed set of historical demand patterns to design efficient bike repositioning solutions. However, the progress remains slow in understanding the underlying uncertainties in demand and designing proactive robust bike repositioning solutions. To bridge this gap, we propose a dynamic bike repositioning approach based on a probabilistic satisficing method which uses the uncertain demand parameters that are learnt from historical data. We develop a novel and computationally efficient mixed integer linear program for maximizing the probability of satisfying the uncertain demand so as to improve the overall customer satisfaction and efficiency of the system. Extensive experimental results from a simulation model built on a real-world bike sharing data set demonstrate that our approach is not only robust to uncertainties in customer demand, but also outperforms the existing state-of-the-art repositioning approaches in terms of reducing the expected lost demand.
APA, Harvard, Vancouver, ISO, and other styles
8

Leong, Darrell, Ying Min Low, and Youngkook Kim. "An Efficient System Reliability Approach Against Mooring Overload Failures." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-95048.

Full text
Abstract:
Abstract As exploration for hydrocarbon resources venture into deeper waters, offshore floating structures are increasingly required to be stationed at sites of highly uncertain environmental loading conditions. Driven by high historical mooring failure rates of severe consequences, the need for effective long-term structural reliability methods arises for mooring lines. However, system nonlinearities, high problem dimensionality, and the diversity of conceivable failure causalities their extremely low probabilities complicates the analysis. Variations on the Monte Carlo approach are robust in addressing these challenges, but at the expense of high computational costs. In this study, distributions of environmental parameters and their correlations are modelled into a joint probabilistic description. By classifying conceivable sea states across the domain, an efficient uniform sampling scheme is presented as an efficient means of assessing long-term reliability against extreme events. The proposed method was performed on a floating production unit case study situated in the hurricane-prone Gulf of Mexico, exposed to irregular wave loads. The analysis was found to provide probability estimates with negligible bias when validated against subset simulation, with significant variance reduction of mean estimators by eliminating the need to over-simulate non-critical environmental conditions. The resulting sampling density has an added advantage of being non-failure specific, enabling system reliability assessments across multiple modes and locations of failure without the need for re-analysis.
APA, Harvard, Vancouver, ISO, and other styles
9

Syafwan, M. "Fit-For-Purpose History Matching Approach for A Low-Quality Reservoir Under Waterflood: Integration of Uncertain Production Allocation." In Digital Technical Conference. Indonesian Petroleum Association, 2020. http://dx.doi.org/10.29118/ipa20-e-311.

Full text
Abstract:
This paper presents a fit-for-purpose approach to mitigate zonal production data allocation uncertainty during history matching of a reservoir simulation model due to limited production logging data. To avoid propagating perforation/production zone allocation uncertainty at commingled wells into the history matched reservoir model, only well-level production data from historical periods when production was from a single zone were used to calibrate reservoir properties that determine initial volumetric. Then, during periods of the history with commingled production, average reservoir pressure measurements were integrated into the model to allocate fluid production to the target reservoir. Last, the periods constrained by dedicated well-level fluid production and average reservoir pressure were merged over the forty-eight-year history to construct a single history matched reservoir model in preparation for waterflood performance forecasting. This innovative history matching approach, which mitigates the impacts of production allocation uncertainty by using different intervals of the historical data to calibrate model saturations and model pressures, has provided a new interpretation of OOIP and current recovery factor, as well as drive mechanisms including aquifer strength and capillary pressure. Fluid allocation from the target reservoir in the history matched model is 85% lower than previously estimated. The history matched model was used as a quantitative forecasting and optimization tool to expand the recent waterflood with improved production forecast reliability. The remaining mobile oil saturation map and streamline-based waterflood diagnostics have improved understanding of injector-producer connectivity and swept pore volumes, e.g., current swept volumes are minor and well-centric with limited indication of breakthrough at adjacent producers resulting in high remaining mobile oil saturation. Accordingly, the history matched model provides a foundation to select new injection points, determine dedicated producer locations and support optimized injection strategies to improve recovery.
APA, Harvard, Vancouver, ISO, and other styles
10

Muhabie, Yohannes T., Jean-David Caprace, Cristian Petcu, and Philippe Rigo. "Improving the Installation of Offshore Wind Farms by the use of Discrete Event Simulation." In SNAME 5th World Maritime Technology Conference. SNAME, 2015. http://dx.doi.org/10.5957/wmtc-2015-183.

Full text
Abstract:
The offshore wind energy development is highly affected by the condition of the weather at sea. Hence, it demands a well-organized planning of the overall process starting from the producers’ sites until the offshore site where the turbines will finally be installed. The planning phase can be supported with the help of Discrete Event Simulation (DES) where weather restrictions, distance matrix, vessel characteristics and assembly scenarios are taken into account. The purpose of this paper is to simulate the overall transport, assembly and installation of the wind turbine components at sea. The analysis is carried out through DES considering both the real historical weather data (wind speed and wave height) and probabilistic approach. Results of the study, applied to the real Offshore Wind Farm (OWF) configuration, are showing a good agreement between the two proposed models. The results point out that the probabilistic\ approach is highly affected by the semi-random numbers used to model the stochastic behavior of the input variable so that several iterations (200 to 400) are required to reach the convergence of the simulation outputs. We suggest that seasonality of the outputs of both models are preserved, i.e. the variation of the results depending on the variation of the weather along the year. These findings provide a new framework to address risks and uncertainties in OWF installations.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Historical Simulation Approach"

1

Melby, Jeffrey, Thomas Massey, Fatima Diop, Himangshu Das, Norberto Nadal-Caraballo, Victor Gonzalez, Mary Bryant, et al. Coastal Texas Protection and Restoration Feasibility Study : Coastal Texas flood risk assessment : hydrodynamic response and beach morphology. Engineer Research and Development Center (U.S.), July 2021. http://dx.doi.org/10.21079/11681/41051.

Full text
Abstract:
The US Army Corps of Engineers, Galveston District, is executing the Coastal Texas Protection and Restoration Feasibility Study coastal storm risk management (CSRM) project for the region. The project is currently in the feasibility phase. The primary goal is to develop CSRM measures that maximize national net economic development benefits. This report documents the coastal storm water level and wave hazard, including sea level rise, for a variety of flood risk management alternatives. Four beach restoration alternatives for Galveston Island and Bolivar peninsula were evaluated. Suites of synthetic tropical and historical non-tropical storms were developed and modeled. The CSTORM coupled surge-and-wave modeling system was used to accurately characterize storm circulation, water level, and wave hazards using new model meshes developed from high-resolution land and sub-aqueous surveys for with- and without-project scenarios. Beach morphology stochastic response was modeled with a Monte Carlo life-cycle simulation approach using the CSHORE morphological evolution numerical model embedded in the StormSim stochastic modeling system. Morphological and hydrodynamic response were primarily characterized with probability distributions of the number of rehabilitations and overflow.
APA, Harvard, Vancouver, ISO, and other styles
2

Hamill, Daniel D., Jeremy J. Giovando, Chandler S. Engel, Travis A. Dahl, and Michael D. Bartles. Application of a Radiation-Derived Temperature Index Model to the Willow Creek Watershed in Idaho, USA. U.S. Army Engineer Research and Development Center, August 2021. http://dx.doi.org/10.21079/11681/41360.

Full text
Abstract:
The ability to simulate snow accumulation and melting processes is fundamental to developing real-time hydrological models in watersheds with a snowmelt-dominated flow regime. A primary source of uncertainty with this model development approach is the subjectivity related to which historical periods to use and how to combine parameters from multiple calibration events. The Hydrologic Engineering Center, Hydrological Modeling System, has recently implemented a hybrid temperature index (TI) snow module that has not been extensively tested. This study evaluates a radiatative temperature index (RTI) model’s performance relative to the traditional air TI model. The TI model for Willow Creek performed reasonably well in both the calibration and validation years. The results of the RTI calibration and validation simulations resulted in additional questions related to how best to parameterize this snow model. An RTI parameter sensitivity analysis indicates that the choice of calibration years will have a substantial impact on the parameters and thus the streamflow results. Based on the analysis completed in this study, further refinement and verification of the RTI model calculations are required before an objective comparison with the TI model can be completed.
APA, Harvard, Vancouver, ISO, and other styles
3

Petrie, John, Yan Qi, Mark Cornwell, Md Al Adib Sarker, Pranesh Biswas, Sen Du, and Xianming Shi. Design of Living Barriers to Reduce the Impacts of Snowdrifts on Illinois Freeways. Illinois Center for Transportation, November 2020. http://dx.doi.org/10.36501/0197-9191/20-019.

Full text
Abstract:
Blowing snow accounts for a large part of Illinois Department of Transportation’s total winter maintenance expenditures. This project aims to develop recommendations on the design and placement of living snow fences (LSFs) to minimize snowdrift on Illinois highways. The research team examined historical IDOT data for resource expenditures, conducted a literature review and survey of northern agencies, developed and validated a numerical model, field tested selected LSFs, and used a model to assist LSF design. Field testing revealed that the proper snow fence setback distance should consider the local prevailing winter weather conditions, and snow fences within the right-of-way could still be beneficial to agencies. A series of numerical simulations of flow around porous fences were performed using Flow-3D, a computational fluid dynamics software. The results of the simulations of the validated model were employed to develop design guidelines for siting LSFs on flat terrain and for those with mild slopes (< 15° from horizontal). Guidance is provided for determining fence setback, wind characteristics, fence orientation, as well as fence height and porosity. Fences comprised of multiple rows are also addressed. For sites with embankments with steeper slopes, guidelines are provided that include a fence at the base and one or more fence on the embankment. The design procedure can use the available right-of-way at a site to determine the appropriate fence characteristics (e.g., height and porosity) to prevent snow deposition on the road. The procedure developed in this work provides an alternative that uses available setback to design the fence. This approach does not consider snow transport over an entire season and may be less effective in years with several large snowfall events, very large single events, or a sequence of small events with little snowmelt in between. However, this procedure is expected to be effective for more frequent snowfall events such as those that occurred over the field-monitoring period. Recommendations were made to facilitate the implementation of research results by IDOT. The recommendations include a proposed process flow for establishing LSFs for Illinois highways, LSF siting and design guidelines (along with a list of suitable plant species for LSFs), as well as other implementation considerations and identified research needs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography