To see the other types of publications on this topic, follow the link: Historical Simulation Approach.

Journal articles on the topic 'Historical Simulation Approach'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Historical Simulation Approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zhou, Rui, Johnny Siu-Hang Li, and Jeffrey Pai. "Pricing temperature derivatives with a filtered historical simulation approach." European Journal of Finance 25, no. 15 (April 5, 2019): 1462–84. http://dx.doi.org/10.1080/1351847x.2019.1602068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmad Baharul Ulum, Zatul Karamah Binti. "Market Risk Quantifications: Historical Simulation Approach on the Malaysian Stock Exchange." International Journal of Management Excellence 2, no. 1 (November 11, 2013): 122. http://dx.doi.org/10.17722/ijme.v2i1.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

David Cabedo, J., and Ismael Moya. "Estimating oil price ‘Value at Risk’ using the historical simulation approach." Energy Economics 25, no. 3 (May 2003): 239–53. http://dx.doi.org/10.1016/s0140-9883(02)00111-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Semenov, Andrei. "Historical simulation approach to the estimation of stochastic discount factor models." Quantitative Finance 8, no. 4 (June 2008): 391–404. http://dx.doi.org/10.1080/14697680701561365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dulaney, Ronald A. "An Historical Simulation Approach to the Present Valuation of Future Earnings." Journal of Forensic Economics 1, no. 1 (August 1, 1987): 37–48. http://dx.doi.org/10.5085/0898-5510-1.1.37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ramadan, Helmi, Prana Ugiana Gio, and Elly Rosmaini. "Monte Carlo Simulation Approach to Determine the Optimal Solution of Probabilistic Supply Cost." Journal of Research in Mathematics Trends and Technology 2, no. 1 (February 24, 2020): 1–6. http://dx.doi.org/10.32734/jormtt.v2i1.3752.

Full text
Abstract:
Monte Carlo simulation is a probabilistic simulation where the solution of problem is given based on random process. The random process involves a probabilitydistribution from data variable collected based on historical data. The used model is probabilistic Economic Order Quantity Model (EOQ). This model then assumed use Monte Carlo simulation, so that obtained the total of optimal supply cost in the future. Based on data processing, the result of probabilistic EOQ is $486128,19. After simulation using Monte Carlo simulation where the demand data follows normal distribution and it is obtained the total of supply cost is $46116,05 in 23 months later. Whereas the demand data uses Weibull distribution is obtained the total of supply stock is $482301,76. So that, Monte Carlo simulation can calculate the total of optimal supply in the future based on historical demand data.
APA, Harvard, Vancouver, ISO, and other styles
7

Susanti, Dwi, Sukono Sukono, and Maria Jatu Verrany. "Value-at-Risk Estimation Method Based on Normal Distribution, Logistics Distribution and Historical Simulation." Operations Research: International Conference Series 1, no. 1 (February 5, 2020): 13–18. http://dx.doi.org/10.47194/orics.v1i1.19.

Full text
Abstract:
This paper discusses the risk analysis of single stock and portfolio returns. The stock data analyzed are BNI, BRI shares and portfolio. After obtaining a stock return, value at risk (VaR) will be estimated using the normal distribution approach, logistic distribution, and historical simulation. From the VaR results, a backtest is then conducted to test the validity of the model and the backtest results for BNI and the portfolio produce a smaller QPS on the historical simulation method compared to the normal distribution and logistics distribution approaches. This shows that BNI VaR and VaR portfolios with the historical simulation method are more consistent than other methods. While the backtest results for BRI produced the smallest QPS on the normal distribution approach compared to the logistical distribution and historical simulation approaches. This shows that the VaR BRI using the normal distribution approach is more consistent than the other methods.
APA, Harvard, Vancouver, ISO, and other styles
8

Gawin, Bartlomiej, and Bartosz Marcinkowski. "How Close to Reality is the „as-is” Business Process Simulation Model?" Organizacija 48, no. 3 (August 1, 2015): 155–75. http://dx.doi.org/10.1515/orga-2015-0013.

Full text
Abstract:
AbstractBackground and Purpose: Business process simulation (BPS) model is based on real-life data form sources like databases, observations and interviews. It acts as “as-is” business scenario can used for reengineering. The main challenge is to gather relevant data and to develop simulation model. Research aims to elaborate BPS model and to systematically assess how close to reality it is.Design/Methodology/Approach: The research has been performed in Polish telecommunications company. Authors investigate technical process of expanding cellular network. After elaborating “as-is” model, authors use ADONIS simulation tool to run a series simulations and confront simulation results with actual historical events. After this, assessment whether computer simulation model can precisely map real-life business process - and consequently act as credible basis for process improvement - is made.Results: The simulation model has been constructed with data from the WfMS database, observations, staff knowledge and their experience. Fully equipped simulation model is found to allow reconstructing the historical execution of business activity with low margin for error. Some limitation were identified and discussed.Conclusion: BPS is not a popular approach for process reengineering and improvement yet. Data collection issues for BPS that require adopting process mining techniques and additional information sources are among the reasons for that. In our study, computer simulation outputs are compatible with historical events. Hence the model reflects the business reality and can be taken as a reference model while redesigning the process.
APA, Harvard, Vancouver, ISO, and other styles
9

Lu, Baohong, Huanghe Gu, Ziyin Xie, Jiufu Liu, Lejun Ma, and Weixian Lu. "Stochastic simulation for determining the design flood of cascade reservoir systems." Hydrology Research 43, no. 1-2 (February 1, 2012): 54–63. http://dx.doi.org/10.2166/nh.2011.002.

Full text
Abstract:
Stochastic simulation is widely applied for estimating the design flood of various hydrosystems. The design flood at a reservoir site should consider the impact of upstream reservoirs, along with any development of hydropower. This paper investigates and applies a stochastic simulation approach for determining the design flood of a complex cascade of reservoirs in the Longtan watershed, southern China. The magnitude of the design flood when the impact of the upstream reservoirs is considered is less than that without considering them. In particular, the stochastic simulation model takes into account both systematic and historical flood records. As the reliability of the frequency analysis increases with more representative samples, it is desirable to incorporate historical flood records, if available, into the stochastic simulation model. This study shows that the design values from the stochastic simulation method with historical flood records are higher than those without historical flood records. The paper demonstrates the advantages of adopting a stochastic flow simulation approach to address design-flood-related issues for a complex cascade reservoir system.
APA, Harvard, Vancouver, ISO, and other styles
10

Changchien, Chang-Cheng, Chu-Hsiung Lin, and Wei-Shun Kao. "Capturing value-at-risk in futures markets: a revised filtered historical simulation approach." Journal of Risk Model Validation 6, no. 4 (December 2012): 67–93. http://dx.doi.org/10.21314/jrmv.2012.095.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Catumba, Jorge, Rafael Rentería, Johan Manuel Redondo, Leonar Aguiar, and José Octaviano Barrera. "A first approach to a hybrid algorithm for mobile emergency resources allocation." Memorias, no. 1 (November 2, 2018): 73–79. http://dx.doi.org/10.22490/25904779.3069.

Full text
Abstract:
We present a hybrid algorithm based on Genetic Algorithms and Discrete Event Simulation that computes the algorithmic-optimal location of emergency resources. Parameters for the algorithm were obtained from computed historical statistics of the Bogotá Emergency Medical Services. Considerations taken into account are: (1) no more than a single resource is sent to an incident, (2) resources are selected according to incidentpriorities (3) distance from resource base to incident location is also considered for resource assignment and (4) all resources must be used equally. For every simulation, a different set of random incidents is generated so it’s possible to use the algorithm with an updated set of historical incidents. We found that the genetic algorithm converges so we can consider its solution as an optimal. With the algorithmic-optimal solution we found that arrival times are shorter than the historical ones. It’s also possible to compute the amount of required resources to reduce even more the arrival times. Since every Discrete Event Simulation takes a considerable amount of time the whole algorithm takes a heavy amount of time for large simulation time-periods and for many individuals for generation in the genetic algorithm, so an optimization approach is the next step in our research. Also, less restricted considerations must be taken into account for future developments in this topic.
APA, Harvard, Vancouver, ISO, and other styles
12

Walters, Carl J., Steven J. D. Martell, and Josh Korman. "A stochastic approach to stock reduction analysis." Canadian Journal of Fisheries and Aquatic Sciences 63, no. 1 (January 1, 2006): 212–23. http://dx.doi.org/10.1139/f05-213.

Full text
Abstract:
Stock reduction analysis (SRA) can complement more detailed assessment methods by using long-term historical catches to estimate recruitment rates needed to have produced those catches, yet still end up with stock sizes near those estimated by the detailed methods. A longer historical perspective can also add information to the estimation of reference points such as unfished biomass (B0) or target biomass (BMSY). Deterministic SRA models provide a single stock size trajectory that is vanishingly unlikely to have actually occurred, while stochastic SRA attempts to provide probability distributions for stock size over time under alternative hypotheses about unfished recruitment rates and about variability around assumed stock–recruitment relationships. These distributions can be generated with age-structured population models by doing large numbers of Monte Carlo simulation trials and retaining those sample trials for which the stock would not have been driven to extinction by historical catches. By resampling from these trials using likelihood weights (sampling – importance resampling method), it is possible to move into fully Bayesian, state–space assessment modeling through a series of straightforward steps and to provide understandable visualization of how much the data help to reduce uncertainty about historical fishing impacts and stock status.
APA, Harvard, Vancouver, ISO, and other styles
13

Elwakil, Emad, and Tarek Zayed. "Construction knowledge discovery system using fuzzy approach." Canadian Journal of Civil Engineering 42, no. 1 (January 2015): 22–32. http://dx.doi.org/10.1139/cjce-2014-0153.

Full text
Abstract:
Most research works in simulating construction operations have predominantly focused on modeling and mistreated data preparation that is paramount for simulation. To prepare data for simulation process, a knowledge discovery system (KDS) is indispensable in extracting hidden knowledge from construction data sets. This knowledge is typically hard to obtain using traditional means, such as statistical analysis. The presented research develops, using fuzzy approach, a KDS to prepare, utilize, analyze, and extract the hidden patterns from construction data to predict work task durations. The KDS depends mainly on finding the relation between quantitative and qualitative variables, which affect the duration of construction operations and work tasks as well as prepare data for simulation modeling. It consists of two stages: data processing and mining. Data processing consists of cleaning, integrating, transforming, and selecting the appropriate knowledge. Data mining consists of selecting the factors that affect a construction operation, generating their fuzzy sets, defining fuzzy rule and models, developing a fuzzy knowledge base, and testing the effectiveness of this knowledge base in predicting work task durations. The developed KDS has been tested using a construction case study in which the results found satisfactory with an average validity percent of 92%. The developed system assists researchers and practitioners in utilizing historical construction data to extract knowledge that could not be obtained by traditional techniques and precisely predicting work task durations.
APA, Harvard, Vancouver, ISO, and other styles
14

Rosso, R., and M. C. Rulli. "An integrated simulation method for flash-flood risk assessment: 2. Effects of changes in land-use under a historical perspective." Hydrology and Earth System Sciences 6, no. 2 (April 30, 2002): 285–94. http://dx.doi.org/10.5194/hess-6-285-2002.

Full text
Abstract:
Abstract. The influence of land use changes on flood occurrence and severity in the Bisagno River (Thyrrenian Liguria, N.W. Italy is investigated using a Monte Carlo simulation approach (Rulli and Rosso, 2002). High resolution land-use maps for the area were reconstructed and scenario simulations were made for a pre-industrial (1878), an intermediate (1930) and a current (1980) year. Land-use effects were explored to assess the consequences of distributed changes in land use due to agricultural practice and urbanisation. Hydraulic conveyance effects were considered, to assess the consequences of channel modifications associated with engineering works in the lower Bisagno River network. Flood frequency analyses of the annual flood series, retrieved from the simulations, were used to examine the effect of land-use change and river conveyance on flood regime. The impact of these effects proved to be negligible in the upper Bisagno River, moderate in the downstream river and severe in the small tributaries in the lower Bisagno valley that drain densely populated urban areas. The simulation approach is shown to be capable of incorporating historical data on landscape and river patterns into quantitative methods for risk assessment. Keywords: flood, simulation, distributed model, land-use changes, channel modifications, historical data
APA, Harvard, Vancouver, ISO, and other styles
15

Blankenship, Kori, Leonardo Frid, and James L. Smith. "A state-and-transition simulation modeling approach for estimating the historical range of variability." AIMS Environmental Science 2, no. 2 (2015): 253–68. http://dx.doi.org/10.3934/environsci.2015.2.253.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Keane, Robert E., Russell A. Parsons, and Paul F. Hessburg. "Estimating historical range and variation of landscape patch dynamics: limitations of the simulation approach." Ecological Modelling 151, no. 1 (May 2002): 29–49. http://dx.doi.org/10.1016/s0304-3800(01)00470-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Fan, Ying, and Jian Ling Jiao. "An improved historical simulation approach for estimating 'value at risk' of crude oil price." International Journal of Global Energy Issues 25, no. 1/2 (2006): 83. http://dx.doi.org/10.1504/ijgei.2006.008385.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Zheng, Wanhong, Minqi Li, Michael Nickasch, Feng Yang, and Aida Rabiee Gohar. "A new approach to over-bedding in American healthcare: Statistical analysis and computer simulation of patient admission in a West Virginia state psychiatric hospital." Journal of Hospital Administration 5, no. 1 (October 20, 2015): 18. http://dx.doi.org/10.5430/jha.v5n1p18.

Full text
Abstract:
Objective: To identify the factors that have a substantial impact on a West Virginia state psychiatric hospital’s bed occupancy by investigating historical admission data, and developing a computer simulation system to give insight into modifiable variables that reduce admission numbers, therefore to provide solutions to the over-bedding problem.Methods: Quantitative review of hospital admission data from January 2007 to November 2013 allowed for the construction of a simulation model to estimate the inpatient flow. The system’s performance was evaluated after alteration of selected parameters and variables.Results: The study revealed significant regional differences in admission numbers. The civil commitments and psychiatric hospitalizations do not directly correlate with county coverage populations. Some counties sent disproportionately more patients. Patients’ length of stay also varied among geographical areas. Re-admission was not uncommon. Using the percentage of diversion as the outcome measurement, the computer simulation model reconstructed the admission scenario multiple times, predicting that the diversion rate can be significantly reduced if certain variables (hospital capacity, patient arrivals from top referring counties, and patient length of stay) are changed.Conclusions: Involuntary admissions were unevenly distributed according to geography and population in the studied American state psychiatric hospital. Using historical data, computer simulations can model hospital admission systems to evaluate performance and predict needs for change.
APA, Harvard, Vancouver, ISO, and other styles
19

Wan, Yong Tao, Zhi Gang Zhang, and Lu Tao Zhao. "An Improved Historical Simulation Method to Estimate the Amount of Refined Oil Retail Value at Risk VaR." Advanced Materials Research 734-737 (August 2013): 1711–18. http://dx.doi.org/10.4028/www.scientific.net/amr.734-737.1711.

Full text
Abstract:
The international crude oil market is complicated in itself and with the rapid development of China in recent years, the dramatic changes of the international crude oil market have brought some risk to the security of Chinas oil market and the economic development of China. Value at risk (VaR), an effective measurement of financial risk, can be used to assess the risk of refined oil retail sales as well. However, VaR, as a model that can be applied to complicated nonlinear data, has not yet been widely researched. Therefore, an improved Historical Simulation Approach, historical stimulation of genetic algorithm to parameters selection of support vector machine, HSGA-SVMF, in this paper, is proposed, which is based on an approach the historical simulation with ARMA forecasts, HSAF. By comparing it with the HSAF and HSGA-SVMF approach, this paper gives evidence to show that HSGA-SVMF has a more effective forecasting power in the field of amount of refined oil.
APA, Harvard, Vancouver, ISO, and other styles
20

Ghanbartehrani, Saeed, Anahita Sanandaji, Zahra Mokhtari, and Kimia Tajik. "A Novel Ramp Metering Approach Based on Machine Learning and Historical Data." Machine Learning and Knowledge Extraction 2, no. 4 (September 23, 2020): 379–96. http://dx.doi.org/10.3390/make2040021.

Full text
Abstract:
The random nature of traffic conditions on freeways can cause excessive congestion and irregularities in the traffic flow. Ramp metering is a proven effective method to maintain freeway efficiency under various traffic conditions. Creating a reliable and practical ramp metering algorithm that considers both critical traffic measures and historical data is still a challenging problem. In this study we use simple machine learning approaches to develop a novel real-time ramp metering algorithm. The proposed algorithm is computationally simple and has minimal data requirements, which makes it practical for real-world applications. We conduct a simulation study to evaluate and compare the proposed approach with an existing traffic-responsive ramp metering algorithm.
APA, Harvard, Vancouver, ISO, and other styles
21

Gigliarelli, Elena, Filippo Calcerano, and Luciano Cessari. "Heritage Bim, Numerical Simulation and Decision Support Systems: an Integrated Approach for Historical Buildings Retrofit." Energy Procedia 133 (October 2017): 135–44. http://dx.doi.org/10.1016/j.egypro.2017.09.379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Bøe, Espen Johnsen. "Remediation of Historical Photographs in Mobile Augmented Reality." Journal of Media Innovations 7, no. 1 (May 11, 2021): 29–40. http://dx.doi.org/10.5617/jomi.8793.

Full text
Abstract:
Being present at a location of historical significance, often demands imagination to understand the full scope of the area. An approach to spark one’s imagination is to present a mediated simulation of a historic location in situ. As an application example, we used the sitsim AR platform to develop a simulation that conveys the history of fishermen in the historic fishing village of Storvågan in Lofoten, Norway. The study presents a rendition of the sitsim AR platform’s functionality for engaging presentations of historical photographs. This functionality is enhanced from solely representing buildings in a historical photograph into also representing animated human characters. In Storvågan, a museum (Lofotmuseet) occupies historically significant buildings amid the historic surroundings. This museum exhibits a historical photograph of fishermen that also shows how the area once looked. This photograph is remediated into a 3D animation, presented as a real-time generated simulation, at the location where the photograph was originally photographed. The study documents a design experiment including the modelling and animation of a 3D representation depicting the photograph. The functionality is evaluated based on user feedback from a case study of a beta version on location in Lofoten. Users reported that the animated fishermen contribute to an engaging experience and a feeling of being “part of the history.” The majority of users perceived the 3D representation as credible. An analysis of the modelled characters concludes that the 3D-models lack perceptual validity; hence, the case study’s positive results were somewhat unexpected. Three theories are presented as conceivable explanations for the unexpected result. Ultimately, the study provides a method for modelling and animation of people from a historical photograph, and showcases how the animation of human characters in a sitsim may be applied to convey cultural heritage in an engaging way.
APA, Harvard, Vancouver, ISO, and other styles
23

Zhang, Zhiwen, Chenghao Shi, Pengming Zhu, Zhiwen Zeng, and Hui Zhang. "Autonomous Exploration of Mobile Robots via Deep Reinforcement Learning Based on Spatiotemporal Information on Graph." Applied Sciences 11, no. 18 (September 7, 2021): 8299. http://dx.doi.org/10.3390/app11188299.

Full text
Abstract:
In this paper, we address the problem of autonomous exploration in unknown environments for ground mobile robots with deep reinforcement learning (DRL). To effectively explore unknown environments, we construct an exploration graph considering historical trajectories, frontier waypoints, landmarks, and obstacles. Meanwhile, to take full advantage of the spatiotemporal feature and historical information in the autonomous exploration task, we propose a novel network called Spatiotemporal Neural Network on Graph (Graph-STNN). Specifically, the proposed Graph-STNN extracts the spatial feature using graph convolutional network (GCN) and the temporal feature using temporal convolutional network (TCN). Then, gated recurrent unit (GRU) is performed to synthesize the spatial feature, the temporal feature, and the historical state information into the current state feature. Combined with DRL, our Graph-STNN helps estimation of the optimal target point through extracted hybrid features. The simulation experiment shows that our approach is more effective than the GCN-based approach and the information entropy-based approach. Moreover, Graph-STNN also performs better generalization ability than GCN-based, information entropy-based, and random methods. Finally, we validate our approach on the simulation platform Stage with the actual robot model.
APA, Harvard, Vancouver, ISO, and other styles
24

Kusumaningtyas, Alfi Reny, and Abdul Aziz. "METODE HISTORIS UNTUK PERHITUNGAN VALUE AT RISK PADA MODEL GENERALIZED AUTOREGRESSIVE CONDITIONAL HETEROSCEDACITY IN MEAN." Jurnal Matematika "MANTIK" 2, no. 1 (October 30, 2016): 1. http://dx.doi.org/10.15642/mantik.2016.2.1.1-6.

Full text
Abstract:
Investment is a commitment of the placement of the data on an object or a few investments with expectations will benefit in the future. The main motive is to seek investment gain or profit in a certain amount, but behind the good side there is one side that can harm or the risk of, for it required a measurement of risk where methods of value at risk (VaR) is very popular is widely used by the financial industry worldwide. Three main method on calculation of VaR historical method, parametric method and Monte Carlo method. So, the selected calculation of VaR GARCH-M model with historical simulation method on Bank Mandiri Tbk closing stock in 2005-2010. This research aims to know the calculation of VaR model GARCH-M through the historical method and implementation model GARCH-M on the computation of VaR via simulation on closing stock Bank Mandiri Tbk. Historical method approach is a model calculation of VaR is determined by the value of the past (historical) or return generated by simulation (repetition) of data used. The measures undertaken that explains the historical simulation method VaR models in the estimation of GARCH-M with a normal distribution, then apply GARCH-M in case of loss obtained by investors after investing with the help of Minitab software, E-views software and Matlab software.
APA, Harvard, Vancouver, ISO, and other styles
25

Karunakaran, Sreedhar. "Innovative application of LSS in aircraft maintenance environment." International Journal of Lean Six Sigma 7, no. 1 (March 7, 2016): 85–108. http://dx.doi.org/10.1108/ijlss-01-2015-0001.

Full text
Abstract:
Purpose – The purpose of this paper is to eliminate the wastes and inefficient procedures in the maintenance organization of aircraft so as to reduce its downtime and increase mission availability. Design/methodology/approach – Customized lean Six Sigma (LSS) was applied at the task level and servicing cycle level to reduce the task content, cycle length and resources in servicing. The loading of the servicing facility was simulated through a simulation program developed from a statistical analysis of historical data for validating/simulating/determining optimum loading of servicing facility with refined tasks, reduced cycle length and resources. In simulation, the optimum combination of manpower, resources and infrastructure at the facility level was determined through sensitive analysis and design of experiments (DoE). Findings – Optimization at the task level and its re-organization at the servicing cycle level reduced the cycle length by 55-68 per cent and manpower resources by 26 per cent. This further reduced facility-level manpower by 25 to 40 per cent, capacity requirements by more than 33 per cent and annual aircraft downtime by 78 per cent. The approach reduced the average number of aircraft undergoing servicing at each airbase at any time from 2.35 to just 0.76 and increased the mission availability to 20 per cent. Originality/value – The hallmark of the paper has been the design of LSS approach from structured historical data and its validation through innovative simulation. The multi-pronged bottom-up approach practically bundles all wastes resident in the maintenance organization. The paper provides cursory approach to lean practitioners in the elimination of wastes in the maintenance of capital assets like aircraft.
APA, Harvard, Vancouver, ISO, and other styles
26

Mishra, Nandkumar, and Santosh B. Rane. "Prediction and improvement of iron casting quality through analytics and Six Sigma approach." International Journal of Lean Six Sigma 10, no. 1 (March 4, 2019): 189–210. http://dx.doi.org/10.1108/ijlss-11-2017-0122.

Full text
Abstract:
Purpose The purpose of this technical paper is to explore the application of analytics and Six Sigma in the manufacturing processes for iron foundries. This study aims to establish a causal relationship between chemical composition and the quality of the iron casting to achieve the global benchmark quality level. Design/methodology/approach The case study-based exploratory research design is used in this study. The problem discovery is done through the literature survey and Delphi method-based expert opinions. The prediction model is built and deployed in 11 cases to validate the research hypothesis. The analytics helps in achieving the statistically significant business goals. The design includes Six Sigma DMAIC (Define – Measure – Analyze – Improve and Control) approach, benchmarking, historical data analysis, literature survey and experiments for the data collection. The data analysis is done through stratification and process capability analysis. The logistic regression-based analytics helps in prediction model building and simulations. Findings The application of prediction model helped in quick root cause analysis and reduction of rejection by over 99 per cent saving over INR6.6m per year. This has also enhanced the reliability of the production line and supply chain with on-time delivery of 99.78 per cent, which earlier was 80 per cent. The analytics with Six Sigma DMAIC approach can quickly and easily be applied in manufacturing domain as well. Research limitations implications The limitation of the present analytics model is that it provides the point estimates. The model can further be enhanced incorporating range estimates through Monte Carlo simulation. Practical implications The increasing use of prediction model in the near future is likely to enhance predictability and efficiencies of the various manufacturing process with sensors and Internet of Things. Originality/value The researchers have used design of experiments, artificial neural network and the technical simulations to optimise either chemical composition or mould properties or melt shop parameters. However, this work is based on comprehensive historical data-based analytics. It considers multiple human and temporal factors, sand and mould properties and melt shop parameters along with their relative weight, which is unique. The prediction model is useful to the practitioners for parameter simulation and quality enhancements. The researchers can use similar analytics models with structured Six Sigma DMAIC approach in other manufacturing processes for the simulation and optimisations.
APA, Harvard, Vancouver, ISO, and other styles
27

Chapman, Melissa A., and Miguel Gomez. "Primary sources, simulations and peanut butter and jelly." Social Studies Research and Practice 15, no. 3 (October 29, 2020): 311–19. http://dx.doi.org/10.1108/ssrp-09-2020-0043.

Full text
Abstract:
PurposeThis paper seeks to provide instructional methods for using simulations to teach primary and secondary sources within a social studies classroom. Classroom simulations provide students with authentic opportunities to engage in meaningful learning experiences that are both hands-on in nature and promote the use of critical thinking.Design/methodology/approachThis paper opted to describe an approach to teach students about primary and secondary sources through a classroom simulation. Step-by-step instruction was provided via an included table, so that readers can recreate the lesson in their own classrooms.FindingsThis paper offers insights about how simulations can be used to provide students an authentic experience with primary and secondary sources. These experiences include opportunities to critically think about the benefits and limitations that both primary and secondary sources offer students while engaging in historical inquiry.Practical implicationsThis paper is designed for teachers to utilize and replicate in their own social studies classrooms.Originality/valueThis paper recognizes the important role that primary and secondary sources have in the social studies classroom. Through an original approach, using simulations, the authors present a unique perspective on how to teach about primary and secondary sources in a manner that supports historical inquiry.
APA, Harvard, Vancouver, ISO, and other styles
28

Petrovic, Jasna, and Jovan Despotovic. "Historical rainfall for urban storm drainage design." Water Science and Technology 37, no. 11 (June 1, 1998): 105–11. http://dx.doi.org/10.2166/wst.1998.0446.

Full text
Abstract:
Traditional design method for urban drainage systems is based on design storms and its major drawback is that frequencies of peak flows in the system are considered equal to frequencies of design storms. An alternative is to use historical storms with rainfall-runoff models to produce a series of possible flows in the system and their frequencies. The latter approach involves more computations and can be laborious for larger catchments. This paper considers ways to reduce the set of historical storms to be involved in design procedure and yet to lead to realistic flow frequencies. Frequencies obtained by rainfall-runoff simulation at an experimental catchment are compared with frequencies of observed peak flows in the system.
APA, Harvard, Vancouver, ISO, and other styles
29

Nonaka, Etsuko, Thomas A. Spies, Michael C. Wimberly, and Janet L. Ohmann. "Historical range of variability in live and dead wood biomass: a regional-scale simulation study." Canadian Journal of Forest Research 37, no. 11 (November 2007): 2349–64. http://dx.doi.org/10.1139/x07-064.

Full text
Abstract:
The historical range of variability (HRV) in landscape structure and composition created by natural disturbance can serve as a general guide for evaluating ecological conditions of managed landscapes. HRV approaches to evaluating landscapes have been based on age-classes or developmental stages, which may obscure variation in live and dead stand structure. Developing the HRV of stand structural characteristics would improve the ecological resolution of this coarse-filter approach to ecosystem assessment. We investigated HRV in live and dead wood biomass in the regional landscape of the Oregon Coast Range by integrating stand-level biomass models and a spatially explicit fire simulation model. We simulated historical landscapes of the region for 1000 years under pre-Euro-American settlement fire regimes and calculated biomass as a function of disturbance history. The simulation showed that live and dead wood biomass historically varied widely in time and space. The majority of the forests historically contained 500–700 Mg·ha–1 (50–70 kg·m–2) of live wood and 50–200 Mg·ha–1 (5–20 kg·m–2) of dead wood. The current distributions are more concentrated in much smaller amounts for both biomass types. Although restoring the HRV of forest structure is not necessarily a management goal for most landowners and managing agencies, departure from the reference condition can provide relative measure to evaluate habitat conditions for managers seeking to use forest structure as a means to maintain or restore ecosystem and species diversity.
APA, Harvard, Vancouver, ISO, and other styles
30

Bagnato, Marco. "Design of an algorithm for an adaptive Value at Risk measurement through the implementation of robust methods in relation to asset cross-correlation." Risk Management Magazine 16, no. 1 (April 30, 2021): 43–57. http://dx.doi.org/10.47473/2020rmm0083.

Full text
Abstract:
This study proposes an algorithmic approach for selecting among different Value at Risk (VaR) estimation methods. The proposed metaheuristic, denominated as “Commitment Machine” (CM), has a strong focus on assets cross-correlation and allows to measure adaptively the VaR, dynamically evaluating which is the most performing method through the minimization of a loss function. The CM algorithm compares five different VaR estimation techniques: the traditional historical simulation method, the filtered historical simulation (FHS) method, the Monte Carlo method with correlated assets, the Monte Carlo method with correlated assets which uses a GARCH model to simulate asset volatility and a Bayesian Vector autoregressive model. The heterogeneity of the compared methodologies and the proposed dynamic selection criteria allow us to be confident in the goodness of the estimated risk measure. The CM approach is able to consider the correlations between portfolio assets and the non-stationarity of the analysed time-series in the different models. The paper describes the techniques adopted by the CM, the logic behind model selection and it provides a market application case of the proposed metaheuristic, by simulating an equally weighted multi-asset portfolio.
APA, Harvard, Vancouver, ISO, and other styles
31

Naraghi, Tahereh, Mehdi F. Najib, Ali S. Nobari, and Kamran Nikbin. "Fitness-for-Service Assessment Approach for Ageing Pipeline Section Based on Sparse Historical Data." Journal of Multiscale Modelling 12, no. 01 (March 2021): 2150001. http://dx.doi.org/10.1142/s1756973721500013.

Full text
Abstract:
Fitness-for-service (FFS) assessment is a common evaluation methodology in oil and gas industries to assess the integrity of in-service structures that may contain flaws, metal thinning and pitting damage. However, given the level of unknowns or missing information in the industry deterministic predictions are unacceptable and invariably the lower bound values could also be substantially conservative. The aim of this work is to develop a generic process to ensure, within a level of confidence, the operational safety and integrity of aging gas or oil pipelines sections based on available data. Fitness for service procedure according to “API 579-1/ASME FFS-1” is performed using local metal loss and micro-cracking to predict a range of safe life for the ageing pipeline operated for around 40 years. The mean value predictions of the present assessment indicate that the flaws away from the weld are within an acceptable boundary which implies the pipes would be fit to continue in operation and at least have 10 years remaining life, whilst the flaws near the weld need to be repaired as soon as possible. This is based on the best average values for the NDE and material property results available. However, adopting extreme caution in the analysis will render the pipes obsolete and ready for replacement. Understanding the risks to be taken in such situations becomes an expert system decision based not just on the FFS analysis but on both quantitative historical data, loading history, material degradation due to environment, corrosion rates and metallurgical analysis in addition to qualitative experience collected from other databases and pipes failure data. Beyond such a procedure the safe option would be a full burst pressure testing of the length of pipeline in question to identify possible leaks of old pipes.
APA, Harvard, Vancouver, ISO, and other styles
32

Terzić, Ivica, and Marko Milojević. "EVALUATING MEASURES OF MARKET RISK IN CIRCUMSTANCES OF GLOBAL FINANCIAL CRISIS – EMPIRICAL EVIDENCE FROM FIVE COUNTRIES." CBU International Conference Proceedings 1 (June 30, 2013): 75–81. http://dx.doi.org/10.12955/cbup.v1.17.

Full text
Abstract:
The purpose of this paper is to evaluate performance of value-at-risk (VaR) produced by two risk models: historical simulation and Risk Metrics. We perform three backtest: unconditional coverage, independence and conditional coverage. We present results on both VaR 1% and VaR 5% on a one-day horizon for the following indices: S&P 500, DAX, SAX, PX and Belex 15. Our results show that Historical simulation 500 days rolling window approach satisfies unconditional coverage for all tested indices, while Risk Metrics has many rejection cases. On the other hand Risk Metrics model satisfies independence backtest for three indices, while Historical simulation has rejected more times. Based on our strong criteria to accept accuracy of VaR models only if both unconditional coverage and independence properties are satisfied, results indicate that during the crisis period all tested VaR models underestimate the true level of market risk exposure.
APA, Harvard, Vancouver, ISO, and other styles
33

Roca, Pere, Miguel Cervera, Luca Pelá, Roberto Clemente, and Michele Chiumenti. "Viscoelasticity and Damage Model for Creep Behavior of Historical Masonry Structures." Open Civil Engineering Journal 6, no. 1 (November 16, 2012): 188–99. http://dx.doi.org/10.2174/1874149501206010188.

Full text
Abstract:
This paper presents a continuum model for the simulation of the viscous effects and the long-term damage ac-cumulation in masonry structures. The rheological model is based on a generalized Maxwell chain representation with a constitutive law utilizing a limited number of internal variables. Thanks to its computational efficiency, this approach is suitable for the analysis of large and complex structures. In the paper, the viscous and damage models are presented and their coupling is discussed. The FE simulation of the construction process of the representative bay of Mallorca Cathedral is presented, together with the analysis of the long-term effects. The parameters of the model are tentatively calibrated on the basis of the time-dependent viscous deformations detected during the cathedral monitoring.
APA, Harvard, Vancouver, ISO, and other styles
34

Poppick, Andrew, and Karen A. McKinnon. "Observation-Based Simulations of Humidity and Temperature Using Quantile Regression." Journal of Climate 33, no. 24 (December 15, 2020): 10691–706. http://dx.doi.org/10.1175/jcli-d-20-0403.1.

Full text
Abstract:
AbstractThe human impacts of changes in heat events depend on changes in the joint behavior of temperature and humidity. Little is currently known about these complex joint changes, either in observations or projections from general circulation models (GCMs). Further, GCMs do not fully reproduce the observed joint distribution, implying a need for simulation methods that combine information from GCMs with observations for use in impact studies. We present an observation-based, conditional quantile mapping approach for the simulation of future temperature and humidity. A temperature simulation is first produced by transforming historical temperature observations to include projected changes in the mean and temporal covariance structure from a GCM. Next, a humidity simulation is produced by transforming humidity observations to account for projected changes in the conditional humidity distribution given temperature, using a quantile regression model. We use the Community Earth System Model Large Ensemble (CESM1-LE) to estimate future changes in summertime (June–August) temperature and humidity over the continental United States (CONUS), and then use the proposed method to create future simulations of temperature and humidity at stations in the Global Summary of the Day dataset. We find that CESM1-LE projects decreases in summertime humidity across CONUS for a given deviation in temperature from the forced trend, but increases in the risk of high dewpoint on historically hot days. In comparison with raw CESM1-LE output, our observation-based simulation largely projects smaller changes in the future risk of either high or low humidity on days with historically warm temperatures.
APA, Harvard, Vancouver, ISO, and other styles
35

Lin, Chu-Hsiung, Chang-Cheng Chang Chien, and Sunwu Winfred Chen. "Incorporating the Time-Varying Tail-Fatness into the Historical Simulation Method for Portfolio Value-at-Risk." Review of Pacific Basin Financial Markets and Policies 09, no. 02 (June 2006): 257–74. http://dx.doi.org/10.1142/s0219091506000720.

Full text
Abstract:
This study extends the method of Guermat and Harris (2002), the Power EWMA (exponentially weighted moving average) method in conjunction with historical simulation to estimating portfolio Value-at-Risk (VaR). Using historical daily return data of three hypothetical portfolios formed by international stock indices, we test the performance of this modified approach to see if it can improve the precise forecasting capability of historical simulation. We explicitly highlight the extended Power EWMA owns privileged flexibilities to capture time-varying tail-fatness and volatilities of financial returns, and therefore may promote the quality of extreme risk management. Our empirical results, derived from the Kupiec (1995) tests and failure ratios, show that our proposed method indeed offers substantial improvements on capturing dynamic returns distributions, and can significantly enhance the estimation accuracy of portfolio VaR.
APA, Harvard, Vancouver, ISO, and other styles
36

Yang, Yujuin, Donald H. Burn, and Barbara J. Lence. "Development of a framework for the selection of a reservoir operating policy." Canadian Journal of Civil Engineering 19, no. 5 (October 1, 1992): 865–74. http://dx.doi.org/10.1139/l92-098.

Full text
Abstract:
A methodology is developed for the selection of a preferred operating policy for a multipurpose reservoir. The methodology consists of policy identification through optimization, policy evaluation through simulation, and policy selection with a multiobjective optimization approach. The approach developed is demonstrated through an application to the Shellmouth Reservoir, a multipurpose reservoir located on the Assiniboine River in Manitoba. The operating policy identified for the Shellmouth Reservoir resulted in a satisfactory performance for the historical period of record. Key words: operating rules, reservoirs, optimization, simulation.
APA, Harvard, Vancouver, ISO, and other styles
37

Mas, Erick, Bruno Adriano, Nelson Pulido, Cesar Jimenez, and Shunichi Koshimura. "Simulation of Tsunami Inundation in Central Peru from Future Megathrust Earthquake Scenarios." Journal of Disaster Research 9, no. 6 (December 1, 2014): 961–67. http://dx.doi.org/10.20965/jdr.2014.p0961.

Full text
Abstract:
We estimated, from twelve scenarios of potential megathrust earthquakes, the tsunami impact on the Lima-Callao region in Central Peru. In addition, we conducted hazard mapping using the local envelope of the maximum inundation simulated in these scenarios. The deterministic approach is supported by the decades of geodetic measurements in this area that characterize the interseismic strain build up since historical megathrust earthquakes. The earthquake scenarios for simulation proposed in [1] introduce spatially correlated short-wavelength slip heterogeneities to a first slip model in [2] calculated from the interseismic coupling (ISC) distribution in Central Peru. The ISC was derived from GPS monitoring data as well as from historical earthquake information. The results of strong ground motion simulations in [1] reported that the slip scenario with the deepest average peak values along the strike (Mw= 8.86) generates the largest PGA in the Lima-Callao area. In this study, we found from tsunami simulation results that the slip model with the largest peak slip at a shallow depth (Mw= 8.87) yielded the highest tsunami inundation. Such differences in maximum scenarios for peak ground acceleration and tsunami height reveal the importance of a comprehensive assessment of earthquake and tsunami hazards in order to provide plausible worstcase scenarios for disaster risk management and education.
APA, Harvard, Vancouver, ISO, and other styles
38

Axelrod, Robert. "An Evolutionary Approach to Norms." American Political Science Review 80, no. 4 (December 1986): 1095–111. http://dx.doi.org/10.1017/s0003055400185016.

Full text
Abstract:
Norms provide a powerful mechanism for regulating conflict in groups, even when there are more than two people and no central authority. This paper investigates the emergence and stability of behavioral norms in the context of a game played by people of limited rationality. The dynamics of this new norms game are analyzed with a computer simulation based upon the evolutionary principle that strategies shown to be relatively effective will be used more in the future than less effective strategies. The results show the conditions under which norms can evolve and prove stable. One interesting possibility is the employment of metanorms, the willingness to punish someone who did not enforce a norm. Many historical examples of domestic and international norms are used to illustrate the wide variety of mechanisms that can support norms, including metanorms, dominance, internalization, deterrence, social proof, membership in groups, law, and reputation.
APA, Harvard, Vancouver, ISO, and other styles
39

Tidwell, Matt. "Priming the pump: Does providing information before a crisis communications simulation provide a better learning experience?" Volume 2 2, no. 2019 (March 2019): 31–34. http://dx.doi.org/10.30658/icrcc.2019.9.

Full text
Abstract:
Many crisis communications educators use simulations as a means for students to test their learning in a controlled environment meant to simulate a real-life crisis using an (often hypothetical) organization. This project explores whether providing background and historical information about the organization days or weeks in advance of the simulation can enhance learning. Survey results of students exposed to this method as well as a traditional scenario approach (where all information is provided at once) showed that students preferred the advanced exposure method. The learning experience was judged to be superior overall. In addition, the recognition of understanding risks as well as improvements in teamwork were also noted.
APA, Harvard, Vancouver, ISO, and other styles
40

Masud, Badrul, Quan Cui, Mohamed E. Ammar, Barrie R. Bonsal, Zahidul Islam, and Monireh Faramarzi. "Means and Extremes: Evaluation of a CMIP6 Multi-Model Ensemble in Reproducing Historical Climate Characteristics across Alberta, Canada." Water 13, no. 5 (March 9, 2021): 737. http://dx.doi.org/10.3390/w13050737.

Full text
Abstract:
This study evaluates General Circulation Models (GCMs) participating in the Coupled Model Intercomparison Project Phase 6 (CMIP6) for their ability in simulating historical means and extremes of daily precipitation (P), and daily maximum (Tmax), and minimum temperature (Tmin). Models are evaluated against hybrid observations at 2255 sub-basins across Alberta, Canada using established statistical metrics for the 1983–2014 period. Three extreme indices including consecutive wet days (CWD), summer days (SD), and warm nights (WN) are defined based on the peak over the threshold approach and characterized by duration and frequency. The tail behaviour of extremes is evaluated using the Generalized Pareto Distribution. Regional evaluations are also conducted for four climate sub-regions across the study area. For both mean annual precipitation and mean annual daily temperature, most GCMs more accurately reproduce the observations in northern Alberta and follow a gradient toward the south having the poorest representation in the western mountainous area. Model simulations show statistically better performance in reproducing mean annual daily Tmax than Tmin, and in reproducing annual mean duration compared to the frequency of extreme indices across the province. The Kernel density curves of duration and frequency as simulated by GCMs show closer agreement to that of observations in the case of CWD. However, it is slightly (completely) overestimated (underestimated) by GCMs for warm nights (summer days). The tail behaviour of extremes indicates that GCMs may not incorporate some local processes such as the convective parameterization scheme in the simulation of daily precipitation. Model performances in each of the four sub-regions are quite similar to their performances at the provincial scale. Bias-corrected and downscaled GCM simulations using a hybrid approach show that the downscaled GCM simulations better represent the means and extremes of P characteristics compared to Tmax and Tmin. There is no clear indication of an improved tail behaviour of GPD based on downscaled simulations.
APA, Harvard, Vancouver, ISO, and other styles
41

Guido, Vitale, Astarita, and Giofrè. "Comparison Analysis between Real Accident Locations and Simulated Risk Areas in An Urban Road Network." Safety 5, no. 3 (August 27, 2019): 60. http://dx.doi.org/10.3390/safety5030060.

Full text
Abstract:
Recently, many researchers have employed a microsimulation technique to study the chain of interactions among vehicles, which generates an accident occurrence in some circumstances. This new approach to studying road safety is named traffic conflict technique. The aim of this paper is to assess how the microscopic simulation is a useful tool to identify potentially unsafe vehicle interactions and how high-risk locations identified by a microsimulation technique are similar to the ones identified by using historical accident data. Results show that high-risk locations identified by the simulation framework are superimposable to those identified by using the historical accident database. In particular, the statistical analysis employed based on Pearson’s correlation demonstrates a significative correspondence between a risk rate defined with simulation and an accident rate determined by the observed accidents dataset.
APA, Harvard, Vancouver, ISO, and other styles
42

Bonazzi, A., A. L. Dobbin, J. K. Turner, P. S. Wilson, C. Mitas, and E. Bellone. "A Simulation Approach for Estimating Hurricane Risk over a 5-yr Horizon." Weather, Climate, and Society 6, no. 1 (January 1, 2014): 77–90. http://dx.doi.org/10.1175/wcas-d-13-00025.1.

Full text
Abstract:
Abstract We develop a stochastic North Atlantic hurricane track model whose climate inputs are Atlantic main development region (MDR) and Indo-Pacific (IP) sea surface temperatures and produce extremely long model simulations for 58 different climates, each one conditioned on 5 yr of observed SSTs from 1950 to 2011—hereafter referred as medium-term (MT) views. Stringent tests are then performed to prove that MT simulations are better predictors of hurricane landfalls than a long-term view conditioned on the entire SST time series from 1950 to 2011. In this analysis, the authors extrapolate beyond the historical record, but not in terms of a forecast of future conditions. Rather it is attempted to define—within the limitation of the modeling approach—the magnitude of extreme events that could have materialized in the past at fixed probability thresholds and what is the likelihood of observed landfalls given such estimates. Finally, a loss proxy is built and the value of the analysis results from a simplified property and casualty insurance perspective is shown. Medium-term simulations of hurricane activity are used to set the strategy of reinsurance coverage purchased by a hypothetical primary insurance, leading to improved solvency margins.
APA, Harvard, Vancouver, ISO, and other styles
43

Piselli, Cristina, Jessica Romanelli, Matteo Di Grazia, Augusto Gavagni, Elisa Moretti, Andrea Nicolini, Franco Cotana, Francesco Strangis, Henk J. L. Witte, and Anna Laura Pisello. "An Integrated HBIM Simulation Approach for Energy Retrofit of Historical Buildings Implemented in a Case Study of a Medieval Fortress in Italy." Energies 13, no. 10 (May 20, 2020): 2601. http://dx.doi.org/10.3390/en13102601.

Full text
Abstract:
The Italian building stock consists of buildings mainly constructed until the mid-20th century using pre-industrial construction techniques. These buildings require energy refurbishment that takes into account the preservation of their architectural heritage. In this view, this work studies an innovative integrated modelling and simulation framework consisting of the implementation of Historical Building Information Modeling (HBIM) for the energy retrofit of historical buildings with renewable geothermal HVAC system. To this aim, the field case study is part of a medieval complex in Central Italy (Perugia), as representative ancient rural offshore architecture in the European countryside. The system involves of a ground source heat pump, a water tank for thermal-energy storage connected to a low-temperature radiant system, and an air-handling unit. The building heating energy performance, typically influenced by thermal inertia in historical buildings, when coupled to the novel HVAC system, is comparatively assessed against a traditional scenario implementing a natural-gas boiler, and made inter-operative within the HBIM ad hoc platform. Results show that the innovative renewable energy system provides relevant benefits while preserving minor visual and architectural impact within the historical complex, and also in terms of both energy saving, CO2 emissions offset, and operation costs compared to the traditional existing system. The integrated HBIM approach may effectively drive the path toward regeneration and re-functioning of heritage in Europe.
APA, Harvard, Vancouver, ISO, and other styles
44

Vuojärvi, Hanna, and Saana Korva. "An ethnographic study on leadership-as-practice in trauma simulation training." Leadership in Health Services 33, no. 2 (March 9, 2020): 185–200. http://dx.doi.org/10.1108/lhs-06-2019-0031.

Full text
Abstract:
Purpose This study aims to discover how leadership emerges in a hospital’s trauma team in a simulated trauma care situation. Instead of investigating leadership from a leader-centric perspective, or using a metrics-based approach to reach generalizable results, the study aims to draw from post-heroic theories by applying leadership-as-practice and sociomaterial perspectives that emphasize the cultural-historical context and emergent nature of leadership. Design/methodology/approach The study was conducted in a Finnish central hospital through ethnographic observations of 14 in situ trauma simulation trainings over a period of 13 months. The data consist of vignettes developed and written from field notes. The analysis was informed by the cultural-historical activity theory. Findings Leadership in a trauma team during an in situ simulation training emerges from a complex system of agencies taking place simultaneously. Contextual elements contributed to the goal. Clarity of roles and task division, strong execution of leadership at critical points, active communication and maintenance of disciplined communication helped to overcome difficulties. The team developed coordination of the process in conjunction with the care. Originality/value The study considers trauma leadership to be a practical phenomenon emerging from the trauma team’s sociomaterial context. The results can be used to develop non-technical skills training within the field of simulation-based medical training.
APA, Harvard, Vancouver, ISO, and other styles
45

McCown, R. L., and K. A. Parton. "Learning from the historical failure of farm management models to aid management practice. Part 2. Three systems approaches." Australian Journal of Agricultural Research 57, no. 2 (2006): 157. http://dx.doi.org/10.1071/ar05052.

Full text
Abstract:
Part 1 analysed the difficulties experienced in the field of academic Farm Management in making complex theoretical models relevant to farming. This paper highlights the important connections developed between the field of Farm Management economics and 3 key ‘systems’ ideas and tools in agricultural science in response to difficulties and opportunities. The first systems approach reviewed is the 20-year experiment by agricultural economists in using crop and animal production simulation models in management analyses. The second systems approach reviewed is Farming Systems Research (FSR), an approach characterised by on-farm experimentation with a management orientation. Many pioneers of FSR were Farm Management economists disenchanted by the inapplicability of economic theory to farm management. The FSR that emerged is interpreted as a scion of the early era of Farm Management prior to the coup by economics theorists in the 1940s. A third systems approach reviewed is a ‘soft’ intervention to facilitate farmer learning. Although evolving from FSR, this approach has surprising similarities to the ‘goal adjusting’ consulting performed by the legendary Australian Farm Management consultant cum academic, Jack Makeham. The paper concludes with discussion of a recent innovation that combines these 3 approaches. It uses a soft intervention approach that features farmers shaping their goals and expectations by ‘experimenting’ in a local, but virtual, environment provided by simulation of the production system using ‘hard’ models.
APA, Harvard, Vancouver, ISO, and other styles
46

Al-Azri, Nasser, and Saleh Al-Saadi. "Variant Developments of Typical Meteorological Years (TMYs) for Seeb, Oman and their Impact on Energy Simulation of Residential Buildings." Journal of Engineering Research [TJER] 15, no. 2 (November 7, 2018): 129. http://dx.doi.org/10.24200/tjer.vol15iss2pp129-141.

Full text
Abstract:
Typical meteorological years (TMYs) are widely used for the analysis and simulation of energy-intensive systems. The reliability of a developed typical year depends on the accuracy of the historical record of weather data as well as the fitness of the developed approach to the application. In this work, a TMY for Seeb area in the Muscat Governorate, Oman was developed using different approaches. The developed TMYs are compared to the current commonly used TMY which is based on 1985-2001 records that have many gaps and anomalies and hence have intensive interpolation treatment. The different TMYs were compared by simulating energy consumption of a typical residential building and also by studying applicability of passive cooling strategies. The findings showed that the variation in energy consumption is minimal for the different TMY development approaches for the same set of historical records but the difference is very significant when the comparison is based on the two sets from the two periods of records.
APA, Harvard, Vancouver, ISO, and other styles
47

Al-Azri, Nasser, and Saleh Al-Saadi. "Variant Developments of Typical Meteorological Years (TMYs) for Seeb, Oman and their Impact on Energy Simulation of Residential Buildings." Journal of Engineering Research [TJER] 15, no. 2 (November 7, 2018): 29. http://dx.doi.org/10.24200/tjer.vol15iss2pp29-41.

Full text
Abstract:
Typical meteorological years (TMYs) are widely used for the analysis and simulation of energy-intensive systems. The reliability of a developed typical year depends on the accuracy of the historical record of weather data as well as the fitness of the developed approach to the application. In this work, a TMY for Seeb area in the Muscat Governorate, Oman was developed using different approaches. The developed TMYs are compared to the current commonly used TMY which is based on 1985-2001 records that have many gaps and anomalies and hence have intensive interpolation treatment. The different TMYs were compared by simulating energy consumption of a typical residential building and also by studying applicability of passive cooling strategies. The findings showed that the variation in energy consumption is minimal for the different TMY development approaches for the same set of historical records but the difference is very significant when the comparison is based on the two sets from the two periods of records.
APA, Harvard, Vancouver, ISO, and other styles
48

Mandinach, Ellen B. "Model-Building and the Use of Computer Simulation of Dynamic Systems." Journal of Educational Computing Research 5, no. 2 (May 1989): 221–43. http://dx.doi.org/10.2190/7w4f-xy0h-l6fh-39r8.

Full text
Abstract:
The Systems Thinking and Curriculum Innovation (STACI) Project is a multi-year research effort intended to examine the cognitive impact of learning from a systems thinking approach to instruction and from using simulation-modeling software. Systems thinking is an analytic problem solving tool that can be integrated into courses to supplement and enhance instruction in a variety of content areas. The purpose of the study is to test the potentials and effects of using the technology-based approach in secondary school curricula to teach content-specific knowledge as well as general problem solving skills. The research focuses on the effects of introducing a software environment that enables students to learn from and make concrete multiple representations of scientific, mathematical, and historical phenomena.
APA, Harvard, Vancouver, ISO, and other styles
49

Witzany, Jiří. "A Bayesian Approach to Measurement of Backtest Overfitting." Risks 9, no. 1 (January 8, 2021): 18. http://dx.doi.org/10.3390/risks9010018.

Full text
Abstract:
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical techniques to prevent model overfitting such as out-sample backtesting turn out to be unreliable in situations when the selection is based on results of too many models tested on the holdout sample. There is an ongoing discussion of how to estimate the probability of backtest overfitting and adjust the expected performance indicators such as the Sharpe ratio in order to reflect properly the effect of multiple testing. We propose a consistent Bayesian approach that yields the desired robust estimates on the basis of a Markov chain Monte Carlo (MCMC) simulation. The approach is tested on a class of technical trading strategies where a seemingly profitable strategy can be selected in the naïve approach.
APA, Harvard, Vancouver, ISO, and other styles
50

Witzany, Jiří. "A Bayesian Approach to Measurement of Backtest Overfitting." Risks 9, no. 1 (January 8, 2021): 18. http://dx.doi.org/10.3390/risks9010018.

Full text
Abstract:
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical techniques to prevent model overfitting such as out-sample backtesting turn out to be unreliable in situations when the selection is based on results of too many models tested on the holdout sample. There is an ongoing discussion of how to estimate the probability of backtest overfitting and adjust the expected performance indicators such as the Sharpe ratio in order to reflect properly the effect of multiple testing. We propose a consistent Bayesian approach that yields the desired robust estimates on the basis of a Markov chain Monte Carlo (MCMC) simulation. The approach is tested on a class of technical trading strategies where a seemingly profitable strategy can be selected in the naïve approach.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography