Academic literature on the topic 'Clouds Stochastic processes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Clouds Stochastic processes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Clouds Stochastic processes"

1

Mueller, Eli A., and Samuel N. Stechmann. "Shallow-cloud impact on climate and uncertainty: A simple stochastic model." Mathematics of Climate and Weather Forecasting 6, no. 1 (March 20, 2020): 16–37. http://dx.doi.org/10.1515/mcwf-2020-0002.

Full text
Abstract:
AbstractShallow clouds are a major source of uncertainty in climate predictions. Several different sources of the uncertainty are possible—e.g., from different models of shallow cloud behavior, which could produce differing predictions and ensemble spread within an ensemble of models, or from inherent, natural variability of shallow clouds. Here, the latter (inherent variability) is investigated, using a simple model of radiative statistical equilibrium, with oceanic and atmospheric boundary layer temperatures, To and Ta, and with moisture q and basic cloud processes. Stochastic variability is used to generate a statistical equilibrium with climate variability. The results show that the intrinsic variability of the climate is enhanced due to the presence of shallow clouds. In particular, the on-and-off switching of cloud formation and decay is a source of additional climate variability and uncertainty, beyond the variability of a cloud-free climate. Furthermore, a sharp transition in the mean climate occurs as environmental parameters are changed, and the sharp transition in the mean is also accompanied by a substantial enhancement of climate sensitivity and uncertainty. Two viewpoints of this behavior are described, based on bifurcations and phase transitions/statistical physics. The sharp regime transitions are associated with changes in several parameters, including cloud albedo and longwave absorptivity/carbon dioxide concentration, and the climate state transitions between a partially cloudy state and a state of full cloud cover like closed-cell stratocumulus clouds. Ideas of statistical physics can provide a conceptual perspective to link the climate state transitions, increased climate uncertainty, and other related behavior.
APA, Harvard, Vancouver, ISO, and other styles
2

Tacchella, Sandro, John C. Forbes, and Neven Caplar. "Stochastic modelling of star-formation histories II: star-formation variability from molecular clouds and gas inflow." Monthly Notices of the Royal Astronomical Society 497, no. 1 (June 26, 2020): 698–725. http://dx.doi.org/10.1093/mnras/staa1838.

Full text
Abstract:
ABSTRACT A key uncertainty in galaxy evolution is the physics regulating star formation, ranging from small-scale processes related to the life-cycle of molecular clouds within galaxies to large-scale processes such as gas accretion on to galaxies. We study the imprint of such processes on the time-variability of star formation with an analytical approach tracking the gas mass of galaxies (‘regulator model’). Specifically, we quantify the strength of the fluctuation in the star-formation rate (SFR) on different time-scales, i.e. the power spectral density (PSD) of the star-formation history, and connect it to gas inflow and the life-cycle of molecular clouds. We show that in the general case the PSD of the SFR has three breaks, corresponding to the correlation time of the inflow rate, the equilibrium time-scale of the gas reservoir of the galaxy, and the average lifetime of individual molecular clouds. On long and intermediate time-scales (relative to the dynamical time-scale of the galaxy), the PSD is typically set by the variability of the inflow rate and the interplay between outflows and gas depletion. On short time-scales, the PSD shows an additional component related to the life-cycle of molecular clouds, which can be described by a damped random walk with a power-law slope of β ≈ 2 at high frequencies with a break near the average cloud lifetime. We discuss star-formation ‘burstiness’ in a wide range of galaxy regimes, study the evolution of galaxies about the main sequence ridgeline, and explore the applicability of our method for understanding the star-formation process on cloud-scale from galaxy-integrated measurements.
APA, Harvard, Vancouver, ISO, and other styles
3

Feitzinger, J. V., E. Harfst, and J. Spicker. "Stochastic Star Formation and Magnetic Fields." Symposium - International Astronomical Union 140 (1990): 257–58. http://dx.doi.org/10.1017/s0074180900190175.

Full text
Abstract:
The model of selfpropagating star formation uses local processes (200 pc cell size) in the interstellar medium to simulate the large scale cooperative behaviour of spiral structure in galaxies. The dynamic of the model galaxies is taken into account via the mass distribution and the resulting rotation curve; flat rotation curves are used. The interstellar medium is treated as a multiphase medium with appropriate cooling times and density history. The phases are: molecular gas, cool HI gas, warm intercloud and HII gas and hot coronal fountain gas. A detailed gas reshuffeling between the star forming cells in the plane and outside the galactic plane controls the cell content. Two processes working stochastically are incooperated: the building and the decay of molecular clouds and the star forming events in the molecular clouds.
APA, Harvard, Vancouver, ISO, and other styles
4

Shutts, Glenn, Thomas Allen, and Judith Berner. "Stochastic parametrization of multiscale processes using a dual-grid approach." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 366, no. 1875 (April 29, 2008): 2623–39. http://dx.doi.org/10.1098/rsta.2008.0035.

Full text
Abstract:
Some speculative proposals are made for extending current stochastic sub-gridscale parametrization methods using the techniques adopted from the field of computer graphics and flow visualization. The idea is to emulate sub-filter-scale physical process organization and time evolution on a fine grid and couple the implied coarse-grained tendencies with a forecast model. A two-way interaction is envisaged so that fine-grid physics (e.g. deep convective clouds) responds to forecast model fields. The fine-grid model may be as simple as a two-dimensional cellular automaton or as computationally demanding as a cloud-resolving model similar to the coupling strategy envisaged in ‘super-parametrization’. Computer codes used in computer games and visualization software illustrate the potential for cheap but realistic simulation where emphasis is placed on algorithmic stability and visual realism rather than pointwise accuracy in a predictive sense. In an ensemble prediction context, a computationally cheap technique would be essential and some possibilities are outlined. An idealized proof-of-concept simulation is described, which highlights technical problems such as the nature of the coupling.
APA, Harvard, Vancouver, ISO, and other styles
5

Arakawa, Akio, and Chien-Ming Wu. "A Unified Representation of Deep Moist Convection in Numerical Modeling of the Atmosphere. Part I." Journal of the Atmospheric Sciences 70, no. 7 (July 1, 2013): 1977–92. http://dx.doi.org/10.1175/jas-d-12-0330.1.

Full text
Abstract:
Abstract A generalized framework for cumulus parameterization applicable to any horizontal resolution between those typically used in general circulation and cloud-resolving models is presented. It is pointed out that the key parameter in the generalization is σ, which is the fractional area covered by convective updrafts in the grid cell. Practically all conventional cumulus parameterizations assume σ ≪ 1, at least implicitly, using the gridpoint values of the thermodynamic variables to define the thermal structure of the cloud environment. The proposed framework, called “unified parameterization,” eliminates this assumption from the beginning, allowing a smooth transition to an explicit simulation of cloud-scale processes as the resolution increases. If clouds and the environment are horizontally homogeneous with a top-hat profile, as is widely assumed in the conventional parameterizations, it is shown that the σ dependence of the eddy transport is through a simple quadratic function. Together with a properly chosen closure, the unified parameterization determines σ for each realization of grid-scale processes. The parameterization can also provide a framework for including stochastic parameterization. The remaining issues include parameterization of the in-cloud eddy transport because of the inhomogeneous structure of clouds.
APA, Harvard, Vancouver, ISO, and other styles
6

Palmer, T. N., and P. D. Williams. "Introduction. Stochastic physics and climate modelling." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 366, no. 1875 (April 29, 2008): 2419–25. http://dx.doi.org/10.1098/rsta.2008.0059.

Full text
Abstract:
Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.
APA, Harvard, Vancouver, ISO, and other styles
7

Prat, O. P., and A. P. Barros. "Exploring the use of a column model for the characterization of microphysical processes in warm rain: results from a homogeneous rainshaft model." Advances in Geosciences 10 (April 26, 2007): 145–52. http://dx.doi.org/10.5194/adgeo-10-145-2007.

Full text
Abstract:
Abstract. A study of the evolution of raindrop spectra (raindrop size distribution, DSD) between cloud base and the ground surface was conducted using a column model of stochastic coalescense-breakup dynamics. Numerical results show that, under steady-state boundary conditions (i.e. constant rainfall rate and DSD at the top of the rainshaft), the equilibrium DSD is achieved only for high rain rates produced by midlevel or higher clouds and after long simulation times (~30 min or greater). Because these conditions are not typical of most rainfall, the results suggest that the theoretical equilibrium DSD might not be attainable for the duration of individual rain events, and thus DSD observations from field experiments should be analyzed conditional on the specific storm environment under which they were obtained.
APA, Harvard, Vancouver, ISO, and other styles
8

Mohamadi, Sara, and David Lattanzi. "Life-Cycle Modeling of Structural Defects via Computational Geometry and Time-Series Forecasting." Sensors 19, no. 20 (October 21, 2019): 4571. http://dx.doi.org/10.3390/s19204571.

Full text
Abstract:
The evaluation of geometric defects is necessary in order to maintain the integrity of structures over time. These assessments are designed to detect damages of structures and ideally help inspectors to estimate the remaining life of structures. Current methodologies for monitoring structural systems, while providing useful information about the current state of a structure, are limited in the monitoring of defects over time and in linking them to predictive simulation. This paper presents a new approach to the predictive modeling of geometric defects. A combination of segments from point clouds are parametrized using the convex hull algorithm to extract features from detected defects, and a stochastic dynamic model is then adapted to these features to model the evolution of the hull over time. Describing a defect in terms of its parameterized hull enables consistent temporal tracking for predictive purposes, while implicitly reducing data dimensionality and complexity as well. In this study, two-dimensional (2D) point clouds analogous to information derived from point clouds were firstly generated over simulated life cycles. The evolutions of point cloud hull parameterizations were modeled as stochastic dynamical processes via autoregressive integrated moving average (ARIMA) and vectorized autoregression (VAR) and compared against ground truth. The results indicate that this convex hull approach provides consistent and accurate representations of defect evolution across a range of defect topologies and is reasonably robust to noisy measurements; however, assumptions regarding the underlying dynamical process play a significant the role in predictive accuracy. The results were then validated on experimental data from fatigue testing with high accuracy. Longer term, the results of this work will support finite element model updating for predictive analysis of structural capacity.
APA, Harvard, Vancouver, ISO, and other styles
9

Jameson, A. R., and A. J. Heymsfield. "A Bayesian Approach to Upscaling and Downscaling of Aircraft Measurements of Ice Particle Counts and Size Distributions." Journal of Applied Meteorology and Climatology 52, no. 9 (September 2013): 2075–88. http://dx.doi.org/10.1175/jamc-d-12-0301.1.

Full text
Abstract:
AbstractThis study addresses the issue of how to upscale cloud-sized in situ measurements of ice to yield realistic simulations of ice clouds for a variety of modeling studies. Aircraft measurements of ice particle counts along a 79-km zigzag path were collected in a Costa Rican cloud formed in the upper-level outflow from convection. These are then used to explore the applicability of Bayesian statistics to the problems of upscaling and downscaling. Using the 10-m particle counts, the analyses using Bayesian statistics provide estimates of the probability distribution function of all possible mean values corresponding to these counts. The statistical method of copulas is used to produce an extensive ensemble of estimates of these mean values, which are then combined to derive the probability density function (pdf) of mean values at 1-km resolution. These are found to compare very well to the observed 1-km particle counts when spatial correlation is included. The profiles of the observed and simulated mean counts along the flight path show similar features and have very similar statistical characteristics. However, because the observed and the simulated counts are both the results of stochastic processes, there is no way to upscale exactly to the observed profile. Each simulation is a unique realization of the stochastic processes, as are the observations themselves. These different realizations over all the different sizes can then be used to upscale particle size distributions over large areas.
APA, Harvard, Vancouver, ISO, and other styles
10

Cirisan, A., B. P. Luo, I. Engel, F. G. Wienhold, U. K. Krieger, U. Weers, G. Romanens, et al. "Balloon-borne match measurements of mid-latitude cirrus clouds." Atmospheric Chemistry and Physics Discussions 13, no. 10 (October 2, 2013): 25417–79. http://dx.doi.org/10.5194/acpd-13-25417-2013.

Full text
Abstract:
Abstract. Observations of persistent high supersaturations with respect to ice inside cirrus clouds are challenging our understanding of cloud microphysics and of climate feedback processes in the upper troposphere. Single measurements of a cloudy air mass provide only a snapshot from which the persistence of ice supersaturation cannot be judged. We introduce here the "cirrus match technique" to obtain information of the evolution of clouds and their saturation ratio. The aim of these coordinated balloon soundings is to analyze the same air mass twice. To this end the standard radiosonde equipment is complemented by a frost point hygrometer "SnowWhite" and a particle backscatter detector "COBALD" (Compact Optical Backscatter Aerosol Detector). Extensive trajectory calculations based on regional weather model COSMO forecasts are performed for flight planning and COSMO analyses are used as basis for comprehensive microphysical box modeling (with grid scale 2 km and 7 km, respectively). Here we present the results of matching a cirrus cloud to within 2–15 km, realized on 8 June 2010 over Payerne, Switzerland, and a location 120 km downstream close to Zurich. A thick cirrus was detected over both measurement sites. We show that in order to quantitatively reproduce the measured particle backscatter ratios, the small-scale temperature fluctuations not resolved by COSMO must be superimposed on the trajectories. The stochastic nature of the fluctuations is captured by ensemble calculations. Possibilities for further improvements in the agreement with the measured backscatter data are investigated by assuming a very slow mass accommodation of water on ice, the presence of heterogeneous ice nuclei, or a wide span of (spheroidal) particle shapes. However, the resulting improvements from microphysical refinements are moderate and comparable in magnitude with changes caused by assuming different regimes of temperature fluctuations for clear sky or cloudy sky conditions, highlighting the importance of a proper treatment of subscale fluctuations. The model yields good agreement with the measured backscatter over both sites and reproduces the measured saturation ratios with respect to ice over Payerne. Conversely, the 30% in-cloud supersaturation measured in a massive, 4-km thick cloud layer over Zurich cannot be reproduced, irrespective of the choice of meteorological or microphysical model parameters. The measured supersaturation can only be explained by either resorting to an unknown physical process, which prevents the ice particles from consuming the excess humidity, or – much more likely – by a measurement error, such as a contamination of the sensor housing of the SnowWhite hygrometer by a precipitation drop from a mixed phase cloud just below the cirrus layer or from some very slight rain in the boundary layer. This uncertainty calls for in-flight checks or calibrations of hygrometers under the extreme humidity conditions in the upper troposphere.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Clouds Stochastic processes"

1

Xue, Yan. "Effects of air turbulence and stochastic coalescence on the size distribution of cloud droplets." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 214 p, 2006. http://proquest.umi.com/pqdweb?did=1172112611&sid=2&Fmt=2&clientId=8331&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Leach, Ryan N. "Bulk meteorological parameters for diagnosing cloudiness in the stochastic cloud forecast model." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Mar%5FLeach.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ekholm, Harald, and Daniel Englund. "Cost optimization in the cloud : An analysis on how to apply an optimization framework to the procurement of cloud contracts at Spotify." Thesis, Linköpings universitet, Produktionsekonomi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-168441.

Full text
Abstract:
In the modern era of IT, cloud computing is becoming the new standard. Companies have gone from owning their own data centers to procuring virtualized computational resources as a service. This technology opens up for elasticity and cost savings. Computational resources have gone from being a capital expenditure to an operational expenditure. Vendors, such as Google, Amazon, and Microsoft, offer these services globally with different provisioning alternatives. In this thesis, we focus on providing a cost optimization algorithm for Spotify on the Google Cloud Platform. To achieve this we  construct an algorithm that breaks up the problem in four different parts. Firstly, we generate trajectories of monthly active users. Secondly, we split these trajectories up in regions and redistribute monthly active users to better describe the actual Google Cloud Platform footprint. Thirdly we calculate usage per monthly active users quotas from a representative week of usage and use these to translate the redistributed monthly active users trajectories to usage. Lastly, we apply an optimization algorithm to these trajectories and obtain an objective value. These results are then evaluated using statistical methods to determine the reliability. The final model solves the problem to optimality and provides statistically reliable results. As a consequence, we can give recommendations to Spotify on how to minimize their cloud cost, while considering the uncertainty in demand.
APA, Harvard, Vancouver, ISO, and other styles
4

Ghosh, Rahul. "Scalable Stochastic Models for Cloud Services." Diss., 2012. http://hdl.handle.net/10161/6110.

Full text
Abstract:

Cloud computing appears to be a paradigm shift in service oriented computing. Massively scalable Cloud architectures are spawned by new business and social applications as well as Internet driven economics. Besides being inherently large scale and highly distributed, Cloud systems are almost always virtualized and operate in automated shared environments. The deployed Cloud services are still in their infancy and a variety of research challenges need to be addressed to predict their long-term behavior. Performance and dependability of Cloud services are in general stochastic in nature and they are affected by a large number of factors, e.g., nature of workload and faultload, infrastructure characteristics and management policies. As a result, developing scalable and predictive analytics for Cloud becomes difficult and non-trivial. This dissertation presents the research framework needed to develop high fidelity stochastic models for large scale enterprise systems using Cloud computing as an example. Throughout the dissertation, we show how the developed models are used for: (i) performance and availability analysis, (ii) understanding of power-performance trade-offs, (ii) resiliency quantification, (iv) cost analysis and capacity planning, and (v) risk analysis of Cloud services. In general, the models and approaches presented in this thesis can be useful to a Cloud service provider for planning, forecasting, bottleneck detection, what-if analysis or overall optimization during design, development, testing and operational phases of a Cloud.


Dissertation
APA, Harvard, Vancouver, ISO, and other styles
5

De, La Chevrotière Michèle. "Stochastic and Numerical Models for Tropical Convection and Hadley–Monsoon Dynamics." Thesis, 2015. http://hdl.handle.net/1828/6621.

Full text
Abstract:
The poor representation of cloud processes in general circulation models (GMCs) has been recognized for decades as one of the major sources of uncertainties in weather and climate predictions. Because of the coarse spatial resolution of GCMs, subgrid- scale cloud and convection processes are modelled by parameterization schemes that provide a statistical representation of the subgrid-scale processes in terms of the large- scale, gridbox fields. This thesis focuses on the stochastic multicloud parameterization of Khouider et al. (2010), which is based on the three cloud types (congestus, deep, and stratiform) that are most observed in tropical convective systems. A rigorous parameter estimation model based on the Bayesian paradigm is developed to infer from data a set of seven convective timescales that determine the transition rates from one cloud type to another in the multicloud framework. The Bayesian posterior is given in terms of a costly model likelihood function that must be approximated numerically using high-performance linear algebra routines for parallel distributed computing. The Bayesian procedure is applied to the Giga-LES dataset of Khairout- dinov et al. (2009), a large-eddy simulation of tropical deep convection that covers a physical domain comparable to that of a typical horizontal grid cell in a GCM. The stochastic multicloud model and its deterministic version are then coupled to a zonally iv symmetric atmospheric model to study the meridional Hadley circulation and mon- soon dynamics. The main model is based on the hydrostatic Boussinesq equations on a rotating sphere, and is composed of a deep convective troposphere and a dynamical planetary boundary layer to sustain shallow convection. The resulting equations form a system of nonconservative partial different equations, which is solved numerically using high order non-oscillatory finite volume methods. Results from deterministic and stochastic simulations reveal a mean local Hadley cell structure with some fea- tures of organized convection. In the stochastic case, the Giga-LES parameter regime best captures the Hadley-type circulation and monsoon trough features, compared to a parameter regime used in a different study.
Graduate
mdelachev@gmail.com
APA, Harvard, Vancouver, ISO, and other styles
6

Collins, David. "A stochastic bulk model for turbulent collision and coalescence of cloud droplets." Thesis, 2016. http://hdl.handle.net/1828/7413.

Full text
Abstract:
We propose a mathematical procedure to derive a stochastic parameterization for the bulk warm cloud micro-physical properties of collision and coalescence. Unlike previous bulk parameterizations, the stochastic parameterization does not assume any particular droplet size distribution, all parameters have physical meanings which are recoverable from data, all equations are independently derived making conservation of mass intrinsic, the auto conversion parameter is finely controllable, and the resultant parameterization has the flexibility to utilize a variety of collision kernels. This new approach to modelling the kinetic collection equation (KCE) decouples the choice of a droplet size distribution and a collision kernel from a cloud microphysical parameterization employed by the governing climate model. In essence, a climate model utilizing this new parameterization of cloud microphysics could have different distributions and different kernels in different climate model cells, yet employ a single parameterization scheme. This stochastic bulk model is validated theoretically and empirically against an existing bulk model that contains a simple enough (toy) collision kernel such that the KCE can be solved analytically. Theoretically, the stochastic model reproduces all the terms of each equation in the existing model and precisely reproduces the power law dependence for all of the evolving cloud properties. Empirically, values of stochastic parameters can be chosen graphically which will precisely reproduce the coefficients of the existing model, save for some ad-hoc non-dimensional time functions. Furthermore values of stochastic parameters are chosen graphically. The values selected for the stochastic parameters effect the conversion rate of mass cloud to rain. This conversion rate is compared against (i) an existing bulk model, and (ii) a detailed solution that is used as a benchmark. The utility of the stochastic bulk model is extended to include hydrodynamic and turbulent collision kernels for both clean and polluted clouds. The validation and extension compares the time required to convert 50\% of cloud mass to rain mass, compares the mean rain radius at that time, and used detailed simulations as benchmarks. Stochastic parameters can be chosen graphically to replicate the 50\% conversion time in all cases. The curves showing the evolution of mass conversion that are generated by the stochastic model with realistic kernels do not match corresponding benchmark curves at all times during the evolution for constant parameter values. The degree to which the benchmark curves represent ground truth, i.e. atmospheric observations, is unknown. Finally, among alternate methods of acquiring parameter values, getting a set of sequential values for a single parameter has a stronger physical foundation than getting one value per parameter, and a stochastic simulation is preferable to a higher order detailed method due to the presence of bias in the latter.
Graduate
0725 0608 0405
davidc@uvic.ca
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Clouds Stochastic processes"

1

Jeyarani, R., N. Nagaveni, Satish Kumar Sadasivam, and Vasanth Ram Rajarathinam. "Power Aware Meta Scheduler for Adaptive VM Provisioning in IaaS Cloud." In Cloud Computing Advancements in Design, Implementation, and Technologies, 190–204. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-1879-4.ch014.

Full text
Abstract:
Cloud Computing provides on-demand access to a shared pool of configurable computing resources. The major issue lies in managing extremely large agile data centers which are generally over provisioned to handle unexpected workload surges. This paper focuses on green computing by introducing Power-Aware Meta Scheduler, which provides right fit infrastructure for launching virtual machines onto host. The major challenge of the scheduler is to make a wise decision in transitioning state of the processor cores by exploiting various power saving states inherent in the recent microprocessor technology. This is done by dynamically predicting the utilization of the cloud data center. The authors have extended existing cloudsim toolkit to model power aware resource provisioning, which includes generation of dynamic workload patterns, workload prediction and adaptive provisioning, dynamic lifecycle management of random workload, and implementation of power aware allocation policies and chip aware VM scheduler. The experimental results show that the appropriate usage of different power saving states guarantees significant energy conservation in handling stochastic nature of workload without compromising the performance, both when the data center is in low as well as moderate utilization.
APA, Harvard, Vancouver, ISO, and other styles
2

Jeyarani, R., N. Nagaveni, Satish Kumar Sadasivam, and Vasanth Ram Rajarathinam. "Power Aware Meta Scheduler for Adaptive VM Provisioning in IaaS Cloud." In Grid and Cloud Computing, 717–32. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0879-5.ch310.

Full text
Abstract:
Cloud Computing provides on-demand access to a shared pool of configurable computing resources. The major issue lies in managing extremely large agile data centers which are generally over provisioned to handle unexpected workload surges. This paper focuses on green computing by introducing Power-Aware Meta Scheduler, which provides right fit infrastructure for launching virtual machines onto host. The major challenge of the scheduler is to make a wise decision in transitioning state of the processor cores by exploiting various power saving states inherent in the recent microprocessor technology. This is done by dynamically predicting the utilization of the cloud data center. The authors have extended existing cloudsim toolkit to model power aware resource provisioning, which includes generation of dynamic workload patterns, workload prediction and adaptive provisioning, dynamic lifecycle management of random workload, and implementation of power aware allocation policies and chip aware VM scheduler. The experimental results show that the appropriate usage of different power saving states guarantees significant energy conservation in handling stochastic nature of workload without compromising the performance, both when the data center is in low as well as moderate utilization.
APA, Harvard, Vancouver, ISO, and other styles
3

Tamura, Yoshinobu, and Shigeru Yamada. "Reliability Modeling and Assessment for Open Source Cloud Software." In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 718–42. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-6178-3.ch029.

Full text
Abstract:
Software development based on the Open Source Software (OSS) model is being increasingly accepted to stand up servers and applications. In particular, Cloud OSS is now attracting attention as the next generation of software products due to cost efficiencies and quick delivery. This chapter focuses on the software reliability modeling and assessment for Cloud computing infrastructure software, especially open source software, such as OpenStack and Eucalyptus. In this chapter, the authors introduce a new approach to the Jump diffusion process based on stochastic differential equations in order to consider the interesting aspect of the numbers of components and users in the reliability model. In addition, the authors consider the network traffic of the Cloud in the reliability modeling and integrate the reliability model with a threshold-based neural network approach that estimates network traffic. Actual software fault-count data are analyzed in order to show numerical examples of software reliability assessment. This chapter also illustrates how the proposed method of reliability analysis can assist in quality improvement in Cloud computing software.
APA, Harvard, Vancouver, ISO, and other styles
4

Tamura, Yoshinobu, and Shigeru Yamada. "Reliability Modeling and Assessment for Open Source Cloud Software." In Open Source Technology, 1069–90. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-7230-7.ch052.

Full text
Abstract:
Software development based on the Open Source Software (OSS) model is being increasingly accepted to stand up servers and applications. In particular, Cloud OSS is now attracting attention as the next generation of software products due to cost efficiencies and quick delivery. This chapter focuses on the software reliability modeling and assessment for Cloud computing infrastructure software, especially open source software, such as OpenStack and Eucalyptus. In this chapter, the authors introduce a new approach to the Jump diffusion process based on stochastic differential equations in order to consider the interesting aspect of the numbers of components and users in the reliability model. In addition, the authors consider the network traffic of the Cloud in the reliability modeling and integrate the reliability model with a threshold-based neural network approach that estimates network traffic. Actual software fault-count data are analyzed in order to show numerical examples of software reliability assessment. This chapter also illustrates how the proposed method of reliability analysis can assist in quality improvement in Cloud computing software.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Clouds Stochastic processes"

1

N, Krishnadas. "Cloud Computing: Analysis using Stochastic Process." In Annual International Conference on Computer Games, Multimedia and Allied Technology. Global Science & Technology Forum (GSTF), 2013. http://dx.doi.org/10.5176/2251-1679_cgat13.33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nie, Junhong, Hanlin Tang, and Jingxuan Wei. "Analysis on Convergence of Stochastic Processes in Cloud Computing Models." In 2018 14th International Conference on Computational Intelligence and Security (CIS). IEEE, 2018. http://dx.doi.org/10.1109/cis2018.2018.00024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wu, Dazhong, David W. Rosen, and Dirk Schaefer. "Modeling and Analyzing the Material Flow of Crowdsourcing Processes in Cloud-Based Manufacturing Systems Using Stochastic Petri Nets." In ASME 2014 International Manufacturing Science and Engineering Conference collocated with the JSME 2014 International Conference on Materials and Processing and the 42nd North American Manufacturing Research Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/msec2014-3907.

Full text
Abstract:
Cloud-based manufacturing (CBM), also referred to as cloud manufacturing, has the potential to allow manufacturing enterprises to be rapidly scaled up and down by crowdsourcing manufacturing tasks or sub-tasks. To improve the efficiency of the crowdsourcing process, the material flow of CBM systems needs to be managed so that several manufacturing processes can be executed simultaneously. Further, the scalability of manufacturing capacity in CBM needs to be designed, analyzed, and planned in response to rapidly changing market demands. The objective of this paper is to introduce a stochastic petri nets (SPNs)-based approach for modeling and analyzing the concurrency and synchronization of the material flow in CBM systems. The proposed approach is validated through a case study of a car suspension module. Our results have shown that the SPN-based approach helps analyze the structural and behavioral properties of a CBM system and verify manufacturing performance.
APA, Harvard, Vancouver, ISO, and other styles
4

Khasawneh, Firas A., and Elizabeth Munch. "Exploring Equilibria in Stochastic Delay Differential Equations Using Persistent Homology." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-35655.

Full text
Abstract:
This paper explores the possibility of using techniques from topological data analysis for studying datasets generated from dynamical systems described by stochastic delay equations. The dataset is generated using Euler-Maryuama simulation for two first order systems with stochastic parameters drawn from a normal distribution. The first system contains additive noise whereas the second one contains parametric or multiplicative noise. Using Taken’s embedding, the dataset is converted into a point cloud in a high-dimensional space. Persistent homology is then employed to analyze the structure of the point cloud in order to study equilibria and periodic solutions of the underlying system. Our results show that the persistent homology successfully differentiates between different types of equilibria. Therefore, we believe this approach will prove useful for automatic data analysis of vibration measurements. For example, our approach can be used in machining processes for chatter detection and prevention.
APA, Harvard, Vancouver, ISO, and other styles
5

Lupuleac, Sergey, Nadezhda Zaitseva, Maria Stefanova, Sergey Berezin, Julia Shinder, Margarita Petukhova, and Elodie Bonhomme. "Simulation and Optimization of Airframe Assembly Process." In ASME 2018 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/imece2018-87058.

Full text
Abstract:
An approach for simulating the assembly process where compliant airframe parts are being joined by riveting is presented. The foundation of this approach is the mathematical model based on the reduction of the corresponding contact problem to a Quadratic Programming (QP) problem. The use of efficient QP algorithms enables mass contact problem solving on refined grids, which is needed for variation analysis and simulation as well as for the consequent assembly process optimization. To perform variation simulation, the initial gap between the parts is assumed to be stochastic and a cloud of such gaps is generated based on statistical analysis of the available measurements. The developed approach is illustrated with two examples, simulation of A350-900 wing-to-fuselage joining and optimization of A320 wing box assembly. New contact quality measures are discussed.
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Yunbao, and Xiaoping Qian. "A Stochastic Approach to Surface Reconstruction." In ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/detc2006-99597.

Full text
Abstract:
Shape reconstruction, a process to compute non-discrete mathematical shape description from discrete points, has been widely used in a variety of applications such as reverse engineering, quality inspection, and topography modeling. However, current shape reconstruction approaches, often based on deterministic techniques, face two fundamental challenges: 1) in noise handling, i.e. how to properly handle the data noise variance and outliers in order to reconstruct a robust surface; 2) in model selection, i.e. how to automatically select a surface model adapting to data cloud and to local shape change in order to avoid under-fit and over-fit. This paper aims to address these two issues by developing a novel stochastic surface reconstruction approach: multilevel Kalman filter. The core idea of this approach is to use a state-space model to relate data noise with the surface model, to adopt a multilevel surface representation to address the under-fit and over-fit issue, and to use Kalman filter to produce the optimal estimates and surface uncertainty. Experimental results from the prototype implementation demonstrate that multilevel Kalman filter produces better quality surface than the traditional least-squares method and is robust against noisy data, adapts well to shape changes in complex parts, and can handle incomplete data.
APA, Harvard, Vancouver, ISO, and other styles
7

Shawky, Doaa M. "Performance evaluation of dynamic resource allocation in cloud computing platforms using Stochastic Process Algebra." In 2013 8th International Conference on Computer Engineering & Systems (ICCES). IEEE, 2013. http://dx.doi.org/10.1109/icces.2013.6707168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography