Dissertations / Theses on the topic 'Probability constraints'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Probability constraints.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Chernyy, Vladimir. "On portfolio optimisation under drawdown and floor type constraints." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:19dee50e-466b-46b5-83ae-5816d3b27c62.
Full textAl-jasser, Faisal M. A. "Phonotactic probability and phonotactic constraints : processing and lexical segmentation by Arabic learners of English as a foreign language." Thesis, University of Newcastle Upon Tyne, 2008. http://hdl.handle.net/10443/537.
Full textHu, Yanling. "SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/150.
Full textBrahmantio, Bayu Beta. "Efficient Sampling of Gaussian Processes under Linear Inequality Constraints." Thesis, Linköpings universitet, Statistik och maskininlärning, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176246.
Full textKokrda, Lukáš. "Optimalizace stavebních konstrukcí s pravděpodobnostními omezeními." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-232181.
Full textHenry, Tyler R. "Constrained short selling and the probability of informed trade /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/8716.
Full textPfeiffer, Laurent. "Sensitivity analysis for optimal control problems. Stochastic optimal control with a probability constraint." Palaiseau, Ecole polytechnique, 2013. https://pastel.hal.science/docs/00/88/11/19/PDF/thesePfeiffer.pdf.
Full textThis thesis is divided into two parts. In the first part, we study constrained deterministic optimal control problems and sensitivity analysis issues, from the point of view of abstract optimization. Second-order necessary and sufficient optimality conditions, which play an important role in sensitivity analysis, are also investigated. In this thesis, we are interested in strong solutions. We use this generic term for locally optimal controls for the L1-norm, roughly speaking. We use two essential tools: a relaxation technique, which consists in using simultaneously several controls, and a decomposition principle, which is a particular second-order Taylor expansion of the Lagrangian. Chapters 2 and 3 deal with second-order necessary and sufficient optimality conditions for strong solutions of problems with pure, mixed, and final-state constraints. In Chapter 4, we perform a sensitivity analysis for strong solutions of relaxed problems with final-state constraints. In Chapter 5, we perform a sensitivity analysis for a problem of nuclear energy production. In the second part of the thesis, we study stochastic optimal control problems with a probability constraint. We study an approach by dynamic programming, in which the level of probability is a supplementary state variable. In this framework, we show that the sensitivity of the value function with respect to the probability level is constant along optimal trajectories. We use this analysis to design numerical schemes for continuous-time problems. These results are presented in Chapter 6, in which we also study an application to asset-liability management
Prezioso, Luca. "Financial risk sources and optimal strategies in jump-diffusion frameworks." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/254880.
Full textEt, Tabii Mohamed. "Contributions aux descriptions optimales par modèles statistiques exponentiels." Rouen, 1997. http://www.theses.fr/1997ROUES028.
Full textHansson, Patrik. "Overconfidence and Format Dependence in Subjective Probability Intervals: Naive Estimation and Constrained Sampling." Licentiate thesis, Umeå University, Department of Psychology, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-14737.
Full textA particular field in research on judgment and decision making (JDM) is concerned with realism of confidence in one’s knowledge. An interesting finding is the so-called format dependence effect which implies that assessment of the same probability distribution generates different conclusions about over- or underconfidence bias depending on the assessment format. In particular,expressing a belief about some unknown quantity in the form of a confidence interval is severely prone to overconfidence as compared to expressing the belief as an assessment of a probability. This thesis gives a tentative account of this finding in terms of a Naïve Sampling Model (NSM;Juslin, Winman, & Hansson, 2004), which assumes that people accurately describe their available information stored in memory but they are naïve in the sense that they treat sample properties as proper estimators of population properties. The NSM predicts that it should be possible to reducethe overconfidence in interval production by changing the response format into interval evaluation and to manipulate the degree of format dependence between interval production and interval evaluation. These predictions are verified in empirical experiments which contain both general knowledge tasks (Study 1) and laboratory learning tasks (Study 2). A bold hypothesis,that working memory is a constraining factor for sample size in judgment which suggests that experience per se does not eliminate overconfidence, is investigated and verified. The NSM predicts that the absolute error of the placement of the interval is a constant fraction of interval size, a prediction that is verified (Study 2). This thesis suggests that no cognitive processing bias(Tversky & Kahneman, 1974) over and above naivety is needed to understand and explain the overconfidence bias in interval production and hence the format dependence effect.
Elbahloul, Salem A. "Modelling of turbulent flames with transported probability density function and rate-controlled constrained equilibrium methods." Thesis, Imperial College London, 2013. http://hdl.handle.net/10044/1/30826.
Full textPrakash, Sunil. "Modeling the Constraint Effects on Fracture Toughness of Materials." University of Akron / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=akron1259271280.
Full textPicard, Vincent. "Réseaux de réactions : de l’analyse probabiliste à la réfutation." Thesis, Rennes 1, 2015. http://www.theses.fr/2015REN1S093/document.
Full textA major goal in systems biology is to inverstigate the dynamical behavior of reaction networks. There exists two main dynamical frameworks : the first one is the deterministic dynamics where the dynamics is described using odinary differential equations, the second one is probabilistic and relies on Markov chains. In both cases, one major issue is to determine the kinetic laws of the systems together with its kinetic parameters. As a consequence the direct study of large biological reaction networks is impossible. To deal with this issue, stationnary assumptions have been used. A widely used method is flux balance analysis, where systems of constraints are derived from information on the average slopes of the system trajectories. In this thesis, we construct a probabilistic analog of this stationnary analysis. The results are divided into three parts. First, we introduce a stationnary analysis of the probabilistic dynamics which relies on a Bernoulli approximation. Second, this approximated dynamics allows us to derive systems of constraints from information about the means, variances and co-variances of the system trajectories. Third, we present several applications of these systems of constraints such as the possibility to reject reaction networks using information from experimental variances and co-variances and the formal verification of logical properties concerning the stationnary regime of the system
Baek, Yeongcheon. "An interior point approach to constrained nonparametric mixture models /." Thesis, Connect to this title online; UW restricted, 2006. http://hdl.handle.net/1773/5753.
Full textSohul, Munawwar Mahmud. "PERFORMANCE OF LINEAR DECISION COMBINER FOR PRIMARY USER DETECTION IN COGNITIVE RADIO." OpenSIUC, 2011. https://opensiuc.lib.siu.edu/theses/705.
Full textPeng, Shen. "Optimisation stochastique avec contraintes en probabilités et applications." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS153/document.
Full textChance constrained optimization is a natural and widely used approaches to provide profitable and reliable decisions under uncertainty. And the topics around the theory and applications of chance constrained problems are interesting and attractive. However, there are still some important issues requiring non-trivial efforts to solve. In view of this, we will systematically investigate chance constrained problems from the following perspectives. As the basis for chance constrained problems, we first review some main research results about chance constraints in three perspectives: convexity of chance constraints, reformulations and approximations for chance constraints and distributionally robust chance constraints. For stochastic geometric programs, we formulate consider a joint rectangular geometric chance constrained program. With elliptically distributed and pairwise independent assumptions for stochastic parameters, we derive a reformulation of the joint rectangular geometric chance constrained programs. As the reformulation is not convex, we propose new convex approximations based on the variable transformation together with piecewise linear approximation methods. Our numerical results show that our approximations are asymptotically tight. When the probability distributions are not known in advance or the reformulation for chance constraints is hard to obtain, bounds on chance constraints can be very useful. Therefore, we develop four upper bounds for individual and joint chance constraints with independent matrix vector rows. Based on the one-side Chebyshev inequality, Chernoff inequality, Bernstein inequality and Hoeffding inequality, we propose deterministic approximations for chance constraints. In addition, various sufficient conditions under which the aforementioned approximations are convex and tractable are derived. To reduce further computational complexity, we reformulate the approximations as tractable convex optimization problems based on piecewise linear and tangent approximations. Finally, based on randomly generated data, numerical experiments are discussed in order to identify the tight deterministic approximations. In some complex systems, the distribution of the random parameters is only known partially. To deal with the complex uncertainties in terms of the distribution and sample data, we propose a data-driven mixture distribution based uncertainty set. The data-driven mixture distribution based uncertainty set is constructed from the perspective of simultaneously estimating higher order moments. Then, with the mixture distribution based uncertainty set, we derive a reformulation of the data-driven robust chance constrained problem. As the reformulation is not a convex program, we propose new and tight convex approximations based on the piecewise linear approximation method under certain conditions. For the general case, we propose a DC approximation to derive an upper bound and a relaxed convex approximation to derive a lower bound for the optimal value of the original problem, respectively. We also establish the theoretical foundation for these approximations. Finally, simulation experiments are carried out to show that the proposed approximations are practical and efficient. We consider a stochastic n-player non-cooperative game. When the strategy set of each player contains a set of stochastic linear constraints, we model the stochastic linear constraints of each player as a joint chance constraint. For each player, we assume that the row vectors of the matrix defining the stochastic constraints are pairwise independent. Then, we formulate the chance constraints with the viewpoints of normal distribution, elliptical distribution and distributionally robustness, respectively. Under certain conditions, we show the existence of a Nash equilibrium for the stochastic game
Val, Petran. "BINOCULAR DEPTH PERCEPTION, PROBABILITY, FUZZY LOGIC, AND CONTINUOUS QUANTIFICATION OF UNIQUENESS." Case Western Reserve University School of Graduate Studies / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=case1504749439893027.
Full textWang, Xun. "Essays on Down Payment Constraint, House Price and Young People's Homeownership Behavior." The Ohio State University, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=osu1275015646.
Full textLooper, Jason K. "Semiparametric Estimation of Unimodal Distributions." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000132.
Full textVan, Ackooij Wim. "Chance Constrained Programming : with applications in Energy Management." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2013. http://www.theses.fr/2013ECAP0071/document.
Full textIn optimization problems involving uncertainty, probabilistic constraints are an important tool for defining safety of decisions. In Energy management, many optimization problems have some underlying uncertainty. In particular this is the case of unit commitment problems. In this Thesis, we will investigate probabilistic constraints from a theoretical, algorithmic and applicative point of view. We provide new insights on differentiability of probabilistic constraints and on convexity results of feasible sets. New variants of bundle methods, both of proximal and level type, specially tailored for convex optimization under probabilistic constraints, are given and convergence shown. Both methods explicitly deal with evaluation errors in both the gradient and value of the probabilistic constraint. We also look at two applications from energy management: cascaded reservoir management with uncertainty on inflows and unit commitment with uncertainty on customer load. In both applications uncertainty is dealt with through the use of probabilistic constraints. The presented numerical results seem to indicate the feasibility of solving an optimization problem with a joint probabilistic constraint on a system having up to 200 constraints. This is roughly the order of magnitude needed in the applications. The differentiability results involve probabilistic constraints on uncertain linear and nonlinear inequality systems. In the latter case a convexity structure in the underlying uncertainty vector is required. The uncertainty vector is assumed to have a multivariate Gaussian or Student law. The provided gradient formulae allow for efficient numerical sampling schemes. For probabilistic constraints that can be rewritten through the use of Copulae, we provide new insights on convexity of the feasible set. These results require a generalized concavity structure of the Copulae, the marginal distribution functions of the underlying random vector and of the underlying inequality system. These generalized concavity properties may hold only on specific sets
Mahmood, Khalid. "Constrained linear and non-linear adaptive equalization techniques for MIMO-CDMA systems." Thesis, De Montfort University, 2013. http://hdl.handle.net/2086/10203.
Full textFerdjoukh, Adel. "Une approche déclarative pour la génération de modèles." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT325/document.
Full textOwning data is useful in many different fields. Data can be used to test and to validate approaches, algorithms and concepts. Unfortunately, data is rarely available, is cost to obtain, or is not adapted to most of cases due to a lack of quality.An automated data generator is a good way to generate quickly and easily data that are valid, in different sizes, likelihood and diverse.In this thesis, we propose a novel and complete model driven approach, based on constraint programming for automated data generation
Björnemo, Erik. "Energy Constrained Wireless Sensor Networks : Communication Principles and Sensing Aspects." Doctoral thesis, Uppsala universitet, Institutionen för teknikvetenskaper, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9519.
Full textwisenet
Zhao, Jianmin. "Optimal Clustering: Genetic Constrained K-Means and Linear Programming Algorithms." VCU Scholars Compass, 2006. http://hdl.handle.net/10156/1583.
Full textMoran, Michael. "On Comparative Algorithmic Pathfinding in Complex Networks for Resource-Constrained Software Agents." ScholarWorks, 2017. https://scholarworks.waldenu.edu/dissertations/3951.
Full textExcoffier, Mathilde. "Chance-Constrained Programming Approaches for Staffing and Shift-Scheduling Problems with Uncertain Forecasts : application to Call Centers." Thesis, Paris 11, 2015. http://www.theses.fr/2015PA112244/document.
Full textThe staffing and shift-scheduling problems in call centers consist in deciding how many agents handling the calls should be assigned to work during a given period in order to reach the required Quality of Service and minimize the costs. These problems are subject to a growing interest, both for their interesting theoritical formulation and their possible applicative effects. This thesis aims at proposing chance-constrained approaches considering uncertainty on demand forecasts.First, this thesis proposes a model solving the problems in one step through a joint chance-constrained stochastic program, providing a cost-reducing solution. A continuous-based approach leading to an easily-tractable optimization program is formulated with random variables following continuous distributions, a new continuous relation between arrival rates and theoritical real agent numbers and constraint linearizations. The global risk level is dynamically shared among the periods during the optimization process, providing reduced-cost solution. The resulting solutions respect the targeted risk level while reducing the cost compared to other approaches.Moreover, this model is extended so that it provides a better representation of real situations. First, the queuing system model is improved and consider the limited patience of customers. Second, another formulation of uncertainty is proposed so that the period correlation is considered.Finally, another uncertainty representation is proposed. The distributionally robust approach provides a formulation while assuming that the correct probability distribution is unknown and belongs to a set of possible distributions defined by given mean and variance. The problem is formulated with a joint chance constraint. The risk at each period is a decision variable to be optimized. A deterministic equivalent problem is proposed. An easily-tractable mixed-integer linear formulation is obtained through piecewise linearizations
Liu, Hai. "Semiparametric regression analysis of zero-inflated data." Diss., University of Iowa, 2009. https://ir.uiowa.edu/etd/308.
Full textSassi, Achille. "Numerical methods for hybrid control and chance-constrained optimization problems." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLY005/document.
Full textThis thesis is devoted to the analysis of numerical methods in the field of optimal control, and it is composed of two parts. The first part is dedicated to new results on the subject of numerical methods for the optimal control of hybrid systems, controlled by measurable functions and discontinuous jumps in the state variable simultaneously. The second part focuses on a particular application of trajectory optimization problems for space launchers. Here we use some nonlinear optimization methods combined with non-parametric statistics techniques. This kind of problems belongs to the family of stochastic optimization problems and it features the minimization of a cost function in the presence of a constraint which needs to be satisfied within a desired probability threshold
Marêché, Laure. "Kinetically constrained models : relaxation to equilibrium and universality results." Thesis, Université de Paris (2019-....), 2019. http://www.theses.fr/2019UNIP7125.
Full textThis thesis studies the class of interacting particle systems called kinetically constrained models (KCMs). It considers first the question of universality: can the infinity of possible models be sorted into a finite number of classes according to their properties? Such a result was recently proven in a related class of models, bootstrap percolation, where models can be divided into supercritical, critical and subcritical. This classification can also be applied to KCMs, but it is not precise enough: supercritical KCMs have to be divided into rooted and unrooted, and critical KCMs depending on them having or not an infinity of stable directions. This thesis shows the relevance of this classification of KCMs and completes the proof of their universality in the supercritical and critical cases, by proving a lower bound for two characteristic scales, the relaxation time and the first time at which a site is at 0, in the supercritical rooted case (work with F. Martinelli and C. Toninelli, relying on a combinatorial result shown without collaboration) and in the case of critical models with an infinity of stable directions (work with I. Hartarsky and C. Toninelli). It also establishes a more precise lower bound in the particular case of the Duarte model (work with F. Martinelli and C. Toninelli). Secondly, this thesis shows results of exponential convergence to equilibrium, for all supercritical KCMs under certain conditions and in the particular case of the d-dimensional East model without restrictions
Ekberg, Marie. "Sensitivity analysis of optimization : Examining sensitivity of bottleneck optimization to input data models." Thesis, Högskolan i Skövde, Institutionen för ingenjörsvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-12624.
Full textSerra, Romain. "Opérations de proximité en orbite : évaluation du risque de collision et calcul de manoeuvres optimales pour l'évitement et le rendez-vous." Thesis, Toulouse, INSA, 2015. http://www.theses.fr/2015ISAT0035/document.
Full textThis thesis is about collision avoidance for a pair of spherical orbiting objects. The primary object - the operational satellite - is active in the sense that it can use its thrusters to change its trajectory, while the secondary object is a space debris that cannot be controlled in any way. Onground radars or other means allow to foresee a conjunction involving an operational space craft,leading in the production of a collision alert. The latter contains statistical data on the position and velocity of the two objects, enabling for the construction of a probabilistic collision model.The work is divided in two parts : the computation of collision probabilities and the design of maneuvers to lower the collision risk. In the first part, two kinds of probabilities - that can be written as integrals of a Gaussian distribution over an Euclidean ball in 2 and 3 dimensions -are expanded in convergent power series with positive terms. It is done using the theories of Laplace transform and Definite functions. In the second part, the question of collision avoidance is formulated as a chance-constrained optimization problem. Depending on the collision model, namely short or long-term encounters, it is respectively tackled via the scenario approach or relaxed using polyhedral collision sets. For the latter, two methods are proposed. The first one directly tackles the joint chance constraints while the second uses another relaxation called risk selection to obtain a mixed-integer program. Additionaly, the solution to the problem of fixed-time fuel minimizing out-of-plane proximity maneuvers is derived. This optimal control problem is solved via the primer vector theory
Benhida, Soufia. "De l'optimisation pour l'aide à la décision : applications au problème du voyageur de commerce probabiliste et à l'approximation de données." Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR27.
Full textThe first part of this work deals with route optimization in the form of an optimization problem named The Traveler's Business Problem. In this part we are interested to make a rich presentation of the problem of Traveler Commerce, its variants, then we propose a strategy of constraint generation for the resolution of the TSP. Then we treat its stochastic version : the probabilistic business traveler problem. We propose a mathematical formulation of the PTSP and we present numerical results obtained by exact resolution for a series of small instances. In the second part, we propose a method of general approximation to approximate different type of data, first we treat the approximation of a wind signal (simple case, 1D), then the approximation of a vector field taking into account the topography which is the main contribution of this part
Merlinge, Nicolas. "State estimation and trajectory planning using box particle kernels." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS425/document.
Full textState estimation and trajectory planning are two crucial functions for autonomous systems, and in particular for aerospace vehicles.Particle filters and sample-based trajectory planning have been widely considered to tackle non-linearities and non-Gaussian uncertainties.However, these approaches may produce erratic results due to the sampled approximation of the state density.In addition, they have a high computational cost which limits their practical interest.This thesis investigates the use of box kernel mixtures to describe multimodal probability density functions.A box kernel mixture is a weighted sum of basic functions (e.g., uniform kernels) that integrate to unity and whose supports are bounded by boxes, i.e., vectors of intervals.This modelling yields a more extensive description of the state density while requiring a lower computational load.New algorithms are developed, based on a derivation of the Box Particle Filter (BPF) for state estimation, and of a particle based chance constrained optimisation (Particle Control) for trajectory planning under uncertainty.In order to tackle ambiguous state estimation problems, a Box Regularised Particle Filter (BRPF) is introduced.The BRPF consists of an improved BPF with a guaranteed resampling step and a smoothing strategy based on kernel regularisation.The proposed strategy is theoretically proved to outperform the original BPF in terms of Mean Integrated Square Error (MISE), and empirically shown to reduce the Root Mean Square Error (RMSE) of estimation.BRPF reduces the computation load in a significant way and is robust to measurement ambiguity.BRPF is also integrated to federated and distributed architectures to demonstrate its efficiency in multi-sensors and multi-agents systems.In order to tackle constrained trajectory planning under non-Gaussian uncertainty, a Box Particle Control (BPC) is introduced.BPC relies on an interval bounded kernel mixture state density description, and consists of propagating the state density along a state trajectory at a given horizon.It yields a more accurate description of the state uncertainty than previous particle based algorithms.A chance constrained optimisation is performed, which consists of finding the sequence of future control inputs that minimises a cost function while ensuring that the probability of constraint violation (failure probability) remains below a given threshold.For similar performance, BPC yields a significant computation load reduction with respect to previous approaches
Gabriš, Ondrej. "Software Projects Risk Management Support Tool." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2011. http://www.nusl.cz/ntk/nusl-412827.
Full textAlais, Jean-Christophe. "Risque et optimisation pour le management d'énergies : application à l'hydraulique." Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST1071/document.
Full textHydropower is the main renewable energy produced in France. It brings both an energy reserve and a flexibility, of great interest in a contextof penetration of intermittent sources in the production of electricity. Its management raises difficulties stemming from the number of dams, from uncertainties in water inflows and prices and from multiple uses of water. This Phd thesis has been realized in partnership with Electricité de France and addresses two hydropower management issues, modeled as stochastic dynamic optimization problems. The manuscript is divided in two parts. In the first part, we consider the management of a hydroelectric dam subject to a so-called tourist constraint. This constraint assures the respect of a given minimum dam stock level in Summer months with a prescribed probability level. We propose different original modelings and we provide corresponding numerical algorithms. We present numerical results that highlight the problem under various angles useful for dam managers. In the second part, we focus on the management of a cascade of dams. We present the approximate decomposition-coordination algorithm called Dual Approximate Dynamic Programming (DADP). We show how to decompose an original (large scale) problem into smaller subproblems by dualizing the spatial coupling constraints. On a three dams instance, we are able to compare the results of DADP with the exact solution (obtained by dynamic programming); we obtain approximate gains that are only at a few percents of the optimum, with interesting running times. The conclusions we arrived at offer encouraging perspectives for the stochastic optimization of large scale problems
Prigent, Sylvain. "Approche novatrice pour la conception et l’exploitation d’avions écologiques." Thesis, Toulouse, ISAE, 2015. http://www.theses.fr/2015ESAE0014/document.
Full textThe objective of this PhD work is to pose, investigate, and solve the highly multidisciplinary and multiobjective problem of environmentally efficient aircraft design and operation. In this purpose, the main three drivers for optimizing the environmental performance of an aircraft are the airframe, the engine, and the mission profiles. The figures of merit, which will be considered for optimization, are fuel burn, local emissions, global emissions, and climate impact (noise excluded). The study will be focused on finding efficient compromise strategies and identifying the most powerful design architectures and design driver combinations for improvement of environmental performances. The modeling uncertainty will be considered thanks to rigorously selected methods. A hybrid aircraft configuration is proposed to reach the climatic impact reduction objective
Hee, Sonke. "Computational Bayesian techniques applied to cosmology." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/273346.
Full textMirroshandel, Seyedabolghasem. "Towards less supervision in dependency parsing." Thesis, Aix-Marseille, 2015. http://www.theses.fr/2015AIXM4096/document.
Full textProbabilistic parsing is one of the most attractive research areas in natural language processing. Current successful probabilistic parsers require large treebanks which are difficult, time consuming, and expensive to produce. Therefore, we focused our attention on less-supervised approaches. We suggested two categories of solution: active learning and semi-supervised algorithm. Active learning strategies allow one to select the most informative samples for annotation. Most existing active learning strategies for parsing rely on selecting uncertain sentences for annotation. We show in our research, on four different languages (French, English, Persian, and Arabic), that selecting full sentences is not an optimal solution and propose a way to select only subparts of sentences. As our experiments have shown, some parts of the sentences do not contain any useful information for training a parser, and focusing on uncertain subparts of the sentences is a more effective solution in active learning
Safadi, El Abed El. "Contribution à l'évaluation des risques liés au TMD (transport de matières dangereuses) en prenant en compte les incertitudes." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAT059/document.
Full textWhen an accidental event is occurring, the process of technological risk assessment, in particular the one related to Dangerous Goods Transportation (DGT), allows assessing the level of potential risk of impacted areas in order to provide and quickly take prevention and protection actions (containment, evacuation ...). The objective is to reduce and control its effects on people and environment. The first issue of this work is to evaluate the risk level for areas subjected to dangerous goods transportation. The quantification of the intensity of the occurring events needed to do this evaluation is based on effect models (analytical or computer code). Regarding the problem of dispersion of toxic products, these models mainly contain inputs linked to different databases, like the exposure data and meteorological data. The second problematic is related to the uncertainties affecting some model inputs. To determine the geographical danger zone where the estimated risk level is not acceptable, it is necessary to identify and take in consideration the uncertainties on the inputs in aim to propagate them in the effect model and thus to have a reliable evaluation of the risk level. The first phase of this work is to evaluate and propagate the uncertainty on the gas concentration induced by uncertain model inputs during its evaluation by dispersion models. Two approaches are used to model and propagate the uncertainties. The first one is the set-membership approach based on interval calculus for analytical models. The second one is the probabilistic approach (Monte Carlo), which is more classical and used more frequently when the dispersion model is described by an analytic expression or is is defined by a computer code. The objective is to compare the two approaches to define their advantages and disadvantages in terms of precision and computation time to solve the proposed problem. To determine the danger zones, two dispersion models (Gaussian and SLAB) are used to evaluate the risk intensity in the contaminated area. The risk mapping is achieved by using two methods: a probabilistic method (Monte Carlo) which consists in solving an inverse problem on the effect model and a set-membership generic method that defines the problem as a constraint satisfaction problem (CSP) and to resolve it with an set-membership inversion method. The second phase consists in establishing a general methodology to realize the risk mapping and to improve performance in terms of computation time and precision. This methodology is based on three steps: - Firstly the analysis of the used effect model. - Secondly the proposal of a new method for the uncertainty propagationbased on a mix between the probabilistic and set-membership approaches that takes advantage of both approaches and that is suited to any type of spatial and static effect model. -Finally the realization of risk mapping by inversing the effect models. The sensitivity analysis present in the first step is typically addressed to probabilistic models. The validity of using Sobol indices for interval models is discussed and a new interval sensitivity indiceis proposed
Gouyou, Doriane. "Introduction de pièces déformables dans l’analyse de tolérances géométriques de mécanismes hyperstatiques." Thesis, Bordeaux, 2018. http://www.theses.fr/2018BORD0343/document.
Full textOver-constrained mechanisms are often used in industries to ensure a good mechanical strength and a good robustness to manufacturing deviations of parts. The tolerance analysis of such assemblies is difficult to implement.Indeed, depending on the geometrical deviations of parts, over-constrained mechanisms can have assembly interferences. In this work, we used the polytope method to check whether the assembly has interferences or not. For each assembly, the resulting polytope of the mechanism is computed. If it is non empty, the assembly can be performed without interference. If not, there is interferences in the assembly. According to the result, two different methods can be implemented.For an assembly without interference, the resulting polytope enables to check directly its compliance. For an assembly with interferences, a study taking into account the stiffness of the parts is undertaken. This approach uses a model reduction with super elements. It enables to compute quickly the assembly with deformation. Then, an assembly load is computed to conclude on its feasibility. Finally, the spreading of deformation through the parts is calculated to check the compliance of the mechanism.The short computational time enables to perform stochastic tolerance analyses in order to provide the rates of compliant assemblies
Basei, Matteo. "Topics in stochastic control and differential game theory, with application to mathematical finance." Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424239.
Full textIn questa tesi vengono considerati tre problemi relativi alla teoria del controllo stocastico e dei giochi differenziali; tali problemi sono legati a situazioni concrete nell'ambito della finanza matematica e, più precisamente, dei mercati dell'energia. Innanzitutto, affrontiamo il problema dell'esercizio ottimale di opzioni swing nel mercato dell'energia. Il risultato principale consiste nel caratterizzare la funzione valore come unica soluzione di viscosità di un'opportuna equazione di Hamilton-Jacobi-Bellman. Il caso relativo ai contratti con penalità può essere trattato in modo standard. Al contrario, il caso relativo ai contratti con vincoli stretti porta a problemi di controllo stocastico in cui è presente un vincolo non standard sui controlli: la suddetta caratterizzazione è allora ottenuta considerando un'opportuna successione di problemi non vincolati. Tale approssimazione viene dimostrata per una classe generale di problemi con vincolo integrale sui controlli. Successivamente, consideriamo un fornitore di energia che deve decidere quando e come intervenire per cambiare il prezzo che chiede ai suoi clienti, al fine di massimizzare il suo guadagno. I costi di intervento possono essere fissi o dipendere dalla quota di mercato del fornitore. Nel primo caso, otteniamo un problema standard di controllo stocastico impulsivo, in cui caratterizziamo la funzione valore e la politica ottimale di gestione del prezzo. Nel secondo caso, la teoria classica non può essere applicata a causa delle singolarità nella funzione che definisce le penalità. Delineiamo quindi una procedura di approssimazione e consideriamo infine condizioni più forti sui controlli, così da caratterizzare, anche in questo caso, il controllo ottimale. Infine, studiamo una classe generale di giochi differenziali a somma non nulla e con controlli di tipo impulsivo. Dopo aver definito rigorosamente tali problemi, forniamo la dimostrazione di un teorema di verifica: se una coppia di funzioni è sufficientemente regolare e soddisfa un opportuno sistema di disequazioni quasi-variazionali, essa coincide con le funzioni valore del problema ed è possibile caratterizzare gli equilibri di Nash. Concludiamo con un esempio dettagliato: indaghiamo l'esistenza di equilibri nel caso in cui due nazioni, con obiettivi differenti, possono condizionare il tasso di cambio tra le rispettive valute.
Xu, Chuan. "Power-Aware Protocols for Wireless Sensor Networks." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS498/document.
Full textIn this thesis, we propose a formal energy model which allows an analytical study of energy consumption, for the first time in the context of population protocols. Population protocols model one special kind of sensor networks where anonymous and uniformly bounded memory sensors move unpredictably and communicate in pairs. To illustrate the power and the usefulness of the proposed energy model, we present formal analyses on time and energy, for the worst and the average cases, for accomplishing the fundamental task of data collection. Two power-aware population protocols, (deterministic) EB-TTFM and (randomized) lazy-TTF, are proposed and studied for two different fairness conditions, respectively. Moreover, to obtain the best parameters in lazy-TTF, we adopt optimization techniques and evaluate the resulting performance by experiments. Then, we continue the study on optimization for the power-aware data collection problem in wireless body area networks. A minmax multi-commodity netflow formulation is proposed to optimally route data packets by minimizing the worst power consumption. Then, a variable neighborhood search approach is developed and the numerical results show its efficiency. At last, a stochastic optimization model, namely the chance constrained semidefinite programs, is considered for the realistic decision making problems with random parameters. A novel simulation-based algorithm is proposed with experiments on a real control theory problem. We show that our method allows a less conservative solution, than other approaches, within reasonable time
Cheng, Jianqiang. "Stochastic Combinatorial Optimization." Thesis, Paris 11, 2013. http://www.theses.fr/2013PA112261.
Full textIn this thesis, we studied three types of stochastic problems: chance constrained problems, distributionally robust problems as well as the simple recourse problems. For the stochastic programming problems, there are two main difficulties. One is that feasible sets of stochastic problems is not convex in general. The other main challenge arises from the need to calculate conditional expectation or probability both of which are involving multi-dimensional integrations. Due to the two major difficulties, for all three studied problems, we solved them with approximation approaches.We first study two types of chance constrained problems: linear program with joint chance constraints problem (LPPC) as well as maximum probability problem (MPP). For both problems, we assume that the random matrix is normally distributed and its vector rows are independent. We first dealt with LPPC which is generally not convex. We approximate it with two second-order cone programming (SOCP) problems. Furthermore under mild conditions, the optimal values of the two SOCP problems are a lower and upper bounds of the original problem respectively. For the second problem, we studied a variant of stochastic resource constrained shortest path problem (called SRCSP for short), which is to maximize probability of resource constraints. To solve the problem, we proposed to use a branch-and-bound framework to come up with the optimal solution. As its corresponding linear relaxation is generally not convex, we give a convex approximation. Finally, numerical tests on the random instances were conducted for both problems. With respect to LPPC, the numerical results showed that the approach we proposed outperforms Bonferroni and Jagannathan approximations. While for the MPP, the numerical results on generated instances substantiated that the convex approximation outperforms the individual approximation method.Then we study a distributionally robust stochastic quadratic knapsack problems, where we only know part of information about the random variables, such as its first and second moments. We proved that the single knapsack problem (SKP) is a semedefinite problem (SDP) after applying the SDP relaxation scheme to the binary constraints. Despite the fact that it is not the case for the multidimensional knapsack problem (MKP), two good approximations of the relaxed version of the problem are provided which obtain upper and lower bounds that appear numerically close to each other for a range of problem instances. Our numerical experiments also indicated that our proposed lower bounding approximation outperforms the approximations that are based on Bonferroni's inequality and the work by Zymler et al.. Besides, an extensive set of experiments were conducted to illustrate how the conservativeness of the robust solutions does pay off in terms of ensuring the chance constraint is satisfied (or nearly satisfied) under a wide range of distribution fluctuations. Moreover, our approach can be applied to a large number of stochastic optimization problems with binary variables.Finally, a stochastic version of the shortest path problem is studied. We proved that in some cases the stochastic shortest path problem can be greatly simplified by reformulating it as the classic shortest path problem, which can be solved in polynomial time. To solve the general problem, we proposed to use a branch-and-bound framework to search the set of feasible paths. Lower bounds are obtained by solving the corresponding linear relaxation which in turn is done using a Stochastic Projected Gradient algorithm involving an active set method. Meanwhile, numerical examples were conducted to illustrate the effectiveness of the obtained algorithm. Concerning the resolution of the continuous relaxation, our Stochastic Projected Gradient algorithm clearly outperforms Matlab optimization toolbox on large graphs
Nenna, Luca. "Numerical Methods for Multi-Marginal Optimal Transportation." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED017/document.
Full textIn this thesis we aim at giving a general numerical framework to approximate solutions to optimal transport (OT) problems. The general idea is to introduce an entropic regularization of the initialproblems. The regularized problem corresponds to the minimization of a relative entropy with respect a given reference measure. Indeed, this is equivalent to find the projection of the joint coupling with respect the Kullback-Leibler divergence. This allows us to make use the Bregman/Dykstra’s algorithm and solve several variational problems related to OT. We are especially interested in solving multi-marginal optimal transport problems (MMOT) arising in Physics such as in Fluid Dynamics (e.g. incompressible Euler equations à la Brenier) and in Quantum Physics (e.g. Density Functional Theory). In these cases we show that the entropic regularization plays a more important role than a simple numerical stabilization. Moreover, we also give some important results concerning existence and characterization of optimal transport maps (e.g. fractal maps) for MMOT
CHUANG, HUI-TING, and 莊惠婷. "Optimization of GPRS Time Slot Allocation Considering Call Blocking Probability Constraints." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/54202451602068361393.
Full text國立臺灣大學
資訊管理研究所
90
GPRS is a better solution to mobile data transfer before the coming of 3G network. By using packet-switched technique, one to eight time-slot channels can be dynamically assigned to GPRS users on demand. Because GSM and GPRS users share the same channels and resources, the admission control of different type of traffics is needed to optimize the data channel allocation. The methodology of dynamic slot allocation plays an important role on both maximizing the system revenue and satisfying users’ QoS requirements. We try to find out the best methodology to get the maximum system revenue considering the call blocking probability constraints. We propose two mathematical models to deal with the slot allocation problem in this thesis. The goal of our model is to find a slot allocation policy to maximize the system revenue considering the capacity and the call blocking probability constraints. The main difference between two models is the time type. The first model is discrete-time case, and the other is continuous-time case. Markovian decision process can be applied to solve the problem of maximizing system revenue without the call blocking probability constraints. In order to take the call blocking probability constraints into account, we propose three approaches: linear programming in Markovian decision process, Lagrangian relaxation with Markovian decision process, and the expansion of the Markovian decision process. The most significant and contributive part of this thesis is that we combine the Lagrangian relaxation and the Markovian decision process and use it to successfully solve the Markovian decision process with additional constraints. The computational results are good in our experiments. We can find a slot allocation policy to maximize the system revenue under the call blocking probability constraints. Compared to the policy that the vendors often used, the policy we found has great improvement in system revenue. Thus, our model could provide much better decisions for system vendors and network planners.
Wu, Chia-Lin, and 吳家麟. "Downlink Interference Management in Underlay Femtocell Networks with Outage Probability Constraints." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/43839186532879003281.
Full text國立交通大學
電信工程研究所
98
The femtocell technology is an attractive alternative to solve the capacity and coverage problems of the existing macrocellular networks. To fully exploit and realize the promised potentials, it has toovercome the interference issue when femtocells are to be deployed over a macrocell system sharing the same spectrum. In this thesis, we adopt a resource allocation based interference control approach. We consider downlink transmissions of an orthogonal frequency division multiple access (OFDMA) based femtocell network. The interference constraint is manifested in the form of (average) outage probability requirement. In other words, a femto base station (BS) is allowed to use a certain spectrum only if such an use does not cause the outage of a macrocellular user within the coverage neighborhood. The advantage of using the outage probability constraint is that the knowledge of the locations of the primary (macrocellular) users is not needed. To satisfy the average outage probability constraint, a femtocell BS has to allocates the downlink resource (subcarriers and power) properly. We divide the resource allocation problem into two subproblems. The semidefinete relaxation (SDR) is used to convert the non-convex subcarrier assignment problem into a convex semidefinete programming (SDP) which is then solved by the primal-dual interior-point method. Given the subcarrier assignment, we suggest an iterative water-filling method by solving the KKT conditions for the power allocation problem.
Szabados, Viktor. "Nové trendy ve stochastickém programování." Master's thesis, 2017. http://www.nusl.cz/ntk/nusl-367895.
Full textLapšanská, Alica. "Úlohy vícestupňového stochastického programování - dekompozice." Master's thesis, 2015. http://www.nusl.cz/ntk/nusl-336705.
Full textUhliar, Miroslav. "Ekonomické růstové modely ve stochastickém prostředí." Master's thesis, 2017. http://www.nusl.cz/ntk/nusl-367898.
Full textChen, Jiunn-Yann, and 陳俊諺. "Low-Complexity Symmetry-Constrained Maximum-A-Posteriori Probability Algorithm for Adaptive Blind Beamforming." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/09482189312002402759.
Full text國立中央大學
通訊工程學系
101
In this paper, we propose a real-valued SC-MAP (RSC-MAP) algorithm for concurrent adaptive filter (CAF) applied to beamforming. We first contribute to deriving a closed-form optimal weight expression for blind MAP algorithm. A conjugate symmetric property associated with optimal blind MAP weights is further acquired. Then, we use the conjugate symmetric constraint to guide the proposed RSC-MAP algorithms to follow the optimal blind MAP expression form during adapting procedure. In the simulations, we show that the proposed RSC-MAP algorithms have better performance than the classic ones. Compared with SC-MAP, the RSC-MAP with less computational complexity has the same bit-error rate performance.