Academic literature on the topic 'Probability constraints'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Probability constraints.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Probability constraints"

1

Pandit, Tanmoy, Alaina M. Green, C. Huerta Alderete, Norbert M. Linke, and Raam Uzdin. "Bounds on the recurrence probability in periodically-driven quantum systems." Quantum 6 (April 6, 2022): 682. http://dx.doi.org/10.22331/q-2022-04-06-682.

Full text
Abstract:
Periodically-driven systems are ubiquitous in science and technology. In quantum dynamics, even a small number of periodically-driven spins leads to complicated dynamics. Hence, it is of interest to understand what constraints such dynamics must satisfy. We derive a set of constraints for each number of cycles. For pure initial states, the observable being constrained is the recurrence probability. We use our constraints for detecting undesired coupling to unaccounted environments and drifts in the driving parameters. To illustrate the relevance of these results for modern quantum systems we demonstrate our findings experimentally on a trapped-ion quantum computer, and on various IBM quantum computers. Specifically, we provide two experimental examples where these constraints surpass fundamental bounds associated with known one-cycle constraints. This scheme can potentially be used to detect the effect of the environment in quantum circuits that cannot be classically simulated. Finally, we show that, in practice, testing an n-cycle constraint requires executing only O(n) cycles, which makes the evaluation of constraints associated with hundreds of cycles realistic.
APA, Harvard, Vancouver, ISO, and other styles
2

van de Laar, Thijs, İsmail Şenöz, Ayça Özçelikkale, and Henk Wymeersch. "Chance-Constrained Active Inference." Neural Computation 33, no. 10 (September 16, 2021): 2710–35. http://dx.doi.org/10.1162/neco_a_01427.

Full text
Abstract:
Abstract Active inference (ActInf) is an emerging theory that explains perception and action in biological agents in terms of minimizing a free energy bound on Bayesian surprise. Goal-directed behavior is elicited by introducing prior beliefs on the underlying generative model. In contrast to prior beliefs, which constrain all realizations of a random variable, we propose an alternative approach through chance constraints, which allow for a (typically small) probability of constraint violation, and demonstrate how such constraints can be used as intrinsic drivers for goal-directed behavior in ActInf. We illustrate how chance-constrained ActInf weights all imposed (prior) constraints on the generative model, allowing, for example, for a trade-off between robust control and empirical chance constraint violation. Second, we interpret the proposed solution within a message passing framework. Interestingly, the message passing interpretation is not only relevant to the context of ActInf, but also provides a general-purpose approach that can account for chance constraints on graphical models. The chance constraint message updates can then be readily combined with other prederived message update rules without the need for custom derivations. The proposed chance-constrained message passing framework thus accelerates the search for workable models in general and can be used to complement message-passing formulations on generative neural models.
APA, Harvard, Vancouver, ISO, and other styles
3

Flaminiano, John Paul, and Jamil Paolo Francisco. "Firm characteristics and credit constraints among SMEs in the Philippines." Small Business International Review 5, no. 1 (May 31, 2021): e332. http://dx.doi.org/10.26784/sbir.v5i1.332.

Full text
Abstract:
Access to finance is critical to support the growth of small and medium-sized enterprises (SMEs). However, lack of access to adequate financing is one of the biggest obstacles that SMEs face. This paper analyzed the relationship between firm characteristics and credit constraints among SMEs in the Philippines. We determined which firm characteristics are correlated to the predicted probability of being credit-constrained or “quasi-constrained” — i.e., able to borrow from informal sources. Estimates of marginal effects at the means (MEMs) from logistic regressions provide some suggestive evidence that increased firm size, previous purchase of fixed assets, and increased use of digital technologies for accounting and financial management are associated with a lower predicted probability of being credit-constrained. The use of digital technologies in accounting and financial management is also associated with a lower probability of credit constraint in informal financial markets.
APA, Harvard, Vancouver, ISO, and other styles
4

Burgstaller, Bernhard. "Annihilating probability measures under constraints." Quaestiones Mathematicae 29, no. 4 (December 2006): 395–405. http://dx.doi.org/10.2989/16073600609486172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rotea, Mario, and Carlos Lana. "State estimation with probability constraints." International Journal of Control 81, no. 6 (June 2008): 920–30. http://dx.doi.org/10.1080/00207170701531149.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gaivoronski, Alexei A., Abdel Lisser, Rafael Lopez, and Hu Xu. "Knapsack problem with probability constraints." Journal of Global Optimization 49, no. 3 (July 1, 2010): 397–413. http://dx.doi.org/10.1007/s10898-010-9566-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shapira, Assaf. "Kinetically constrained models with random constraints." Annals of Applied Probability 30, no. 2 (April 2020): 987–1006. http://dx.doi.org/10.1214/19-aap1527.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kulkarni, Anand J., and K. Tai. "A Probability Collectives Approach with a Feasibility-Based Rule for Constrained Optimization." Applied Computational Intelligence and Soft Computing 2011 (2011): 1–19. http://dx.doi.org/10.1155/2011/980216.

Full text
Abstract:
This paper demonstrates an attempt to incorporate a simple and generic constraint handling technique to the Probability Collectives (PC) approach for solving constrained optimization problems. The approach of PC optimizes any complex system by decomposing it into smaller subsystems and further treats them in a distributed and decentralized way. These subsystems can be viewed as a Multi-Agent System with rational and self-interested agents optimizing their local goals. However, as there is no inherent constraint handling capability in the PC approach, a real challenge is to take into account constraints and at the same time make the agents work collectively avoiding the tragedy of commons to optimize the global/system objective. At the core of the PC optimization methodology are the concepts of Deterministic Annealing in Statistical Physics, Game Theory and Nash Equilibrium. Moreover, a rule-based procedure is incorporated to handle solutions based on the number of constraints violated and drive the convergence towards feasibility. Two specially developed cases of the Circle Packing Problem with known solutions are solved and the true optimum results are obtained at reasonable computational costs. The proposed algorithm is shown to be sufficiently robust, and strengths and weaknesses of the methodology are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
9

Bhaduri, Saumitra N. "Investment and Capital Market Imperfections: Some Evidence from a Developing Economy, India." Review of Pacific Basin Financial Markets and Policies 11, no. 03 (September 2008): 411–28. http://dx.doi.org/10.1142/s0219091508001416.

Full text
Abstract:
This paper presents a switching regression model of investment decision where the probability of a firm facing financial constraint is endogenously determined. The approach, therefore, obviates the use of a priori criteria to exogenously identify the financially constrained firms, and thereby addresses the potential misclassification problem faced in the existing literature. A sample of 576 Indian manufacturing firms, collected across 15 broad industries is used for this study. The study establishes that financially constrained firms exhibit a much higher investment-cash flow sensitivity than those identified to be unconstrained. It also probes into the possible determinants of financial constraints, and finds empirical support for its hypothesis that young, liquidity constrained and low dividend payout firms are more likely to face financial constraints, when compared to their respective counterparts. This paper also provides some insight into the impact of the ongoing liberalization program on the financial constraints faced by the Indian firms.
APA, Harvard, Vancouver, ISO, and other styles
10

Ayton, Benjamin, Brian Williams, and Richard Camilli. "Measurement Maximizing Adaptive Sampling with Risk Bounding Functions." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 7511–19. http://dx.doi.org/10.1609/aaai.v33i01.33017511.

Full text
Abstract:
In autonomous exploration a mobile agent must adapt to new measurements to seek high reward, but disturbances cause a probability of collision that must be traded off against expected reward. This paper considers an autonomous agent tasked with maximizing measurements from a Gaussian Process while subject to unbounded disturbances. We seek an adaptive policy in which the maximum allowed probability of failure is constrained as a function of the expected reward. The policy is found using an extension to Monte Carlo Tree Search (MCTS) which bounds probability of failure. We apply MCTS to a sequence of approximating problems, which allows constraint satisfying actions to be found in an anytime manner. Our innovation lies in defining the approximating problems and replanning strategy such that the probability of failure constraint is guaranteed to be satisfied over the true policy. The approach does not need to plan for all measurements explicitly or constrain planning based only on the measurements that were observed. To the best of our knowledge, our approach is the first to enforce probability of failure constraints in adaptive sampling. Through experiments on real bathymetric data and simulated measurements, we show our algorithm allows an agent to take dangerous actions only when the reward justifies the risk. We then verify through Monte Carlo simulations that failure bounds are satisfied.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Probability constraints"

1

Chernyy, Vladimir. "On portfolio optimisation under drawdown and floor type constraints." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:19dee50e-466b-46b5-83ae-5816d3b27c62.

Full text
Abstract:
This work is devoted to portfolio optimisation problem arising in the context of constrained optimisation. Despite the classical convex constraints imposed on proportion of wealth invested in the stock this work deals with the pathwise constraints. The drawdown constraint requires an investor's wealth process to dominate a given function of its up-to-date maximum. Typically, fund managers are required to post information about their maximum portfolio drawdowns as a part of the risk management procedure. One of the results of this work connects the drawdown constrained and the unconstrained asymptotic portfolio optimisation problems in an explicit manner. The main tools for achieving the connection are Azema-Yor processes which by their nature satisfy the drawdown condition. The other result deals with the constraint given as a floor process which the wealth process is required to dominate. The motivation arises from the financial market where the class of products serve as a protection from a downfall, e.g. out of the money put options. The main result provides the wealth process which dominates any fraction of a given floor and preserves the optimality. In the second part of this work we consider a problem of a lifetime utility of consumption maximisation subject to a drawdown constraint. One contribution to the existing literature consists of extending the results to incorporate a general drawdown constraint for a case of a zero interest rate market. The second result provides the first heuristic results for a problem in a presence of interest rates which differs qualitatively from a zero interest rate case. Also the last chapter concludes with the conjecture for the general case of the problem.
APA, Harvard, Vancouver, ISO, and other styles
2

Al-jasser, Faisal M. A. "Phonotactic probability and phonotactic constraints : processing and lexical segmentation by Arabic learners of English as a foreign language." Thesis, University of Newcastle Upon Tyne, 2008. http://hdl.handle.net/10443/537.

Full text
Abstract:
A fundamental skill in listening comprehension is the ability to recognize words. The ability to accurately locate word boundaries(i . e. to lexically segment) is an important contributor to this skill. Research has shown that English native speakers use various cues in the signal in lexical segmentation. One such cue is phonotactic constraints; more specifically, the presence of illegal English consonant sequences such as AV and MY signals word boundaries. It has also been shown that phonotactic probability (i. e. the frequency of segments and sequences of segments in words) affects native speakers' processing of English. However, the role that phonotactic probability and phonotactic constraints play in the EFL classroom has hardly been studied, while much attention has been devoted to teaching listening comprehension in EFL. This thesis reports on an intervention study which investigated the effect of teaching English phonotactics upon Arabic speakers' lexical segmentation of running speech in English. The study involved a native English group (N= 12), a non-native speaking control group (N= 20); and a non-native speaking experimental group (N=20). Each of the groups took three tests, namely Non-word Rating, Lexical Decision and Word Spotting. These tests probed how sensitive the subjects were to English phonotactic probability and to the presence of illegal sequences of phonemes in English and investigated whether they used these sequences in the lexical segmentation of English. The non-native groups were post-tested with the -same tasks after only the experimental group had been given a treatment which consisted of explicit teaching of relevant English phonotactic constraints and related activities for 8 weeks. The gains made by the experimental group are discussed, with implications for teaching both pronunciation and listening comprehension in an EFL setting.
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Yanling. "SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/150.

Full text
Abstract:
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
APA, Harvard, Vancouver, ISO, and other styles
4

Brahmantio, Bayu Beta. "Efficient Sampling of Gaussian Processes under Linear Inequality Constraints." Thesis, Linköpings universitet, Statistik och maskininlärning, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176246.

Full text
Abstract:
In this thesis, newer Markov Chain Monte Carlo (MCMC) algorithms are implemented and compared in terms of their efficiency in the context of sampling from Gaussian processes under linear inequality constraints. Extending the framework of Gaussian process that uses Gibbs sampler, two MCMC algorithms, Exact Hamiltonian Monte Carlo (HMC) and Analytic Elliptical Slice Sampling (ESS), are used to sample values of truncated multivariate Gaussian distributions that are used for Gaussian process regression models with linear inequality constraints. In terms of generating samples from Gaussian processes under linear inequality constraints, the proposed methods generally produce samples that are less correlated than samples from the Gibbs sampler. Time-wise, Analytic ESS is proven to be a faster choice while Exact HMC produces the least correlated samples.
APA, Harvard, Vancouver, ISO, and other styles
5

Kokrda, Lukáš. "Optimalizace stavebních konstrukcí s pravděpodobnostními omezeními." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-232181.

Full text
Abstract:
The diploma thesis deals with penalty approach to stochastic optimization with chance constraints which are applied to structural mechanics. The problem of optimal design of beam dimensions is modeled and solved. The uncertainty is involved in the form of random load. The corresponding mathematical model contains a condition in the form of ordinary differencial equation that is solved by finite element method. The probability condition is approximated by several types of penalty functions. The results are obtained by computations in the MATLAB software.
APA, Harvard, Vancouver, ISO, and other styles
6

Henry, Tyler R. "Constrained short selling and the probability of informed trade /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/8716.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pfeiffer, Laurent. "Sensitivity analysis for optimal control problems. Stochastic optimal control with a probability constraint." Palaiseau, Ecole polytechnique, 2013. https://pastel.hal.science/docs/00/88/11/19/PDF/thesePfeiffer.pdf.

Full text
Abstract:
Cette thèse est divisée en deux parties. Dans la première partie, nous étudions des problèmes de contrôle optimal déterministes avec contraintes et nous nous intéressons à des questions d'analyse de sensibilité. Le point de vue que nous adoptons est celui de l'optimisation abstraite; les conditions d'optimalité nécessaires et suffisantes du second ordre jouent alors un rôle crucial et sont également étudiées en tant que telles. Dans cette thèse, nous nous intéressons à des solutions fortes. De façon générale, nous employons ce terme générique pour désigner des contrôles localement optimaux pour la norme L1. En renforçant la notion d'optimalité locale utilisée, nous nous attendons à obtenir des résultats plus forts. Deux outils sont utilisés de façon essentielle : une technique de relaxation, qui consiste à utiliser plusieurs contrôles simultanément, ainsi qu'un principe de décomposition, qui est un développement de Taylor au second ordre particulier du lagrangien. Les chapitres 2 et 3 portent sur les conditions d'optimalité nécessaires et suffisantes du second ordre pour des solutions fortes de problèmes avec contraintes pures, mixtes et sur l'état final. Dans le chapitre 4, nous réalisons une analyse de sensibilité pour des problèmes relaxés avec des contraintes sur l'état final. Dans le chapitre 5, nous réalisons une analyse de sensibilité pour un problème de production d'énergie nucléaire. Dans la deuxième partie, nous étudions des problèmes de contrôle optimal stochastique sous contrainte en probabilité. Nous étudions une approche par programmation dynamique, dans laquelle le niveau de probabilité est vu comme une variable d'état supplémentaire. Dans ce cadre, nous montrons que la sensibilité de la fonction valeur par rapport au niveau de probabilité est constante le long des trajectoires optimales. Cette analyse nous permet de développer des méthodes numériques pour des problèmes en temps continu. Ces résultats sont présentés dans le chapitre 6, dans lequel nous étudions également une application à la gestion actif-passif
This thesis is divided into two parts. In the first part, we study constrained deterministic optimal control problems and sensitivity analysis issues, from the point of view of abstract optimization. Second-order necessary and sufficient optimality conditions, which play an important role in sensitivity analysis, are also investigated. In this thesis, we are interested in strong solutions. We use this generic term for locally optimal controls for the L1-norm, roughly speaking. We use two essential tools: a relaxation technique, which consists in using simultaneously several controls, and a decomposition principle, which is a particular second-order Taylor expansion of the Lagrangian. Chapters 2 and 3 deal with second-order necessary and sufficient optimality conditions for strong solutions of problems with pure, mixed, and final-state constraints. In Chapter 4, we perform a sensitivity analysis for strong solutions of relaxed problems with final-state constraints. In Chapter 5, we perform a sensitivity analysis for a problem of nuclear energy production. In the second part of the thesis, we study stochastic optimal control problems with a probability constraint. We study an approach by dynamic programming, in which the level of probability is a supplementary state variable. In this framework, we show that the sensitivity of the value function with respect to the probability level is constant along optimal trajectories. We use this analysis to design numerical schemes for continuous-time problems. These results are presented in Chapter 6, in which we also study an application to asset-liability management
APA, Harvard, Vancouver, ISO, and other styles
8

Prezioso, Luca. "Financial risk sources and optimal strategies in jump-diffusion frameworks." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/254880.

Full text
Abstract:
An optimal dividend problem with investment opportunities, taking into consideration a source of strategic risk is being considered, as well as the effect of market frictions on the decision process of the financial entities. It concerns the problem of determining an optimal control of the dividend under debt constraints and investment opportunities in an economy with business cycles. It is assumed that the company is to be allowed to accept or reject investment opportunities arriving at random times with random sizes, by changing its outstanding indebtedness, which would impact its capital structure and risk profile. This work mainly focuses on the strategic risk faced by the companies; and, in particular, it focuses on the manager's problem of setting appropriate priorities to deploy the limited resources available. This component is taken into account by introducing frictions in the capital structure modification process. The problem is formulated as a bi-dimensional singular control problem under regime switching in presence of jumps. An explicit condition is obtained in order to ensure that the value function is finite. A viscosity solution approach is used to get qualitative descriptions of the solution. Moreover, a lending scheme for a system of interconnected banks with probabilistic constraints of failure is being considered. The problem arises from the fact that financial institutions cannot possibly carry enough capital to withstand counterparty failures or systemic risk. In such situations, the central bank or the government becomes effectively the risk manager of last resort or, in extreme cases, the lender of last resort. If, on the one hand, the health of the whole financial system depends on government intervention, on the other hand, guaranteeing a high probability of salvage may result in increasing the moral hazard of the banks in the financial network. A closed form solution for an optimal control problem related to interbank lending schemes has been derived, subject to terminal probability constraints on the failure of banks which are interconnected through a financial network. The derived solution applies to real bank networks by obtaining a general solution when the aforementioned probability constraints are assumed for all the banks. We also present a direct method to compute the systemic relevance parameter for each bank within the network. Finally, a possible computation technique for the Default Risk Charge under to regulatory risk measurement processes is being considered. We focus on the Default Risk Charge measure as an effective alternative to the Incremental Risk Charge one, proposing its implementation by a quasi exhaustive-heuristic algorithm to determine the minimum capital requested to a bank facing the market risk associated to portfolios based on assets emitted by several financial agents. While most of the banks use the Monte Carlo simulation approach and the empirical quantile to estimate this risk measure, we provide new computational approaches, exhaustive or heuristic, currently becoming feasible, because of both new regulation and the high speed - low cost technology available nowadays.
APA, Harvard, Vancouver, ISO, and other styles
9

Et, Tabii Mohamed. "Contributions aux descriptions optimales par modèles statistiques exponentiels." Rouen, 1997. http://www.theses.fr/1997ROUES028.

Full text
Abstract:
Cette thèse est consacrée à l'amélioration de l'approximation d'une loi inconnue, par la méthode de la i-projection sous contraintes. L'amélioration consiste à minimiser une information de Kullback pour obtenir la i-projection. La i-projection est définie à l'aide d'une probabilité de référence et de fonctions qui constituent les contraintes. Le rôle de ces données pour améliorer l'approximation est explicite. Nous développons une méthodologie pour caractériser les contraintes, les meilleures tenant compte des propriétés plausibles de la densité de la loi inconnue. Elle consiste de plus à construire pour la méthode de la i-projection, une probabilité de référence, meilleure que celle de départ. Des applications numériques montrent la pertinence de la méthode.
APA, Harvard, Vancouver, ISO, and other styles
10

Hansson, Patrik. "Overconfidence and Format Dependence in Subjective Probability Intervals: Naive Estimation and Constrained Sampling." Licentiate thesis, Umeå University, Department of Psychology, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-14737.

Full text
Abstract:

A particular field in research on judgment and decision making (JDM) is concerned with realism of confidence in one’s knowledge. An interesting finding is the so-called format dependence effect which implies that assessment of the same probability distribution generates different conclusions about over- or underconfidence bias depending on the assessment format. In particular,expressing a belief about some unknown quantity in the form of a confidence interval is severely prone to overconfidence as compared to expressing the belief as an assessment of a probability. This thesis gives a tentative account of this finding in terms of a Naïve Sampling Model (NSM;Juslin, Winman, & Hansson, 2004), which assumes that people accurately describe their available information stored in memory but they are naïve in the sense that they treat sample properties as proper estimators of population properties. The NSM predicts that it should be possible to reducethe overconfidence in interval production by changing the response format into interval evaluation and to manipulate the degree of format dependence between interval production and interval evaluation. These predictions are verified in empirical experiments which contain both general knowledge tasks (Study 1) and laboratory learning tasks (Study 2). A bold hypothesis,that working memory is a constraining factor for sample size in judgment which suggests that experience per se does not eliminate overconfidence, is investigated and verified. The NSM predicts that the absolute error of the placement of the interval is a constant fraction of interval size, a prediction that is verified (Study 2). This thesis suggests that no cognitive processing bias(Tversky & Kahneman, 1974) over and above naivety is needed to understand and explain the overconfidence bias in interval production and hence the format dependence effect.

APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Probability constraints"

1

Piunovskiy, A. B. Optimal Control of Random Sequences in Problems with Constraints. Dordrecht: Springer Netherlands, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ryan, M. J. Economic choice constrained games and the nature of probability. Hull: University of Hull, Department of Economics, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tree-based graph partitioning constraint. London: ISTE, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Xiaoqi, Yang, ed. Lagrange-type functions in constrained non-convex optimization. Boston: Kluwer Academic Publishers, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Constrained principal component analysis and related techniques. Boca Raton: CRC, Taylor & Francis Group, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kotzen, Matthew. Probability in Epistemology. Edited by Alan Hájek and Christopher Hitchcock. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199607617.013.32.

Full text
Abstract:
In recent years, probabilistic approaches to epistemological questions have become increasingly influential. This chapter surveys a number of the most significant ways in which probability is relevant to contemporary epistemology. Topics surveyed include: the debate surrounding the connection between full and partial beliefs; synchronic rational constraints on credences including probabilism, regularity, reflection, and the principal principle; diachronic rational constraints on credences including conditionalization and de se updating; the application of the requirement of total evidence; evidential probability, focusing on the theories of Henry Kyburg and Timothy Williamson; sharp and fuzzy credences; likelihood arguments, including the fine-tuning argument for Design; dogmatism and its critics; and transmission failure, focusing on the work of Crispin Wright.
APA, Harvard, Vancouver, ISO, and other styles
7

McGrew, Timothy. The Spirit of Cromwell’s Rule. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198746904.003.0015.

Full text
Abstract:
One of the central complaints about Bayesian probability is that it places no constraints on individual subjectivity in one’s initial probability assignments. Those sympathetic to Bayesian methods have responded by adding restrictions motivated by broader epistemic concerns about the possibility of changing one’s mind. This chapter explores some cases where, intuitively, a straightforward Bayesian model yields unreasonable results. Problems arise in these cases not because there is something wrong with the Bayesian formalism per se but because standard textbook illustrations teach us to represent our inferences in simplified ways that break down in extreme cases. It also explores some interesting limitations on the extent to which successive items of evidence ought to induce us to change our minds when certain screening conditions obtain.
APA, Harvard, Vancouver, ISO, and other styles
8

Coolen, A. C. C., A. Annibale, and E. S. Roberts. Random graph ensembles. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198709893.003.0003.

Full text
Abstract:
This chapter presents some theoretical tools for defining random graph ensembles systematically via soft or hard topological constraints including working through some properties of the Erdös-Rényi random graph ensemble, which is the simplest non-trivial random graph ensemble where links appear between two nodes with a fixed probability p. The chapter sets out the central representation of graph generation as the result of a discrete-time Markovian stochastic process. This unites the two flavours of graph generation approaches – because they can be viewed as simply moving forwards or backwards through this representation. It is possible to define a random graph by an algorithm, and then calculate the associated stationary probability. The alternative approach is to specify sampling weights and then to construct an algorithm that will have these weights as the stationary probabilities upon convergence.
APA, Harvard, Vancouver, ISO, and other styles
9

Lorca, Xavier. Tree-Based Graph Partitioning Constraint. Wiley & Sons, Incorporated, John, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lorca, Xavier. Tree-Based Graph Partitioning Constraint. Wiley & Sons, Incorporated, John, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Probability constraints"

1

Perez, Guillaume, and Jean-Charles Régin. "MDDs: Sampling and Probability Constraints." In Lecture Notes in Computer Science, 226–42. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-66158-2_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Baki, Bassam, and Maroua Bouzid. "Scheduling with Probability and Temporal Constraints." In Lecture Notes in Computer Science, 148–59. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11558590_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Roe, Byron P. "Fitting Data with Correlations and Constraints." In Probability and Statistics in Experimental Physics, 180–88. New York, NY: Springer New York, 1992. http://dx.doi.org/10.1007/978-1-4757-2186-7_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Roe, Byron P. "Fitting Data with Correlations and Constraints." In Probability and Statistics in Experimental Physics, 221–30. New York, NY: Springer New York, 2001. http://dx.doi.org/10.1007/978-1-4684-9296-5_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Foss, Sergey, and Alexander Sakhanenko. "Structural Properties of Conditioned Random Walks on Integer Lattices with Random Local Constraints." In Progress in Probability, 407–38. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60754-8_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Roe, Byron P. "Fitting Data with Correlations and Constraints." In Probability and Statistics in the Physical Sciences, 201–13. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-53694-7_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Stute, W., and J. Reitze. "Statistical Analysis of Doubly Censored Data Under Support Constraints." In Asymptotics in Statistics and Probability, edited by Madan L. Puri, 347–66. Berlin, Boston: De Gruyter, 2000. http://dx.doi.org/10.1515/9783110942002-024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Laha, Vivek, Vinay Singh, Yogendra Pandey, and S. K. Mishra. "Nonsmooth Mathematical Programs with Vanishing Constraints in Banach Spaces." In High-Dimensional Optimization and Probability, 395–417. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-00832-0_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Caron, Virgile. "Importance Sampling for Multi-Constraints Rare Event Probability." In Springer Proceedings in Mathematics & Statistics, 119–28. New York, NY: Springer New York, 2014. http://dx.doi.org/10.1007/978-1-4939-2104-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nambakhsh, Cyrus M. S., Terry M. Peters, Ali Islam, and Ismail Ben Ayed. "Right Ventricle Segmentation with Probability Product Kernel Constraints." In Advanced Information Systems Engineering, 509–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-40811-3_64.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Probability constraints"

1

Loubenets, Elena R. "Quantum states satisfying classical probability constraints." In Quantum Probability. Warsaw: Institute of Mathematics Polish Academy of Sciences, 2006. http://dx.doi.org/10.4064/bc73-0-25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fang, Cheng. "Efficient Algorithms And Representations For Chance-constrained Mixed Constraint Programming." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/749.

Full text
Abstract:
Resistance to adoption of autonomous systems comes in part from the perceived unreliability of the systems. Concerns can be addressed by approaches that guarantee the probability of success. This is achieved in chance-constrained constraint programming (CC-CP) by imposing constraints required for success, and providing upper-bounds on the probability of violating constraints. This extended abstract reports on novel uncertainty representations to address problems prevalent in current methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Calafiore, Giuseppe C., and Laurent El Ghaoui. "Linear Programming with Probability Constraints - Part 1." In 2007 American Control Conference. IEEE, 2007. http://dx.doi.org/10.1109/acc.2007.4282190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Calafiore, Giuseppe C., and Laurent El Ghaoui. "Linear Programming with Probability Constraints - Part 2." In 2007 American Control Conference. IEEE, 2007. http://dx.doi.org/10.1109/acc.2007.4282191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Graborov, Sergey. "THE OPTIMIZATION TASKS WITH THE PROBABILITY CONSTRAINTS." In Theory and Practice of Institutional Reforms in Russia [Text]: Collection of Scientific Works. CEMI RAS, 2019. http://dx.doi.org/10.33276/978-5-8211-0781-7-53-61.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Guolei Zhu, Yingmin Wang, and Qi Wang. "Matched field processing under posterior probability constraints." In OCEANS 2015 - MTS/IEEE Washington. IEEE, 2015. http://dx.doi.org/10.23919/oceans.2015.7404444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

HasanzadeZonuzy, Aria, Dileep Kalathil, and Srinivas Shakkottai. "Model-Based Reinforcement Learning for Infinite-Horizon Discounted Constrained Markov Decision Processes." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/347.

Full text
Abstract:
In many real-world reinforcement learning (RL) problems, in addition to maximizing the objective, the learning agent has to maintain some necessary safety constraints. We formulate the problem of learning a safe policy as an infinite-horizon discounted Constrained Markov Decision Process (CMDP) with an unknown transition probability matrix, where the safety requirements are modeled as constraints on expected cumulative costs. We propose two model-based constrained reinforcement learning (CRL) algorithms for learning a safe policy, namely, (i) GM-CRL algorithm, where the algorithm has access to a generative model, and (ii) UC-CRL algorithm, where the algorithm learns the model using an upper confidence style online exploration method. We characterize the sample complexity of these algorithms, i.e., the the number of samples needed to ensure a desired level of accuracy with high probability, both with respect to objective maximization and constraint satisfaction.
APA, Harvard, Vancouver, ISO, and other styles
8

Sheng, Wei, Wai-Yip Chan, and Steven D. Blostein. "Rateless code based multimedia multicasting with outage probability constraints." In 2010 25th Biennial Symposium on Communications. IEEE, 2010. http://dx.doi.org/10.1109/bsc.2010.5472984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"A SCHEDULING TECHNIQUE OF PLANS WITH PROBABILITY AND TEMPORAL CONSTRAINTS." In 2nd International Conference on Informatics in Control, Automation and Robotics. SciTePress - Science and and Technology Publications, 2005. http://dx.doi.org/10.5220/0001169300770084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Scott, R. T., and G. A. Gabriele. "Computer Aided Tolerance Analysis of Parts and Assemblies." In ASME 1989 Design Technical Conferences. American Society of Mechanical Engineers, 1989. http://dx.doi.org/10.1115/detc1989-0017.

Full text
Abstract:
Abstract An exact constraint scheme based on the physical contacting constraints of real part mating features is used to represent the process of assembling the parts. To provide useful probability information about how assembly dimensions are distributed when the parts are assembled as intended, the real world constraints that would prevent interference are ignored. This work addresses some limitations in the area of three dimensional assembly tolerance analysis. As a result of this work, the following were demonstrated: 1. Assembly of parts whose assembly mating features are subjected to variation; 2. Assemble parts using a real world set of exact constraints; 3. Provide probability distributions of assembly dimensions.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Probability constraints"

1

Vargas-Riaño, Carmiña Ofelia, and Julian Parra-Polania. Relevance of the collateral constraint form in the analysis of financial crisis interventions. Banco de la República, January 2022. http://dx.doi.org/10.32468/be.1190.

Full text
Abstract:
We combine two modifications to the standard (current and total income) collateral constraint that has been commonly used in models that analyze financial crisis interventions. Specifically, we consider an alternative constraint stated in terms of future and disposable income. We find that in this case a state-contingent debt tax (effective during crisis only, as opposed to a macroprudential tax) increases debt capacity and lowers the probability of crisis. This shows one more instance to call the attention of academics and policymakers to the fact that the specific form of the borrowing constraint is crucial in determining the appropriate crisis intervention.
APA, Harvard, Vancouver, ISO, and other styles
2

Rodgers, Yana van der Meulen, and Joseph E. Zveglich, Jr. Gender Differences in Access to Health Care Among the Elderly: Evidence from Southeast Asia. Asian Development Bank, February 2021. http://dx.doi.org/10.22617/wps210047-2.

Full text
Abstract:
This paper examines gender among other factors that may constrain older persons in Southeast Asia from meeting their health-care needs when sick based on data from Cambodia, the Philippines and Viet Nam. It finds that while women in Cambodia and the Philippines are more likely to seek treatment than men, the gender difference is reversed in Viet Nam where stigma associated with some diseases may more strongly deter women than men. Household survey data from these countries show that the probability of seeking treatment rises with age more sharply for women than men. Yet, for the subsample of elders, the gender difference is not significant.
APA, Harvard, Vancouver, ISO, and other styles
3

Weller, Joel I., Ignacy Misztal, and Micha Ron. Optimization of methodology for genomic selection of moderate and large dairy cattle populations. United States Department of Agriculture, March 2015. http://dx.doi.org/10.32747/2015.7594404.bard.

Full text
Abstract:
The main objectives of this research was to detect the specific polymorphisms responsible for observed quantitative trait loci and develop optimal strategies for genomic evaluations and selection for moderate (Israel) and large (US) dairy cattle populations. A joint evaluation using all phenotypic, pedigree, and genomic data is the optimal strategy. The specific objectives were: 1) to apply strategies for determination of the causative polymorphisms based on the “a posteriori granddaughter design” (APGD), 2) to develop methods to derive unbiased estimates of gene effects derived from SNP chips analyses, 3) to derive optimal single-stage methods to estimate breeding values of animals based on marker, phenotypic and pedigree data, 4) to extend these methods to multi-trait genetic evaluations and 5) to evaluate the results of long-term genomic selection, as compared to traditional selection. Nearly all of these objectives were met. The major achievements were: The APGD and the modified granddaughter designs were applied to the US Holstein population, and regions harboring segregating quantitative trait loci (QTL) were identified for all economic traits of interest. The APGD was able to find segregating QTL for all the economic traits analyzed, and confidence intervals for QTL location ranged from ~5 to 35 million base pairs. Genomic estimated breeding values (GEBV) for milk production traits in the Israeli Holstein population were computed by the single-step method and compared to results for the two-step method. The single-step method was extended to derive GEBV for multi-parity evaluation. Long-term analysis of genomic selection demonstrated that inclusion of pedigree data from previous generations may result in less accurate GEBV. Major conclusions are: Predictions using single-step genomic best linear unbiased prediction (GBLUP) were the least biased, and that method appears to be the best tool for genomic evaluation of a small population, as it automatically accounts for parental index and allows for inclusion of female genomic information without additional steps. None of the methods applied to the Israeli Holstein population were able to derive GEBV for young bulls that were significantly better than parent averages. Thus we confirm previous studies that the main limiting factor for the accuracy of GEBV is the number of bulls with genotypes and progeny tests. Although 36 of the grandsires included in the APGD were genotyped for the BovineHDBeadChip, which includes 777,000 SNPs, we were not able to determine the causative polymorphism for any of the detected QTL. The number of valid unique markers on the BovineHDBeadChip is not sufficient for a reasonable probability to find the causative polymorphisms. Complete resequencing of the genome of approximately 50 bulls will be required, but this could not be accomplished within the framework of the current project due to funding constraints. Inclusion of pedigree data from older generations in the derivation of GEBV may result is less accurate evaluations.
APA, Harvard, Vancouver, ISO, and other styles
4

Cabrera, Wilmar, Santiago Gamba, Camilo Gómez, and Mauricio Villamizar-Villegas. Examining Macroprudential Policy through a Microprudential Lens. Banco de la República, October 2022. http://dx.doi.org/10.32468/be.1212.

Full text
Abstract:
In this paper, we examine the financial and real effects of macroprudential policies with a new identifying strategy that exploits borrower-specific provisioning levels for each bank. Locally, we compare similar firms just below and above regulatory thresholds established in Colombia during 2008--2018 for the corporate credit portfolio. Our results indicate that the scheme induces banks to increase the provisioning cost of downgraded loans. This implies that, for loans with similar risk but with a discontinuously lower rating, banks offer a lower amount of credit, demand higher quality guarantees, and impose a higher level of provision coverage through the loan-loss given default. To illustrate, a 1 percentage point (pp) increase in the provision-to-credit ratio leads to a reduction in credit growth of up to 15pp and lowers the probability of receiving new credit by up to 11pp. When mapping our results to the real sector, we find that downgraded firms are constrained in their investment decisions and experience a contraction in liabilities, equity, and total assets.
APA, Harvard, Vancouver, ISO, and other styles
5

Nolan, Brian, Brenda Gannon, Richard Layte, Dorothy Watson, Christopher T. Whelan, and James Williams. Monitoring Poverty Trends in Ireland: Results from the 2000 Living in Ireland survey. ESRI, July 2002. http://dx.doi.org/10.26504/prs45.

Full text
Abstract:
This study is the latest in a series monitoring the evolution of poverty, based on data gathered by The ESRI in the Living in Ireland Surveys since 1994. These have allowed progress towards achieving the targets set out in the National Anti Poverty Strategy since 1997 to be assessed. The present study provides an updated picture using results from the 2000 round of the Living in Ireland survey. The numbers interviewed in the 2000 Living in Ireland survey were enhanced substantially, to compensate for attrition in the panel survey since it commenced in 1994. Individual interviews were conducted with 8,056 respondents. Relative income poverty lines do not on their own provide a satisfactory measure of exclusion due to lack of resources, but do nonetheless produce important key indicators of medium to long-term background trends. The numbers falling below relative income poverty lines were most often higher in 2000 than in 1997 or 1994. The income gap for those falling below these thresholds also increased. By contrast, the percentage of persons falling below income lines indexed only to prices (rather than average income) since 1994 or 1997 fell sharply, reflecting the pronounced real income growth throughout the distribution between then and 2000. This contrast points to the fundamental factors at work over this highly unusual period: unemployment fell very sharply and substantial real income growth was seen throughout the distribution, including social welfare payments, but these lagged behind income from work and property so social welfare recipients were more likely to fall below thresholds linked to average income. The study shows an increasing probability of falling below key relative income thresholds for single person households, those affected by illness or disability, and for those who are aged 65 or over - many of whom rely on social welfare support. Those in households where the reference person is unemployed still face a relatively high risk of falling below the income thresholds but continue to decline as a proportion of all those below the lines. Women face a higher risk of falling below those lines than men, but this gap was marked among the elderly. The study shows a marked decline in deprivation levels across different household types. As a result consistent poverty, that is the numbers both below relative income poverty lines and experiencing basic deprivation, also declined sharply. Those living in households comprising one adult with children continue to face a particularly high risk of consistent poverty, followed by those in families with two adults and four or more children. The percentage of adults in households below 70 per cent of median income and experiencing basic deprivation was seen to have fallen from 9 per cent in 1997 to about 4 per cent, while the percentage of children in such households fell from 15 per cent to 8 per cent. Women aged 65 or over faced a significantly higher risk of consistent poverty than men of that age. Up to 2000, the set of eight basic deprivation items included in the measure of consistent poverty were unchanged, so it was important to assess whether they were still capturing what would be widely seen as generalised deprivation. Factor analysis suggested that the structuring of deprivation items into the different dimensions has remained remarkably stable over time. Combining low income with the original set of basic deprivation indicators did still appear to identify a set of households experiencing generalised deprivation as a result of prolonged constraints in terms of command over resources, and distinguished from those experiencing other types of deprivation. However, on its own this does not tell the whole story - like purely relative income measures - nor does it necessarily remain the most appropriate set of indicators looking forward. Finally, it is argued that it would now be appropriate to expand the range of monitoring tools to include alternative poverty measures incorporating income and deprivation. Levels of deprivation for some of the items included in the original basic set were so low by 2000 that further progress will be difficult to capture empirically. This represents a remarkable achievement in a short space of time, but poverty is invariably reconstituted in terms of new and emerging social needs in a context of higher societal living standards and expectations. An alternative set of basic deprivation indicators and measure of consistent poverty is presented, which would be more likely to capture key trends over the next number of years. This has implications for the approach adopted in monitoring the National Anti-Poverty Strategy. Monitoring over the period to 2007 should take a broader focus than the consistent poverty measure as constructed to date, with attention also paid to both relative income and to consistent poverty with the amended set of indicators identified here.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography