Literatura científica selecionada sobre o tema "Probabilistic number theory"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Probabilistic number theory".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Probabilistic number theory"

1

Stakėnas, Vilius. "The probabilistic number theory and continuum." Lietuvos matematikos rinkinys, no. III (December 17, 1999): 93–99. http://dx.doi.org/10.15388/lmr.1999.35485.

Texto completo da fonte
Resumo:
Let bn be a sequence of real numbers increasing unboundedly, α > 0 and B(n, α) = [bn α]. The conditions on bn are considered, which imply the regularity of distribution of B(n, α) in arithmetic prog­ressions for almost all α > 0. This allows to develop a piece of probabilistic number theory on sequences B(n, α) for almost all α > 0.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Indlekofer, K. H. "Number theory—probabilistic, heuristic, and computational approaches." Computers & Mathematics with Applications 43, no. 8-9 (April 2002): 1035–61. http://dx.doi.org/10.1016/s0898-1221(02)80012-8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Stakėnas, V. "On some inequalities of probabilistic number theory." Lithuanian Mathematical Journal 46, no. 2 (April 2006): 208–16. http://dx.doi.org/10.1007/s10986-006-0022-2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Erdõs, P. "Recent problems in probabilistic number theory and combinatorics." Advances in Applied Probability 24, no. 4 (December 1992): 766–67. http://dx.doi.org/10.1017/s0001867800024654.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Stakėnas, Vilius. "Jonas Kubilius and genesis of probabilistic number theory." Lithuanian Mathematical Journal 55, no. 1 (January 2015): 25–47. http://dx.doi.org/10.1007/s10986-015-9263-2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Zhang, Wen-Bin. "Probabilistic number theory in additive arithmetic semigroups II." Mathematische Zeitschrift 235, no. 4 (December 1, 2000): 747–816. http://dx.doi.org/10.1007/s002090000165.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Kátai, Imre, Bui Minh Phong, and Le Manh Thanh. "Some results and problems in probabilistic number theory." Annales Universitatis Scientiarum Budapestinensis de Rolando Eötvös Nominatae. Sectio computatorica, no. 43 (2014): 253–65. https://doi.org/10.71352/ac.43.253.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Elliott, P. D. T. A. "Jonas Kubilius and Probabilistic Number Theory Some Personal Reflections." Lithuanian Mathematical Journal 55, no. 1 (January 2015): 2–24. http://dx.doi.org/10.1007/s10986-015-9262-3.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Daili, Noureddine. "DENSITIES AND NATURAL INTEGRABILITY. APPLICATIONS IN PROBABILISTIC NUMBER THEORY." JP Journal of Algebra, Number Theory and Applications 47, no. 1 (July 1, 2020): 51–65. http://dx.doi.org/10.17654/nt047010051.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Lokutsievskiy, Lev V. "Optimal probabilistic search." Sbornik: Mathematics 202, no. 5 (May 31, 2011): 697–719. http://dx.doi.org/10.1070/sm2011v202n05abeh004162.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Mais fontes

Teses / dissertações sobre o assunto "Probabilistic number theory"

1

Harper, Adam James. "Some topics in analytic and probabilistic number theory." Thesis, University of Cambridge, 2012. https://www.repository.cam.ac.uk/handle/1810/265539.

Texto completo da fonte
Resumo:
This dissertation studies four problems in analytic and probabilistic number theory. Two of the problems are about a certain random number theoretic object, namely a random multiplicative function. The other two problems are about smooth numbers (i.e. numbers only having small prime factors), both in their own right and in their application to finding solutions to S-unit equations over the integers. Thus all four problems are concerned, in different ways, with _understanding the multiplicative structure of the integers. More precisely, we will establish that certain sums of a random multiplicative function satisfy a normal approximation (i.e. a central limit theorem) , but that the complete sum over all integers less than x does not satisfy such an approximation. This reflects certain facts about the number and size of the prime factors of a typical integer. Our proofs use martingale methods, as well as a conditioning argument special to this problem. Next, we will prove an almost sure omega result for the sum of a random multiplicative function, substantially improving the existing result of Halasz. We will do this using a connection between sums of a random multiplicative function and a certain random trigonometric sum process, so that the heart of our work is proving precise results about the suprema of a class of Gaussian random processes. Switching to the study of smooth numbers, we will establish an equidistribution result for the y-smooth numbers less than x among arithmetic progressions to modulus q, asymptotically as (logx)/(logq)-+ oo, subject to a certain condition on the relative sizes of y and q. The main point of this work is that it does not require any restrictions on the relative sizes of x and y. Our proofs use a simple majorant principle for trigonometric sums, together with general tools such as a smoothed explicit formula. Finally, we will prove lower bounds for the possible number of solutions of some S-unit equations over the integers. For example, we will show that there exist arbitrarily large sets S of prime numbers such that the equation a+ l = c has at least exp{(#S)116- �} solutions (a, c) with all their prime factors from S. We will do this by using discrete forms of the circle method, and the multiplicative large sieve, to count the solutions of certain auxiliary linear equations.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Hughes, Garry. "Distribution of additive functions in algebraic number fields." Title page, contents and summary only, 1987. http://web4.library.adelaide.edu.au/theses/09SM/09smh893.pdf.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zhao, Wenzhong. "Probabilistic databases and their application." Lexington, Ky. : [University of Kentucky Libraries], 2004. http://lib.uky.edu/ETD/ukycosc2004d00183/wzhao0.pdf.

Texto completo da fonte
Resumo:
Thesis (Ph. D.)--University of Kentucky, 2004.<br>Title from document title page (viewed Jan. 7, 2005). Document formatted into pages; contains x, 180p. : ill. Includes abstract and vita. Includes bibliographical references (p. 173-178).
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Lloyd, James Robert. "Representation, learning, description and criticism of probabilistic models with applications to networks, functions and relational data." Thesis, University of Cambridge, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.709264.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Li, Xiang, and 李想. "Managing query quality in probabilistic databases." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B47753134.

Texto completo da fonte
Resumo:
In many emerging applications, such as sensor networks, location-based services, and data integration, the database is inherently uncertain. To handle a large amount of uncertain data, probabilistic databases have been recently proposed, where probabilistic queries are enabled to provide answers with statistical guarantees. In this thesis, we study the important issues of managing the quality of a probabilistic database. We first address the problem of measuring the ambiguity, or quality, of a probabilistic query. This is accomplished by computing the PWS-quality score, a recently proposed measure for quantifying the ambiguity of query answers under the possible world semantics. We study the computation of the PWS-quality for the top-k query. This problem is not trivial, since directly computing the top-k query score is computationally expensive. To tackle this challenge, we propose efficient approximate algorithms for deriving the quality score of a top-k query. We have performed experiments on both synthetic and real data to validate their performance and accuracy. Our second contribution is to study how to use the PWS-quality score to coordinate the process of cleaning uncertain data. Removing ambiguous data from a probabilistic database can often give us a higher-quality query result. However, this operation requires some external knowledge (e.g., an updated value from a sensor source), and is thus not without cost. It is important to choose the correct object to clean, in order to (1) achieve a high quality gain, and (2) incur a low cleaning cost. In this thesis, we examine different cleaning methods for a probabilistic top-k query. We also study an interesting problem where different query users have their own budgets available for cleaning. We demonstrate how an optimal solution, in terms of the lowest cleaning costs, can be achieved, for probabilistic range and maximum queries. An extensive evaluation reveals that these solutions are highly efficient and accurate.<br>published_or_final_version<br>Computer Science<br>Master<br>Master of Philosophy
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Rotondo, Pablo. "Probabilistic studies in number theory and word combinatorics : instances of dynamical analysis." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC213/document.

Texto completo da fonte
Resumo:
L'analyse dynamique intègre des outils propres aux systèmes dynamiques (comme l'opérateur de transfert) au cadre de la combinatoire analytique, et permet ainsi l'analyse d'un grand nombre d'algorithmes et objets qu'on peut associer naturellement à un système dynamique. Dans ce manuscrit de thèse, nous présentons, dans la perspective de l'analyse dynamique, l'étude probabiliste de plusieurs problèmes qui semblent à priori bien différents : l'analyse probabiliste de la fonction de récurrence des mots de Sturm, et l'étude probabiliste de l'algorithme du “logarithme continu”. Les mots de Sturm constituent une famille omniprésente en combinatoire des mots. Ce sont, dans un sens précis, les mots les plus simples qui ne sont pas ultimement périodiques. Les mots de Sturm ont déjà été beaucoup étudiés, notamment par Morse et Hedlund (1940) qui en ont exhibé une caractérisation fondamentale comme des codages discrets de droites à pente irrationnelle. Ce résultat relie ainsi les mots de Sturm au système dynamique d'Euclide. Les mots de Sturm n'avaient jamais été étudiés d'un point de vue probabiliste. Ici nous introduisons deux modèles probabilistes naturels (et bien complémentaires) et y analysons le comportement probabiliste (et asymptotique) de la “fonction de récurrence” ; nous quantifions sa valeur moyenne et décrivons sa distribution sous chacun de ces deux modèles : l'un est naturel du point de vue algorithmique (mais original du point de vue de l'analyse dynamique), et l'autre permet naturellement de quantifier des classes de plus mauvais cas. Nous discutons la relation entre ces deux modèles et leurs méthodes respectives, en exhibant un lien potentiel qui utilise la transformée de Mellin. Nous avons aussi considéré (et c'est un travail en cours qui vise à unifier les approches) les mots associés à deux familles particulières de pentes : les pentes irrationnelles quadratiques, et les pentes rationnelles (qui donnent lieu aux mots de Christoffel). L'algorithme du logarithme continu est introduit par Gosper dans Hakmem (1978) comme une mutation de l'algorithme classique des fractions continues. Il calcule le plus grand commun diviseur de deux nombres naturels en utilisant uniquement des shifts binaires et des soustractions. Le pire des cas a été étudié récemment par Shallit (2016), qui a donné des bornes précises pour le nombre d'étapes et a exhibé une famille d'entrées sur laquelle l'algorithme atteint cette borne. Dans cette thèse, nous étudions le nombre moyen d'étapes, tout comme d'autres paramètres importants de l'algorithme. Grâce à des méthodes d'analyse dynamique, nous exhibons des constantes mathématiques précises. Le système dynamique ressemble à première vue à celui d'Euclide, et a été étudié d'abord par Chan (2005) avec des méthodes ergodiques. Cependant, la présence des puissances de 2 dans les quotients change la nature de l'algorithme et donne une nature dyadique aux principaux paramètres de l'algorithme, qui ne peuvent donc pas être simplement caractérisés dans le monde réel.C'est pourquoi nous introduisons un nouveau système dynamique, avec une nouvelle composante dyadique, et travaillons dans ce système à deux composantes, l'une réelle, et l'autre dyadique. Grâce à ce nouveau système mixte, nous obtenons l'analyse en moyenne de l'algorithme<br>Dynamical Analysis incorporates tools from dynamical systems, namely theTransfer Operator, into the framework of Analytic Combinatorics, permitting the analysis of numerous algorithms and objects naturally associated with an underlying dynamical system.This dissertation presents, in the integrated framework of Dynamical Analysis, the probabilistic analysis of seemingly distinct problems in a unified way: the probabilistic study of the recurrence function of Sturmian words, and the probabilistic study of the Continued Logarithm algorithm.Sturmian words are a fundamental family of words in Word Combinatorics. They are in a precise sense the simplest infinite words that are not eventually periodic. Sturmian words have been well studied over the years, notably by Morse and Hedlund (1940) who demonstrated that they present a notable number theoretical characterization as discrete codings of lines with irrationalslope, relating them naturally to dynamical systems, in particular the Euclidean dynamical system. These words have never been studied from a probabilistic perspective. Here, we quantify the recurrence properties of a ``random'' Sturmian word, which are dictated by the so-called ``recurrence function''; we perform a complete asymptotic probabilistic study of this function, quantifying its mean and describing its distribution under two different probabilistic models, which present different virtues: one is a naturaly choice from an algorithmic point of view (but is innovative from the point of view of dynamical analysis), while the other allows a natural quantification of the worst-case growth of the recurrence function. We discuss the relation between these two distinct models and their respective techniques, explaining also how the two seemingly different techniques employed could be linked through the use of the Mellin transform. In this dissertation we also discuss our ongoing work regarding two special families of Sturmian words: those associated with a quadratic irrational slope, and those with a rational slope (not properly Sturmian). Our work seems to show the possibility of a unified study.The Continued Logarithm Algorithm, introduced by Gosper in Hakmem (1978) as a mutation of classical continued fractions, computes the greatest common divisor of two natural numbers by performing division-like steps involving only binary shifts and substractions. Its worst-case performance was studied recently by Shallit (2016), who showed a precise upper-bound for the number of steps and gave a family of inputs attaining this bound. In this dissertation we employ dynamical analysis to study the average running time of the algorithm, giving precise mathematical constants for the asymptotics, as well as other parameters of interest. The underlying dynamical system is akin to the Euclidean one, and was first studied by Chan (around 2005) from an ergodic, but the presence of powers of 2 in the quotients ingrains into the central parameters a dyadic flavour that cannot be grasped solely by studying this system. We thus introduce a dyadic component and deal with a two-component system. With this new mixed system at hand, we then provide a complete average-case analysis of the algorithm by Dynamical Analysis
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Pariente, Cesar Alberto Bravo. "Um método probabilístico em combinatória." Universidade de São Paulo, 1996. http://www.teses.usp.br/teses/disponiveis/45/45132/tde-07052010-163719/.

Texto completo da fonte
Resumo:
O presente trabalho é um esforço de apresentar, organizado em forma de survey, um conjunto de resultados que ilustram a aplicação de um certo método probabilístico. Embora não apresentemos resultados novos na área, acreditamos que a apresentação sistemática destes resultados pode servir para a compreensão de uma ferramenta útil para quem usa dos métodos probabilísticos na sua pesquisa em combinatória. Os resultados de que falaremos tem aparecido na última década na literatura especializada e foram usados na investigação de problemas que resitiram a outras aproximações mais clássicas. Em vez de teorizar sobre o método a apresentar, nós adotaremos a estratégia de apresentar três problemas, usando-os como exemplos práticos da aplicação do método em questão. Surpeendentemente, apesar da dificuldade que apresentaram para ser resolvidos, estes problemas compartilham a caraterística de poder ser formulados muito intuitivamente, como veremos no Capítulo 1. Devemos advertir que embora os problemas que conduzem nossa exposição pertençam a áreas tão diferentes quanto teoria de números, geometria e combinatória, nosso intuito é fazer énfase no que de comum tem as suas soluções e não das posteriores implicações que estes problemas tenham nas suas respectivas áreas. Ocasionalmente comentaremos sim, outras possíveis aplicações das ferramentas usadas para solucionar estes problemas de motivação. Os problemas de que trataremos tem-se caracterizado por aguardar várias décadas em espera de solução: O primeiro, da teoria de números, surgiu na pesquisa de séries de Fourier que Sidon realizava a princípios de século e foi proposto por ele a Erdös em 1932. Embora tenham havido, desde 1950, diversos avanços na pesquisa deste problema, o resultado de que falaremos data de 1981. Já o segundo problema, da geometria, é uma conjectura formulada em 1951 por Heilbronn e refutada finalmente em 1982. O último problema, de combinatória, é uma conjectura de Erdös e Hanani de 1963, que foi tratada em diversos casos particulares até ser finalmente resolvida em toda sua generalidade em 1985.<br>The following work is an effort to present, in survey form, a collection of results that illustrate the application of a certain probabilistic method in combinatorics. We do not present new results in the area; however, we do believe that the systematic presentation of these results can help those who use probabilistic methods comprenhend this useful technique. The results we refer to have appeared over the last decade in the research literature and were used in the investigation of problems which have resisted other, more classical, approaches. Instead of theorizing about the method, we adopted the strategy of presenting three problems, using them as practical examples of the application of the method in question. Surpisingly, despite the difficulty of solutions to these problems, they share the characteristic of being able to be formulated very intuitively, as we will see in Chapter One. We should warn the reader that despite the fact that the problems which drive our discussion belong to such different fields as number theory, geometry and combinatorics, our goal is to place emphasis on what their solutions have in common and not on the subsequent implications that these problems have in their respective fields. Occasionally, we will comment on other potential applications of the tools utilized to solve these problems. The problems which we are discussing can be characterized by the decades-long wait for their solution: the first, from number theory, arose from the research in Fourier series conducted by Sidon at the beginning of the century and was proposed by him to Erdös in 1932. Since 1950, there have been diverse advances in the understanding of this problem, but the result we talk of comes from 1981. The second problem, from geometry, is a conjecture formulated in 1951 by Heilbronn and finally refuted in 1982. The last problem, from combinatorics, is a conjecture formulated by Erdös and Hanani in 1963 that was treated in several particular cases but was only solved in its entirety in 1985.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Schimit, Pedro Henrique Triguis. "Modelagem e controle de propagação de epidemias usando autômatos celulares e teoria de jogos." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/3/3139/tde-05122011-153541/.

Texto completo da fonte
Resumo:
Estuda-se o espalhamento de doenças contagiosas utilizando modelos suscetível-infectado-recuperado (SIR) representados por equações diferenciais ordinárias (EDOs) e por autômatos celulares probabilistas (ACPs) conectados por redes aleatórias. Cada indivíduo (célula) do reticulado do ACP sofre a influência de outros, sendo que a probabilidade de ocorrer interação com os mais próximos é maior. Efetuam-se simulações para investigar como a propagação da doença é afetada pela topologia de acoplamento da população. Comparam-se os resultados numéricos obtidos com o modelo baseado em ACPs aleatoriamente conectados com os resultados obtidos com o modelo descrito por EDOs. Conclui-se que considerar a estrutura topológica da população pode dificultar a caracterização da doença, a partir da observação da evolução temporal do número de infectados. Conclui-se também que isolar alguns infectados causa o mesmo efeito do que isolar muitos suscetíveis. Além disso, analisa-se uma estratégia de vacinação com base em teoria dos jogos. Nesse jogo, o governo tenta minimizar os gastos para controlar a epidemia. Como resultado, o governo realiza campanhas quase-periódicas de vacinação.<br>The spreading of contagious diseases is studied by using susceptible-infected-recovered (SIR) models represented by ordinary differential equations (ODE) and by probabilistic cellular automata (PCA) connected by random networks. Each individual (cell) of the PCA lattice experiences the influence of others, where the probability of occurring interaction with the nearest ones is higher. Simulations for investigating how the disease propagation is affected by the coupling topology of the population are performed. The numerical results obtained with the model based on randomly connected PCA are compared to the results obtained with the model described by ODE. It is concluded that considering the topological structure of the population can pose difficulties for characterizing the disease, from the observation of the time evolution of the number of infected individuals. It is also concluded that isolating a few infected subjects can cause the same effect than isolating many susceptible individuals. Furthermore, a vaccination strategy based on game theory is analyzed. In this game, the government tries to minimize the expenses for controlling the epidemic. As consequence, the government implements quasi-periodic vaccination campaigns.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Silva, Everton Juliano da. "Uma demonstração analítica do teorema de Erdös-Kac." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/45/45131/tde-24032015-132813/.

Texto completo da fonte
Resumo:
Em teoria dos números, o teorema de Erdös-Kac, também conhecido como o teorema fundamental de teoria probabilística dos números, diz que se w(n) denota a quantidade de fatores primos distintos de n, então a sequência de funções de distribuições N definidas por FN(x) = (1/N) #{n <= N : (w(n) log log N)/(log log N)^(1/2)} <= x}, converge uniformemente sobre R para a distribuição normal padrão. Neste trabalho desenvolvemos todos os teoremas necessários para uma demonstração analítica, que nos permitirá encontrar a ordem de erro da convergência acima.<br>In number theory, the Erdös-Kac theorem, also known as the fundamental theorem of probabilistic number theory, states that if w(n) is the number of distinct prime factors of n, then the sequence of distribution functions N, defined by FN(x) = (1/N) #{n <= N : (w(n) log log N)/(log log N)^(1/2)} <= x}, converges uniformly on R to the standard normal distribution. In this work we developed all theorems needed to an analytic demonstration, which will allow us to find an order of error of the above convergence.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Shi, Lingsheng. "Numbers and topologies." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät II, 2003. http://dx.doi.org/10.18452/14871.

Texto completo da fonte
Resumo:
In der Ramsey Theorie fuer Graphen haben Burr und Erdos vor nunmehr fast dreissig Jahren zwei Vermutungen formuliert, die sich als richtungsweisend erwiesen haben. Es geht darum diejenigen Graphen zu charakterisieren, deren Ramsey Zahlen linear in der Anzahl der Knoten wachsen. Diese Vermutungen besagen, dass Ramsey Zahlen linear fuer alle degenerierten Graphen wachsen und dass die Ramsey Zahlen von Wuerfeln linear wachsen. Ein Ziel dieser Dissertation ist es, abgeschwaechte Varianten dieser Vermutungen zu beweisen. In der topologischen Ramseytheorie bewies Kojman vor kurzem eine topologische Umkehrung des Satzes von Hindman und fuehrte gleichzeitig sogenannte Hindman-Raeume und van der Waerden-Raeume ein (beide sind eine Teilmenge der folgenkompakten Raeume), die jeweils zum Satz von Hindman beziehungsweise zum Satz von van der Waerden korrespondieren. In der Dissertation wird zum einen eine Verstaerkung der Umkehrung des Satzes von van der Waerden bewiesen. Weiterhin wird der Begriff der Differentialkompaktheit eingefuehrt, der sich in diesem Zusammenhang ergibt und der eng mit Hindman-Raeumen verknuepft ist. Dabei wird auch die Beziehung zwischen Differentialkompaktheit und anderen topologischen Raeumen untersucht. Im letzten Abschnitt des zweiten Teils werden kompakte dynamische Systeme verwendet, um ein klassisches Ramsey-Ergebnis von Brown und Hindman et al. ueber stueckweise syndetische Mengen ueber natuerlichen Zahlen und diskreten Halbgruppen auf lokal zusammenhaengende Halbgruppen zu verallgemeinern.<br>In graph Ramsey theory, Burr and Erdos in 1970s posed two conjectures which may be considered as initial steps toward the problem of characterizing the set of graphs for which Ramsey numbers grow linearly in their orders. One conjecture is that Ramsey numbers grow linearly for all degenerate graphs and the other is that Ramsey numbers grow linearly for cubes. Though unable to settle these two conjectures, we have contributed many weaker versions that support the likely truth of the first conjecture and obtained a polynomial upper bound for the Ramsey numbers of cubes that considerably improves all previous bounds and comes close to the linear bound in the second conjecture. In topological Ramsey theory, Kojman recently observed a topological converse of Hindman's theorem and then introduced the so-called Hindman space and van der Waerden space (both of which are stronger than sequentially compact spaces) corresponding respectively to Hindman's theorem and van der Waerden's theorem. In this thesis, we will strengthen the topological converse of Hindman's theorem by using canonical Ramsey theorem, and introduce differential compactness that arises naturally in this context and study its relations to other spaces as well. Also by using compact dynamical systems, we will extend a classical Ramsey type theorem of Brown and Hindman et al on piecewise syndetic sets from natural numbers and discrete semigroups to locally connected semigroups.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Mais fontes

Livros sobre o assunto "Probabilistic number theory"

1

Kubilius, Jonas. Analiziniai ir tikimybiniai metodai skaičių teorijoje: Trečiosios tarptautines konferencijos J. Kubiliaus garbei darbų rinkinys / redaktoriai, A. Dubickas, A. Laurinčikas, E. Manstavičius = Analytic and probabilistic methods in number theory : proceedings of the third international conference in honour of J. Kubilius, Palanga, Lithuania, 24-28 September 2001. Vilnius: TEV, 2002.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Jonas, Kubilius, Laurinčikas Antanas, Manstavičius E, and Stakėnas V, eds. Analytic and probabilistic methods in number theory: Proceedings of the second international conference in honour of J. Kubilius, Palanga, Lithuania, 23-27 September 1996. Vilnius, Lithuania: TEV, 1997.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Tenenbaum, Gerald. Introduction à la théorie analytique et probabiliste des nombres. 2nd ed. Paris: Société Mathématique de France, 1995.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Laurincikas, E., E. Manstavicius, and V. Stakenas, eds. Analytic and Probabilistic Methods in Number Theory. Berlin, Boston: DE GRUYTER, 1997. http://dx.doi.org/10.1515/9783110944648.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

1940-, Zhang Wen-Bin, ed. Number theory arising from finite fields: Analytic and probabilistic theory. New York: Marcel Dekker, 2001.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Boultbee, R. Rounded numbers. [Toronto, Ont.?: s.n.], 1990.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Japan) International Conference "Functions in Number Theory and Their Probabilistic Aspects" (2010 Kyoto. Functions in Number Theory and Their Probabilistic Aspects, December 13-17, 2010. Kyoto, Japan: Research Institute for Mathematical Sciences, Kyoto University, 2012.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Florian, Luca, ed. Analytic number theory: Exploring the anatomy of integers. Providence, R.I: American Mathematical Society, 2012.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

1956-, Applebaum David, Schürmann Michael 1955-, and Franz Uwe, eds. Quantum independent increment processes. Berlin: Springer, 2005.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

E, Barndorff-Nielsen O., Schürmann Michael, and Franz Uwe, eds. Quantum independent increment processes. Berlin: Springer, 2006.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Mais fontes

Capítulos de livros sobre o assunto "Probabilistic number theory"

1

Murty, M. Ram, and V. Kumar Murty. "Probabilistic Number Theory." In The Mathematical Legacy of Srinivasa Ramanujan, 149–53. India: Springer India, 2012. http://dx.doi.org/10.1007/978-81-322-0770-2_11.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Boston, Nigel. "A probabilistic generalization of the Riemann zeta function." In Analytic Number Theory, 155–62. Boston, MA: Birkhäuser Boston, 1996. http://dx.doi.org/10.1007/978-1-4612-4086-0_8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Elliott, P. D. T. A. "Progress in Probabilistic Number Theory." In Grundlehren der mathematischen Wissenschaften, 423–48. New York, NY: Springer New York, 1985. http://dx.doi.org/10.1007/978-1-4613-8548-6_25.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Kubilius, Jonas. "Recent Progress in Probabilistic Number Theory." In Asymptotic Methods in Probability and Statistics with Applications, 507–19. Boston, MA: Birkhäuser Boston, 2001. http://dx.doi.org/10.1007/978-1-4612-0209-7_36.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Indlekofer, Karl-Heinz. "A New Method in Probabilistic Number Theory." In Probability Theory and Applications, 299–308. Dordrecht: Springer Netherlands, 1992. http://dx.doi.org/10.1007/978-94-011-2817-9_19.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Kubilius, J. "On some inequalities in the probabilistic number theory." In Lecture Notes in Mathematics, 214–20. Berlin, Heidelberg: Springer Berlin Heidelberg, 1988. http://dx.doi.org/10.1007/bfb0078476.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Nora, Pedro, Jurriaan Rot, Lutz Schröder, and Paul Wild. "Relational Connectors and Heterogeneous Simulations." In Lecture Notes in Computer Science, 111–32. Cham: Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-90897-2_6.

Texto completo da fonte
Resumo:
Abstract While behavioural equivalences among systems of the same type, such as Park/Milner bisimilarity of labelled transition systems, are an established notion, a systematic treatment of relationships between systems of different types is currently missing. We provide such a treatment in the framework of universal coalgebra, in which the type of a system (nondeterministic, probabilistic, weighted, game-based etc.) is abstracted as a set functor: We introduce relational connectors among set functors, which induce notions of heterogeneous (bi)simulation among coalgebras of the respective types. We give a number of constructions on relational connectors. In particular, we identify composition and converse operations on relational connectors; we construct corresponding identity relational connectors, showing that the latter generalize the standard Barr extension of weak-pullback-preserving functors; and we introduce a Kantorovich construction in which relational connectors are induced from relations between modalities. For Kantorovich relational connectors, one has a notion of dual-purpose modal logic interpreted over both system types, and we prove a corresponding Hennessy-Milner-type theorem stating that generalized (bi)similarity coincides with theory inclusion on finitely-branching systems. We apply these results to a number of example scenarios involving labelled transition systems with different label alphabets, probabilistic systems, and input/output conformances.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Guan, Ji, and Nengkun Yu. "A Probabilistic Logic for Verifying Continuous-time Markov Chains." In Tools and Algorithms for the Construction and Analysis of Systems, 3–21. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-99527-0_1.

Texto completo da fonte
Resumo:
AbstractA continuous-time Markov chain (CTMC) execution is a continuous class of probability distributions over states. This paper proposes a probabilistic linear-time temporal logic, namely continuous-time linear logic (CLL), to reason about the probability distribution execution of CTMCs. We define the syntax of CLL on the space of probability distributions. The syntax of CLL includes multiphase timed until formulas, and the semantics of CLL allows time reset to study relatively temporal properties. We derive a corresponding model-checking algorithm for CLL formulas. The correctness of the model-checking algorithm depends on Schanuel’s conjecture, a central open problem in transcendental number theory. Furthermore, we provide a running example of CTMCs to illustrate our method.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Devroye, Luc, László Györfi, and Gábor Lugosi. "Uniform Laws of Large Numbers." In A Probabilistic Theory of Pattern Recognition, 489–506. New York, NY: Springer New York, 1996. http://dx.doi.org/10.1007/978-1-4612-0711-5_29.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Fullér, Robert, István Á. Harmati, and Péter Várlaki. "On Probabilistic Correlation Coefficients for Fuzzy Numbers." In Aspects of Computational Intelligence: Theory and Applications, 249–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-30668-6_17.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Probabilistic number theory"

1

Raimondo, Eleonora, Andrea Grimaldi, Anna Giordano, Massimo Chiappini, Mario Carpentieri, and Giovanni Finocchio. "Random Number Generation Driven by Voltage-Controlled Magnetic Anisotropy and Their use in Probabilistic Computing." In 2024 IEEE 24th International Conference on Nanotechnology (NANO), 326–30. IEEE, 2024. http://dx.doi.org/10.1109/nano61778.2024.10628878.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Brenna, Andrea, Luciano Lazzari, and Marco Ormellese. "Probabilistic Model Based on Markov Chain for the Assessment of Localized Corrosion of Stainless Steels." In CORROSION 2014, 1–15. NACE International, 2014. https://doi.org/10.5006/c2014-4091.

Texto completo da fonte
Resumo:
Abstract Pitting, crevice and stress corrosion cracking are the most damaging corrosion forms of stainless steels in industrial applications. Generally, pitting and crevice susceptibility depends on a variety of factors related to the metal (chemical composition, differences in the metallurgical structure, inclusions), the environment (chloride content, pH, temperature, differential aeration) and the geometry of the system. Due to their unpredictable occurrence, localized corrosion events cannot be explained without using a proper statistical method. In this work a probabilistic approach based on Markov chains for the assessment of pitting and crevice corrosion initiation is proposed. A Markov chain is a stochastic process that undergoes transitions from one state to another through a finite number of possible states, until a so-called "absorbing state” from which the system has no tendency to evolve is attained. Formally, a Markov chain is characterized by a set of states and a transition probability matrix. The model calculates the probability to have pitting (and vice versa to maintain a stable passive condition) involving a large number of operating parameters related to both metal and environment. Experimental tests were carried out to validate the model which requires more accurate investigations.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Krynicki, J. W., K. E. Bagnoli, and J. E. McLaughlin. "Probabilistic Risk Based Approach for Performing an Onstream High Temperature Hydrogen Attack Inspection." In CORROSION 2006, 1–10. NACE International, 2006. https://doi.org/10.5006/c2006-06580.

Texto completo da fonte
Resumo:
Abstract A risk based methodology is proposed for the assessment of HTHA (high temperature hydrogen attack). This approach permits the ranking of probability in a semi-quantitative fashion for establishing inspection requirements. The assessment methodology relates the Pv parameter to the operating limits/curves in API 941, where Pv correlates temperature and hydrogen partial to the time for incipient attack1. This, in turn, is correlated to the probability of failure as described on a risk matrix. The results are then used to establish inspection frequency, extent, and priority. The approach outlined here was applied to a C-0.5Mo pressure vessel operating above the carbon steel API 941 curve2. Due to the variable performance of C-0.5Mo steel in hydrogen service there have been a number of unfavorable service experiences with this material, resulting in the removal of the C-0.5Mo curve from API 941. Risk analysis of the pressure vessel evaluated in this study indicated the need for inspection. The inspection for HTHA is traditionally performed when equipment is out of service. However, an on-stream inspection has the advantage of identifying damage before the equipment is shut down and thereby providing adequate lead-time to prepare for replacement or repair. An on-stream inspection program was developed, qualified, and successfully field deployed for the aforementioned vessel, which operates up to 750°F (400°C). The inspection approach relied on the ultrasonic backscatter spectrum analysis method for the base metal inspection and time of flight diffraction – ultrasonic testing (TOFD UT) for weld inspection. This paper will describe the approach and results from the on-stream vessel inspection and a subsequent off stream inspection, which validated the on-stream results.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Jansen, H. J. M., and G. J. J. van der Schot. "Life Extension of Degraded Main Oil Line Pipeline Sections Through Improved Risk Based Integrity Management." In CORROSION 2005, 1–15. NACE International, 2005. https://doi.org/10.5006/c2005-05143.

Texto completo da fonte
Resumo:
Abstract Intelligent pig inspections between 2000 and 2003 revealed severe corrosion damage in a number of Main Oil Line pipeline sections to such level that continued operation was jeopardized. Replacement of 3 Main Oil Line sections was planned for 2004 because of integrity problems. In order to overcome the direct integrity threat, specialized defect assessment was introduced in 2002 to reduce the conservatism in the defect assessment with respect to defect interaction rules. Continued operation could be justified by this special analysis. In 2003, a special hydraulic analysis applying the "Head versus Distance" methodology was introduced to review the pressure safeguarding of the system, which allowed derating of one of the Main Oil Line sections as such increasing the corrosion tolerance of the section. In addition a new risk based “Inspect- Repair” strategy was developed to manage the integrity of the corroding pipelines longer-term. This strategy was incorporated in Company’s Pipe-RBA (Risk Based Assessment) methodology and although it is not based on a probabilistic assessment, it does take into account the probabilistic nature of corrosion growth. Robust permanent repair strategies were developed for both internal and external defects including the quality control aspects of applying the repairs. As a result of all this work, the life of 2 lines is extended by at least 10 years but possibly indefinitely and the life of 1 line is extended by at least 7 years. Total replacement cost of the 3 lines is 71 million $. This paper outlines the risk based methodologies used and their application to two line sections.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Flodin, Larkin, and Arya Mazumdar. "Probabilistic Group Testing with a Linear Number of Tests." In 2021 IEEE International Symposium on Information Theory (ISIT). IEEE, 2021. http://dx.doi.org/10.1109/isit45174.2021.9517841.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Pigott, Ronald. "Advanced Probabilistic Design of Multi-Degree of Freedom Systems Subjected to a Number of Discreet Excitation Frequencies." In ASME 1997 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/imece1997-0031.

Texto completo da fonte
Resumo:
Abstract From the beginning, engineers have focused on the special case of determinism in the design process, and an enormous methodology has been developed to support this approach. Today, however, customers are demanding greater reliability and are imposing greater penalties for failure. In order to achieve higher reliability, and in order to assess the risk of failure, probabilistic approaches will almost certainly have to be employed. While designers have always used probability in their work, it has usually been done with risk represented in a single factor of safety. This paper focuses on the application of probability theory to the design of multi-degree of freedom systems that are subjected to a number of discreet excitation frequencies. Problems of this nature are often encountered in rotating machines where the excitation frequencies are multiples of the speed of rotation. In a previous paper (Pigott, 1996), only variability in the system natural frequencies was considered, and the probability of exceeding a certain vibratory stress was determined. It was shown that, at the reliability levels of interest, it is necessary to consider multiple mode contributions to the total vibratory stress. In this paper, the previous approach is extended to include variability in the magnitude of the excitation force, system damping, and system fatigue strength leading to the determination of a total probability of failure due to high cycle fatigue. The results of applying the deterministic analysis, the first level of probabilistic analysis (Pigott, 1996) and the full probabilistic analysis are presented for a sample system. The benefits from using the probabilistic approach for component design are also discussed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Kersten, Daniel, and David C. Knill. "Reflectance estimation and lightness constancy: a probabilistic approach." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1987. http://dx.doi.org/10.1364/oam.1987.thc2.

Texto completo da fonte
Resumo:
The image of a natural scene can be considered a product of reflectance and effective illumination functions. One goal of image understanding is the estimation of the reflectance and illumination from image luminance data. Human observers are remarkably good at inferring reflectance changes from an image. One well-known aspect of this ability is lightness constancy. However, because there are two unknowns for every data point in the image, this problem is ill-posed, and it does not have a unique solution without prior constraints on the class of reflectance and illumination functions. We use Markov random fields to model and thereby constrain the class of reflectance and illumination functions. Our computational goal is taken to be the maximization of the posterior distribution of the reflectance and illumination conditional on the image. This goal is achieved using stochastic relaxation. This approach provides a general framework for quantifying the computational theory apart from specific algorithmic implementations for a number of problems of early vision.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Karge, Jonas, and Sebastian Rudolph. "The More the Worst-Case-Merrier: A Generalized Condorcet Jury Theorem for Belief Fusion." In 19th International Conference on Principles of Knowledge Representation and Reasoning {KR-2022}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/kr.2022/21.

Texto completo da fonte
Resumo:
In multi-agent belief fusion, there is increasing interest in results and methods from social choice theory. As a theoretical cornerstone, the Condorcet Jury Theorem (CJT) states that given a number of equally competent, independent agents where each is more likely to guess the true out of two alternatives, the chances of determining this objective truth by majority voting increase with the number of participating agents, approaching certainty. Past generalizations of the CJT have shown that some of its underlying assumptions can be weakened. Motivated by requirements from practical belief fusion scenarios, we provide a significant further generalization that subsumes several of the previous ones. Our considered setting simultaneously allows for heterogeneous competence levels across the agents (even tolerating entirely incompetent or even malicious voters), and voting for any number of alternatives from a finite set. We derive practical lower bounds for the numbers of agents needed to give probabilistic guarantees for determining the true state through approval voting. We also demonstrate that the non-asymptotic part of the CJT fails in our setting for arbitrarily high numbers of voters.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

He, Jian Wen, and Ying Min Low. "Probabilistic Assessment of the Clashing Between Flexible Marine Risers." In ASME 2010 29th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/omae2010-20046.

Texto completo da fonte
Resumo:
Flexible marine risers are compliant to external forces from waves, current and platform motions, and clashing between risers is an important concern. In deepwater developments where the number of connected risers is large, it is not economical to space them too far apart. In this regard, it is necessary to establish the probability of riser clashing throughout the service life; however, at present there appears to be no systematic procedure for assessing this risk. This paper presents a novel procedure for estimating the probability of riser clashing based on post-processing results obtained from time domain simulations of flexible risers subjected to random wave loads. First, an efficient technique is employed to sieve out critical pairs among riser elements. From these element pairs, the time history of a normalized clearance parameter is derived from the nodal displacements of the elements. Subsequently, the mean up-crossing rate of this parameter is extracted and extrapolated to the threshold of clashing using extreme value theory. As this method is still in its early developmental stage, critical effects such as vortex-induced vibrations and wake interference will not be considered in the present work.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Skelonis, Carl D., M. Brett Shelton, and Glenn T. Burney. "A Probabilistic Gas Touched Length Analysis." In ASME 2011 Power Conference collocated with JSME ICOPE 2011. ASMEDC, 2011. http://dx.doi.org/10.1115/power2011-55298.

Texto completo da fonte
Resumo:
Field measurements of the steamside oxide thickness for high temperature (&gt; 850F) boiler tubing subject to the accumulation of creep damage often are made to support deterministic assessments of the remaining life. Most often, these inspections are undertaken to understand the condition of the tubing at some particular location along a circuit, often as a result of a tube failure. The life assessment is based on relationships that have been developed between oxide growth kinetics and temperature. Unfortunately, because of variability in the oxide-temperature relationships reflecting different original data sets, and because of the inherent uncertainty in materials properties where heat-specific test data is not available, there typically exists a broad range of uncertainty in the deterministic assessment results. Large utility-type boilers typically contain a number of high temperature sections, including various stages of superheat and reheat, each of which will contain miles of tubing. Since the temperature derived from an oxide thickness measurement is relevant only to the specific location where the measurement was made, the deterministically derived life calculation is also specific to that location. As a result, the attempt to draw conclusions regarding the condition of an entire superheater or reheater section from measurements made at only one or two locations in those sections is fraught with difficulties. It is for this reason that the Probabilistic Gas Touched Length Analysis model has been developed. This model makes it possible to calculate creep damage accumulation/remaining life at any point along the steam path. Oxide thickness data and operating data are the primary operating inputs into the model, which performs heat transfer calculations at user-defined locations along the length of the tube circuit. The model applies statistical methods to evaluate variations in operating conditions as well as in physical and mechanical properties using a Monte Carlo simulation to generate values for the probability of failure at selected locations. This paper will discuss the limitations of the existing approach to estimating the remaining life of high temperature boiler tubing and present the underpinning theory of the gas touched length analysis model. A case study showing the analysis results is included.
Estilos ABNT, Harvard, Vancouver, APA, etc.

Relatórios de organizações sobre o assunto "Probabilistic number theory"

1

Hale, Christie, Norman Abrahamson, and Yousef Bozorgnia. Probabilistic Seismic Hazard Analysis Code Verification. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, March 2018. http://dx.doi.org/10.55461/kjzh2652.

Texto completo da fonte
Resumo:
Over the past decade, the use of Probabilistic Seismic Hazard Analysis (PSHA) to assess seismic hazards has expanded, leading to the creation of a number of new PSHA computer codes throughout the industry. Additionally, recent seismic source and ground-motion characterization studies have led to more complex source and ground-motion models, which necessitate implementation in PSHA codes. This project was undertaken to update previous PSHA computer code verification efforts by running an expanded set of verification tests on codes currently in use for PSHA calculations. Following an announcement to the community, fifteen owners of PSHA codes from private consulting companies, academic institutions, risk analysis firms, and government agencies participated in the verification project by running verification tests on their own codes. The project included three sets of tests that increased in complexity from the first test in Set 1 to the last test in Set 3. Over the course of the project the group held ten meetings to discuss and finalize the results. Tests were often re-run several times before the results for all codes were finalized. This report documents the specifications and benchmark answers for the verification tests. Common issues and programming errors are also summarized, along with standard modeling approaches and key discussion points from the meetings. Where differences in modeling approaches lead to differences in reported hazard, those different modeling approaches are described. Through participation in the project, code owners verified the primary functions of their codes as benchmark answers were reached. The PSHA codes developed in the future can be verified by running the tests and comparing the results to the benchmark answers documented in this report. Note: the scope of this project is PSHA computer code verification. This project does not make recommendations on how to model earthquake scenarios from the specified source-characterization or ground-motion characterization inputs.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Mylne, Ken. Communicating Probability Forecasts – will people understand? Met Office, December 2024. https://doi.org/10.62998/bwjw5735.

Texto completo da fonte
Resumo:
Executive Summary “People don’t understand probabilities” – or do they? Weather forecasting science has long been developing ensemble forecasts as a way to improve forecast capability and provide better information to support users’ decisions. The science is well proven and, indeed, the Met Office will soon move to an ensemble-only NWP (Numerical Weather Prediction) system. Ensemble forecasts can be used in a number of ways, but fundamentally they provide a probabilistic picture of the weather forecast including a most likely outcome and information on the confidence, uncertainty or risks associated with forecast outcomes. In order to pull through the full benefits of this information it is important to communicate this information effectively to users so that they can make appropriate risk-based decisions. There is a widely-held belief that people will find probabilistic information hard to understand or make use of, which provides a significant obstacle to communicating it. This challenge for ensemble or probabilistic forecasts has long been recognised and there has been extensive research conducted into effective communication and people’s understanding of such forecasts, including several papers led or sponsored by the Met Office. This paper offers a review of that research to help guide future communications of forecasts. The overwhelming and consistent conclusion found in the literature is that people do understand the probabilistic information and make better decisions when presented with it, provided that the information is presented appropriately. Key conclusions include: • Nearly all of the studies indicate that people make better decisions, have more trust in information, and/or display more understanding of forecast information when forecasters use probability information in place of deterministic statements. • Providing additional information on uncertainty does not lead to confusion and misinterpretation compared to simple deterministic forecasts. • The inclusion of a numerical probability (e.g. 30%) alongside a visual or worded description can greatly help with correct interpretation; using both forms helps ensure that both more and less numerate individuals will understand the message. • Careful choice of language helps to promote understanding e.g. some people may be put off by “30% probability” which they consider to be mathematical, but are quite comfortable with “30% chance” and interpret it correctly. • It is important clearly define the events to which probabilities apply, and the way in which forecasters frame messages can influence how audiences interpret risks. Overall, the literature review provides strong support for communicating probabilistic information to forecast users, including the general public. It does not support the idea that people’s understanding should be a barrier to communicating such information. While not every single person will understand or take full advantage of the additional information, most people will benefit and make better decisions as a result. The review also offers a number of suggestions for optimising effective communication.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Darling, Arthur H., Diego J. Rodríguez, and William J. Vaughan. Uncertainty in the Economic Appraisal of Water Quality Improvement Investments: The Case for Project Risk Analysis. Inter-American Development Bank, July 2000. http://dx.doi.org/10.18235/0008825.

Texto completo da fonte
Resumo:
This technical paper argues that Monte Carlo risk analysis offers a more comprehensive and informative way to look at project risk ex-ante than the traditional (and often arbitrary), one-influence-at-atime sensitivity analysis approach customarily used in IDB analyses of economic feasibility. The case for probabilistic risk analysis is made using data from a project for cleaning up the Tietê river in São Paulo, Brazil. A number of ways to handle uncertainty about benefits are proposed, and their implications for the project acceptance decision and the consequent degree of presumed project risk are explained and illustrated.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Biddlecom, Ann, Taylor Riley, Jacqueline E. Darroch, Elizabeth A. Sully, Vladimíra Kantorová, and Mark C. Wheldon. Future Scenarios of Adolescent Contraceptive Use, Cost and Impact in Developing Regions. Guttmacher Institute, August 2018. http://dx.doi.org/10.1363/2018.29732.

Texto completo da fonte
Resumo:
Key Points This report presents scenarios of adolescent contraceptive use through 2030 to highlight the potential impact and costs associated with overall increased contraceptive use among adolescents and an increased use of long-acting, reversible contraceptives (LARCs), specifically. Under a scenario that assumes the most likely level of modern contraceptive use to be reached in a particular year (median values of probabilistic projections), the number of adolescent women using modern contraceptives in developing regions would reach 19.8 million in 2030, and 57% of adolescent women would have their need for modern contraception met. The total annual cost of services in 2030 for the projected 19.8 million modern method users would be an estimated $310 million. The cost would be lower, at $275 million, if 20% of adolescent women using short-acting methods were to choose LARCs. An estimated 7.1 million unintended pregnancies would be averted under this scenario. Because LARCs are highly effective, a shift toward use of these methods would avert an additional 300,000 unintended pregnancies. Under a scenario with accelerated growth in modern contraceptive use among adolescent women in developing regions, the number of modern method users would reach 27.1 million in 2030, and the proportion of adolescent women whose need for modern contraception would be met would rise to 79%. Contraceptive services for the 27.1 million modern method users in 2030 would cost an estimated $412 million. The cost would drop to $365 million under an assumption of increased LARC use. In 2030, an estimated 9.6 million unintended pregnancies would be averted under this accelerated growth scenario, and an additional 400,000 unintended pregnancies would be averted with a shift to LARC use.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Mazzoni, Silvia, Nicholas Gregor, Linda Al Atik, Yousef Bozorgnia, David Welch, and Gregory Deierlein. Probabilistic Seismic Hazard Analysis and Selecting and Scaling of Ground-Motion Records (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/zjdn7385.

Texto completo da fonte
Resumo:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3 (WG3), Task 3.1: Selecting and Scaling Ground-motion records. The objective of Task 3.1 is to provide suites of ground motions to be used by other working groups (WGs), especially Working Group 5: Analytical Modeling (WG5) for Simulation Studies. The ground motions used in the numerical simulations are intended to represent seismic hazard at the building site. The seismic hazard is dependent on the location of the site relative to seismic sources, the characteristics of the seismic sources in the region and the local soil conditions at the site. To achieve a proper representation of hazard across the State of California, ten sites were selected, and a site-specific probabilistic seismic hazard analysis (PSHA) was performed at each of these sites for both a soft soil (Vs30 = 270 m/sec) and a stiff soil (Vs30=760 m/sec). The PSHA used the UCERF3 seismic source model, which represents the latest seismic source model adopted by the USGS [2013] and NGA-West2 ground-motion models. The PSHA was carried out for structural periods ranging from 0.01 to 10 sec. At each site and soil class, the results from the PSHA—hazard curves, hazard deaggregation, and uniform-hazard spectra (UHS)—were extracted for a series of ten return periods, prescribed by WG5 and WG6, ranging from 15.5–2500 years. For each case (site, soil class, and return period), the UHS was used as the target spectrum for selection and modification of a suite of ground motions. Additionally, another set of target spectra based on “Conditional Spectra” (CS), which are more realistic than UHS, was developed [Baker and Lee 2018]. The Conditional Spectra are defined by the median (Conditional Mean Spectrum) and a period-dependent variance. A suite of at least 40 record pairs (horizontal) were selected and modified for each return period and target-spectrum type. Thus, for each ground-motion suite, 40 or more record pairs were selected using the deaggregation of the hazard, resulting in more than 200 record pairs per target-spectrum type at each site. The suites contained more than 40 records in case some were rejected by the modelers due to secondary characteristics; however, none were rejected, and the complete set was used. For the case of UHS as the target spectrum, the selected motions were modified (scaled) such that the average of the median spectrum (RotD50) [Boore 2010] of the ground-motion pairs follow the target spectrum closely within the period range of interest to the analysts. In communications with WG5 researchers, for ground-motion (time histories, or time series) selection and modification, a period range between 0.01–2.0 sec was selected for this specific application for the project. The duration metrics and pulse characteristics of the records were also used in the final selection of ground motions. The damping ratio for the PSHA and ground-motion target spectra was set to 5%, which is standard practice in engineering applications. For the cases where the CS was used as the target spectrum, the ground-motion suites were selected and scaled using a modified version of the conditional spectrum ground-motion selection tool (CS-GMS tool) developed by Baker and Lee [2018]. This tool selects and scales a suite of ground motions to meet both the median and the user-defined variability. This variability is defined by the relationship developed by Baker and Jayaram [2008]. The computation of CS requires a structural period for the conditional model. In collaboration with WG5 researchers, a conditioning period of 0.25 sec was selected as a representative of the fundamental mode of vibration of the buildings of interest in this study. Working Group 5 carried out a sensitivity analysis of using other conditioning periods, and the results and discussion of selection of conditioning period are reported in Section 4 of the WG5 PEER report entitled Technical Background Report for Structural Analysis and Performance Assessment. The WG3.1 report presents a summary of the selected sites, the seismic-source characterization model, and the ground-motion characterization model used in the PSHA, followed by selection and modification of suites of ground motions. The Record Sequence Number (RSN) and the associated scale factors are tabulated in the Appendices of this report, and the actual time-series files can be downloaded from the PEER Ground-motion database Portal (https://ngawest2.berkeley.edu/)(link is external).
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Motamed, Ramin, David McCallen, and Swasti Saxena. An International Workshop on Large-Scale Shake Table Testing for the Assessment of Soil-Foundation-Structure System Response for Seismic Safety of DOE Nuclear Facilities, A Virtual Workshop – 17-18 May 2021. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, February 2024. http://dx.doi.org/10.55461/jjvo9762.

Texto completo da fonte
Resumo:
Aging infrastructure within the US Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) nuclear facilities poses a major challenge to their resiliency against natural phenomenon hazards. Examples of mission-critical facilities located in regions of high seismicity can be found at a number of NNSA sites including Lawrence Livermore National Laboratory, Los Alamos National Laboratory, and the Nevada National Security Site. Most of the nation’s currently operating nuclear facilities have already reached their operating lifetime, and most currently operating nuclear power plants (NPPs) have already reached the extent of their operating license period. While the domestic demand for electrical energy is expected to grow, if currently operating NPPs do not extend their operations and additional plants are not built quickly enough to replace them, the total fraction of electrical energy generated from carbon-free nuclear power will rapidly decline. The decision to extend operation is ultimately an economic one; however, economics can often be improved through technical advancements (McCarthy et al. 2015) and research and development (R&amp;D) activities. Similarly, the operating lifetime of the current DOE- and NNSA-owned critical infrastructure can be extended using the Probabilistic Risk Assessment (PRA) framework to systematically identify the risk associated with designing and operating existing facilities and building new ones. Using this framework consists of several steps, including (1) system analysis considering the interaction between components, such as evaluating the soil-foundation-structure system response; and (2) assessment of areas of uncertainty. Both of these steps are essential to assessing and reducing risks to the DOE and NNSA nuclear facilities. While the risks to the DOE’s facilities are primarily due to natural hazard phenomena, data from large-scale tests of the soil-foundation-structural system response to seismic shaking is currently lacking. This workshop aimed to address these key areas by organizing an international workshop focused on advancing the seismic safety of nuclear facilities using large-scale shake table testing. As a result, this workshop, which was held virtually, brought together a select group of international experts in large-scale shake table testing from the U.S., Japan, and Europe to discuss state-of-the-art experimental techniques and emerging instrumentation technologies that can produce unique experimental data to advance knowledge in natural hazards that impact the safety of the DOE’s nuclear facilities. The generated experimental data followed by research and development activities will ultimately result in updates to ASCE 4-16, one of the primary design guides for DOE nuclear facilities per DOE-STD-1020-2016. The ultimate objective of the workshop was to develop a “road map” for the future experimental campaign and innovative instrumentations using the newly constructed DOE-funded large-scale shake table facility at the University of Nevada, Reno (UNR) as well as other large-scale shake table testing facilities. This new facility resulted from a collaborative project engagement between UNR and Lawrence Berkeley National Laboratory. (LBNL). This report summarizes the proceedings of the workshop and highlights the key outcomes from presentations and discussions.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Burns, Malcom, and Gavin Nixon. Literature review on analytical methods for the detection of precision bred products. Food Standards Agency, September 2023. http://dx.doi.org/10.46756/sci.fsa.ney927.

Texto completo da fonte
Resumo:
The Genetic Technology (Precision Breeding) Act (England) aims to develop a science-based process for the regulation and authorisation of precision bred organisms (PBOs). PBOs are created by genetic technologies but exhibit changes which could have occurred through traditional processes. This current review, commissioned by the Food Standards Agency (FSA), aims to clarify existing terminologies, explore viable methods for the detection, identification, and quantification of products of precision breeding techniques, address and identify potential solutions to the analytical challenges presented, and provide recommendations for working towards an infrastructure to support detection of precision bred products in the future. The review includes a summary of the terminology in relation to analytical approaches for detection of precision bred products. A harmonised set of terminology contributes towards promoting further understanding of the common terms used in genome editing. A review of the current state of the art of potential methods for the detection, identification and quantification of precision bred products in the UK, has been provided. Parallels are drawn with the evolution of synergistic analytical approaches for the detection of Genetically Modified Organisms (GMOs), where molecular biology techniques are used to detect DNA sequence changes in an organism’s genome. The scope and limitations of targeted and untargeted methods are summarised. Current scientific opinion supports that modern molecular biology techniques (i.e., quantitative real-time Polymerase Chain Reaction (qPCR), digital PCR (dPCR) and Next Generation Sequencing (NGS)) have the technical capability to detect small alterations in an organism’s genome, given specific prerequisites of a priori information on the DNA sequence of interest and of the associated flanking regions. These techniques also provide the best infra-structure for developing potential approaches for detection of PBOs. Should sufficient information be known regarding a sequence alteration and confidence can be attributed to this being specific to a PBO line, then detection, identification and quantification can potentially be achieved. Genome editing and new mutagenesis techniques are umbrella terms, incorporating a plethora of approaches with diverse modes of action and resultant mutational changes. Generalisations regarding techniques and methods for detection for all PBO products are not appropriate, and each genome edited product may have to be assessed on a case-by-case basis. The application of modern molecular biology techniques, in isolation and by targeting just a single alteration, are unlikely to provide unequivocal evidence to the source of that variation, be that as a result of precision breeding or as a result of traditional processes. In specific instances, detection and identification may be technically possible, if enough additional information is available in order to prove that a DNA sequence or sequences are unique to a specific genome edited line (e.g., following certain types of Site-Directed Nucelase-3 (SDN-3) based approaches). The scope, gaps, and limitations associated with traceability of PBO products were examined, to identify current and future challenges. Alongside these, recommendations were made to provide the infrastructure for working towards a toolkit for the design, development and implementation of analytical methods for detection of PBO products. Recognition is given that fully effective methods for PBO detection have yet to be realised, so these recommendations have been made as a tool for progressing the current state-of-the-art for research into such methods. Recommendations for the following five main challenges were identified. Firstly, PBOs submitted for authorisation should be assessed on a case-by-case basis in terms of the extent, type and number of genetic changes, to make an informed decision on the likelihood of a molecular biology method being developed for unequivocal identification of that specific PBO. The second recommendation is that a specialist review be conducted, potentially informed by UK and EU governmental departments, to monitor those PBOs destined for the authorisation process, and actively assess the extent of the genetic variability and mutations, to make an informed decision on the type and complexity of detection methods that need to be developed. This could be further informed as part of the authorisation process and augmented via a publicly available register or database. Thirdly, further specialist research and development, allied with laboratory-based evidence, is required to evaluate the potential of using a weight of evidence approach for the design and development of detection methods for PBOs. This concept centres on using other indicators, aside from the single mutation of interest, to increase the likelihood of providing a unique signature or footprint. This includes consideration of the genetic background, flanking regions, off-target mutations, potential CRISPR/Cas activity, feasibility of heritable epigenetic and epitranscriptomic changes, as well as supplementary material from supplier, origin, pedigree and other documentation. Fourthly, additional work is recommended, evaluating the extent/type/nature of the genetic changes, and assessing the feasibility of applying threshold limits associated with these genetic changes to make any distinction on how they may have occurred. Such a probabilistic approach, supported with bioinformatics, to determine the likelihood of particular changes occurring through genome editing or traditional processes, could facilitate rapid classification and pragmatic labelling of products and organisms containing specific mutations more readily. Finally, several scientific publications on detection of genome edited products have been based on theoretical principles. It is recommended to further qualify these using evidenced based practical experimental work in the laboratory environment. Additional challenges and recommendations regarding the design, development and implementation of potential detection methods were also identified. Modern molecular biology-based techniques, inclusive of qPCR, dPCR, and NGS, in combination with appropriate bioinformatics pipelines, continue to offer the best analytical potential for developing methods for detecting PBOs. dPCR and NGS may offer the best technical potential, but qPCR remains the most practicable option as it is embedded in most analytical laboratories. Traditional screening approaches, similar to those for conventional transgenic GMOs, cannot easily be used for PBOs due to the deficit in common control elements incorporated into the host genome. However, some limited screening may be appropriate for PBOs as part of a triage system, should a priori information be known regarding the sequences of interest. The current deficit of suitable methods to detect and identify PBOs precludes accurate PBO quantification. Development of suitable reference materials to aid in the traceability of PBOs remains an issue, particularly for those PBOs which house on- and off-target mutations which can segregate. Off-target mutations may provide an additional tool to augment methods for detection, but unless these exhibit complete genetic linkage to the sequence of interest, these can also segregate out in resulting generations. Further research should be conducted regarding the likelihood of multiple mutations segregating out in a PBO, to help inform the development of appropriate PBO reference materials, as well as the potential of using off-target mutations as an additional tool for PBO traceability. Whilst recognising the technical challenges of developing and maintaining pan-genomic databases, this report recommends that the UK continues to consider development of such a resource, either as a UK centric version, or ideally through engagement in parallel EU and international activities to better achieve harmonisation and shared responsibilities. Such databases would be an invaluable resource in the design of reliable detection methods, as well as for confirming that a mutation is as a result of genome editing. PBOs and their products show great potential within the agri-food sector, necessitating a science-based analytical framework to support UK legislation, business and consumers. Differentiating between PBOs generated through genome editing compared to organisms which exhibit the same mutational change through traditional processes remains analytically challenging, but a broad set of diagnostic technologies (e.g., qPCR, NGS, dPCR) coupled with pan-genomic databases and bioinformatics approaches may help contribute to filling this analytical gap, and support the safety, transparency, proportionality, traceability and consumer confidence associated with the UK food chain.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!