To see the other types of publications on this topic, follow the link: A. Le Coq (Firm).

Dissertations / Theses on the topic 'A. Le Coq (Firm)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'A. Le Coq (Firm).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Nascimento, Rui Fonseca. "Which factors determine firm survival?" Master's thesis, Instituto Superior de Economia e Gestão, 2015. http://hdl.handle.net/10400.5/13084.

Full text
Abstract:
Mestrado em Finanças
Este estudo tem como finalidade analisar quais as variáveis que afectam a sobrevivência das empresas que atuam na indústria transformadora portuguesa. A análise de sobrevivência será efetuada através de 1130 empresas, correspondendo estas as empresas nascidas no período de 2005 a 2009. A base de dados utilizada no estudo SCIE tem como base a publicação reportada pelo Instituto Nacional de Estatística (INE). A nossa análise de sobrevivência é centralizada em cinco variáveis de uma empresa: Crescimento, Dimensão, Tecnologia, Indicadores financeiros e o Sector. De forma a determinar o impacto destas variáveis na sobrevivência utilizamos o modelo de regressão de Cox. Antes de efetuarmos uma análise pelo modelo de regressão de Cox o comportamento das variáveis independentes foi analisado através do modelo de Kaplan-Meier onde podemos concluir que o segundo e terceiro ano de operação apresentam-se como os anos em que as empresas verificam maiores taxas de mortalidade (estas foram de cerca de 10% em cada ano). Analisando o modelo de Cox fomos incapazes de rejeitar todas as hipóteses efetuadas.
The main aim of this empirical study is to determine which factors influence the survival of new Portuguese companies. We will do so through survival analysis of 1130 companies born in the Portuguese manufacturing sector between 2005 and 2009. The database used SCIE, based on the report published by the INE. Our survival analysis is centered on five company variables: Growth, Size, Technological, Dimension, and Sector. To determine the impact of these variables on survival we used the Cox regression model. Before we ran an analysis through the Cox model we also studied the behavior of the variables through a Kaplan-Meier survival estimate, where we concluded that the second and the third year are those in which firms present the highest mortality rates (about 10% in each year). Moving into the Cox regression analysis, we were unable to reject any of our original hypotheses.
info:eu-repo/semantics/publishedVersion
APA, Harvard, Vancouver, ISO, and other styles
2

Bahrami, Abdorrahim. "Modelling and Verifying Dynamic Properties of Neuronal Networks in Coq." Thesis, Université d'Ottawa / University of Ottawa, 2021. http://hdl.handle.net/10393/42643.

Full text
Abstract:
Since the mid-1990s, formal verification has become increasingly important because it can provide guarantees that a software system is free of bugs and working correctly based on a provided model. Verification of biological and medical systems is a promising application of formal verification. Human neural networks have recently been emulated and studied as a biological system. Some recent research has been done on modelling some crucial neuronal circuits and using model checking techniques to verify their temporal properties. In large case studies, model checkers often cannot prove the given property at the desired level of generality. In this thesis, we provide a model using the Coq proof assistant and prove some properties concerning the dynamic behavior of some basic neuronal structures. Understanding the behavior of these modules is crucial because they constitute the elementary building blocks of bigger neuronal circuits. By using a proof assistant, we guarantee that the properties are true in the general case, that is, true for any input values, any length of input, and any amount of time. In this thesis, we define a model of human neural networks. We verify some properties of this model starting with properties of neurons. Neurons are the smallest unit in a human neuronal network. In the next step, we prove properties about functional structures of human neural networks which are called archetypes. Archetypes consist of two or more neurons connected in a suitable way. They are known for displaying some particular classes of behaviours, and their compositions govern several important functions such as walking, breathing, etc. The next step is verifying properties about structures that couple different archetypes to perform more complicated actions. We prove a property about one of these kinds of compositions. With such a model, there is the potential to detect inactive regions of the human brain and to treat mental disorders. Furthermore, our approach can be generalized to the verification of other kinds of networks, such as regulatory, metabolic, or environmental networks.
APA, Harvard, Vancouver, ISO, and other styles
3

Chambers, Maxwell J. "The Effect of Executive Compensation on Firm Performance through the Dot-Com Bubble." Scholarship @ Claremont, 2012. http://scholarship.claremont.edu/cmc_theses/415.

Full text
Abstract:
This thesis examines firm performance through the dot-com bubble through the lens of executive compensation. Hypotheses based on the theoretical literature of Bolton, Scheinkman and Xiong (2006) as well as Bertrand and Mullainathan (2001) in regards to management compensation in a speculative bubble motivate three regression models with differing market-cap-growth based dependent variables and specific compensation variables. Regression analyses test the models using public compensation and security data from S&P's Execucomp and Compustat databases. Synthesizing regression results show that stock option vesting schedules and executives' status on the board of directors may significantly affect firm performance through the dot-com bubble, but more analysis, using more robust data, is necessary to verify either claim.
APA, Harvard, Vancouver, ISO, and other styles
4

Matos, Catarina. "Ageing and entrepreneurship : firm creation and performance among older individuals." Doctoral thesis, Instituto Superior de Economia e Gestão, 2018. http://hdl.handle.net/10400.5/15851.

Full text
Abstract:
Doutoramento em Estudos de Desenvolvimento
O actual foco no envelhecimento activo torna o empreendedorismo sénior um fenómeno de crescente relevância. Pouco se conhece sobre as motivações do empreendedorismo sénior e ainda menos sobre os seus efeitos. Esta tese produz conhecimento sobre os empreendedores seniores e analisa o seu desempenho empresarial do ponto de vista objectivo e subjectivo. Assenta em teorias de diferentes disciplinas, como a gerontologia, psicologia e economia. Adicionalmente, analisa dados primários (um questionário) e secundários (“Quadros de Pessoal”). A investigação contempla quatro contributos principais. Primeiro, é desenvolvido e aplicado na revisão da literatura um esquema conceptual de análise do empreendedorismo sénior. Verifica-se uma carência de investigação sobre o desempenho organizacional das empresas. O segundo contributo reflete a realidade Portuguesa do empreendedorismo sénior, verificando-se que os seniores mostram uma reduzida vontade de enveredar pelo empreendedorismo, admitindo-se que as causas estejam relacionadas com os níveis de burocracia, a reduzida dinâmica dos mercados e uma cultura pouco orientada para o desempenho. Em terceiro lugar foi analisado o impacto do capital humano na criação de empresas, e da idade no desempenho organizacional. Concluiu-se que possuir experiência empreendedora e profissional está positivamente relacionada com a criação de empresas por seniores. Adicionalmente, os resultados confirmam o efeito negativo da idade no desempenho organizacional. O último contributo analisa o nível de satisfação do empreendedor para com a empresa. Verifica-se que aspetos monetários e não-monetários são, ambos, relevantes; assim como ter experiência na indústria afeta positivamente a satisfação, ao passo que um período de desemprego superior a 12 meses, prévio à criação da empresa, afeta negativamente a satisfação. Esta tese tem implicações no desenho de políticas públicas relacionadas com empreendedorismo e em futura investigação. Aqui, admite-se que a utilização do conceito de idade percebida pelo próprio (em lugar da idade cronológica) possa ser um indicador útil na adesão e desempenho do empreendedor. O efeito negativo do desemprego anterior à criação da empresa deve ser reconhecido e abordado pelos decisores políticos.
Senior entrepreneurship is a phenomenon of growing interest due to the current focus on active ageing. Little is still known about the determinants of senior entrepreneurship and less regarding its outcomes. This thesis provides insights about senior entrepreneurs and examines firm performance from subjective and objective levels. It is based on a multi-theory approach, from gerontology, psychology, to economic theories. Moreover, primary and secondary data was adopted: a questionnaire and a national database “Quadros de Pessoal”. The research has four main contributions. First, a framework of analysis is developed and applied to review senior entrepreneurship literature. A lack of evidence related to firm performance was found and more theory-based articles should be developed. Second, the Portuguese reality of senior entrepreneurship is examined and we find that Portugal faces an older population who exhibits a low willingness to engage in entrepreneurship, probably due to the levels of bureaucracy, low market dynamics, and a culture not oriented to performance. Third, we explore the impact of human capital traits on firm creation and of age on firm performance. Having entrepreneurial and paid employee experience is positively related to firm creation for older individuals. Furthermore, our results confirm the negative effect of being older on firm performance. Fourth, we examine business satisfaction among senior entrepreneurs –monetary and non-monetary are both important to explain business satisfaction and having industry experience positively affects business satisfaction, whereas having spent more than 12 months unemployed immediately before founding affects it negatively. The thesis leads to implications for policy makers and future research, namely on the appropriateness of considering self-perceived age (instead of chronological age) as an indicator influencing entrepreneurship. The negative effect of unemployment status before startup should be acknowledged and tackled by policy makers.
info:eu-repo/semantics/publishedVersion
APA, Harvard, Vancouver, ISO, and other styles
5

Lundstedt, Anders. "Realizability in Coq." Thesis, KTH, Matematik (Avd.), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-174109.

Full text
Abstract:
This thesis describes a Coq formalization of realizability interpretations of arithmetic. The realizability interpretations are based on partial combinatory algebras—to each partial combinatory algebra there is an associated realizability interpretation. I construct two partial combinatory algebras. One of these gives a realizability interpretation equivalent to Kleene’s original one, without involving the usual recursion-theoretic machinery.
Den här uppsatsen beskriver en Coq-formalisering av realiserbarhetstolkningar av aritmetik. Realiserbarhetstolkningarna baseras på partiella kombinatoriska algebror—för varje partiell kombinatorisk algebra finns det en motsvarande realiserbarhetstolkning. Jag konstruerar två partiella kombinatoriska algebror. En av dessa ger en realiserbarhetstolkning som är ekvivalent med Kleenes ursprungliga tolkning, men dess konstruktion använder inte det sedvanliga rekursionsteoretiska maskineriet.
APA, Harvard, Vancouver, ISO, and other styles
6

Claret, Guillaume. "Program in Coq." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC068/document.

Full text
Abstract:
Dans cette thèse, nous cherchons à développer de nouvelles techniques pour écrire plus simplement des programmes formellement vérifiés. Nous procédons en étudiant l'utilisation de Coq en tant que langage de programmation dans différents environnements. Coq étant un langage purement fonctionnel, nous nous concentrons surtout sur la représentation et la spécification d'effets impurs, tel que les exceptions, les références mutables, les entrées-sorties et la concurrence.Nous travaillons premièrement sur deux projets préliminaires qui nous aident à comprendre les défis existants dans la programmation en Coq. Le premier projet, Cybele, est un plugin Coq pour écrire des preuves par réflexion efficaces avec effets. Nous compilons et nous exécutons les effets impurs en OCaml pour générer une prophétie, une forme de certificat, et interprétons les effets dans Coq en utilisant cette prophétie. Le second projet, le compilateur CoqOfOCaml, importe des programmes OCaml avec effets dans Coq en utilisant un système d'inférence d'effets.Puis nous décrivons différentes représentations génériques et composables d'effets impurs en Coq. Les calculs avec pause combinent les effets d'exceptions et de références mutables avec un mécanisme de pause. Ce mécanisme de pause permet de rendre explicite les étapes d'évaluation dans le but de représenter l'évaluation concurrente de deux termes. En implémentant le serveur web Pluto en Coq, nous réalisons que les entrées-sorties asynchrones sont l'effet le plus utile : cet effet est présent dans la plupart des programmes et ne peux être encodé de façon purement fonctionnelle. Nous concevons alors les "calculs asynchrones" comme moyen pour représenter et compiler des programmes avec événements en Coq.Finalement, nous étudions des techniques pour prouver des propriétés à propos de programmes avec effets. Nous commençons avec la vérification du système de blog ChickBlog écrit dans le langage des "calculs interactifs". Ce blog lance un fil d'exécution par client. Nous vérifions notre blog en utilisant une méthode de spécification par cas d'utilisation. Nous adaptons cette technique à la théorie des types en exprimant un cas d'utilisation comme un co-programme bien typé. Grâce à ce formalisme, nous pouvons présenter un cas d'utilisation comme un programme de test symbolique et le déboguer symboliquement, étape par étape, en utilisant le mode interactif de Coq. À notre connaissance, ceci représente la première telle adaptation de la spécification par cas d'utilisation en théorie des types. Nous pensons que la spécification formelle par cas d'utilisation est l'une des clés pour vérifier des programmes avec effets, sachant que la méthode des cas d'utilisation s'est avérée utile dans l'industrie pour exprimer des spécifications informelles. Nous étendons notre formalisme aux programmes concurrents et potentiellement non-terminants, avec le langage des "calculs concurrents". Nous concevons également un vérificateur de modèles pour vérifier l'absence d'interblocage dans un programme concurrent, en compilant la composition parallèle vers l'opérateur de choix non-déterministe
In this thesis, we develop new techniques to conveniently write formally verified programs. To proceed, we study the use of Coq as a programming language in different settings. Coq being a purely functional language, we mainly focus on the representation and on the specification of impure effects, like exceptions, mutable references, inputs-outputs, and concurrency.First, we work on two preliminary projects helping us to understand the challenges of programming in Coq. The first project, Cybele, is a Coq plugin to write efficient proofs by reflection with effects. We compile and execute the impure effects in OCaml to generate a prophecy, a kind of certificate, and then interpret the effects in Coq using the prophecy. The second project, the compiler CoqOfOCaml, imports OCaml programs with effects into Coq, using an effect inference system.Next, we describe different generic and composable representations of impure effects in Coq. The breakable computations combine the standard exceptions and mutable references effects, with a pause mechanism to make explicit the evaluation steps in order to represent the concurrent evaluation of two terms. By implementing the Pluto web server in Coq, we realize that the most important effects to program are the asynchronous inputs-outputs. Indeed, these effects are ubiquitous and cannot be encoded in a purely functional manner. Thus, we design the asynchronous computations as a first way to represent and compile programs with events and handlers in Coq.Then, we study techniques to prove properties about programs with effects. We start with the verification of the blog system ChickBlog written in the language of the interactive computations. This blog runs one worker with synchronous inputs-outputs per client. We verify our blog using the method of specification by use cases. We adapt this technique to type theory by expressing a use case as a well-typed co-program over the program we verify. Thanks to this formalism, we can present a use case as a symbolic test program and symbolically debug it, step by step, using the interactive proof mode of Coq. To our knowledge, this is the first such adaptation of the use case specifications in type theory. We believe that the formal specification by use cases is one of the keys to verify effectful programs, as the method of use cases proved to be convenient to express (informal) specifications in the software industry. We extend our formalism to concurrent and potentially non-terminating programs with the language of concurrent computations. Apart from the use case method, we design a model-checker to verify the deadlock freedom of concurrent computations, by compiling the parallel composition to the non-deterministic choice operator using the language of blocking computations
APA, Harvard, Vancouver, ISO, and other styles
7

Svensson, Sofie, and Maria Rothén. "Voluntary carbon offsetting : A case study of Husqvarna AB from a firm, consumer and society wide perspective." Thesis, Jönköping University, JIBS, Economics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-12986.

Full text
Abstract:

Global warming is an international problem which has led to that many corporations today has an increased environmental awareness.

This thesis includes a Cost Benefit Analysis (CBA) approach which evaluates whether carbon offsetting is a profitable alternative for corporations and society wide. The study is predominately focusing on the emissions of the greenhouse gas . The calculations of CBA show the difference between the scenarios with or without the carbon offsetting. In the CBA approach effects are divided into benefits and costs.

The study includes a case study of Husqvarna AB and is carried through with aim to get a decision support whether or not to make the corporation carbon neutral. Basic data from Husqvarna AB has been used.

APA, Harvard, Vancouver, ISO, and other styles
8

Luna, Ango Luis Toribio, and Huamán José Omar Morón. "Factores que permiten el desarrollo de Startups peruanas con características de una Born Global Firm." Bachelor's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2017. http://hdl.handle.net/10757/623507.

Full text
Abstract:
En los últimos años, la internacionalización ha sido tema de diversos estudios y que muchas empresas han visto como un objetivo cada vez más factible gracias a la globalización, permitiendo a las Born Global Firms (BGF) alcanzar este objetivo en corto tiempo y salir al extranjero desde sus inicios. Al mismo tiempo, en el mercado han surgido grupos de emprendedores que han desarrollado negocios de rápido crecimiento y escalabilidad conocidos como startups y que, recientemente, en el Perú está comenzando a tomar mayor fuerza gracias a la participación de entidades públicas y privadas del ecosistema. Dentro de este contexto, la presente investigación tiene como objetivo identificar los principales factores que permiten a una startup peruana desarrollarse como una BGF. Para tal fin, la investigación aplicó técnicas cualitativas basadas en entrevistas a expertos y representantes de algunas born global startups del ecosistema; además mediante encuestas a 80 startups peruanas como parte del estudio cuantitativo se realizó un análisis estadístico de relación entre los factores planteados y la posibilidad de lograr startups internacionales. De esta manera, con los resultados cualitativos y cuantitativos se concluyó que la visión global del modelo de negocios es el principal factor para desarrollar una Born Global startup en el Perú. Asimismo, se determinó que la experiencia previa de los fundadores y las redes de contactos constituyen factores críticos para proporcionar una mejor visión global. En contraste, se identificó al financiamiento como un factor de valor agregado al modelo, pero no determinante para la internacionalización de una startup.
In recent years, internationalization has been the subject of several studies and many companies have seen it as an increasingly feasible goal thanks to globalization and this phenomenon has allowed that Born Global Firms (BGF) to achieve this goal in a short time and go abroad since its inception. At the same time, the market has originated groups of entrepreneurs who have developed fast-growing and scalable businesses known as startups. Recently, in Peru, startups are beginning to take greater strength thanks to the participation of public and private entities of the ecosystem. In this context, the research aims to identify the main factors that help to develop a Peruvian startup like a BGF. For this purpose, the study applied qualitative techniques based on interviews to experts and representatives of some born global startups of the ecosystem. In addition, as part of the quantitative study, surveys were conducted to eighty startups that allowed develop statistical analysis of the relationship between the proposed factors and the possibility of getting international startups. In this way, with the qualitative and quantitative results, it was concluded that the global vision of business model is the main factor to develop a born global startup in the Peruvian ecosystem. Furthermore, it was determined that the previous experience of the founders and networking are critical factors to provide a better global vision. In contrast, financing was identified as a factor of value added to the model but not determinant for the internationalization of a startup.
Tesis
APA, Harvard, Vancouver, ISO, and other styles
9

Rothén, Maria, and Sofie Svensson. "Volontary carbon offsetting : A case study of Husqvarna AB from a firm, consumer and a society wide perspective." Thesis, Jönköping University, JIBS, Economics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-12911.

Full text
Abstract:

Global warming is an international problem which has led to that many corporations today has an increased environmental awareness.

This thesis includes a Cost Benefit Analysis (CBA) approach which evaluates whether carbon offsetting is a profitable alternative for corporations and society wide. The study is predominately focusing on the emissions of the greenhouse gas . The calculations of CBA show the difference between the scenarios with or without the carbon offsetting. In the CBA approach effects are divided into benefits and costs.

The study includes a case study of Husqvarna AB and is carried through with aim to get a decision support whether or not to make the corporation carbon neutral. Basic data from Husqvarna AB has been used.

APA, Harvard, Vancouver, ISO, and other styles
10

Jakubiec, Line. "Vérification de circuits dans Coq." Aix-Marseille 1, 1999. http://www.theses.fr/1999AIX11030.

Full text
Abstract:
La verification formelle de circuits integres garantit de facon rigoureuse leur fiabilite. Pour ce faire, les assistants de preuve sont de plus en plus utilises. Le systeme coq, base sur le calcul des constructions inductives avec types co-inductifs, presente des particularites interessantes et originales. Nous etudions ce que ce systeme peut apporter dans le domaine de la specification et de la verification de circuits. Apres avoir montre l'interet des types dependants pour donner des specifications de circuits precises et donc fiables, nous utilisons le mecanisme d'extraction coq pour synthetiser un circuit correct par construction. Nous illustrons ces aspects sur des circuits combinatoires dont l'architecture est lineaire et nous etudions sur cet exemple les diverses strategies de preuve qu'offre coq. Notre etude porte ensuite sur les circuits sequentiels synchrones specifies a l'aide de types co-inductifs. Ces types permettent de definir en coq des objets infinis comme les streams. Les structures et les comportements des circuits sont modelises de facon uniforme par des automates, eux-memes caracterises par des fonctions co-recursives sur les streams. Notre approche est hierarchique et modulaire et repose sur un lemme general qui exprime une equivalence entre deux streams issues respectivement de deux automates. Ce lemme prend en compte l'essentiel de l'aspect temporel de nos preuves de correction. Nous appliquons ensuite cette methodologie a un circuit reel, le fairisle atm switch element.
APA, Harvard, Vancouver, ISO, and other styles
11

Ribeiro, Ivan César. "Contratos relacionais e a teoria da firma: um teste empírico com subcontratação de atividades jurídicas." Universidade de São Paulo, 2005. http://www.teses.usp.br/teses/disponiveis/12/12139/tde-06102006-115733/.

Full text
Abstract:
Este trabalho analisa o papel dos contratos relacionais na decisão entre a subcontratação e a integração vertical e as condições que tornam estes contratos factíveis. As proposições de Baker, Gibbons e Murphy (2002 – a partir daqui apenas BGM) e de Dixit (2004) são testadas e os resultados confirmam as principais hipóteses. Empresas podem operar através do mercado ou com um alto grau de integração vertical. No segundo caso as empresas evitam o que se chama de problemas de hold up – isto é, quando defrontada com a ocasião de cumprir os termos acertados no início da operação, a outra parte pode exigir termos de negociação mais onerosos, mas não tão onerosos a ponto de a primeira parte preferir abandonar os investimentos específicos que fez e negociar com terceiros. Se todos os ativos pertencerem a uma única empresa, tais problemas não existiriam, e isto explicaria a segunda opção, a operação em um alto grau de integração vertical (Williamson, 1985). Esta explicação, entretanto, não esclarece por que algumas empresas operam em redes, arranjos onde as partes permanecem como entidades economicamente separadas, mas com relações de longo prazo. A Toyota e outras empresas japonesas de automóveis são o exemplo típico desta situação (Holmström, Ronerts, 1998). Os contratos relacionais ajudam a contornar as dificuldades das contratações formais, independentes de essas advirem de problemas de holdup ou de outra fonte. Um contrato relacional permite que as partes utilizem o conhecimento detalhado que possuem de sua situação específica e que se adaptem também às novas informações quando essas se tornam disponíveis (MacNeil, 1978). Existe uma ressalva, entretanto: os contratos não poderão ser garantidos por uma terceira parte e por isso devem ser auto-executáveis, isto é, o valor das relações futuras deve ser o suficiente para que nenhuma das partes renegue o contrato (BGM, 2002, Dixit, 2004). Mas o que faz alguns contratos falharem enquanto outros são bem sucedidos? BGM examinam o problema à luz da teoria de contatos relacionais e da teoria de direitos de propriedade. De acordo com esses autores, a integração vertical afeta a tentação das partes de renegar um dado contrato relacional. Então, em um dado ambiente econômico e institucional, um contrato relacional pode ser factível sob integração vertical e não sê-lo para transações através do mercado – e isso será particularmente verdade quando encontrarmos uma grande variação dos preços alternativos dos ativos transacionados nestes contratos. Esses ativos não estão restritos apenas aos físicos, e podem ser o direito à propriedade de um bem ou a discricionariedade que um trabalhador subcontratado tem sobre como alocar o seu tempo na execução do trabalho contratado (Hart, 1992). Dixit (2004) discute o papel da sinalização e dos contratos formais na manutenção desses contratos relacionais. Partindo dessas hipóteses e com base na literatura de incentivos, as proposições de BGM e de Dixit são testadas. A decisão das empresas entre contratar serviços legais através do mercado ou manter um departamento jurídico próprio podem ser explicadas principalmente pela variação dos preços dos ativos (nesse caso, o valor dos serviços legais, que pode ser expresso pela maior ou menor competição no mercado de trabalho – Bertrand, 2004), mas também pelo ambiente institucional, particularmente pelo tempo necessário para se obter uma decisão da justiça e a variação do resultado esperado. Os resultados dos testes empíricos apontam para a confirmação da hipótese principal e sugere algumas linhas de pesquisa.
This work analyzes the role of relational contracts in the decision between subcontracting or vertical integration and the conditions that make these contracts feasible. The propositions of Baker, Gibbons and Murphy (2002 – since now, just BGM) and Dixit (2004) are tested and the results are supportive to the main propositions. Firms can conduct their operations through the market or can operate in a high degree of vertical integration. In the second case firms avoid what we call “holdup” problems - that is, when it comes time to work out the terms of the deal left open at the outset, the other side might demand terms of trade that are onerous but not so onerous that the first part would willingly forfeit the value of those transaction-specific assets by taking its business elsewhere. If all this assets belongs to one firm, there is no problem at all, and that explain this second choice, the vertical integration (Williamson, 1985). This rationale, however, don’t explain why some companies operate through networks, arrangements where the parties stay economically separate entities but having long-term relationship. Toyota and others Japanese Automobile Companies are the typical example (Holmström, Roberts, 1998). Relational contracts help circumvent difficulties in formal contracting no matter if these difficulties come from holdup problems or from another source. A relational contract allows the parties to utilize their detailed knowledge of their specific situation and to adapt to new information as it becomes available (MacNeil, 1978). There is a caveat, however: they cannot be enforced by a third party and must be self-enforcing, that means, the value of the future relationship must be sufficiently large that neither party wishes to renege (BGM, 2002, Dixit, 2004). But what makes some contracts to breakdown until others goes well? BGM examine the problem in the light of relational contracts and property rights theory. According to them, integration affects the parties’ temptation to renege a given relational contract. Thus, in a given environment, a desirable relational contract might be feasible under integration but not under nonintegration – and this will be particularly true when we face a wide varying alternative prices of an asset. These assets are not restricted to physical ones, and can be even a legal title to a good or the discretion that a outsourced worker have about how to allocate his time doing the job (Hart, 1992). Dixit (2004) discuss the role of signalization and formal contracts in the maintenance of these relational contracts. Departing from these hypotheses and with ground on the incentive literature, the BGM’s and Dixit’s propositions were tested. The companies’ decision between to contract law services in the market or to employ an internal legal department can be explained mainly by the variation on assets value (in the case, the value of legal services, expressed by a greater competition degree – Bertrand, 2004), but also by the institutional environment, particularly the time to reach a decision and the variability of the expected result. The results of the empirical research confirm the main assumption and point some lines of research in the relational contracts field.
APA, Harvard, Vancouver, ISO, and other styles
12

Santos, Júlio César dos. "Propriedade intelectual com ênfase em trade secrets: criptologia e performance econômica." Universidade Federal do Espírito Santo, 2003. http://repositorio.ufes.br/handle/10/6003.

Full text
Abstract:
Made available in DSpace on 2016-12-23T14:00:40Z (GMT). No. of bitstreams: 1 FACE.pdf: 89333 bytes, checksum: 9fe2d3791aae56820ee8b6646c8b46fd (MD5) Previous issue date: 2003-06-26
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Trata-se de uma abordagem teórica sobre Propriedade Intelectual com ênfase em Trade Secrets como barreira à entrada. Os avanços em criptologia no campo da matemática podem vir a se somar aos esforços teóricos desenvolvidos em Microeconomia em Organização Industrial. Neste contexto, a criptologia é resgatada historicamente e são apresentadas suas principais funções na Propriedade Intelectual, em especial no Trade Secrets. Questiona-se aqui: Por que as empresas criptografam seus processos produtivos? A propriedade intelectual em especial os Trade Secrets tem desempenhado importante papel, enquanto barreira à entrada na economia contemporânea? Como a criptologia e seus respectivos desenvolvimentos na matemática podem contribuir para o fortalecimento do segredo industrial? Percebe-se que apesar da presença marcante da criptografia e dos bens incorpóreos na Economia Contemporânea, ainda são escassas as análises teóricas, os estudos de caso e os bancos de dados sobre Propriedade Intelectual e principalmente, sobre Trade Secrets. Contribui ao explicitar seus conceitos e alertar para a importância de intensificação de pesquisa científica sobre o tema possibilitando, dessa forma, uma melhor compreensão da dinâmica econômica empresarial na atualidade.
This work presents a theoretical approach on intellectual property with emphasis in trade secrets as entry barriers. Progress in cryptology in the field of the mathematics can be added to the theoretical efforts developed in microeconomics in industrial organization. In this context, cryptology is rescued historically and its main functions are presented in the intellectual property, especially in trade secrets. This research questions: Why do the firms cryptograph their productive processes? Has intellectual property, especially trade secrets, been playing an important role as a barrier to entry in the contemporary economy? How can cryptology and its respective developments in mathematics contribute to the strengthening of the industrial secret? It is noticed that in spite of the outstanding presence of cryptography and of the incorporeal goods in the contemporary economy, the theoretical analyses are still scarce, as are case studies and databases on intellectual property and mainly, on trade secrets. This research contributes to explication of its concepts and warns about the importance of intensification of scientific research on this topic, in order to reach a better understanding of managerial economical dynamics at the present time.
APA, Harvard, Vancouver, ISO, and other styles
13

Vinogradova, Polina. "Formalizing Abstract Computability: Turing Categories in Coq." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/36354.

Full text
Abstract:
The concept of a recursive function has been extensively studied using traditional tools of computability theory. However, with the development of category-theoretic methods it has become possible to study recursion in a more general (abstract) sense. The particular model this thesis is structured around is known as a Turing category. The structure within a Turing category models the notion of partiality as well as recursive computation, and equips us with the tools of category theory to study these concepts. The goal of this work is to build a formal language description of this computation model. Specifically, to use the Coq proof assistant to formulate informal definitions, propositions and proofs pertaining to Turing categories in the underlying formal language of Coq, the Calculus of Co-inductive Constructions (CIC). Furthermore, we have instantiated the more general Turing category formalism with a CIC description of the category which models the language of partial recursive functions exactly.
APA, Harvard, Vancouver, ISO, and other styles
14

Ledovskaya, Yulia. "Marketing plan for Le Coq Sportif Russia." Master's thesis, NSBE - UNL, 2014. http://hdl.handle.net/10362/11904.

Full text
Abstract:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
I am going to take the position of the Head Office in France and look at the Russian business performance as a part of the global business. Results of several researches indicate a clear picture of the challenges on the Russian market, as low awareness of the brand, low penetration of the brand and complexity with the marketing mix implementation due to wide differences in terms of behavior, overall environment in the cities and climate between Russian cites. This marketing plan intends to face those challenges and create a sustainable and profitable business in the Russian market.
APA, Harvard, Vancouver, ISO, and other styles
15

Glondu, Stéphane. "Vers une certification de l'extraction de coq." Paris 7, 2012. http://www.theses.fr/2012PA077089.

Full text
Abstract:
L'assistant de preuve Coq permet de s'assurer mécaniquement de la correction de chaque étape de raisonnement dans une preuve. Ce système peut également servir au développement de programmes certifiés. En effet, Coq utilise en interne un langage typé dérivé du lambda-calcul, le calcul des constructions inductives (CIC). Ce langage est directement utilisable pour programmer, et un mécanisme, l'extraction, permet de traduire les programmes CIC vers des langages à plus large audience tels qu'OCaml, Haskell ou Scheme. L'extraction n'est pas un simple changement de syntaxe: CIC dispose d'un système de types très riche, mais en contrepartie, des entités purement logiques peuvent apparaître dans les programmes et impacter leurs performances. L'extraction se charge également d'effacer ces parties logiques. Dans cette thèse, nous nous attaquons à la certification de l'extraction elle-même. Nous avons prouvé sa correction dans le cadre d'une formalisation entière de Coq en Coq. Cette formalisation ne correspond pas exactement au CIC implanté dans Coq, mais nous avons tout de même réalisé notre étude avec l'implantation concrète de Coq en tête. Nous proposons également une nouvelle méthode de certification des programmes extraits, dans le cadre concret du système Coq existant
The Coq proof assistant mechanically checks the consistency of the logical reasoning in a proof. It can also be used to develop certified programs. Indeed, Coq uses intemally a typed language derived from lambda-calculus, the calculus of inductive constructions (CIC). This language can be directl; used by a programmer, and a procedure, extraction, allows one to translate CIC programs into more widely used languages such as OCaml, Haskell or Scheme. Extraction is not a mere syntax change: the type System of CIC is very rich, but purely logical entities can appear inside programs, impacting their performance. Extraction erases these logical artefacts as well. In this thesis, we tackle certification of the extraction itself. We have proved its correction in the context of a full formalization of Coq in Coq. Even though this formalization is not exactly Coq, we worked on it with the concrete implementation of Coq in mind. We also propose a new way to certify extracted programs, in the concrete setting of the existing Coq System
APA, Harvard, Vancouver, ISO, and other styles
16

MICHETTI, Matteo. "Ancorato Vs disancorato: modelli ed evoluzioni tra locale e globale. I flussi di partecipazioni con l'estero delle imprese dell'Emilia-Romagna." Doctoral thesis, Università degli studi di Ferrara, 2017. http://hdl.handle.net/11392/2488027.

Full text
Abstract:
Il primo capitolo affronta i grandi cambiamenti a livello di scenario economico internazionale. L’obiettivo è quello di descrivere l’insieme degli eventi che nel corso degli ultimi venti/trent’anni hanno radicalmente mutato i contorni dell’organizzazione della produzione su scala mondiale, provando più in generale a dare un significato compiuto al concetto di uso comune, ma non per questo scontato, di globalizzazione (perlomeno nella sua accezione economica). Per meglio contestualizzare il presente, si è ritenuto opportuno ripercorrere alcuni passaggi importanti nella storia economica degli ultimi due secoli, ben sapendo che anche i fenomeni apparentemente più innovativi e dirompenti, difficilmente accadono per la prima volta e soprattutto senza preavviso alcuno. Contestualmente, è stato affrontato il tema della nuova centralità della dimensione territoriale-regionale, nell’ambito della competizione economica internazionale, a discapito di quella sempre più superata, dello Stato-nazione. Nel secondo capitolo l’analisi si concentra sulla regione Emilia-Romagna e sul suo apparato produttivo. L’elemento di interesse consiste nel quantificare il livello di internazionalizzazione delle imprese regionali, prima da un punto di vista prettamente commerciale, poi produttivo, ovvero in termini di investimenti diretti esteri in entrata e in uscita. Un'altra fattore di cambiamento che ha agito in profondità nell’ambito del sistema produttivo dell’Emilia-Romagna, consiste in una crescente polarizzazione nei risultati aziendali delle imprese. Viene dunque formalizzata l’ipotesi di lavoro di questa ricerca, che consiste nel mettere in relazione il fattore della multinazionalità dell’impresa con le relative performance economiche. Il terzo capitolo presenta un’analisi descrittiva delle società di capitali con sede legale in Emilia-Romagna. Ciascuna società viene classificata sulla base del suo status proprietario, così da individuare le imprese internazionalizzate (solo in entrata, solo in uscita, sia in entrata che in uscita), mettendo in relazione la vocazione internazionale con il profilo strutturale e le performance economiche di ciascuna tipologia d’impresa. Il quarto capitolo è dedicato all’analisi quantitativa (micro)econometrica. Dopo una rassegna delle letteratura economica che indaga il rapporto tra internazionalizzazione e performance, viene enunciato il quesito valutativo: per passare dalla correlazione ad un nesso di causalità per cui l’acquisizione dello status di multinazionale produce un premio (ex-post), è strettamente necessario individuare un prima e un dopo il momento in cui l’impresa diventa multinazionale. Per implementare questo tipo di analisi empirica, è stato predisposto un panel bilanciato 2008-2015 di società di capitali, con sede legale in Emilia-Romagna, isolando il gruppo di quelle che nel periodo 2010-2013 si sono internazionalizzate per la prima volta (gruppo delle “unità trattate”), così da poterne osservare i risultati economici negli anni 2014-2015, anche in termini di Total Factor Productivity (TFP), da mettere a confronto con il gruppo di controllo delle imprese domestiche, selezionato mediante la tecnica del propensity score matching, abbinata ad un approccio difference in differences. I risultati confermano la presenza di un premio ex-post a favore delle imprese internazionalizzate, relativamente a tutte e quattro le variabili di performance considerate. Seguono alcune considerazioni di policy, cercando di portare a sintesi l’evidenza empirica risultante dall’analisi descrittiva ed econometrica, con le riflessioni inerenti il territorio e il suo accresciuto protagonismo nell’ambito della competizione internazionale.
The economic history of the last two decades has seen the exponential growth of the goods and services flows with a pace never seen before. Next to the commercial trade of finished goods it has taken place the trade of components and semi-finished goods. Consequently, this is contributing to the growth of Global Value Chains (GVC), that become the heart of the new organization of international labour. Given the magnitude of this phenomenon, the more and more open and global production chains are nowadays a real paradigm of the industrial organization. For this reason the structural change must be taken into consideration in both the economic analysis and the policies of economic planning, especially at regional level, given that nowadays the national level has lost much of its sovereign power. After having analysed these changes, the research focuses on the productive system of the Emilia-Romagna region, which has one of the strongest manufacturing sector among European regions. The Emilia-Romagna region is historically hosting a remarkable number of productive agglomerations and, as such, since decades, it is a model largely studied among the local production systems which are based on territorial proximity. Nowadays, the economic growth is prone to award always longer production chain and for this reason we are witnessing the serious issue of disintegration between, the medium and large enterprises situated at the top of the global value chains on one side, and the small firms focusing on subcontracting activity on the other side. This polarization involves a high risk of mortality for a large part of the regional productive system with all the problems that this entails. The analysis will initially focus on the dynamic of the international trade to move on the foreign direct investments in both inward and outward directions. Then the focus moves on firms having their headquarters placed in Emilia-Romagna. We classify each company in relation to its proprietary status, in order to identify if internationalized companies (in – out – in&out), present a structural profile and economic results differentiated from those not internationalized. A (micro)econometrics analysis then follows. Both empirical and theoretical literature show that multinational firms exhibit a competitive advantage before investing abroad. However, there are no clear empirical results regarding the ex-post effects of foreign direct investment (FDI) on firm performance, partially due to the lack of available firm-level data. We build a new firm-level dataset able to provide reliable measures of key performance indicators, especially Total Factor Productivity (TFP). We then use a Propensity Score Matching together with a Difference-In-Differences approach, to analyze the causal relationship between FDI and firm performance. The results confirm the existence of an “ex-post premia” for internationalized companies, relative to all the four performance variables considered (Revenues, Value added, TFP, Total salary and wages). Empirical evidence on Emilia-Romagna case is lacking, despite the subject is relevant to its policy implications.
APA, Harvard, Vancouver, ISO, and other styles
17

Black, Roberto. "Heterogeneidade no ganho de qualidade informacional com a adoção de IFRS: evidências do Brasil." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/96/96133/tde-02122015-165144/.

Full text
Abstract:
Esse trabalho tem como objetivo investigar a existência de heterogeneidade no ganho de qualidade informacional com a adoção de IFRS. A adoção de IFRS está geralmente associada com um aumento de qualidade das demonstrações contábeis. Entretanto, as empresas dentro de um mesmo país provavelmente possuem diferentes incentivos econômicos em relação à divulgação da informação. Nesse sentido, tratar as empresas de forma homogênea, sem considerar os incentivos econômicos atrelados, poderia contaminar a investigação da qualidade informacional. É analisado o caso do Brasil, um país classificado como code-law, cuja legislação fiscal induzia a prática contábil e cuja adoção de IFRS foi mandatória. Em primeiro lugar, as empresas brasileiras listadas na BOVESPA foram separadas em dois grupos, a saber: as empresas que emitiram ADR até a adoção de IFRS e as empresas que não emitiram ADR até a adoção de IFRS. Em seguida, esse segundo grupo de empresas foi agrupado, por meio de uma análise de conglomerados, em dois diferentes subgrupos em função de incentivos econômicos em comum. Com base nos grupos identificados, é testada a qualidade da informação contábil para cada grupo antes e após a adoção de IFRS. Esse trabalho utiliza o reconhecimento tempestivo dos eventos econômicos, a value relevance do lucro contábil e o gerenciamento de resultados como proxies para verificar a qualidade da informação contábil. Os resultados encontrados sugerem que um determinado conjunto de empresas obteve, de fato, um incremento de qualidade da informação contábil divulgada após adoção do padrão IFRS no Brasil. Esse grupo de empresas teria incentivos suficientes para deixar para trás a conformidade contábil-fiscal e apresentar uma qualidade superior no seu conjunto de informações contábeis divulgadas. Além disso, foi verificado um segundo grupo de empresas com qualidade da informação contábil antes e após 2008. Em contrapartida, foi identificado um terceiro conjunto de empresas que não apresentou qualidade da informação contábil seja antes ou após 2008. Esses resultados corroboram o pressuposto de que os incentivos no nível das empresas possuem um papel relevante na qualidade das demonstrações contábeis. Isso não implica afirmar que as normas contábeis não importam, mas de que existem outros direcionadores que moldam a qualidade das demonstrações contábeis e que as normas contábeis deveriam ser vistas como um desses direcionadores.
This work aims to investigate the existence of heterogeneity in the quality of accouting information after the adoption of IFRS. The adoption of IFRS is generally associated with an increased quality of the financial statements. However, companies within the same country probably have different economic incentives regarding the disclosure of information. Accordingly, treat companies evenly, without considering the linked economic incentives, could contaminate the identification of information quality after the adoption of IFRS. It examined the case of Brazil, a country classified as code-law, whose tax laws induced the accounting practice and whose adoption of IFRS is mandatory. First, Brazilian companies listed on the BOVESPA were separated into two groups, namely: companies issuing ADRs to the adoption of IFRS and the companies that have not issued ADR to the adoption of IFRS. Then, this second group of companies were grouped by means of a cluster analysis in two different subgroups based on economic incentives in common. Then, based on the identified groups, the accounting quality information is tested for each group before and after the adoption of IFRS. This work uses the timely recognition of economic events, value relevance of net income and earnings management as proxies for the quality of accounting information. The results suggest that a particular group of companies obtained, in fact, an increase of accounting information quality after adoption of the IFRS in Brazil. This group of companies would have sufficient incentives to leave behind the accounting and tax compliance and provide superior quality to your set of accounting information disclosed. In addition, a second group of companies with quality of accounting information was checked before and after 2008. In contrast, a third group of companies has been identified that did not show quality of accounting information either before or after 2008. These results support the assumption that incentives at the level of companies have an important role in the quality of financial statements. This does not imply stating that accounting standards do not matter, but that there are other drivers that shape the quality of financial statements and accounting standards should be seen as one of those drivers.
APA, Harvard, Vancouver, ISO, and other styles
18

Narboux, Julien. "Formalisation et automatisation du raisonnement géométrique en Coq." Phd thesis, Université Paris Sud - Paris XI, 2006. http://tel.archives-ouvertes.fr/tel-00118806.

Full text
Abstract:
L'objet de cette thèse est la formalisation et l'automatisation du raisonnement géométrique au sein de l'assistant de preuve Coq.
Dans une première partie, nous réalisons un tour d'horizon des principales axiomatiques de la géométrie puis nous présentons une formalisation des huit premiers chapitres du livre de Schwabäuser, Szmielew et Tarski: Metamathematische Methoden in der Geometrie.
Dans la seconde partie, nous présentons l'implantation en Coq d'une procédure de décision pour la géométrie affine plane : la méthode des aires de Chou, Gao et Zhang. Cette méthode produit des preuves courtes et lisibles.
Dans la troisième partie, nous nous intéressons à la conception d'une interface graphique pour la preuve formelle en géométrie : Geoproof. GeoProof combine un logiciel de géométrie dynamique avec l'assistant de preuve Coq.
Enfin, nous proposons un système formel diagrammatique qui permet de formaliser des raisonnements dans le domaine de la réécriture abstraite. Il est par exemple possible de formaliser dans ce système la preuve diagrammatique du lemme de Newman. La correction et la complétude du système sont prouvées vis-à-vis d'une classe de formules appelée logique cohérente.
APA, Harvard, Vancouver, ISO, and other styles
19

Erbsen, Andres. "Crafting certified elliptic curve cryptography implementations in Coq." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112843.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 103-106).
Elliptic curve cryptography has become a de-facto standard for protecting the privacy and integrity of internet communications. To minimize the operational cost and enable near-universal adoption, increasingly sophisticated implementation techniques have been developed. While the complete specification of an elliptic curve cryptosystem (in terms of middle school mathematics) fits on the back of a napkin, the fast implementations span thousands of lines of low-level code and are only intelligible to a small group of experts. However, the complexity of the code makes it prone to bugs, which have rendered well-designed security systems completely ineffective. I describe a principled approach for writing crypto code simultaneously with machine-checkable functional correctness proofs that compose into an end-to-end certificate tying highly optimized C code to the simplest specification used for verification so far. Despite using template-based synthesis for creating low-level code, this workflow offers good control over performance: I was able to match the fastest C implementation of X25519 to within 1% of arithmetic instructions per inner loop and 7% of overall execution time. While the development method itself relies heavily on a proof assistant such as Coq and most techniques are explained through code snippets, every Coq feature is introduced and motivated when it is first used to accommodate a non-Coq-savvy reader.
by Andres Erbsen.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
20

Philipoom, Jade (Jade D. ). "Correct-by-construction finite field arithmetic in Coq." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/119582.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 73-74).
Elliptic-curve cryptography code, although based on elegant and concise mathematical procedures, often becomes long and complex due to speed optimizations. This statement is especially true for the specialized finite-field libraries used for ECC code, resulting in frequent implementation bugs. I describe the methodologies used to create a Coq framework that generates implementations of finite-field arithmetic routines along with proofs of their correctness, given nothing but the modulus.
by Jade Philipoom.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
21

Xavier, Bruno Francisco. "Formaliza??o da l?gica linear em Coq." PROGRAMA DE P?S-GRADUA??O EM MATEM?TICA APLICADA E ESTAT?STICA, 2017. https://repositorio.ufrn.br/jspui/handle/123456789/22622.

Full text
Abstract:
Submitted by Automa??o e Estat?stica (sst@bczm.ufrn.br) on 2017-04-03T22:46:23Z No. of bitstreams: 1 BrunoFranciscoXavier_DISSERT.pdf: 923146 bytes, checksum: c0238dcb8801e0f87397d8417f0eb689 (MD5)
Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2017-04-11T20:33:34Z (GMT) No. of bitstreams: 1 BrunoFranciscoXavier_DISSERT.pdf: 923146 bytes, checksum: c0238dcb8801e0f87397d8417f0eb689 (MD5)
Made available in DSpace on 2017-04-11T20:33:35Z (GMT). No. of bitstreams: 1 BrunoFranciscoXavier_DISSERT.pdf: 923146 bytes, checksum: c0238dcb8801e0f87397d8417f0eb689 (MD5) Previous issue date: 2017-02-15
Em teoria da prova, o teorema da elimina??o do corte (ou Hauptsatz, que significa resultado principal) ? de suma import?ncia, uma vez que, em geral, implica na consist?ncia e na propriedade subf?rmula para um dado sistema. Ele assinala que qualquer prova em c?lculo de sequentes que faz uso da regra do corte pode ser substitu?da por outra que n?o a utiliza. A prova procede por indu??o na ordem lexicogr?fica (peso da f?rmula, altura do corte) e gera m?ltiplos casos quando a f?rmula de corte ? ou n?o principal. De forma geral, deve-se considerar a ?ltima regra aplicada nas duas premissas imediatamente depois de aplicar a regra do corte, o que gera um n?mero consider?vel de situa??es. Por essa raz?o, a demonstra??o poderia ser propensa a erros na hip?tese de recorremos a uma prova informal. A l?gica linear (LL) ? uma das l?gicas subestruturais mais significativas e a regra do corte ? admiss?vel no seu c?lculo de sequentes. Ela ? um refinamento do modelo cl?ssico e intuicionista. Sendo uma l?gica sens?vel ao uso de recursos, LL tem sido amplamente utilizada na especifica??o e verifica??o de sistemas computacionais. ? vista disso, se torna relevante sua abordagem neste trabalho. Nesta disserta??o, formalizamos, em Coq, tr?s c?lculos de sequentes para a l?gica linear e provamos que s?o equivalentes. Al?m disso, provamos metateoremas tais como admissibilidade da regra do corte, generaliza??o das regras para axioma inicial, ! e copy e invertibilidade das regras para os conectivos ?, ?, & e ?. No tocante ? invertibilidade, demonstramos uma vers?o por indu??o sobre a altura da deriva??o e outra com aplica??o da regra do corte, o que nos possibilitou conferir que, em um sistema que satisfaz Hauptsatz, a regra do corte simplifica bastante as provas em seu c?lculo de sequentes. Com a finalidade de atenuar o n?mero dos diversos casos, desenvolvemos v?rias t?ticas em Coq que nos permite realizar opera??es semiautom?ticas.
In proof theory, the cut-elimination theorem (or Hauptsatz, which means main result) is of paramount importance since it implies the consistency and the subformula property for the given system. This theorem states that any proof in the sequent calculus that makes use of the cut rule can be replaced by other that does not make use of it. The proof of cut-elimination proceeds by induction on the lexicographical order (formula weight, cut height) and generates multiple cases, considering for instance, when the formula generated by the cut rule is, or is not, principal. In general, one must consider the last rule applied in the two premises immediately after applying the cut rule (seeing the proof bottom-up). This thus generates a considerable amount of cases. For this reason, the proof of cut-elimination includes several cases and it could be error prone if we use an informal proof. Linear Logic (LL) is one of the most significant substructural logics and the cut rule is admissible in its sequent calculus. LL is a refinement of the classical and the intuitionistic model. As a resource sensible logic, LL has been widely used in the specification and verification of computer systems. In view of this, it becomes relevant the study of this logic in this work. In this dissertation we formalize three sequent calculus for linear logic in Coq and prove all of them equivalent. Additionally, we formalize meta-theorems such as admissibility of cut, generalization of initial rule, bang and copy and invertibility of the rules for the connectives par, bot, with and quest. Regarding the invertibility, we demonstrate this theorem in two different ways: a version by induction on the height of the derivation and by using the cut rule. This allows us to show how the cut rule greatly simplifies the proofs in the sequent calculus. In order to mitigate the number of several cases in the proofs, we develop several tactics in Coq that allow us to perform semi-automatic reasoning.
APA, Harvard, Vancouver, ISO, and other styles
22

Carvalho, Segundo Washington Luís Ribeiro de. "Verificação de propriedades do cálculo גex em Coq." reponame:Repositório Institucional da UnB, 2010. http://repositorio.unb.br/handle/10482/7685.

Full text
Abstract:
Dissertação (mestrado) - Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2010.
Submitted by Allan Wanick Motta (allan_wanick@hotmail.com) on 2011-05-09T17:06:46Z No. of bitstreams: 1 2010_WashingtnLuisRibeirodeCarvalhoSegundo.pdf: 529113 bytes, checksum: 3c74f1ea1498ab7ee05b3f8cca2df3e5 (MD5)
Approved for entry into archive by Patrícia Nunes da Silva(patricia@bce.unb.br) on 2011-05-11T20:45:58Z (GMT) No. of bitstreams: 1 2010_WashingtnLuisRibeirodeCarvalhoSegundo.pdf: 529113 bytes, checksum: 3c74f1ea1498ab7ee05b3f8cca2df3e5 (MD5)
Made available in DSpace on 2011-05-11T20:45:58Z (GMT). No. of bitstreams: 1 2010_WashingtnLuisRibeirodeCarvalhoSegundo.pdf: 529113 bytes, checksum: 3c74f1ea1498ab7ee05b3f8cca2df3e5 (MD5)
O cálculo גex representa uma solução importante dentro da classe de cálculos de substituições explícitas que lidam com “nomes”, em oposição aqueles que codificam suas variáveis por índices. Delia Kesner obteve, através de um conjunto de provas construtivas, demonstrações das importantes propriedades do גex. Dentre elas, destacamos a PSN, isso é, a Preservação da Normalização Forte, cuja demonstração faz uso de uma estratégia de redução perpétua, que permitiu uma caracterização indutiva do conjunto SN גex. Estendemos a especificação em Coq, já realizada para o cálculo ג, de B. Aydemir et al, e que utiliza lógica nominal para construção de princípios de indução e recursão _-estrutural. Dessa forma nossa especificação inclui a substituição explícita (s[x=t]) na gramática de termos. Avançamos definindo os sistemas de reescrita e as relações de redução do גex, e concluímos por formalizar alguns resultados para o cálculo, a saber: a FC (Composição Completa), a SIM (Simulação de um passo da β-redução) e ainda outros que caminham para a formalização da PSN. _______________________________________________________________________________ ABSTRACT
The גex-calculus represents an important solution among all the class of explicit substitutions calculi that deal with "names", as opposed to those that encode variables by indices. Delia Kesner developed the proofs, through a set of constructive ones, of important properties of the _ex calculus. Among them, we highlight the PSN property, that is, the Preservation of Strong Normalization, whose proof uses a perpetual reduction strategy which allowed an inductive characterization of the set SN גex. We extended the specifi cation already done in Coq for the -calculus by B. Aydemir et al, using nominal logic to build principles of ג -structural induction and recursion. In this way our specification includes the explicit substitution (s[x=t]) in the grammar of the terms. We go foward by de_ning the rewriting systems and the reduction relations for the ג ex and we conclude by formalizing some results for this calculus, as follows: The FC (Full Composition), SIM (Simulation of One Step of β -Reduction) and others that go in the direction of the formalization of the PSN.
APA, Harvard, Vancouver, ISO, and other styles
23

CHABANE, NACIRA. "Formalisation de la theorie de reecriture dans coq." Paris 6, 1999. http://www.theses.fr/1999PA066661.

Full text
Abstract:
Dans cette these, nous introduisons une theorie de reecriture dans le systeme coq pour offrir aux utilisateurs un cadre convivial pour realiser leur travail en mettant a sa disposition des outils generiques. La formalisation de la reecriture dans coq nous offre plusieurs avantages de definitions d'outils generiques tels que les signatures, les termes, les substitutions, l'algorithme de filtrage, le renommage des variables et l'algorithme d'unification. Pour obtenir la notion de genericite nous avons formalise au mieux la notion de signature autrement dit trouver la representation la plus generale et la plus commode a utiliser. La signature fait intervenir les termes de l'algebre, donc formaliser la theorie des termes generiques qui est la representation la plus generale des termes. Nous avons aussi etudie et formalise les substitutions selon deux approches : l'approche fonctionnelle et l'approche calculatoire. Par la suite nous avons implanter le filtrage et le renommage des variables. La formalisation du filtrage et du renommage dans coq n'a pose aucun probleme de terminaison. Nous avons utilise l'operateur du fixpoint pour implanter l'algorithme d'unification. Ce developpement a pose un probleme de terminaison que nous avons resolu avec l'introduction d'un ordre lexicographique. Enfin, nous etudions les possibilites d'extension des travaux presentes dans cette these au developpement de l'algorithme de knuth bendix, une extension permettant d'avoir une theorie de reecriture complete.
APA, Harvard, Vancouver, ISO, and other styles
24

Masters, David M. "Verifying Value Iteration and Policy Iteration in Coq." Ohio University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1618999718015199.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Lemoine, Manuela. "La réaction acrosomique du spermatozoïde chez le coq." Thesis, Tours, 2009. http://www.theses.fr/2009TOUR4005.

Full text
Abstract:
L’objectif de la thèse a été d’apporter des éléments sur la réaction acrosomique (RA) aviaire afin de mieux comprendre les processus menant à la fécondation et de mieux maîtriser la capacité des spermatozoïdes à être conservés. Nos résultats ont conforté l’hypothèse de l’absence de capacitation chez les oiseaux. De plus, il n’y a pas d’hyperactivation de la mobilité lors de la RA. Seul le Ca2+ s’avère être l’élément indispensable au déclenchement de la RA. L’évaluation de la RA avec des spermatozoïdes conservés à l’état liquide ou après cryoconservation a révélé une évolution différente en fonction du type de conservation. L’étude des voies de signalisation susceptibles d’être impliquées dans le déclenchement de la RA a suggéré l’activation de 3voies, PKA, PI3K et MAPK ERK. Ce travail ouvre de nombreuses perspectives scientifiques vers l’approfondissement des connaissances de la RA chez les oiseaux et sur l’utilisation qui peut en être faite pour mieux maîtriser la qualité des gamètes
The aim of this work was to provide new information on chicken acrosome reaction (AR) for a better comprehension of the mechanisms leading to this reaction and a better control of the fertilizing potential of spermatozoa after in vitro storage. Our results showed that calcium is the factor absolutely necessary to initiate the AR and supported the hypothesis that chicken spermatozoa do not need to be capacitated. Moreover, motility hyperactivation was not found at the time of AR. Then, we showed that chicken sperm ability to undergo the AR may differ depending on the type of semen storage. Indeed, this ability was dramatically affected by liquid storage, but was submitted to contrasted effect after cryopreservation. Finally, we investigated the potential involvement of several signaling pathways in initiation of the chicken AR and the results showed that the AR could be mediated by activation of the PKA, PI3K and ERK MAPK pathways
APA, Harvard, Vancouver, ISO, and other styles
26

Quirin, Kevin. "Lawvere-Tierney sheafification in Homotopy Type Theory." Thesis, Nantes, Ecole des Mines, 2016. http://www.theses.fr/2016EMNA0298/document.

Full text
Abstract:
Le but principal de cette thèse est de définir une extension de la traduction de double-négation de Gödel à tous les types tronqués, dans le contexte de la théorie des types homotopique. Ce but utilisera des théories déjà existantes, comme la théorie des faisceaux de Lawvere-Tierney, quenous adapterons à la théorie des types homotopiques. En particulier, on définira le fonction de faisceautisation de Lawvere-Tierney, qui est le principal théorème présenté dans cette thèse.Pour le définir, nous aurons besoin de concepts soit déjà définis en théorie des types, soit non existants pour l’instant. En particulier, on définira une théorie des colimits sur des graphes, ainsi que leur version tronquée, et une notion de modalités tronquées basée sur la définition existante de modalité.Presque tous les résultats présentés dans cette thèse sont formalisée avec l’assistant de preuve Coq, muni de la librairie [HoTT/Coq]
The main goal of this thesis is to define an extension of Gödel not-not translation to all truncated types, in the setting of homotopy type theory. This goal will use some existing theories, like Lawvere-Tierney sheaves theory in toposes, we will adapt in the setting of homotopy type theory. In particular, we will define a Lawvere-Tierney sheafification functor, which is the main theorem presented in this thesis.To define it, we will need some concepts, either already defined in type theory, either not existing yet. In particular, we will define a theory of colimits over graphs as well as their truncated version, and the notion of truncated modalities, based on the existing definition of modalities.Almost all the result presented in this thesis are formalized with the proof assistant Coq together with the library [HoTT/Coq]
APA, Harvard, Vancouver, ISO, and other styles
27

Letouzey, Pierre. "Programmation fonctionnelle certifiée : L'extraction de programmes dans l'assistant Coq." Phd thesis, Université Paris Sud - Paris XI, 2004. http://tel.archives-ouvertes.fr/tel-00150912.

Full text
Abstract:
Nous nous intéressons ici à la génération de programmes certifiés
corrects par construction. Ces programmes sont obtenus en
extrayant l'information pertinente de preuves constructives réalisées
dans l'assistant de preuves Coq.

Une telle traduction, ou "extraction", des preuves constructives
en programmes fonctionnels n'est pas nouvelle, elle correspond
à un isomorphisme bien connu sous le nom de Curry-Howard. Et
l'assistant Coq comporte depuis longtemps un tel outil d'extraction.
Mais l'outil précédent présentait d'importantes limitations. Certaines
preuves Coq étaient ainsi hors de son champ d'application, alors que
d'autres engendraient des programmes incorrects.

Afin de résoudre ces limitations, nous avons effectué une refonte
complète de l'extraction dans Coq, tant du point de vue de la théorie
que de l'implantation. Au niveau théorique, cette refonte a entraîné
la réalisation de nouvelles preuves de correction de ce mécanisme
d'extraction, preuves à la fois complexes et originales. Concernant
l'implantation, nous nous sommes efforcés d'engendrer du code
extrait efficace et réaliste, pouvant en particulier être intégré dans des
développement logiciels de plus grande échelle, par le biais de
modules et d'interfaces.

Enfin, nous présentons également plusieurs études de cas illustrant
les possibilités de notre nouvelle extraction. Nous décrivons ainsi la
certification d'une bibliothèque modulaire d'ensembles finis, et
l'obtention de programmes d'arithmétique réelle exacte à partir d'une
formalisation d'analyse réelle constructive. Même si des progrès
restent encore à obtenir, surtout dans ce dernier cas, ces exemples
mettent en évidence le chemin déjà parcouru.
APA, Harvard, Vancouver, ISO, and other styles
28

Lu, Weiyun. "Formally Verified Code Obfuscation in the Coq Proof Assistant." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39994.

Full text
Abstract:
Code obfuscation is a software security technique where transformations are applied to source and/or machine code to make them more difficult to analyze and understand to deter reverse-engineering and tampering. However, in many commercial tools, such as Irdeto's Cloakware product, it is not clear why the end user should believe that the programs that come out the other end are still the same program"! In this thesis, we apply techniques of formal specification and verification, by using the Coq Proof Assistant and IMP (a simple imperative language within it), to formulate what it means for a program's semantics to be preserved by an obfuscating transformation, and give formal machine-checked proofs that these properties hold. We describe our work on opaque predicate and control flow flattening transformations. Along the way, we also employ Hoare logic as an alternative to state equivalence, as well as augment the IMP program with Switch statements. We also define a lower-level flowchart language to wrap around IMP for modelling certain flattening transformations, treating blocks of codes as objects in their own right. We then discuss related work in the literature on formal verification of data obfuscation and layout obfuscation transformations in IMP, and conclude by discussing CompCert, a formally verified C compiler in Coq, along with work that has been done on obfuscation there, and muse on the possibility of implementing formal methods in the next generation of real-world obfuscation tools.
APA, Harvard, Vancouver, ISO, and other styles
29

Zucchini, Rébecca. "Bibliothèque certifiée en Coq pour la provenance des données." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG040.

Full text
Abstract:
La présente thèse se situe à l'intersection des méthodes formelles et des bases de données, et s'intéresse à la formalisation de la provenance des données à l'aide de l'assistant de preuve Coq. L'étude de la provenance des données, qui permet de retracer leur origine et leur historique, est essentielle pour assurer la qualité des données, éviter les interprétations erronées et favoriser la transparence dans le traitement des données. La thèse propose ainsi la formalisation de deux types de provenance des données couramment utilisés, la How-provenance et la Where-provenance. Cette formalisation a permis de comparer leur sémantique, mettant en évidence leurs différences et leurs complémentarités. En outre, elle a conduit à la proposition d'une structure algébrique et d'une sémantique unificatrice pour ces deux types de provenance
This thesis is about the formalization of data provenance using the Coq proof assistant, at the intersection of formal methods and database communities. It explores the importance of data provenance, which tracks the origin and history of data, in addressing issues such as poor data quality, incorrect interpretations, and lack of transparency in data processing. The thesis proposes formalizations of two commonly used types of data provenance, How-provenance and Where-provenance. Formalizing both types of provenance allowed us to compare their semantics, highlighting their differences and complementarities. Additionally, the formalization of these two types of provenance led to the proposal of an algebraic structure that provides a unifying semantics
APA, Harvard, Vancouver, ISO, and other styles
30

Sall, Boubacar Demba. "Programmation impérative par raffinements avec l'assistant de preuve Coq." Electronic Thesis or Diss., Sorbonne université, 2020. http://www.theses.fr/2020SORUS181.

Full text
Abstract:
Cette thèse s’intéresse à la programmation certifiée correcte dans le cadre formel fourni par l’assistant de preuve Coq, et conduite par étapes de raffinements, avec l'objectif d’aboutir à un résultat correct par construction. Le langage de programmation considéré est un langage impératif simple, avec affectations, alternatives, séquences, et boucles. La sémantique associée à ce langage est une sémantique relationnelle exprimée dans un cadre prédicatif plus adapté à un plongement dans la théorie des types, plutôt que dans le calcul des relations. Nous étudions la relation entre d’une part la sémantique prédicative et relationnelle que nous avons choisie, et d’autre part une approche plus classique dans le style de la logique de Hoare. En particulier, nous montrons que les deux approches ont en théorie la même puissance. La démarche que nous étudions consiste à certifier, avec l’aide d’un assistant de preuve, les raffinements successifs permettant de passer de la spécification au programme. Nous nous intéressons donc aussi aux techniques de preuve permettant en pratique d’établir la validité des raffinements. Plus précisément, nous utilisons un calcul de la plus faible pré-spécification jouant ici le rôle du calcul de la plus faible pré-condition dans les approches classiques. Afin que l’articulation des étapes de raffinement reste aussi proche que possible de l’activité de programmation, nous formalisons un langage de développement qui permet de décrire l’arborescence des étapes de raffinement, ainsi qu’une logique permettant de raisonner sur ces développements, et de garantir leur correction
This thesis investigates certified programming by stepwise refinement in the framework of the Coq proof assistant. This allows the construction of programs that are correct by construction. The programming language that is considered is a simple imperative language with assignment, selection, sequence, and iteration. The semantics of this language is formalized in a relational and predicative setting, and is shown to be equivalent to an axiomatic semantics in the style of a Hoare logic. The stepwise refinement approach to programming requires that refinement steps from the specification to the program be proved correct. For so doing, we use a calculus of weakest pre-specifications which is a generalisation of the calculus of weakest pre-conditions. Finally, to capture the whole refinement history of a program development, we formalize a design language and a logic for reasoning about program designs in order to establish that all refinement steps are indeed correct. The approach developed during this thesis is entirely mecanised using the Coq proof assistant
APA, Harvard, Vancouver, ISO, and other styles
31

Carvalho, Leila Moreira de. "Características bioquímicas e químicas em filés de peito de frango com anomalia pfn (pale, firm, non-exudative) e pse (pale, soft, exudative)." Universidade Federal da Paraíba, 2016. http://tede.biblioteca.ufpb.br:8080/handle/tede/9434.

Full text
Abstract:
Submitted by Maike Costa (maiksebas@gmail.com) on 2017-09-06T11:32:46Z No. of bitstreams: 1 arquivototal.pdf: 1111473 bytes, checksum: b2da58df331c4ba4ded210b34732682f (MD5)
Made available in DSpace on 2017-09-06T11:32:46Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 1111473 bytes, checksum: b2da58df331c4ba4ded210b34732682f (MD5) Previous issue date: 2016-03-27
Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq
In the past decades, the intense genetic selection aiming at the increase of productive efficiency of chickens resulted in changes in the meat quality. The broiler breast meat have been classified in three group of quality, Normal, PSE (Pale, Soft, Exudative) and DFD (Dark, Firm, Dry); however, recently, due to the appearance with pale color and normal firmness, another category of meat quality was suggested as PFN (Pale, Firm, Non-exudative). Considering the absence of studies about the incidence of anomalies of color in broiler breast in the Northeast of Brazil, this work aimed at verifying the incidence of these anomalies in the region and carrying out the chemical and biochemical characterization of these meats. This samples were collected in a commercial slaughterhouse and evaluated the parameters of color (L *, a *, b *), pH, water-holding capacity, cooking loss, shear force, myofibrillar fragmentation index, protein denaturation, mineral and fatty acids content, TBARS values, warmed-over flavor and total carbonyls. The broiler breast meat (n = 838) were classified in three categories: Normal (44 5.8), PSE (L * ≥53, pH <5.8) and PFN (L * ≥53, pH> 5.8). The incidence of PFN broiler breast meat was 19.8%, almost doubling the incidence of PSE (11.1%); Normal broiler breast meat presented a larger percentage (69.1%). The FPN breasts presented WHC (66.9%) similar to the Normal group and 4.2% larger in the PSE meat. The PFN breasts presented calcium concentrations (373,02 mg/kg), arachidonic acid (84,61 mg/100g) and the MFI (57,4) lesser in relation to PSE meat. We also noticed levels of lipid oxidation (0.23 mg MDA / kg) similar in comparison with Normal and PSE breasts. The PFN and Normal breasts presented a larger concentration of total carbonyls, 8,2 and 7,4 nM/mg of proteins, respectively. The results related confirm the existence of PFN anomaly in broiler breast meat, which present functional properties similar to the Normal group. When added to this, the results related confirm that the PSE syndrome in broiler present a defect in the regulation of calcium causes a fall of the meat pH and consequent compromise of the functional properties. Besides, its less firm texture results from the greater proteolytic activity, which seems to be related not only to the activation of the calpains, due to the excessive calcium ions, but also to the smaller level of protein oxidation.
Nas últimas décadas, a intensa seleção genética visando o aumento da eficiência produtiva de frangos resultou em modificações na qualidade da carne. Os peitos de frango tem sido classificados em três grupos de qualidade, Normal, PSE (Pale, Soft, Exudative) e DFD (Dark, Firm, Dry); porém, recentemente, devido a aparência com cor pálida e firmeza normal, outra categoria de qualidade de carne foi sugerida a PFN (Pale, Firm, Non-exudative). Considerando-se a ausência de estudos sobre a incidência de anomalias de cor em peitos de frango no Nordeste do Brasil, objetivou-se, neste trabalho verificar a incidência dessas anomalias na região e realizar a caracterização química e bioquímicas dessas carnes. Para isso, foram coletadas amostras em um abatedouro comercial e avaliados os parâmetros de cor (L*, a*, b*), pH, capacidade de retenção de água, perda de peso por cozimento, força de cisalhamento, índice de fragmentação miofibrilar, desnaturação proteica, teor de minerais e de ácidos graxos, número de TBARS, aroma requentado e carbonilas totais. Os filés de peito de frango (n=838) foram classificados em três categorias: Normal (445,8), PSE (L*≥53; pH<5,8) e PFN (L*≥53; pH>5,8). A incidência de peito de frango PFN foi de 19,8%, apresentando-se quase em dobro à incidência de PSE (11,1%); peitos de frango Normal apresentaram-se em maior percentual (69,1%). Os peitos FPN apresentaram CRA (66,9%) similar ao grupo Normal e 4,2% maior à carne PSE. Os peitos PFN apresentaram concentrações de cálcio (373,02 mg/kg), ácido araquidônico (84,61 mg/100g) e o IFM (57,4) menores em relação a carne PSE. Observando-se também níveis de oxidação lipídica (0,23 mg MDA/kg), similar em comparação aos peitos Normal e PSE. Os peitos PFN e Normal apresentaram maior concentração de carbonilas totais, 8,2 e 7, 4 nM/mg de proteínas, respectivamente. Os resultados relatados confirmam a existência da anomalia PFN em peitos de frango, os quais apresentam propriedades funcionais similares ao grupo Normal. Somado a isso, os resultados relatados confirmam que a anomalia PSE em frangos apresenta defeito na regulação de cálcio que acarreta na queda do pH da carne e consequente comprometimento de suas propriedades funcionais; além disso, sua textura menos firme decorre da maior atividade proteolítica, que parece não estar apenas relacionada a ativação das calpaínas, pelo excesso de íons de cálcio, mas ao menor nível de oxidação proteica.
APA, Harvard, Vancouver, ISO, and other styles
32

Oda, Patrícia. "Transações com partes relacionadas, governança corporativa e desempenho: um estudo com dados em painel." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/12/12136/tde-02052012-211106/.

Full text
Abstract:
A pesquisa trata da relação entre as transações com partes relacionadas (RPT\'s) e o desempenho nas companhias do Novo Mercado. Teve como objetivo identificar se esta relação pode ser afetada pelos mecanismos de governança corporativa, especificamente de supervisão e monitoramento por elas adotados voluntariamente. Foram consideradas as hipóteses dicotômicas apresentadas por Gordon, Henry e Palia (2004), denominadas de \"conflitos de interesse\" e \" transações eficientes\". Na tentativa de mensurar estas relações, adotou-se o modelo de análise de dados em painel por permitir a incorporação de informações temporais e reduzir o problema do viés de variáveis omitidas. Evidências sugerem que há relação entre as transações com partes relacionadas operacionais e o desempenho das companhias estudadas. No entanto, os resultados a respeito do efeito de moderação dos mecanismos de supervisão e monitoramento na utilização de tais contratos foram inconclusivos.
The study deals with the relationship between related party transactions (RPT\'s) and firm performance in the Brazilian \"Novo Mercado\", and its goal is to identify whether this relationship can be affected by mechanisms of corporate governance. Audit committee was used to measure corporate governance mechanisms. The two assumptions made by Gordon, Henry and Palia (2004), called \"conflicts of interest\" and \"efficient transactions\" were considered in this study. In an attempt to measure these relationships, it was adopted the model of panel data analysis to reduce the problem of omitted variable bias. The empirical results provide evidence that there is a relationship between related party transactions and firm performance. However, the results regarding the moderating effect of the mechanisms of supervision and use of such contracts have been inconclusive.
APA, Harvard, Vancouver, ISO, and other styles
33

Gaspar, Nuno. "Support mécanisé pour la spécification formelle, la vérification et le déploiement d'applications à base de composants." Thesis, Nice, 2014. http://www.theses.fr/2014NICE4127/document.

Full text
Abstract:
Cette thèse appartient au domaine des méthodes formelles. Nous nous concentrons sur leur application à une méthodologie spécifique pour le développement de logiciels: l'ingénierie à base de composants. Le Grid Component Model (GCM) suit cette méthodologie en fournissant tous les moyens pour définir, composer, et dynamiquement reconfigurer les applications distribuées à base de composants. Dans cette thèse, nous abordons la spécification formelle, la vérification et le déploiement d'applications GCM reconfigurables et distribuées. Notre première contribution est un cas d'étude industriel sur la spécification comportementale et la vérification d'une application distribuée et reconfigurable: L'HyperManager. Notre deuxième contribution est une plate-forme, élaborée avec l'assistant de preuve Coq, pour le raisonnement sur les architectures logicielles: Mefresa. Cela comprend la mécanisation de la spécification du GCM, et les moyens pour raisonner sur les architectures reconfigurables GCM. En outre, nous adressons les aspects comportementaux en formalisant une sémantique basée sur les traces d'exécution de systèmes de transitions synchronisées. Enfin, notre troisième contribution est un nouveau langage de description d'architecture (ADL): Painless. En outre, nous discutons son intégration avec ProActive, un intergiciel Java pour la programmation concurrente et distribuée, et l'implantation de référence du GCM
This thesis belongs to the domain of formal methods. We focus their application on a specific methodology for the development of software: component-based engineering.The Grid Component Model (GCM) endorses this approach by providing all the means to define, compose and dynamically reconfigure component-based distributed applications. In this thesis we address the formal specification, verification and deployment of distributed and reconfigurable GCM applications. Our first contribution is an industrial case study on the behavioural specification and verification of a reconfigurable distributed application: The HyperManager. Our second contribution is a framework, developed with the Coq proof assistant, for reasoning on software architectures: Mefresa. This encompasses the mechanization of the GCM specification, and the means to reason about reconfigurable GCM architectures. Further, we address behavioural concerns by formalizing a semantics based on execution traces of synchronized transition systems. Overall, it provides the first steps towards a complete specification and verification platform addressing both architectural and behavioural properties. Finally, our third contribution is a new Architecture Description Language (ADL), denominated Painless. Further, we discuss its proof-of-concept integration with ProActive, a Java middleware for concurrent and distributed programming, and the de facto reference implementation of the GCM
APA, Harvard, Vancouver, ISO, and other styles
34

Boutillier, Pierre. "De nouveaux outils pour calculer avec des inductifs en Coq." Phd thesis, Université Paris-Diderot - Paris VII, 2014. http://tel.archives-ouvertes.fr/tel-01054723.

Full text
Abstract:
En ajoutant au lambda-calcul des structures de données algébriques, des types dépendants et un système de modules, on obtient un langage de programmation avec peu de primitives mais une très grande expressivité. L'assistant de preuve Coq s'appuie sur un tel langage (le CIC) à la sémantique particulièrement claire. L'utilisateur n'écrit pas directement de programme en CIC car cela est ardu et fastidieux. Coq propose un environnement de programmation qui facilite la tâche en permettant d'écrire des programmes incrémentalement grâce à des constructions de haut niveau plus concises. Typiquement, les types dépendants imposent des contraintes fortes sur les données. Une analyse de cas peut n'avoir à traiter qu'un sous-ensemble des constructeurs d'un type algébrique, les autres étant impossibles par typage. Le type attendu dans chacun des cas varie en fonction du constructeur considéré. L'impossibilité de cas et les transformations de type doivent être explicitement écrites dans les termes de Coq. Pourtant, ce traitement est mécanisable et cette thèse décrit un algorithme pour réaliser cette automatisation. Par ailleurs, il est nécessaire à l'interaction avec l'utilisateur de calculer des programmes du CIC sans faire exploser la taille syntaxique de la forme réduite. Cette thèse présente une machine abstraite conçu dans ce but. Enfin, les points fixes permettent une manipulation aisée des structure de données récursives. En contrepartie, il faut s'assurer que leur exécution termine systématiquement. Cette question sensible fait l'objet du dernier chapitre de cette thèse.
APA, Harvard, Vancouver, ISO, and other styles
35

Braibant, Thomas. "Algèbres de Kleene, réécriture modulo AC et circuits en coq." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00683661.

Full text
Abstract:
Cette thèse décrit trois travaux de formalisation en Coq. Le premier chapitre s'intéresse à l'implémentation d'une procédure de décision efficace pour les algèbres de Kleene, pour lesquelles le modèle des langages réguliers est initial : il est possible de décider la théorie équationelle des algèbres de Kleene via la construction et la comparaison d'automates finis. Le second chapitre est consacré à la définition de tactiques pour la réécriture modulo associativité et commutativité en utilisant deux composants : une procédure de décision réflexive pour l'égalité modulo AC, ainsi qu'un greffon OCaml implémentant le filtrage modulo AC. Le dernier chapitre esquisse une formalisation des circuits digitaux via un plongement profond utilisant les types dépendants de Coq ; on s'intéresse ensuite à prouver la correction totale de circuits paramétriques.
APA, Harvard, Vancouver, ISO, and other styles
36

Barros, Flávio José Ferro. "Uma formalização da composicionalidade do cálculo lambda-ex em Coq." reponame:Repositório Institucional da UnB, 2010. http://repositorio.unb.br/handle/10482/6601.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2010.
Submitted by Allan Wanick Motta (allan_wanick@hotmail.com) on 2011-01-21T18:14:44Z No. of bitstreams: 1 2010_FlavioJoseFerroBarros.pdf: 454810 bytes, checksum: 20b7e7f5115fdc9ff34396a6f5e6cc1f (MD5)
Approved for entry into archive by Daniel Ribeiro(daniel@bce.unb.br) on 2011-01-26T00:28:37Z (GMT) No. of bitstreams: 1 2010_FlavioJoseFerroBarros.pdf: 454810 bytes, checksum: 20b7e7f5115fdc9ff34396a6f5e6cc1f (MD5)
Made available in DSpace on 2011-01-26T00:28:37Z (GMT). No. of bitstreams: 1 2010_FlavioJoseFerroBarros.pdf: 454810 bytes, checksum: 20b7e7f5115fdc9ff34396a6f5e6cc1f (MD5)
Apresenta-se uma formalização das propriedades de composicionalidade do Cálculo lambda-ex em Coq. A abordagem utilizada baseia-se na lógica nominal de acordo com o trabalho desenvolvido por [3]. Mais especificamente estendemos a formalização do lambda-cálculo contida neste trabalho de forma a incluir a operação de substituição explícita do cálculo lambda-ex. Nessa abordagem, a alpha-equivalência coincide com a igualdade pré-construída de Coq, e os princípios de recursão e indução sobre classes de lambda-termos possuem tratamento específico. Escolhemos trabalhar com o cálculo lambda-ex por ser atualmente o único cálculo que satisfaz simultaneamente todas as propriedades desejáveis para um cálculo de substituições explícitas. Ele é uma extensão do lambda-x com uma regra de reescrita para composição de substituições dependentes e uma equação para comutação de substituições independentes. O cálculo lambda-ex usa um construtor unário para a substituição explicita, mas tem o mesmo poder de expressividade de cálculos com substituições simultâneas. _________________________________________________________________________________ ABSTRACT
We present a formalization of properties of compositionality of the ex-calculus in Coq. The approach is based in the nominal logic as presented in the paper [3]. More precisely, we extended a formalization of the -calculus in such a way that it now includes the explicit substitution operation of the ex-calculus. In this approach, -equivalence of -terms coincides with the Coqt’s built-in equality, and the principles of recursion and induction over classes of -terms are treated in a specific way. We chose to work with the ex-calculus because it is currently the only calculus that simultaneously satisfies all the desirable properties for a calculus of explicit substitutions. It is an extension of the x-calculus with a rewrite rule for composition of dependent substitutions and one equation for independent substitutions. The ex-calculus has a unary constructor for the explicit substitution operation, but have the same expressive power of calculi with simultaneous substitutions.
APA, Harvard, Vancouver, ISO, and other styles
37

Lescuyer, Stephane. "Formalizing and Implementing a Reflexive Tactic for Automated Deduction in Coq." Phd thesis, Université Paris Sud - Paris XI, 2011. http://tel.archives-ouvertes.fr/tel-00713668.

Full text
Abstract:
In this thesis, we propose new automation capabilities for the Coq proof assistant. We obtain this mechanization via an integration into Coq of decision procedures for propositional logic, equality reasoning and linear arithmetic which make up the core of the Alt-Ergo SMT solver. This integration is achieved through the reflection technique, which consists in implementing and formally proving these algorithms in Coq in order to execute them directly in the proof assistant. Because the algorithms formalized in Coq are exactly those in use in Alt-Ergo's kernel, this work significantly increases our trust in the solver. In particular, it embeds an original algorithm for combining equality modulo theory reasoning, called CC(X) and inspired by the Shostak combination algorithm, and whose justification is quite complex. Our Coq implementation is available in the form of tactics which allow one to automatically solve formulae combining propositional logic, equality and arithmetic. In order to make these tactics as efficient as may be, we have taken special care with performance in our implementation, in particular through the use of classical efficient data structures, which we provide as a separate library.
APA, Harvard, Vancouver, ISO, and other styles
38

Keller, Chantal. "Question de confiance : communication sceptique entre Coq et des prouveurs externes." Phd thesis, Ecole Polytechnique X, 2013. http://pastel.archives-ouvertes.fr/pastel-00838322.

Full text
Abstract:
Cette thèse présente une coopération entre l'assistant de preuve Coq et certains prouveurs externes basée sur l'utilisation de traces de preuves. Nous étudions plus particulièrement deux types de prouveurs pouvant renvoyer des certicats : d'une part, les réponses des prouveurs SAT et SMT peuvent être vériées en Coq afin d'augmenter à la fois la confiance qu'on peut leur porter et l'automatisation de Coq ; d'autre part, les théorèmes établis dans des assistants de preuves basés sur la Logique d'Ordre Supérieur peuvent être exportés en Coq et re-vérifiés, ce qui permet d'établir des preuves formelles mêlant ces deux paradigmes logiques. Cette étude a abouti à deux logiciels : SMTCoq, une coopération bi-directionnelle entre Coq et des prouveurs SAT/SMT, et HOLLIGHTCOQ, un outil important les théorèmes de HOL Light en Coq. L'architecture de chacun de ces deux développements a été pensée de manière modulaire et efficace, en établissant une séparation claire entre trois composants: un encodage en Coq du formalisme de l'outil externe qui est ensuite traduit avec soin vers des termes Coq, un vérificateur certifié pour établir les preuves, et un pré-processeur écrit en Ocaml traduisant les traces venant de prouveurs différents dans le même format de certicat. Grâce à cette séparation, un changement dans le format de traces n'affecte que le pré-processeur, sans qu'il soit besoin de modier du code ou des preuves Coq. Un autre composant essentiel pour l'efficacité et la modularité est la réflexion calculatoire, qui utilise les capacités de calcul de Coq pour établir des preuves à la fois courtes et génériques à partir des certificats.
APA, Harvard, Vancouver, ISO, and other styles
39

Athalye, Anish (Anish R. ). "CoqIOA : a formalization of IO automata in the Coq proof assistant." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/112831.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 51-53).
Implementing distributed systems correctly is difficult. Designing correct distributed systems protocols is challenging because designs must account for concurrent operation and handle network and machine failures. Implementing these protocols is challenging as well: it is difficult to avoid subtle bugs in implementations of complex protocols. Formal verification is a promising approach to ensuring distributed systems are free of bugs, but verification is challenging and time-consuming. Unfortunately, current approaches to mechanically verifying distributed systems in proof assistants using deductive verification do not allow for modular reasoning, which could greatly reduce the effort required to implement verified distributed systems by enabling reuse of code and proofs. This thesis presents CoqIOA, a framework for reasoning about distributed systems in a compositional way. CoqIOA builds on the theory of input/output automata to support specification, proof, and composition of systems within the proof assistant. The framework's implementation of the theory of IO automata, including refinement, simulation relations, and composition, are all machine-checked in the Coq proof assistant. An evaluation of CoqIOA demonstrates that the framework enables compositional reasoning about distributed systems within the proof assistant.
by Anish Athalye.
M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
40

Keller, Chantal. "A Matter of Trust : Skeptical Communication between Coq and External Provers." Palaiseau, Ecole polytechnique, 2013. http://pastel.archives-ouvertes.fr/docs/00/83/83/22/PDF/thesis-keller.pdf.

Full text
Abstract:
Cette thèse présente une coopération entre l'assistant de preuve Coq et certains prouveurs externes basée sur l'utilisation de traces de preuves. Nous étudions plus particulièrement deux types de prouveurs pouvant renvoyer des certicats : d'une part, les réponses des prouveurs SAT et SMT peuvent être vériées en Coq afin d'augmenter à la fois la confiance qu'on peut leur porter et l'automatisation de Coq ; d'autre part, les théorèmes établis dans des assistants de preuves basés sur la Logique d'Ordre Supérieur peuvent être exportés en Coq et re-vérifiés, ce qui permet d'établir des preuves formelles mêlant ces deux paradigmes logiques. Cette étude a abouti à deux logiciels : SMTCoq, une coopération bi-directionnelle entre Coq et des prouveurs SAT/SMT, et HOLLIGHTCOQ, un outil important les théorèmes de HOL Light en Coq. L'architecture de chacun de ces deux développements a été pensée de manière modulaire et efficace, en établissant une séparation claire entre trois composants: un encodage en Coq du formalisme de l'outil externe qui est ensuite traduit avec soin vers des termes Coq, un vérificateur certifié pour établir les preuves, et un pré-processeur écrit en Ocaml traduisant les traces venant de prouveurs différents dans le même format de certicat. Grâce à cette séparation, un changement dans le format de traces n'affecte que le pré-processeur, sans qu'il soit besoin de modier du code ou des preuves Coq. Un autre composant essentiel pour l'efficacité et la modularité est la réflexion calculatoire, qui utilise les capacités de calcul de Coq pour établir des preuves à la fois courtes et génériques à partir des certificats
This thesis studies the cooperation between the Coq proof assistant and external provers through proof witnesses. We concentrate on two different kinds of provers that can return certicates: first, answers coming from SAT and SMT solvers can be checked in Coq to increase both the confidence in these solvers and Coq's automation; second, theorems established in interactive provers based on Higher-Order Logic can be exported to Coq and checked again, in order to offer the possibility to produce formal developments which mix these two dierent logical paradigms. It ended up in two software: SMTCoq, a bi-directional cooperation between Coq and SAT/SMT solvers, and HOLLIGHTCOQ, a tool importing HOL Light theorems into Coq. For both tools, we took great care to define a modular and efficient architecture, based on three clearly separated ingredients: an embedding of the formalism of the external tool inside Coq which is carefully translated into Coq terms, a certified checker to establish the proofs using the certicates, and an Ocaml preprocessor to transform proof witnesses coming from different provers into a generic certificate. This division allows that a change in the format of proof witnesses only affects the preprocessor, but no proved Coq code. Another fundamental component for efficiency and modularity is computational reflection, which exploits the computational power of Coq to establish generic and small proofs based on the certicates
APA, Harvard, Vancouver, ISO, and other styles
41

Stark, Kathrin [Verfasser], and Gert [Akademischer Betreuer] Smolka. "Mechanising syntax with binders in Coq / Kathrin Stark ; Betreuer: Gert Smolka." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2020. http://d-nb.info/1206178590/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Ioannidis, Eleftherios Ioannis. "Extracting and optimizing low-level bytecode from high-level verified Coq." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/121675.

Full text
Abstract:
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 51-53).
This document is an MEng thesis presenting MCQC, a compiler for extracting verified systems programs to low-level assembly, with no Runtime or Garbage Collection requirements and an emphasis on performance. MCQC targets the Gallina functional language used in the Coq proof assistant. MCQC translates pure and recursive functions into C++17, while compiling monadic effectful functions to imperative C++ system calls. With a series of memory and performance optimizations, MCQC combines verifiability with memory and runtime performance. By handling effectful and pure functions MCQC can generate executable code directly from Gallina and link it with trusted code, reducing the effort of implementing and executing verified systems.
by Eleftherios Ioannidis.
M. Eng.
M.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science
APA, Harvard, Vancouver, ISO, and other styles
43

Soubiran, Elie. "Modular development of theories and name-space management for the Coq proof assistant." Palaiseau, Ecole polytechnique, 2010. http://tel.archives-ouvertes.fr/docs/00/67/92/01/PDF/these.pdf.

Full text
Abstract:
Proof assistants offer a formal framework for formalizing and mechanically checking mathematical knowledge. Moreover, due to the numerous applications that follow from formal methods, the scientifc production being formalized and verified by such tools is constantly growing. In that context, the organization and the classification of this knowledge does not have to be neglected. Coq is a proof assistant well-suited for program certification and mathematical formalization, and for seven years now it has featured a module system that helps users in their development processes. Modules provide a way to represent theories and offer a namespace management that is crucial for large developments. In this dissertation, we advance the module system of Coq by putting the emphasis on the two latter aspects. We propose to unify both module implementation and module type into a single notion of structure, and to split our module system in two parts. We have, on one hand, a namespace system that is able to define extensible naming scopes and to deal with renaming, and on the other hand a structure system that describes how to combine and to form structures. We define a new merge operator that, given two structures, builds the resulting structure by unifying components of the former two. In that dual system, a module is the association of a sub-namespace and a pair of structures, it acts as concrete declared theory. Furthermore, we adopt an applicative semantic for higher-order functors that allows a precise propagation of information. We show that this module system is a conservative extension of the underlying base language of Coq and we present the on-going implementation.
APA, Harvard, Vancouver, ISO, and other styles
44

Pesce, Nicola <1997&gt. "The adoption of technological solutions in response to the Sars-CoV-2 Pandemic: a multi-firm case study analysis on the Food&Beverage industry." Master's Degree Thesis, Università Ca' Foscari Venezia, 2021. http://hdl.handle.net/10579/19847.

Full text
Abstract:
The Sars-CoV-2 is a model of coronavirus that originated in China in late 2019 and that caused unprecedented disruption in almost each aspect of human life. World's economy was not spared, as the combined action of direct (e.g. workers catching the virus and being forced home) and indirect (e.g. forced closures and lockdowns imposed by government to address the virus) effects caused millions of firms to declare bankruptcy and several more to struggle to stay in business. Despite the fact that a small minority of companies managed to thrive during the pandemic and to increase their revenues, the aforementioned crisis appear to be the rule for most firms. To remain in the market, many firms were forced to change their way of carrying out business and to adopt a variety of strategies whose effectiveness was not certain at all. Amid these strategies, the one regarding digital transformation or the adoption of new technologies seemed to be particularly effective, especially because they allowed employees to continue working remotely and customers to shift towards digital channels. The aim of this paper is to confirm these assumption in regard to the Food and Beverage Industry, both by analysing existing literature and by inquiring firms directly. The Food and Beverage Industry is one of the most important industries both in term of its share of global GDP and for very survival of human society. Furthermore, while the effects of the virus are easily identifiable in some industries, the effects on the F&B industry are mixed and harder to identify. This paper will be organized as follows. The first part will describe the Food and Beverage Industry and its importance in global economy. The second part will analyse the effects of the novel coronavirus both on the economy in general and in the F&B Industry. The third part will analyze the existing literature in regard to digital transformation and to its effects applied to the crisis originated by the Sars-CoV-2 pandemic. The fourth and last part will contain the results of the interviews conducted on leading firms of the Food and Beverage industry, with the aim to confirm the previous parts and to infer a possible solution for firms to mitigate the negative effects of the virus.
APA, Harvard, Vancouver, ISO, and other styles
45

Bodin, Martin. "Certified semantics and analysis of JavaScript." Thesis, Rennes 1, 2016. http://www.theses.fr/2016REN1S087/document.

Full text
Abstract:
JavaScript est un langage de programmation maintenant très utilisé - y compris dans des domaines où la sécurité est importante. Il est donc important de permettre de vérifier la qualité des logiciels écrit en JavaScript. Cette thèse explore l'approche de la preuve formelle, visant à donner une preuve mathématique qu'un programme donné se comporte comme prévu. Pour construire cette preuve, nous utilisons un assistant de preuve tel que Coq - un programme de confiance permettant de vérifier nos preuves formelles. Pour pouvoir énoncer qu'un programme JavaScript se comporte correctement, nous avons tout d'abord besoin d'une sémantique du langage JavaScript. Cette thèse s'est donc inscrite dans le projet JSCert visant à produire une sémantique formelle pour le langage JavaScript. Devant la taille de la sémantique de JavaScript, il est important de savoir comment on peut lui faire confiance : une faute de frappe peut compromettre toute la sémantique. Pour pouvoir faire confiance à JSCert, nous nous sommes appuyés sur deux sources de confiance. D'une part, JSCert a été conçue pour être très similaire à la spécification officielle de JavaScript, le standard ECMAScript : ils utilisent les mêmes structures de donnée, et il est possible d'associer chaque règle de réduction dans JSCert à une ligne d'ECMAScript. D'autre part, nous avons défini et prouvé relativement à JSCert un interpréteur nommé JSRef. Nous avons aussi pu lancer JSRef sur les suites de test de JavaScript. La sémantique de JSCert n'est pas la première sémantique formelle pour le JavaScript, mais c'est la première à proposer deux manières distinctes pour relier la sémantique formelle au langage JavaScript : en ayant une sémantique très similaire à la spécification officielle, et en ayant testé cette sémantique pour la comparer aux autres interpréteurs. Plutôt que de prouver indépendamment que chaque programme JavaScript s'exécute comme prévu, nous analysons ses programmes par interprétation abstraite. Cela consiste à interpréter la sémantique d'un langage avec des domaines abstraits. Par exemple la valeur concrète 1 pourra être remplacée par la valeur abstraite +. L'interprétation abstraite se compose en deux étapes : d'abord une sémantique abstraite est construite et prouvée correcte vis à vis de sa sémantique concrète, puis des analyseurs sont construits selon cette sémantique abstraite. Nous ne nous intéresserons qu'à la première étape dans cette thèse. La sémantique de JSCert est immense - plus de huit cent règles de réduction. La construction d'une sémantique abstraite traditionnelle ne passent pas à l'échelle face à de telles tailles. Nous avons donc conçu une nouvelle manière de construire des sémantiques abstraites à partir de sémantiques concrètes. Notre méthode se base sur une analyse précise de la structure des règles de réduction et vise à minimiser l'effort de preuve. Nous avons appliqué cette méthode sur plusieurs langages. Dans le but d'appliquer notre approche sur JavaScript, nous avons construit un domaine basé sur la logique de séparation. Cette logique requiert de nombreuses adaptations pour pouvoir s'appliquer dans le cadre de l'interprétation abstraite. Cette thèse en étudie les interactions et propose une nouvelle approche pour les solutionner dans le cadre construit précédemment. Nos domaines, bien qu'assez simple par rapport au modèle mémoire de JavaScript, semblent permettre la preuve d'analyseurs déjà existant. Les contributions de cette thèse sont donc triples : une sémantique formelle de confiance pour le langage JavaScript, un formalisme générique pour construire des sémantiques abstraites, et un domaine non trivial pour ce formalisme
JavaScript is a trending programming language. It is not used in applications in which security may be an important issue. It thus becomes important to be able to control the quality of softwares written in JavaScript. This thesis explores a formal proof approach, which aims at giving a mathematical proof that a given program behaves as expected. To build this proof, we use proof assistants such as Coq—a trusted program enabling to check formal proofs. To state that a JavaScript program is behaving as expected, we first need a semantics of the JavaScript language. This thesis is thus part of the JSCert project, whose aim it to prove a formal semantics for JavaScript. Because of the size of JavaScript's semantics, it is crucial to know how it can be trusted: a typing mistake could compromise the whole semantics. To trust JSCert, we based ourselves on two trust sources. On one hand, JSCert has been designed to be the most similar it can be from the official JavaScript specification, the ECMAScript standard: they use the same data structures, and it is possible to relate each derivation rule in JSCert to a line of ECMAScript. On the other hand, we defined and proved correct with respect to JSCert an interpreter named JSRef. We have been able to run JSRef on JavaScript test suites. The JSCert semantics is not the first formal semantics of JavaScript, but it is the first to propose two distinct ways to relate the formal semantics to the JavaScript language: by having a semantics close to the official specification, and by testing this semantics and comparing it to other interpreters. Instead of independently proving that each JavaScript program behaves as expected, we chose to analyse programs using abstract interpretation. It consists of interpreting the semantics of a programming language with abstract domains. For instance, the concrete value 1 can be replaced by the abstract value +. Abstract interpretation is split into two steps : first, an abstract semantics is built and proven correct with respect to its concrete semantics, then, analysers are built from this abstract semantics. We only focus on the first step in this thesis. The JSCert semantics is huge - more than height hundred derivation rules. Building an abstract semantics using traditional techniques does not scale towards such sizes. We thus designed a new way to build abstract semantics from concrete semantics. Our approach is based on a careful analysis on the structure of derivation rules. It aims at minimising the proof effort needed to build an abstract semantics. We applied our method on several languages. With the goal of applying our approach to JavaScript, we built a domain based on separation logic. This logic require several adaptations to be able to apply in the context of abstract interpretation. This thesis precisely studies these interactions and introduces a new approach to solve them in our abstract interpretation framework. Our domains, although very simple compared to the memory model of JavaScript, seems to enable the proof of already existing analysers. This thesis has thus three main contributions : a trusted formal semantics for the JavaScript, a generic framework to build abstract semantics, and a non-trivial domain for this formalism
APA, Harvard, Vancouver, ISO, and other styles
46

Zakowski, Yannick. "Verification of a Concurrent Garbage Collector." Thesis, Rennes, École normale supérieure, 2017. http://www.theses.fr/2017ENSR0010/document.

Full text
Abstract:
Les compilateurs modernes constituent des programmes complexes, réalisant de nombreuses optimisations afin d'améliorer la performance du code généré. Du fait de cette complexité, des bugs y sont régulièrement détecté, conduisant à l'introduction de nouveau comportement dans le programme compilé. En réaction, il est aujourd'hui possible de prouver correct, dans des assistants de preuve tels que Coq, des compilateurs optimisants pour des langages tels que le C ou ML. Transporter un tel résultat pour des langages haut-niveau tels que Java est néanmoins encore hors de portée de l'état de l'art. Ceux-ci possèdent en effet deux caractéristiques essentielles: la concurrence et un environnement d'exécution particulièrement complexe.Nous proposons dans cette thèse de réduire la distance vers la conception d'un tel compilateur vérifié en nous concentrant plus spécifiquement sur la preuve de correction d'un glaneur de cellules concurrent performant. Ce composant de l'environnement d'exécution prend soin de collecter de manière automatique la mémoire, en lieu et place du programmeur. Afin de ne pas générer un ralentissement trop élevé à l'exécution, le glaneur de cellules doit être extrêmement performant. Plus spécifiquement, l'algorithme considéré est dit «au vol»: grâce à l'usage de concurrence très fine, il ne cause jamais d'attente active au sein d'un fil utilisateur. La preuve de correction établit par conséquent que malgré l'intrication complexe des fils utilisateurs et du collecteur, ce dernier ne collecte jamais une cellule encore accessible par les premiers. Nous présentons dans un premier temps l'algorithme considéré et sa formalisation en Coq dans une représentation intermédiaire conçue pour l'occasion. Nous introduisons le système de preuve que nous avons employé, une variante issue de la logique «Rely-Guarantee», et prouvons l'algorithme correct. Raisonner simultanément sur l'algorithme de collection et sur l'implantation des structures de données concurrentes manipulées générerait une complexité additionnelle indésirable. Nous considérons donc durant la preuve des opérations abstraites: elles ont lieu instantanément. Pour légitimer cette simplification, nous introduisons une méthode inspirée par les travaux de Vafeiadis et permettant la preuve de raffinement de structures de données concurrentes dites «linéarisables». Nous formalisons l'approche en Coq et la dotons de solides fondations sémantiques. Cette méthode est finalement instanciée sur la principale structure de données utilisée par le glaneur de cellules
Modern compilers are complex programs, performing several heuristic-based optimisations. As such, and despite extensive testing, they may contain bugs leading to the introduction of new behaviours in the compiled program.To address this issue, we are nowadays able to prove correct, in proof assistants such as Coq, optimising compilers for languages such as C or ML. To date, a similar result for high-level languages such as Java nonetheless remain out of reach. Such languages indeed possess two essential characteristics: concurrency and a particularly complex runtime.This thesis aims at reducing the gap toward the implementation of such a verified compiler. To do so, we focus more specifically on a state-of-the-art concurrent garbage collector. This component of the runtime takes care of automatically reclaiming memory during the execution to remove this burden from the developer side. In order to keep the induced overhead as low as possible, the garbage collector needs to be extremely efficient. More specifically, the algorithm considered is said to be ``on the fly'': by relying on fine-grained concurrency, the user-threads are never caused to actively wait. The key property we establish is the functional correctness of this garbage collector, i.e. that a cell that a user thread may still access is never reclaimed.We present in a first phase the algorithm considered and its formalisation in Coq by implementing it in a dedicated intermediate representation. We introduce the proof system we used to conduct the proof, a variant based on the well-established Rely-Guarantee logic, and prove the algorithm correct.Reasoning simultaneously over both the garbage collection algorithm itself and the implementation of the concurrent data-structures it uses would entail an undesired additional complexity. The proof is therefore conducted with respect to abstract operations: they take place instantaneously. To justify this simplification, we introduce in a second phase a methodology inspired by the work of Vafeiadis and dedicated to the proof of observational refinement for so-called ``linearisable'' concurrent data-structures. We provide the approach with solid semantic foundations, formalised in Coq. This methodology is instantiated to soundly implement the main data-structure used in our garbage collector
APA, Harvard, Vancouver, ISO, and other styles
47

Raharjo, Joko Purnomo. "The performance analysis of real estate construction firms (RECF) in Indonesia." Thesis, Queensland University of Technology, 2022. https://eprints.qut.edu.au/228029/1/Joko%20Purnomo_Raharjo_Thesis.pdf.

Full text
Abstract:
This thesis examines the efficiency, productivity, and survival of real estate construction firms in Indonesia. By applying a meta-frontier approach, the results show that efficiency and productivity of the firms are relatively low. The gap between the industry and the small, medium, and large is high. This suggest that the overall industry could obtain remarkable efficiency and productivity improvement if firms could access technologies used by more efficient firms. Designing policy and managerial interventions for each group would have greater impacts. The survival analysis demonstrates that profit margin, income diversification, experience and structure are strongly related to firm survival.
APA, Harvard, Vancouver, ISO, and other styles
48

Coq, Guilhelm. "Utilisation d'approches probabilistes basées sur les critères entropiques pour la recherche d'information sur supports multimédia." Poitiers, 2008. http://theses.edel.univ-poitiers.fr/theses/2008/Coq-Guilhelm/2008-Coq-Guilhelm-These.pdf.

Full text
Abstract:
Les problèmes de sélection de modèles se posent couramment dans un grand nombre de domaines applicatifs tels que la compression de données ou le traitement du signal et de l’image. Un des outils les plus utilisés pour résoudre ces problèmes se présente sous la forme d’une quantité réelle à minimiser appelée critère d’information ou critère entropique pénalisé. La principale motivation de ce travail de thèse est de justifier l’utilisation d’un tel critère face à un problème de sélection de modèles typiquement issu d’un contexte de traitement du signal. La justification attendue se doit, elle, d’avoir un solide fondement mathématique. Nous abordons aussi le problème classique de la détermination de l’ordre d’une autorégression. La régression gaussienne, permettant de détecter les harmoniques principales d’un signal bruité, est également abordée. Pour ces problèmes, nous donnons un critère dont l’utilisation est justifiée par la minimisation du coût résultant de l’estimation obtenue. Les chaînes de Markov multiples modèlisent la plupart des signaux discrets, comme les séquences de lettres ou les niveaux de gris d’une image. Nous nous intéressons au problème de la détermination de l’ordre d’une telle chaîne. Dans la continuité de ce problème nous considérons celui, a priori éloigné, de l’estimation d’ une densité par un histogramme. Dans ces deux domaines, nous justifions l’ utilisation d’ un critère par des notions de codage auxquelles nous appliquons une forme simple du principe de « Minimum description Length ». Nous nous efforçons également, à travers ces différents domaines d’application, de présenter des méthodes alternatives d’ utilisation des critères d’ information. Ces méthodes, dites comparatives, présentent une complexité d’ utilisation moindre que les méthodes rencontrées habituellement, tout en permettant une description précise du modèle
Model selection problems appear frequently in a wide array of applicative domains such as data compression and signal or image processing. One of the most used tools to solve those problems is a real quantity to be minimized called information criterion or penalized likehood criterion. The principal purpose of this thesis is to justify the use of such a criterion responding to a given model selection problem, typically set in a signal processing context. The sought justification must have a strong mathematical background. To this end, we study the classical problem of the determination of the order of an autoregression. . We also work on Gaussian regression allowing to extract principal harmonics out of a noised signal. In those two settings we give a criterion the use of which is justified by the minimization of the cost resulting from the estimation. Multiple Markov chains modelize most of discrete signals such as letter sequences or grey scale images. We consider the determination of the order of such a chain. In the continuity we study the problem, a priori distant, of the estimation of an unknown density by an histogram. For those two domains, we justify the use of a criterion by coding notions to which we apply a simple form of the “Minimum Description Length” principle. Throughout those application domains, we present alternative methods of use of information criteria. Those methods, called comparative, present a smaller complexity of use than usual methods but allow nevertheless a precise description of the model
APA, Harvard, Vancouver, ISO, and other styles
49

Vaughan, Jefferson Archer. "Biology of immature Culicoides variipennis ssp. australis (Coq.) (Diptera:Ceratopogonidae) at Saltville, VA." Diss., Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/51943.

Full text
Abstract:
The larval and pupal biology of a unique population of gulicoides variipennis inhabiting the brine ponds of Saltville, VA was studied. Developmental threshold temperatures (OC) and thermal constants (Odays) for larvae and pupae were 9.6OC and 387Odays (larval stage) and 9.6OC and 3OOdays (pupal stage) respectively. Accumulated heat units recorded in the field ranged from 366—376Odays between successive generations in the summer. Heat accumulations required for completion of immature development of Q. variipennis were found to be much greater (83lOdays) for the overwintering generation. During the summer, larval/pupal distribution within the littoral zone of a brine pond was confined to the surface cm of mud at or near the shoreline. Insects overwintered farther offshore, mostly as 3rd instars. In ear1y' March, most larvae had xnolted to 4th instars and migrated above shoreline to pupate. Adult emergence occurred in April. Three summer generations were documented for 1983-1984 at Saltville._ Life tables and survivorship curves were calculated for the overwintering generation and the first summer generations for 1983 and 1984. For the overwintering generation, there was a relatively constant mortality rate between successive ageclasses (Type II survivorship curve). During the summer, there was relatively little mortality between successive larval age—classes but a dramatic increase in mortality was evident at the pupal stage (Type I survivorship curve). Late instar larvae were found to migrate from the shoreline onto the exposed mudflats to pupate, thus becoming vulnerable to predation by ants and carabid beetles. Excellent survival rates of the larvae during the summer was attributed to habitat stability, the paucity of predators and parasites and abundant microfloral content (i.e. food} of the pond water. Intra-specific competition for food resources appeared to be alleviated somewhat by partitioning of those resources on a diurnal cycle.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
50

Grégoire, Benjamin. "Compilation de termes de preuves : un (nouveau) mariage entre coq et OCaml." Paris 7, 2003. http://www.theses.fr/2003PA077216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography