Dissertations / Theses on the topic 'Gestion des données comptables'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Gestion des données comptables.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Lejard, Christophe. "La titrisation : enjeux en termes de communication comptable et financière." Thesis, Montpellier 1, 2011. http://www.theses.fr/2011MON10065/document.
Full textFinancial crisis due to subprime mortgage loans put in light to public the use of a complex financial engineering technique : the securitization. Recourse to securitization presents an exponential growth during the last decade, as this kind of transaction is nowadays used as the preferential tool of financing in the USA and Europe. This dissertation aims to appreciate issues conveyed by the use of securitization in terms of accounting and financial communication for listed banking groups, taking into account interests brought by this transaction to the manager. This research is made following two steps. The first step aims to determine and explain the impact due to securitization announcement by banks onto share price. The second step deals with consequences of manager's choice to design transaction off-balance sheet on the following accounting items : return on assets, leverage, Basle ratio, and level of credit risk. Results from this research bring to light that, in spite of a negative market perception during the announcement of the transaction, the use of securitization and particularly when it appears as off-balance sheet, is adequate to satisfy manager's own interests
Greusard, Olivier. "Anti-corruption laws and firms behavior : lessons from the FCPA enforcement activity." Thesis, Paris 1, 2019. http://www.theses.fr/2019PA01E052.
Full textThe impact of regulation on the behavior of firms is a subject of constant debate between more or less need for public intervention in the decision process of firms. In this thesis, I investigate the direct effect on targeted firms and the indirect effect on peer firms of law enforcement using the cases enforced between 1978 and 2015 under Foreign Corrupt Practices Act (FCPA), the U.S. Anti-Bribery law, as a framework. I firstly review the literature on earnings management, and more specifically on accrual-based models, to analyze the efficiency of accrual-based models to capture changes in regulation. In a second paper, I investigate with my co-author the accrual quality of bribe-paying firms and their competitors and find a positive effect of law enforcement on the accrual quality of bribe-paying firms’ competitors, but not the bribe-paying firms. Our results suggest a positive impact of antibribery law enforcement that incentivizes other firms to enhance their accounting information once they acknowledge a bribing behavior of a peer following the information risk channel. In a third paper, I focus on the indirect effect of law enforcement on peer firms and investigate the real economic effects of anti-bribery enforcement on the level of investment of peers. I find that peer firms decrease the level of their investment once they acknowledge the opening of a FCPA investigation in their industry. More surprisingly, I find a weakening in the decrease of investment for the cases acknowledged after December 2004, when the prosecutor used for the first time alternative resolution vehicles to conclude a FCPA case. These results suggest that, beyond the impact of law enforcement itself, the prosecution mode also affects the behavior of peer firms. In sum, this thesis shows that anti-bribery law enforcement can have a deterrent effect that impacts peer firms, who tend to adapt their behavior in response to a regulatory stimulus
Gayet, Amaury. "Méthode de valorisation comptable temps réel et big data : étude de cas appliquée à l'industrie papetière." Thesis, Paris 10, 2018. http://www.theses.fr/2018PA100001/document.
Full textContext: IP Leanware is a growing start-up. Created in 2008, its consolidated sales has quadrupled in 4 years and established two subsidiaries (Brazil and the United States). Since then, its growth has been two digits (2015). It optimizes the performance of industrial companies with software (BrainCube) that identifies overperformance conditions. The thesis, carried out in CIFRE within the R&D service led by Sylvain Rubat du Mérac, is located at the interface of management control, production management and information systems.Aim: BrainCube manages massive descriptive data of its customers' process flows. Its analysis engine identifies overperformance situations and broadcasts them in real time through tactile interfaces. BrainCube couples two flows: informational and physical. The mission is to integrate the economic variable. A literature study shows that simultaneous real-time evaluation of physical, informational and financial flows coupled with continuous improvement of production processes is not realized.Result: A literature review examines the practices and methods of management control to propose a real-time method adapted to the specificities of BrainCube. The case study, based on an engineering-research, proposes a generic modeling methodology of the economic variable. Configurable generic decision models are proposed. They must facilitate the use of real time information with high granularity. The contributions, limits and perspectives highlight the interest of works for the company and the management sciences
Perier, Stephane. "Gestion des résultats comptables et introduction en bourse." Grenoble 2, 1998. http://www.theses.fr/1998GRE21056.
Full textMedina, Marquez Alejandro. "L'analyse des données évolutives." Paris 9, 1985. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1985PA090022.
Full textTeller, Pierre. "Une représentation formelle des normes comptables internationales." Nice, 2007. http://www.theses.fr/2007NICE4061.
Full textThe subject of this thesis was to build a formal model designed to enhance the representation of financial information. This model allows a more efficient treatment of this kind of information, and in this particular field of data management, there is still a lot of work to be done to reach the level of formalism obtained in other domains. The accounting procedures are facing wide mutations, especially since Enron's bankruptcy. The financial scandals of the early 2000's highlighted the weaknesses of surveillance procedures, which is the reason why regulators all over the world realized that financial information has to change, to match with the economic reality. Two global trends emerge from the modifications of the accounting legal background : the differences between national standards are reduced, and the requirements on financial reporting are more and more important, from a quantitative and a qualitative point of view. Moreover, as the amount of informations grows, the time allowed for its treatment is shortened. The use of the most efficient techniques of data manipulation is therefore a major issue. We are offering a model for financial information, which is composed of a syntactic model and a semantic model, based on the new international accounting standards. The syntactic representation use formalism inspired by description logics. This formalism allows the use of non monotonic logical operators, and these operators were very useful to represent some of the most typical features of accounting standards. The semantic of accounting standards is represented using an ontology, because this tool is very convenient for our purpose, and it allows automatic knowledge-oriented reasoning, like classification within a taxonomy. This ontology is used to fix the meaning of the concepts used in our model. The possibilities of our model are illustrated by practical case studies. These cases show that the availability of this model may greatly improve some tasks related to financial data manipulation
El, Beyrouthy Marc. "Management vert : stratégies, nouvelles normes comptables et éthique marketing." Aix-Marseille 3, 2008. http://www.theses.fr/2008AIX32081.
Full textThis thesis deals with the Green Management that concerns food and health companies as much as the companies that operates in conjunction with the environment. In this field, which acquires today great importance with all the problems of food, respect for nature and respect for the human; the problems are endless and the entire planet is facing challenges like no other since the beginning of the great civilizations. Starting with hunger, conflict, pollution, and depletion of natural resources, the problems appear almost insurmountable. We tried throughout this work to know what a managerial approach can provide and examined three major fields of management. The general approach concerning the laws, guidelines and provisions implemented at international and national levels. The financial approach on how to charge the commitment of the companies whether it is related to the food or simply to environmental issues, this is at the level of accounting and evaluation that mostly contribute to the knowledge of financial information systems of the companies and the marketing approach that requests here, more than elsewhere probably, to approach the question of the media and advertising organizations pressures, in order not to appoint others, who find some privileged lands to create rash and purely epidermal reactions where the common consumers or even other companies are forced to adopt often expensive measures when they are not harmful
Bessieux, Corinne. "Les déterminants culturels des choix comptables : le cas des éléments incorporels." Paris 9, 2002. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2002PA090047.
Full textChevallier, André. "Système d'information comptable et multivision de l'entreprise : contribution à une pédagogie combinatoire des images comptables." Nice, 1989. http://www.theses.fr/1989NICE0017.
Full textMejri, Tarek. "Le rôle des manipulations comptables dans la valorisation de la firme." Thesis, Paris 13, 2013. http://www.theses.fr/2013PA131026.
Full textAThe accounting indicators remain a privileged information source for the financial market. However, accounting rules give opportunities to managers to exercise judgment in financial reporting. Despite the rationality of investors, it is not easy to distinguish between the different incentives of accounts manipulations. Managers can use their knowledge about the business to improve the informativeness of financial statement. Manager can have also incentives to mislead their financial statement users by exercising discretions in financial reporting. This report is on the basis of our research question: given the importance of the accounting manipulations practices, what is their role on firm valuation? We will examine in the American context the role of the discretionary accruals measured by Dechow, Richardson and Tuna (2003) on the stock-returns building on the methodology of association studies between the components of the accounting results (The cash-flow, the total accruals, the discretionary accruals, the not discretionary accruals and the discretionary results) and the stock-returns. In addition, we will examine the nature of the earning management. Our sample consists of 480 firms over the period [2000-2011]. The results show that discretionary accruals are valued by the financial market and this valuation is explained by managers’ informative motivations
Yang, Xi. "Essais sur la stabilité du secteur bancaire : analyses sur données comptables des banques américaines." Thesis, Paris 10, 2015. http://www.theses.fr/2015PA100169.
Full textThe 2007-2009 global financial crisis reveals the fragility of modern banking sector and the flaws in bank regulation. In the wake of the crisis, an important number of reforms are carried out: enhancement of micro-prudential regulation, introduction of macro-prudential instruments and separation of activities. In this context, this thesis, using detailed information on the U.S. banking sector, tries to explain bank vulnerability by their financial characteristics and organizational structure. Then the thesis analyzes the efficiency of some new regulatory instruments. Our findings are the following: 1) Banks adopting an aggressive business model in economic boom and banks funded massively with instable liabilities are more likely to fail. A healthy (well-capitalized and profitable) bank holding company is a source of strength for its bank subsidiaries. These findings support the introduction of the countercyclical capital buffer and of the requirements on liquidity in the Basel III framework. 2) A high degree of diversification across different banking activities is associated with important risk reduction benefits while the expansion in non-traditional activities seems to make banks more vulnerable. This indicates the necessity of structural reform for certain universal banks. 3) The leverage ratios are more efficient in predicting failures of large banks than the risk-weighted capital ratio whereas the two types of capital ratios predict the failures of small banks as well as each other. These findings go in line with the reinforcement of regulation on systemically important banks
Akhoune, Farhana. "Le comptable public à l'épreuve de la nouvelle gestion publique." Paris 1, 2007. http://www.theses.fr/2007PA010307.
Full textZehri, Fatma. "La place de la qualité de l'audit externe dans la gestion des résultats comptables : cas des entreprises tunisiennes." Montpellier 2, 2007. http://www.theses.fr/2007MON20001.
Full textEarnings management took, for a long time, an important weight in the research in fields of sciences of management, within the accounting profession and the control mechanisms of securities stock exchange. Debate on the exact effects of the financial divulgation process and the management behaviour are always of current events. Works on this subject are numerous; they treat, generally, determinants of accounting strategy, with multiple approaches and leading to different results. This thesis contributes to such efforts by analyzing the case of Tunisia and by insisting on the position of audit quality in explaining the importance of earnings management. Simultaneous effects of certain corporate governance mechanisms and agency costs are included, as control variables, for better explanation and comprehension of reliability of financial statements
Meloche, Hélène. "L'utilité des données comptables dans les décisions de sélection et d'évaluation de performance de portefeuille." Mémoire, Université de Sherbrooke, 1990. http://hdl.handle.net/11143/9049.
Full textLe, Béchec Antony. "Gestion, analyse et intégration des données transcriptomiques." Rennes 1, 2007. http://www.theses.fr/2007REN1S051.
Full textAiming at a better understanding of diseases, transcriptomic approaches allow the analysis of several thousands of genes in a single experiment. To date, international standard initiatives have allowed the utilization of large quantity of data generated using transcriptomic approaches by the whole scientific community, and a large number of algorithms are available to process and analyze the data sets. However, the major challenge remaining to tackle is now to provide biological interpretations to these large sets of data. In particular, their integration with additional biological knowledge would certainly lead to an improved understanding of complex biological mechanisms. In my thesis work, I have developed a novel and evolutive environment for the management and analysis of transcriptomic data. Micro@rray Integrated Application (M@IA) allows for management, processing and analysis of large scale expression data sets. In addition, I elaborated a computational method to combine multiple data sources and represent differentially expressed gene networks as interaction graphs. Finally, I used a meta-analysis of gene expression data extracted from the literature to select and combine similar studies associated with the progression of liver cancer. In conclusion, this work provides a novel tool and original analytical methodologies thus contributing to the emerging field of integrative biology and indispensable for a better understanding of complex pathophysiological processes
Maniu, Silviu. "Gestion des données dans les réseaux sociaux." Thesis, Paris, ENST, 2012. http://www.theses.fr/2012ENST0053/document.
Full textWe address in this thesis some of the issues raised by the emergence of social applications on the Web, focusing on two important directions: efficient social search inonline applications and the inference of signed social links from interactions between users in collaborative Web applications. We start by considering social search in tagging (or bookmarking) applications. This problem requires a significant departure from existing, socially agnostic techniques. In a network-aware context, one can (and should) exploit the social links, which can indicate how users relate to the seeker and how much weight their tagging actions should have in the result build-up. We propose an algorithm that has the potential to scale to current applications, and validate it via extensive experiments. As social search applications can be thought of as part of a wider class of context-aware applications, we consider context-aware query optimization based on views, focusing on two important sub-problems. First, handling the possible differences in context between the various views and an input query leads to view results having uncertain scores, i.e., score ranges valid for the new context. As a consequence, current top-k algorithms are no longer directly applicable and need to be adapted to handle such uncertainty in object scores. Second, adapted view selection techniques are needed, which can leverage both the descriptions of queries and statistics over their results. Finally, we present an approach for inferring a signed network (a "web of trust")from user-generated content in Wikipedia. We investigate mechanisms by which relationships between Wikipedia contributors - in the form of signed directed links - can be inferred based their interactions. Our study sheds light into principles underlying a signed network that is captured by social interaction. We investigate whether this network over Wikipedia contributors represents indeed a plausible configuration of link signs, by studying its global and local network properties, and at an application level, by assessing its impact in the classification of Wikipedia articles.javascript:nouvelleZone('abstract');_ajtAbstract('abstract')
Benchkron, Said Soumia. "Bases de données et logiciels intégrés." Paris 9, 1985. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1985PA090025.
Full textCastelltort, Arnaud. "Historisation de données dans les bases de données NoSQLorientées graphes." Thesis, Montpellier 2, 2014. http://www.theses.fr/2014MON20076.
Full textThis thesis deals with data historization in the context of graphs. Graph data have been dealt with for many years but their exploitation in information systems, especially in NoSQL engines, is recent. The emerging Big Data and 3V contexts (Variety, Volume, Velocity) have revealed the limits of classical relational databases. Historization, on its side, has been considered for a long time as only linked with technical and backups issues, and more recently with decisional reasons (Business Intelligence). However, historization is now taking more and more importance in management applications.In this framework, graph databases that are often used have received little attention regarding historization. Our first contribution consists in studying the impact of historized data in management information systems. This analysis relies on the hypothesis that historization is taking more and more importance. Our second contribution aims at proposing an original model for managing historization in NoSQL graph databases.This proposition consists on the one hand in elaborating a unique and generic system for representing the history and on the other hand in proposing query features.We show that the system can support both simple and complex queries.Our contributions have been implemented and tested over synthetic and real databases
Padula, Antonio Domingos. "Une méthodologie de diagnostic organisationnel global pour le conseil de direction en PME-PMI." Grenoble 2, 1991. http://www.theses.fr/1991GRE21008.
Full textSmall and mid-sized firms are often confronted with problems of management beyond their human and material resources. One of the major expectations of executives is help from consultants to solve such problems. Accounting specialists are naturally called on by such firms. However, nowadays they aren't ready to satisfy executives's expectations. They feel the need for specific approachs and tools to ease their professional evolution towards more general managerial solutions for such firms. Considering this situation, we've formulated a research problem that takes into account the following points : the specificity of consulting activity to small and mid-sized firms and the development process of such organizations. Then we proposed a solution resulting from action research into the issue so as to conceptualize, formalize and validate a method of overall business diagnosis. By enriching first then making systematic hither-to-fore dispersed and fragmentary knowledge, we have filled in the gaps on knowledge as far as consulting activities towards small and mid-sized firms, as well as contributing to e better understanding of their transitional problems
Maurice, Jonathan. "Fiabilité des provisions comptables environnementales : apports d'une lecture institutionnelle." Phd thesis, Université Montpellier I, 2012. http://tel.archives-ouvertes.fr/tel-00768565.
Full textParadis, Jocelin. "Modèle théorique de détermination du taux de capitalisation ajusté au risque basé sur des données comptables." Mémoire, Université de Sherbrooke, 1989. http://hdl.handle.net/11143/8233.
Full textJanin, Rémi. "Gestion des chiffres comptables, contenu informationnel du résultat et mesure de la création de valeur." Grenoble 2, 2000. http://www.theses.fr/2000GRE21033.
Full textTos, Uras. "Réplication de données dans les systèmes de gestion de données à grande échelle." Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30066/document.
Full textIn recent years, growing popularity of large-scale applications, e.g. scientific experiments, Internet of things and social networking, led to generation of large volumes of data. The management of this data presents a significant challenge as the data is heterogeneous and distributed on a large scale. In traditional systems including distributed and parallel systems, peer-to-peer systems and grid systems, meeting objectives such as achieving acceptable performance while ensuring good availability of data are major challenges for service providers, especially when the data is distributed around the world. In this context, data replication, as a well-known technique, allows: (i) increased data availability, (ii) reduced data access costs, and (iii) improved fault-tolerance. However, replicating data on all nodes is an unrealistic solution as it generates significant bandwidth consumption in addition to exhausting limited storage space. Defining good replication strategies is a solution to these problems. The data replication strategies that have been proposed for the traditional systems mentioned above are intended to improve performance for the user. They are difficult to adapt to cloud systems. Indeed, cloud providers aim to generate a profit in addition to meeting tenant requirements. Meeting the performance expectations of the tenants without sacrificing the provider's profit, as well as managing resource elasticities with a pay-as-you-go pricing model, are the fundamentals of cloud systems. In this thesis, we propose a data replication strategy that satisfies the requirements of the tenant, such as performance, while guaranteeing the economic profit of the provider. Based on a cost model, we estimate the response time required to execute a distributed database query. Data replication is only considered if, for any query, the estimated response time exceeds a threshold previously set in the contract between the provider and the tenant. Then, the planned replication must also be economically beneficial to the provider. In this context, we propose an economic model that takes into account both the expenditures and the revenues of the provider during the execution of any particular database query. Once the data replication is decided to go through, a heuristic placement approach is used to find the placement for new replicas in order to reduce the access time. In addition, a dynamic adjustment of the number of replicas is adopted to allow elastic management of resources. Proposed strategy is validated in an experimental evaluation carried out in a simulation environment. Compared with another data replication strategy proposed in the cloud systems, the analysis of the obtained results shows that the two compared strategies respond to the performance objective for the tenant. Nevertheless, a replica of data is created, with our strategy, only if this replication is profitable for the provider
Chardonnens, Anne. "La gestion des données d'autorité archivistiques dans le cadre du Web de données." Doctoral thesis, Universite Libre de Bruxelles, 2020. https://dipot.ulb.ac.be/dspace/bitstream/2013/315804/5/Contrat.pdf.
Full textThe subject of this thesis is the management of authority records for persons. The research was conducted in an archival context in transition, which was marked by the evolution of international standards of archival description and a shift towards the application of knowledge graphs. The aim of this thesis is to explore how the archival sector can benefit from the developments concerning Linked Data in order to ensure the sustainable management of authority records. Attention is not only devoted to the creation of the records and how they are made available but also to their maintenance and their interlinking with other resources.The first part of this thesis addresses the state of the art of the developments concerning the international standards of archival description as well as those regarding the Wikibase ecosystem. The second part presents an analysis of the possibilities and limits associated with an approach in which the free software Wikibase is used. The analysis is based on an empirical study carried out with data of the Study and Documentation Centre War and Contemporary Society (CegeSoma). It explores the options that are available to institutions that have limited resources and that have not yet implemented Linked Data. Datasets that contain information of people linked to the Second World War were used to examine the different stages involved in the publication of data as Linked Open Data.The experiment carried out in the second part of the thesis shows how a knowledge base driven by software such as Wikibase streamlines the creation of multilingual structured authority data. Examples illustrate how these entities can then be reused and enriched by using external data in interfaces aimed at the general public. This thesis highlights the possibilities of Wikibase, particularly in the context of data maintenance, without ignoring the limitations associated with its use. Due to its empirical nature and the formulated recommendations, this thesis contributes to the efforts and reflections carried out within the framework of the transition of archival metadata.
Doctorat en Information et communication
info:eu-repo/semantics/nonPublished
Duquet, Mario. "Gestion des données agrométéorologiques pour l'autoroute de l'information." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ61339.pdf.
Full textRhin, Christophe. "Modélisation et gestion de données géographiques multi-sources." Versailles-St Quentin en Yvelines, 1997. http://www.theses.fr/1997VERS0010.
Full textRichaud, Emmanuel. "L'Instruction budgétaire et comptable M14 et la gestion financière des petites collectivités locales." Corte, 2003. http://www.theses.fr/2003CORT1022.
Full textThe M14 directive is a regulation, establishing the financial management of districts and intra-district coopérative institutions, notably of small-communities. This directive has been in force since 1997. At that time, the aim was to standardize book-keeping, according to the PGC, and to give a best patrimonial outlook of the districts. But, the methodical analyse reveals that the disrespect of financial and accounting principles, divert the directive from its aims
Demolli, Eric. "Vers un cadre conceptuel d'audit des systèmes d'information comptables et financiers : outils et perspectives." Nice, 1992. http://www.theses.fr/1992NICE0011.
Full textThis doctoral dissertation discusses procedures of auditing in financial statements. It's based on the observation of a "gap" between accounting normalization (built on principles in which conceptual unity is being established) and it's control normalization (focused on control procedures). It dicusses audit process consistency through its tools and practices. And consideres the introduction of new tools, such as expert systems. This brings the author to propose the development of a "conceptual framework" for auditing
Ouvrard, Stéphane. "Contribution à la connaissance de la performance financière mesurée en normes IFRS." Bordeaux 4, 2010. http://www.theses.fr/2010BOR40023.
Full textThanks to IFRS, accountancy is no longer a dry matter exclusively reserved for accountants. Everybody is interested in it. Even politicians make reference to it in their speeches. For IFRS, which are manadatorily implemented by Eurpean listed companies in their consolidated financial statements since 1st january 2005, financial performance measurement is considered as a strategic stake. After more than twenty years of existence, the IASB conceptual framework, which privileges the investors, is based on the most simple agency theory : analysis the relationships between a principal (investor) and an agent (manager). Is this framework, source of short term vision and of volatility in financial statements, still adapted to the current stakes of our society (particularly in a period of crisis) ? The inclusion of finance in the accounting scope re-launches the debate on fair value and, more widely, on two accounting approaches : the balance sheet approach from which the comprehensive income is issued, and the profit and loss approach. These different concepts of accounting make us question the role of a firm. Can we consider an enterprise as an addition of portfolios for the benefitof privileged stakeholders (investors) ? Or, on the contrary, is its role larger : wealthe creation for the benefit of all stakeholders ? Our research aims to prove that it is urgent to come back to the fundamentals of financial management ; understanding and analyzing the operational performance through the business model of a company. IFRS standards on sector data and on cash flow statements are very useful to reach this goal since they allow for the determination of long term sustainable operating performance indicators
Zelasco, José Francisco. "Gestion des données : contrôle de qualité des modèles numériques des bases de données géographiques." Thesis, Montpellier 2, 2010. http://www.theses.fr/2010MON20232.
Full textA Digital Surface Model (DSM) is a numerical surface model which is formed by a set of points, arranged as a grid, to study some physical surface, Digital Elevation Models (DEM), or other possible applications, such as a face, or some anatomical organ, etc. The study of the precision of these models, which is of particular interest for DEMs, has been the object of several studies in the last decades. The measurement of the precision of a DSM model, in relation to another model of the same physical surface, consists in estimating the expectancy of the squares of differences between pairs of points, called homologous points, one in each model which corresponds to the same feature of the physical surface. But these pairs are not easily discernable, the grids may not be coincident, and the differences between the homologous points, corresponding to benchmarks in the physical surface, might be subject to special conditions such as more careful measurements than on ordinary points, which imply a different precision. The generally used procedure to avoid these inconveniences has been to use the squares of vertical distances between the models, which only address the vertical component of the error, thus giving a biased estimate when the surface is not horizontal. The Perpendicular Distance Evaluation Method (PDEM) which avoids this bias, provides estimates for vertical and horizontal components of errors, and is thus a useful tool for detection of discrepancies in Digital Surface Models (DSM) like DEMs. The solution includes a special reference to the simplification which arises when the error does not vary in all horizontal directions. The PDEM is also assessed with DEM's obtained by means of the Interferometry SAR Technique
Sandoval, Gomez Maria Del Rosario. "Conception et réalisation du système de gestion de multibases de données MUSE : architecture de schéma multibase et gestion du catalogue des données." Paris 6, 1989. http://www.theses.fr/1989PA066657.
Full textLiroz, Miguel. "Partitionnement dans les systèmes de gestion de données parallèles." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2013. http://tel.archives-ouvertes.fr/tel-01023039.
Full textLiroz-Gistau, Miguel. "Partitionnement dans les Systèmes de Gestion de Données Parallèles." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2013. http://tel.archives-ouvertes.fr/tel-00920615.
Full textPetit, Loïc. "Gestion de flux de données pour l'observation de systèmes." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00849106.
Full textLiroz, Gistau Miguel. "Partitionnement dans les systèmes de gestion de données parallèles." Thesis, Montpellier 2, 2013. http://www.theses.fr/2013MON20117/document.
Full textDuring the last years, the volume of data that is captured and generated has exploded. Advances in computer technologies, which provide cheap storage and increased computing capabilities, have allowed organizations to perform complex analysis on this data and to extract valuable knowledge from it. This trend has been very important not only for industry, but has also had a significant impact on science, where enhanced instruments and more complex simulations call for an efficient management of huge quantities of data.Parallel computing is a fundamental technique in the management of large quantities of data as it leverages on the concurrent utilization of multiple computing resources. To take advantage of parallel computing, we need efficient data partitioning techniques which are in charge of dividing the whole data and assigning the partitions to the processing nodes. Data partitioning is a complex problem, as it has to consider different and often contradicting issues, such as data locality, load balancing and maximizing parallelism.In this thesis, we study the problem of data partitioning, particularly in scientific parallel databases that are continuously growing and in the MapReduce framework.In the case of scientific databases, we consider data partitioning in very large databases in which new data is appended continuously to the database, e.g. astronomical applications. Existing approaches are limited since the complexity of the workload and continuous appends restrict the applicability of traditional approaches. We propose two partitioning algorithms that dynamically partition new data elements by a technique based on data affinity. Our algorithms enable us to obtain very good data partitions in a low execution time compared to traditional approaches.We also study how to improve the performance of MapReduce framework using data partitioning techniques. In particular, we are interested in efficient data partitioning of the input datasets to reduce the amount of data that has to be transferred in the shuffle phase. We design and implement a strategy which, by capturing the relationships between input tuples and intermediate keys, obtains an efficient partitioning that can be used to reduce significantly the MapReduce's communication overhead
Gürgen, Levent. "Gestion à grande échelle de données de capteurs hétérogènes." Grenoble INPG, 2007. http://www.theses.fr/2007INPG0093.
Full textThis dissertation deals with the issues related to scalable management of heterogeneous sensor data. Ln fact, sensors are becoming less and less expensive, more and more numerous and heterogeneous. This naturally raises the scalability problem and the need for integrating data gathered from heterogeneous sensors. We propose a distributed and service-oriented architecture in which data processing tasks are distributed at severallevels in the architecture. Data management functionalities are provided in terms of "services", in order to hide sensor heterogeneity behind generic services. We equally deal with system management issues in sensor farms, a subject not yet explored in this context
Etien-Gnoan, N'Da Brigitte. "L'encadrement juridique de la gestion électronique des données médicales." Thesis, Lille 2, 2014. http://www.theses.fr/2014LIL20022/document.
Full textThe electronic management of medical data is as much in the simple automated processing of personal data in the sharing and exchange of health data . Its legal framework is provided both by the common rules to the automated processing of all personal data and those specific to the processing of medical data . This management , even if it is a source of economy, creates protection issues of privacy which the French government tries to cope by creating one of the best legal framework in the world in this field. However , major projects such as the personal health record still waiting to be made and the right to health is seen ahead and lead by technological advances . The development of e-health disrupts relationships within one dialogue between the caregiver and the patient . The extension of the rights of patients , sharing responsibility , increasing the number of players , the shared medical confidentiality pose new challenges with which we must now count. Another crucial question is posed by the lack of harmonization of legislation increasing the risks in cross-border sharing of medical
Tort, Éric. "Contribution à la connaissance des systèmes comptables des grandes entreprises en France : approches organisationnelles, managériales et pratiques de gestion : bilan et perspectives." Paris 1, 2001. http://www.theses.fr/2001PA01A001.
Full textFaye, David Célestin. "Médiation de données sémantique dans SenPeer, un système pair-à-pair de gestion de données." Phd thesis, Université de Nantes, 2007. http://tel.archives-ouvertes.fr/tel-00481311.
Full textDjellalil, Jilani. "Conception et réalisation de multibases de données." Lyon 3, 1989. http://www.theses.fr/1989LYO3A003.
Full textMard, Yves. "Déterminants et instruments de la gestion des résultats comptables : étude empirique sur un échantillon d'entreprises françaises cotées." Aix-Marseille 3, 2002. http://www.theses.fr/2002AIX32045.
Full textStudies on earnings management analyse managers'decisions that influence earnings. The purpose of our research is to study determinants and instruments of earnings management, and to draw practical teachings from this study. Our processes consists of three stages. First, a survey leads to hypotheses about determinants of earnings management. We then test the hypotheses on a sample of 294 french listed firms during the period 1990-1998. Two methodologies are used : the study of earnings distributions and the study of accruals from firms. Aggregate and sector-based analyses show the diversity of determinants (financial policy, performance, control) and instruments of earnings management. We finally propose some ways to reduce earnings management : the renforcement of governance structures, the growth of auditors'independance, the evolution of accounting and financial regulation
Cho, Choong-Ho. "Structuration des données et caractérisation des ordonnancements admissibles des systèmes de production." Lyon, INSA, 1989. http://www.theses.fr/1989ISAL0053.
Full textThis work deals, on the one band, with the specification and the modelization of data bases for the scheduling problems in a hierarchical architecture of manufacturing systems, on the other hand, with the analytical specification of the set of feasible solutions for the decision support scheduling problems about three different types of workshops: - first, made up several machines (flowshop: sequences of operations are the same for all jobs), considering the important cri teri on as the set up times under set tasks groups) and potential. Constraints, - second, with only one machine, under the given due dates of jobs constraints, finally, organised in a jobshop, under the three previous constraints: set, potential and due dates. One of original researchs concerns the new structure: PQR trees, to characterise the set of feasible sequences of tasks
Guégot, Françoise. "Gestion d'une base de données mixte, texte et image : application à la gestion médicale dentaire." Paris 9, 1989. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1989PA090042.
Full textIn the frame work of organizational data processing, we have shown, on an actual example -a dental surgeon cabinet- that image display constitutes a bonus which may prove decisive in decision making. This should be considered to play down the principles governing a mixed data basic managering system. A basis of text data will be constituted through an S. I. A. D generator which will also perform the necessary processing of the said data. A basis of image data will be established. In parallel with the former, from an inventory of the various image processing techniques. Finally, both basis will be connected to form the mixed data managerial system
Le, Mahec G. "Gestion des bases de données biologiques sur grilles de calculs." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2008. http://tel.archives-ouvertes.fr/tel-00462306.
Full textPierkot, Christelle. "Gestion de la Mise à Jour de Données Géographiques Répliquées." Phd thesis, Université Paul Sabatier - Toulouse III, 2008. http://tel.archives-ouvertes.fr/tel-00366442.
Full textL'institution militaire utilise elle aussi les données spatiales comme soutien et aide à la décision. A chaque étape d'une mission, des informations géographiques de tous types sont employées (données numériques, cartes papiers, photographies aériennes...) pour aider les unités dans leurs choix stratégiques. Par ailleurs, l'utilisation de réseaux de communication favorise le partage et l'échange des données spatiales entre producteurs et utilisateurs situés à des endroits différents. L'information n'est pas centralisée, les données sont répliquées sur chaque site et les utilisateurs peuvent ponctuellement être déconnectés du réseau, par exemple lorsqu'une unité mobile va faire des mesures sur le terrain.
La problématique principale concerne donc la gestion dans un contexte militaire, d'une application collaborative permettant la mise à jour asynchrone et symétrique de données géographiques répliquées selon un protocole à cohérence faible optimiste. Cela nécessite de définir un modèle de cohérence approprié au contexte militaire, un mécanisme de détection des mises à jour conflictuelles lié au type de données manipulées et des procédures de réconciliation des écritures divergentes adaptées aux besoins des unités participant à la mission.
L'analyse des travaux montre que plusieurs protocoles ont été définis dans les communautés systèmes (Cederqvist :2001 ; Kermarrec :2001) et bases de données (Oracle :2003 ; Seshadri :2000) pour gérer la réplication des données. Cependant, les solutions apportées sont souvent fonctions du besoin spécifique de l'application et ne sont donc pas réutilisables dans un contexte différent, ou supposent l'existence d'un serveur de référence centralisant les données. Les mécanismes employés en information géographique pour gérer les données et les mises à jour ne sont pas non plus appropriés à notre étude car ils supposent que les données soient verrouillées aux autres utilisateurs jusqu'à ce que les mises à jour aient été intégrée (approche check in-check out (ESRI :2004), ou utilisent un serveur centralisé contenant les données de référence (versionnement : Cellary :1990).
Notre objectif est donc de proposer des solutions permettant l'intégration cohérente et autant que possible automatique, des mises à jour de données spatiales dans un environnement de réplication optimiste, multimaître et asynchrone.
Nous proposons une stratégie globale d'intégration des mises à jour spatiales basée sur une vérification de la cohérence couplé à des sessions de mises à jour. L'originalité de cette stratégie réside dans le fait qu'elle s'appuie sur des métadonnées pour fournir des solutions de réconciliation adaptées au contexte particulier d'une mission militaire.
La contribution de cette thèse est double. Premièrement, elle s'inscrit dans le domaine de la gestion de la mise à jour des données spatiales, domaine toujours très actif du fait de la complexité et de l'hétérogénéité des données (Nous limitons néanmoins notre étude aux données géographiques vectorielles) et de la relative «jeunesse » des travaux sur le sujet. Deuxièmement, elle s'inscrit dans le domaine de la gestion de la cohérence des données répliquées selon un protocole optimiste, en spécifiant en particulier, de nouveaux algorithmes pour la détection et la réconciliation de données conflictuelles, dans le domaine applicatif de l'information géographique.
Gagnon, Bertrand. "Gestion d'information sur les procédés thermiques par base de données." Thesis, McGill University, 1986. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=65447.
Full textAntoine, Émilien. "Gestion des données distribuées avec le langage de règles: Webdamlog." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00908155.
Full textDia, Amadou Fall. "Filtrage sémantique et gestion distribuée de flux de données massives." Electronic Thesis or Diss., Sorbonne université, 2018. http://www.theses.fr/2018SORUS495.
Full textOur daily use of the Internet and related technologies generates, at a rapid and variable speeds, large volumes of heterogeneous data issued from sensor networks, search engine logs, multimedia content sites, weather forecasting, geolocation, Internet of Things (IoT) applications, etc. Processing such data in conventional databases (Relational Database Management Systems) may be very expensive in terms of time and memory storage resources. To effectively respond to the needs of rapid decision-making, these streams require real-time processing. Data Stream Management Systems (SGFDs) evaluate queries on the recent data of a stream within structures called windows. The input data are different formats such as CSV, XML, RSS, or JSON. This heterogeneity lock comes from the nature of the data streams and must be resolved. For this, several research groups have benefited from the advantages of semantic web technologies (RDF and SPARQL) by proposing RDF data streams processing systems called RSPs. However, large volumes of RDF data, high input streams, concurrent queries, combination of RDF streams and large volumes of stored RDF data and expensive processing drastically reduce the performance of these systems. A new approach is required to considerably reduce the processing load of RDF data streams. In this thesis, we propose several complementary solutions to reduce the processing load in centralized environment. An on-the-fly RDF graphs streams sampling approach is proposed to reduce data and processing load while preserving semantic links. This approach is deepened by adopting a graph-oriented summary approach to extract the most relevant information from RDF graphs by using centrality measures issued from the Social Networks Analysis. We also adopt a compressed format of RDF data and propose an approach for querying compressed RDF data without decompression phase. To ensure parallel and distributed data streams management, the presented work also proposes two solutions for reducing the processing load in distributed environment. An engine and parallel processing approaches and distributed RDF graphs streams. Finally, an optimized processing approach for static and dynamic data combination operations is also integrated into a new distributed RDF graphs streams management system
Le, Mahec Gaël. "Gestion des bases de données biologiques sur grilles de calcul." Clermont-Ferrand 2, 2008. http://www.theses.fr/2008CLF21891.
Full textCheballah, Kamal. "Aides à la gestion des données techniques des produits industriels." Ecully, Ecole centrale de Lyon, 1992. http://www.theses.fr/1992ECDL0003.
Full text